Daily Tech Digest - July 30, 2018


        Chamber of Digital Commerce Sets Out ICO and Token Guidelines
Former Securities and Exchange Commission (SEC) commissioner and CEO of Patomak Global Partners Paul Atkins comments, “These principles are an important tool for responsible growth and smart regulation that strikes the right balance between protecting investors while allowing for innovation in this new technological frontier. We think it is important to explain the unique attributes of blockchain-based digital assets, which are not all strictly investment based, and provide guidance to consumers, regulators and the industry.” The whitepaper is broken up into three distinct sections. The first offers a comprehensive overview of current and future regulations to give investors a stronger understanding of securities laws in the U.S., Canada, the U.K. and Australia. The second part showcases industry-developed principles for both trading platforms and token sponsors to better promote safe and legal business practices and lower the risks to organizers and traders. 


panasonic-rugged-tablet.png
The Toughbook T1 has a 5-inch screen and runs Android 8.1 Oreo. It allows retail workers, warehouse employees, or transportation and logistics employees to quickly scan barcodes for better productivity. The device also has a built-in barcode reader and high-speed connectivity to integrate with resource management systems and databases. The FZ-T1 is available in two models—one with Wi-Fi connectivity only, and another offering voice and data connection on AT&T and Verizon networks, as well as data connectivity through P.180, Panasonic's purpose-built network. The Toughbook L1 is a professional-grade tablet that can be mounted in a vehicle or used as a handheld device. It has a 7-inch screen and runs Android 8.1 Oreo. It includes an integrated barcode reader that is field-configurable for landscape or portrait modes. The L1 will be released in a Wi-Fi only model that supports data service on Verizon, AT&T and Panasonic's P. 180.


IBM banks on the the blockchain to boost financial services innovation

On Monday, the tech giant said a proof-of-concept (PoC) design has been created for the platform, dubbed LedgerConnect. The system is a distributed ledger technology (DLT) platform intended for enterprise financial services companies including banks, buy and sell-side firms, FinTechs and software vendors. The goal for LedgerConnect is to bring these companies together to deploy, share, and use blockchain-based services hosted on the network in order to make adoption more cost effective for companies, as well as easier to access and to deploy. Services will include Know Your Customer (KYC) processes, sanctions screening, collateral management, derivatives post-trade processing and reconciliation and market data. "By hosting these services on a single, enterprise-grade network, organizations can focus on business objectives rather than application development, enabling them to realize operational efficiencies and cost savings across asset classes," IBM says.


Don’t Let your Data Lake become a Data Swamp


To cope with the growing volume and complexity of data and alleviate IT pressure, some are migrating to the cloud. But this transition—in turn—creates other issues. For example, once data is made more broadly available via the cloud, more employees want access to that information. Growing numbers and varieties of business roles are looking to extract value from increasingly diverse data sets, faster than ever—putting pressure on IT organizations to deliver real-time, data access that serves the diverse needs of business users looking to apply real-time analytics to their everyday jobs. However, it’s not just about better analytics—business users also frequently want tools that allow them to prepare, share, and manage data.To minimize tension and friction between IT and business departments, moving raw data to one place where everybody can access it sounded like a good move. The concept of the data lake first coined by James Dixon in 2014 expected the data lake to be a large body of raw data in a more natural state where different users come to examine it, delve into it, or extract samples from it.


3 Ways Automation & Integration Is Disrupting the HIT Status Quo

Integrated patient engagement solutions empower patients along the continuum of their healthcare experience, pre-visit to post-visit, with features such as self-scheduling, online access to consent forms and personal information, and communications with their providers via a user-friendly patient portal. And by engaging patients with this end-to-end lifecycle approach, practices can increase patient satisfaction rates, patient retention and referrals. ... “We wanted something that was easy to use for the patients and staff, straightforward, less expensive than our current solution, available to all our providers, and that would offer greater transparency to patients, particularly on which insurances we take,” notes Jared Boundy, MHA, director of operations for Washington-based Dermatology Arts. “We also felt that it needed to integrate with the other systems we already had in place. It had to be adaptable, too, as we didn’t want to pay an arm and a leg every time we added a provider or a location.”


AI Software Development: 7 things you need to Know

AI in Software Development
At the initial stage, Machine learning needs substantial computing resources, meanwhile, the data processing stage is not so challenging. Previously, this varying requirement in computing resources was difficult for those who wanted to implement machine learning but were unwilling to make big one-time investments to purchase servers that were adequately powerful. As the cloud technology emerged, the possibility of satisfying this requirement became easy. AI software development services can rely on either the corporate or commercial cloud, for e.g. Microsoft or AWS etc. ... As artificial intelligence techniques become mature, more are interested in using these practices to control complex real-world systems that have solid deadlines. ...  AI is a huge field and with a wide area to cover, it is difficult to refer just one single programming language. Of course, there are a variety of programming languages that can be used but not all offer the best value for your effort. These languages are considered to be best options for AI considering their simplicity, prototyping capabilities, usefulness, usability and speed – they are Python, Java, Lisp, Prolog, C++ etc.


Connecting whilst building – benefits of the IoT in construction

Connecting whilst building – benefits of the IoT in construction image
While IIoT opens the door to a host of new opportunities such as cost reduction, worker safety, quality improvement and business growth, the prospect of gearing up for the next industrial revolution can cause apprehension. Implementing IIoT solutions can change the way IT interacts with production systems and field devices but if this is matched with the right approach to connectivity, and realising the potential of the servitisation model, it needn’t keep construction companies awake at night. Connectivity is the lifeblood of the IoT and this is just as true in an industrial setting. Field connectivity is indispensable for conveying commands to field systems and devices in addition to acquiring data for further analysis. It tends to be a cross-cutting and cross-layer function in IIoT systems as both edge and cloud modules are able to access field data directly using one of a large number of protocols. These include OPC-UA (Unified Architecture, MQTT (Message Queue Telemetry Transport), DDS (Data Distribution Service), oneM2M and various other protocols as illustrated in the Industrial Internet Connectivity Framework.


Pushing the Boundaries of Computer Vision

Although augmented reality has occasionally been described as a bridge to true virtual reality, AR is actually more difficult to implement in some ways. Nevertheless, the technology has evolved rapidly in recent years, thanks in part to computer vision advances. At the core of AR is a challenge relevant to other fields of computer vision: Object recognition. Small variations in objects can prove challenging for imagine recognition software, and even a change in lighting can cause mismatches. Experts at Facebook and other companies have made tremendous progress through deep learning and other artificial intelligence fields, and these advances have the potential to make AR and other vision fields dependent on object recognition more powerful in the coming years. Another transformative use-case is predicted to be agriculture. Agricultural science is charged with feeding the world, and computers have been making major strides in the field in recent years. Because farms are so large and often remote, image recognition enables individual farmers to be far more effective. Computer vision capable of detecting fruit can help farmers track progress and determine the right time for harvest.


Monitoring Your Data Center Like a Google SRE

Monitoring Your Data Center
SLO is used to define what SREs call the “error budget,” which is a numeric line in the sand. The error budget is used to encourage collective ownership of service availability and blamelessly resolve disputes about balancing risk and stability. For example, if programmers are releasing risky new features too frequently and compromising availability, this will deplete the error budget. SREs can point to the at-risk error budget, and argue for halting releases and refocusing coders on efforts to improve system resilience. This approach lets the organization as a whole balance speed and risk with stability effectively. Paying attention to this economy encourages investment in strategies that accelerate the business while minimizing risk: writing error- and chaos-tolerant apps, automating away pointless toil, advancing by means of small changes and evaluating “canary” deployments before proceeding with full releases. Monitoring systems are key to making this whole, elegant tranche of DevOps/SRE discipline work. It’s important to note that this has nothing to do with what kind of technologies you’re monitoring, the processes you’re wrangling or the specific techniques you might apply to stay above your SLOs.


Utilize microservices to support a 5G network architecture


Microservices is the ideal cloud-based architecture for 5G, rather than a monolithic architecture. Only microservices can properly support a 5G network architecture, because no set of monolithic applications can deliver the same requirements of responsiveness, flexibility, updatability and scalability that 5G demands. Virtualized network services also must adapt to new technologies and demands on the system as they come along. With a microservices-based architecture, this is a relatively easy task, accomplished via changes to individual microservices rather than the whole system. The technologies included in 5G will likely change rapidly after the initial rollout, so this kind of adaptability is a necessity. Additionally, signal-related expectations of 5G, such as high availability, require the kind of flexibility that microservices can deliver. According to NGMN, remote-location equipment should be self-healing, which means it requires flexible, built-in, AI-baseddiagnostic and repair software capable of at least re-establishing lost communication when isolated.



Quote for the day:


"The People That Follow You Are A Reflection Of Your Leadership." -- Gordon TredGold


No comments:

Post a Comment