Daily Tech Digest - May 29, 2019

Is Lean IT Killing Your Digital Transformation Plans?

Image: Olivier Le Moal - stock.adobe.com
The first thing to keep in mind is that IT should not be in any huge hurry to significantly trim down in terms of time and technology waste. A proper framework must first be put in place that clearly outlines and categorizes technology services, how they should be implemented, supported and spun down at the end of the lifecycle. These processes should be broad enough to encompass things like technical staff/management roles, service provider requirements, lifecycle planning, quality control and lines of communication. Also keep in mind that unlike Lean manufacturing, Lean IT must take into consideration the speed at which technology advances and the volatility in what the business needs. Manufacturing is far more static in nature – and major changes can be planned for well in advance. Yet, with IT, that’s not the case. The need to adopt disruptive digital technologies can strike at lightning speed. Added to this is the fact that DX is about converting all business processes to a digitized state under the operational umbrella of the IT department. Thus, even a minor pivot in business strategy requires IT to change or add new technologies to accommodate for shifting business processes.


10 years from now your brain will be connected to your computer

BMIs (Brain Machine Interfaces) are an intriguing area of research with huge potential, offering the ability to directly connect the human brain to computers to share data or control devices. Some of the work on BMI is one step away from science fiction. Probably the best-known company working on this technology today is Neuralink, the Elon Musk-backed firm that aims to develop ultra high bandwidth 'neural lace' devices to connect humans and computers. At least part of the reason for Musk's interest in the idea of mind brain-computer connections is that such technology could stop humans getting left behind by a (still to emerge) super-intelligent artificial intelligence. The idea is that connecting our minds directly to the AI with high bandwidth links would at least give us a chance to keep up with the conversation. However, more basic forms of BMI technology have been used in medicine for years, like cochlear implants which provide a sense of sound to a person who is profoundly deaf or severely hard of hearing. 


blockchain world
Regardless of which solution is chosen, the current underlying structure of blockchain is simply not sustainable. If cryptocurrencies and the myriad of other applications of the technology are to be used reliably and at scale, the system has to change. Ethereum has taken tangible steps towards doing this, but the chance of blockchain feasibly replacing central authorities – like banks and energy companies – remains slim. That doesn’t mean, however, that blockchain can’t be used to gradually improve transparency and trust in industries where there are environmental and ethical concerns. The tricky relationship between blockchain and sustainability demonstrates just how complex sustainable solutions can be. While blockchain has the potential to improve supply chain sustainability, it also necessitates the mammoth energy consumption required by cryptocurrencies – particularly Bitcoin. Perhaps the evolution of blockchain won’t come from the financial sector, but from the governments, organisations, and communities that use it to support sustainability.


No real change a year into GDPR, says privacy expert

Since the implementation of the GDPR, Room said there has been a “fixation” among privacy practitioners on the idea that the regulatory system needs to deliver pain and punishment to deliver change, with a great deal of discussion and focus on the potentially huge fines under the GDPR. “We are deluding ourselves about the power to change that comes from enforcement action such as fines. We should not be investing our hopes in pain if we want to deliver change,” he said, adding that already this has led many to believe GDPR is about US tech giants. “One year on, many organisations are thinking the fight is against US technology companies and not really about them. Not only is that distortion troubling, but so too is the view that pain is key to change because that suggests a fundamental failure to understand the significance and importance of the subject matter in its own right,” said Room. The focus should not be on the fines and other enforcement actions, he said, but on the fact that the GDPR is about fundamental rights and freedoms.


Deploying RPA: why DevOps and IT need more control

Deploying RPA: why DevOps and IT need more control image
“Non-IT departments have targets and ambitions to transform their business and feel frustrated that IT is just trying to keep the lights on,” he said. “So when a technology like RPA comes along and it’s pitched and marketed to a business audience and they can see positive results almost immediately, it’s a no brainer for them that they’re just going to try and run it themselves; rather than have a lengthy conversation with IT over how to best implement it or how it fits within their technology roadmap.” But should DevOps be worried? According to O’Donoghue, no. He said: “Ultimately, RPA does not take away the bulk of what DevOps and IT services teams do. There’s a whole spectrum of tasks their busy with from on-the-spot patching and service development. RPA can only do a very tiny part of this. So we’re never going to see a direct competition between RPA and DevOps, which is more of a cultural methodology for IT development and operations.


NVIDIA Launches Edge Computing Platform to Bring Real-Time AI to Global Industries

NVIDIA EGX was created to meet the growing demand to perform instantaneous, high-throughput AI at the edge — where data is created – with guaranteed response times, while reducing the amount of data that must be sent to the cloud. By 2025, 150 billion machine sensors and IoT devices will stream continuous data that will need to be processed(1) — orders of magnitude more than produced today by individuals using smartphones. Edge servers like those in the NVIDIA EGX platform will be distributed throughout the world to process data in real time from these sensors. ... EGX combines the full range of NVIDIA AI computing technologies with Red Hat OpenShift and NVIDIA Edge Stack together with Mellanox and Cisco security, networking and storage technologies. This enables companies in the largest industries — telecom, manufacturing, retail, healthcare and transportation — to quickly stand up state-of-the-art, secure, enterprise-grade AI infrastructures.


DevOps for networking hits chokepoints, tangled communications


While NetOps still lags behind DevOps, major market players look to bridge that gap. Red Hat, for example, brought network automation into Ansible configuration management to enable DevOps and network teams to automate the deployment of network devices and connections in the same way they would with OSes and cloud services. Ansible Tower, a management console for Ansible Engine, can store network credentials and scale network automation, among other tasks. Collectively, these networking features are referred to as Ansible Network Automation. DevOps teams should watch to see if, or how, they evolve in light of IBM's acquisition of Red Hat. In another move, this time by an established networking vendor, F5 Networks invested in NetOps via its acquisition of Nginx, an open source app delivery platform, early in 2019. With Nginx, F5 aims to blend network management with DevOps practices, as well as strengthen its multi-cloud presence. At the time of the deal, F5 said it will meld its app and network security services with Nginx's app delivery and API management portfolio.


Perfect storm for data science in security


Another key contribution by data science is in describing the extent of an attack as well as possible through automated methods. “Detection and response go hand in hand, and so the more we can detail the extent of an attack in terms of detection, the more we can accelerate the response.” Data scientists are also working in the field of automated response, but Neil said in this regard, it is “still early days” and automated response remains highly dependent on detection capability. “You need to be very sure of your detection before you start shutting machines down because a false positive here is quite expensive for the enterprise, so this is a real challenge. “However, progress is being made, and Microsoft has some of these automated response systems deployed. But we are very careful about this. Automated response is a very long-term goal. Regardless of the hype, it is going to take us years to realise this fully.” That said, Neil believes a lot of the manual, human-driven cyber attacks by teams of well-funded attackers will start to be replaced. “I think we are going to start seeing attackers using automated decision making.”


How researchers are teaching AI to learn like a child


One of the most challenging tasks is to code instincts flexibly, so that AIs can cope with a chaotic world that does not always follow the rules. Autonomous cars, for example, cannot count on other drivers to obey traffic laws. To deal with that unpredictability, Noah Goodman, a psychologist and computer scientist at Stanford University in Palo Alto, California, helps develop probabilistic programming languages (PPLs). He describes them as combining the rigid structures of computer code with the mathematics of probability, echoing the way people can follow logic but also allow for uncertainty: If the grass is wet it probably rained—but maybe someone turned on a sprinkler. Crucially, a PPL can be combined with deep learning networks to incorporate extensive learning. While working at Uber, Goodman and others invented such a "deep PPL," called Pyro. The ride-share company is exploring uses for Pyro such as dispatching drivers and adaptively planning routes amid road construction and game days. Goodman says PPLs can reason not only about physics and logistics, but also about how people communicate, coping with tricky forms of expression such as hyperbole, irony, and sarcasm.


Effective Risk Analysis in Cybersecurity, Operational Technology and the Supply Chain


From a cybersecurity perspective, Open Standards can be used to provide a proven, consensus-based methodology for the application of quantitative risk analysis, allowing for effective measurement that offers more validity. In supply chain security, for example, the Open Trusted Technology Provider Standard exists to help providers of IT products to utilize a quantitative approach to risk analysis. This enhances the manufacturers ability to identify how much risk is present and determine which third party is the weakest link within their supply chain.  In OT environments, however, risk evaluation methodologies like Bow-tie are often used to relate hazards, threats and mitigating controls. To enhance this technique, the addition of quantitative risk measurement will enable OT decision makers to more accurately evaluate which risks are worthy of mitigation. Although the measurement and management of risk has long been recognized as an important organizational responsibility, the hyper-complexity of today’s business environment has catapulted it to the forefront of the minds of senior executives.



Quote for the day:


"It is the responsibility of leadership to provide opportunity, and the responsibility of individuals to contribute." -- William Pollard


No comments:

Post a Comment