Daily Tech Digest - April 06, 2020

How DevOps is integral to a cloud-native strategy

How DevOps is integral to a cloud-native strategy image
Containerisation allows applications to be made environment-agnostic and eliminates application conflicts between developers and operations teams, in turn allowing greater collaboration between developers and testers. Breaking down monolithic applications into constituent microservices also increases agility and creates a common toolset, terminology, and set of processes between development and operations teams, which makes it easier for these teams to work with one another. This enables the advanced automation of processes and contributes to an organisation’s move towards agile software development (defined by the continuous delivery of software created in rapid iterations). It’s important to stress that these technologies will only be successfully implemented if that cultural shift happens too, which is where embracing DevOps becomes key. Going cloud-native is a gradual process and a learning experience. Most organisations have established IT environments that use on-premise applications.


"An increase in state digital surveillance powers, such as obtaining access to mobile phone location data, threatens privacy, freedom of expression, and freedom of association, in ways that could violate rights and degrade trust in public authorities -- undermining the effectiveness of any public health response. Such measures also pose a risk of discrimination and may disproportionately harm already marginalized communities," the joint statement said. "These are extraordinary times, but human rights law still applies. Indeed, the human rights framework is designed to ensure that different rights can be carefully balanced to protect individuals and wider societies. "States cannot simply disregard rights such as privacy and freedom of expression in the name of tackling a public health crisis. On the contrary, protecting human rights also promotes public health. Now more than ever, governments must rigorously ensure that any restrictions to these rights is in line with long-established human rights safeguards." As part of the statement, the signatories set out eight proposed conditions for all governments to adhere to if increased digital surveillance is used to respond to the COVID-19 pandemic.


Fog and Edge Computing: Principles and Paradigms provides a comprehensive overview of the state-of-the-art applications and architectures driving this dynamic field of computing while highlighting potential research directions and emerging technologies. Exploring topics such as developing scalable architectures, moving from closed systems to open systems, and ethical issues arising from data sensing, this timely book addresses both the challenges and opportunities that Fog and Edge computing presents. ... The Cloud Adoption Playbook helps business and technology leaders in enterprise organisations sort through the options and make the best choices for accelerating cloud adoption and digital transformation. Written by a team of IBM technical executives with a wealth of real-world client experience, this book cuts through the hype, answers your questions, and helps you tailor your cloud adoption and digital transformation journey to the needs of your organisation. ... The updated edition of this practical book shows developers and ops personnel how Kubernetes and container technology can help you achieve new levels of velocity, agility, reliability, and efficiency.


Applications: Combining the old with the new


There are a few reasons why mainframes applications cannot be migrated to public cloud infrastructure easily. Cresswell says mainframe applications will not run on the underlying cloud hardware without significant refactoring and recompilation. “They are typically compiled into mainframe-specific machine code and the mainframe instruction-set architecture is substantially different from the x86 platforms that underpin almost all cloud services,” he says. “Legacy mainframe applications rely on infrastructure software to manage batch and online activity, data access and many other legacy mainframe features. Like the applications themselves, this infrastructure software is also tied to the physical mainframe hardware and will not run in a conventional x86 cloud environment.” Another barrier to migrating mainframe systems is that the mainframe software development pipeline cannot support many of the rapid deployment features that cloud-native applications rely on, says Cresswell, and it is virtually impossible to spin up testing environments on mainframes without extensive planning.


7 Key Principles to Govern Digital Initiatives


An important starting point is to take an inventory of digital initiatives. This may sound like a straightforward task, but it is often quite challenging. People are reluctant to share information for fear they may lose control over their initiatives. Thus, it is helpful to stress that the inventory phase is about the centralization of information about digital initiatives, not control over them. Fred Herren, senior vice president, digital and innovation at SGS, the world’s largest provider of inspection, testing, and certification services, understood that applying a top-down approach to rules rarely works in decentralized cultures. He noted, “I think it’s necessary to walk the talk rather than give instructions. I’ve managed to get a lot of information because I’m not telling employees to stop [their activities]. I walk around and ask people what’s new and I always react positively.” ... Establishing appropriate key performance indicators (KPIs) is a critical exercise, particularly for digital initiatives that are highly dependent on strategic priorities related to the company’s future vision, success, and implementation objectives. However, when we asked leaders how they measure the performance of digital initiatives, most of them answered in one of two ways: either “we don’t” or “it depends.”


Emerging from AI utopia

Embedded Image
Facial recognition is a good example of an AI-driven technology that is starting to have a dramatic human impact. When facial recognition is used to unlock a smartphone, the risk of harm is low, but the stakes are much higher when it is used for policing. In well over a dozen countries, law enforcement agencies have started using facial recognition to identify “suspects” by matching photos scraped from the social media accounts of 3 billion people around the world. Recently, the London Metropolitan Police used the technology to identify 104 suspects, 102 of whom turned out to be “false positives.” In a policing context, the human rights risk is highest because a person can be unlawfully arrested, detained, and ultimately subjected to wrongful prosecution. Moreover, facial recognition errors are not evenly distributed across the community. In Western countries, where there are more readily available data, the technology is far more accurate at identifying white men than any other group, in part because it tends to be trained on datasets of photos that are disproportionately made up of white men. Such uses of AI can cause old problems—like unlawful discrimination—to appear in new forms. Right now, some countries are using AI and mobile phone data to track people in self-quarantine because of the coronavirus disease 2019 pandemic. The privacy and other impacts of such measures might be justified by the scale of the current crisis, but even in an emergency, human rights must still be protected. Moreover, we will need to ensure that extreme measures do not become the new normal when the period of crisis passes.


Is Blockchain Necessary? An Unbiased Perspective

Is Blockchain Necessary? An Unbiased Perspective
Bankers hate blockchain. It’s obvious why they would; the greatest advantage of blockchain is that it cuts down on costs, only requiring infrastructure costs. No transaction fees, no maintenance charges, nothing. Effectively, blockchain makes banking obsolete, and honestly, I feel it should. The banking industry has remained unchanged over millennia. It is an integral part of society whose mismanaged monetary transactions have incited myriad wars. Unfortunately, the banking industry is in a pathetic state. Bankers have too much power, control and streams of revenue. It needs to topple. It’s a legacy system, and the pain points of this system haven’t changed since the days of Venetian merchants. There is so much abuse of power involved, and the fact that it is legal paints a grim picture. For example, the man who invented the credit card never wanted interest rates to go over 8%. Today, banks on average charge from 12% to 18% not including transaction, processing and various other fees. Blockchain can destroy and recreate this system. However, this brings us to the greatest chink in blockchain’s armor: This transformative process is expensive and decentralized.


Remote Working: What It Means For RPA


RPA still has considerable risks with remote working. If anything, companies will need to engage in even more planning with their systems. “Enterprise grade security needs to be baked into any RPA platform from the start, which helps provide greater resilience and business continuity,” said Jason Kingdon, who is the Executive Chairman at Blue Prism. There will also need to be more attention paid to managing bot development and deployment. Otherwise there could be much more sprawl across an organization, lessening the benefits of the technology. This is why its important to have a Center-of-Excellence or COE (you can learn more about this from one of my recent Forbes.com posts). “You need to have a group of champions who control the system, and monitor what bots are being built and who is building them,” said Tabakman. “It’s best to provide regular training around bot design and consider an approval process, where your champions review bots before they’re deployed. You’ll want to ensure that a bot being created doesn’t create more problems than it solves, such as bots that go into infinite loops, resulting in more work for IT teams.


Overcoming flat data to unlock business insight and productivity

Overcoming flat data to unlock business insight and productivity image
Artificial intelligence is eliminating entire swathes of manual intervention in the processing of documents, and, more importantly, adding context to them. It’s not enough to simply scan a document and store it along with a reference number: the technology must be able to add meaning to it and to create links with other related data, structured or unstructured. This type of technology falls into a category that we call Context Driven Productivity. At its core is the ability to extract information from flat data and transform it into semantic data, whereby links are created to other data sources, both internal and external, building relationships, connections and additional meaning. Semantic data allows humans or AI robots to gain contextual information automatically, rather than having to rely on a limited number of hard-wired connections. In practical terms, the possibilities are enormous. Not only will administrative workers be freed from the tedious task of manually processing incoming documents, but the resulting context-driven data will be infinitely more useful to any organisation.


How cloud computing is changing the laboratory ecosystem


Cloud computing allows labs to partake in immense computing processes without the cost and complexity of running onsite server rooms. Switching from an onsite solution to the cloud alleviates the costs of IT infrastructure, reducing the cost of entry into the industry, while also leveling the playing field for smaller laboratories. Moreover, cloud computing can allow data to be extracted from laboratory devices to be put in the cloud. Device integration between lab equipment and cloud services allows real-life data from experiments to be collated in a cloud system. One of the most popular products in the market is Cubuslab, a plug-and-play solution that serves as a laboratory execution system and collects instrument data in real time as well as managing devices remotely. This new collection of high amounts of data requires a centralised system that integrates the scientists protocols and experimental annotations. The electronic lab notebook, is starting to become a common tool in research by allowing users to organise all their different data inputs and retrieve this data at any point. This also allows for large R&D projects to effectively control data over their scalability potential.



Quote for the day:


"The art of communication is the language of leadership." -- James Humes


No comments:

Post a Comment