Daily Tech Digest - January 21, 2018

Calculating the Costs of a Cyber Breach: Becoming the “Antifragile” Cyber Organization

Calculating the Costs of a Cyber Breach: Becoming the “Antifragile” Cyber Organization
We like the antifragile concept for two main reasons. First, when it comes to cybersecurity, what concerns people like us are these low-probability/high-impact events, sometimes called “fat-tail” events, that are difficult to account for and even harder to predict. Sure, we can say that a spear-phishing campaign could be catastrophic, but identifying which spear-phishing campaign will be the straw that broke the camel’s back is a whole lot harder if not impossible. Second, we like the antifragile concept because it is not only about resisting the breach, but rather, it is also about learning from the breach attempt. We like that, and that’s where we would like all organizations to be when it comes to their cyber posture. (Note: we are giving you the super oversimplified version of the antifragile concept.) So, if we want to become an “antifragile cyber organization,” where do our concerns lay? Actually, it is not so much with the technical capabilities.



MADIoT – The nightmare after XMAS (and Meltdown, and Spectre)


Now, there is a much larger underlying issue. Yes, software bugs happen, hardware bugs happen. The first are usually fixed by patching the software; in most cases the latter are fixed by updating the firmware. However, that is not possible with these two vulnerabilities as they are caused by a design flaw in the hardware architecture, only fixable by replacing the actual hardware. Luckily, with cooperation between the suppliers of modern operating systems and the hardware vendors responsible for the affected CPUs, the Operating Systems can be patched, and complemented if necessary with additional firmware updates for the hardware. Additional defensive layers preventing malicious code from exploiting the holes – or at least making it much harder – are an “easy” way to make your desktop, laptop, tablet and smartphone devices (more) secure. Sometimes this happens at the penalty of a slowdown in device performance


Forget bitcoin. Here come the blockchain ETFs

"Investors have been buying blindly, and there has been some abuse," said Christian Magoon, CEO of Amplify ETFs. "The SEC has to protect investors." But make no mistake. These two funds are set up to take advantage of the growing interest in blockchain. This is not the Winklevoss Bitcoin Trust, a fund that only owns bitcoin and is run by Cameron and Tyler, of Facebook and "The Social Network" movie fame. The Winklevii want to launch an ETF with the ticker symbol COIN, but the SEC has yet to approve it. In fact, the SEC seems unlikely to greenlight any funds that just want to invest in cryptocurrencies. Dalia Blass, director of the SEC's Division of Investment Management, wrote in a letter Thursday that it had many questions about these funds. And she said that until they are addressed, "we do not believe that it is appropriate for fund sponsors to initiate registration of funds that intend to invest substantially in cryptocurrency and related products."


Applying Quantum Physics to Data Security Matters Now and in the Future


As one of the few companies leveraging the power of quantum physics for random number generation, we also offer advanced key and policy management features that give customers complete control over the lifecycle and use of encryption keys. “Encryption and key management is complicated enough, and the injection of quantum mechanics into the discussion is enough to make most folks’ heads spin,” notes co-authors Garrett Becker and Patrick Daly in the report. “By combining the added security of encryption keys based on quantum random numbers, advanced key lifecycle management and an HSM for protecting those keys, QuintessenceLabs has developed a compelling offering for those enterprises and agencies with a need for the highest level of data security.”


How long will patient live? Deep Learning takes on predictions

The team described their work in their paper, "Improving Palliative Care with Deep Learning," which is up arXiv. The paper was submitted in November. The authors are Anand Avati, Kenneth Jung, Stephanie Harman, Lance Downing, Andrew Ng and Nigam Shah. Authors' Stanford affiliations ranged from Department of Computer Science, the Center for Biomedical Informatics Research, Department of Medicine and Stanford University School of Medicine. The algorithm was not developed to replace doctors but rather to provide a tool to improve the accuracy of prognoses. As Jeremy Hsu, IEEE Spectrum, wrote, "as a benign opportunity to help prompt physicians and patients to have necessary end-of-life conversations earlier." One can think of it as a triage tool for improving access to palliative care, one of the authors, Stephanie Harman, clinical associate professor of medicine at Stanford University and a co-author of the new study, told Gizmodo.


New Security Architecture Practitioner’s Initiative


The Security Architecture Practitioner’s Initiative is a joint effort of The Open Group Security Forum (a global thought leader in Enterprise Architecture) and The SABSA Institute to articulate in a clear, approachable way the characteristics of a highly-qualified Security Architect. The focus of this initiative is on the practitioner, the person who fills the role of the Security Architect, and on the skills and experience that make them great. This project is not about security architecture as a discipline, nor about a methodology for security architecture but rather about people and what makes them great Security Architects. The project team consists of pioneering Security Architects drawn from both The Open Group Security Forum and The SABSA Institute who have between them many decades of security architecture experience at organizations such as Boeing, IBM, HP, and NASA. Operating under the auspices of The Open Group and in collaboration with The SABSA Institute


Public Blockchain's Lure Will Become Irresistible for Enterprises in 2018


Central banks are already experimenting with the tokenization of their own currencies, but doing so in private, permissioned or proprietary blockchains that are managed by the central banks. It is a good start, but the next logical step is to create the legal and regulatory framework that enables the tokenization of fiat currency on any industrial or public blockchain. Once a closed-loop tokenized industrial blockchain exists, many of the key foundations of specialized blockchains would become add-on features in the true economic blockchain. Trade finance is easy if you trust that the representation of 1,000 phones, each worth $1,000 is accurate — you can loan money against those tokens in the blockchain. Similarly, customs declarations, tax calculations, and product history and provenance are all easily derived from looking at the history of the tokens in that blockchain. No separate blockchain is required for trade finance, payments or product traceability.


World Wide Data Wrestling

3vs1
As on date we see data almost every single person, company or any entity is just running after data. By combining all this disparate data, predictive analytics can create highly accurate models to predict pollution trends in advance allowing civic agencies to make relevant predictions and changes to prevent spikes and keep pollution levels in check. Big data and analytics can also help improve traffic management in addition to just monitoring pollution levels and the fates of artificial intelligence and big data are intertwined ... Organizations collect data from a variety of sources, including business transactions, social media and information from sensor or machine-to-machine data. In the past, storing it would’ve been a problem – but new technologies (such as Hadoop) have eased the burden. A comprehensive and widespread network such as this to track the causes of pollution at source will allow government agencies to create smarter strategies to combat pollution



Don’t be fooled: AI-powered tech still needs to prove its intelligence

It’s essentially a definitional problem: For some reason, the industry is hellbent on using AI when what is actually means is machine learning (ML). This is a much more narrow term, referring to what is essentially using trial and error to build a model that’s capable of guessing the answers to discrete questions very accurately. For example, take image recognition: say you want to build a system that separates pictures of cats from pictures of dogs. All you have to do is feed a ML algorithm enough pictures of cats, telling the system they are cats, and then enough pictures of dogs, telling it they are dogs. It will then build a model of what patterns to look for and eventually, after enough training, you should be able to feed it an unlabelled image, and it will be able to make a fairly accurate guess as to which of the two animals is in the picture.


What Aspiring Data Scientists Are Looking For in Hiring Companies

Data Scientists
Not long ago, big data was the exclusive territory of the most prominent IT brands. There weren’t as many experts in the field 20 or even 10 years ago, and many small companies functioned perfectly fine without using big data. But this isn’t the case anymore. Even the smallest startup companies and entrepreneurial ventures now rely on big data to execute their business models, locate specific demographics and ensure long-term success. Businesses leave almost nothing to chance anymore, and much of this transition is tied directly to the big data boom. Perhaps above all else, data scientists want to work with a company that offers job security and stability. Many professionals in the niche realize how few openings exist in the field, so they’ll be pursuing long-term assignments whenever possible. Companies that can accommodate this need and provide guaranteed work will likely find it easier to fill roles in big data management, as opposed to those that only need to complete a one-time project.



Quote for the day:


"A leader does not deserve the name unless he is willing occasionally to stand alone." -- Henry A. Kissinger


Daily Tech Digest - January 19, 2018

cloud computing network connections - IoT - internet of things
A fog computing fabric can have a variety of components and functions. It could include fog computing gateways that accept data IoT devices have collected. It could include a variety of wired and wireless granular collection endpoints, including ruggedized routers and switching equipment. Other aspects could include customer premise equipment (CPE) and gateways to access edge nodes. Higher up the stack fog computing architectures would also touch core networks and routers and eventually global cloud services and servers. The OpenFog Consortium, the group developing reference architectures, has outlined three goals for developing a fog framework. Fog environments should be horizontally scalable, meaning it will support multiple industry vertical use cases; be able to work across the cloud to things continuum; and be a system-level technology, that extends from things, over network edges, through to the cloud and across various network protocols.


How to enable TCP BBR to improve network speed on Linux

linuxhero.jpg
Google developed a TCP Congestion Control Algorithm (CCA) called TCP Bottleneck Bandwidth and RRT (BBR) that overcomes many of the issues found in both Reno and CUBIC (the default CCAs). This new algorithm not only achieves significant bandwidth improvements, but also lower latency. TCP BBR is already employed with google.com servers, and now you can make it happen--so long as your Linux machine is running kernel 4.9 or newer. Out of the box, Linux uses Reno and CUBIC. ... The first thing you need to do is make sure your Linux machine is running a supported kernel. Issue the command uname -r. If your kernel is earlier than 4.9, this won't work. You'll have to upgrade your kernel. For instance, out of the box Ubuntu 16.04 runs kernel 4.4. If your server is such that the kernel can be updated, Ubuntu now has a very easy means of updating to a much newer kernel.



Car hacking remains a very real threat as autos become ever more loaded with tech

A large-scale vehicle hacking resulting in death and destruction was depicted in last year's "The Fate of the Furious" action movie. “That’s Hollywood sensationalizing it, but that is not really that far-fetched," said Joe Fabbre, a director with Santa Barbara, Calif.-based Green Hills Software, which makes operating systems software for vehicles with a focus on security. “There are very skilled hackers out there who can beat through a lot of medium and low levels of robustness in terms of security that is present in a lot of cars today.” In response to the hacking threat, more vehicles are gaining the ability to wirelessly download security patches, similar to how computers and smartphones have been getting software updates for years. These over-the-air updates allow auto companies to respond to threats – and newly discovered vulnerabilities – faster than having to direct customers to bring their vehicles to dealerships.


Ransomware: Why the crooks are ditching bitcoin and where they are going next

istock-bitcoin-and-other-currency.jpg
"If you're the guy behind the ransomware campaign, you want people to pay you -- you don't want people not to be able to pay you! You want to make it as easy as possible," said Glusman. Meanwhile, ransomware victims don't really want to have to pay to get their files back at the best of times - they give in grudgingly - but the incentive to pay might go out the window if it's going to take them days to buy bitcoin and pay the hackers before getting their files back. And there's an even bigger headache: many forms of ransomware offer only a small window for victims to pay the ransom. If that expires, victims risk the ransom going up or even losing their data permanently. Delays in being able to buy bitcoin and then make the payment make it even harder for ransomware victims to be able to get their data back. This is also a headache for the ransomware crooks: ultimately, there's therefore no point in a ransomware distributor being in the business if they can't get paid for their illicit activity.


Create security culture to boost cyber defences, says Troy Hunt


“Even organisations that are security aware enough to be training employees on various related topics do not necessarily know how to make those hard skills part of the organisation’s culture,” he said. This realisation, he said, led to the development of a course on creating a security-centric culture for Pluralsight, an enterprise technology learning platform company. The course is aimed at helping technology professionals and management understand how to embed a culture of security in their organisations, said Hunt. Part of the problem, he said, is that many organisations’ development and security teams tend to work in separate silos. Typically, development groups build the software before it is passed to the security team, but this creates a divide between these groups. Developers tend to be scared of the security people, said Hunt, because the security people can stop software projects from going live if any critical security vulnerabilities are identified in the software code.


Blockchain, digital trust and distributed ledger technology – going big business

Blockchain technology and distributed ledger in business
The question is how do you deal with ever more and faster transactions as the core of digital business in a reliable way that doesn’t slow down transactions in any way but, on the contrary, offers the speed they need in a trustworthy and cost-efficient way? Using a distributed technology is the answer for many. Enter blockchain technology. As mentioned in the introduction blockchain technology is rooted in the world of cryptocurrencies, more specifically Bitcoin. That connotation will disappear and we will not speak about the blockchain but about blockchains (note the letter ‘s’), blockchain technology or distributed ledger technology. Blockchain technology is being tested and implemented across a broad range of applications, industries and use cases for endless applications. Examples, on top of the Internet of Things and financial services


Leverage the power of the mainframe to make sense of your IoT data

Leverage the power of the mainframe to make sense of your IoT data
Thankfully, new innovations on the mainframe make it possible to leave IoT data in each of the databases where they reside and join the different data sources to perform your analytics. After all, with so many IoT devices collecting data and storing it in different locations, eliminating the ETL/ELT process altogether certainly sounds like a more efficient means to analyzing data. The concept of data virtualization allows users to define the structure of a data source in a relational format so that SQL can be run against that data. This means disparate hierarchical data sources can be joined via SQL, just like relational data sources, creating an aggregate view of the available data, enterprisewide. Reading IoT data in situ using SQL can be performed on the mainframe using its enhanced capabilities and can eliminate the ETL/ELT process entirely. By using the workhorse platform retailers already have within their infrastructure, they can gain real-time access to their IoT data and provide their customers with a frictionless and customized shopping experience.


Cloud portability: Why you’ll never really get there

Cloud portability: Why you’ll never really get there
The reality is that porting applications,whether they’re in containers or not, requires a great deal of planning to deal with the compatibility issues of the different environments. The use of containers does not guarantee that your containerized applications will be portable from platform to platform, cloud to cloud. For example, you can’t take a containerized application meant for Linux and run it on Windows, or the other way around. Indeed, containers are really just a cool way of bundling applications with operating systems. You do get enhanced portability capabilities with containers, but you don’t get the “any platform to any platform” portability that many believe it to be. Of course, enterprises want portability. And you can have it. All that’s needed is a greater planning effort when it comes to creating the applications in the first place.  The fact is that all applications are portable if you have enough time and money.


No-collar workforce: collaborating in roles and new talent models


For HR organizations in particular, this trend raises a number of fundamental questions. For example, how can companies approach performance management when the workforce includes bots and virtual workers? What about onboarding or retiring non-human workers? These are not theoretical questions. One critical dimension of the no-collar workforce trend involves creating an HR equivalent to support mechanical members of the worker cohort. Given how entrenched traditional work, career, and HR models are, reorganizing and reskilling workers around automation will likely be challenging. It will require new ways of thinking about jobs, enterprise culture, technology, and, most importantly, people. Even with these challenges, the no-collar trend introduces opportunities that may be too promising to ignore. What if by augmenting a human’s performance, you could raise his productivity on the same scale that we have driven productivity in technology?


What is the impact and likelihood of global risks?

“Cyber-attacks are increasing and have become a global concern as many systems and devices that run critical infrastructure and decision making are now connected through the worldwide web. Public and private companies have become more vulnerable to cyber-attacks as established IT security controls are now failing to protect the current systems. As a result, cyber-attacks have been deemed one of the greatest threat and concern to eight global economies – the USA, Germany, Estonia, Japan, Holland, Switzerland, Singapore, and Malaysia,” he noted. “This means that it is highly important that cyber-attacks become an urgent boardroom debate; they are no longer an IT problem, but a whole company problem and everyone is now responsible for cybersecurity. Cyber risks put the regulatory frameworks under pressure as they to adapt to these new high-frequency and high-risk economic threats.”



Quote for the day:


"Knowledge is the new capital, but it's worthless unless it's accessible, communicated, and enhanced." -- Hamilton Beazley


Daily Tech Digest - January 18, 2018

Understanding Supervised, Unsupervised, and Reinforcement Learning


With supervised learning, you feed the output of your algorithm into the system. This means that in supervised learning, the machine already knows the output of the algorithm before it starts working on it or learning it. A basic example of this concept would be a student learning a course from an instructor. The student knows what he/she is learning from the course. With the output of the algorithm known, all that a system needs to do is to work out the steps or process needed to reach from the input to the output. The algorithm is being taught through a training data set that guides the machine. If the process goes haywire and the algorithms come up with results completely different than what should be expected, then the training data does its part to guide the algorithm back towards the right path. Supervised Machine Learning currently makes up most of the ML that is being used by systems across the world. The input variable (x) is used to connect with the output variable (y) through the use of an algorithm.



Shift your Java applications into containers with Jelastic PaaS

application container vs system container
System containers offer multiple benefits when migrating an existing legacy application. IP addresses, hostnames, and locally stored data can survive container downtimes, there’s no need for port mapping, and you gain a far better isolation and virtualization of resources. Plus you get compatibility with SSH-based config tools and even hibernation and live migration of the memory state. The only perceptible disadvantage compared to application containers might be a slower start-up time as system containers are a bit heavier due to the additional services required for running multiple processes. In Jelastic, it is possible to run both application and system containers. In contrast to other PaaS vendors that use the so-called Twelve-Factor App methodology, Jelastic does not force customers to use any specific approach or application design in order to deploy cloud-native microservices and legacy monoliths.


How machine learning can be used to write more secure computer programs

By transforming software code into a graph, you can actually extract different properties from that code by analyzing the graph. ... Let’s take a smaller function that might have one IF block. One of the graph structures that’s first generated is called an abstract syntax tree. That’s a tree that you’d get by just parsing the code. ... For each IF and for each variable, for each statement, there’s going to be a node. For each operator, like if there’s an assignment, there’s also going to be a node, and they are all connected by edges. You soon run into a lot of nodes and edges. If you take something like, let’s say, the Linux kernel, you’ll have several hundreds of thousands of nodes. ... You can do a lot by essentially solving reachability problems in these graphs.


Cheap Raspberry Pi alternatives: 20 computers that cost less than the Pi 3


On its release in 2012, the $35 Raspberry Pi showed just how much computer you could get for a bargain-basement price. But the cost of single-board computers has just kept dropping, with the Raspberry Pi Foundation releasing the tiny Pi Zero for just $5. Today the Zero is one of several computers with a single-digit price tag, and if you're looking for an as cheap as chips board you're spoiled for choice. These are the single-board computers that you can pick up for less than a price of the $35 Pi. One thing to bear in mind is that the cheapest offerings lack many of the features of the Raspberry Pi 3 Model B, and have more in common with the $5/$10 Raspberry Pi Zero. Even the more expensive boards are at somewhat of a disadvantage compared to the Pi range, lacking their breadth of stable software, tutorials and community support.


Server vendors push flex pricing to challenge cloud providers

cloud computing - data center
For many customers who are just starting to use cloud services, these plans offer a means for making a graceful transition. “Customers want the flexibility to start with a certain capacity and scale as needed,” he said. The success of public cloud services is forcing the hand of vendors to compete by addressing some of the shortcomings of cloud. “Our advice back to these vendors is the time is right because more and more companies are putting more workloads on public cloud that require more storage,” said Stanley Stevens, an analyst with Technology Business Research Companies. “They are getting sticker shock because when they replicated their environment in the public cloud, they realized a lot of that storage is inefficient and unused.” The real cost of the cloud is not the workload, but moving around data, he said. Cloud providers like Amazon and Microsoft charge you for data sent up to their data center, storage, processing, and data sent back down to you. 


IT sabotage: Identifying and preventing insider threats


Insiders that commit IT sabotage are technically competent users who have the access and ability to carry out an attack, as well as the capability to conceal their illicit activities. These characteristics make detecting these kinds of insider IT sabotage very difficult, as malicious behavior rarely looks any different than normal behavior. ... However, in nearly every IT insider sabotage attack, distinct patterns have been discovered, and the detection of these patterns can help identify malicious insider activities. The CERT Insider Threat Center has been working for more than 15 years cataloging, analyzing and detecting patterns of malicious insider behavior in order to understand who commits insider attacks, why they do it, when and where they do it, and how they carry out their attacks.


Next-gen Mirai botnet targets cryptocurrency mining operations


Satori.Coin.Robber works “primarily on the Claymore Mining equipment that allows management actions on 3333 ports with no password authentication enabled (which is the default config),” the researchers said. “To prevent potential abuse, we will not discuss details.” Analysis of the botnet code revealed similarities with the original Satori, including similar code structures, encrypted configurations, similar configuration strings, and the same payload. However, the new variant also comes with a payload targeting the Claymore Miner that features an asynchronous network connection method and enables a new set of command and control communication protocols. Researchers noted that the author behind Satori.Coin.Robber has claimed the code is not malicious, and has even left an email address behind.


Convolutional neural networks for language tasks

Notice how the CNN processes the input as a complete sentence, rather than word by word as we did with the LSTM. For our CNN, we pass a tensor with all word indices in our sentence to our embedding lookup and get back the matrix for our sentence that will be used as the input to our network. Now that we have our embedded representation of our input sentence, we build our convolutional layers. In our CNN, we will use one-dimensional convolutions, as opposed to the two-dimensional convolutions typically used on vision tasks. Instead of defining a height and a width for our filters, we will only define a height, and the width will always be the embedding dimension. This makes sense intuitively, when compared to how images are represented in CNNs. When we deal with images, each pixel is a unit for analysis, and these pixels exist in both dimensions of our input image.


Configuration errors in Intel workstations being labeled a security hole

data breach security threat lock crime spyware
Normally computers with AMT have a BIOS password to prevent making low-level changes, but due to insecure defaults in the BIOS and AMT’s BIOS extension (MEBx) configuration, an attacker with physical access can log in using the default password “admin.” Given the bad security habits of many people, there’s a good chance this default password was not changed. By changing the default password, enabling remote access and setting AMT’s user opt-in to “None,” the attacker has now backdoored the machine and can gain access to the system remotely, assuming the attacker is on the same network as the target machine. Intel says this is a problem in how the machine is configured by the OEM. Its recommendation is that MEBx access be gated by the BIOS password and has said so since 2015. What F-Secure found is that some system manufacturers were not requiring a BIOS password to access MEBx. So it updated its guidance for proper AMT/MEBx security in December.


The role of the data curator: Make data scientists more productive

The data curator has a good understanding of the types of systems that store the data, and the types of tools that can be used for processing the data, even if they are not practitioners of these technologies themselves. They have up-to-date knowledge about datasets, their provenance, and what data curation is needed. They also understand the different types of analysis that need to be performed on specific datasets, as well as the expectations in terms of latency and availability set by diverse business users. By working with data engineers, data custodians, data analysts, and data scientists, the data curator develops a deep understanding of how data is used by the business, and how IT applies technology to make the data available. Data curators are making data analysts and data scientists more productive by allowing them to focus on what they do best.



Quote for the day:


"It is impossible to defeat an ignorant man in an argument." -- William G. Mcadoo


Daily Tech Digest - January 17, 2018

The Neuroscience of Intelligence: An Interview with Richard Haier


Neuroscience approaches have already made intelligence research more mainstream and ready for inclusion in policy discussions. For example, the single most important factor that predicts school success, by far, is the student’s intelligence. Social economic status, family resources, school and teacher quality all pale in comparison. The data showing this is overwhelming. Yet, the word “intelligence” is virtually absent from all discussions about education policies in the United States, and many other countries. Even if intelligence is mostly influenced by genes, all that means for education is that each student comes to school with a different set of strengths for learning. Teachers all know this and the common goal is to maximize each students potential. Attempts to create policies to do this without paying attention to what we know about intelligence have failed for decades, especially with respect to closing achievement gaps.



Why Your Data Could Be At Risk Without Decentralized Computing

According to industry experts, it will take decades for CPUs to be properly redesigned to resolve these issues and replaced. What should the world do to protect itself in the meantime? The answer is decentralization. This is a form of “trustless” computing that assumes from the start that no single machine can be relied upon, instead spreading information out across many different computers or “nodes.” In this framework, even though each individual entity has the potential to be compromised, the decentralized collective will always perform the work safely and correctly. Bitcoin, Ethereum, and blockchain technology in general offer notable examples of decentralized computing. Decentralization achieves two goals. First, no single machine is making all the decisions, so no single machine can unilaterally make bad decisions that affect individual users.


5 Ways SD-WAN Equips Enterprises to Improve Network Security


While the headlines have been alarming, overall industry trends are mixed. According to a recent report by the Ponemon Institute, the average cost of a data breach dropped by about 10 percent to $3.62 million in 2017. This is most likely tied to a reduction in the cost per record stolen, which declined from $158 in 2016 to $141 in 2017. However, the average size of data breaches rose 1.8 percent to more than 24,000 records. Clearly, this is not the time for enterprises to neglect network security. With the rapid expansion of the cloud, followed by what is likely to be an equally rapid move to the Internet of Things, wide-area infrastructure is in need of more flexible and robust protection. One of the most significant enhancements in this field is the advent of the software-defined wide-area network (SD-WAN). By abstracting regional connectivity on top of underlying hardware, enterprises can experience a number of benefits over traditional hardware-centric architectures.


6 things that prevent Blockchain from ruling the world

Generally speaking, the internet is fairly efficient when it comes to the transmission of data. The user requests information, and the server transmits back the piece of data requested with only a small amount of additional data required to get it there. However, the blockchain, in order for it to be preserved, as well as to prevent hacking, needs multiple copies distributed across many nodes. And the blockchain then requires a large amount of storage – for example, Bitcoin’s blockchain was nearly 150GB in size as of last month, and it’s getting bigger all the time. Furthermore, transmitting so much data for the blockchain each time also consumes additional electricity, making the blockchain quite inefficient. In a time where efforts are being made to compress video further to decrease the data required for a download, blockchain’s bulkiness makes little sense.


Financial savings just the beginning for CIOs who understand code quality


It is not just about cutting costs, but improving development productivity and code quality. In the past year, NCOI has fed code into the Cast system four times, but is moving to a contract to enable it to do so monthly to keep up with more regular software updates. “This is so we can refresh our portal every month,” said van Eeden. Ironically, since using the Cast system NCOI has been using more developers because it is doing more development. “For our core ERP application, we have doubled software development productivity,” said van Eeden. “My output doubled, and the quality in the sense of downtime and the number of bugs also improved dramatically.” Van Eeden said he knows there have been no software outages since the company has been using the software intelligence platform, whereas previously it “didn’t even look at the robustness of systems”.


The role of trust in security: Building relationships with management and employees

In reality, security processes must constantly evolve based on discussions between the chief security officer, management, and employees in every business unit, accounting for emerging risks, new technologies, and recently uncovered vulnerabilities. Chief security officers need to first and foremost ensure that a solid understanding exists between the security team and the business units. There is no way that anyone could understand the nuances of a business unit’s capabilities, processes, assets, and services to the extent the unit itself does, so it is tremendously important for a chief security officer to meet with each unit and develop a comprehensive security plan, which is aligned on the corporate level. Only by gaining a more complete understanding of the unique needs of a business unit can a chief security officer develop safeguards that reduce risks.


Demystifying DynamoDB Streams


In order to build something even as simple as a master-slave replication, there are several primitives to understand. The first and foremost is ordering. Imagine if two transactions were to be applied sequentially to a database — the first writes a new entry and the second deletes this entry, which ultimately results in no data persisting in the database — but if the ordering is not guaranteed, the delete transaction could be processed first (causing no effect) and then the write transaction applied, which results in data incorrectly persisting in the database. The second core primitive is duplication: each single transaction should appear exactly once within the log. Failure to enforce ordering or prevent duplication within a log can result in the master and slave becoming inconsistent. ... There are multiple strategies to checkpointing, each of which is a trade-off between specificity and throughput.


How AI Would Have Caught the Forever 21 Breach

As a first step, we must recognize that the days of the desktop/server model are over. In the case of Forever 21, the POS devices served as ground zero — not a laptop, a server, or even a corporate printer. In the age of the Internet of Things, we increasingly rely on "nontraditional" devices to optimize efficiency and boost productivity. But what constitutes a nontraditional device, and how do we look for it? Is it a device without a monitor? A device without a keyboard? Today a nontraditional device could be anything from heating and cooling systems to Internet-connected coffee machines to a rogue Raspberry Pi hidden underneath the floorboards. Protecting registered corporate devices is not enough — criminals will look for the weakest link. As our businesses grow in digital complexity, we have to monitor the entire infrastructure, including the physical network, virtual and cloud environments, and nontraditional IT, to ensure we can spot irregularities as they emerge.


What is identity management? IAM definition, uses, and solutions

Compromised user credentials often serve as an entry point into an organization’s network and its information assets. Enterprises use identity management to safeguard their information assets against the rising threats of ransomware, criminal hacking, phishing and other malware attacks. Global ransomware damage costs alone are expected to exceed $5 billion this year, up 15 percent from 2016, Cybersecurity Ventures predicted. In many organizations, users sometimes have more access privileges than necessary. A robust IAM system can add an important layer of protection by ensuring a consistent application of user access rules and policies across an organization.  Identity and access management systems can enhance business productivity. The systems’ central management capabilities can reduce the complexity and cost of safeguarding user credentials and access.


Mental Models & Security: Thinking Like a Hacker

Although we cannot predict the future with great certainty, we often subconsciously make decisions based on probabilities. For example, when crossing the road, we believe there's a low risk of being hit by a car. The risk exists, but if you've looked for traffic, you are confident that you can cross. The Bayesian method says that one should consider all prior relevant probabilities and then incrementally update them as newer information arrives. This method is especially productive given the fundamentally nondeterministic world we experience: we must use both prior odds and new information to arrive at our best decisions. While there may not be a simple answer to what it means to "think like a hacker," the use of mental models to build frameworks of thought can help avoid the pitfalls associated with approaching every problem from the same angle.



Quote for the day:


"It is easy to lead from the front when there are no obstacles before you, the true colors of a leader are exposed when placed under fire." -- Mark W. Boyer


Daily Tech Digest - January 16, 2018

Data Management 2018
A lot is happening it the market today, as data continues to explode at unprecedented levels. Compliance is no longer an option, but a requirement. The push to cut costs and embrace the multi-cloud – yet still maintain visibility of your data – has never been more critical. At the same time, the need to safeguard against data breaches is an absolute must. And, the necessity to gather as many insights from your data as possible could be the difference between success and failure for many organisations. So, what does all of this mean for today’s CIOs and IT decision makers? Here are the top five predictions for the coming year. For those who see challenges as opportunities, this could be an exciting year for you. ... It is expected that new data valuation techniques to get a boost from AI to reshape information lifecycle management through the automation of policy enforcement and more intelligent data management actions.


What is Zero Trust? A model for more effective security

The Zero Trust model of information security basically kicks to the curb the old castle-and-moat mentality that had organizations focused on defending their perimeters while assuming everything already inside didn’t pose a threat and therefore was cleared for access. Security and technology experts say the castle-and-moat approach isn’t working. They point to the fact that some of the most egregious data breaches happened because hackers, once they gained access inside corporate firewalls, were able move through internal systems without much resistance. “One of the inherent problems we have in IT is we let too many things run way too openly with too many default connections. We essentially trust way too much,” Cunningham says. “That’s why the internet took off – because everyone could share everything all the time. But it’s also a key fail point: If you trust everything, then you don’t have a chance of changing anything security wise.”


Microsoft Stresses Security, Responsible AI in Cloud Policy Updates


In the 2018 update, Microsoft is tackling some of the negative consequences of using AI and other technologies based on the tumultuous year the IT industry experienced in 2017. "We continue to witness cyber-attacks by nation-states on citizens, critical infrastructure and the institutions of democracy. We read on an almost daily basis about the criminal hacking of companies and governments to steal private and sensitive information of customers," wrote Smith in the 2018 update to the e-book. "We listen to the concerns about the loss of jobs to automation and the disruptive impact of artificial intelligence (AI) on entire sectors of the economy." In terms of cyber-security, Microsoft continues to honor its commitment to spend $1 billion in the IT security field each year. If necessary, the company is poised to use legal means to disrupt nation-state attacks.


Cloud computing: Three strategies for making the most of on-demand

"It's still complicated," says Marks, before suggesting more organisations will continue to move from an exploratory stage through to full adoption. "The difference today is that the cloud is understood for its different dimensions, be that at the level of the infrastructure, the platform or services. The migration of applications and data from the server downstairs to the public cloud is a shift that continues. The key point is that it's hard to think of a sensible reason why a CIO would buy hardware ever again -- and that should be your starting point." ZDNet speaks with three IT leaders at different stages of the cloud adoption process: exploring, transforming, and pioneering. Evidence from these three stages suggests cloud-led change remains a work in progress, where smart IT leaders assess their business context and provide an on-demand solution that can flex with future requirements.


Enterprise software spending set to grow thanks to AI and digital boost


“Looking at some of the key areas driving spending over the next few years, Gartner forecasts $2.9tn in new business value opportunities attributable to AI by 2021, as well as the ability to recover 6.2 billion hours of worker productivity,” he said. “Capturing the potential business value will require spending, especially when seeking the more near-term cost savings. “Spending on AI for customer experience and revenue generation is likely to benefit from AI being a force multiplier – the cost to implement will be exceeded by the positive network effects and resulting increase in revenue.” Gartner forecast a slight increase of 0.6% in datacentre spending in 2018 compared with 2017, but predicted a decline of 0.2% in 2019. As Computer Weekly has reported previously, this may be related to the increase in SaaS and cloud-based services.


Lessons in Becoming an Effective Data Scientist

The first skill that I look for when engaging with or hiring a data scientist is humility. I look for the ability to listen and engage with others who may not seem as smart as them. And as you can see from our DEPP methodology, humility is the key to driving collaboration between the business stakeholders (who will never understand data science to the level that a data scientist do) and the data scientist. Humility is critical to our DEPP methodology because you can’t learn what’s important for the business if you aren’t willing to acknowledge that you might not know everything. Humility is one of the secrets to effective collaboration. Nowhere does the importance of the business/data science collaboration play a more important role than in hypothesis development. If you get the hypothesis and the metrics against which you are going to measure success wrong, everything the data scientist does to support that hypothesis doesn’t matter.


7 Acquisitions that Point to Cloud Maturity


Over the better part of a decade, cloud computing mergers and acquisitions painted a picture where cloud service providers where on the hunt to accumulate as many customers as possible. Thus, you saw massive build-outs of facilities and a haphazard set of mergers and acquisitions that had no real rhyme or reason. We also witnessed incredible price wars when it came to commodity cloud resources in the IaaS and PaaS space. In a nutshell, the cloud service provider market has always about growth by any means necessary. If you contrast previous years' cloud acquisitions with those that have occurred the latter parts of 2017 and into 2018, we start to see a new pattern forming. Sure, there are still signs of significant growth in the cloud space for those looking for bleeding-edge services. Yet at the same time, you see a trend towards stability and cloud acquisition dollars trending toward following what the customer wants in a cloud service -- as opposed to the other way around.


As the cloud’s popularity grows, so does the risk to sensitive data

Despite the prevalence of cloud usage, the study found that there is a gap in awareness within businesses about the services being used. Only a quarter (25%) of IT and IT security practitioners revealed they are very confident they know all the cloud services their business is using, with a third (31%) confident they know. Looking more closely, shadow IT may be continuing to cause challenges. Over half of Australian (61%), Brazilian (59%) and British (56%) organizations are not confident they know all the cloud computing apps, platform or infrastructure services their organization is using. Confidence is higher elsewhere, with only around a quarter in Germany (27%), Japan (27%) and France (25%) not confident. Fortunately, the vast majority (81%) believe that having the ability to use strong authentication methods to access data and applications in the cloud is essential or very important.


Big Data 2018: 4 Reasons To Be Excited, 4 Reasons To Be Worried

Figure 1. TensorFlow Playground offers an interactive sandbox for exploring the foundations of TensorFlow. (Source: Google)
Machine-learning models can accurately perform recognition of specific patterns in data streams. In environments already inundated with data, this capability provides high value and distinct advantages, and the industry has responded accordingly. Data scientists can take advantage of a growing number of open-source machine-learning frameworks including Google’s TensorFlow, Apache MXNet, Facebook Caffe2, and Microsoft Cognitive Toolkit, among others. Most important, the task of building models has never been easier. For example, Amazon Web Services (AWS) offers deep learning AMIs (Amazon Machine Images) with the leading ML frameworks already built in and ready for use on the AWS cloud. For those just starting, Google’s TensorFlow Playground helps users learn more about the neural networks underlying machine learning frameworks, using simple data sets and pre-trained models


Container infrastructure a silver lining amid Intel CPU flaw fixes

Meltdown and Spectre loom over containers
"Most folks running containers have something like [Apache] Mesos or Kubernetes, and that makes it easy to do rolling upgrades on the infrastructure underneath," said Andy Domeier, director of technology operations at SPS Commerce, a communications network for supply chain and logistics businesses based in Minneapolis. SPS uses Mesos for container orchestration, but it is evaluating Kubernetes, as well. Containers are often used with immutable infrastructures, which can be stood up and torn down at will and present an ideal means to handle the infrastructure changes on the way, due to these specific Intel CPU flaws or unforeseen future events. "It really hammers home the case for immutability," said Carmen DeArdo, technology director responsible for the software delivery pipeline at Nationwide Mutual Insurance Co. in Columbus, Ohio.



Quote for the day:


"When we lead from the heart, we don't need to work on being authentic we just are!" -- Gordon Tredgold


Daily Tech Digest - January 15, 2018

Blockchain Company Wants to Create Alternative Decentralized Digital Economy

Blockchain Company Wants to Create Alternative Decentralized Digital Economy
One recently proposed solution to this is Pocketinns, which aims to disrupt this space by acting as a collection of marketplaces. Nearly anything you can think of would be available on the platform – its goal is to turn all the current monopolies on their heads by providing a safer, secure alternative platform by offering the same quality promised by these giant corporations.The company already has a home sharing and vacation rental marketplace active and live in Europe with 50,000 properties and is looking at adding another 250,000 properties in the next few months and all of this happens at zero percent commission. Pocketinns looks to follow the monthly subscription model by offering multiple services on one single platform.In addition, the future vision includes building an internal financial network used to support the internal transactions which include payment processors, remittances, banking, etc.


20 years on, open source hasn’t changed the world as promised
This chicken-and-egg conundrum is starting to resolve itself, thanks to the forward-looking efforts of Google, Facebook, Amazon, and other web giants that are demonstrating the value of open-sourcing code. Although it’s unlikely that a State Farm or Chevron will ever participate in the same way as a Microsoft, we are starting to see companies like Bloomberg and Capital One get involved in open source in ways they never would have considered back when the term “open source” was coined in 1997, much less in 2007. It’s a start. Let’s also not forget that although we have seen companies use more open source code over the past 20 years, the biggest win for open source since its inception is how it has changed the narrative of how innovation happens in software. We’re starting to believe, and for good reason, that the best, most innovative software is open source.


Why you’ll fire Siri and do the job yourself

ObEN’s PAI approach is one answer to the question of how virtual assistants with agency might function. We’ve assumed for years that virtual assistants will do more than just answer our questions, which is mostly what they do today. Future virtual assistants should buy things, negotiate fees, automatically remind co-workers of their deadlines and more. Consider Amy, the x.ai virtual assistant. Amy is A.I. that interacts via email and schedules meetings. Amy has a personality and can make decisions in an email conversation, such as the meeting participants and the Amy virtual assistant negotiating available times for meetings. Amy is a virtual person, and many people who encounter Amy assume they’re interacting with a real human. If our virtual assistants are to be “personalities” like Amy, they could also be virtual representations of ourselves. This approach is actually more transparent than the A.I. that’s currently used.


Safeguarding your biggest cybersecurity target: Executives

Safeguarding your biggest cybersecurity risk: Executives
“Executives need to internalize that they are targets,” says Bill Thirsk, vice president of IT and CIO at Marist College. “Cyber attackers take time to watch, plan, practice, hone, and harden their art before going after a high-value target. Attackers have the luxury of stealth, time, duplicity, and multiple platforms for designated random attacks — all of which work against normal human behavior, curiosity, and the need for connectedness.” An executive’s “digital footprint” needs to be understood and gaps must be closed as a matter of practice, Thirsk says. Social accounts should be registered, confirmed, and monitored, he says. But getting executives to buy into protection is a challenge. “Every statistic I’ve seen shows that executives are the least likely to adhere to policies that they expect everyone else to follow,” says Paul Boulanger, vice president and chief security consultant at SoCal Privacy Consultants. “In part, this is because they are the people most willing to sacrifice security for convenience.”


IT service management effectiveness hampered by lack of metrics


According to the study, the increasing demand placed on IT operations is resulting in teams taking on more work than they can handle. Axelos found that this could be having a negative effect on their reputation. “Despite struggling to keep up with demand and working beyond realistic expectations, they are still perceived as delivering poor performance,” the report stated. IT operations and development teams said they wanted to eliminate inefficient practices. The study found that 55% of ITSM professionals who took part in the survey showed an interest in identifying and eliminating wasteful work through the use of continuous service improvement, DevOps and agile practices. Axelos found that larger organisations tend to recognise lack of visibility as a problem, while smaller organisations struggle more with inefficient processes and understanding customer needs.


AI Begins to Infiltrate the Enterprise

Image: Shutterstock
The data that feeds AI systems can also present obstacles. "The gathering and curation of data is a key challenge," said Patience. "We see that over and over again, where either organizations don't have enough data, or they have it and can't get access to it.” Then there are the problems with the technology itself. While AI research has advanced incredibly quickly in recent years, we still don't have a general artificial intelligence that truly thinks and learns the way humans do. As a result, human interactions with AI are sometimes less than satisfactory. "The 'klutziness,' if you will, of a computer itself is a serious challenge," Hadley said, adding, "The opportunities for mistakes and disasters from the point of view of the customer experience are much more likely." That leads to a bigger issue: trust. "The overarching issue in the whole development of the field is do people trust the results that they get out of a machine?" Reynolds said.


Spectre and Meltdown explained: What they are, how they work, what's at risk

thinkstock 500773792 cpu processor
The problem arises because the protected data is stored in CPU cache even if the process never receives permission to access it. And because CPU cache memory can be accessed more quickly than regular memory, the process can attempt to access certain memory locations to find out if the data there has been cached — it still won't be able to access the data, but if the data has been cached, its attempt to read it will be rejected much more quickly than it otherwise would. Think of it as knocking on a box to see if it's hollow. Because of the way computer memory works, just knowing the addresses where data is stored can help you deduce what the data is. ... Spectre and Meltdown both open up possibilities for dangerous attacks. For instance, JavaScript code on a website could use Spectre to trick a web browser into revealing user and password information. Attackers could exploit Meltdown to view data owned by other users and even other virtual servers hosted on the same hardware, which is potentially disastrous for cloud computing hosts.


Don't Use a Blockchain Unless You Really Need One

Most CoinDesk readers are probably familiar with the usefulness of decentralization in a monetary context (and if you're not, take a look at recent articles about cryptocurrency adoption in Iran, Venezuela, Russia and, ahem, the alt-right). The neutrality, censorship-resistance and openness of a permissionless network mean it will attract the odious along with the oppressed, and the software doesn't decide which is which. But why is decentralization worthwhile in the data use case? "Today, every piece of content and media you have is living somewhere owned by somebody," Ravikant explained. ... Then the conversation took a turn that I have to admit made me roll my eyes at first. "If someone creates a new Pokemon card game or a Magic the Gathering card game" online, he continued, the characters are "living and owned by a certain company in a certain format. You can't just go and reuse those assets."


IOT Security Needs A White Knight

wireless network - internet of things [iot]
A lack of quality control and the presence of a host of very old devices on IoT networks might be the most critical security threats, however. Decades-old hardware, which may not have been designed to be connected to the Internet in the first place, let alone stand up to modern-day security threats, creates a serious issue. “You have over 10 billion IoT devices out there already … and a lot of these devices were created in 1992,” noted Sarangan. Moreover, the huge number of companies making IoT-enabled hardware makes for a potentially serious problem where quality control is concerned. Big companies like Amazon and Microsoft and Google make headlines for their smart home gizmos, but the world of IoT is a lot broader than that. China, in particular, is a major source of lower-end IoT devices – speakers, trackers, refrigerators, bike locks and so on – and it’s not just the Huaweis and Xiaomis of the world providing the hardware.


Forget the CES hype, IoT is all about industry

Forget the CES hype, IoT is all about industry
In addition to all the new product previews, this year’s CES is full of summits, seminars, presentations and other sessions devoted to helping consumer products companies make, sell, deploy and monetize everything from smart cars and smart homes to smart cities. But I’m here to tell you that none of that really matters much to the future of the Internet of Things (IoT). Nope, despite the CES hype, the IoT is really all about industrial and business devices, networks and applications. Here’s the thing: The consumer side of IoT is consumed by the faddish and spectacular, not the everyday and useful. Just consider the kind of IoT products that have been featured at CES in previous years: There was the infamous smart toothbrush (expect more of those this year, too), not to mention smart hairbrushes and refrigerators. Not exactly must-haves for most people.



Quote for the day:


"An approximate answer to the right problem is worth a good deal more than an exact answer to an approximate problem." -- John Tukey


Daily Tech Digest - January 14, 2018

Strategy and Innovation Roadmapping Tools

EIRMA Roadmapping ViewThe ways of doing roadmapping have existed for some time, but supporting software has not. Motorola is credited with the development of roadmapping in the 1970s to support integrated product and technology strategic planning. Unfortunately, many have struggled with drawing, and coloring boxes and connecting lines in different tools for over 20 years. This doesn’t have to be the case today. ... Added analytical capabilities in this group of strategy and innovation roadmapping tools may cover techniques such as scenario planning , Delphi, Blue Ocean and more . Such analysis may be disconnected from in-flight efforts; but, such analysis may also allow for future plans to be considered in light of current state results. Analysis techniques for the Marketplace are also integral to creating and delivering strategy and innovation in many organisations. The marketplace we’ve profiled in our Market Guide for Strategy and Innovation Roadmapping Tools supports many of these techniques


NASA Awarded A Grant For Ethereum Blockchain-Related Research

Among the goals of the program are measures to protect NASA vehicles from collisions with space junk orbiting the earth, which can damage or completely incapacitate them, and the processing of highly complex data. At the helm of the research project is Dr. Jin Wei (sometimes credited as Jin Kocsis or Jin Wei Kocsis), an assistant professor with the University of Akron's Department of Electrical and Computer Engineering.  A write-up published by the Collier Report of US Government Spending, which shares a significant amount of language with a project summary ostensibly penned by Wei, describes plans to develop a "data-driven resilient and cognitive networking management architecture." Wei's team will also conduct research into decentralized computing mechanisms that could prove instrumental in processing "the massive amount of high-dimensional data" often collected by NASA spacecraft.



By 2020 83% Of Enterprise Workloads Will Be In The Cloud

By 2020 83% Of Enterprise Workloads Will Be In The Cloud
Digitally transforming enterprises (63%) is the leading factor driving greater public cloud computing engagement or adoption today. 66% of IT professionals say security is their most significant concern in adopting an enterprise cloud computing strategy. 50% of IT professionals believe artificial intelligence and machine learning are playing a role in cloud computing adoption today, growing to 67% by 2020. Artificial Intelligence (AI) and Machine Learning will be the leading catalyst driving greater cloud computing adoption by 2020. These insights and findings are from LogicMonitor’s Cloud Vision 2020: The Future of the Cloud Study. The survey is based on interviews with approximately 300 influencers LogicMonitor interviewed in November 2017. Respondents include Amazon Web Services AWS re:Invent 2017 attendees, industry analysts, media, consultants and vendor strategists. The study’s primary goal is to explore the landscape for cloud services in 2020.


Blockchain: can the law keep up?

The legal implications of a disruptive technology such as blockchain vary as the technology is applied to different sectors and applications. Some of the key considerations are as follows: ... Distributed ledger technology is, just as described, distributed. There is no fixed location of a transaction, a registry or an application. It is therefore critical that the parties to any arrangement involving this technology have expressly agreed and recorded the jurisdiction and governing law which is to apply to the arrangement. Some jurisdictions are starting to address the legal and regulatory matters around blockchain. ... Contractual and legal issues must be seen from a different angle with blockchain technologies. How are service levels and performance defined? What is the liability position? In particular, the enforceability of an arrangement involving blockchain should be considered carefully.


The Future of Humans - Intersection of HR & AI


Artificial intelligence is transforming our lives at home and at work. At home, you may be one of the 1.8 million people who use Amazon's Alexa to control the lights, unlock your car, and receive the latest stock quotes for the companies in your portfolio. In total, Alexa is touted as having more than 3,000 skills and growing daily. In the workplace, artificial intelligence is evolving into an intelligent assistant to help us work smarter. Artificial intelligence is not the future of the workplace, it is the present and happening today. Time is not far when AI will be contributing to every business function to make transactions effective & efficient. Human Resources will not be able to stay form it for a long time and all HR professionals should embrace this change gracefully and make the most of it. Artificial intelligence is all about analysing, breaking down and transforming data into humanized format, which is easy to interpret and study. A good example of AI is the suggestions and predictions that we get from our smartphones without having to be reminded for the same.


Why the Organisation of the Tomorrow is a Data Organisation

A car company should no longer see itself as a car manufacturer, but as a software company that is in the business of helping move people from A to B. It should look at how the company can do so in the most reliable, comfortable and safe way. Whether it produces cars, self-flying taxis or develops an Uber-like app are then questions that can be asked. The same goes for, for example, a bank. They are not a financial institution, but a data company that enables people to store money and make transactions safely. Whether this is done using a cryptocurrency or as a mobile-only bank are then questions that can be asked. Nowadays, any company, regardless of the industry, should see itself as a data company. When seeing an organisation as a data company, it allows you to remove any inhibitors that prevent the business from delivering the product or service in the most efficient, effective and customer-friendly way.


Machine Learning's Greatest Potential Is Driving Revenue In The Enterprise


These and many other insights are from the recently published study, Global CIO Point of View. The entire report is downloadable here (PDF, 24 pp., no opt-in). ServiceNow and Oxford Economics collaborated on this survey of 500 CIOs in 11 countries on three continents, spanning 25 industries. In addition to the CIO interviews, leading experts in machine learning and its impact on enterprise performance contributed to the study. For additional details on the methodology, please see page 4 of the study and an online description of the CIO Survey Methodology here. Digital transformation is a cornerstone of machine learning adoption. 72% of CIOs have responsibility for digital transformation initiatives that drive machine learning adoption. The survey found that the greater the level of digital transformation success, the more likely machine learning-based programs and strategies would succeed. IDC predicts that 40% of digital transformation initiatives will be supported by machine learning and artificial intelligence by 2019.


Key considerations of AI, IoT and digital transformation

IoT is a key driver of both the machine learning and AI craze. With the volume of data produced by machines and people on a daily basis becoming unmanageable, it has become increasingly difficult to make use of this information — and the proliferation of connected sensors only serves to further up the ante. Without AI and machine learning, making heads from tails of this data is downright difficult and creates problems for businesses looking to use their data. With digital transformation at the forefront of many business initiatives, applying AI to IoT can help drive the innovation and business effectiveness that many companies are hoping to achieve. Applied correctly, companies can change the way they operate through the use of these digital technologies to realize key competitive differentiators. Some industries are more proactive in this regard than others, which is best observed in Constellation Research’s recently released “Business Transformation 150.”


In 2018, AI will be listening and watching us more than ever: Is our privacy under threat?


Finding Alexa or Google Assistant (or even both) inside a television or speaker is no longer a surprise. But as the smart home of the future finally becomes an attainable reality, artificial intelligence is appearing everywhere. At CES it was in fridges and ovens, washing machines, dryers and even light switches. Yes, we are at a stage where even the most humble of household devices — the light switch — has been given a microphone, speakers, and a blue pulsating light to indicate when it is listening and thinking. Somehow, while our backs were turned, our light switches became intelligent. The revelation of widespread surveillance efforts by the NSA and Britain's GCHQ are still a recent memory. Yet the giants of Silicon Valley are fitting microphones and cameras in every room of our house. The Amazon Echo Spot is designed as a bedside alarm clock — yet it has a small camera, an always-listening microphone and an always-on internet connection.


Blockchain and the Rise of Transaction Technology

Administrations transact with citizens to provide them with trusted public services. They transact with businesses and governments, too. Sometimes citizens transact with government through business. Within strategic sectors, like energy or utility business, transacting is key. In an increasingly data-focused economy, transacting data can even be said to be a special type of virtualized critical infrastructure. This is why states and businesses need to focus on assuring trusted data structures. Blockchains and distributed ledgers, then, can be considered a tool for ensuring data integrity, immutability and trust. It does not mean we need to port everything to blockchain. But it can mean provide an additional, transaction layer to existing data structures, a robust audit trail on what happens on our critical infrastructure. In this way, the possible role of distributed ledgers within digital state infrastructure too often goes unrecognized.



Quote for the day:

"Strength is when you have so much to cry for but you prefer to smile instead." -- Unknown