Daily Tech Digest - March 25, 2018

The Top 10 IoT Trends


In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is jumping aboard, hoping to harness the power of the IoT to connect with customers, grow their brands, and improve the customer journey in deeply personal ways. But industries like healthcare and supply are not far behind. They’re using the technology to connect with patients via wearable devices, and track products from factory to floor. In many ways, the full potential of the IoT is still being realized; we’ll likely see more of that in 2018.



Securing the Operational Technology (OT) - The Challenges

The increasing connectivity of previously isolated manufacturing systems, together with a reliance on remote supporting services for operational maintenance, has introduced new vulnerabilities for cyber attack. Not only is the number of attacks growing, but so is their sophistication. As OT security becomes a widely discussed topic, the awareness of OT operators is rising, but so is the knowledge and understanding of OT-specific problems and vulnerabilities in the hacker community. It’s true that the systems and devices involved in OT are often based on the same technologies as that of IT and as such many of the threats they face are exactly the same. However, it is an open secret that OT security is not the same as IT security. While securing OT systems requires an integrated approach similar to IT, its objectives are inverted, with availability being the primary requirement, followed by integrity and confidentiality. There are certain other important differences as well that mean that the OT infrastucture can not be managed as an extension of the IT infrastructure


How Blockchain Is Replacing Branding As A Source Of Trust


It's not difficult to see why we're heading towards brandless trust. If you think about a supply chain that's obsessed with finding a more efficient way of doing things, you see why we have a system that's always adding more agencies in between the beginning and the end points. And why there's a decreasing visibility of what's really going on. Where Molly was a single agency brand, her modern counterparts would be adding agencies everywhere to make things work cheaper, better and faster. If you can turn one link of the chain into two sub-links and bring an economy in here or there, you've 'improved' the system. Sure, you've opened it up to a greater risk of fraud, but that will be someone else's problem, higher up the chain. What we're witnessing isn't an accident of occasional fraud, it's an unavoidable consequence of our desire for cheaper, better and faster.


MIT researchers find that graphene can function as a superconductor


Due to its high conductivity, graphene could be used in semiconductors to greatly increase the speed at which information travels. Recently the Department of Energy conducted tests which demonstrated that semiconductive polymers conduct electricity much faster when placed atop a layer of graphene than a layer of silicon. This holds true even if the polymer is thicker. A polymer 50-nanometers thick, when placed on top of a graphene layer, conducted a charge better than a 10 nm thick layer of the polymer. This flew in the face of previous wisdom which held that the thinner a polymer is, the better it can conduct charge. Yet another example of graphene’s remarkable properties. The biggest obstacle to graphene’s use in electronics is its lack of a band gap, the gap between valence and conduction bands in a material that, when crossed, allows for a flow of electrical current. The band gap is what allows semiconductive materials such as silicon to function as transistors; they can switch between insulating or conducting an electric current, depending on whether their electrons are pushed across the band gap or not.


The importance of integrating legacy enterprises

gear turn industry machine motion integration
CIOs feel as well that the inability to integrate can be a source of competitive disadvantage. How many strategic planners have this in their SWOTT analysis? CIOs are clear, however, that integration alone isn’t sufficient to drive a competitive advantage. They say it is people collaborating on market driven priorities backed by integrated practices that drives competitive advantage. Jack Gold concluded this discussion by saying “the organizations that win in the future are the ones that can make best use of ALL of their data and apps – legacy or otherwise.” It seems clear that legacy organizations that are built and integrated like the famed Winchester House will find themselves at a distinct disadvantage in an era of digital disruption. The speed and agility of integrating applications, data, and business capabilities matters today. Here, CIOs need to build internal competency versus perpetuating “duct tape” integrations. How well they do this can be a source of competitive advantage or competitive disadvantage.


What kind of AI future do we want?

"What will the role of humans be, if machines can do everything better and cheaper than us?" asked Max Tegmark, a professor of physics at the Massachusetts Institute of Technology and the author of Life 3.0: Being Human in the Age of Artificial Intelligence. He was speaking at the Beyond Impact summit on artificial intelligence Friday at The Globe and Mail in Toronto, presented in conjunction with the University of Waterloo. The assumption in such questions is that artificial intelligence is trying to progress to AGI, or artificial general intelligence, in which a machine will basically think a thought, or at least do an intellectual task on its own, as a human can. Some believe we may never reach true AGI, Dr. Tegmark noted. Machines may never have the consciousness of a living entity or show true creativity. Yet, "the future development of AI might go faster than typical human development, and there is a very controversial possibility of an intelligence explosion, where self-improving AI might rapidly leave human intelligence far behind," he said.


5 Blockchain Innovations Wall Street Is Watching in 2018

5 Blockchain Innovations Wall Street Is Watching in 2018
The biggest upside for using blockchain is system integrity. Cryptocurrencies and blockchain technology eliminate the need for middlemen. Hence, it's these middlemen that tend to overcomplicate payments and charge expensive fees on top of large transactions. As such, the very design of blockchain lends itself to security. Blockchain is a decentralized ledger. Therefore, transactions are not visible to any person besides the two parties engaging in the asset transfer. Also, crypto wallets are essentially immune to fraud due to their complexity and uniqueness. Hence, it's difficult to steal assets. The assets become invulnerable to forgery. For example, the Internet of Services Foundation has created a scalable blockchain infrastructure for the future of online business. Its high throughput processing and security offer an intriguing alternative from cryptocurrency mainstays like Bitcoin and Ethereum. To date, these cryptocurrencies have been unable to scale for mass adoption.


UK launches the world’s first crypto assets task force

UK launches the world̢۪s first crypto assets task force
The initiative is part of a larger collective fintech sector strategy; one which will “help the UK to manage the risks around crypto assets, as well as harnessing the potential benefits of the underlying technology,” as per Hammond. Philip Hammond is expected to announce the task force — which will comprise of Bank of England, the Financial Conduct Authority, and the Treasury itself — on Thursday, at the government’s second International Fintech Conference. The statement also announced the government’s interest in creating a UK-Australia ‘fintech bridge’, which will aim to connect the countries’ respective markets and help UK firms expand internationally. The British government has been mostly supportive of cryptocurrencies and blockchain technology, only sporadically calling for increased regulations in the industry. British Prime Minister Theresa May, speaking at the World Economic Forum in January, shared her concerns about potential criminal usage of cryptocurrencies.


J.P.Morgan’s massive guide to machine learning and big data jobs in finance

Machine learning banking
Before machine learning strategies can be implemented, data scientists and quantitative researchers need to acquire and analyze the data with the aim of deriving tradable signals and insights. J.P. Morgan notes that data analysis is complex. Today’s datasets are often bigger than yesterday’s. They can include anything from data generated by individuals (social media posts, product reviews, search trends, etc.), to data generated by business processes (company exhaust data, commercial transaction, credit card data, etc.) and data generated by sensors (satellite image data, foot and car traffic, ship locations, etc.). These new forms of data need to be analyzed before they can be used in a trading strategy. They also need to be assessed for ‘alpha content’ – their ability to generate alpha. Alpha content will be partially dependent upon the cost of the data, the amount of processing required and how well-used the dataset is already.


8 questions to ask about your industrial control systems security

supply chain management logistics - ERP - Enterprise Resource Planning
An ICS is any device, instrumentation, and associated software and networks used to operate or automate industrial processes. Industrial control systems are commonly used in manufacturing, but they are also vital to critical infrastructure such as energy, communications, and transportation. Many of these systems connect to sensors and other devices over the internet—the industrial Internet of things (IIoT), which increases the potential ICS attack surface. "It is important that organizations leverage lessons learned securing enterprise IT but adapt those lessons to the unique characteristics of OT," says Eddie Habibi, CEO and founder of ICS security vendor PAS Global. "This includes moving beyond perimeter-based security in a facility and adding security controls to the assets that matter most – the proprietary control systems, which have primary responsibility for process safety and reliability," he says. The following are some of the key questions that plant operators, process control engineers, manufacturing IT specialists, and security personnel need to be asking when planning for ICS security, according to several experts.



Quote for the day:


"Acknowledging what you don't know is the dawning of wisdom." -- Charlie Munger


Daily Tech Digest - March 24, 2018

Cryptomining Botnets: A Threat to Your Security


Unauthorized mining has been going on for several years, but it’s only since 2017 that it’s been a major issue. The biggest mining botnets have pulled in millions of dollars. If you get infected by this type of software, you might not notice. It’s technically not malware, except for the fact that it shouldn’t be on your computer. People use the same software for legitimate mining. Security software often can’t detect it, but it costs you in several ways. Unauthorized mining can slow down your computers. Since computers performing active tasks consume more energy than idle ones, it increases your electric bill. If it has bugs, it can make your computers crash more often. It could create security holes and load more nasty software onto your machines. Some kinds of software can grab a computer’s processing cycles for mining without even breaking its security. A controversial Web application called Coinhive runs code on browsers to do mining. When the site owner says so up front and uses it instead of ads to fund the site, it’s generally considered OK.



How insurers can get the most out of digital transformation

Many new skills are required for a successful digital transformation, including an understanding of analytics and the perpetual evolution of IT architecture to modernize IT. Insurers are struggling to attract talent, because these and other necessary skills are new, not only in insurance but across industries. Demand for talent in areas like big data is expected to exceed available supply by a factor of four. The resulting talent shortage means that leaders are responsible for establishing an environment that attracts talent, promotes personal growth, and offers a desirable and interconnected work environment and flexibility. Few insurers will become leading tech companies, as they will struggle to attract tech talent at scale on a par with the likes of digital natives. But digital leaders in the insurance sector, having prioritized technological literacy and infused areas like engineering, design, and agile with more talent, better understand the investments and trade-offs required in the digital age.


How ‘Not Invented Here’ Limits Innovation


NIH is a surprisingly common feature of the innovation landscape, and there are many other famous examples. Not least Kodak’s rejection of both Edwin Land’s idea for the Polaroid process and Chester Carlson’s xerography underline how easy it is to put up defenses against ideas originating from outside. NIH is a theme which my colleague Oana-Maria Pop has written a great blog post about, but its persistence makes it worthwhile to take another look. Elting E. Morison gives a wonderful example in his detailed study of “Gunfire at Sea,” which explores the tortuous journey the innovation of continuous-aim gunnery had in finding its way on to the decks of U.S. warships. Back in the late 19th century, naval gunnery was not very accurate. A U.S. Bureau of Ordnance study of one thousand shells fired during an exercise around the time of the Spanish-American war suggested that less than 3 percent were hitting the target. That’s a problem. A long way away in the South China Sea, Admiral Percy Scott of the British Navy was working on the solution. His squadron was doing gunnery practice with similarly poor results


Shedding light on broadband performance for an optimized SD-WAN

smart city - IoT internet of things - mobile wireless network
The catalyst for this has been the rise of software-defined WANs (SD-WANs). Their multipath capabilities and network optimization make it possible to use broadband for business connectivity, and organizations of all sizes have been jumping aboard this trend. One of the challenges that remain, though, is that not all connections or types of broadband are created equal. A network manager at a business with branches located nationwide recently told me that his company's preferred broadband type has been cable, but he has been surprised by the variability in the resulting bandwidth by location and time of day. For example, in one large metro area, he is purchasing 50MB of bandwidth, more than adequate for most branches. During the mornings he often sees throughput of 100MB or more. Later in the day, however, when kids get home from school and the Xboxes and Netflix subscriptions start coming online, he has seen his bandwidth drop to as low as 8MB.


Impact of Artificial Intelligence on Paid Advertising


AI is the advancement of research technology that brings together many different factors based on the internet habits of the user. In other words, it is the gathering of information on the personal habits of the potential customer so that paid advertising can be better directed at the right audience. There are ethical implications of AI as many feel that it crosses the line in gathering personal information. However, there is no doubt that it has helped improve the targeting of paid advertising efforts. AI does more that break down the demographics, it takes every search you make and every page you visit and turns that into information about your personal habits on the internet. By not only addressing your actions, it also considers your vulnerabilities which means that paid advertising in places like Google AdWords and Facebook PPC becomes more potent in its effects. Over the past few years, AI has had a major impact on the collection, interpretation, and distribution of the information provided to marketers so they can purchase advertising in the right places. 


Citi wants fintech startups to disrupt institutional banking

Citi Naveed Sultan MWC18
“Contrary to the common belief, I think there is more opportunity for collaboration with fintech than disruption,” he added. “Particularly on the institutional side.” Sultan said Citi is already “very engaged” with the fintech space, scanning “thousands” of startups every year — saying it’s taken an equity position in “about 30” so far. It has established four “innovation centers” in Singapore, Dublin, Tel-Aviv and London to act as its feelers on the fintech scene. And its investments are focused in four key areas, according to Sultan — namely: Client experience; scalability; operating model agility; and innovation. “Pretty much every one of [the fintech startups it’s invested in] have an operating relationship with the businesses,” he continued. “So we are using their technology and integrating into our solutions. And helping them commercialize, appreciate our equity, as well as delivering a better solution to the client. “So I think the philosophical change is you cannot get to the market fast enough on a proprietary basis.”


CEO Insights: As plants get smarter, manufacturers need to “watch the robots”

We are seeing three big drivers. The first is the rapidly decreasing cost and size of cameras. The number of cameras deployed will grow dramatically. Five years ago, where we were putting in half a dozen cameras in to provide an overview of what was going on in the warehouse or factory floor – today we are putting 4 or 5 times as many on the ends of forklift trucks and inside industrial machinery. Wherever an operator doesn't have exactly the view they need, we are going to be placing cameras. This will just keep growing. The second big driver is increasing intelligence in cameras. More and more video analytics and storage are moving to the edge – inside the camera. Customers keep finding new opportunities to have the cameras provide not only images but also actionable data. For us, this is a great opportunity, and we are pushing forward with specialized analytics for industrial applications.


Still don't understand the blockchain? This explainer will help

Soccer Football - Premier League - West Ham United vs Newcastle United - London Stadium, London, Britain - December 23, 2017   General view of a bubble inside the stadium    Action Images via Reuters/Tony O'Brien    EDITORIAL USE ONLY. No use with unauthorized audio, video, data, fixture lists, club/league logos or "live" services. Online in-match use limited to 75 images, no video emulation. No use in betting, games or single club/league/player publications.  Please contact your account representative for further details. - RC11B9B70E90
“I think the blockchain is going to do really interesting things around the transfer of value,” writer and academic Rachel Botsman noted at Davos. Even though traditionally much of the chatter in the cryptocurrency communities has centred on disrupting the financial industry, Botsman doubts this technology will altogether result in currency exchanges bypassing the banks. “I don’t understand where that hype is coming from,” Botsman told the Forum's annual meeting. Indeed, many banks, though hardly endorsing the exchange of cryptocurrencies, are certainly waking up to the possibilities of the blockchain, which could eventually allow them to automate many systems that currently require a large in-office staff. As Brian Behlendorf, executive director of Hyperledger, explained to me at the Forum's annual meeting this year: “[Banks] can do a lot of what they do today (...) just faster and cheaper and with greater guarantees that the parties that they’re working with will make good on their commitments.”


Use your brain; AI won't soon replace it


The human brain may not be the most efficient form of intelligence; it needs a lot of biological backup machinery to make up for cells that die all the time, and its ability to store data is not as reliable as that of computers. Someday, many years from now, technology will probably exist that will be able to reconstruct the brain while cutting some corners for improved efficiency. But it's unlikely to be able to replicate every nuance of perception, memory, emotion, intuition. We often talk about today's artificial intelligence -- based on algorithms that essentially use the brute force of computers to crunch problems such as image recognition -- as if it'll soon replace humans at complex creative and communicative tasks. That kind of AI, however, will never do it. Progress along the same lines can produce smarter digital assistants than today's Siri or Alexa. But a human, equipped with a computer, will still run circles around them because of the sheer, currently irreproducible complexity of the human brain.


How Federal Cybersecurity is Evolving

In 2017, Congress passed the 500 million dollar Modernizing Government Technology Act, which is part of defense funding. Although the act promises to play a role in improving lax cybersecurity efforts, it’s also aimed to replace legacy systems with more modern systems, purportedly in an effort to cut back on operating expenses. Although the act, if properly funded, will bring extra attention to cybersecurity, replacing legacy systems also increases the burden of implementing proper security. The act will likely have benefits to users, as modern systems should be able to outperform legacy ones. However, there’s no guarantee that security efforts will be enough to prevent potentially large and damaging cyberattacks. The cost of proper cybersecurity can be expensive, and the slow pace of the government means new technology can be slow to adopt. On the other side, paying for hackers is relatively inexpensive, and even the smallest of security holes can eventually led to massive attacks.



Quote for the day:


"Having more data does not always give you the power to make better decisions." -- Jeffrey Fry


Daily Tech Digest - March 23, 2018

Google Enemy
There are countless other examples of Google acting as its own worst enemy and failing to follow through with a commendable initial vision. Look at the company's never-ending messaging mess, for instance, or the awkward implementation of Apple-like app shortcuts in Android 7.1. In the latter case, as I said at the time, "instead of thinking through what'd be the most sensible and user-friendly way for a feature like this to work, Google seemed to just emulate the way Apple did it." See the pattern? To a degree, a company being flexible and open to the evolution of its products — even when said transformation blatantly revolves around "borrowing" inspiration from other sources — can be an asset. But there's also something to be said for having the stones to stand by the value of your own ideas and remaining willing to recognize when you've got a good thing going, even if that thing requires a mix of refinement and promotion to reach its potential.


multiple-exposure image of FinTech symbols, laptop, circuit board, and a dollar bill
Starting in May, GDPR will force European banks to rethink how they store, manage, use and disseminate personally identifiable information, according to the report. "If they wish to partake in blockchain-based AML and EFM device, whitelist, and transactional data sharing, [financial institutions] must adapt their privacy policies and tools to be able to cope with this requirement," Forrester said. ... AML and EFM are harder than ever to enforce and need to rely on the most diverse data possible, Forrester said, adding that "verifying identities before allowing them to transact helps avoid fraud losses in a complex payment ecosystem." That's where blockchain can be useful. Because it is an immutable, auditable electronic record, blockchain ensures that transaction records contain artifacts and identifiers of previous transactions. "This allows authorized investigators to backtrack transactions on the blockchain more easily than with current AML and EFM systems," Forrester said.


Java security issues are real. Java was designed to be as secure as most other popular programming languages, and it offers features like SecurityManager to help improve security in certain contexts. However, Java applications are subject to a number of potential security vulnerabilities, including, but not limited to, various injection attacks. It's crucial for Java developers and administrators to keep common Java security vulnerabilities in mind as they write and deploy Java applications. Security-first programming is especially important in the case of Java because the cross-platform nature of Java code means that OS-level security frameworks can't always be trusted to keep applications secure. Nor should you expect end users to be able to manage Java security threats effectively. Sure, you can blame your users for running untrusted Java code or disabling automatic updates to their Java runtimes, but ultimately, the burden of writing secure Java applications and isolating code within a Java environment that might not be secure lies with developers.


Short sprints vs. big bang: the best way to adopt the cloud
When it comes to cloud adoption, large enterprises and government agencies focus on quick wins, using quick sprints, and are typically more successful than those companies that try to drive huge change over a longer period of time, aka the big-bang approach. ... The objectives of each may be exactly the same—to migrate most of the enterprise’s workloads—but the short-sprint approach is ten times more likely to demonstrate success and thus value than the big-bang approach.  The short-sprint approach also aligns with the typical corporate culture. People think in small tactical ways versus large and strategic, so expectations are for small, quick wins. The larger, longer strategic wins simply are not as valued by the executives and investors in the standard corporate culture. The big-bang approach can and does work—if the company can hold to its commitment that long and doesn’t need ROI along the way. 


4 best practices for automating mobile app testing

How can you test your mobile apps on a diversity of mobile devices, and in geographic locales that you're not even thinking of? This is the problem that BrowserStack, and other companies like it, set out to solve. These companies offer cloud-based test automation software and processes that enable you to test your app in virtually any simulated mobile device environment and in any test scenario. The test automation can eliminate steps for your QA staff, such as checking out app navigation, displays of data and images, and even data access, retrieval and update to a database. "Our goal was to develop tools that could automate as much of the diverse mobile app test process as possible, to save time, and to speed these apps to market," said Rao. "We were also aware of the mobile developer shortage in enterprises, and the fact that many companies can't secure the mobile app development talent that they need. Consequently, they have to find other ways to speed app development and test, like automating more of the process."


Intel To Release Most Powerful Mainstream Processor Ever To Beat AMD


Intel largely succeeded here, and thanks to higher clock speeds (and unfortunately a higher price too) on the CPU compared to AMD's equivalents price-wise, the Core i7-8700K was faster in many tests despite a two core deficit. However, where raw multi-threaded performance is concerned, especially in benchmarks that aren't otherwise Intel-optimised, AMD gained the upper hand meaning that Intel can't quite claim to be faster in everything. This is what it's looking to address with the new CPU, because even with better boosting algorithms, AMD's soon-expected Ryzen second generations CPUs, due next month, probably won't be able to match something akin to a Core i7-8700K, but with eight cores. Interestingly, AMD's 12-core Threadripper 1920X retails for $670, which means that there's a big gap for Intel to play with price-wise with its new eight-core CPU. The Core i7-8700K retails for around $350, so even if the new CPU costs $500, it's potentially going to blur the lines between both AMD and Intel's high-end desktop platforms.


How Microsoft and Databricks crafted a unique partnership for AI data processing

Using Azure Databricks, customers can take in data ingested through other services, prepare it, and process it using machine learning algorithms and other techniques. After that, it can be funneled out to other services like Cosmos DB and Power BI. Making a deep integration possible required a great deal of work on the part of both firms, however. Company representatives made many trips back and forth between Databricks’ office in San Francisco and Microsoft’s in Redmond. The partnership wasn’t without its challenges on either end, but both companies were committed to it for the sake of their joint customers. Andreessen Horowitz cofounder Ben Horowitz, who sits on Databricks’ board, has a close relationship with Microsoft CEO Satya Nadella and helped facilitate the two companies’ collaboration. Microsoft had to work through concerns about what it would mean for the company to deeply integrate Databricks with Azure systems, including those for incident management, handling support requests, and other functions. 


Why the Waterfall or Agile debate will be around forever


"What's wrong is to assume one or the other is always the answer," Raschke said. "When you look at a product and it needs flexibility [or] the client is not sure what they want, what the market needs or there is innovation required, Scrum is important. If it is embedded or mission-critical software, Waterfall might be a better choice." What phase the project is on also determines which methodology a team should use. Business case development, choice of infrastructure and decisions on deployment and maintenance are all aspects of the software development lifecycle that don't necessarily lend themselves to the Agile approach. When forced to align with either Waterfall or Agile, these business cases tend to fall more in line with Waterfall. Once those outlines are in place, the actual development might trend more toward either Waterfall or Agile, depending on the purpose and nature of the project. Raschke said to remember what is most important. "The question is: What do we need to do to get it to the customer? In that case, it usually ends up as hybrid."


CISCO, Verizon Take Information-Centric Networking For a Real World Spin

mobile wireless network
Cisco has developed what it calls Hybrid ICN (hICN), which enables the deployment of ICN within IP rather than as an overlay or replacement of IP. It preserves all features of ICN communication by encoding ICN names into IP addresses, according to Giovanna Carofiglio, a Cisco Distinguished Engineer.  “hICN supports IPv4- or IPv6-RFC compliant packet formats, and guarantees transparent interconnection with standard IP networking equipment, simplifying the insertion of ICN technology in existing IP infrastructure and enabling coexistence with legacy IP traffic,” Carofiglio said. Cisco and Verizon expect that hICN will become a strong technology for 5Genvironments in that ICN adoption may dramatically simplify next generation network architecture by offering a unified content-aware and access-agnostic network substrate for the integration of heterogeneous networks, Carofiglio said. It is the hICN technology Verizon tested in its lab recently.



Optimising the smart office: A marriage of technology and people

A strong digital culture is clearly a positive thing, but there's room for improvement: the percentages of employees in strong-digital-culture businesses who rate themselves highly on empowerment (47%), innovativeness (39%) and -- in particular -- productivity (22%) might be expected to be higher, for example. Management consultancy McKinsey has recently suggested that productivity benefits from digitisation have yet to materialise at scale for several reasons, including "lag effects due to the need to reach technological and business readiness" and "costs associated with the absorption of management's time and focus on digital transformation". Another KPI Microsoft examined was engagement, or 'flow' -- the ability for workers to focus on the task at hand and deliver a better end result more efficiently. Overall, just 20 percent of respondents felt highly engaged at work, but there was a fourfold difference between engagement levels in businesses with strong versus weak digital cultures.



Quote for the day:


"Rarely have I seen a situation where doing less than the other guy is a good strategy." -- Jimmy Spithill


Daily Tech Digest -March 22, 2018

Five Pillars of Data Governance: Initiative Sponsorship

Initiative Sponsorship Data Governance GDPR
It’s an on-going initiative that requires active engagement from executives and business leaders. But unfortunately, the 2018 State of Data Governance Report finds lack of executive support to be the most common roadblock to implementing DG. This is historical baggage. Traditional DG has been an isolated program housed within IT, and thus, constrained within that department’s budget and resources. More significantly, managing DG solely within IT prevented those in the organization with the most knowledge of and investment in the data from participating in the process. This silo created problems ranging from a lack of context in data cataloging to poor data quality and a sub-par understanding of the data’s associated risks. Data Governance 2.0 addresses these issues by opening data governance to the whole organization. Its collaborative approach ensures that those with the most significant stake in an organization’s data are intrinsically involved in discovering, understanding, governing and socializing it to produce the desired outcomes. 


How Serverless Computing Reshapes Security

First and foremost, serverless computing, as its names implies, lowers the risks involved with managing servers. While the servers clearly still exist, they are no longer managed by the application owner, and are instead taken care of by the cloud platform operators — for instance, Google, Microsoft, or Amazon. Efficient and secure handling of servers is a core competency for these platforms, and so it's far more likely they will handle it well. The biggest concern you can eliminate is addressing vulnerable server dependencies. Patching your servers regularly is easy enough on a single server but quite hard to achieve at scale. As an industry, we are notoriously bad at tracking vulnerable operating system binaries, leading to one breach after another. Stats from Gartner predict this trend will continue into and past 2020. With a serverless approach, patching servers is the platform's responsibility. Beyond patching, serverless reduces the risk of a denial-of-service attack


Google parent's free DIY VPN: Alphabet's Outline keeps out web snoops

aphabetjigsawoutlinevpn.png
Outline promises to solve the double-edged sword of VPN services. There are loads of free VPN services, which in theory can protect sensitive information when using a public Wi-Fi network. However, as ZDNet's David Gewirtz has pointed out, you probably shouldn't entrust these with digging an encrypted tunnel between your computer and another machine. An alternative option is to pay around $120 a year for a VPN service, but again this requires trusting the provider and weighing up the jurisdiction it operates in. Outline offers journalists a cheaper way to set up their own VPN server on any cloud provider or on their own hardware, cutting out the need to trust a third party. "Outline gives you control over your privacy by letting you operate your own server. And Outline never logs your web traffic," Jigsaw product manager Santiago Andrigo wrote. "We made it possible to set up Outline on any cloud provider or on your own infrastructure so you can fully own and operate your own VPN and don't have to trust a VPN operator with your data."


Hexagonal Architecture as a Natural Fit for Apache Camel


Let's look at the two extremes: a layered architecture manages the complexity of a large application by decomposing it and structuring it into groups of subtasks of particular abstraction levels called layers. Each layer has a specific role and responsibility within the application and changes made in one layer of the architecture usually don't affect the components of other layers. In practice, this architecture splits an application into horizontal layers, and it is a very common approach for large monolithic web or ESB applications of the JEE world. On the other extreme is Camel, with its expressive DSL and route/flow abstractions. Based on the Pipes and Filters pattern, Camel would divide a large processing task into a sequence of smaller, independent processing steps (Filters) connected by a channel (Pipes). There is no notion of layers that depend on each other, and, in fact, because of its powerful DSL, a simple integration can be done in a few lines and a single layer only. In practice, Camel routes split your application by use case and business flow into vertical flows rather than horizontal layers.


Do Facebook Users Really Care About Online Privacy?

Facebook-Cambridge Analytica row: Do Facebook users really care about online privacy?
Facebook, since 2014, has a platform policy that clearly states what developers of third party apps can and cannot do. With regards to data, third party apps have to elaborate in their privacy policy on what data they are collecting and how they plan to use that data. These third-party apps must also delete any data received from Facebook. What Facebook also does now is moderate third party apps. Apps go through a review process where they must justify why that information is necessary for the app. Facebook characterizes "detailed information" as anything other than a user's friends, public profile, and email. Approval is only granted if apps can show that the information they are requested will be directly used. But Facebook's updated platform policy only came into place a year after the "Cambridge University researcher named Aleksandr Kogan had created a personality quiz app, which was installed by around 300,000 people who shared their data as well as some of their friends' data," as revealed by Zuckerberg in the public post.



Wide area Ethernet can fuel digital network transformation

Enter wide area Ethernet (WAE) services. WAE is a technology that has been around a long time but never gained the same level of adoption as other network services such as MPLS or consumer-type broadband services. In some ways Ethernet has always been a solution looking for a problem as its attributes didn't align cleanly with the challenges most businesses faced. Indeed, the network was considered by many to be a commodity -- the basic plumbing if you will -- where the information being transported was best-effort in nature. Because of this, network managers and procurement officers just went with what they knew, even though it was often considerably more expensive. DX is the problem that Ethernet and WAE have been waiting for. WAE directly addresses the business problems faced by digital organizations. ... Data continues to grow at exponential rates; 90% of all data that exists today, in fact, has been created in the past two years, according to ZK Research. IoT, video, mobile services and other data will only continue to add to the glut. 


9 machine learning myths
Machine learning is proving so useful that it's tempting to assume it can solve every problem and applies to every situation. Like any other tool, machine learning is useful in particular areas, especially for problems you’ve always had but knew you could never hire enough people to tackle, or for problems with a clear goal but no obvious method for achieving it. Still, every organization is likely to take advantage of machine learning in one way or another, as 42 percent of executives recently told Accenture they expect AI will be behind all their new innovations by 2021. But you’ll get better results if you look beyond the hype and avoid these common myths by understanding what machine learning can and can’t deliver. ... Think of it as anything that makes machines seem smart. None of these are the kind of general “artificial intelligence” that some people fear could compete with or even attack humanity. Beware the buzzwords and be precise. 


networksec.jpg
As organizations matured, they could use the cloud control plane tools to create NAC rules. While the interface required training, the concepts were similar. Traffic from one set of hosts was allowed or disallowed. However, the cloud security control plane does represent one of the first early challenges in hybrid IT security—a consistent operations control plane. As hybrid IT services become more complex, security professionals required more granular controls between the public cloud and private infrastructure. Take the universal example of the web and application tiers in a three-tier application as an example. Merely creating a firewall rule that allows traffic from the web-tier to the application-tier proved complex. Early private data center firewalls lacked the context of ephemeral cloud security objects. If the web-tier leveraged elastic compute, the public cloud administrator had to ensure that auto-scaled web servers were all created in the same network scope for the static firewall to properly filter traffic.


What would a regulated-IoT world look like?

IoT security hero image
Perhaps the most useful contrast to the U.S.’s lack of regulatory attention to IoT security issues is Europe, where the General Data Protection Regulation has provoked howls of outrage from the tech industry, but won praise from privacy rights advocates. GDPR, in essence, places the burden on companies to state clearly and up-front what types of user data will be gathered, and precisely what it will be used for. It also gives users the right to see data that has been collected about them, and to correct inaccuracies. It’s not wildly dissimilar to the most stringent data protection law currently on the books in the U.S. – the Health Insurance Portability and Accountability Act, better known as HIPAA. According to Sadeh, a more broad-based privacy protection law in the U.S., designed to address the threats posed by IoT and other technologies that have badly outstripped existing regulations, could easily resemble HIPAA with greater scope.



Google Is Working on Its Own Blockchain-Related Technology

The technology presents challenges and opportunities for Google. Distributed networks of computers that run digital ledgers can eliminate risks that come with information held centrally by a single company. While Google’s security is strong, it’s one of the largest holders of information in the world. The decentralized approach is also beginning to support new online services that compete with Google. Still, the company is an internet pioneer and has long experience embracing new and open web standards. To build its ledger, Google has looked at technology from the Hyperledger consortium, but it could opt for another type that may be easier to scale to run millions of transactions, one of the people familiar with the situation said. "Any time there’s a paradigm shift like this, there’s an opportunity for new giants to emerge -- but also for incumbents to adopt the new approach," said Elad Gil, a startup investor who worked on early mobile projects at Google more than a decade ago.



Quote for the day:


"Do not lose hold of your dreams or aspirations. For if you do, you may still exist but you have ceased to live." -- Thoreau


Daily Tech Digest - March 21, 2018

AI outpaces lawyers in reviewing legal documents


For the study, the lawyers and the LawGeex AI had to analyse five previously unseen contracts with 153 paragraphs of technical legal language, under controlled conditions precisely prepared the way lawyers review and approve everyday contracts. The highest performing lawyer stood in line with LawGeex AI by achieving 94% accuracy but the average accuracy achieved by the least performing lawyer stood at just 67%. The most notable difference in the test between machines and humans lies in the time factor: while it took LawGeex AI only 26 seconds to complete the task, the lawyers took average of 92 minutes. The longest time spent by the humans to accomplish the test was 156 minutes and the shortest time recorded was 51 minutes. Commenting on the study, Gillian K. Hadfield, Professor of Law and Economics at the University of Southern California said: “This research shows technology can help solve two problems – both making contract management faster and more reliable, and freeing up resources so legal departments can focus on building the quality of their human legal teams.”



Are You Just Keeping the Lights On in Your Datacenter?

The notion that IT struggles to move beyond their traditional role and into a more innovative one is very common. But, as the IDC statistic shows, IT is more often a cost center, rather than a source of innovation and revenue for the company. Why is this situation still so widespread? A core issue is that nearly everything in the datacenter is manual and not automated. Most datacenters have custom configurations that require their own manual maintenance with specialized tools. Incremental progress on any one or two help, but it isn’t enough to substantially change the big picture for the company. Over time, people get used to this status quo and start to think that it is completely normal. They fall into the trap of believing that a huge step forward toward automation and innovation is impossible.



Top 10 open source legal stories that shook 2017

Top 10 open source legal stories
In February 2017, GitHub announced it was revising its terms of service and invited comments on the changes, several of which concerned rights in the user-uploaded content. The earlier GitHub terms included an agreement by the user to allow others to "view and fork" public repositories, as well as an indemnification provision protecting GitHub against third-party claims. The new terms added a license from the user to GitHub to allow it to store and serve content, a default "inbound=outbound" contributor license, and an agreement by the user to comply with third-party licenses covering uploaded content. While keeping the "view and fork" language, the new terms state that further rights can be granted by adopting an open source license. The terms also add a waiver of moral rights with a two-level fallback license, the second license granting GitHub permission to use content without attribution and to make reasonable adaptations "as necessary to render the website and provide the service."


Descriptive Statistics: The Mighty Dwarf of Data Science

Consider a case where a monitoring system is to detect anomalies within the data. Typically, one may turn into the classic means of outlier analysis like the DBSCAN-based approaches or LOF. Nothing wrong with these, they may perfectly well point towards the directions where the outliers may be present. However, these techniques may require substantial computational resources to complete the task on high volumes of data in reasonably acceptable amount of time. A much faster alternative may come from considering the given case as a time series analysis problem. Such data coming from a system operating in ‘healthy’ conditions would have a typical, acceptable amplitude distribution and, in such scenario, any deviation from the expected shape may be considered a potential threat, worth detecting. A very fast descriptive statistic aimed at summarizing the shape of the distribution of the signal is called the ‘kurtosis’.


What Are the Limits of Forensic Data Retention?


Internet Service Providers will retain IP addresses of customers and the servers that they connect to. Armed with this information, forensic investigators can determine which websites suspects are accessing. However, most ISPs do not keep records of the actual content their subscribers access. There are a couple of reasons for this. First of all, keeping records of all content would be far more demanding on their servers. They simply don’t have the resources, even in the age of big data. Even if they wanted to keep these records, it would be impossible to see what content customers are accessing on most websites. Most websites have encrypted connections, so Internet Service Providers can’t tell what their users are doing on them. For example, since Facebook uses HTTPS connections, Internet service providers can’t read the customers’ messages or seewhat content they post on their Facebook feed. Nor can they see what they are searching for on Google.


AI key to do 'more with less' in securing enterprise cloud services

Artificial intelligence (AI), machine learning (ML), and predictive analytics applications may one day prove to be the key to maintaining control and preventing successful hacks, data breaches, and network compromise. These technologies encompass deep learning, algorithms, and Big Data analysis to perform a variety of tasks. The main goal of AI and ML is usually to find anomalies in systems and networks of note, whether it be suspicious traffic, unauthorized insider behavior and threats, or indicators of compromise. Able to evolve over time, the purpose of AI technologies is to learn, detect, and prevent suspicious and dangerous activities with improvements and refinements the longer such applications and systems are in use. This provides companies with a custom cybersecurity system which tailors itself to their requirements, in comparison to an off-the-shelf, traditional antivirus security solution -- which is no longer enough with so many threats lurking at the perimeter.


What is the leading IT cost-optimisation priority for CIOs?

IT Cost CIOs
This bolsters Gartner’s opinion that the most successful organisations are more likely to trust their IT organisation to manage their IT and digital technology spending. Respondents were also questioned about who manages the selection and approval of cost optimisation ideas. Those with visibility of both the IT shared services budget and all digital spending across the organisation reported that, on average, nearly half of their digital technology spending is paid for by the business. A quarter is paid for out of the IT budget, with chargeback to the business. “As you’d expect, CIOs have the most influence over the selection and approval of cost optimisation opportunities within IT shared services,” said Buchanan. “Interestingly, CIOs who focus on digital business opportunities have greater responsibility for cost optimisation than those who don’t. This suggests that CIOs are starting to exert influence over selecting and approving digital business ideas to optimise business costs.”


IoT in Healthcare: Balancing Patient Privacy & Innovation

Medical technology tends to lag behind other technologies, as the cost of mistakes at medical practices and hospitals can be astronomical. As a result, the field can lag behind when it comes to adopting the latest digital or IoT technologies. Patient privacy is a major issue, so all new technologies must be adopted carefully while adhering to various data compliance obligations that apply both to companies in general as well as healthcare organisations specifically. Managing clinics and hospitals is complex, and it’s expensive. Many healthcare organizations rely on multiple computer and networking systems. Through smart bracelets, administrators can better track patient movement, and they can determine how often patients meet with their doctors. In addition, IoT technology can make it easier to track and analyze patients’ vital signs and other metrics, offering invaluable feedback and resolution not possible with manually measurements. 


Organizing for digital industrial leadership

Organizing for digital industrial leadership
Digital industrial leadership is transforming the industrial world. For BHGE, specifically, data and analytics are fundamentally changing the way work gets done in our business and in the oil and gas industry as we prepare for the next big step-change in productivity. When I think of digital industrial leadership, I think about using data to move from looking in the rear-view mirror to looking into the future. We are beyond making decisions based on order history; we are using data to be predictive and make recommendations for future sales targets. As an example, we are using artificial intelligence on the shop floor to understand what drives disruptive, unscheduled downtime of our welding machines. When information technology meets operations technology, we learn what behaviors or indicators lead up to that unplanned downtime. We can use predictive analytics to do preventive maintenance and improve our productivity.


Credit Risk Prediction Using Artificial Neural Network Algorithm

To predict the credit default, several methods have been created and proposed. The use of method depends on the complexity of banks and financial institutions, size and type of the loan. The commonly used method has been discrimination analysis. This method uses a score function that helps in decision making whereas some researchers have stated doubts on the validity of discriminates analysis because of its restrictive assumptions; normality and independence among variables [4]. Artificial neural network models have created to overcome the shortcomings of other inefficient credit default models. The objective of this paper is to study the ability of neural network algorithms to tackle the problem of predicting credit default, that measures the creditworthiness of the loan application over a time period. Feed forward neural network algorithm is applied to a small dataset of residential mortgages applications of a bank to predict the credit default.


Quote for the day:


"Never allow someone to be your priority while allowing yourself to be their option." -- Mark Twain


Daily Tech Digest - March 20, 2018

The future of computer security is machine vs machine

security automation robot protects defends from attack intrusion breach
Because so much of our computing infrastructure will be protected and controlled by well-informed, cloud-based decision makers, the malware and hackers of the future will be forced to fight the centralized services first and foremost if they ever hope to spread. They will probably subscribe to these same services and look for holes, or subscribe to a malicious service that belongs to multiple services and looks for and sells weaknesses, much like some services do today fighting the accuracy of VirusTotal. This is where the future defense and attack scenarios start looking very machine versus machine. Our future defenses will be more centralized, coordinated, and automated. The hackers will have to do the same thing to stay ahead. If they don’t automate as much as or more than the defensive services do, they won’t be able to do as much badness. Hackers and malware will turn to automation and AI just as much as the defenders. When the defenders block the malicious thing that was being successful a few minutes ago, the malicious automated service will have to quickly respond. Whomever’s AI is better will ultimately win.


Empowering Citizen Data Scientists to Solve the AI Skills Shortage


Citizen data scientists are analysts with above average skills but without a formal academic background in data science, explained Ashley Kramer, vice president of Product Management at Alteryx, "I see the citizen data scientists as this emerging group," she said in an interview. "They don't have a degree in data science but they're more advanced than your average analyst. They are people with advanced capabilities like writing scripts within Excel. They are starting to get to that next level of being able to create predictive analytics. They need a little bit of help because they're not programmers." This is the group of what in an earlier era were called power users. Alteryx is designed to help them. "With the data scientist shortage," Kramer said, "we provide a platform that can be used by citizen data scientists in a code-free way, which is really important as they're learning the process." Alteryx offers a "code-friendly" way for budding data scientists to begin creating machine learning models for business requirements such as predicting and preventing equipment failures.


How complexity, multicloud sprawl, and need for maturity hinder hybrid IT


To some degree, we’ve already hit that inflection point where technology is being used in inappropriate ways. A great example of this—and it’s something that just kind of raises the hair on the back of my neck—is when I hear that boards of directors of publicly traded companies are giving mandates to their organization to “go cloud.” The board should be very business-focused and instead they're dictating specific technology, whether it’s the right technology or not. That’s really what this comes down to.  Another example is folks that try and go all in on cloud but aren’t necessarily thinking about what’s the right use of cloud—in all forms: public, private, software as a service (SaaS). What’s the right combination to use for any given application? It’s not a one-size-fits-all answer. We in the enterprise IT space haven't really done enough work to truly understand how best to leverage these new sets of tools. We need to both wrap our head around it but also get in the right frame of mind and thought process around how to take advantage of them in the best way possible.


Enhancing digital infrastructure: Why it matters & the best strategies for your business

Change can be unsettling, and workers can understandably be apprehensive about any differences in role, focus or tasks, especially if it involves a technology element they don’t necessarily understand. For example; if moving to the ‘cloud’ you can’t just ‘buy cloud’ and hope everyone catches on to the concept overnight. No person or training course can transform single-handedly. There’s also a whole range of concepts to become familiar with for teams to work at the speed that cloud can enable. Focus on the move to ‘becoming cloud’ – a gradual process of change – which will transform the culture in a structured and less intimidating way. Help your staff by providing as much clarity as you can; translate top-line goals and priorities into specific metrics and KPIs for employees at all levels. Allow them time and space to experiment with new technology and let them know it’s ok to get things wrong! Encourage users to communicate and feedback so the right support is identified.


The evolution of systems requires an evolution of systems engineers

Nature-kaleidoscope
New frameworks, architectures, processes, and a thriving ecosystem of tools have emerged to help us meet those challenges. Some of these are in an embryonic state, but rapid adoption is driving quick maturity. We’ve seen this evolution in compute: it’s only been four years since containers became a mainstream technology, and we are now working with complex application-level abstractions enabled by tools like Kubernetes. A similar evolution is occuring with deployment, serverless, edge-computing technology, security, performance, and system observability. Finally, no changes can exist in a human and organizational vacuum. We have to develop the leadership skills necessary to build truly cross-functional teams and enable that rapid iteration needed to build these systems. We have to continue the work of the DevOps and SRE communities to break down silos and streamline transitions between teams and increase development velocity.


Surprisingly, These 10 Professional Jobs Are Under Threat From Big Data

Big Data threat to professional jobs (Source: Shutterstock)
When you read or hear news stories about the imminent takeover of robots and algorithms that will eliminate jobs for human workers, many times the first examples given are blue-collar jobs like factory workers and taxi drivers. And you may have mentally congratulated yourself because your “professional” job is safe from the threat of being outsourced to computers. But don’t feel so safe just yet. More and more, sophisticated algorithms and machine learning are proving that jobs previously thought to be the sole purview of humans can be done — as well or better — by machines. Boston Consulting Group has predicted that by 2025 as much as a quarter of jobs currently available will be replaced by either smart software or robots. A study out of Oxford University also suggested that as much as 35 percent of existing jobs in the U.K. could be at risk of automation inside the next 20 years.


IBM Watson Data Kits speed enterprise AI development

istock-871148930.jpg
More than half of data scientists said they spend most of their time on janitorial tasks, such as cleaning and organizing data, labeling data, and collecting data sets, according to a CrowdFlower report, making it difficult for business leaders to implement AI technology at scale. Streamlining and accelerating the development process for AI engineers and data scientists will help companies more quickly gain insights from their data, and drive greater business value, according to IBM. "Big data is fueling the cognitive era. However, businesses need the right data to truly drive innovation," Kristen Lauria, general manager of Watson media and content, said in the release. "IBM Watson Data Kits can help bridge that gap by providing the machine-readable, pre-trained data companies require to accelerate AI development and lead to a faster time to insight and value. Data is hard, but Watson can make it easier for stakeholders at every level, from CIOs to data scientists."


An Incredible New Type of Brain Implant Can Boost Memory by 15%

main article image
By fine tuning the electrical activity of the probes, the scientists aimed to activate key components of the brain's memory network only when it struggled to store memories, but not when it was working fine. The basic concept itself of boosting memorisation and recall through neural stimulation is old ground. Neuroscientists have gradually progressed from using non-invasive Transcranial Magnetic Stimulation techniques to deep brain stimulation in an effort to tickle the right pathways and encourage the brain to store and reconnect with memories. While there have been encouraging successes by precisely targeting areas such as the hippocampus and medial temporal lobes, the results haven't always been consistent. Part of the problem could be the choice of location, but another issue could be the method. Past efforts have used what's called an open loop system, meaning the stimulation wasn't tweaked in response to the brain's activity.


Hyperconverged infrastructure gets its own Gartner magic quadrant

data center hyperconvergence
Hyperconvergence is an IT framework that combines storage, virtualized computing and networking into a single system to reduce data center complexity and increase scalability. Hyperconverged platforms include a hypervisor for virtualized computing, software-defined storage, and virtualized networking. The promise of hyperconverged infrastructure is simplicity and flexibility compared with legacy solutions. The integrated storage systems, servers and networking switches are designed to be managed as a single system, across all instances of a hyperconverged infrastructure. As hyperconvergence has caught on among enterprises, major system vendors have gotten into the action by acquiring startups or bunding their servers with HCI software through OEM arrangements. Gartner’s new magic quadrant specifically focuses on vendors that develop the core hyperconvergence software. The new magic quadrant drops the system hardware requirement that’s part of the HCIS appliance model, Gartner says.


A.I. and speech advances bring virtual assistants to work

01virtual assistant faceoff2 100686099 orig
“It is that idea of ambient computing, the idea that at any time I could just say ‘Alexa start my meeting,’ ‘Alexa how are my sales figures?’ or ‘Alexa I forgot to shut off the projector in the conference room please shut that off for me.’ “It is very natural, it is very spontaneous,” Ibitski said. “And all of this is happening because of the advancements we have seen in what is called NLU, or natural language understanding. And that is the difference – it is understanding context.”  Collin Davis, general manager of Alexa for Business, said virtual assistants are already helping employees get work done. “What we are finding is a really interesting shift is happening, where voice is offering up almost another dimension of multi-tasking, where workers sitting at their desk can use Alexa almost as a vocal multi-taskerto be able to get information quickly without losing focus,” Davis said. “You could be working on a report and you need to know how many deals closed last quarter without having to reach into your pocket or find an app or switch websites – you just get the information that you need.”



Quote for the day:


"Leadership appears to be the art of getting others to want to do something you are convinced should be done." -- Vance Packard