Daily Tech Digest - May 14, 2018

Next-Gen ERP: Finance Leaders Transform into Superheroes


Finance leaders across many small and midsize businesses (SMBs) are truly modern-day business superheroes, flexing their influence across entire companies more than ever. They own responsibilities spanning regulatory compliance; treasury and asset management; investor relationships; and strategic advice to the CEO, president, or owner – all while concurrently managing their traditional finance, budgeting, and accounting functions. Today’s finance leaders are evolving into prominence as they deal with significantly more business risks as markets change at an unrelenting pace and huge chunks of critical data become more readily available. Finance leaders don’t need “super vision” to see they must focus on making fact-driven decisions that potentially impact every area of the company – from recruiting to manufacturing and logistics. According to the SAP-sponsored Oxford Economics report, “How Finance Leadership Pays Off: Small and Midsize Business,” 82% of surveyed finance leaders are accepting this challenge. Yet, many still struggle with outdated technology and manual processes.



Crypto Fight: US Lawmakers Seek Freedom From Backdoors

Crypto Fight: US Lawmakers Seek Freedom From Backdoors
"It is troubling that law enforcement agencies appear to be more interested in compelling U.S. companies to weaken their product security than using already available technological solutions to gain access to encrypted devices and services," Lofgren says. Other House lawmakers co-sponsoring the bill are Thomas Massie, R-Ky.; Jerrold Nadler, D-N.Y; Ted Poe, R-Texas; Ted Lieu, D-Calif.; and Matt Gaetz, R-Fla. Their effort has earned plaudits from digital rights groups, including the Electronic Frontier Foundation, which on Thursday said that the bill "gets encryption right." The EFF's David Ruiz says in a blog post: "This welcome piece of legislation reflects much of what the community of encryption researchers, scientists, developers and advocates have explained for decade - there is no such thing as a secure backdoor." The move by technology vendors to strengthen data protections in their products has been fueled by ever-increasing cybercrime, hacking efforts sponsored by nation-states, and the scale of the mass surveillance programs being conducted by the U.S. and U.K. governments, as revealed in 2013 by former National Security Agency contractor Edward Snowden.


Google Duplex beat the Turing test: Are we doomed?

Modern AI scientists have called what became known as the Turing test somewhat simplistic, because computer intelligence can be seen in a wide variety of actions beyond the imitation of human conversation. Even so, Turing's test has gone essentially unsolved since 1952. The test is simple. In Volume LIX, Number 236 (October 1950) of Oxford University's MIND, a Quarterly Review of Psychology and Philosophy, Turing published a paper, Computing Machinery and Intelligence. While there were many important concepts in this document, one concept he put forth was what he called an "imitation game." There's a 2014 movie by that name, starring Sherlock's Benedict Cumberbatch. It's about Turing, and it's worth watching. The idea of the imitation game was that both a human and a computer would be communicated with by a second human, the "interrogator." The interrogator would send, essentially, text messages to the human and to the computer and get replies. If the interrogator could not tell which of the two respondents was the human and which was the computer, the computer was said to have passed the Turing test, where a computer could so fully imitate a human that a human couldn't tell the difference.


Data Science for Startups: Introduction


This series is intended for data scientists and analysts that want to move beyond the model training stage, and build data pipelines and data products that can be impactful for an organization. However, it could also be useful for other disciplines that want a better understanding of how to work with data scientists to run experiments and build data products. It is intended for readers with programming experience, and will include code examples primarily in R and Java. One of the first questions to ask when hiring a data scientist for your startup is how will data science improve our product? At Windfall Data, our product is data, and therefore the goal of data science aligns well with the goal of the company, to build the most accurate model for estimating net worth. At other organizations, such as a mobile gaming company, the answer may not be so direct, and data science may be more useful for understanding how to run the business rather than improve products. However, in these early stages it’s usually beneficial to start collecting data about customer behavior, so that you can improve products in the future.


Scaffolding Entity Framework Core with CatFactory

Code generation it's a common technique developers use to reduce time in code writing, I know the most programmers build a code generator in their professional lifes. EF 6.x had a wizard for code generation, that tool generates DbContext and POCOs but there isn't code for Fluent API, Repositories and other things like those; with .NET Core there is a command line tool for code generation but we have the same scenario, there is generation only for DbContext and Entities; with CatFactory we're looking for a simple way to generate code with enterprise patterns, please don't forget this is an alpha version of CatFactory, don't pretend to have in this date a full version of code generation engine. Why don't use code CodeDOM? CodeDOM it's a complex code generation engine, I don't saying CodeDOM sucks or something like that, but at this moment we're focus on generate code in the more simple way, maybe in the incoming versions we'll add integration with CodeDOM.


This malware is harvesting saved credentials in Chrome, Firefox browsers

screen-shot-2018-05-14-at-07-40-45.jpg
The new malware has a subset of the same functionality but has also been upgraded with an arsenal of expanded features, including a new network communication protocol and Firefox stealing functionality. Vega Stealer is also written in .NET and focuses on the theft of saved credentials and payment information in Google Chrome. These credentials include passwords, saved credit cards, profiles, and cookies. When the Firefox browser is in use, the malware harvests specific files -- "key3.db" "key4.db", "logins.json", and "cookies.sqlite" -- which store various passwords and keys. However, Vega Stealer does not wrap up there. The malware also takes a screenshot of the infected machine and scans for any files on the system ending in .doc, .docx, .txt, .rtf, .xls, .xlsx, or .pdf for exfiltration. According to the security researchers, the malware is currently being utilized to target businesses in marketing, advertising, public relations, retail, and manufacturing. The phishing campaign designed to propagate the malware, however, is not sophisticated.


Growing CDN services market changes to meet cloud needs

The basic purpose of a CDN is still the same. But cloud use, growing reliance on mobile devices and application developers' needs to optimize their platforms are driving demands for CDN services that boost network performance and scalability, according to Ted Chamberlin, research vice president of cloud service providers at Gartner. Enterprises need their websites to be as dynamic as possible, and now they're looking at other pain points and turning to their CDN providers for help, he said. "They're saying, 'What else?'" That "what else" includes services like web application firewalls, distributed denial-of-service (DDoS) protection, bot mitigation, streaming video and e-commerce optimization.  Most of this happens through cloud platforms. The use of cloud-based CDN services continues to grow because they improve capabilities of web applications and storage, Chamberlin said. "Cloud is spurring everybody to do more than static content."  The general consensus is CDN services are in for a period of big growth. MarketsandMarkets forecasts the CDN services market will grow from $7.5 billion in 2017 to $30 billion in 2022, as CDN providers focus on security, compression, video, web optimization and data duplication features.


ASP.NET Core - The Power of Simplicity


Microsoft decided to go all-in on Open Web Interface for .NET, or OWIN as it’s also known, and abstract away the webserver completely. This allows the framework, as well as its users, to completely ignore which server is responsible for accepting the incoming HTTP requests, and instead, focus on building the functionality that is needed. OWIN isn’t a new concept though. The OWIN specification has been around for quite a few years, and Microsoft has allowed developers to use it while running under IIS for almost as long, through an open source project called Project Katana. In reality, Microsoft hasn’t just allowed us developers to use it through Katana, it has been the foundation for all ASP.NET authentication functionality for several years. So, what is OWIN really? To be honest, it’s fairly simple! And the simplicity is actually the thing that makes it so great. It’s an interface that manages to abstract away the webserver using only a predefined delegate and a generic dictionary of string and object. So instead of having an event driven architecture where the webserver raises events that you can attach to, it defines a pipeline of so called middlewares.


The rise of outcome-driven software development

In theory, outcome-driven development is about investigating customer or end-user needs in order to work toward desired outcomes. As a business idea, the outcome-based methodology has been circulatingsince at least 2002, when a Harvard Business Review contributor outlined a multi-step outcome-based process for business growth, beginning with conducting outcome-focused customer interviews; registering and noting desired outcomes; organizing and rating those outcomes based on degrees of customer satisfaction; and finally harnessing desired outcomes to inform product design. But if the theoretical basis for outcome-driven development was laid nearly 20 years ago, it’s only in recent years that we’ve seen it take hold in industries like software development, where the traditional “Big Bang” software launch is quickly being supplanted by a model of continuous development and delivery. Rather than focus on perfecting a piece of software in time for a perfect launch, innovative development teams view software as a constant work-in-progress.


IoT and personal devices pose huge security risk to enterprises


While 88% of the IT leaders that responded to the survey believe their security policy is either effective or very effective, nearly a quarter of employees from the US and UK did not know if their organisation had a security policy. Of those that reported that their organisation did have a security policy for connected devices, 20% of UK respondents claimed they either rarely, or never, follow it. Only one-fifth of respondents in the US and UK reported that they followed it to the letter. While security policies and security awareness have their place, they also have their limitations, according to RBS CISO Chris Ulliott. Commenting specifically on cyber security awareness training programmes, he told attendees of CrestCon 2018 in London that security professionals need to realise the limitations of such programmes. Ulliott is among those information security professionals who believe that device manufacturers and service providers need to put more effort into making things secure by design so they are safe to use without any fear of security risk.



Quote for the day:


"Grounded leaders are present for others, operate with fortitude, and influence with the full impact of their vision and strength." - Catherine Robinson-Walker


Daily Tech Digest - May 13, 2018

Routing Innovations for the Cloud Era

sachin2.png
Modern cloud grade routing architectures improve network economics by increasing network utilization and service availability. They offer end-to-end entropy friendly traffic load balancing - from multi-homed service edges to much simpler ECMP friendly SPRING and IP fabric cores. Traffic load balancing across all available paths improves network utilization and simplifies network capacity planning by easy scale out, without requiring traffic re-engineering. Additionally, multi-pathing architectures improve service availability and reduce failure domains since traffic can reroute to alternate path within milliseconds of a failure. Even better, multi-pathing architectures improve capital efficiency and network economics by allowing operators to run their networks ‘hotter,’ without compromising service SLAs. ... Ultimately, the great advantage of cloud grade networking is architectural simplicity that improves service agility and efficiency. With Juniper, deploying IP fabrics, EVPN, SPRING, RIFT and the Northstar Controller complement current network operations and architectures, and provide a graceful network transformation to modern, cloud era architectures.


Where Bank of America uses AI, and where its worries lie

“There's a chance AI models will be biased,” said Caroline Arnold, BofA's head of enterprise technology (which includes HR tech). “You might say, who's going to be successful at this company? An AI engine could find that people who golf are going to be successful at the company. On the other hand, using those same techniques can remove bias if you have the model ignore some of these things that are indicators of different groups but go on to the meat of the profile of the person and understand it in a deeper way.” Arnold believes an AI engine can never be the final say in who gets hired. Mehul Patel, CEO of Hired, a technology company whose software uses AI to match people to jobs, agreed that AI and humans have biases. “The good news about AI is, you can fix the bias,” he said. “We will boost underrepresented groups. The trouble with humans is they can't unwire their bias easily. Human bias far outweighs algorithmic bias. That's because we humans make quick decisions on people that aren't founded on what you're looking for in the job.”


Can blockchain technology live up to the hype? Barclays analysts say no


“It is high time to end the hype. Bitcoin is a slow, energy-inefficient dinosaur that will never be able to process transactions as quickly or inexpensively as an Excel spreadsheet,” wrote Nouriel Roubini, economist and cryptocurrency skeptic, in a recent Project Syndicate column he co-wrote called “The Blockchain Pipe Dream.” Of course, the advocates of blockchain are as ardently optimistic about what the technology can do, comparing blockchain with the early days of the internet. “It’s easy to compare blockchain with the internet due to the surrounding attention and the amount of money being poured into the respective spaces, but this only gives me more confidence that the technology will prevail long term,” said James Tabor, CEO of Media Protocol in an email to MarketWatch. “In the same vein as the internet made the flow of communication seamless and information readily available, blockchains can dismantle the centralized powers that have caused so much pain across all industries,” Tabor said.


Three elements drive interest in regulatory tech

According to a Juniper Research report, spending on regtech will grow by an average of 48% per annum over the next five years, rising from $10.6 billion in 2017 to $76.3 billion in 2022, as banks and financial services firms seek to avoid costly regulatory fines. Brennan Wright, head of marketing at identity verification and compliance company ThisIsMe, says the current staffing component dedicated to regulatory compliance within financial services organisations will fall to 1% to 2% by 2025, as new regtechs are introduced. "Technologies such as risk data aggregation and reporting tools, fraud detection tools and client onboarding systems will continue to empower compliance teams in the short term and will eventually replace many back-office positions; especially those mundane and admin-intensive roles. "The theme of change will favour legal and compliance teams that are technically savvy, have the necessary creative foresight and an ability to leverage the rapid innovation necessary to keep costs down, systems running smoothly and regulation in check," Wright points out.


The Law of Blockchain: Beyond Government Control?

In the case of blockchain, it’s still early days and Blockchain and the Law reflects that. It contains little in the way of case law (blockchain disputes are only now coming before judges), and the authors, Primavera De Filippi and Aaron Wright, spend considerable time explaining just how blockchains work. Namely, they emphasize how blockchain software creates permanent ledgers that are distributed across multiple computers and are mostly beyond the reach of central authorities. The upshot is what the authors call “lex cryptographica” or a system of rules where autonomous, decentralized code — rather than legislators or judges — determine the outcome of given interactions and disputes. This has the potential to bring dramatic changes in fields like corporate and insurance law. For instance, a blockchain can distribute dividends to shareholders according to pre-coded smart contracts. Or, in the event of an earthquake, an insurer’s blockchain can consult a third-party server (known as an “oracle” in blockchain parlance) to obtain seismic information and arrange payouts.


Connect the Dots: IoT Security Risks in an Increasingly Connected World

A woman using a digital tablet to control a smart home system: IoT
For organizations deploying IoT technology, it’s crucial to establish an incident response team to remediate vulnerabilities and disclose data breaches to the public. All devices should be capable of receiving remote updates to minimize the potential for threat actors to exploit outlying weaknesses to steal data. In addition, security leaders must invest in reliable data protection and storage solutions to protect users’ privacy and sensitive enterprise assets. This is especially critical given the increasing need to align with data privacy laws, many of which impose steep fines for noncompliance. Because some regulations afford users the right to demand the erasure of their personal information, this capability must be built into all IoT devices that collect user data. Organizations must also establish policies to define how data is collected, consumed and retained in the IT environment. To ensure the ongoing integrity of IoT deployments, security teams should conduct regular gap analyses to monitor the data generated by connected devices. This analysis should include both flow- and packet-based anomaly detection.


Making The Case For Hybrid Cloud

Enterprises have a complicated relationship with the cloud. Infrastructure-as-a-service (IaaS) offerings from public-cloud providers offer appealing alternatives to acquiring and provisioning on-premises hardware. And line-of-business organizations love being able to subscribe to software-as-a-service (SaaS) offerings that bypass IT altogether. But application development and deployment teams—the people the company charges with leading the digital transformation—have to work harder to gain the benefits cloud computing promises. And clouds add new facets to IT environments already struggling under the weight of too much of a good thing. But now, hybrid clouds—private, on-premises clouds linked to public clouds with data and applications shared among them—promise to take the enterprise’s love affair with cloud computing to a new level. Descriptions of the cloud’s role in enterprise computing vary widely with who’s doing the describing. Public-cloud providers see almost all enterprise workloads moving to, yes, public clouds. To enable that transition, they’ve shored up their offerings with heightened security features. They offer service-level agreements covering availability and performance.


Connecting Enterprise IT Models to Institutional Missions and Goals

There is no doubt that replacing an ERP system requires a significant up-front investment. We needed a way to assess the cost of continuing operations with our existing ERP system against the cost of implementation and support for a replacement. To build these cost and value estimates, we worked closely with many IT teams including application support, infrastructure, data management, and client services to build a return on investment (ROI) model. In addition to licensing and maintenance costs, we looked at ongoing on-premises costs to support infrastructure, backups, and disaster recovery. We included the costs of satellite systems, such as the staff, faculty, and student portal, that we had developed over the years to improve the user experience. Finally, we factored in the cost to rewrite custom-developed modules if we stayed on the existing system. We ended up with a financial model that evaluated the 10-year costs of staying with our current ERP system against costs incurred in the implementation and support of a replacement.


Open Reference Architecture for Security and Privacy Documentation

Privacy is getting more and more important. New technologies make our lives better but put our freedom and privacy under pressure. Terrorist and (cyber) criminals can be more easily detected by analyzing large amounts of data. Also ‘diseases’ can be better cured using more data of more people. Currently great improvements come at a large price: Big data analytics systems are going over your user data and user data traces (e.g. mouse movements in web pages, location data) multiple times a day. Companies know better what you need, think and eat tomorrow than you. Your location is continuously being tracked, due to all the communication devices you use. Using public transport cannot be done anonymously anymore while cars are full of track and tracing technology. When privacy is designed first just as security we should have less concern on security and privacy hacks. Also if more IT designs are open and published under an open license chances of mistakes in architecture and design will be less. Partly due to pressure of openness but also since more experts can contribute to lower security and privacy risks concerned with public or private systems.


The Multiplier Effect of Collaboration for Security Operations

Today, state, local and federal agencies are much better equipped to collaborate and coordinate response with real-time situational awareness and actionable situational intelligence.  We’re experiencing a similar evolution in the world of cybersecurity. For years, we’ve relied on a defense-in-depth approach to security where each team uses different point products from different vendors to protect valuable digital assets and systems. The problem is that these disparate technologies don’t interoperate, and each has its own intelligence, making it extremely difficult for tools and teams to share intelligence, collaborate and coordinate response. When security teams are dispersed all over the world, the challenge is even greater. This is where a threat intelligence platform comes into play. It can serve as the glue to integrate these disparate technologies. Automatically exporting and distributing key intelligence across the many different layers of your defense-in-depth architecture, it offers your different security teams access, as part of their workflow, to the threat intelligence they need to improve security posture and reduce the window of exposure and breach.



Quote for the day:


"You can't save time. You can only spend it, but you can spend it wisely or foolishly." -- Benjamin Hoff


Daily Tech Digest - May 12, 2018

Boston Dynamics' SpotMini robot dog goes on sale in 2019


Who'll buy it? Probably not you, at least to start.  Raibert didn't reveal price plans, but said the SpotMini robots could be useful for security patrols or for helping construction companies keep tabs on what's happening at building sites. SpotMini can be customized with attachments and extra software for particular jobs, he said. Eventually, though, the company hopes to sell it for use in people's homes. "Most places have something where wheels don't get you everywhere," Raibert said. "We think SpotMini can go to a much larger fraction of places." Boston Dynamics is among the highest-profile robot companies out there. It made a bang with its gas-powered Big Dog quadruped, which could navigate challenging terrain while keeping its balance. Later, the company unveiled Atlas, a humanoid robot that can do flips, pick up boxes and can now run. SpotMini, whose development began while Boston Dynamics was a Google subsidiary, is remarkable for being cute, as well as fascinating to watch. That's pretty valuable given how leery a lot of us are about our future robot overlords.



Back to the Future: Demystifying Hindsight Bias


When using the original dataset, Information about the target label crept into the training data. Boat and body are only known in the future after the event has already occurred. They are not known in the present when making the prediction. If we train the model with such data, it will perform poorly in the present, as that piece of information would not legitimately be available. This problem is known formally as hindsight bias. And, it is predominant in real-world data, which we’ve witnessed first-hand while building predictive applications at Salesforce Einstein. Here is an actual example in the context of predicting sales lead conversion: the data had a field called deal value which was populated intermittently when a lead was converted or was close to being converted (similar to the Boat and Body fields in the Titanic story). In layman terms, it is like Marty McFly (from Back to the Future) traveling to the future, getting his hands on the Sports Almanac, and using it to bet on the games of the present. Since time travel is still a few years away, hindsight bias is a serious problem today.


Cloud-Based Product Lifecycle Management Market is touching new levels

HTF MI recently introduced new title on “Global Cloud-Based Product Lifecycle Management Market Size, Status and Forecast 2025” from its database. The report provides study with in-depth overview, describing about the Product / Industry Scope and elaborates market outlook and status to 2025. The Report gives you competition analysis of top manufacturer with sales volume, price, revenue (Million USD) and market share, the top players including Dassault Systemes, Siemens AG, PTC Inc, Oracle Corporation, SAP SE, Autodesk, Inc, Arena Solutions, Aras, Infor & Accenture PLC. In this report Global Cloud-Based Product Lifecycle Management market classified on the basis of product, end-user, and geographical regions. The report includes in-depth data related to revenue generation region wise and major market players in the Cloud-Based Product Lifecycle Management market.


The future for service – will you focus on AI, voice or search?


Sadly, service delivery today is anything but routine, predictable or scalable. Take a new application built in the cloud – an issue with the cloud provider could lead to all customers being locked out of their service. With each and every customer suddenly needing assistance, scaling up to cope with the problem is difficult; diagnosing the issue with a supplier is also tricky. Coping with a bigger problem and automating responses where possible is therefore necessary. In the State of the Service Desk Report, 13,000 service desk teams provided their insights into what is working and what is needed to cope in future. Around 69 per cent of front line responders stated that they spent too much time firefighting, rather than being able to plan ahead through better problem management. Similarly, around a quarter pointed to increased automation as essential for their efficiency. Yet each company will have to look at its own approach to automation – there is no one size fits all solution. There are a number of new options that service teams can take to evolve their approach – voice, AI and search. 


How to Achieve Sustainable Employee Engagement in Healthcare

Enabling employees to do meaningful work is critical to employee engagement, and requires a consistent feedback loop and the right systems and processes to support them. Technology can be a powerful accelerant that offloads mundane tasks and allows employees to apply their skills and expertise to the things that technology can’t do—innately human things that require empathy, connectivity, communications, and influence. Unfortunately, many healthcare organizations are still operating on legacy systems and their employees are bogged down by slow technology that prevents them from fully engaging in their jobs. These employees end up spending significant time working on things that they weren’t hired to do such as piecing together and fact-checking spreadsheets and reports—activities that they should be able to do within the technology. The right technology will allow your workforce to do their best work by making what encompasses their role more automated, manageable, and efficient. And as regulations and patient expectations continue to change, the systems you choose should be agile enough to change with your organization’s needs. 


Three Fintechs leading Open banking initiatives in the UK

Digital-Banking-Open-Banking-and-APIs-a-Trend-to-Watch-Closely-1440x564_c
As the world starts warming up to the Open banking culture, there is always going to be this tug of war between control and agility. As regulators tune their policies around data sharing and open banking, they will have to make decisions on how much control Financial services firms have over customer data. At the same time it is also critical to work towards an agile open banking framework within a controlled and secure data sharing ecosystem that takes care of customers’ interests. UK, like in most other aspects of Fintech, has been spearheading open banking in policy and execution, but it would be myopic to assume that open banking starts and ends in the UK. I have touched upon different regulatory approaches to open banking and customer data sharing across the globe in my previous posts. Today, I focus on three of my favourite Fintechs in the UK that are regulated to provide open banking services. ... These players and a few others not only add efficiencies for their business through open banking APIs and data analytics, but also create opportunities for businesses partnering with them.


The ethics lessons will continue until morality improves

So, why didn't Build start with that? For exactly the same reason that reactions to Google Duplex has been so divided: Because technology powered by AI has the potential to make our lives far, far better -- or far, far more unbearable. Microsoft showed a meeting room camera system that recognised people walking into the room, greeted them by name, and transcribed every word they said -- even if their deafness made them a little harder to understand. That deaf team member could join in at an equal level with everyone else, and so could remote colleagues. Everyone got a list of what they had said they were going to do, delivered to their to-do lists. Empowering and convenient -- exactly the kind of system the $25 million AI for Accessibility grant programme Nadella announced is there to create. The same system in a railway station in a country with an authoritarian government, or even left on in an HR meeting room where someone is trying to report an abusive boss, would be deeply worrying. Google showed its Duplex assistant phoning a restaurant and sounding enough like a human to be treated like a real customer.


The hybrid cloud provides a best of both worlds solution

Hybrid the best of both worlds
The direction that cloud services and cloud providers are heading in at the moment can quite accurately be described as two major points. Cloud providers seem to be focusing primarily on, number one: expanding their infrastructure and make it available in a number of different geographical locations, and number two: ensuring a variety of options and services be available for their users including IaaS and Paas layers so they are not turned away. One may raise the question of cloud providers not as actively working on creating security solutions but it is negated by the shared responsibility model currently adopted by them which envisions cloud security to be both, the provider and the user’s responsibility, equally. This is why a hybrid cloud system seems to be the ideal solution as it allows enterprises to remain on top of the tech race with the cloud yet be able to retain critical work on-premise to ensure its utter security. Despite a great number of entrants finding a haven in the cloud and data centre technologies, a proper and flexible security solution for the hybrid cloud systems still remains to be formulated.


Coaching with Curiosity Using Clean Language and Agile


The Clean for Teams training is all about getting the team to be curious and supportive of each other using Clean Questions. It works wonders as long as people use no more than three questions in a row at a given time, keeping it light and not going as deep as you might in professional performance coaching or therapy.  In a recent workshop I gave, two colleagues were pairing up to practice the questions that they had just learned. They decided to use as a topic a discussion they had had the prior day at work. During the debrief, one commented that the trajectory of the conversation had been richer and more revealing than had been the conversation the day before. They used only a few questions and had had only 15 minutes of exposure to Clean Language. So yes, it is possible with the right guidance to put it to use in your everyday work, whether in a coaching relationship or not. You will experience an improvement in the way people relate to you and you to them, which is one of the outcomes of good coaching. The conditions for peer-to-peer coaching include having a space to listen, and a technique to separate out your own thinking so that you can stay within the mental model of the person you are listening to.


Understand Microservices Monitoring


The ultimate goal, of course, is for processes, errors, and bottlenecks to be managed in ways that are totally transparent to end users, as microservices-based platforms fix themselves with the help of microservices analytics. In the event of a bottleneck, for example, an end-use customer who tries to buy a widget or service on the Web would ideally never receive an error message that might prompt the user to “try again later.” Developing microservices orchestrations and associated analytics capabilities are easier said than done in-house, of course. To that end, third-parties have emerged with solutions and services for those organizations that lack resources to develop the architectures in-house.“Microservices are moving toward mainstream use today and often show many integration points with existing monolithic enterprise applications,” Torsten Volk, an analyst for Enterprise Management Associates (EMA), said. “Meanwhile, vendors of DevOps-centric application and infrastructure analytics software are stepping up to monitoring this often complex and dynamic world of applications consisting of shared services with often disconnected release schedules.”



Quote for the day:


"To make a decision, all you need is authority. To make a good decision, you also need knowledge, experience, and insight." -- Denise Moreland


Daily Tech Digest - May 09, 2018

Europe may come to regret its new set of data rules


Worse, the rules could impede innovation. Many blockchain companies could be shut out entirely. Cloud computing may become substantially more complicated. Systems that rely on artificial intelligence could in many cases be incompatible with the GDPR’s mandates. It’s an ominous sign that Facebook has already started pulling some data projects from Europe. Yet all this is more or less by design; there will also be unintended consequences. Although the GDPR aims to improve data security, for instance, its privacy rules may compromise a crucial tool used by security researchers, thereby increasing spam, phishing attacks and malware. Its compliance costs could inhibit cybersecurity investment. Its emphasis on obtaining consent for data collection is, in practice, likely to mean endless “click to proceed” boxes that leave customers little more informed — and significantly more irritated — than before. For all these drawbacks, the EU deserves credit for illuminating — and attempting to resolve — a very real problem. European law enshrines a right to privacy. 


In Cybersecurity, Accountability Could be the Ultimate Innovation

Sacrificing short-term gains to reinforce the company’s mission has understandably been a big positive for their brand—and it’s been great for their business. In December of 2017, CVS announced it would buy Aetna, a move that could very well reshape the health insurance landscape in this country. Cybersecurity is an industry that can desperately use a dose of accountability-as-innovation. Accountability in cybersecurity is virtually non-existent. Despite billions of dollars spent worldwide on cybersecurity solutions, our position in cyberspace is now more precarious than ever. Recently, the World Economic Forum’s (WEF) Global Risks Landscape 2018 ranked cyber attacks alongside extreme weather events and the prospect of nuclear war as the most likely and dangerous risks threatening the stability of society. That means, on the internet, “attackers could trigger a breakdown in the systems that keep societies functioning.” Which we just saw happen last month when cyber actors held critical services provided by the city of Atlanta for ransom and even took Baltimore’s emergency 911 response system offline.


Forget Windows; Microsoft is now all about the cloud

open windows clouds
Windows resides in the More Personal Computing segment, the revenue leader, but don’t let that deceive you. A closer look tells the real story. ... There’s no breakdown of Windows versus cloud, but Microsoft did say the Azure public cloud’s revenue boomed 93% year over year. The previous quarter it grew 98% year over year. And Microsoft also said that what it calls its “commercial cloud,” made up of Azure, Office 365, Dynamics 365 and other cloud services, brought in $6 billion in revenue in the third quarter, which was up 58% year over year. The More Personal Computing segment was up far less — only 13% year over year. Also notable in the third quarter: Windows and Devices chief Terry Myerson left the company. You can be sure he didn’t depart because Microsoft was going to devote more attention to Windows. Keep in mind, also, that a lot of Microsoft products are now essentially cloud-based, so there’s even more cloud revenue at the company than first meets the eye. Microsoft Office, for example, is increasingly a cloud service, with the company pushing Office 365 heavily over the client version of the Office suite.


What is an API? Application programming interfaces explained

What is an API? Application programming interfaces explained
Diving a little deeper, an API is a specification of possible interactions with a software component. For example, if a car was a software component, its API would include information about the ability to accelerate, brake, and turn on the radio. It would also include information about how to accelerate: Put your foot on the gas pedal and push. The “what” and “how” information come together in the API definition, which is abstract and separate from the car itself. One thing to keep in mind is that the name of some APIs is often used to refer to both the specification of the interactions and to the actual software component you interact with. The phrase “Twitter API,” for example, not only refers to the set of rules for programmatically interacting with Twitter, but is generally understood to mean the thing you interact with, as in “We’re doing analysis on the tweets we got from the Twitter API.” Let’s dig in by looking at the Java API and the Twitter API as examples. First, we’ll get a quick picture of these two APIs and how they fulfill the definition of “what” and “how.” Then, we’ll talk about when you’ll likely use APIs and what goes into a well-designed API.



Antipattern of the Month: Unresolved Proxy

Image title
Any proxy must be respected as having executive authority regarding value, so as not to be undermined. This includes authority over the articulation and ordering of work on a Product Backlog and how it is represented to the Development Team. The proxy must be a genuine and competent representative of the "real" PO, and recognized as being fully able to take decisive action and to provide information in a timely way. Unfortunately, though, a proxying model can be resorted to as a salve when genuine product ownership is weak. Stakeholders might expect a certain product capability to be available, but none may necessarily wish to own it. This can be the case with middleware for example. Several proxies might then be used, each of whom will represent certain capabilities on behalf of a notional though absent Product Owner. Great discipline is needed when a single clear proxy is unrecognized, since all must then agree to establish compensatory protocols through which they collaborate beyond their narrow interests.


Why CEOs Should Embrace Minimally Viable Moves

minimally viable moves
Minimally viable moves allow companies to pursue big bets with incremental amounts of risk instead of big sweeping chunks. It’s akin to an MVP (minimally viable product), which is designed to represent just enough of a new market-facing offer that you can get real feedback about it and course-correct as necessary. An MVM involves making just enough of an organizational change to determine whether or not the move will be valuable to your business. This is beneficial and empowering for business leaders at all levels. Instead of feeling that responding to disruption is equivalent to betting the farm, MVMs provide enough cover so that if mistakes happen, decision-makers don’t feel forced back to the drawing board. Going slow and steady allows for on-the-fly adjustments and never having to double back because of hastiness. Which minimally viable move you make depends on your organization and your objectives. For example, you can alter protocol for a common type of decision, skip a management feedback step in preparing for a customer visit, or shift hiring practices for a certain role.


Windows critical flaw: This security bug is under attack right now

In an advisory crediting Qihoo 360 Core Security researchers and Kaspersky Lab malware analysts for discovering a critical bug tagged as CVE-2018-8174, Microsoft details a remote code execution flaw residing not in Internet Explorer but the Windows VBScript engine. However, it also explains the bug can be exploited through Internet Explorer. Microsoft hasn't confirmed this is the bug reported by Qihoo 360 Core Security but notes the flaw is being exploited in the wild. "In a web-based attack scenario, an attacker could host a specially crafted website that is designed to exploit the vulnerability through Internet Explorer and then convince a user to view the website," Microsoft notes. "An attacker could also embed an ActiveX control marked 'safe for initialization' in an application or Microsoft Office document that hosts the IE rendering engine." Observed attacks have started with a malicious Word document, which when opened downloads an exploit written in VBScript that's hosted on a webpage, according to malware analysts at Kaspersky Lab.


Google’s developer show highlights the promise and perils of its data hoard


Google has stressed that its data systems are more secure and it keeps information anonymous. “We’ve long had a very robust and strong privacy program at Google,” Pichai told investors last month. Yet Google already gives many Android app creators access to a sea of personal information, including location history and some shopping behavior. And it has been routinely criticized for the vast targeting in its advertising business and the spread of misinformation on search results and YouTube. Last week, Google said its latest security product, which restricts outside access to personal accounts like Gmail, was available for the iPhone. In March, it unveiled a new plan to stamp out fake news. Expect similar announcements at I/O. But the company will have to offer developers new features, too, some of which will likely give them fresh ways to track where people go and how they interact with their devices. There are about 25 conference sessions this week on the Google Assistant, a voice-enabled, AI-powered service that the company is trying spread further and faster than Amazon’s Alexa.


The Impact of MiFID II on Data Management: Q&A with MarkLogic's Ken Krupa

The volume of data that needs to be recorded makes the regulation a huge technology challenge. Many companies are finding that they have to update their technologies, infrastructures, and data management processes. To be compliant, firms need transparency and the ability to maintain a consistent view of the trade landscape at any point in time. All of these requirements will have a broad impact on data management and IT infrastructure, in large part because the old ways of dealing with data are no longer sufficient. The evolution of the IT infrastructure in the financial services industry has led to proliferation of systems and fragmentation of data. Also, the rapid rise of social media, instant messaging, forum usage, unstructured data as a source of new content, and trader behavior analytics has increased the amount of information that grows outside transactional systems. All of this new information now falls under the remit of compliance and business planning.


How to create a data strategy for enterprise IoT

When it comes to enterprise adoption of IoT, most deployments are still in a pilot or proof-of-concept phase, according to Forrester Research senior analyst Paul Miller. These projects are often driven by operational teams, and are not necessarily linked to enterprise-wide technology strategies for cloud or data. "A lot of these deployments are early, small, and often under the radar of central IT," Miller said. "As they become more mission critical, there will be a very real need to ensure that they do comply with things like data policies, privacy policies, and security policies. But it's still early days, and there's relatively little formal policy around IoT deployment at the moment." Most companies are examining how to manage their existing data, in terms of how to secure and extract value from it, said Mark Hung, a research vice president at Gartner. "Both the speed and scale of data that IoT brings is a new challenge," Hung said. With so many endpoints, companies need to prepare to manage a large influx of information that must be analyzed in close to real time to gain the greatest insights, he added.



Quote for the day:


"The tragedy of life doesn't lie in not reaching your goal. The tragedy lies in having no goal to reach." -- Benjamin Mays


Daily Tech Digest - May 08, 2018

Who wants to go threat hunting?

forensics threat hunter cyber security thumbprint
To become a threat hunter, one must first work as a security analyst and likely graduate into IR and cyber threat intelligence fields. Combined with a bit of knowledge of attacker methodology and tactics, threat hunting becomes a very coveted skill. Threat hunting is one of the most advanced skillsets one could obtain in information security today. The core skills of a threat hunter include security operations and analytics, IR and remediation, attacker methodology, and cyber threat intelligence capabilities. Combined, a hunter is the special operations team of an organization’s defensive and detection capabilities. A threat hunter is taking the traditional indicators of compromise (IoC) and instead of passively waiting to detect them, is aggressively going out looking for them. Traditional intrusion detection doesn’t do a great job on the crafty adversary. They will avoid tripping the normal intrusion detection defenses. It takes a threat hunter to find them. Not every company can have one. It takes a certain size and sophistication. ... Threat hunting teams need threat intelligence plus a network person, an endpoint person, a malware analyzer, and a scalable bunch of tools. A threat hunting team is like special operation forces.



Quantum computers and Bitcoin mining – Explained

Circuit Board Projecting Bitcoin
With the application of the science behind electrons, energy required to mine bitcoins will be drastically reduced causing a direct impact on the protection of the environment. It is immaterial where the computers are located because one of the properties that this technology employs is having objects in any place at the same time. Because of the energy challenge, most mining companies have opted to set up their data centers in regions that have cold weather conditions most time of the year as is the case of using the classical computers. Quantum computers adequately sort out that issue. The economics of mining will definitely improve with miners not having to get concerned about the exorbitant electricity bills they have to contend with. A large number of Mining companies have had to migrate to China where they can capitalize on the relatively cheaper electricity offered. This need not be case anymore as particles can exist in multiple locations at once with quantum computing. The technology of quantum computing is in itself an incentive or motivating factor in causing more people to desire to engage in conducting the mining activity.


Should you let IT teams choose their own collaboration tools?

teamwork - collaboration
While CIOs should consider stepping back from dictating what IT can use, they are still responsible for vetting, integrating and maintaining the solutions IT chooses, Palm says. “You’re not just a strategic advisor to the business as far as how these tools can enable efficiency and innovation, but you’re helping your teams choose the best tools to help them do the best job they can,” he says. Red Hat’s Kelly notes a fundamental issue CIOs encounter when shifting to a choose-your-own approach: “What you don’t want to do is completely let go, and then all of a sudden you have fifty different ways people are communicating with each other — that’s a mess. You as a CIO have to walk a line between standing there and saying, ‘You are required to use this and only this,’ and making it a free-for-all.” One chief concern is the possibility of constraints that regulatory compliance and data governance may have on these decisions, depending on your industry, Kelly says.


Financial sector cyber-related laws are a bellwether, says Deloitte


“While it is generally not possible to control when you have a crisis, quite often the cause of these crises is a cyber security incident, so it is worth information security teams in organisations engaging with the privacy teams to help understand where the organisation’s core risks lie, so they can prepare for these crises. A good response makes a huge difference.” Another thing that “absolutely attracts regulator attention”, said Bonner, is “pockets of complaints”, because even if the regulator does not have the resources to follow up on every single isolated complaint, if there are several customer complaints about a single organisation, the regulator will pay attention. “The lack of resources means that regulators will draw conclusions based on the nature and volume of the complaints,” he said. “So it could be by chance that a couple of entirely separate parts of your organisation have an issue that gets escalated to the regulator, but the conclusion will be that the organisation has a systemic problem.


Making sense of Handwritten Sections in Scanned Documents


It is challenging to achieve acceptable extraction accuracy when applying traditional search and knowledge extraction methods to these documents. Chief among these challenges are poor document image quality and handwritten annotations. The poor image quality stems from the fact that these documents are frequently scanned copies of signed agreements, stored as PDFs, often one or two generations removed from the original. This causes many optical character recognition (OCR) errors that introduce nonsense words. Also, most of these contracts include handwritten annotations which amend or define critical terms of the agreement. ... In recent years, computer vision object detection models using deep neural networks have proven to be effective at a wide variety of object recognition tasks, but require a vast amount of expertly labeled training data. Fortunately, models pre-trained on standard datasets such as COCO, containing millions of labeled images, can be used to create powerful custom detectors with limited data via transfer learning – a method of fine-tuning an existing model to accomplish a different but related task. Transfer learning has been demonstrated to dramatically reduce the amount of training data required to achieve state-of-the-art accuracy for a wide range of applications.


Smarter cities: why data-driven cities are the only option

New and exciting mobile and static infrastructure technologies are enabling safer communities, new low-cost utility services, more efficient city operations, and intelligent low-emissions transportation systems. One example of a UK smart city is Milton Keynes which showcases driverless pods that ferry citizens along fixed routes across the city. ... Smart city programmes involve mass deployments of sensors which are necessary to gather data to justify and manage change. While sensors can be very cheap, the deployment of these sensors can often be very expensive, especially at a city-wide scale. But rather than deploying an entirely new network of sensors, cities and local authorities can improve efficiency by leveraging existing sensor networks to avoid expensive infrastructure investments. Telematics companies may seem unlikely partners in this endeavour, but with some of the world’s largest organically grown vehicle datasets, telematics providers can grant access to aggregated data already blanketing cities across the globe – without the expense of building a sensor network.


11 Industries That Will Soon Be Disrupted By Blockchain

Let's face it. Many people are resistant to technological changes in both their personal lives and at the office. However, what they often lack is the vision to see how the new technology they are resisting will improve their lives in the future.Emerging technologies are exciting and bring innovation and new opportunities across the globe. They change our life by altering the way we think and operate on a daily basis. Technological innovation can impact a lot more than our daily lives. In fact, it can disrupt entire industries and change the way we do business.As new technologies are developed, affected industries are forced to adapt or be replaced.The newest technology that is quickly becoming the next major disruption is blockchain technology. Blockchain is a digital ledger system used to securely record transactions. It is poised to impact the way business is done across the globe.Here are nine prominent industries that are slated to be overhauled by blockchain technology in the near future.1) Gambling IndustryThe day of coins falling from slot machines when someone hits a jackpot are long gone. But the coin could soon make are return, but in a digital form via blockchain. RAcoin's mission is to make the blockchain technology an essential part of the traditional gambling industry and distribute RAcoin as a properly featured gambling cryptocurrency. 2) Payment IndustryKora is building an infrastructure for cross-border payment that facilitates financial transactions between people and/or businesses in a more transparent way using the blockchain. Other value-add like Identity and interoperability across other range of financial services are things we also bring to the table. Kora's use of blockchain will soon unlock growth in emerging markets by connecting people, communities and capital. Their services include the ability to access marketplaces, to make payments, transfers, investments, and to lend & pool capital across any community. Kora's native token has been named, KNT, it is the medium for interaction to be used on their in the future.3. The Real Estate IndustryAnyone who has ever purchased or sold a home knows just how much paperwork is involved in a real estate transaction. Blockchain technology can completely change the current headache that all of these documents cause. By using blockchain, all of the documents and transaction records can be stored securely with measurably less work and less cost.According to Piper Moretti, CEO of the Crypto Realty Group and licensed realtor, the use of blockchain can also potentially eliminate the escrow process. The technology can create smart contracts that release funding only when the conditions are met.Additionally, many people in the process of working with a real estate agent know how frustrating the commission rates can be, with many charging up to 6 percent. Deedcoin is looking to change that with its cryptocurrency-powered platform. Through using Deedcoin's platform and proprietary tokens, those rates decrease to just 1%. Deedcoin's distributed architecture gives power back to homeowners and buyers by tokenizing the process and eliminating any middlemen, barring direct interactions between agents and customers. 4. The Healthcare IndustryThe healthcare industry has been in need of a significant disruption when it comes to sharing and storing medical data and records.The potential for error, fraud, and lost records has created distrust between consumers and healthcare providers. Blockchain technology can revamp the trust by securely storing medical records that can be accurately and safely transferred to and accessed by the doctors and people who are authorized. Blockchain will aid in the authorization and identification of people. In fact, one startup called Ontology is already working to make positive, multi-source identification a reality across all industries using the blockchain technology.5. The Legal IndustryBlockchain technology is poised to disrupt some areas of the legal industry by being able to store and verify documents and data. For example, litigation dealing with resolving concerns over wills of the deceased or any other documentation can be eliminated. Records (including wills) stored on the blockchain will be quickly and securely verified. Any changes to the documents will be authenticated and stored.Blockchain technology can also eliminate legal issues dealing with inheritance, even including cryptocurrency assets. Safe Haven, for example, gives users the opportunity to secure digital assets so that the investor's legacy can be passed down to his children or designee safely and securely. This technology eliminates lengthy court battles arguing over digital inheritance.6. The Cryptocurrency Exchange IndustryDigital money is the way of the future, and it is thanks to blockchain that it can be securely transferred and recorded. However, the "mining" required to verify and authenticate every transaction of digital money requires an enormous amount of computing power. In recent years, this has created a lot of issues on several platforms when certain transactions "ran out of gas" or fizzled out due to the sheer amount of computation required. This issue was costing users valuable time and money.New developments in blockchain technology are changing the way the cryptocurrency exchange industry operates. Zen Protocol has developed an alternative to other platforms, which has solved the most significant issues in the cryptocurrency space. Unlike other platforms, Zen Protocol utilizes smart contracts that know in advance how much computation each contract requires. That means that unless there is enough "gas" to support that contract, it won't run.7. PoliticsIn the recent past, government parties here in the U.S. and around the world have been accused of rigging election results. But that won't be possible if blockchain is used because it would take care of voter registration and verification of identity, and it would count the votes to ensure only legitimate votes were counted. Gone are the days of recounting votes and voting day drama.8. The Startup IndustryWith thousands of startups looking for investors, there is no current way for them to get in front of the right investors without jeopardizing the security of their ideas. Likewise, there is no right way for investors to find the companies they are interested in backing. Blockchain technology can change all of that. In fact, it has already started. Companies such as Pitch Ventures are creating a way for startups to pitch investors live in a secure manner. Entrepreneurs create summaries of their product or service and investors can quickly sort and find potential opportunities. Ethereum's Smart Contract address allows a secure medium for the pitches, so privacy is maintained.9. The Video IndustryVideo is predicted to form 82% of all Internet traffic by 2021, and blockchain may play a significant role by decentralizing the video infrastructure. Decentralizing video encoding, storage, and content distribution will dramatically reduce the cost of video traffic by tapping into $30 billion in wasted Internet computing services. Startups like VideoCoin are already making good on the promise of freeing up this capital, which will allow entirely new and innovative ecosystems of video apps to emerge on the market.10. The Education IndustryThe education industry is poised to see some significant breakthroughs utilizing an emerging version of the Internet that combines blockchain, cryptocurrency, and virtual reality. This new Internet will be known as "3DInternet," and it has the power to create a global classroom like never before. SocratesCoin is making big moves to make this a reality. The company will create a global community of faculty, students, campuses, and curriculum. The students will encompass all ages, cultures, and locations. SocratesCoin has secured Nauka University, which will utilize 3DInternet to unite science, thought leadership and science through education. Blockchain-distributed ledger technology provides a safe and auditable way to record and transfer data. It can transform the way we live our everyday lives and disrupt any industry that uses data or transactions at all. 11. The Banking IndustryBlockchain technology has the potential to solve several significant problems faced by the banking industry today. Right now banks store money for their customers, and they also handle the transfer of that money.Blockchain inherently has a secure system that would provide permanent records of the millions of transactions that take place in the banking industry each day. This ledger system could significantly lower the risk by providing secure records. Furthermore, money could be transferred cheaper and faster by the decentralization provided by blockchain.And all of this disruption is a good thing.Whether or not you like to introduce new tech into your life, I think we can all agree that added security to our financial data would give everyone more peace of mind.About the author:John White is the CMO and founder of Social Marketing Solutions. White writes at the crossroads of social media, entrepreneurship, startups, and marketing.
Many people are resistant to technological changes in both their personal lives and at the office. However, what they often lack is the vision to see how the new technology they are resisting will improve their lives in the future. Emerging technologies are exciting and bring innovation and new opportunities across the globe. They change our life by altering the way we think and operate on a daily basis. Technological innovation can impact a lot more than our daily lives. In fact, it can disrupt entire industries and change the way we do business. As new technologies are developed, affected industries are forced to adapt or be replaced. The newest technology that is quickly becoming the next major disruption is blockchain technology. Blockchain is a digital ledger system used to securely record transactions. It is poised to impact the way business is done across the globe. Here are nine prominent industries that are slated to be overhauled by blockchain technology in the near future.


Microsoft's Project Brainwave brings fast-chip smarts to AI at Build conference


Project Brainwave brings two important differences to conventional AI. First, it uses a fast and flexible but unusual processor type called an FPGA, short for field programmable gate array. It can be updated often to accelerate AI chores with the latest algorithms, and it handles AI tasks rapidly enough to be used for real-time jobs where response time is crucial. Second, customers eventually will be able to run the AI jobs with Microsoft hardware at their own sites, and not just by tapping into Microsoft's data centers, which speeds up operations another notch. "This is a unique offering," said Forrester analyst Mike Gualtieri. The project is a microcosm of the AI revolution sweeping the tech industry. On the one hand, it's maturing fast enough to become useful for countless tasks -- digesting legal contracts, finding empty parking spaces, looking for hiring biases and generating 3D models of people's bodies, limbs and heads from a video.


More time equals more opportunity for cyber attackers


Given enough time, a criminal siphoning data can slow the attack down to a level where it may look like normal network traffic noise, rather than attempt to send out gigabytes of data from a database, for example. New data can also be gained over time, such as new oil well exploration or pharmaceutical research. If this arrives in an already compromised database, the attacker is positioned, ready and waiting, and only needs to exfiltrate it. Third, a rushed attack can often be rolled back to a previous backup without too much trouble or data loss. If exploitation of a database occurs today and is discovered, restoring the database leaves only a short batch of transactions that may need to be updated, once the route in has been strengthened. As a result, the business impact is low. Conversely, an attack that takes place over many months may mean long periods of compromised backups, requiring extensive manual work to rebuild from the last known successful backup. In extreme cases, reliance on these backups may not be possible as tapes deteriorate or are reused/recycled.


What is edge computing?

As centralized as this all sounds, the truly amazing thing about cloud computing is that a seriously large percentage of all companies in the world now rely on the infrastructure, hosting, machine learning, and compute power of a very select few cloud providers: Amazon, Microsoft, Google, and IBM. ... The advent of edge computing as a buzzword you should perhaps pay attention to is the realization by these companies that there isn’t much growth left in the cloud space. Almost everything that can be centralized has been centralized. Most of the new opportunities for the “cloud” lie at the “edge.” So, what is edge? The word edge in this context means literal geographic distribution. Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It doesn’t mean the cloud will disappear. It means the cloud is coming to you.



Quote for the day:


"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore