Daily Tech Digest - September 09, 2018

Lenovo Partners With Blockchain Platform to Develop Its IoT and AR/VR Hybrid Software

Lenovo Partners With Blockchain Platform to Develop Its IoT and AR/VR Hybrid Software
“The Internet of Things and augmented reality are already changing the way we interact with the world. We are excited to partner with AR titan Lenovo New Vision. I see the combination of AI/AR and IoT revolutionizing the business environment," Credits CEO & founder Igor Chugunov said in the press release. Lenovo New Vision Technology intends to incorporate Credits’ blockchain solution to streamline internal operations and management procedures. “Credits [has] been chosen by Lenovo New Vision Technology thanks to its distinctive technical solutions, such as [a] unique consensus algorithm which consists of dPoS (delegated-proof-of-stake) and BFT (Byzantine Fault Tolerance) features,” says the press release. The blockchain platform that Credits has built is capable of performing up to one million transactions per second, it offers a processing speed of 0.01 seconds, combined with commission rates of as low as $0.001. The extended functionality of Credits’ smart contracts makes it possible to set cycles and create schedules.



The ultimate guide to finding and killing spyware and stalkerware on your smartphone

Our digital selves, more and more, are becoming part of our full identity. The emails we send, the conversations we have over social media -- both private and public -- as well as photos we share, the videos we watch, and the websites we visit all contribute to our digital forms. ... As mobile devices are now a common tool for social interactions, it is not just ad agencies, data miners, and surveillance-hungry powers that want to keep track of us. When a government agency, country, or cybercriminals decide to peek into our digital lives, there are generally ways to prevent them from doing so. Virtual private networks (VPNs), end-to-end encryption and using browsers that do not track user activity are all common methods. Sometimes, however, surveillance is more difficult to detect -- and closer to home. This guide will run through what spyware is, what the warning signs of infection are, and how to remove such pestilence from your mobile devices.


The How-To: Improving Customer Satisfaction With Digital Leadership

The How-To: Improving Customer Satisfaction With Digital Leadership
Hiring the right people and giving them a platform doesn’t automatically make them customer service champions. What does your business think about the digital revolution? What ideas do you have in your company on how to satisfy your customer with digital platforms. Do you have a guideline for every department on how they can use digital tools to improve their service to customers? Having a digital culture shows your employees that your business is serious about implementing digital innovations in serving customers. Zappos uses the 'WOW Philosophy' to go the extra mile in ensuring that there customers are happy. The company doesn’t want to be known as the “shoe company” but as “a great customer service company. Zappos created a fun working environment which leads to workers having fun at and with their tasks, and this robs off on the customer. A solid digital culture is also a way of showing your customers that you’re trying your best as a business to improve customer service.


Industrial Networking Enabling IIoT Communication

Industrial networking is different from networking for the enterprise or networking for consumers. First there is the convergence of Information Technology (IT) and Operational Technology (OT). Important networking considerations include whether to use wired or wireless, how to support mobility (e.g. vehicles, equipment, robots and workers) and how to reconfigure components. Other factors include the lifecycle of the deployments, physical environmental conditions such as those found in mining and agriculture and electromagnetic conditions where interference from machines and equipment can be a problem. Then we have power. Will it be available? Or will devices need to run on local power such as batteries? Second there are the technical requirements. They include network latency and jitter, throughput needs, reliability and availability. The requirements can vary from being relaxed to highly demanding. The network must meet the end-to-end performance requirements for applications deployed both at the edge and in the cloud. 


Design Thinking: Future-proof Yourself from AI


Integrating design thinking and machine learning can give you “super powers” that future-proof whatever career you decide to pursue. To meld these two disciplines together, one must: Understand where and how machine learning can impact your business initiatives. While you won’t need to write machine learning algorithms (though I wouldn’t be surprised given the progress in “Tableau-izing” machine learning), business leaders do need to learn how to “Think like a data scientist” in order understand how machine learning can optimize key operational processes, reduce security and regulatory risks, uncover new monetization opportunities; Understand how design thinking techniques, concepts and tools can create a more compelling and emphatic user experience with a “delightful” user engagement through superior insights into your customers’ usage objectives, operating environment and impediments to success. Let’s jump into the specifics about what business leaders need to know about integrating design thinking and machine learning in order to provide lifetime job security (my career adviser bill will be in the mail)!


The Pentagon is investing $2 billion into artificial intelligence

pentagon FILE
Artificial intelligence, which lets machines perform tasks traditionally done by humans, is a trendy topic in technology and business circles. For example, Google recently delighted and alarmed observers when it showed how an AI system could call a restaurant and book a reservation while sounding entirely human. Breakthroughs in the last decade have inspired companies to recruit top AI talent away from academia. Machines are now much more accurate at recognizing speech, understanding images and processing words, leading to products such as Amazon's Alexa, Apple's Siri, and Waymo's self-driving vans. The country's biggest and most innovative companies rely on it to stay ahead of competitors.Waymo's autonomous vehicles have driven more than 9 million miles on US roads thanks to artificial intelligence. National governments, such as Canada, China, India and France, are prioritizing AI now too. They view artificial intelligence as essential to growing their economies in the 21st century. Most notably, China has said it wants to be the global leader by 2030.


How corporations and startups can more effectively work with one another

People Rowing Boat In Water
If the corporate posture of the past around innovation could be described as “not invented here” with a strong bias toward building internally, today’s corporate posture leans in a much different direction, with many thinking about how to disrupt themselves before an external party beats them to it. Not surprisingly, this has created more corporate and startup partnerships. While getting this type of collaboration right is beneficial for both parties, if you speak to most startups selling into large enterprises or corporate executives looking to partner with startups today, you will find many justifiable frustrations on both sides. As the vice president of Business Development at RRE Ventures, an early-stage venture capital firm based in New York, a major part of my role is leading our business development initiatives, where we enable collaboration between corporations and startups. Before this role, I spent time on the corporate side and on the startup side, so I’ve gotten to see this dynamic from both angles throughout my career.


Decentralisation: the next big step for the world wide web

an illustration of a decentralised network looking like constellations in a late evenan illustration of a decentralised network looking like constellations in a late evening sky
If it is done right, say enthusiasts, either you won’t notice or it will be better. One thing that is likely to change is that you will pay for more stuff directly – think micropayments based on cryptocurrency – because the business model of advertising to us based on our data won’t work well in the DWeb. Want to listen to songs someone has recorded and put on a decentralised website? Drop a coin in the cryptocurrency box in exchange for a decryption key and you can listen. Another difference is that most passwords could disappear. One of the first things you will need to use the DWeb is your own unique, secure identity, says Blockstack’s Ali. You will have one really long and unrecoverable password known only to you but which works everywhere on the DWeb and with which you will be able to connect to any decentralised app. Lose your unique password, though, and you lose access to everything. ... The decentralised web isn’t quite here yet. But there are apps and programs built on the decentralised model.


In the IoT world, general-purpose databases can’t cut it

In the IoT world, general-purpose databases canĂ¢€™t cut it
We live in an age of instrumentation, where everything that can be measured is being measured so that it can be analyzed and acted upon, preferably in real time or near real time. This instrumentation and measurement process is happening in both the physical world, as well as the virtual world of IT. For example, in the physical world, a solar energy company has instrumented all its solar panels to provide remote monitoring and battery management. Usage information is collected from a customers’ panels and sent via mobile networks to a database in the cloud. The data is analyzed, and the resulting information is used to configure and adapt each customer’s system to extend the life of the battery and control the product. If an abnormality or problem is detected, an alert can be sent to a service agent to mitigate the problem before it worsens. Thus, proactive customer service is enabled based on real-time data coming from the solar energy system at a customer’s installation.


Don’t Rely on the Cloud Provider to Protect Your Data

Unfortunately, many organizations that rely on cloud services for mission-critical applications assume that there’s no need to protect the data and apps that live there. The survey above sheds light on this issue, showing that most IT organizations are either using just scripts (25 percent) or nothing at all (23 percent) to back up their AWS data. And when it comes to disaster recovery for AWS, only 19 percent have plans in place. Thirty-six percent are working on a plan, but an alarming 45 percent have no DR plan at all. Those who are relying on their cloud provider to protect their data are making a huge mistake. That's because the big cloud services, such as Amazon EC2 or S3, are exceptionally durable and redundant, they assume that the cloud’s architecture mitigates any risk of downtime from a failure, error, outage or security attack. This is wrong. The end-user license agreements of AWS and most large public clouds put final responsibility for data on the customer. While the “cloud” itself is secured by AWS, everything within that cloud is the customer’s responsibility.



Quote for the day:


"Leadership in the past was a model of direction and control. Now it should help people set directions for the future and facilitate their delivery." -- John Bailey


Daily Tech Digest - September 08, 2018

Why the cloud is the data, and the data is the cloud?


First, we have the use of easy-to-provision and auto-scaling virtual machines that provide a platform for widely distributed “share nothing” database operations. This provides a divide-and-conquer approach to gathering data from both structured and unstructured sources. It’s the ‘secret sauce’ behind the newfound “Hadoop-y” speed that was really not there in the world of traditional relational databases. Second, we have the ability to deliver data using data services that combine behavior and information. This places the database operations behind a well-defined API or service. For the most part, these are simple services that act very much like a traditional database query, or just produce data as requested from a single data source. However, these cloud-based data services, or, cloud APIs, are becoming complex. They can mash up data from multiple sources and externalize that data using a single interface. Thus, you may be able to ask a single question about the existing state of the company and have a service that considers data in hundreds, perhaps thousands of databases, using up-to-date operational data, to come back with a single meaningful answer.



Proactive approach to defending computer systems

"The concept of MTD has been introduced with the aim of increasing the adversary's confusion or uncertainty by dynamically changing the attack surface, which consists of the reachable and exploitable vulnerabilities," Cho said. "MTD can lead to making the adversary's intelligence gained from previous monitoring no longer useful and accordingly results in poor attack decisions." The basic idea as it applies to IP addresses on computer networks is this: Change the IP address of the computer frequently enough so the attacker loses sight of where his victim is; however, this can be expensive, so the approach taken by the researchers in the collaboration here uses something known as software-defined networking. This lets computers keep their real IP addresses fixed, but masks them from the rest of the internet with virtual IP addresses that are frequently changing. Moore added that as the adage suggests, it is harder to hit a moving target. "MTD increases uncertainty and confuses the adversary, as time is no longer an advantage," Moore said.


Key Algorithms and Statistical Models for Aspiring Data Scientists

Algorithm Design Topics
As a data scientist who has been in the profession for several years now, I am often approached for career advice or guidance in course selection related to machine learning by students and career switchers on LinkedIn and Quora. Some questions revolve around educational paths and program selection, but many questions focus on what sort of algorithms or models are common in data science today. With a glut of algorithms from which to choose, it’s hard to know where to start. Courses may include algorithms that aren’t typically used in industry today, and courses may exclude very useful methods that aren’t trending at the moment. Software-based programs may exclude important statistical concepts, and mathematically-based programs may skip over some of the key topics in algorithm design. ... Because machine learning is a branch of statistics, machine learning algorithms technically fall under statistical knowledge, as well as data mining and more computer-science-based methods.


Windows 10 Enterprise customers will now get Linux-like support

Effective this month, for enterprise customers willing to pay the Enterprise edition premium, Microsoft is granting an extra year's support. The new changes are designed to encourage slow-moving enterprises to pick up the upgrade tempo for hundreds of millions of Windows 7 PCs, before that older OS reaches its retirement date in less than 500 days. Today's announcements are the latest twist in a series of changes and extensions in the three years since Windows 10's initial release in 2015. In November 2017, Microsoft extended support for version 1511 by six months, to April 2018. (The blog post announcing that change is no longer online.) Then, in February 2018, Microsoft announced similar six-month "servicing extensions for Windows 10," but this time with a noteworthy gotcha: The new, 24-month support lifecycle applied only to Enterprise and Education editions. If your organization has devices running Windows 10 Pro, they need to be updated every 18 months or sooner.


AWS vs. Azure: Users Share Their Experiences

Image: Shutterstock
What makes a user switch to AWS or Azure from their previous approach to handling a workload? The answers reveal what’s working well with these two cloud solutions. For example, it_user396519, another Amazon Redshift user, had been using an on-premise MySQL data warehouse. He switched to AWS “to reduce the cost and improve scalability.” Or, consider itmanage402807 a SQL Azure user who moved to the Microsoft cloud after evaluating databases like PostgreSQL. He explained, “We decided to switch because our .NET application works well with Microsoft solutions.” Adi L., an AWS Dynamo DB User at a healthcare company, chose AWS when he was also faced with the option of continuing with PostgreSQL. He noted, “We switched to DynamoDB for the scalability and ease of deployment and operation.” Wagner S. , an Amazon EC2 user, switched from the Google Cloud Platform “because Amazon AWS offers more services and a lot more settings.”


Phishing alert: North Korea's hacking attacks shows your email is still the weakest link

The North Korean group accused of some of the biggest cyber crimes ever conducted may have harnessed some highly sophisticated technologies, but their ability to break into computer networks worldwide often relied on nothing more than a bogus email. The US Department of Justice has formally charged a North Korean programmer for his part in some of the largest cyber-attacks in recent years, conducted by a group backed by the North Korean government. The 172-page criminal complaint published by the US Department of Justice provides an unprecedented insight into the workings of one of the most notorious hacking groups on the planet, but also shows how their most successful attacks were at least in part down to a blizzard of fake -- phishing -- emails. The group's activities allegedly include the devastating attack on Sony Pictures Entertainment in November 2014. The group launched their attack on the company in response to the movie The Interview, a comedy that depicted the assassination of North Korea's leader.


How the Equifax hack happened, and what still needs to be done


Equifax as a company hasn't faced many consequences. In January, Democratic senators proposed a law that would require credit-reporting agencies to protect the data they've amassed and pay a fine if they're hacked. The bill never went anywhere. "One year after they publicly revealed the massive 2017 breach, Equifax and other big credit reporting agencies keep profiting off a business model that rewards their failure to protect personal information -- and the Trump Administration and the Republican-controlled Congress have done nothing," Sen. Elizabeth Warren, a Democrat from Massachusetts, said in a statement. Warren isn't alone. At a House Energy and Commerce Committee hearing on Wednesday, where the focus was on Twitter and its CEO, Jack Dorsey, Rep. Ben Lujan pivoted his attention to Equifax. "We've not done anything as well for the 148 million people that were impacted by Equifax," said Lujan, a Democrat from New Mexico. "I think we should use this committee's time to make a difference in the lives of the American people and live up to the commitments that this committee has made: provide protections for our consumers."


Swaying the C-Suite: Proving the ROI of a sound security strategy

When you think about it, it does make sense that the C-suite and security or operations teams don’t speak the same language. Senior leaders are often tasked with cutting the fat. At the same time, organizations struggle to quantify the value of cybersecurity investments. It's important for IT and security leaders to note that true ROI comes from defending the organization against material impact, before it happens. Begin by proving your position with numbers. For example, cybercrime is estimated to cost approximately $6 trillion per year on average through 2021. As such, smart security spend pays for itself in cost savings, reputation protection and more, given the direct connection between loss prevention and a company's bottom line. We're facing a reality in which organizations understand they need to care about security, but to really get executive buy-in, the security team still needs to prove ROI — the right kind of ROI — and present a clear implementation plan. After providing facts and figures, a security roadmap can help highlight the tactical actions needed to sway the C-suite to commit and spend.


The simple fix so your cloud costs don’t spin out of control

The simple fix so your cloud costs donĂ¢€™t spin out of control
There is an easy fix, and it’s called cloud cost management or cloud usage management. It comprises the processes, approaches, and tools that let you keep cost in check—and, most important, keep those costs predictable. These are cloud cost governance tools to monitor usage and the associated costs. They do so by workload, by user, by department, or byany other way you want to slice it. These tools not only let you see who’s using what and when, and how much it costs, but do chargebacks and showbacks to make sure that the right budgets are funding the cloud usage. Perhaps the most important aspect of this technology is that you can set predetermined limits. This includes setting usage parameters such as not provisioning the most expensive instances of storage all the time and making sure that budget restrictions are adhered to. Ironically, even enterprises that are the most controlling when it comes to costs tend to think of cloud costs as something that’s unknown and so just accept whatever rolls in. No one knows what the bill will be, nor expects to.


Facebook, Twitter Defend Fight Against Influence Operations

Numerous technology firms have disputed the notion that there is any political bias in their algorithms. "These accusations are not borne out by data and facts, and they have been widely discredited by major news organizations and experts," a coalition of technology industry groups said in a letter to the committee. But some industry watchers have suggested that while they see no political bias, private social media firms should be more transparent about how their algorithms work as well as their content management policies. "Charges of left-leaning bias are not new, of course," says Tarleton Gillespie, a principal researcher at Microsoft Research and an affiliated associate professor at Cornell University, on TechDirt. "They come from a very old playbook conservatives have used against newspapers and broadcasters for decades. Unfortunately, Silicon Valley is partly to blame for why it is working so well today. Search engines and social media platforms have been too secretive about how their algorithms work and too secretive about how content moderation works."



Quote for the day:


"Leadership is particularly necessary to ensure ready acceptance of the unfamiliar and that which is contrary to tradition." -- Cyril Falls


Daily Tech Digest - September 06, 2018

IBM researchers build AI-powered prototype to help small farmers test soil

2-agropad-photo.jpg
The AgroPad is a paper device about the size of a business card. It has a microfluidics chip inside that can perform a chemical analysis of a water or soil sample in less than 10 seconds. A farmer simply puts his sample on one side of the card, and on the other side, a set of circles provides colorimetric test results. Using a dedicated smartphone app, the farmer can receive immediate, precise results. The app uses machine vision to translate the color composition and intensity into chemical concentrations, with results more reliable than those that rely on human vision. The current prototype measures pH, nitrogen dioxide, aluminum, magnesium and chlorine, though the research team is working on extending the library of chemical indicators. AgroPads could be personalized based on the needs of the individual farmer. Once the test results are in, the data can be streamed to the cloud and labeled with a digital tag to mark the time and location of the analysis. Results for millions of individual tests could be stored.



Microchip 'god mode' flaw: Is it time to rethink security?


This particular vulnerability might be described as a type of hardware backdoor, in which undocumented CPU instructions can take a process from an operating system's Ring 3, the least privileged level of access to resources, directly to Ring 0, the most privileged level of access to resources. Ring 3 is where applications run, and keeping them there keeps them from tinkering with the data or code that other applications use. Ring 0 is reserved for the operating system itself, which manages the resources that all running processes can access. An application needs to be in Ring 0 to enable this backdoor, but Domas found that some systems seem to have been shipped with the backdoor already enabled. Software running in Ring 0 can potentially bypass any security mechanism of other processes. If a process uses a password or cryptographic key, another process running in Ring 0 may also be able to get that password or key, thus virtually eliminating the security it provides.


How blockchain technology could aid key data challenges


The current model for storing data is by keeping said data stored in one place. For example, a Microsoft Word document is saved to a desktop. While access to that document may be made through the server, even remotely, it is still saved in a single, centralized location. Blockchain is, arguably, the exact opposite. Data is stored as the “block” of the technology. The blockchain, in its entirety, is an encrypted ledger that is replicated throughout the database. The data (block) is decentralized through this replication. In other words, it is not saved to a single place, but instead exists across all blocks in the network. Even though the ledger exists in a public space, a private key is required to access a specific block—it enables data to be distributed, but not copied. This manner of data storage protects information from ransomware and hacking attacks by requiring a hacker to simultaneously breach and affect every block in order to render any damage, as opposed to corrupting or stealing just one document in the current centralized version of data storage.


Is a developer career right for you? 10 questions to ask yourself

Developers are among the most in-demand tech professionals in the workforce, with high salaries offered to those with the right skill sets. While learning to code and breaking into a new career may seem daunting, the high number of open jobs and training opportunities could make development a great option for many people. "A lot of developers suffer with imposter syndrome and feeling like they don't have enough knowledge or experience to apply for a developer position," said Cristina Blanchard, a front end web developer at Brew Agency. "The truth is, if you have a solid knowledge and understanding of the most basic, core concepts of development, you can learn just about anything with the right training and a little tenacity. Don't be afraid to apply for a position you feel you may be under-qualified for, because you never know who may be willing to train you or help you get the experience you need."


Government projects watchdog recommends terminating Gov.uk Verify identity project


Sources suggest that GDS hopes to make a case to continue with Verify. Just this week, it announced three further digital services using Verify had reached the “private beta” testing stage, although none of the services have a launch date. ... GDS is also understood to be making a case that Verify remains essential to the ongoing roll-out of Universal Credit, the government’s new benefits system. But even there, the Department for Work and Pensions has had to develop an additional identity system after finding that hundreds of thousands of benefits applicants could be unable to register successfully on Verify. ... There are also question marks over the commitment of the IDPs – also known as certified companies. A report by McKinsey for the Cabinet Office showed that more than 80% of users chose two of the seven IDPs – Experian and the Post Office – leaving Digidentity, Royal Mail, Barclays, Citizen Safe and Secure Identity to pick up the remainder between them.


Designing a Usable, Flexible, Long-Lasting API

Most APIs aren't truly REST APIs, so if you choose to build a RESTful API, do you understand the constraints of REST including hypermedia/HATEOAS? If you choose to build a partial REST or REST-like API, do you know why you are choosing to not follow certain constraints of REST? Depending on what your API needs to be able to do and where your API will be used, legacy formats such as SOAP may make sense. However, with each format comes a tradeoff in terms of usability, flexibility, and development costs. Finally, as we start to plan our API, it's important to understand how our users will interact with the API and how they'll use it in conjunction with other services. Be sure to use tools like RAML or Swagger/OAI during this process to involve your users, provide mock APIs for them to interact with, and to ensure your design is consistent and meets their needs. As you design your API, it's also important to remember that you're laying a foundation to build upon at a later time.


How to Cultivate Security Champions at the Workplace

How to Cultivate Security Champions at the Workplace
Some things you consider simple are things that can make a big impact on people. Think even smaller, visiting with people one-on-one as time and events present themselves. Last note, there is no better time than an incident debrief to educate users one-on-one or in a group. The point is to get people’s attention. Show them why security is important. Show how easy it really can be for malicious actors to reign havoc in your environment. Show how they can have a direct impact in helping to prevent that. A few people will take it to heart and develop a security mindsight. Many people in information security are problem solvers. Approach it that way. Demonstrate to them how a malicious actor could easily attack your AD / Kerberos infrastructure. See how many ask what can be done to mitigate it. Instead of answering, ask them what they would do, what they can think of. Make it a problem for them to solve. Just keep your audience in mind. What will entice one audience, say demonstrating the intricacies of Kerberoasting to your server administrators, will be lost on business partners.


Cloud computing: Three reasons why it could be time to go cloud-first

"There's data governance questions and there might be clients that don't allow us to pass their data to the cloud -- and that's why there might be a situation why we can't go on demand. But our default option will always be to take the cloud option first if we can. And that approach will be multi-cloud where we'll use a range of providers." Kay says he doesn't believe the cloud is a one-size-fits-all situation right now. He says there's still a bit of an arms race taking place and that different providers have different strengths. "Some are better at doing things better than others. And we, therefore, want to be able to take advantage of those capabilities," says Kay. "So, we will not be dogmatic and push everything to a single provider. We're trying all of them -- at the SaaS level, we're using Salesforce, Microsoft Dynamics 365 and Workday. When it comes to IaaS and PaaS, we're using AWS, Azure and Google. That's a deliberate strategy. We have a view on which provider is stronger for a particular set of characteristics."


Four Ways to Take Charge in Your First Agile Project


Creating an environment of psychological safety is imperative for a high-performing team. Google conducted a two-year long study on team collaboration and found that when individuals felt that they could share their honest opinions without fear of backlash, they performed far better. When employees feel that their opinions matter, their engagement levels peak and, according to Gallup’s research, their productivity increases by an average of 12%. But unfortunately, this doesn’t always happen, especially when teams are new to Agile and Scrum. The Agile Report found that one of the top challenges reported while adopting an Agile approach was its alignment to cultural philosophy, along with lack of leadership support and troubles with collaboration. All of these issues are related to people’s personalities, including their strong points and areas of weakness. If a strong leader is put in place and a strategy is designed to work to each member’s strength, many of these problems can diminish.


Ransomware Recovery: Don't Make Matters Worse

Ransomware Recovery: Don't Make Matters Worse
"Trying to decide whether to pay ransom or not is never an easy decision - the best answer is 'no, never' but that's not always a decision you can make," says former healthcare CIO David Finn, executive vice president of the security consultancy CynergisTek. "You will rarely negotiate from a position of strength with a hacker - or any criminal, for that matter. Having a well thought out plan would have helped, and certainly being able to restore the data yourself, without 'buying' decryption might have avoided the entire nasty event." An organization that chooses to pay attackers to unlock data "should apply the decryption key itself with whatever instruction the criminals can provide - instead of sending a file to the ransomware perpetrators to decrypt as evidence the key works," suggests Keith Fricke, principal consultant at tw-Security. "If possible, it is a good practice to have a third-party vendor pay for the decryption key on behalf of the [organization]," he adds.



Quote for the day:


"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn


Daily Tech Digest - September 05, 2018

Binary randomization makes large-scale vulnerability exploitation nearly impossible
To date, critical infrastructure cybersecurity has relied too much upon network monitoring and anomaly detection in an attempt to detect suspicious traffic before it turns problematic. The challenge with this approach is that it is reactionary and only effective after an adversary has breached some level of defenses. We take an entirely different approach, focusing on prevention by denying malware the uniformity it needs to propagate. To do this, we use a binary randomization technique that shuffles the basic constructs of a program, known as basic blocks, to produce code that is functionally identical, but logically unique. When an attacker develops an exploit for a known vulnerability in a program, it is helpful to know where all the code is located so that they can repurpose it to do their bidding. Binary randomization renders that prior knowledge useless, as each instance of a program has code in different locations. One way to visualize the concept of binary randomization is to picture the Star Wars universe at the time when Luke Skywalker and the Rebel Alliance set off to destroy the Death Star. 


How to eliminate project noise

istock-women-with-bullhorn.jpg
Once the requirements and project scope are clearly set, it's important to operate within those boundaries and not fold in additional enhancements that will affect timelines and deliverables. Enhancement requests usually come from end users. If the request is easy to satisfy, like altering a screen layout or adding a data field edit, it likely can be added without much impact. But if the enhancement impacts multiple programs, it's time to redefine the project scope and timeline. Communicate it to all stakeholders, and make sure they are on board with any project changes. ... You can't control a change in business direction your boss makes. However, you can control your project timelines, resources, and how they will be impacted by these new priorities. When a priority change occurs, evaluate impact, and then define a new set of timelines for projects that were already underway. Communicate the impact to your staff and your superiors immediately. One mistake newer project managers make is that they are so eager to please that they try to maintain the original timelines of their projects and just add new projects.


Throwing more people at the problem no longer works

To understand how to proceed forward, we need to unpack the problem a bit. The first, and guiding factor is by understanding the nature of your business. I often go into organizations where the IT leadership has a limited understanding of their business…and not at the level they need today. To complicate matters, beyond the senior most IT leader, the level of business knowledge drops off precipitously. IT staff further into the organization know little more than what a common person knows about their company. While this has worked (marginally) in the past, it will not serve the company moving forward. We have long since passed the point where a company can function without the use of technology. Likewise, we have also passed the point where a CIO or IT leader can survive by technology knowledge alone. Hence why the value of the traditional CIO is in decline while the value of the transformational CIO is on the upswing. See my post on The difference between the traditional CIO and the transformational CIO for more specifics.


Facing competing objectives, CIOs share prioritization strategies

eisenhower matrix chart
Effective governance is not something you can do alone. You need a decision-making body to help prioritize IT investment, to establish transparency to your processes and decisions, and to share ownership of those decisions with the other decision makers. This piece is critical because face it: The “tyranny of now” is really the tyranny of your customers’ demands. And since there will never be enough money or bandwidth to meet all those demands at once, managing those expectations must be a team sport. Governance does that for you. To be proactive rather than reactive, CIOs must be realistic about capacity. Don’t promise what you can’t deliver. You can’t manage 45 concurrent major IT projects, so narrow the list and focus on 10. Is that painful? Sure, but it’s also effective. Finally, remember that flexibility is key. When something new emerges, line it up against your current top 10. If necessary, reprioritize. Governance doesn’t mean being rigid. It means being flexible. ... Of course, we need to get down to one to work efficiently and effectively as a single company. So, we held 14 workshops in the first four weeks after the acquisition, meeting with every part of the business, from accounting to legal to program execution.


Moving apps to the cloud? 3 steps to ensure good customer experiences

cloud hand touch create access secure clouds reach tech job certification
If your business is like most enterprises today, chances are good it's moving toward a best-of-breed, multi-cloud strategy. You're looking for applications best suited to the business's IT needs and want to run each of those apps in the optimal cloud environment. But if you're going to be mixing and matching cloud architectures and workloads to optimize performance, you need openness and flexibility. You need to select cloud providers and software vendors who embrace open standards, open source technologies, and who excel at ensuring cross-platform interoperability. Look for cloud providers and software companies who make it relatively easy for you to move workloads between on-premises data centers, their cloud and other clouds. Developers on your team will also be pleased with this commitment to openness. Today's developers expect to use modern, open source tools for management and customization. The last thing they want is to get boxed into using subpar, proprietary development tools simply because a software partner deems it necessary.


Cryptojacking campaign exploiting Apache Struts 2 flaw kills off the competition

Researchers from F5 Labs say the Apache bug is being used in a new cryptomining campaign which impacts Linux machines. According to the team, threat actors are harnessing PoC code for the Apache Struts 2 critical remote code execution vulnerability posted to Pastebin to infiltrate Linux systems for the purpose of mining Monero. Mining for cryptocurrency, such as Bitcoin (BTC), Ethereum (ETH), and Monero (XMR), is a completely legitimate activity which uses computing power to find virtual coins. However, when this power is taken without consent, such activities are considered cryptojacking. The most common tactic used by criminals in cryptojacking campaigns is the Coinhive script, a legitimate system which is being widely abused. In July, a massive cryptojacking campaign was uncovered in which a botnet used enslaved MikroTik routers to mine for Monero. Dubbed CroniX, the new attack exploits the Apache bug to send a single HTTP request at the same time as injecting an Object-Graph Navigation Language (OGNL) expression containing malicious JavaScript code.


How Do You Develop A Data Strategy? Here’re 6 Simple Steps That Will Help

How Do You Develop A Data Strategy? HereĂ¢€™re 6 Simple Steps That Will Help
There are millions of ways data can help a business but, broadly speaking, they fall into two categories: one is using data to improve your existing business and how you make business decisions. The second is using data to transform your day-to-day business operations. In practice, most companies start out wanting to improve their decision making and take it from there. However, if you want to use data, you must always start with a data strategy. What data you gather and how you analyse it will depend entirely on what you’re looking to achieve – so you need to have thought about this at the outset. Having a data strategy helps the whole process run more smoothly and prepares you and your people for the journey ahead. ... Getting the key company players and decision makers involved will help you create a better data strategy overall, and getting their buy-in at this crucial early stage means they’re more likely to put all that data to good use later on. Keep in mind that, like any business improvement process, things may shift or evolve along the way.


Securing IoT devices: Fortinet's FortiNAC automates the process

Securing IoT devices: Fortinet's FortiNAC automates the process
This week, security vendor Fortinet announced its new FortiNAC solution aimed at addressing many of the limitations of current NAC products. FortiNAC came to Fortinet via the acquisition of Bradford Networks made earlier this year and fills a hole in the vendor's “Security Fabric” story that delivers consistent, end-to-end threat protection. The strength of FortiNAC is visibility and how it discovers all the endpoints. Instead of relying on a database or endpoint agents, FortiNAC is completely agentless and automates the discovery of endpoints by ingesting a wide range of data sources, such as RADIUS, SNMP, DHCP, LDAP and others, as well as behavioral information. This lets FortiNAC identify over 1,500 device types compared to other solutions that can identify 500 to 1,000. ... Also, because it pulls information from a wide range of sources, it can identify devices connected on Wi-Fi or the wired network. The majority IoT devices use Wi-Fi, which is where much of the focus has been from the NAC vendors, but the wired IoT endpoints are used widely in many verticals.


Making Change Is Not a Matter of Willpower


“Employees had to revisit their decisions about how to get to work. They could not just mindlessly repeat their old habits. The new office location turned out to be an opportunity for those with strong environmental values to take mass transit rather than drive to work every day.” Changing contexts gave people the opportunity to think about what they were doing. That changed the habit triggers, which in turn created the opportunity to change behavior. “People have challenges in changing behavior because of a misunderstanding about what controls many of our everyday actions,” she says. “Motivation and understanding just aren’t enough on their own to effect change.” Old habits can endure longer than the motivation to try something new, even for the most dedicated of employees. “As leaders, we need to ask: ‘What do I want people to do on a daily basis in this environment?’” She advises, “Understand the underlying context, and make changes needed so that the desired behavior is easy and rewarding.... Everyone responds to that. When your focus is on the behavior, you can create change that outlasts people’s old habits.”


Notes from the frontier: Modeling the impact of AI on the world economy

New research from the McKinsey Global Institute attempts to simulate the impact of AI on the world economy. First, it builds on an understanding of the behavior of companies and the dynamics of various sectors to develop a bottom-up view of how to adopt and absorb AI technologies. Second, it takes into account the likely disruptions that countries, companies, and workers are likely to experience as they transition to AI. There will very probably be costs during this transition period, and they need to be factored into any estimate. The analysis examines how economic gains and losses are likely to be distributed among firms, employees, and countries and how this distribution could potentially hamper the capture of AI benefits. Third, the research examines the dynamics of AI for a wide range of countries—clustered into groups with similar characteristics—with the aim of giving a more global view.



Quote for the day:


"When a man assumes leadership, he forfeits the right to mercy." -- Gennaro Angiulo


Daily Tech Digest - September 04, 2018

Cyber security training: Is it lacking in the enterprise? image
Everyone in an organisation who is connected to the internet should be given general cyber security training. This is “definitely lacking,” says Wool. As phishing scams – among others – surge, the untrained employee remains a constant risk to the security of their company. The level of training needs to be improved, because currently “there is a poor understanding of the basics of the threat landscape,” according to Wool. “This is something that should be taught in elementary schools. When children learn how to use Excel, PowerPoint and Google, it makes sense for them also to be trained on basic safety rules, just like crossing the street.” ... “A lot can be done and it can be effective, but it takes a very long time to put together,” explains Wool. “Think: how long did it take the human race to figure out what needs to be done to make vehicle transportation reasonably safe. Think about sidewalks, zebra crossings, highway exit and entry ramps and so on. It took 100 years from the invention of the automobile to where we are now. When it comes to safety, we can always do better.”



Multi-Clouds And Composable Infrastructure At VM World

One important trend at VM World and leading up to the show was a focus on software defined infrastructure, including what is called composable infrastructure, that allows virtualizing and addressing individual components, such as storage devices. Before VM World, Dell EMC announced their PowerEdge MX, a high performance, modular infrastructure solution that the company said will easily adapt to future technologies and server disaggregation (a term often used in composable infrastructure). Dell EMC says that the system’s kinetic infrastructure is “uniquely designed without a mid-plane, enabling support for multiple generations of technology releases—processor technologies, new storage types and new connectivity innovations—well into the future. Specifically, the absence of a mid-plane enables direct compute to I/O module connections, allowing for future technology upgrades without disrupting customer operations and without a mid-plane upgrade.


The Roadmap To Digital Manufacturing Transformation
To build a roadmap to digital transformation, more often than not companies are looking into the future, attempting to visualize where they want or need to be in twenty years, and planning backwards. For many however, a more proactive approach to planning would be to accept that “You can't know where you're going without knowing where you are now." We often talk to companies who have predictive and preventative aspirations but who still don’t have machines networked, the necessary IT infrastructure to capture and aggregate machine data, or the internal organizational resources required to decipher the data and implement continuous process changes. This roadmap should actually be quite logical at its core: let’s become as capable as we can and have all our ducks in a row to ready ourselves for the greater journey ahead. Once we’ve optimized capability, it’s time to digitize our assets, visualize our manufacturing data in real-time, and measure the success of our KPI’s using our tools. 


Fintech companies: The ideal talent pool for banks?

Partner up with fintech companies so you can leverage their talent pool. Thailand’s Bank of Ayudhya (BAY), for example, has forged relationships with 25 fintech companies so far, and expects that number to rise to 40 by the end of the year. Other banking giants in Thailand are using the same strategy. Bangkok Bank’s Executive VP Kukkong Ruckphaopunt told local media that the partnership model is a key strategy for acquiring tech talent. Wirawat Panthawangkun, Senior Executive VP of Kasikornbank, said KBank too is using a similar strategy to invest in tech firms to scout out tech talent from around the world. Banks and financial services institutions in other parts of Asia are following suit — except China who produces eight million computer science graduates every year and only needs the occasional innovator or thought leader to accelerate its tech growth.


What hiring managers want to see in data scientists’ CVs

Technical Depth - What Hiring Managers are looking for in Data Scientists CVs?
Some candidates focus on who they reported to, others focus on the accuracy and/or complexity of the models they built, while others only mention the types of projects they worked on. ... My ethos, which is essential in a commercial environment, is to always start with the simplest possible model and only optimise and/or add complexity if/as required. This is precisely what the Lean Startup framework mandates and is precisely what we do in my Data Science team at Royal Mail. This is because you would usually hit diminishing returns as you continue to optimise and/or add complexity to a model, and the key is to know when your model is good enough to have a tangible business impact, and then deliver it, realise the value and move on to the next most crucial problem. So ideally, in the work experience section of the CV, I would like to see multiple impact statements, at least one for each Data Science role the candidate has held. This would give me confidence that the candidate has good commercial awareness and is worth investing in, as I can expect a good ROI.


Microservices in a Post-Kubernetes Era


On cloud native platforms, observability is not enough. A more fundamental prerequisite is to make microservices automatable, by implementing health checks, reacting to signals, declaring resource consumption, etc. It is possible to put almost any application in a container and run it. But to create a containerized application that can be automated and orchestrated effectively by a cloud-native platform requires following certain rules. Following these principles and patterns, will ensure that the resulting containers behave like a good cloud-native citizen in most container orchestration engines, allowing them to be scheduled, scaled, and monitored in an automated fashion. Rather than observing what is happening in a service, we want the platform to detect anomalies and reconcile them as declared. Whether that is by stopping the directing of traffic to a service instance, restarting, scaling up and down, or moving a service to another healthy host, retrying a failing request, or something else, this doesn’t matter.


Card-Skimming Malware Campaign Hits Dozens of Sites Daily

Card-Skimming Malware Campaign Hits Dozens of Sites Daily
Websites don't necessarily catch on quickly after an infection. "The average recovery time is a few weeks, but at least 1,450 stores have hosted the magentocore[dot]net parasite during the full past six months," de Groot writes. Attackers often execute a brute-force attack against a Magento control panel, de Groot says. And attackers are clever: Their code can remove other malicious code that's already in a Magento installation and is also designed to hide its tracks. The malicious code does that via a backdoor included in a cron.php file placed by attackers periodically downloads "malicious code, and, after running, delete itself, so no traces are left," he writes. The code also changes the password for registered Magento users to "how1are2you3," de Groot writes. ... It's best to nuke infected installations and restart, he says. "Revert to a certified-safe copy of the codebase, if possible," de Groot writes. "Malware is often hidden in default HTML header/footers, but also in minimized, static JavaScript files, hidden in deep in the codebase."


A glimpse into the dark underbelly of cryptocurrency markets


What is the business model of the coin rankings sites? Sites like CoinMarketCap, CoinGecko, CoinRanking, Cryptoslate, CryptoCoinRankings, CoinCodex, CryptoCoinCharts, (et al.) sell ads, and in some cases, insert affiliate links to the exchanges. Some of them will sell blended pricing APIs to more sophisticated traders who want a reliable price feed. Many if not most exchanges have affiliate schemes, and referral links (“reflinks”) can be a lucrative source of revenue if you are the intermediary between active traders and exchanges. Sometimes rankings sites win doubly by accepting payment for banner ads for exchanges or trading venues, and then including their own affiliate links in the ad itself. It’s good money if you can get it. Investors go to these sites to find links to exchanges where they can trade their coins of choice, especially if they are smaller projects and do not have many points of liquidity. Since the rankings sites are the ports of call for investors, they have an almost captive audience and can easily monetize with an affiliate link.


Bitcoin Gold delisted from major cryptocurrency exchange after refusing to pay hack damages

bitcoingold.png
The hack at the center of this dispute took place between May 18 and 22, according to an incident response report published this May. The BTG team says the hack was a combination between a 51% attack and a double-spend attack. BTG experts said hackers rented servers through the NiceHash cryptocurrency mining market to overwhelm the Bitcoin Gold network and take control of more than half the BTG network computational hashrate. This is what cryptocurrency experts call a "51% attack," a dangerous scenario that grants attackers the ability to modify transaction details on the entire Bitcoin Gold network. The BTG team says that during the 3.5 days attackers overwhelmed the Bitcoin Gold network, hackers deposited large quantities of Bitcoin Gold funds at cryptocurrency trading platforms. Seconds after these deposits, hackers would convert the funds into another cryptocurrency and transfer the money to new accounts at other exchanges.


Google and Mastercard cut a secret deal to track retail sales data

A Google spokeswoman declined to comment on the partnership with Mastercard, but addressed the ads tool. "Before we launched this beta product last year, we built a new, double-blind encryption technology that prevents both Google and our partners from viewing our respective users’ personally identifiable information,” the company said in a statement. “We do not have access to any personal information from our partners’ credit and debit cards, nor do we share any personal information with our partners.” The company said people can opt out of ad tracking using Google’s “Web and App Activity” online console. Inside Google, multiple people raised objections that the service did not have a more obvious way for cardholders to opt out of the tracking, one of the people said. Seth Eisen, a Mastercard spokesman, also declined to comment specifically on Google. But he said Mastercard shares transaction trends with merchants and their service providers to help them measure "the effectiveness of their advertising campaigns.”



Quote for the day:


"Stressing output is the key to improving productivity, while looking to increase activity can result in just the opposite.” -- Paul Gauguin


Daily Tech Digest - September 03, 2018

Taking the pulse of machine learning adoption

ml-recorded-future.png
The least surprising part of the survey is how respondents categorized their organizations' experience with ML: roughly half are in beginners in exploration phase who are just starting to investigate ML. The remainder -- early adopters with roughly 2 years of ML experience and "sophisticated" organizations with at least 5 years or more accounted for 36% and 15%, respectively. Our take is that if you blew out the survey to a totally blind sample taken from the general population, those numbers would drop considerably. Nonetheless, we'd surmise that these organizations, by virtue of their budgeting for IT/data or analytics-related learning are among those who will be spending the lion's share on IT -- and AI and ML in particular. In the interest of full disclosure, these results are of more than passing interest to us because of the primary research that we're conducting for the day job -- Ovum research jointly sponsored with Dataiku on the people and process side of AI, where we'll be presenting the results at the Strataconference next month.


The Moral Responsibility of Social Networks
How can social media outlets better tune their algorithms? It's a challenging technical problem, but it would also require a willingness to forgo ad revenue that plays on the back of intentionally manipulative or offensive content. There are also battles to be waged against crafty legitimate users who post edgy content that constantly skirts the boundaries of terms of service. As an example, Twitter struggled internally with how to handle right-wing commentator Alex Jones. But the decisions over Jones and lesser firebrands shouldn't be difficult. Neither Twitter nor Facebook or any other company would allow a speech in their corporate headquarters that, for example, employs racist dog whistles or subtly encourages aggression against refugees. And online, their policies should be no different. Such censorship would raise ire, of course. Just a handful of social media outlets have become the main channels for distributing information. Drawing up guidelines for acceptable content isn't difficult, but it is hard to evenly apply them.



For CIOs and CISOs security decision is no less than a dilemma

Just imagine the scene through the eyes of any CIO, CISO or CSO and most would agree it’s certainly a big dilemma – if not done in a right way then it could detrimental in its own way.  “Exactly, of course we know that is the dilemma and what should be right the (security) approach – is what we are saying,” said Bhaskar Bakthavatsalu, Managing Director – Check Point, a cybersecurity solutions company, which is known for firewall technology.  More than a thousand security vendors to deal, a wide security technology products and solutions to choose, putting security controls to match unique needs in the organisation and business domains, and adhering to government and industry regulations plus distinctive business demands. ... On top of that, there are these continuous cyber threats and unknown sophisticated virus and malware attacks emerging almost every day from anonymous sources and cybercriminals operating from untraceable locations on the earth.


Most UK businesses are not insured against security breaches and data loss, says study

Most UK businesses are not insured against breaches and data loss image
“Third party risk is an interesting topic for cyber insurance underwriting that will certainly evolve as this space matures. Currently cyber insurance underwriting is more focused on the entities themselves being insured, however underwriting takes numerous variables into consideration, and the third-party risk will certainly be a factor for the underwriting process, in particular for larger enterprises.” “Security ratings is one of many variables utilised in the underwriting process. Things such as the company itself, the overall industry risk, responses from questionnaires issued, etc. are all factored in, in addition to security ratings. Each area is weighted accordingly to the overall risk being assessed. As the security ratings industry matures, more weight will certainly be lent to the information security ratings provides. When it comes to SMBs, insurers are less focused on assessing the individual risk of each individual company and more on managing the overall risk of the portfolio”


Difference Between UX and UI Design

Difference between the UX and UI
Years ago, we had doctors - just doctors. They practiced every kind of medicine, had small offices, and even made house calls. We called them general practitioners. As the field of medicine grew and research and knowledge expanded, doctors began to specialize. Now we go to one doctor for ear, nose and throat issues; we go to another for skin issues; we go to others for issues with any of our major internal organs. ... So, now we have UX and UI designers, each with their specific facets of web design. These terms are often used interchangeagably, however, and there is some disagreement as to what exactly each specialty entails. So here is a basic definition of each. While UX designers do a lot in the area of how users interact with products and services and designing that flow of interaction, but they do not focus on marketing or sales. They do, however, work with marketing departments, in, for example, the sequence in which products and services may be presented.


Understanding Type I and Type II Errors

In statistical test theory, the notion of statistical error is an integral part of hypothesis testing. The statistical test requires an unambiguous statement of anull hypothesis, for example, "this person is healthy", "this accused person is not guilty" or "this product is not broken". The result of the test of the null hypothesis may be positive or may be negative. If the result of the test corresponds with reality, then a correct decision has been made. However, if the result of the test does not correspond with reality, then two types of error are distinguished: type I errorand type II error. ... Type I and type II errors are highly depend upon the language or positioning of the null hypothesis. Changing the positioning of the null hypothesis can cause type I and type II errors to switch roles. It’s hard to create a blanket statement that a type I error is worse than a type II error, or vice versa. The severity of the type I and type II errors can only be judged in context of the null hypothesis, which should be thoughtfully worded to ensure that we’re running the right test.


Data breach reports see 75% increase in last two years

Data breach reports see 75% increase in last two years image
“Reporting data breaches wasn’t mandatory for most organisations before the GDPR came into force,” explained Andrew Beckett,  “so while the data is revealing, it only gives a snapshot into the true picture of breaches suffered by organisations in the UK. “The recent rise in the number of reports is probably due to organisations’ gearing up for the GDPR as much as an increase in incidents. Now that the regulation is in force, we would expect to see a significant surge in the number of incidents reported as the GDPR imposes a duty on all organisations to report certain types of personal data breach. “We would also expect to see an increase in the value of penalties issued as the maximum possible fine has risen from £500,000 to €20 million or 4 per cent of annual turnover, whichever is higher. The ultimate impact is that businesses face not only a much greater financial risk around personal data, but also a heightened reputational risk.”


5 Lessons I Have Learned From Data Science In Real Working Experience

Be like a Detective. Carry out your investigation with laser focus on details. This is particularly important during the process of data cleaning and transformation. Data in real life is messy and you must have the capability to pick up signals from the ocean of noise before you get overwhelmed. Therefore, having a detail-oriented mindset and workflow is of paramount importance to be successful in Data Science. Without a meticulous mindset or a well-structured workflow, you might lose your direction in the midst of diving into exploring your data. You may be diligently performing Exploratory Data Analysis (EDA) for some time but still may not have reached any insights. Or you may be consistently training your model with different parameters to hopefully see some improvement. Or perhaps, you may be celebrating the completion of arduous data cleaning process, when the data could in fact be not clean enough to feed to your model.


Is It Time to Replace Your Network's Annual Check-Up?

shutterstock 667627561
The evolution toward a more holistic, personalized health maintenance program will create an explosion of data. In fact, the amount of worldwide health care data is expected to grow to 25,000 petabytes in 2020. This will put more pressure on our communication networks. As a result, it's imperative to ensure the "health" of the data network is robust and that sharing patient information amongst all stakeholders is possible. Much like the annual physical health checkup, the traditional approach of many network managers was to conduct infrequent network performance checkups and to take action only when there is an unexpected outage or issue. In today's on-demand world where users expect their communications to be available 24/7, this is no longer acceptable. If network managers look only for alarms, they see just a fraction of the information available at any given moment and lose the ability see the complete network health picture. This can restrict how much preventive action can be taken to avoid network disruption.


The pressure's on: digital transformation seen as a make-or-break proposition for IT managers

As with many technology trends over the years, many executives rush to buy the shiny new gadgets, expecting them to work miracles on their calcified, customer-repelling processes. Digital transformation -- and all the technologies associated with it -- is only the latest example. Companies attempt to put digital approaches in place, thinking they can do things cheaper, without funding the essential background work, such as data integration. But the competitive pressure is intense: 85 percent said disruption in their industry has accelerated over the past 12 months. Thirty-five percent say the primary driver for digital transformation is advances made by competitors, 23 percent changes in regulation, and 20 percent pressure from customers - "meaning digital transformation is mostly being driven by reactive needs, instead of proactive ideas," the survey's authors conclude.



Quote for the day:


"If You Don't Like Your Situation, Take Actions To Change It, Hope Is Not A Strategy." -- Gordon TredGold