Daily Tech Digest - April 27, 2018

Developers, rejoice: Now AI can write code for you

istock-648449908.jpg
A new deep learning, software coding application can help human programmers navigate the increasingly complex number of APIs, making coding easier for developers. The system—called BAYOU—was developed by Rice University computer scientists, with funding from the US Department of Defense's Defense Advanced Research Projects Agency (DARPA) and Google. While the technology is in its infancy, it represents a major breakthrough in using artificial intelligence (AI) for programming software, and can potentially make coding much less time intensive for human developers. BAYOU essentially acts as a search engine for coding, allowing developers to enter a few keywords and see code in Java that will help with their task. Researchers have tried to build AI systems that can write code for more than 60 years, but failed because these methods require a lot of details about the target program, making them inefficient, BAYOU co-creator Swarat Chaudhuri, an associate professor of computer science at Rice, said in a press release.



6 Reasons Why IT Workers Will Quit In 2018

Across every generation, job satisfaction is strong, with 70 percent of IT workers saying they are content with their current job. But while they enjoy their careers, nearly two-thirds of IT pros said they aren’t happy with their compensation. By generation, 68 percent of millennials (those born between 1981 to 1997) said they feel underpaid, while 60 percent of Gen Xers (those born between 1965 to 1980) and 61 percent of baby boomers (those born between 1946 to 1964) said the same. Of those who said they were looking for a new job in 2018, 81 percent of millennials said they wanted to get a higher salary, while 70 percent of Gen Xers and 64 percent of baby boomers said the same. Millennials may be more motivated by salary considering they make an average salary of $50,000 per year. Meanwhile, Gen Xers in IT earn an average of $65,000 per year, while baby boomers average around $70,000 per year. Some companies are already taking steps to secure their junior workers with a pay raise, as 62 percent of millennials expect to get a raise in 2018 from their current employer and 31 percent expect a promotion.


Employees still in the dark about data protection


According to the EEF report, a “worryingly large” 12% of manufacturers surveyed have no process measures in place to mitigate against the threat, only 62% of respondents said they train staff in cyber security, 34% said they do not offer cyber security training and 4% said they did not know. “The Beyond the phish report illustrates the importance of combining the use of assessments and training across many cyber security topic areas, including phishing prevention,” said Joe Ferrara, general manager at Wombat. “Our hope is that by sharing this data, infosec professionals will think more about the ways they are evaluating vulnerabilities within their organisations and recognise the opportunity they have to better equip employees to apply cyber security best practices and, as a result, better manage end-user risk.” According to Wombat, the report validates the need for organisations to use a combination of simulated attacks and question-based knowledge assessments to evaluate their end-users’ susceptibility to phishing.


Organizations gaining new benefits by automating data engineering

Historically, the necessity of data engineering was only matched by its tediousness. Preparation for data analytics and application use involved some wrangling that produced two undesirable side effects. First, wrangling measures like cleansing, transforming, integrating and curating raw data traditionally monopolized data scientists’ time. Secondly, the complexity and lengthy duration of these tasks often alienated the business from using data. However, a number of advancements in data engineering have now decreased data preparation time while increasing time for exploration and applications. By automating aspects of the wrangling process, expediting data quality measures, and making these functions both repeatable and easily shared with other users, alternative solutions to this problem are “empowering your more business type users with functionality that maybe would have only been available to a database administrator or DB doers,” explains Noah Kays, director of content subscriptions at Unilog, which offers a product information management platform.


Apple Is Struggling To Stop A 'Skeleton Key' Hack On Home Wi-Fi


Even with all Apple's expertise and investment in cybersecurity, there are some security problems that are so intractable the tech titan will require a whole lot more time and money to come up with a fix. Such an issue has been uncovered by Don A. Bailey, founder of Lab Mouse Security, who described to Forbes a hack that, whilst not catastrophic, exploits iOS devices' trust in Internet of Things devices like connected toasters and TVs. And, as he describes the attack, it can turn Apple's own chips into "skeleton keys." There's one real caveat to the attack: it first requires the hacker take control of an IoT technology that's exposed on the internet and accessible to outsiders. But, as Bailey noted, that may not be so difficult, given the innumerable vulnerabilities that have been highlighted in IoT devices, from toasters to kettles and sex toys. Once a hacker has access to one of those broken IoT machines, they can start exploiting the trust iOS places in them.


“SamSam” ransomware – a mean old dog with a nasty new trick

One cybersecurity catchphrase you’ll hear these days is that “X is the new ransomware”. That’s because the ransomware scene is no longer clearly dominated by long-running, well-known “brand names” (so to speak) such as CryptoLocker, TeslaCrypt or Locky. In other words, many people are convinced that ransomware has had its day, is dying out, and new threats are taking over. A popular value for the variable X in in the equation above is cryptojacking, where crooks sneakily insinuate cryptocurrency mining software onto your computer or into your browser. Rather than snatching away your files, like ransomware does, cryptojackers steal your processing power and your electricity instead. This means that the crooks earn a tiny bit of money from every victim for as long as they’re infected, rather that taking the all or nothing approach of ransomare, where victims face a stark choice: pay and win, or refuse and lose.


Five areas of fintech that are attracting investment

Overall investment and merger and acquisition activity in fintech almost halved from a record high of $46.7bn in 2015 to only $24.7bn last year, according to KPMG. This is partly a natural, even welcome, correction after the initial hype. Uncertainty created by the Brexit vote and Mr Trump’s election has also had an effect, however. Another negative factor was the governance scandal last year at Lending Club, the biggest online lender in the US, combined with disappointing performances by some of its rivals, which turned investors off peer-to-peer lending. Investor interest continues to rise in some areas of fintech, however, including cyber security, artificial intelligence, blockchain technology and insurtech. There has also been positive news from the two winners of last year’s Future of Fintech awards. Paytm, the Indian electronic payments company, has thrived following the country’s withdrawal of high-value banknotes, and Transmit Security, the cyber security start-up, recently announced a $40m self-funding round.


Data and privacy breach notification plans: What you need to know

draft eprivacy regulation   privacy by design
IT alone is not in a position to have all the knowledge needed to execute on even the most refined notification plans. Instead, “the lawyers, the security officers, crisis communication specialists and IT professionals all need to be lashed together at the hip,” Bahar said. “It takes their combined expertise and judgment.” Bahar even suggests that your organization’s legal team might have to take a leadership role in the notification process. “The potential litigation and regulatory stakes are so high, not to mention the public relations and reputational stakes, so the lawyers need to be heavily involved,” he says. The legal team can help work out what is said and how it is said to best meet requirements and minimize risk—and they don’t need to be wasting time conducting time-sensitive legal research. Many regulations require public disclosure of the breach, whether that’s to customers, shareholders, partners, and so on. This is where marketing and public relations teams can help with that communication.


Best Security Software: How 9 Cutting Edge Tools Tackle Today's Threats

Movie preview test pattern
Threats are constantly evolving and, just like everything else, tend to follow certain trends. Whenever a new type of threat is especially successful or profitable, many others of the same type will inevitably follow. The best defenses need to mirror those trends so users get the most robust protection against the newest wave of threats. Along those lines, Gartner has identified the most important categories in cybersecurity technology for the immediate future. We wanted to dive into the newest cybersecurity products and services from those hot categories that Gartner identified, reviewing some of the most innovative and useful from each group. Our goal is to discover how cutting-edge cybersecurity software fares against the latest threats, hopefully helping you to make good technology purchasing decisions. Each product reviewed here was tested in a local testbed or, depending on the product or service, within a production environment provided by the vendor. Where appropriate, each was pitted against the most dangerous threats out there today as we unleashed the motley crew from our ever-expanding malware zoo.


Sustainable Software with Agile


In the Agile Software Factory of Cegeka, all teams have a bi-weekly reporting to monitor whether we’re still doing the right things right within the agreed budget and timeframe. They are filling in a Progress report – PMI style reporting on customer, timing, budget, scope, dependencies & quality. This report is made available towards the software factory management & the customers. We value the transparency, openness of the status of all project activities. It includes reporting on the sprint, the agreed SLA’s, the defects… The monitoring of the application is happening from different perspectives on a permanent basis. The Ventouris team has implemented a continuous build & deploy environment in which the automated tests are running by each check-in of new code. If the code is broken the information radiator is indicating that it must be claimed to be fixed with the highest priority. With a test coverage of more than 100%, the team can avoid regression. The Ventouris team is using "New Relic" as application monitoring tool for the performance follow-up on each of the SLA per transaction type.



Quote for the day:


"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller


Daily Tech Digest - April 26, 2018

Delivering future-focused enterprises

crystal ball
It is important that CIOs and their teams ensure that IT isn’t perceived as the group that always says no, moves too slow, or doesn't understand what the business needs. Part of fixing this is knocking down siloes within IT. As a part of this, CIOs need to empower their organizations to look out for themselves for technology change that impacts the business services they support. CIOs need to overcome these too by staffing strategy and designing roles that are not only good at transitions and operations. CIOs said as well that it is important to adopt and train everyone on a framework that brings IT together with one voice and as one team. Framework examples include ITIL, TOGAF, and IT4IT. Our CIOs said this process should optimize things IT-wide rather than for a single team. CIOs said IT leaders as well need to push back on tactical band-aids and responses wherever possible. They need to establish proactive planning, strategic goals, and business-oriented metrics. For example, instead of measuring tickets per month for disk space, they should instead be rolling up this kind of data into a strategic metric regarding capacity planning effectiveness.



Tackling Edge Computing Challenges

edge computing
It’s easy to think edge computing magically solves many problems that cloud computing can’t, but there’s a trade-off due to the highly distributed nature of edge systems. Each of the edge nodes are not completely independent, as each may need to share information with other nodes, and keeping data consistent is a challenge. The question is: How do I coordinate a large number of edge computing systems while still allowing them to work independently? This is a problem that has perplexed designers of distributed systems for many years. People call this the distribution, consistency, and synchronization problem. The number of edge computing systems will be high, so any solution will need to scale greatly. Altogether, this is a big problem to solve. Except for some very specialized workloads that simply process events and upload data, many applications processed at the edge need to share security, customer, and other contextual information. What kind of apps need to do this? IoT apps, gaming, advertising, virtual or augmented reality, and mobile apps are good examples.


5 signs you've been hit with an advanced persistent threat (APT)

APTs rapidly escalate from compromising a single computer to taking over multiple computers or the whole environment in just a few hours. They do this by reading an authentication database, stealing credentials, and reusing them. They learn which user (or service) accounts have elevated privileges and permissions, then go through those accounts to compromise assets within the environment. Often, a high volume of elevated log-ons occur at night because the attackers live on the other side of the world. If you suddenly notice a high volume of elevated log-ons across multiple servers or high-value individual computers while the legitimate work crew is at home, start to worry. APT hackers often install backdoor Trojan programs on compromised computers within the exploited environment. They do this to ensure they can always get back in, even if the captured log-on credentials are changed when the victim gets a clue. Another related trait: Once discovered, APT hackers don't go away like normal attackers. Why should they? They own computers in your environment, and you aren't likely to see them in a court of law.


Almost all London law firms are using or plan to use artificial intelligence

According to a survey of more than 100 law firms by real estate advisory CBRE, 48% are already using AI software in their businesses and 41% have imminent plans to do the same. The survey found 61% of the companies already using AI are doing so to generate and review legal documents. It also revealed 47% are using AI for due diligence purposes and 42% for research. About a third (32%) are using AI to carry out compliance and administrative legal support. Almost half (45%) said they expect a reduction in the staff numbers as a result, but only 7% think senior jobs will be cut. “Our study found considerable uncertainty around the impacts of AI on employment, reflected by over 30% who were unsure of the potential impact at each level,” said Frances Warner Lacey, senior director of the central London tenant advisory group at CBRE. “This will make formulating a dynamic real estate strategy, to cope with these structural changes to the sector, particularly problematic for law firms.”


Cisco reinforces storage with new switches, mgmt. software

data storage man watch
The idea is to eliminate the cycles spent in provisioning new devices and avert errors that typically occur when manually configuring complex zones. Even when a host or storage hardware is upgraded or a faulty facility is replaced, the switch automatically detects the change and zones them into the SAN, Cisco said. The switches also support a number of features that are typically only found in higher-end boxes, according to Cisco’s Adarsh Viswanathan senior manager, storage product management and marketing. These include redundancy of components, HVAC/HVDC power options and smaller failure domains to ensure higher reliability. The switches also support Fibre Channel-NVMe to help customers moving towards all-flash storage environments. NVMe was developed for SSDs by a consortium of vendors including Intel, Samsung, Sandisk, Dell, and Seagate and is designed as a standard controller technology for PCI-Express interfaces between CPUs and flash storage. The switches fill out Cisco’s existing MDS storage-fabric switch line which includes the 9132T 32 Port 32G Fibre Channel Switch and MDS 9396S 16G Multilayer Fabric Switch.


Artificial intelligence will be worth $1.2 trillion to the enterprise in 2018

Companies including Google, Apple, Microsoft, IBM, and Nvidia are already heavily involved in the research and development of AI-based products and services. According to CB Insights, startups worldwide are springing up to specialize in artificial intelligence with an emphasis in industries including customer relationship management, automotive, sales, marketing, and commerce. At first, Gartner believes strong growth will appear in the customer experience sector while enterprise players experiment with AI and offshoot technology, such as deep learning, neural networking, and machine learning software. Virtual agents, for example, can take over simple customer requests and tasks from call centers, reducing the cost for companies in offering customer helplines. By taking over the simple issues, human operators are then free to dedicate their time to complicated issues, which, in turn, may improve customer service.


SD-WAN benefits the changing network connectivity landscape


The future for services like MPLS, then, depends on the requirements for security and end-to-end traffic performance guarantees. With so many providers pushing SD-WAN as internet-based VPN services, MPLS will see a decline in usage, as IT teams view the platform as restrictive and expensive. The private nature of MPLS connections means an organization can access only certain cloud services, depending on whether it has connections to private cloud services in its data center or office locations. But MPLS is the technology of choice when enterprises require end-to-end traffic performance and privacy. While internet-based SD-WAN benefits include granular traffic control for both prioritization and connection states, quality of service (QoS) exists primarily at the customer edge. With MPLS, end-to-end traffic prioritization is an inherent property of the technology that translates into predicable latency and jitter to support mission-critical and delay-sensitive applications.


Why Hackers Love Healthcare

Most healthcare organizations spend just 3% of their IT budgets on security, while the SANS Institute — the largest provider of cybersecurity training and certifications — recommends spending at least 10%. For most healthcare organizations, security is often an afterthought. They don't provide regular cybersecurity training for their employees, which could help reduce insider threats. For example, 18% of healthcare employees say they're willing to sell their login credentials for between $500 and $1,000. And about one-quarter of healthcare employees know someone in their organization who has engaged in this practice. To address employee-related cyber vulnerabilities, it's important to note that while training is essential, it won't magically protect patients’ digital data. Although some hospitals struggle to deploy the most basic IT security measures, such as intrusion detection and the ability to wipe lost or stolen devices, it is imperative that basic cyber hygiene practices are coupled with ongoing training to both protect well-intended employees and mitigate future data loss from those seeking to profit.


How Machine Learning Is Changing the World -- and Your Everyday Life
Machine learning and the IoT is enhancing the way we communicate and live our daily lives. Impressive advancements are being made in mind-reading technology, such as the AlterEgo headset that responds to our brainwaves to control appliances around the house. This tech has been in development for some time, and while the AlterEgo is still a little awkward looking, it isn't difficult to picture how its wearability will be improved over the next decade. It's exciting to imagine the implications for these advancements to change the way you operate the appliances in your home. The automation of our domestic lives is already occurring. Amazon's Echo and Alexa allow for the voice-activated control of your smart-home (the dimming of lights, closing of blinds, locking of doors, etc., all at your command). Even the humble fridge has been given the 21st-century makeover and is now connected to the internet. You can be at work and still see inside your fridge to know exactly what food you're running low on. You don't even necessarily need to go to the shop to restock. Your groceries can be ordered on the road and delivered to your door at your convenience.


Q&A on the Book Kanban Maturity Model: Evolving Fit-for-Purpose Organizations

The KMM is based on an organizational maturity model inspired and synthesized from a combination of the Capability Maturity Model Integration (CMMI) and Jerry Weinberg's maturity model published in his 1997 book, Software Quality Management, volume 1. The result of this synthesis gives us 7 levels from 0 through 6. Levels 1 to 5 are intended as direct mapping to the CMMI levels but with some minor changes in naming, to improve clarity, and a direct mapping to defined and observable business outcomes – something that was never explicit in CMMI. The unique selling point and key differentiator for the KMM is that the model maps increasing levels of business performance. We then correlated the observed practices and patterns of Kanban implementations against those observable business outcomes. For example, if a business steadily delivers good quality and predictable service and its customers are satisfied, then that is good enough for maturity level 3. If the satisfaction level is intermittent because service levels vary, and expectations aren't always met, then that is at most only maturity level 2. 



Quote for the day:


"The essence of leadership is the capacity to build and develop the self-esteem of the workers." -- Irwin Federman


Daily Tech Digest - April 25, 2018

SOCs require automation to avoid analyst fatigue for emerging threats

avoid analyst fatigue
SecOps needs an immediate shift across industries. Some SecOps teams develop playbooks for an additional layer of training, but when security events occur, it is uncommon to follow every step a playbook describes. The data becomes overwhelming and the resulting alert fatigue leads to analysts overlooking threats entirely, leading to an increase in emerging threats. The typical security analyst is facing a 40 percent increase in persistent threats and data breaches year over year. In the last year, there were over 1,500 breaches in the U.S. alone, exposing close to 179 million records. Additionally, the rising shortage of cybersecurity skills throughout the industry contributes to the threat detection fatigue experienced by current analysts. “In the ever-evolving threat landscape, we know machines can scale very well, but we cannot expect them to outpace human intelligence,” said Kumar Saurabh, CEO at LogicHub. “CISOs need to capitalize on irreplaceable expert human analyst knowledge to enrich security automation and provide the industry with the right training tools. This is the only way enterprises will stand a chance in protecting their most valued data.”



Introduction to Security and TLS

Encryption relies a lot on math, random number generators, and cryptographic algorithms. With encryption, there is the need for "keys:" sequences of bits and bytes which are used to lock (encrypt) and unlock (decrypt) the data. With symmetric encryption, the same key is used to encrypt and decrypt a message. It means that everyone having that (blue) key will be able to decrypt the message. So, security depends how securely I can distribute and keep that key. With asymmetric encryption, I have a pair of mathematically connected keys: a shared green key and a private red key. I keep the red key private and do not disclose and distribute it. The green key is public: everyone can use it. Everyone can encrypt a message with the green public key, but only the one with the red private key is able to decrypt it. The public and private key build a pair of keys. They are different but mathematically related. That way, only the private key is able to decrypt a message encrypted with the public key.


Securing smart factories: How Schneider Electric connects devices and prevents outages

Robot factory automation
What's happening over time for sure is, as you aggregate that data, as you can start to look at broader trends, you could start to bring in things like machine learning, and the thing that I think that we're seeing today that is the most pronounced, is that you still need quite a bit of human interaction when it comes to machine learning or AI. You need to identify patterns, and then you need to feed those back into machine learning so that you know what that pattern recognition looks like, and then you can start to take proactive measures, and so, just one example. You know a lot of outages or problems that happen in industrial setting, often start, you can actually look at things like partial discharge, or electrical partial discharge that happens in equipment. ... And so today, if you kind of looked at the signatures of what that looks like, a human being can look at that, you know we have thousands of electrical engineers in our company. Incredibly intelligent about what they do. You might not necessarily want to go out drinking with them but, they're a lot of fun too, to actually identify these problems. These guys can look at that, they can look at those signatures, they can instantly say, "You're going to have a problem here."


Is the U.S. headed toward a cashless economy through blockchain?

multiple-exposure image of FinTech symbols, laptop, circuit board, and a dollar bill
A government-backed digital currency could do away with banking fees that often target the poor who make many small, electronic payment transfers via services such as Western Union, while at the same time creating greater efficiencies. ... Cryptographic keys controlling funds could be in a consumer's control; the consumer could be issued a private key associated with their electronic funds and be able to use public keys for payments. Sweden's central bank, The Riksbank, is currently considering issuing a digital currency or cryptocurrency similar to bitcoin for mobile payments. Called the e-Krona, the digital currency would be used for smaller payments between consumers, businesses or with government agencies, and it would create safer and more efficient transactions, the government has argued. In 2015, Ecuador created the world's first state-sponsored digital currency, called Sistema de Dinero Electrónico, which was backed by the central bank; it allowed people to have money in accounts that could be traded on their phones. Ecuador, however, shuttered its electronic money system last year "due to lobbying by the banks," Garratt said.


How APIs can help prevent data warehouse hell

The challenges of breaking down data silos is just as much a problem of company culture. Department heads and even individual technicians may become territorial about the data in their care, reluctant to share it with others and suspicious about any plans to end a data silo as is typical during your traditional central warehousing efforts. A mandate from the top of the company can start the process of openig a data silo, but data owners may want to be able to do it at their own pace and comfort level. Adapting to those preferences is all but impossible to accomplish in traditional big data projects. These problems shouldn’t cause despair. There’s a way to open data silos that addresses both the technical and human problems mentioned above. Data owners shouldn’t be forced to immediately dump all of the information together in a warehouse. The process should happen at a deliberate pace, set in part by the owners of the data. The data doesn’t ever have to leave the silo to be shared. The right API inserted into the data silo by the data owner provides access to the information to everyone who might need it.


Becoming a ‘Digital Bank’ More Than Lipstick on a Legacy Pig

To digitally deliver an exceptional customer experience, an organization must build from within, engaging all functional areas and stakeholders, to ensure a seamless and easy journey from shopping to purchase to use. This includes looking at all back-office processes and data flows to make sure they are in alignment with what is required by the digital consumer. For instance, it is virtually impossible to develop a 5-minute consumer loan product for both customers and prospects without completely revamping the process flow behind the scenes. It is obviously even more difficult to match the 30-second delivery offered by leading banks and credit unions worldwide. Beyond just changing the process, data needs to flow between legacy silos from the initiation of the process to the fruition. The digitization is even more difficult for a home loan, where the stakeholders include the realtor, loan underwriter, regulator, builder, insurer and the end customer. To optimize the journey, all of these stakeholders must be aligned and understand the final objective … to remove all friction from the consumer journey.


Artificial Intelligence: 6 Step Solution Decomposition Process


Success with artificial intelligence doesn’t begin with technology, but rather the business, and more specifically the people and processes running the business. Before deploying technology, leaders should seek to understand (envision) how artificial intelligence could power a profitable business, and drive compelling customer and operational outcomes. Collaboration with stakeholders and key constituents is critical to understanding the decisions and needs of the business. While every organization’s needs vary, there exists a consistent, transparent process that can drive a more stable and widespread adoption of artificial intelligence. Note: throughout this blog, when I use the term “artificial intelligence,” I mean that to include other advanced analytics such as deep learning, machine learning (supervised, unsupervised, reinforcement), data mining, predictive analytics, and statistics ... The power of this process is its simplicity. By staying focused on the business or operational objectives and tasks, businesses can successfully transform how they use data and analytics to produce optimal outcomes.



The importance of firmware security

There are several lessons to be taken away from the Fusée Gelée exploit, and they apply to OEMs as well IT professionals. First off, manufacturers need to be sure that their hardware has been properly tested against all possible attacks. Fusée Gelée allows a device owner to hack their own hardware, which isn't a risk itself, but it could also allow an attacker to write code to remotely execute a similar attack. Firmware security is a critical part of device design that can easily be exploited—just look at Spectre and Meltdown. Had Intel been diligent in seeking out vulnerabilities, it might not be facing a vulnerability in nearly every single processor it ever created. For IT support staff and security professionals, Fusée Gelée paints a whole other set of complications: hardware security. In the case of the Nintendo Switch, hardware modification was necessary to force the device to boot into recovery mode. Doing so isn't complicated though: It just requires the bending of an exposed pin.


Introduction to GraphQL

GraphQL was created directly for different APIs. Its main purpose is to use flexible syntax and systems that simply describe the data requirements and interactions. Throughout its history, GraphQL became an example of properly functioning and reliable software, which could be used in a pretty simple way — even by junior-level programmers. Thanks to its features and opportunities, which have been implemented by the creators, GraphQL was able to replace other earlier customized tools, which have been designed for the same purpose. When we discuss the functions and aspects of GraphQL, it is essential to present those key opportunities. ... As you may have already noticed, one of the main benefits of GraphQL is that you, as a potential user, can do some development things much more quickly. For example, instead of writing huge texts of code, it may be enough just to use one or two primary functions to achieve what you need.


Data protection is critical for all businesses

“Personal data is considered to be one of the most sensitive categories of data an organisation has access to, and perhaps it is the most valuable,” he says. “As the value of personal data increases, so should the controls needed to protect it. Personal data should be processed only with clear consent given by the data owner, with a transparent agreement and an organisation-wide focus on preventing data theft or misuse.” To identify misuse, he believes firms should constantly analyse their businesses procedures and operations to ensure they are compliant with the latest data protection safeguards. At Netskope, Thacker treats data protection as a constant operation. Firms should not assume that once they have installed or developed a system to protect customer data, they have nothing else to do. “I recommend enterprises continually discover new and amended business processes, working alongside the business to apply the necessary safeguards needed for protection,” he says. “The aim is to understand how employees – and third parties – are using personal data and to ensure it meets the sole purposes for which it was originally collected.



Quote for the day:


"I think the next best thing to solving a problem is finding some humor in it." -- Frank A. Clark


Daily Tech Digest - April 24, 2018

The Importance Of EA for Business Transformation: Lessons Learned

In short, managing uncertainty is a necessity. Despite all the turbulence created by digital disruption, we believe that EA is mandatory for becoming a pioneer of innovation and a critical enabler of business vision. The main driver of this is that business reality is changing, and therefore IT needs to change. And EA practices need to reflect this change as well. Organizations that support Business Architecture as an integral part of EA have a significantly higher ability to execute on their corporate strategy because they have a clear understanding of the strategy and its impact on business and IT – and therefore have guidance to drive delivery. Enterprise Architects that deliver the highest business value and outcomes to their organization are those that focus on understanding the impact of major trends and opportunities on their business ecosystem, not just their own business. SKF IT uses Business and Enterprise Architecture to gain business insight and increase the relevance of IT.


Study Reveals Hottest Trends in Industrial IoT

Study Reveals Hottest Trends in Industrial IoT TechNative
Any time automation is mentioned, concerns about jobs are raised. While disruptive technology will affect job markets, it’s also leading to increased demand for talent, as AI and machine learning provide valuable information that must be carefully interpreted. When asked, CEOs around the globe discuss how critical talent is for remaining competitive, and demand will fuel higher salaries as companies compete for the best talent available. In the US, for example, over 80 percent of manufacturers claim to have difficulty finding qualified talent. Furthermore, 3.5 million jobs across the globe are likely to be created, leading to an increasing skills gap. New technology provides valuable opportunities for manufacturing and other fields, but it’s also placing pressure on C-level executives, as the cost of this new technology will demand responses for companies to remain viable. Executives will need to ensure they properly understand these new technologies and how they affect their segments, and they’ll need to uncover problems promptly to avoid being undercut by competitors.


Threat Actors Turn to Blockchain Infrastructure to Host & Hide Malicious Activity

Because blockchain top-level domains such as .bit are not centrally managed and have DNS lookup tables shared across a peer-to-peer network, takedown efforts become much more difficult. "When an individual registers a .bit — or another blockchain-based domain — they are able to do so in just a few steps online, and the process costs mere pennies." Domain registration is not associated with an individual's name or address but with a unique encrypted hash of each user. "This essentially creates the same anonymous system as Bitcoin for Internet infrastructure, in which users are only known through their cryptographic identity." Criminal interest in cryptocurrency-related topics are not new. As FireEye notes, threat actors have been exploring the possibility of leveraging the unique properties of blockchain technology to support malicious operations since at least 2009. One example is malicious actors' interest in Namecoin, a Bitcoin code-based cryptocurrency that allows pretty much anyone to register and manage domain names with the .bit extension. 


Next generation of SCADA industrial controls will protect against cyber attack


Industrial control systems – known as supervisory control and data acquisition (SCADA) systems – which are used to control valves, motors and other industrial processes, are frequently based on technology that pre-dates the internet, and can be vulnerable to attack in modern control systems which transmit and receive data over the internet. But large oil and manufacturing companies are working on plans to replace existing control system infrastructure with lower-cost alternatives that promise greater security against cyber attacks on control devices connected to the industrial internet of things which links millions of internet-connected industrial devices. The project, co-ordinated by the Open Process Automation Forum, part of independent standards organisation The Open Group, aims to help oil and gas and process companies break free from manufacturer-specific industrial control systems, which are expensive to maintain and upgrade and difficult to patch to protect against the latest security vulnerabilities.


Spring Has Splunk'd: Announcing New & Expanded Artificial Intelligence Capabilities

Reports claim AI is shaping the latest in consumer tech and also threatening future job growth. All of this is in the absence of a widely accepted definition of the term. Those of us dedicated to enterprise software are presented with a critical opportunity to move beyond the buzz. I’m excited to lead Platform marketing at Splunk, a company that has, for a decade, invested heavily in machine learning (ML)—predictive analytics, data clustering, and anomaly detection—which is a subset of artificial intelligence. Our customers—Hyatt, Recursion Pharmaceuticals, and TransUnion to name a few—rely on Splunk AI and ML to deliver actionable performance, productivity, and security benefits that map their real-world IT, security and business needs. Artificial intelligence through machine learning is integrated across our portfolio. AI through ML is embedded in our premium solutions (Splunk ITSI and Splunk UBA) for specific IT and security use cases. We also offer a customizable solution, Splunk Machine Learning Toolkit (MLTK)—applicable for a broad variety of use cases—within Splunk Cloud and Splunk Enterprise.


Nurses want to use IT, but are held back by barriers


“Poor connectivity when mobile working hinders information technology from being used to best effect,” the report said. “Systems fail to update and/or synchronise, programmes used for recording information fail to load and systems crash. This leads to nurses having to use paper-based methods of recording information and duplicating this onto IT systems back at base.” Another challenge is the cost of good IT systems. NHS organisations often work on yearly budgets, whereas the return on investment (ROI) of implementing digital systems is usually more long-term. “The ‘up-front’ cost of IT in a tight financial climate serves to increase the risks of waste if technology is not fully used,” the report said. “Systems are prone to crashing and are slow, leading to frustration and compelling community nurses to work from paper.” Some of the nurses surveyed also highlighted concerns that the use of IT took away from time spent with the patient, and that they often felt like the use of technology has “detracted from the role of being a nurse”.


Tech support scams are on the rise, up 24%, warns Microsoft

Tech support scams are up 24%, warns Microsoft
Not all of those scams were cold calls from fake tech support; some started at random websites that had a popup warning about detecting fake threats or fake error message popups. Other social engineering attacks started in email campaigns where the user would click on a URL or open a malicious attachment; once malware is on a computer, it can make system changes or flash fake error messages with a number to call to fix the problem. Scammers continue to resort to these tactics because they work so well to scare the pants off non-tech-savvy users. Of the 153,000 tech support scams reported to Microsoft, 15 percent of victims admitted to losing money in the scam. While most paid between $200 and $400 for the fake problems to be “fixed,” one scammer managed to drain the bank account of a user in the Netherlands. That poor person lost €89,000, which is about $108,838.54. For anyone wondering how a scammer managed to empty the victim’s bank account, Oregon’s FBI explained that some victims of tech support scammers first received a notification about a refund after overpaying for a previous tech support incident.


5 key enterprise IoT security recommendations

5 key enterprise IoT security recommendations
Not so long ago, the phrase “consumerization of IT” was on everyone’s lips. Whole publications and conferences (remember CITE, for Consumerization of IT in the Enterprise?) were created to chronicle the trend of corporations relying on products and services originally created for consumers — which was often easier to use and of higher quality than its business-oriented competitors. ... It turns out that in addition to the “enterprise grade” Internet of Things (IoT) devices they buy, corporate IT teams also have to deal with “consumer-grade” devices that may enter the company via a variety of channels, from non-IT company purchases to staff members bringing them in on their own. Examples include smart TVs, thermostats, smart speakers, fitness trackers, video cameras … basically anything connected to the company network that isn’t a computer, a phone, or a router. Not surprisingly, these devices often lack the comprehensive security features more commonly found on IoT products designed for enterprise use. Worse, perhaps, IT teams may not even be aware that these devices are being connected to their networks, much less be able to plan for their security.


'Death to JavaScript!' Blazor, for .NET Web Apps Using WebAssembly, Goes Alpha


Instead of a heavy dependence on JavaScript, notorious for its complex ecosystem, the new .NET Web framework lets developers use C#, Razor and HTML to create Web apps, with the help of WebAssembly, a low-level assembly-like language that serves as a compilation target for higher-order languages, including C, C# and C++. Razor is "an ASP.NET programming syntax used to create dynamic Web pages with ... C# or Visual Basic .NET." All those technologies combine to form Blazor, which we first reported on when a developer asked Microsoft's Scott Hanselman if the company was working on .NET targeting WebAssembly "so that we can get delivered from the insanity of JavaScript." The answer was "yes," and that answer has been realized in the first public preview. "Blazor enables full stack Web development with the stability, consistency, and productivity of .NET," Microsoft's Daniel Roth announced in a post yesterday. "While this release is alpha quality and should not be used in production, the code for this release was written from the ground up with an eye towards building a production quality Web UI framework."


Optimizing web apps with the Sonarwhal linter

The heart of Sonarwhal is its rule set. These contain the tests it applies to your website, and you can turn them on and off or adjust severity in its configuration files. The default configuration offers a selection of rules, so you can choose to test HTTP options, as well as HTML, site security, and support for PWA functions. Many of the tests require a deep knowledge of web server capabilities as well as HTML and JavaScript. However, once you’ve tested a site, the report data can help tune content and server for the best, and most secure, performance. Results arrive in any of several formats. One option gives you the data in a raw JSON format, ideal for use in other applications. While JSON isn’t human-readable, other options show summaries, a list of specific code issues, or a table of error data. You can even drop result data in an Excel spreadsheet. The formatter model is extensible, so you can create your own and offer them to other users.



Quote for the day:


"Speak when you are angry, and you'll make the best speech you'll ever regret." -- Laurence Peter


Daily Tech Digest - April 23, 2018

Microsoft Boosts Anti-Phishing Skills 

phishing hack scam malware binary code
Dubbed "Windows Defender Browser Protection" (WDBP) the free extension can be added to Chrome on Windows or macOS, and after a post-launch fix, Chrome OS as well. Like the defenses built into Edge, the add-on relies on Microsoft's SmartScreen technology that warns users of potentially malicious websites that may try to download malware to the machine or of sites linked in email messages that lead to known phishing URLs. Microsoft keeps a constantly-changing list of these likely bad destinations on its servers, that list generated in part from telemetry sent by SmartScreen users. At least that's what it appears WDBP does: Microsoft has not documented the extension's operation beyond some general information on its site and in the description on the Chrome Web Store. In the latter, Microsoft said: "If you click a malicious link in an email or navigate to a site designed to trick you into disclosing financial, personal or other sensitive information, or a website that hosts malware, Windows Defender Browser Protection will check it against a constantly updated list of malicious URLs known to Microsoft." That is SmartScreen.


strawberries
Cattle farms and ranches usually stretch over a large land area, making it difficult to monitor the whereabouts of grazing animals without human involvement. Using tracking collars, one can find the location of these animals in real time. Then, a data storage system can record this information in a database to ultimately form a baseline model of their movements during a given time period. Applying intelligent algorithms on these patterns helps us identify if the cattle’s movements are irregular, of if one or more animals are separated from the herd. This usually occurs if they are sick or injured. This solution can easily be implemented by small IoT trackers that communicate over an IoT network like Wi-SUN or other WANs. One could then have networking towers distributed across the fields to cover a large area. This information is then exposed to the farmer or rancher via a web portal or smartphone application, thus making it easy from them to consume it. Another area of IoT use in farming is the utilization of drones to improve crop health. Disease, and the ease of which disease spreads amongst crops, is a real cause for concern as this directly impacts crop yield.


'Tech Accord' Emphasizes Teamwork to Prevent Hacking Damage

The accord is designed to form a more cohesive defense among private companies, researchers, "civil society" and nongovernmental organizations against the range of threats. It also crucially includes a pledge to not assist governments in cyberattacks. "We will protect against tampering with and exploitation of technology products and services during their development, design, distribution and use," Smith writes in a blog post. "We will not help governments launch cyberattacks against innocent citizens and enterprises." Tension sparked between Microsoft and the U.S. government following the WannaCry ransomware outbreak in May 2017. The ransomware used a vulnerability in Microsoft's operating system to rapidly spread, causing millions of dollars in damages. North Korea has been accused by the U.S. and U.K. of developing WannaCry. The vulnerability was believed to have been one of the most productive ones used by U.S. National Security Agency. But a mysterious group calling itself the Shadow Brokers leaked the vulnerability in April 2017. 


Why human vulnerabilities are more dangerous to your business than software flaws

securityengineer.jpg
"Email remains the top attack vector...Attackers are adept at exploiting our natural curiosity, desire to be helpful, love of a good bargain, and even our time constraints to persuade us to click," the report said. Some 50% of all clicks on malicious emails occurred within an hour of it showing up in the victim's inbox. And 30% happened within 10 minutes of receiving the email. Hackers, either working on their own, with a group, or with a state-sponsored entity, attempted to take advantage of human trust in most cases. Nearly 55% of social media attacks that impersonated customer-support accounts were aimed at financial institutions. "Many of these attacks rely on social engineering," the report noted. "Others simply take advantage of inclinations for immediate gratification, improved status, or even the reward of 'getting something for nothing.'" The report continued: "But as the old adage goes, there is no such thing as a free lunch. The hidden costs of a bargain in social media channels can often be credential loss to phishing, coin mining through browser hijacking, and malware infections."



Analyst balks at blockchain distributed ledger in networking


Mike Fratto, an analyst at GlobalData in Sterling, Va., said he sees no purpose for the blockchain distributed ledger in networking. To Fratto, the technology that has attracted lots of industry attention is little more than a "relatively slow" database scattered across a network of computers. As a foundation for network management, blockchain "would be wildly inefficient," Fratto said in an interview. Also, there are much better technologies already in place for grappling with networks. "Fundamentally, blockchain doesn't solve the problems in network management that need to be solved," he said. In general, blockchain is a ledger used to store transactional information across a network of computers. The distributed nature of the technology makes it highly secure, because any change to a transaction that isn't validated by the whole system is immediately rejected.


Engineering Culture Revived: The Key to Digital Transformation


Superbet has established a market-leading position in Central and Eastern Europe for Retail betting; meanwhile, over the last year we have invested heavily in the establishment of a ‘dot com’ team that will launch us globally online. Along the way, we have embedded many acquisitions and so we have quite a ‘melting pot’ of nationalities and practices, but the entrepreneurial flair runs core through all. So for instance, our Slovakian Payment System team operates completely distinctly from our UK Pricing / Trading products team, but both came to our business with an existing implementation-driven approach to market evolution: the capability to test and learn built in as core practice. As we evolve our teams we are taking care to establish the right ‘conditions’ for engineering culture from the start; so for instance, working to a business outcome, it is the team that decides HOW this will be achieved. The teams are also responsible for recruitment such that new team members are selected by the team.


The New Rules Of IT Business Alignment In The Digital Era

The new rules of IT-business alignment in the digital era
“Budgets are shifting and budgets are everything. Whoever’s got the budget has final say,” observes Matthew Mead, CTO of digital technology consulting firm SPR. Mead has observed this transition in his own work. “Traditionally, if you were selling a business system, you’d sit down with IT representatives and one business person and have a very technical conversation. Nowadays, it’s shifted completely. A lot of times we’ll find ourselves in a meeting where the business has much more representation in terms of numbers of people and IT has much less. I think IT has become more of an influencer and consultant. It used to rule the roost and make the call. Now there are many voices and IT is just one of them.” That makes vendors’ jobs easier in ways that ought to worry every CIO. “When we sold to IT, the information we went over was so much more detailed and rigorous. There were a lot of details that had to be disclosed. Now when we work with a business, the experience is a much larger focus and some details that used to be important are no longer important,” Mead says, adding that some of those no-longer-discussed details might include security and maintenance requirements.


Will enterprise IoT become BYOD on steroids?

05 byod
Unlike BYOD, IoT tools are “headless,” typically tied to line of business to drive top line revenue or bottom line cost cutting objectives. This means the importance of monitoring and managing of these new things, to ensure the best possible performance over computer networks, will eclipse that of conventional networked clients. With all the power and benefits of IoT, IoT will also present a new host of challenges to enterprise IT teams that will exceed other recent challenges enterprise IT teams have had to deal with like interoperability, protocols and security. IoT management is further complicated by the fact that some IoT devices have limited hardware capabilities, restricted networking capabilities and don’t run operating systems that support conventional IT or mobile device management. What’s more, IoT management tasks may be split across different factions in IT or network operations. Without a single source of insight into the performance of IoT devices that can be used by all the different networking constituents, more finger pointing among IT staff is sure to result. Another difficult thing for network managers to get a grip on is the impact of IoT-networked devices on capacity planning. 



Get Ready for Cloud Native, Service-Meshed Java Enterprise


Java EE, cloud native and service meshes — this doesn’t really sound like a good fit. Or does it? Is it possible to develop modern, cloud native Java Enterprise applications that fulfill concerns such as scalability, monitoring, tracing, or routing — without implementing everything ourselves? And if so, how? In an enterprise landscape of microservices there is the challenge of adding technical concerns, such as discovery, security, monitoring, tracing, routing, or failure handling, to multiple or all services in a consistent way. Software teams can potentially implement their individual services in different technologies, yet they need to comply with organizational standards. Adding a shared asset such as an API gateway tangles the services together and somehow defeats the purpose of a microservice architecture. Redundancy, however, should be avoided as well. Service meshes transparently enhance each microservice that is part of the mesh with consistent technical concerns. These enhancements are added in a technology-agnostic way, without affecting the application.


Innovative CIOs make shift to managing IT as a product

Innovative CIOs make shift to managing IT as a product
"It's about: How do I move fast, continually adopting capabilities for our organization, much like if we had a product in the market we're evolving based on customer feedback and needs?" Piddington says. Piddington brought these practices with him to MRE in 2014, instituting a culture around crisper, agile software delivery tied to data operations. Piddington soon discovered a hidden gem: IT had built a software tool that uses machine learning algorithms to assess the health of laptops, server farms and other critical machines MRE consultants use to generate revenue. MRE’s help desk technicians used this information to fix machines before they went down. Recognizing the potential to create a new revenue stream, Piddington commercialized the tool, seeding an early version with some services clients to see if it would work in environments supporting thousands of machines. Under Piddington's leadership, MRE fine-tuned the app to support network endpoint devices and virtual machines and boosted the algorithm’s accuracy from 85 percent to 98 percent, before taking it to market in early 2017. Several customers are using it, he says.



Quote for the day:


"I count him braver who overcomes his desires than him who overcomes his enemies." -- Aristotle


Daily Tech Digest - April 22, 2018

New Fraud Statistics Show Rising Volume of Identity Theft

A white mask on a laptop keyboard.
The Cifas data indicated that online retail fraud rose 49 percent last year. According to the report, identity fraud “remains a predominantly internet-based offense, with 84 percent of identity fraud occurring through online application channels.” Account takeover (ATO) fraud is also on the rise, experiencing a 7 percent increase over 2016. A recent Javelin report found that ATO fraud tripled last year, causing more than $5 billion in losses. In addition, the average resolution time for ATO was 16 hours. New account fraud (NAF), meanwhile, rose 70 percent as cybercriminals leveraged personally identifiable information (PII) to create fake credit card and bank accounts. The Cifas report also noted that actors are increasingly targeting older age groups for ATO fraud using social engineering techniques. These often take the form of phishing emails or over-the-phone “security checks” that ask victims to provide personal information for “verification.” Once attackers have PII in hand, they’re able to either compromise existing accounts or create new ones that may lead to claims of credit fraud or identity theft.



'WordPress of Blockchain' Startup Seeks to Solve Enterprise Pain Points

watches
The Federated Network Protocol is aware of the number of validators, and their health, at all times. This awareness allows Hadron to predict the point of failure on the network and prevent it by spinning up temporary validators that keep the network alive while participants are alerted to the imbalance and instructed to remedy it. In this way, Dukkipatty said, the blockchains that use Elemential (which has designed its middleware for Hyperledger Fabric, Corda, Tendermint and private instances of ethereum) can continue working even when a problem arises. Currently, Elemential is working with the National Stock Exchange of India on a know-your-customer (KYC ) compliance scheme that's built on a private blockchain. The pilot includes ICICI Bank, IDFC Bank, Kotak Mahindra Bank, IndusInd Bank and RBL Bank, as well as HDFC Securities, a Mumbai-based brokerage. While the system allows nodes on the same networks to communicate with each other, Elemential's aspirations go further than that.


The truth about data

Streams of letters of the alphabet erupting from or pouring into a smartphone screen
There are many things that impact the quality and veracity of data throughout its life cycle. Errors can be introduced in the collection process, as it is cleaned or moved across disparate systems. It may have been gathered for a different purpose than what it is now being used for. Or it can simply be too old. When United Airlines recently looked at the data it was using to predict seating demands, the company discovered it was actually data from forecasts that were decades old. This lack of veracity resulted in inaccurate pricing models that cost United Airlines $1 billion (£700 milllion) per annum in missed revenue. It is therefore both surprising and alarming to discover that while 79pc of executives agree that their organisations are basing their most critical systems and strategies on data, many have not invested in the capabilities to verify the truth within it. Without establishing the veracity of that data, businesses leave themselves vulnerable and open to a threat that is critically overlooked.


How DataOps Is Transforming Data Management Practices

Data should be a shared asset, but many companies struggle to treat it as such. Data transcends traditional organizational structures and lines of business, and managers find it difficult to reconcile its governance against traditional business structures. It is not uncommon for data management projects to digress into organizational turf battles. This lack of sharing can result in many different versions of reality, where managers compete to promote their own. When data users don’t trust the data or each other, it’s hard to unlock value. Emerging technology providers think that they’ve found a path forward for building trust through a discipline called Data Operations, or “DataOps.” TAMR’s Palmer has been a pioneer in the field of DataOps, which he describes as “the framework of tools and culture that allow data engineering organizations to deliver rapid, comprehensive and curated data to their users”. He continues, “DataOps enable users to help curate and correct data when they consume it by providing feedback from the point of consumption”.


The biggest challenges for true modernization in 2018

controlling chaos (rudall30/Shutterstock.com)
"It's a great opportunity to have the top cover from the administration and the funding, hopefully, to get this done," one executive said. "But I see another opportunity in my organization to change some things. I'm looking at a culture shift and a kind of mind shift on how we do business. I want to be more adaptable, have more agility and be able to focus on cyber and data, and the only way to do those activities effectively is to change the skill set in-house. We also need to have a new strategy for managing data because I'm looking at things like deep learning and artificial intelligence." Other participants said they, too, are taking advantage of the opportunity to consider dramatic changes. "Our agency had eight CIOs in 10 years — and a year and a half without a CIO," one executive said. "It was constant turmoil. Staffing, hiring, rewarding, contracts — everything was broken. So we decided to blow it all up and start over. And we tell everybody to steal from anybody who's done this already. Let's not reinvent it if you don't have to."


AI In Marketing: Where And When It Can Make A Difference

Today’s CMO is tasked with the challenge of understanding a far greater number of channels, platforms and technologies than ever before. Couple that with the never-ending flow of data coming from every device, method and channel and it’s a recipe for data-processing disaster. The right investment can determine whether a CMO lasts less or more than the average 18-month lifetime. Artificial intelligence offers fascinating possibilities for marketing. While it’s still in its infancy, the power is in the hands of marketers to push for answers to the hard questions. Marketers looking to invest in new technologies must know how and why they’re going to apply them and evaluate how they will solve specific pain points. By working with teams made up of traditional marketers, who focus on the practical applications or technical investment, and more technically savvy computer scientists, who will be responsible for building out and deploying new solutions, CMOs can make far more informed decisions.


Tapping Into Data Capital with AI and Machine Learning


The enterprise data being leveraged includes a complete history of all candidates selected and hired, their key attributes, how they were on-boarded once hired, and their eventual performance in the organization. An analysis engine extracts key features that contributed to candidates’ success and creates a recommendation engine that can rate new applicants along their likelihood to thrive at the organization. Simple data analytics, right? Yes, except that the algorithms, rather than people, decide which factors matter and which do not. Furthermore, the system continually processes ongoing results of those candidates, updating its recommendation engine rules over time. The system learns from actual experience, just like humans do. But it does so far more rapidly and objectively. “Now, extend this capability to other high-value, high-frequency business processes,” Hollis writes. “Timing and pricing of supply chain purchasing. Negotiating discounts on large orders. Measuring the temperature of your customers to determine when a small issue might become a big one. Today’s AI-informed recommendations become tomorrow’s advanced automation.”


Confused about mobile platforms? You’re not alone. Here’s clarity.

maze confused insure future
The very thin thread of evidence for a dual boot into Windows is a reference in the same commit to an internal Google document called “go/vboot-windows.” Trouble is, Google offering Windows on Pixelbooks doesn’t make sense. Google hardware exists to support Google software and services. What makes a little more sense is Fuschia OS as “Alt OS.” (More on Fuschia below.) It’s also possible that Google wants to enable enterprises, schools and developers to more easily dual-boot in whatever OS they want to tinker with as a way to encourage such customers to try Chrome OS. A number of experimental alternative OS projects are being worked on in the Linux community. They include GalliumOS, which is based on Xubuntu and is designed for Chrome OS devices specifically. However, GalliumOS itself contains a script that enables users to dual-boot Chrome OS and GalliumOS. So the answer to the question of whether Chromebooks will run Windows is: Maybe, but probably not.


Moving your data analytics to the cloud isn’t so easy

Moving the data doesn’t magically solve your integration challenges. Also, systems of record may still remain on premises, and so need to be synced with the data now stored in the cloud in a timely manner to get up-to-date results. This means using a mix of old and new data-integration technologies and setting up processes that include data movement and structure transformation. Finally, the cloud-based analytics databases themselves are complex and difficult to configure. Some of that complexity is due to the security subsystems in the database; these are necessary but must be figured out in the context of the database and data analytics. This security must also be systemic with the rest of the systems the data analytics systems touch, both in the cloud and on premises—and that can mean most of the other operational systems that need to feed analytics in real time. Although these cloud analytics challenges can all be overcome, it’s up to IT to understand the level of effort may actually be an 8 out of 10, when it thought (or more likely was told) that it would be a 5 out of 10.


Overcoming hidden data risks when managing third parties

Third party risk management is becoming increasingly top-of-mind for organizations as they attempt to protect their privacy and confidential data and improve their security and risk exposure as part of the overall health of their organization. High-profile breaches, like the one suffered by Target in 2014 or more recently by Netflix in 2017, continue to bring to the forefront the risks third parties can introduce to an organization. As the cloud has increasingly become mainstream, an entirely new set of external risks has been introduced to our environment. Most organizations today rely on several—if not dozens—of external/SaaS applications to run their business, not to mention cloud-based infrastructure and platform offerings. Data ranging from employee vacation time to business documentation to confidential customer information now resides in the cloud, creating a new frontier of risk with which organizations must now contend. For many, the ability to manage this new frontier has not kept pace with the adoption of new, cost-effective technologies to better enable operations.



Quote for the day:


"Program testing can be used to show the presence of bugs, but never to show their absence!" -- Edsger W. Dijkstra