Daily Tech Digest - July 31, 2019

The Power Of Purpose: The ROI Of Purpose

The ROI of Purpose - Copyright Conspiracy of Love 2019 www.conspiracyoflove.co
The impact of a purpose-driven initiative on the health of the brand is also another key area to be measured. The ‘silver bullet’ question which is most important is drawing a clear correlation between purpose and sales. However given the complexity of the purchase funnel, I believe that at the very least measuring ‘Purchase Intent’ (‘Does this initiative make you more or less likely to purchase this brand’) is the closest proxy. ... Often, one of the biggest upsides of purpose-driven initiatives is the effect it has on the employees of the company in terms of morale and motivation - not to mention recruiting new talent, especially Millennials and Gen-Z who are increasingly motivated by the opportunity to work for a company that creates meaningful social and environmental impact (leading to lower recruiting costs). While each company has its own metrics for measuring employee engagement, a common one worth measuring is the impact on turnover: Does the initiative make employees more or less likely to stay with the company? Benevity’s research shows that employees are 57% more likely to stay with a company which offers volunteering and fundraising opportunities, leading to significant cost reductions



Why CIOs should focus on trimming their internal email footprint


Reducing business’ reliance on email is just one part of a wider shift in the way companies need to operate going forward. Stanley Louw, UK and Ireland head of digital and innovation at Avanade, believes organisations need a strategy that is digital, not a digital strategy. “The way we have always provided IT for work is actually holding us back,” he said. “You have to apply the sample principles of customer experience to employee experience. What is the experience employees need to do their job? CIOs have to start partnering with HR.” But in Louw’s experience, IT departments still approach desktop IT from a pure IT perspective, which makes their approach to the desktop archaic, very much based in legacy approaches to desktop management. Industry momentum around focusing on customer experience has changed the way businesses look at their customer, he said, adding: “You also need to look internally and start by modernising platforms.”


Chief Integration Officer!

Chief Integration Officer! - ITNEXT
As organizations go for technology-leveraged strategic transformation, they expect technology to help them maximize business value, as an organization. This is different from better decision-making or operational efficiency or a specific new capability at a functional level. The whole value accrued to the organization must be more than the sum of parts. Someone needs to drive that.  That someone, for a very few selected organizations, is a dedicated Chief Digital Officer. But more than 95% of organizations do not have a CDO role; most of them do not intend to have one. Yet, they still need someone to put all the pieces together to create organizational value. That integration has to be done by someone who thoroughly understands technology and its direction as well as business. In most organizations, CIO is the best person to drive that role.  The reason why it has not happened so widely is not as much because the top management has doubt over CIOs’ capability as it is because the CIOs are not ready to move on from nuts and bolts because that may mean giving up control over a big chunk of budget on IT infrastructure.


Why Proxies Are Important for Microservices


The dynamic nature of microservices applications presents challenges when implementing reverse proxies. Services can come and go as they are revisioned or scaled and will have random IP addresses assigned. The synchronization of the available services and the configuration of the reverse proxy is essential to ensure error-free operation. One solution is to use a service registry (e.g. etcd) and have each service maintain its registration while it is running. The reverse proxy watches the service registry to keep its configuration up to date. Kubernetes does all of this automatically for you as part of its automation. The Kube DNS process maintains the service registry with an address (A) and service (SRV) record for each service. The Kube Proxy process routes and load-balances requests across all instances of the services. With all incoming request traffic for a microservices application typically passing through proxies, it is essential to monitor the performance and health of those proxies. Instana sensors include support for Envoy Monitoring, Nginx Monitoring, and Traefik Monitoring, with more proxy technologies coming.


Browser OS could turn the browser into the new desktop


A potential challenge is the browser becomes the desktop for the end user, and that's something folks have to get used to. But to Google's credit and its partnerships with vendors like VMware or Citrix, the UX challenge becomes almost invisible. We'll see how enterprises continue to approach this opportunity, which is ultimately more secure. For certain use cases, field services for example, if a browser OS-based device either dies or gets broken or lost, no data is lost. A user can just go get a new Chromebook and sign in back where he or she left off. That's an unheard-of value proposition -- that begin-where-you-left-off concept is powerful. One other problem enterprises may face is around Microsoft legacy infrastructure -- particularly around endpoint management. Microsoft has moved away from that to help bridge the divide, and Windows 10 is doing well. We'll see a lot more migration happening this year as the Windows 7 sunset comes closer.


This new Android ransomware infects you through SMS messages


Depending on the infected device's language setting, the messages will be sent in one of 42 possible language versions, and the contact's name is also included in the message automatically.  If the link is clicked and the malicious app is installed manually, it often displays material such as a sex simulator. However, the real purpose is quietly running in the background. The app contains hardcoded command-and-control (C2) settings, as well as Bitcoin wallet addresses, within its source code. However, Pastebin is used by the attackers as a conduit for dynamic retrieval.  Once the propagation messages have been sent, Filecoder then scans the infected device to find all storage files and will encrypt the majority of them. Filecoder will encrypt file types including text files and images but fails to include Android-specific files such as .apk or .dex.  ESET believes that the encryption list is no more than a copy-and-paste job from WannaCry, a far more severe and prolific form of ransomware.  A ransom note is then displayed, with demands ranging from approximately $98 to $188 in cryptocurrency. There is no evidence that files will be lost after the time threatened.


Google bolsters hybrid cloud proposition for enterprises through VMware partnership


Google Cloud CEO Thomas Kurian confirmed the news in a blog post, in which he described the move as “another significant step” in his firm’s drive to bolster the enterprise appeal of its public cloud platform.  In recent years, this has seen the Google Cloud team roll out a series of data security and functionality improvements to its platform, and move to introduce industry-specific services. Such moves have resulted in Google Cloud developing into an $8bn annual revenue run rate company that is growing “at significant pace”, as was confirmed by Google CEO Sundar Pichai during a conference call to discuss its parent company’s second-quarter results earlier this month. “Customers are choosing Google Cloud for a variety of reasons,” said Pichai on the call, transcribed by Seeking Alpha. “Reliability and uptime are critical.” He also made reference to the “flexibility” that organisations also need when moving to the cloud, so they can proceed in their “own way”. 


The Future of API Management

There will be a continued emphasis on the developer and recognizing the developer is king and making the job easier for them. We understood this for the public, but we still need to improve internally. In the form of a service catalog for internal developers to make it easier for them to ramp up and get benefits of existing APIs. New architecture — Everything is driven by containers, container platforms, and K8s is leading to microservices architecture with new approaches to control traffic with sidecar approaches to manage traffic like Envoy and Istio to provide service mesh to manage applications within the container cluster.  As these things come up, there will be a proliferation of types of control points with multiple form factors. We embrace Envoy as the new gateway. Right now we live in a mixed world and it’s important to consider how service mesh and API management will overlap. API management is about the relationship of providing a service and multiple consumers of that service. The more scale, the more important the formal API management platform.


zeroday software bug skull and crossbones security flaw exploited danger vulnerabilities by gwengoa
A zero day is a security flaw that has not yet been patched by the vendor and can be exploited and turned into a powerful weapon. Governments discover, purchase, and use zero days for military, intelligence and law enforcement purposes — a controversial practice, as it leaves society defenseless against other attackers who discover the same vulnerability. Zero days command high prices on the black market, but bug bounties aim to encourage discovery and reporting of security flaws to the vendor. The patching crisis means zero days are becoming less important, and so-called 0ld-days become almost as effective. ... Not all zero days are complicated or expensive, however. The popular Zoom videoconferencing software had a nasty zero day that "allows any website to forcibly join a user to a Zoom call, with their video camera activated, without the user's permission," according to the security researcher's write up. "On top of this, this vulnerability would have allowed any web page to DoS (Denial of Service) a Mac by repeatedly joining a user to an invalid call." The Zoom for Mac client also installs a web server on your laptop that can reinstall the Zoom client without your knowledge if it's ever been installed before.


white blocks stacked containers misaligned alignment fragile falling apart flickr
One thing that is widely agreed upon by the security pros – as Kubernetes adoption and deployment grows, so will the security risks. There have been multiple recent events in the cloud and mobile dev spaces where these environments were compromised by attackers. This included everything from disruption, crypto mining, ransomware, and data stealing. Of course, these types of deployments are just as susceptible to exploits and attacks from attackers and insiders as the traditional environments. Thus, it is more important to ensure your large-scale Kubernetes environment has the right deployment architecture and that you use security best practices for all these deployments. As Kubernetes is more widely adopted, it becomes a prime target for threat actors. “The rapid rise in adoption of Kubernetes is likely to uncover gaps that previously went unnoticed on the one hand, and on the other hand gain more attention from bad actors due to a higher profile,” says Amir Jerbi, CTO at Aqua Security.



Quote for the day:


"Leaders begin with a different question than others. Replacing who can I blame with how am I responsible?" -- Orrin Woodward


Daily Tech Digest - July 30, 2019

What to Look Out For When Selecting a DRaaS Provider

What to Look Out For When Selecting a DRaaS Provider
Before exploring DRaaS, your organization should have a business impact analysis. In performing a current business impact analysis, you will be able to posit what would happen in the event of a disaster or disruption of business operations. ... When picking which DRaaS provider is right for you, use this information to determine if providers can accommodate your needs. After figuring out what your disaster recovery requirements are exactly, you can ask questions of providers in order to ascertain if they can support your needs. In the event that you were to experience data loss or corruption, learn the procedures of the providers in that situation by asking questions such as: How many copies of your backups are available? Where are those backups located? Is the provider able to recreate an image of your data at a specific, previous point from available backups? In calendar terms, how far back are backups accessible? What is the provider’s protocol when you perform a failover to DRaaS and are ready to go back to your standard environment afterward?


Google researchers disclose vulnerabilities for 'interactionless' iOS attacks

iPhone iOS
According to the researcher, four of the six security bugs can lead to the execution of malicious code on a remote iOS device, with no user interaction needed. All an attacker needs to do is to send a malformed message to a victim's phone, and the malicious code will execute once the user opens and views the received item. The four bugs are CVE-2019-8641 (details kept private), CVE-2019-8647, CVE-2019-8660, and CVE-2019-8662. The linked bug reports contain technical details about each bug, but also proof-of-concept code that can be used to craft exploits. The fifth and sixth bugs, CVE-2019-8624 and CVE-2019-8646, can allow an attacker to leak data from a device's memory and read files off a remote device --also with no user interaction. While it is always a good idea to install security updates as soon as they become available, the availability of proof-of-concept code means users should install the iOS 12.4 release with no further delay.



Top 5 financial services processes that are ripe for automation


Barely a day goes by without the launch of a new report extolling the potential benefits of artificial intelligence (AI) and automation in the financial services industry. These reports often refer to the potential for cost reduction, increased operational efficiency, improved customer experience and, ultimately, bottom-line growth. Indeed, analysts predict that AI will deliver a 22 percent reduction in operating costs (a saving of more than $1trn) across the global financial services industry by 2030 as business leaders look to transform both front and back-office functions. Demand for AI is coming from both ends of the market: established banks are recognising the need to respond to huge sector-wide disruption and to develop more agile operations in order to compete, while smaller fintech firms are looking to AI and automation as a way to scale quickly while keeping costs down. The scale of the opportunity is so vast that it can sometimes be a challenge for banks and insurance firms to know where to start or how to identify the process automations that will deliver most value.


Avoid chaos with an IT crisis management playbook


The second significant component of an IT crisis management playbook is a breakdown of common or reoccurring issues and their suggested fixes. Append the top resolution suggestions from the application vendors as well. Don't expect to create an exhaustive list, but describe coverage for five to 10 of the business's most critical applications. Create a comprehensive index for both vendors and IT operations staff to see quickly if they need to escalate an issue -- and to whom -- with internal contact information attached. A common question about crisis management playbooks is recommended format: paper or digital? If the modern paperless office is any indicator, create both. Paper binders require effort to update and store, but they also work without power -- something that's not a guarantee with a digital version.


Hackers target Telegram accounts through voicemail backdoor


According to the testimony of one of the arrested suspects, Walter Delgatti Neto, there’s another, potentially more vulnerable, way to get those verification messages – via voicemail. Accessing voicemail boxes turns out to be easier than it should be. Some people forget to set four-digit codes and those that don’t can potentially be undone by crooks cycling through the 10,000 possibilities. Many voicemail systems fight back by checking that the number making an access call belongs to the subscriber, but these numbers can easily be spoofed if the attacker knows the correct number. If an attacker can access voicemail they can access verification messages, such as Telegram’s, which are sent to voicemail if the hacker’s target is on a call or doesn’t answer three times in a row. Apparently, news of the weakness has spread on forums, leading to leaks of attacks on other valuable targets, including Puerto Rico Governor Ricardo Roselló, whose position became untenable after his Telegram chats were recently leaked.


Strategy For and With AI

Our research strongly suggests that in a machine learning era, enterprise strategy is defined by the key performance indicators (KPIs) leaders choose to optimize. (See “About the Analysis.”) These KPIs can be customer centric or cost driven, process specific or investor oriented. These are the measures organizations use to create value, accountability, and competitive advantage. Bluntly: Leadership teams that can’t clearly identify and justify their strategic KPI portfolios have no strategy. In data-rich, digitally instrumented, and algorithmically informed markets, AI plays a critical role in determining what KPIs are measured, how they are measured, and how best to optimize them. Optimizing carefully selected KPIs becomes AI’s strategic purpose. Understanding the business value of optimization is key to aligning and integrating strategies for and with AI and machine learning. KPIs create accountability for optimizing strategic aspirations. Strategic KPIs are what smart machines learn to optimize. 


The Case For Transforming Banking (Even When Profits Are Strong)


Many financial institutions are saying the right things more than doing what is needed. Often, what is being done is in the context of banking from the past, as opposed to being recreated from the bottom up as you would if you were building a digital banking organization from scratch. And many of these initiatives are still moving at a snails pace. Organizations are building digital account opening, loan application and new customer onboarding processes, but the majority of these processes still require the consumer to come into the branch or have way too many steps similar to the paper-based processes of the past. And, while almost all organizations know the benefits of expanded data, advanced analytics and AI, very few have used these tools to personalize experiences or proactively offer solutions in real time. As stated in the BCG report, banking organizations must look at digital transformation in a holistic manner rather than fragmented components that are not seamlessly integrated. More importantly, the direction for this transformation must come from the organization’s senior leadership and be supported by a culture that encourages innovation, digital customer experiences and aggressive market positioning.


Cyber security leadership in the age of fast and continuous delivery


Addressing the need for agile methods and the need to sustain adequate cyber security presents certain challenges for the CISO navigating a transforming business landscape. Here are the top six key triggers and challenges organizations are grappling with today. ... Cyber security usually has predefined contact points within a team's detailed planning and work schedule. These typically occur during initial software architecture definition and validation, with a couple of checkpoints ending with late testing and acceptance of the solution. Today, modern application security replaces the typically predefined interactions in the software lifecycle with more frequent interations that increase dialogue, collaboration and efficiency. How do organizations re-organize cyber security to support this interaction, either through staffing, automation or clever methodological work-arounds? ... It's not rare today to see cyber departments hiring software developers possessing a strong understanding of modern dynamics and training them in cyber security.


Capital One’s breach was inevitable, because we did nothing after Equifax

capital one blue ribbon companies 2016 gettyimages 617684580
This time it’s the financial giant and credit card issuer Capital One, which revealed on Monday a credit file breach affecting 100 million Americans and 6 million Canadians. Consumers and small businesses affected are those who obtained one of the company’s credit cards dating back to 2005. That includes names, addresses, phone numbers, dates of birth, self-reported income and more credit card application data — including over 140,000 Social Security numbers in the U.S., and more than a million in Canada. The FBI already has a suspect in custody. Seattle resident and software developer Paige A. Thompson, 33, was arrested and detained pending trial. She’s been accused of stealing data by breaching a web application firewall, which was supposed to protect it. Sound familiar? It should. Just last week, credit rating giant Equifax settled for more than $575 million over a date breach it had — and hid from the public for several months — two years prior. Why should we be surprised? Equifax faced zero fallout until its eventual fine. All talk, much bluster, but otherwise little action.


Is The Future Of Artificial Intelligence Tied To The Future Of Blockchain?

uncaptioned
There is no doubt that blockchain is a disruptive technology and will give nations and all its components the foundation to the decentralized future. While blockchain is a disruptive technology, the way it is being used and applied has enormous energy and environmental impacts. The reason behind this is the process that is at the core of blockchain systems. The security of blockchain technology comes from its encryption, and the consensus mechanism of blockchain necessitates that all users require permission to write on the chain. Each of these requirements individually and collectively involves the intricate use of algorithms and enormous amounts of computing power. As the computing power needed to keep the current applications of blockchain running is not sustainable, it is one of the critical challenges facing the future of blockchain. It is not only blockchain and artificial intelligence, but all existing and emerging technologies, that are accelerating global computing power consumption. As a result, there is a visible need for increased computing power.



Quote for the day:


"Leaders stuck in old cow paths are destined to repeat the same mistakes. Change leaders recognize the need to avoid old paths, old ideas and old plans." -- Reed Markham


Daily Tech Digest - July 29, 2019

Back to basics: the top five technical issues to address in industrial cyber security

Top five technical issues to address in industrial cyber security image
While industrial facilities are facing more cyber security challenges than they used to, the good news is that awareness around these challenges is increasing. That said, there’s still a marked difference between how well cyber security is understood in the consumer and corporate IT worlds, and how well it’s understood in industrial environments driven by OT. In a sense that’s not surprising. After all, most well-publicised attacks have been in consumer and corporate IT. But with attacks on critical industrial environments now becoming more frequent, people are starting to wake up to the operational, financial, reputational and even human and environmental damage they can inflict. Awareness is one thing. But the fundamentals of cyber security are still not being practised regularly. What are those fundamentals? In our cyber security work with organisations operating critical infrastructures around the world in sectors including power, oil and gas, water management, manufacturing and maritime, we’ve identified the top five technical issues that need addressing.



Deep learning is about to get easier - more widespread


Deep learning algorithms often require millions of training examples to perform their tasks accurately. But many companies and organizations don’t have access to such large caches of annotated data to train their models (getting millions of pictures of cats is hard enough; how do you get millions of properly annotated customer profiles — or, considering an application from the health care realm, millions of annotated heart failure events?). On top of that, in many domains, data is fragmented and scattered, requiring tremendous efforts and funding to consolidate and clean for AI training. In other fields, data is subject to privacy laws and other regulations, which may put it out of reach of AI engineers. This is why AI researchers have been under pressure over the last few years to find workarounds for the enormous data requirements of deep learning. And it’s why there’s been a lot of interest in recent months as several promising solutions have emerged — two that would require less training data, and one that would allow organizations to create their own training examples.



Universities use data analytics to tackle student mental health


“What we are finding is that students with mental health conditions are far more likely to generate an alert than their peers,” says Ed Foster, the university’s student engagement manager. The alerts, which in its first year have been triggered for 3% to 4% of students, are sent to personal tutors, who can then contact the student. They also have access to a detailed dashboard on students’ engagement. “It helps them to prepare for that first discussion with a student, giving them some pointers,” says Foster. Nottingham Trent also provides students with access to their own dashboard, allowing them to see how they compare to averages for their peer group. Foster says that some students have found themselves motivated by the realisation they were falling behind. But some students at University of Leeds oppose such dashboards, according to Professor Neil Morris, dean of digital education.


Why Invest In Cloud-Based Machine Learning For Cybersecurity?

Cloud computing, illustration
Offloading compute-intensive ML workloads to the cloud frees up precious local resources. Systems that do ML on the same local sensor that gathers the data itself must sacrifice performance and fidelity. By doing all the work on a local box, these tools spread themselves too thin and put a low cap on the value they can provide. Networks today move many gigabytes of data per second, and ML is only as accurate and valuable as the data you feed it; this means that the ideal scenario is one in which ML models can take advantage of all that data flying across the network. That simply isn't possible for products that restrict their ML processing to the compute power of an appliance in the data center. Moving ML to the cloud is the only way to take advantage of the sheer volume of data produced by today's enterprise networks. More often than not, modern enterprises are distributed across more than a single campus. Cloud-based ML architectures allow you to take full and cost-effective advantage of the intelligence, detection, and automation provided by ML-based technologies across all on-premises and cloud environments through a single deployment instead of getting limited insights that only apply to individual network regions where a box is installed.


Robotic Process Automation (RPA): 6 things people get wrong

CIO_Big Data Decisions_2
“As companies look to digitally transform themselves, they are looking to streamline and modernize processes,” says John Thielens, CTO at Cleo. “While RPA perhaps can be viewed as a form of streamlining, it streamlines processes in place, but by itself does not necessarily improve them.” Thielens notes that this misunderstanding can occur in organizations that are looking for process improvements as part of a broader digital transformation; they might see RPA as a solution to process woes when it’s better looked at as a tool for achieving new efficiencies and productivity gains with well-established processes. There’s a related mistake people make with RPA: Automating a process you don’t fully understand. Eggplant COO Antony Edwards recently told us that this is a common pitfall: “Most people don’t have clearly defined processes, so they start automating, and either automate the wrong thing or get lost in trying to reverse-engineer the process.”


Govt panel says no need to mirror personal data in India

foreign firms, personal data, Srikrishna panel, data localisation, NITI Aayog, financial data, Amazon, American Express, amstercard, Paypal, European Union
The government panel has, however, agreed with the Srikrishna panel on health data where it had said personal data relating to health should be permitted to be transferred abroad for reasons of prompt action or emergency. The relaxation by a government panel with regard to storage and processing of personal data comes after last month RBI also relaxed its April 2018 circular which required that all payment data generated in India should be stored within the country. This came as some sort of relaxation for major international firms such as American Express, Visa, Mastercard, Amazon, Paypal, Western Union, etc. In its amended circular, RBI has allowed such international firms to store data abroad in cases where the transaction originates in the country but gets completed overseas, with a proviso that a mirror copy in such transactions be stored in India. However, for end-to-end domestic transactions all storage still needs to be done within the country. Earlier, the RBI had not made a clear-cut distinction between transactions which are completed within the country and transactions which originate here and get completed overseas.


Why AI is entering its golden age

Sharp Corp's RoBoHon, a humanoid communication robot shaped mobile phone, raises its hands up to flag a call as a model tries to answer the call during a photo opportunity at its unveiling event in Tokyo, Japan, April 14, 2016. REUTERS/Yuya Shino TPX IMAGES OF THE DAY
AI is working everywhere. To take one framework, think about the product lifecycle: You have to figure out what products or services to create, figure out how to price it, decide how to market and sell and distribute it so it can get to customers. After they’ve bought it, you have to figure out how to support them and sell them related products and services. If you think about this entire product lifecycle, AI is helping with every single one of those [stages]. For example, when it comes to creating products or services, we have this fantasy of people in a garage in Silicon Valley, inventing something from nothing. Of course, that will always happen. But we’ve also got companies that are mining Amazon and eBay data streams to figure out, what are people are buying? What’s an emerging category? If you think about Amazon’s private label businesses like Amazon Basics, product decisions are all data-driven. They can look to see what’s hot on the platform and make decisions like “oh, we have to make an HDMI cable, or we have to make a backpack.”


Managed security services will take center stage at Black Hat

Managed security services will take center stage at Black Hat
A global cybersecurity skills shortage, that’s what. ESG research indicates that 53% of organizations say they have a problematic shortage of cybersecurity skills (note: I am an ESG employee). Furthermore, the recently published research report from ESG and the information systems security association (ISSA) indicates that 73% of organizations have been impacted by the cybersecurity skills shortage. Sixty-six percent of those impacted say the cybersecurity skills shortage has increased the workload on the infosec team, 47% say the cybersecurity skills shortage has led to the inability to learn or use cybersecurity technologies to their full potential, and 41% have had to hire and train junior employees rather than hiring more experienced staff. There’s one more implication around the cybersecurity skills shortage — nearly one-third (32%) of organizations have had to increase their use of professional/managed services because they remain understaffed and lacking advanced cybersecurity skills. Like I said, organizations can no longer tow the cybersecurity line alone — they need help. 


DMARC's abysmal adoption explains why email spoofing is still a thing


Companies are not taking advantage of the protocol, despite the fact that DMARC has been around for years. This means that most companies are still vulnerable to business email compromise (BEC) attacks, phishing emails, and other types of email scams, as hackers can easily make their emails look authentic and pass their scams as legitimate communications. The good news is that DMARC adoption rate is better than previous years. However, the bad news is that the needle is moving too slowly to make a difference. An FTC report from 2017 found that only 10% of 569 businesses with a significant online presences had deployed DMARC policies with their domains. A November 2018 Agari report found that half of the Fortune 500 companies were supporting DMARC, but that only 13% of companies had actually bothered to set up DMARC rejection policies -- meaning the DMARC protocol was installed, but wasn't actually used to stop spam, phishing, and scams from spoofed domains. A quarter later, in February 2019, the same Agari reported that the percentage of Fortune 500 companies actively using DMARC policies had gone up to 15%. A small increase, but still insufficient, since this still left hundreds of the world's biggest companies open to attacks.


Rediscovering Lean

The trouble is, deep down, I knew I was not! My engineering metrics were great, better than ever. But my Pirate metrics (AAARRR!) sucked. Where was the real user impact I wanted? I grew weary. Each new initiative and with every one of those practices I adopted, I changed how I worked, which required conscious effort to discover, learn, implement and refine. Spending all my mental energy on improving delivery metrics was bad; continuously improving the wrong thing was worse. In the middle of this crisis of confidence, my head of engineering asked me to lead the development of a new mobile product. This was a big deal based on a real user need we could address. Along with it also came a large chance of failure. A year previously, I would have jumped on it. But with more self-awareness around my limitations delivering on such a large project, I wavered. With my approach not working, I needed to change. Was a killer process to solve my problems just around the corner, awaiting discovery?



Quote for the day:


"Leadership is developed daily, not in a day." -- John C. Maxwell


Daily Tech Digest - July 26, 2019

IoT’s role in expanding drone use

IoT̢۪s role in expanding drone use
Iris Automation, not surprisingly, is in the business of creating workable collision avoidance systems for drones in an attempt to solve this issue. Variously called “detect and avoid” or “sense and avoid” technologies, these automated solutions are required for “beyond visual line of sight” (BVLOS) drone operations. There are multiple issues in play. As explained on Iris’ website, “Drone pilots are skilled aviators, but even they struggle to see and avoid obstacles and aircraft when operating drones at extended range [and] no pilot on board means low situational awareness. This risk is huge, and the potential conflicts can be extremely dangerous.” As “a software company with a hardware problem,” McKenzie said, Iris’ systems use artificial intelligence (AI), machine learning, computer vision, and IoT connectivity to identify and focus on the “small group of pixels that could be a risk.” Working together, those technologies are creating an “exponential curve” in detect-and-avoid technology improvements, she added. The result? Drones that “see better than a human pilot,” she claimed.


How artificial intelligence can tackle climate change

AI can also unlock new insights from the massive amounts of complex climate simulations generated by the field of climate modeling, which has come a long way since the first system was created at Princeton in the 1960s. Of the dozens of models that have since come into existence, all represent atmosphere, oceans, land, cryosphere, or ice. But, even with agreement on basic scientific assumptions, Claire Monteleoni, a computer science professor at the University of Colorado, Boulder and a co-founder of climate informatics, points out that while the models generally agree in the short term, differences emerge when it comes to long-term forecasts. “There’s a lot of uncertainty,” Monteleoni said. “They don't even agree on how precipitation will change in the future.” One project Monteleoni worked on uses machine learning algorithms to combine the predictions of the approximately 30 climate models used by the Intergovernmental Panel on Climate Change.


How the CIO role in healthcare has evolved at Northwell Health


It's a lot about how these new innovative technologies are going to take hold as time goes along and how they're going to change the model of how we deliver healthcare. One example might be bed throughput in our hospitals. Our hospitals are very busy, so how quickly we can get patients through them is very important to us so they're not sitting in the ER longer than they need to be. There have been bed management programs on the market for a lot of years. Many hospitals use them, big and small. And they operationally help you to move patients through your hospital faster. If we could take new technologies that are artificial intelligence-based with machine learning algorithms behind them and we can start to predict, in addition to the actual management of the beds, how likely is it they're going to get admitted, how likely is it they're going to go to the ICU, how likely is it that a patient is going to be discharged over the next few days.


Louisiana declares emergency over cyberattacks targeting schools

It’s the first time in Louisiana’s history that a cyberattack is being addressed like a disaster, but there is precedent for this approach. Colorado became the first state to treat a cyberattack like a major disaster in March 2018 after its Department of Transportation was compromised by the ransomware virus known as SamSam. That declaration, by then-Gov. John Hickenlooper, came after more than a week of the Office of Information Technology struggling to repair nearly 2,000 infected devices with a small staff working long hours and subsisting on junk food. Following Hickenlooper’s emergency declaration, oversight of the response to the cyberattack was given to the Colorado Office of Emergency Management, which adapted its natural-disaster playbook for an IT security situation, creating a unified command structure and bringing in the National Guard and support personnel from other states.


How to cautiously reap the benefits of DevOps as a service


Before DaaS adoption, carefully weigh its unique benefits and challenges. A DaaS provider offers access to more mature reporting tools out of the box that inform developers, as well as product and technical stakeholders, about ongoing project health and security. Additionally, cloud-based DevOps tools support collaboration between on-site and remote team members and facilitate data-driven processes, since staff can more easily work with the same data sets. DaaS also facilitates faster testing and deployment because it enables developers to scale compute and data storage on demand based on project requirements. But outsourcing to a DevOps-as-a-service provider isn't without its challenges. First, in-house development and operations teams still must manage the complexities of integration and workflow orchestration.


Transitioning GDPR preparations into successful operations

GDPR is also about value and trust in data, a central element of information governance. Information governance encompasses, among other things, information security or, at a digital level, cybersecurity. There are many organizations that were taken in with checklists and companies offering one-stop technological solutions, without taking the necessary steps to understand how personal data flows through the organization, as opposed to designing and implementing a framework that will fit with the culture and ways of working of your organization. Then there are those organizations that complained “it’s not fair” and placed it on the “too difficult to do” pile. On many occasions, senior stakeholders have told me that they could not see how GDPR affected them as they didn’t collect, store or process personal data – in all cases they had failed to grasp that employment data was personal data. Absorbing GDPR into business as usual requires a holistic approach to information governance.


A data breach forced this family to move home and change their names


As reported by the Hackney Gazette, a family in the area adopted a child and the details of who they were and where they lived were meant to be withheld from the birth parents.  However, during the adoption process in 2016, a solicitor appointed by Hackney Council mistakenly included an unredacted copy of the application form. The publication says that the exposed, sensitive data included the couple's names, addresses, phone numbers, dates of birth, and occupations. The scope of the breach was serious enough that the couple spoke to both the council and police, and ultimately decided that moving home and changing their names was the safest option for their adopted child.  Hackney Council acknowledged their mistake and paid out £106,000 ($131,000) in damages, which also helped facilitate the house move. The council apologized to the couple for the "human error." The UK's Information Commissioner's Office (ICO) was also notified. The data watchdog said that the breach had "potential to cause significant distress and detriment," but no further action was required beyond the council's compensation offer.


Why investment is needed in the cyber insurance market


While insurers are missing out on a considerable business market and will look to stop silent cyber, companies that use these commercial policies may also be putting themselves at risk of a complicated claims process that may be ultimately rejected. As such, businesses and insurance firms must work together to develop the cyber insurance market. Insurance policies can be a lifeline for companies when unexpected events or issues occur, that lead to significant damage or loss to the company in some capacity or another. Standard insurance policies can cover several categories, including public liability and property. Yet when a cyber attack on a business damages property or stops business operations, it is unclear whether a company’s business insurance policy will cover the resulting claim.  This lack of clarity has come to a head in the Mondelez vs Zurich legal battle currently taking place. Insurance firm Zurich is refusing to pay out a $100m claim from food conglomerate Mondelez following a cyber attack. 


The Mesosphere Data Center Operating System (DC/OS) is a distributed OS that runs across a cluster. It's based on the open source kernel Mesos -- similar to the Linux kernel -- which abstracts IT hosting resources. Mesos uses a master/agent architecture, whereby a master daemon controls the agent daemons that run on nodes within the cluster. Because DC/OS is a distributed system, it lends itself to distributed applications. For example, an enterprise could use Mesosphere and its distributed architecture to run large and complex data processing software for a big data initiative, such as Apache Cassandra database or Apache Spark data analytics engine, across abstracted virtual and physical resources that act as a single entity. Mesosphere DC/OS enables greater scalability for those systems, since the OS spans more than one machine, and also simplifies node configuration across a cluster. 


smart city / internet of things
Worse still, some legacy systems can be downright impossible to manage and update in a safe way. Older technology often contains no provision for live updates, and its vulnerabilities can be severe, according to the report. Physical access to some types of devices also remains a serious challenge. The report gives the example of wastewater treatment plants in remote locations in Australia, which were sabotaged by a contractor who accessed the SCADA systems directly. In addition to the risk of compromised control systems, the generalized insecurity of smart city IoT makes the vast amounts of data that it generates highly suspect. Improperly configured devices could collect more information than they’re supposed to, including personally identifiable information, which could violate privacy regulations. Also, the data collected is analyzed to glean useful information about such things as parking patterns, water flow and electricity use, and inaccurate or compromised information can badly undercut the value of smart city technology to a given user.



Quote for the day:


"Your greatest asset is your earning ability. Your greatest resource is your time." -- Brian Tracy


Daily Tech Digest - July 25, 2019

Beyond software development: the power of low code

Beyond software development: the power of low code image
The more efficiently the apps are developed and deployed, the better. Sometimes, businesses don’t need a Mercedes when a reliable Volkswagen will do the job just fine. The low code movement is indicative of how development is changing. Platforms including VisionX can be used by different types of users at various levels of expertise. Experienced developers can create apps faster using less time and resources, and without getting sidetracked iterating on problems that are not core to what the application is designed to do. On the other hand, less technical users can build apps in a controlled way via a visual design interface. The result is a repeatable process throughout the organisation to create sound applications faster. The easily replicable templates are at the core of the process, but along the way are powerful tools for greater customisation, advanced data modelling, automated logic, and flexible deployment so that the responsive applications work seamlessly across desktop, mobile, and other form factors.


Humanity at Work: Interview With Rich Sheridan, Author of Chief Joy Officer

In order to truly begin to understand what joy means in the workplace, we need to delve into the sort of things that are less visible, into questions such as: Why do we exist? What do we believe about ourselves? Who do we serve, and what would delight look like for them? Human energy lifts the spirit of your team, and purpose drives it forward. A shared purpose gives us that sense of why we work so hard every day on a particular goal, which in Menlo’s case is to delight those people we intend to serve. Energy and purpose are really at the root of what makes Menlo a joyful place to work — and then of course there are the other things we do as well that add joy to the room. There's no question that laughter, as an example, is part of joy. there is a component of joy that is happiness and while we may not be happy every minute of every day because the work we do can be challenging, I think we do need to carve out space and activities for having fun, being playful, cheerful and supportive.



Most digital transformation efforts do not have visibility. If anything, consumers underestimate the number of businesses that are investing heavily in every customer experience outcome by at least 50 percent. Consumers also are not giving businesses credit for the level of investments they are making. Conversely, businesses are not listening to – or understanding – the needs of customers. Most digital transformation is internally focused -- 68 percent of initiatives are business process centered, while only 28 percent were customer experience centered. A meager handful, four percent, consider their initiatives to be employee experience centered. It appears that a greater outward focus pays off for the business. A majority of customers, 62 percent, would spend more money if their digital experiences "feel effortless." The leading companies in the study "consistently recognize the need for change and they are prepared to take risks," the researchers report. "They have agile, customer-focused organizations that recognize digital transformation is a cultural change, not just a technological change."


Answer These 9 Questions to Determine if Your Data Is Safe

Do you incorporate "privacy and security by design" in your environment? - Privacy and security by design are methodologies based on proactively incorporating privacy and data protection from the very beginning. This approach follows seven principles for implementing growing processes within your IT and business environments. Advocating privacy and security early on in your design process for specific technologies, operations, architectures, and networks will ensure you are building a mature process throughout the design life cycle; Is sensitive data encrypted during transit and at rest? - Encryption keys are vital to the protection of transactions and stored data. Key management should be deployed at a level commensurate with the critical function that those keys serve. I strongly recommend encryption keys be updated on a regular basis and stored separately from the data. Essentially, data is always being pushed and pulled and protecting that information as it moves across boundaries should require strong encryption at rest and while in transit.


Predictive Analytics & Insights Discovery Were Made For Each Other


With the combined power of predictive analytics and prescriptive analytics (derived from insights discovery in big data), we can explore more insightful data-driven “what-if” decision scenarios. This type of “what-if analysis” can foresee the ripple effect of various alternative decisions that adjust different elements in our operations. When we use prescriptive analytics to turn the dials of those elements—either singly or in combination—we can see the implications across the entire landscape. The outputs of prescriptive analytics are particularly helpful in assessing risk, such as when making a strategic decision, or in deciding whether to make cuts or investments. Predictive analytics evaluate the risk of current conditions continuing on their current path, expressing the risk as a mathematical probability. Prescriptive analytics evaluate risk in new scenarios that unfold based upon different decisions, treatments, and options that we might choose to take. This is powerful insight when balancing priorities and considering tradeoffs.


Transformation Without Technology

When legacy companies mistakenly believe that adopting new technology is itself the central strategy for adapting to a digital world, they miss the important organizational and strategic changes that are legitimately essential to survival and prosperity in a dramatically changing environment. Focusing first on the nontechnological changes required for adapting to a digital world — those involving talent, leadership, culture, organization structure, and strategy — helps companies better understand the problems they face and explore other changes that may be necessary before locking into a technological solution. Yes, these same companies may also need to invest — often substantially — in new technology, but when they do, those investments will be focused quite differently than if they had dived headlong into technology as the starting point. In our book, we use the metaphor of the cyclone in The Wizard of Oz to describe the role of technology in digital transformation. We ask, “How much of the story of The Wizard of Oz is about the cyclone?” On the one hand, all of it is.


Kubernetes policy project takes enterprise IT by storm


"Treating compliance as code means adopting best practices from the software development process," Ryan wrote. "One of these is Don't Repeat Yourself. Decoupling policy from applications, and reusing policy definitions in multiple locations, is a good implementation of this rule." As Kubernetes environments grow to encompass Istio service mesh and Knative event-based orchestration in what Google calls the open cloud stack, the fact that OPA lends itself to Kubernetes policy enforcement but can expand to include those adjacent utilities boosts its appeal. "There's a lot of promise in how we might scale and better leverage OPA as the ecosystem grows," said Andy Domeier, senior director of technology operations at SPS Commerce, a Minneapolis-based communications network for supply chain and logistics businesses, which uses OPA in production.


Does Your Company Need That Chatbot?

uncaptioned
In using chatbots, you’re relinquishing control of a wide range of customer data to AI. How will you keep it safe? What outer boundaries have you established to ensure it won’t be shared beyond your chat or phone call? What pieces of the chat will you keep, and which will you ditch in the name of data security? Companies like Bank of America have chosen to service customers on their own site or app, where they have complete control of the chatbot experience. Others, like Butterball, have decided to work with established platforms like Alexa, to field customer questions through Alexa’s “skill” platform. Obviously, questions about one’s personal finances need higher security than those about one’s Thanksgiving turkey recipe. Only you know the security requirements that will best fit your customers and industry. Just by the nature of AI taking time to learn what it needs to know, you need to understand that your chatbot won’t work perfectly straight out of the gate. As noted above, it may need to field thousands of inquiries to understand a customer’s intent or desire perfectly. Only you can answer the question: Is it really worth the effort?


Public cloud winners take all

Public cloud winners take all
As before, AWS, Microsoft, and Google make it into the Leaders quadrant. And as before, everyone else is largely a rounding error. For example, Alibaba gets credited as the top cloud in China, but Gartner also points out that “Alibaba Cloud’s financial losses are increasing and may prevent the company from continuing to invest in necessary expansions.” Not good. Meanwhile, Oracle keeps selling almost entirely to existing Oracle accounts (who presumably can’t escape), with little hope of expanding: “Oracle is unlikely to ever be viewed by the market as a general-purpose provider of integrated IaaS and PaaS offerings.” IBM gets much the same treatment. Which leaves us with the three leaders. Little has changed in Gartner’s assessment of AWS, Microsoft Azure, and Google Cloud. AWS, unsurprisingly, gets credited as “the most mature, enterprise-ready provider” and, as such, “enterprises make larger annual financial commitments to AWS” than other cloud providers. Microsoft keeps using its enterprise heft to drag Azure into its hitherto on-premises customers, while Google Cloud keeps getting noted for innovative technology


Informatica makes the case: Why data needs AI and vice versa


In a briefing a few weeks back, Informatica made a case for why data quality and data integration has outgrown the ability of humans. It starts with the torrents of data and the nature of the data. When you are tapping the social media or IoT firehose, you are often dealing with ingesting terabytes of data at a time. As multi-structured data, schema is far more complex. So far, this is not a case of ordinary CSV customer or product order files where, even if the schema is not consistent, it may be fairly straightforward to identify what's a name or what's a numerical field such as an order number, SKU, part number, phone number, tax ID, or social security number. Here, data prep tools emerged that used a modest level of machine learning to conduct pattern matching to identify the columns and how the columns of different data sets should be transposed or merged. Instead, the challenge is munging files where the data structure is far more cryptic and variable to the point where humans may not be able to parse it without some machine assist.



Quote for the day:


"What I've really learned over time is that optimism is a very, very important part of leadership." -- Bob Iger


Daily Tech Digest - July 24, 2019

Business Intelligence in Microservices: Improving Performance

microservice design
One of the main benefits of the microservice design is its scalability. You can scale several services without changing the whole system. So, you save resources and keep the app less complex. One of the most famous cases that prove this fact is Netflix user base. The company had to cope with the growing subscribers’ database. The microservice design was a great solution for scaling it. Each microservice needs its own database. Otherwise, you can’t use all the benefits of the modularization pattern. The variety of databases leads to challenges in the reporting process. We will discuss the problem later. Microservice design speeds up app development and allows us to launch the product earlier. ... The process of generating reports required engagement with different services. Thus, it caused performance issues. That’s why Freshcode team decided to optimize the app architecture by creating a separate reporting microservice. It received data from all the databases. Then, it saved them and transformed into custom reports.



Supervised Learning

Supervised Learning is one of the two major branches of machine learning. In a way, it is similar to how humans learn a new skill: someone else shows us what to do, and we are then able to learn by following their example. In the case of supervised learning algorithms, we usually need lots of examples, that is, lots of data providing the input to our algorithm and what the expected output should be. The algorithm will learn from this data, and then predict the output based on new inputs that it hasn’t seen before. A surprising number of problems can be addressed by using supervised learning. Many email systems use it to classify emails as either important or unimportant, automatically whenever a new message arrives in the inbox. More complex examples include image recognition systems, which can identify what an image contains purely from the input pixel values.


Ransomware: Why cities have become such a big target for cyberattacks


While some cities, such as Baltimore and Atlanta have opted to spend millions on rebuilding systems rather than giving into the demands of attackers, others have made the decision to pay, authorising their insurers to negotiate with the cybercriminals because it appears to be the simplest short-term option. It's possible that criminals aren't targeting cities on purpose, rather they're searching for any internet facing vulnerabilities they can find. However, it's also possible that groups have become aware that cities are easy pickings. Councils running small and medium-sized cities aren't flush with funds, meaning IT security doesn't have the investment it requires and security holes are left open. And, with a population totalling over 327 million people, there are thousands of small cities in the USA which make perfect targets for attackers. "These organisations are usually less secure then private institutions, they're not as cybersecurity-orientated. They're very prone to cyberattack -- and if something goes wrong, they can lose everything," says Marina Kidron, director of threat intelligence at Skybox Security.



Best practices for building an information governance program that will last

You need to make the case for change by creating the conditions that cause change. The best way to do this is through policies and procedures driven by laws, rules, regulations, standards, guidelines and best practices for managing the lifecycle of information. For example, I recently identified that a group within the agency had a process for storing information that resulted in multiple copies of PDFs being stored in multiple locations. I pointed out the IG issues with the process, but there was some resistance to change: “This is how we’ve done it for a long time, and there are good reasons for doing it this way.” Instead of arguing about process, I focused on changing the RIM policy to address how document drafts and extra copies are managed and disposed of. With that new policy in place, when I do assessments across the agency, I can point to the compliance policy as the rationale for change. Groups are far more receptive to change when compliance with clear policies are at stake.


Growing opportunities in the Internet of Things


The connectivity layer of the IoT technology stack is most tightly bound to mobile-network operators that offer standard cellular connectivity. A small number of well-financed start-ups have targeted this layer of the stack and have made progress in subsegments such as low-power wide-area connectivity. Connectivity technology occupies a still-growing market that’s strongly influenced by international standardization in this technology layer. In the third layer of the stack is cloud computing (which facilitates central processing and storage of data) and enablement platforms (which facilitate access to devices, data across devices, and connectivity standards). Complementary analytics and computational tools have emerged to interpret, visualize, and produce insights from device data. Together, these platforms have proliferated and developed over the past five years and now simplify device integration and application implementation—a favorable growth outlook for key players. The final and top layer, business applications, will continue to be highly fragmented, with many disparate solutions and established companies coexisting with significant start-up activity. 


Malware-based Attacks Dropped 20% Worldwide

Much of the malware decline has to do with the popularity of so-called fileless attacks and attackers using legitimate Windows and security tools to drill down deeper into their victim's network. Some regions had very different stats, the study found: the US experienced the most dramatic drop in malware attacks - 17% - while Switzerland was hit with a 72% jump in malware attacks. They're also abusing encrypted channels such as HTTPS and SSL-based VPN channels to camouflage their traffic and malicious code. SonicWall has seen some 1,100 encrypted attack attempts per day per customer, Conner says. Many organizations mistakenly assume encrypted traffic is legit traffic, he notes. The attackers are able to place malware in a file and "come through that Web channel and via that VPN," he explains. "They either go to the HTTPS site or right to the end user's desktop."


Businessman touching internet of things in 5G network showing the push of 5G IoT market towards edge computing
Technology has evolved over time from storing data on a central mainframe, then to personal computing, and later to cloud computing. Edge computing is the next logical step in this progression. This next evolution is only possible because devices are stronger yet cheaper to produce. The expectation of perennial internet connection means modern devices come loaded with additional computing power, and progressing to these agile devices is simply not as expensive as it once was. Consider the retail prices of small SD or USB flash drives, then reverse calculate for factory prices. Contemporary cloud data is primarily hosted by, well, cloud servers – which means it is primarily hosted by a handful of multinational tech companies. For example, Amazon’s public cloud accounted for 47 percent of the market in 2017. However, the increase to capable devices throughout the modern home makes edge computing not only viable, but practical. Instead of hosting personal data on the leased cloud storage of a company, why not keep it at home on the devices that produced the data?


Bosque: Microsoft’s New Programming Language

Bosque is heavily inspired by TypeScript syntax and types, as well as the semantics of ML and Node/JavaScript. If you have a JavaScript background, Bosque should look familiar and feel pretty natural. In an attempt to improve upon the structured model of the 70s that most people know and use in many programming languages, Bosque aspires to be a language for modern developers and shuns a lot of the complexity traditionally associated with programming languages. The team wondered if you started from a blank slate, with modern development needs in mind, how would you design a new programming language. Mark has always had an interest in semantic programming, creating a compiler is a natural extension of this interest. One of the other aims of Bosque is to create a language that is easier for machines to understand, and by reducing some language boilerplate, this is simpler. It also aids the development of developer tooling such as IDEs and IDE plugins; an important area of business for Microsoft.


Zuckerberg responsible for Facebook privacy compliance after $5bn FTC fine


Facebook CEO Mark Zuckerberg will also be personally responsible for the company’s data protection practices, having to report to the FTC every three months to prove that user data is being safeguarded adequately. The order also imposes other significant privacy requirements on Facebook, including prohibiting it from using telephone numbers obtained to enable two-factor authentication for advertising, requiring it to provide clear and conspicuous notice of its use of facial recognition technology, and forcing it to establish and maintain a comprehensive data security programme. “The magnitude of the $5bn penalty and sweeping conduct relief are unprecedented in the history of the FTC,” said FTC chairman Joe Simons. “The relief is designed not only to punish future violations but, more importantly, to change Facebook’s entire privacy culture to decrease the likelihood of continued violations. ...”


Do you really know what’s happening with your servers?

As performance optimization is pushed from an ever more demanding user base, the need to use the right platform for the right jobs is becoming key. With the demand for more servers on more platforms increasing, the ability to spin these up (and down) in the cloud becomes an easier route than on-premises hosting. In fact, there has been a 15% increase in people using SQL Server in the cloud for at least part of their estate in the last year, with the use of hybrid estates becoming ever more common. For example, it may be that organizations chooses to retain sensitive data on on-premises servers, while data that is not classed as personally identifiable information (PII) is located in the cloud. With your data spread across different server platforms in different locations in such a way, maintaining an overview of estate health becomes a lot more difficult. As a result of all these changes, migrations are also becoming a headache. From ensuring servers are on the latest version, to transferring data to cloud platforms, respondents to the survey revealed that migrating servers will be their biggest challenge this year.



Quote for the day:


"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith