Daily Tech Digest - May 22, 2019

The Net Promoter Score (NPS), which has long been used to measure the loyalty of firms’ customers, is under fire for becoming the false god of corporate America. In a searing article, the Wall Street Journal last week labeled NPS “a dubious metric” — one that is routinely cited by CEOs in earning calls and that somehow, magically, never declines. “Much of Corporate America is obsessed with NPS,” declared the article, before going on to list many of the activities the measure is used to justify, from employee bonuses to executive compensation. NPS hasn’t been useless, though. We can thank it for underscoring the importance of customer satisfaction ever since it was introduced in 2003. But in 2019, executives should question its efficacy and seek something better and broader. There are at least three big reasons why. ... Since the introduction of NPS, customers’ expectations have soared and companies’ access to information about them has increased dramatically. Today, consumers expect next-day delivery of online purchases — with tracking and free returns. 

Deep learning has existed since the 1970s, but because of the lack of computational resources it was only possible to build and train very basic neural networks and the technology floundered for many years. Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton changed this in 2012 when they demonstrated that it was now possible to train deep neural networks efficiently on a large data set by using GPUs. Their deep learning system was able to reduce the error rate by almost 50% — a very significant improvement — in certain image classification tasks. Since then, deep learning has begun to make a meaningful impact in many areas of industry. Today, image recognition by machines trained via deep learning performs better in some scenarios than humans– from classifying animals to identifying indicators for cancer in blood and tumors in MRI scans. Deep learning applications have also extended to speech recognition, autonomous driving cars and more. 

First, AI must the beneficial to humanity. It is essential to stress that AI must promote well-being, preserve dignity, and sustain our planet. The second principle is that AI must also not infringe on privacy or undermine security. Third, AI must protect and enhance our autonomy and ability to take decisions and choose between alternatives. AI must be our servant not our master. The fourth principle concerns justice or fairness. AI must promote prosperity and solidarity, in a fight against inequality, discrimination, and unfairness. Innovation should be inclusive and promote diversity as well as tolerance. Finally, we cannot achieve all this unless we have AI systems that are understandable in terms of how they work (transparency) and explainable in terms of how and why they reach the conclusions they do (accountability). There are other sets of principles which have been formulated by the OECD, the IEEE and many others, but the HLEG principles represent a convergence of ethical thinking and we believe are currently the best show in town. The G20 meeting in Osaka this year will be a good place to start, followed perhaps by incorporation in, or an annexure to, the Universal Declaration of Human Rights.

Demand for systems integration services soaring in New Zealand

In 2018, New Zealand’s IT services market grew 1.9% to reach $3.43bn. Managed services continued to account for the largest share of total IT services revenue. Chayse Gorton, IDC’s ANZ market analyst for IT services, said rather than build new applications from scratch, many Kiwi firms are tapping systems integration services to integrate software-as-a-service(SaaS) applications, as well as custom application development skills to modernise existing applications. Gorton added that as organisations shift from on-premise software deployments towards cloud-based SaaS models, the need for systems integration services will become imperative. ... In addition, many New Zealand organisations have taken a lift-and-shift approach to cloud software deployments. Although lift-and-shift cloud migrations can help to lower on-premise infrastructure costs, the cost benefits of the cloud may not be fully realised without re-architecting an application. Performance issues associated with the application may also crop up, according to IDC

Real-Time Data Processing Using Redis Streams and Apache Spark Structured Streaming

Structured Streaming, a new capability introduced with Apache Spark 2.0, has gathered a lot of traction in the industry and amongst the data engineering community. Built on top of the Spark SQL engine, Structured Streaming APIs deliver an SQL-like interface for streaming data. Initially, Apache Spark processed Structured Streaming queries in micro-batches, with a latency of about 100 milliseconds. Last year, version 2.3 introduced low latency (1 millisecond) "continuous processing", which is further fueling adoption of Structured Streaming. To work at the speed of Spark’s continuous processing, you need to augment it with a high-speed streaming database like Redis. This open source in-memory database is known for its high speed and sub-millisecond latency. Redis 5.0 recently introduced a new data structure called Redis Streams, which enables Redis to consume, hold and distribute streaming data between multiple producers and consumers.

Majority of new technology investments in UAE will be on data

The IoT investment in the UAE is expected to grow by 17% to $672.75 million compared to $574.89 million a year ago. The use cases are in transportation, smart grids, airlines, police using smart cameras, freight monitoring, production management, manufacturing operations, connected oil exploration, digital signage, and smart home and wellness sectors. With the introduction of 5G cellular network, more connected devices are expected to be connected to the network. IDC saw a big uptake of blockchain initially in areas like banking and financial services, trade financing, etc. In the last few years, Lalchandani said that adoption of blockchain by public sectors is gaining more traction and has seen use cases in education, property and utility sectors. What we are seeing is a high-value and low-volume transactions but in the next few years, we will see blockchain technology embedded into the computing. There will be some sectors which will leverage the blockchain technology as early adopters.

Speech and language: the crown jewel of AI with Dr. Xuedong Huang

At some point, let’s say computers can understand three hundred languages, can fluently communicate and converse. I have not run into a person who can speak three hundred languages. And not only machines can fluently communicate and converse, but can comprehend, understand and learn and reason and can really finish all the PhD courses in all subjects. The knowledge acquisition, reasoning, is beyond anyone’s individual capability. When that moment is here, you can think about how intelligent that AI is going to be. ... there are two levels of intelligence. The first level is really perceptive intelligence. That is the ability to see, to hear, to smell. Then the high level is cognitive intelligence. That is the ability to reason, to learn and to acquire knowledge. Most of the AI breakthroughs we have today, they are in the perceptive level such as speech recognition, speech synthesis, computer vision.

Huawei Gets 90-Day Reprieve on Ban

Huawei Gets 90-Day Reprieve on Ban
If the ban does take place, consumers could feel the effects of the actions against Huawei. For example, Huawei, which is the world's second largest maker of smartphones, will likely build its own version of Android based on the available open source code and without the influence of Google and its engineers, Bill Buchanan, a computer science professor at the University of Napier in Edinburgh, Scotland, has argued on Medium. This, in turn, could produce a host of security issues for Huawei smartphone users because they will be using a possible unsecured operating system without the benefits that come from Google's security expertise, says Priscilla Moriuchi, the director of strategic threat development at Recorded Future, a security vendor. "Based on the patterns of behaviors demonstrated by Huawei, we believe Google pulling Huawei's Android license will result in issues and delays for Huawei users, especially should Huawei use the open source version of Android," Moriuchi notes.

Enterprise IoT: Companies want solutions in these 4 areas

The state of enterprise IoT: Companies want solutions for these 4 areas
The IoT makes it possible to manage buildings and spaces more efficiently, with savings of 25% or more. Occupancy sensors can tell whether anyone is actually in a room, adjusting lighting and temperature to saving money and conserve energy. Connected buildings can also help determine when meeting spaces are available, which can boost occupancy at large businesses and universities by 40% while cutting infrastructure and maintenance costs. Other sensors, meanwhile, can detect water and gas leaks and aid in predictive maintenance for HVAC systems. ... Asset trackers can instantly identify the location of all kinds of equipment (saving employee time and productivity), and they can reduce the number of lost, stolen, and misplaced devices and machines as well as provide complete visibility into the location of your assets. Such trackers can also save employees from wasting time hunting down the devices and machines they need. For example, PwC noted that during an average hospital shift, more than one-third of nurses spend at least an hour looking for equipment such as blood pressure monitors and insulin pumps.

Data governance, digital transformation driving ITAM strategies

While ITAM moves from being an administrative activity to a governance function, it’s governance framework and the insights provided by ITAM data enable business units to source technology-based solutions effectively without being categorized as ‘shadow IT’ or ‘IT bypass.’ As organizations are focused on the value they drive from their investments rather than simply looking at costs, ITAM data must connect technology, business process and value. Detailed asset data is providing business unit stakeholders with a clearer understanding of IT services and the associated cost. Increased transparency is building trust between IT and the business. Industry data shows that the number of stakeholders involved in decision-making around technology investment has risen: according to a recent IDG study, the average buying committee now includes 21 influencers. Many of these decisions are being led from outside IT or even made without IT involvement.

Quote for the day:

"A leader is one who sees more than others see and who sees farther than others see and who sees before others see." -- Leroy Eimes

Daily Tech Digest - May 21, 2019

Top 10 Features to Look for in Automated Machine Learning

Following best practices when building machine learning models is a time-consuming yet important process. There are so many things to do ranging from: preparing the data, selecting and training algorithms, understanding how the algorithm is making decisions, all the way down to deploying models to production. I like to think of the machine learning design and maintenance process as being comprised of ten steps (see the diagram above). But, if I want to save time, increase accuracy, and reduce risk, I don’t manually go through the entire machine learning process in order to build my machine learning models. Instead, I turn to automated machine learning, using clever software that knows how to automate the repetitive and mundane steps, and freeing me up to do what humans are best at: communication, applying common sense, and being creative. And, to get the most out of automated machine learning, I want it to automate each and every one of the 10 steps. So, here’s my guide to what to look for in an automated machine learning system. ... Look for an automated machine learning platform that can automatically engineer new features from existing numeric, categorical, and text features. You will want a system that knows which algorithms benefit from extra feature engineering and which don’t, and only generates features that make sense given the data characteristics.

What is an enterprise wide agile transformation and why CIOs should lead it

digital transformation butterfly metamorphosis change gap
Agile practices change the nature of how teams define their customers, align on implementation strategies, debate priorities and commit to getting work done. Agile teams with a history of consistent delivery and demonstrating a strong partnership with their customers can change the culture. Instead of top-down priorities and timelines, teams align on strategic goals and produce business outcomes with incremental deliveries. CEOs are looking for smarter, faster and more innovative organizations that can propel growth, enable winning customer experiences, compete with analytics and drive efficiencies with automation. ... They want more efficient and higher quality operations, smarter sales teams closing more strategic deals and financial groups reporting and forecasting in near real time. And CEOs don’t know how to get there. They are increasingly relying on their leaders and staff to pave the journey for them. CIOs who have excelled at delivering results and culture change with agile practices in IT have the opportunity to extend the practice, culture and mindset as an enterprise wide way of operating.

Why Enterprise Blockchain Projects Fail

Social coordination marks a key point of failure for enterprise blockchain projects.
For one, there is a general lack of vision and understanding that plagues many blockchain projects. Blockchain, like other technologies, does not live in a vacuum devoid of any significant linkage to organizational and societal norms, design, dysfunction and purpose. When you add in years of pent up inertia and entrenched behaviors present in organizations and markets, means that just because something new can evoke positive change, does not mean it will. For this, a clear organizational vision and deep technical and strategic understanding of where blockchain is fit for purpose can go a long way. Unfortunately, many project leaders are hardly conversant in blockchain, let alone the other array of emerging technologies they must intersect with in order to extract maximum value and autonomy. Perhaps the biggest point of failure, is the general lack of cyber hygiene present in many early blockchain projects. The second major point of failure and perhaps the hardest to overcome is blockchain’s social, organizational and market coordination issue.

Redis-Based Tomcat Session Management

Redis is an in-memory open-source data project. In fact, it is the most popular in-memory database that is currently available. In particular, Redisson can be used as a Redis Java client. Redisson uses Redis to empower Java applications for companies' use. It is intended to make your job easier and develop distributed Java applications more efficiently. Redisson offers distributed Java objects and services backed by Redis. Redisson's Tomcat Session Manager allows you to store sessions of Apache Tomcat in Redis. It empowers you to distribute requests across a cluster of Tomcat servers. This is all done in non-sticky session management backed by Redis. Alternative options might serialize the whole session. However, with this particular Redis Tomcat Manager, each session attribute is written into Redis during each invocation. Thanks to this advantage, Redisson Session Manager beats out other Redis-based managers in storage efficiency and optimized writes. Tomcat Session Management, in this way, is used in the most ideal way possible.

Research indicates the only defense against killer AI is not developing it

Research indicates the only defense against killer AI is not developing it
If you’re thinking killer robots duking it out in our cities while civilians run screaming for shelter, you’re not wrong – but robots as a proxy for soldiers isn’t humanity’s biggest concern when it comes to AI warfare. This paper discusses what happens after we reach the point at which it becomes obvious humans are holding machines back in warfare. According to the researchers, the problem isn’t one we can frame as good and evil. Sure it’s easy to say we shouldn’t allow robots to murder humans with autonomy, but that’s not how the decision-making process of the future is going to work. The researchers describe it as a slippery slope: If AI systems are effective, pressure to increase the level of assistance to the warfighter would be inevitable. Continued success would mean gradually pushing the human out of the loop, first to a supervisory role and then finally to the role of a “killswitch operator” monitoring an always-on LAWS.

RegTech solutions can mitigate risk while aiding regulatory compliance

World Insurance Report 2019
While regulations continuously evolve; the costs of non-compliance are skyrocketing. Therefore, to adhere to stringent mandates and norms, banks and PSPs are turning to advanced technological capabilities for support. The result? Regulatory Technology (RegTech) tools, solutions, and firms are gaining mainstream popularity among financial institutions looking to redefine and streamline compliance processes across jurisdictions, lines of business and client bases. RegTech digital solutions collect intelligence through data analytics, predictive modeling, and statistical tools. This functionality is particularly important when it comes to proactively addressing multiple regulations versus taking a one-at-a-time approach that may result in several remediation measures. It is no surprise that firms’ RegTech spending is expected to average 48% annual growth over the next five years, expanding from $10.6 billion in 2017 to $76.3 billion in 2022. RegTech drastically improves the efficiency of compliance-related processes, data aggregation, data analysis, and tailored need-based offerings.

The Best Reason for Your City to Ban Facial Recognition

We’re not prepared as a society to ensure that facial recognition will be used responsibly and without discriminatory effects. We’re not prepared as individuals for a world in which we can be automatically tracked and identified wherever we go without our knowledge or consent. Even if we were ready, the technology itself isn’t: Experts both inside and outside the technology industry acknowledge that the artificial intelligence underlying facial recognition systems still struggles with accuracy, particularly when it comes to identifying the faces of people of color — which is to say, the people who are most likely to be affected by it. In a test last year by the ACLU, Amazon’s facial recognition software falsely matched the faces of 28 members of U.S. Congress to the mug shots of people who had been arrested. The mismatches disproportionately affected representatives of color. Perhaps most important, our governments and law enforcement agencies are not prepared to guard against abuses of the technology or the data it produces, to ensure it is kept confidential, or to constrain its use to the appropriate situations.

Building Digital-Ready Culture in Traditional Organizations

Recognizing the immense scalability of digital solutions, digital leaders typically focus on creating impact, assuming that profit will follow. At their best, these companies revolutionize how people and organizations interact, reinvent industries, and break the power of entrenched gatekeepers. The other three values support that mission. Speed helps companies stay ahead of competitors and keep up with rapidly changing customer desires. Openness encourages employees to challenge the status quo and work with anyone who can help them achieve their goals quickly. Autonomy gives people the freedom to do what’s right for the company and its customers without waiting for formal approval at every turn. Together, these values can foster an engaged, empowered workforce where employees feel a personal responsibility to constantly change the company — and often the world. The values of high-performing digital companies frame their essential practices: rapid experimentation, self-organization, data-driven decision-making, and an obsession with customers and results.

Why data governance matters – and who should own it?

tug of war
CIOs say that all that own, manage and/or rely on data to make decisions, should be involved in data governance. A financial services CIO said, “to use Gramm-Leach-Bliley Act (GLBA) terms, this includes data managers and regulation monitors. They must be at the table. In the end, this could include someone from just about every business area.” For many organizations, the legal department is a key stakeholder to align with and ensure the organization is meeting necessary governance requirements. Data can pose legal challenges. The longer you keep data, the more data can be used in e-discovery. While the business may want to keep data forever, there is a risk in not defining and enforcing data retention as part of a data governance program. Data governance stakeholders, for this reason, often include leaders from operations, sales, marketing, HR, accounting/finance. The C-suite leaders need to play a role. Where they exist, information governance and records management functions need to be included.

Data Pipeline Automation: The Next Step Forward in DataOps

The emerging DataOps field borrows many concepts from DevOps techniques used in general software engineering, including a focus on agility, leanness, and continuous delivery, Eckerson Group writes. The core difference is that it’s implemented in a data analytics environment that touches many data sources, data warehouses, and analytic methodologies. “As data and analytics pipelines become more complex and development teams grow in size,” Eckerson and Ereth write, “organizations need to apply standard processes to govern the flow of data from one step of the data lifecycle to the next – from data ingestion and transformation to analysis and reporting. The goal is to increase agility and cycle times, while reducing data defects, giving developers and business users greater confidence in data analytics output.” There are a handful of vendors delivering shrink-wrapped solution in this area, and not (yet) many open source tools. While DataOps is growing in recognition and need, the tools that supported automated data pipeline flows are relatively new, Eckerson Group writes.

Quote for the day:

"Leadership is not a solo sport; if you lead alone, you are not leading." -- D.A. Blankinship

Daily Tech Digest - May 20, 2019

Extreme launches autonomous network strategy at Extreme Connect

ExtremeAI Security. The software gathers data from a variety of sources to detect errant network traffic and report the anomaly to network operators. Extreme runs on its servers the security algorithms that analyze network, application and device data to identify malicious traffic. ExtremeAI Security gathers traffic flow data from NetFlow-enabled switches and routers. The software also draws IoT device data from Extreme's Defender for IoT and application data from Extreme Analytics. The fourth source of information is third-party threat feeds that provide continuous updates on blacklisted URLs and malicious IP addresses. Defender for IoT identifies IoT devices and assists network managers in setting security policies for groups of connected hardware, which could include medical devices, surveillance cameras or point-of-sale systems. Extreme Analytics draws application telemetry from a sample of network traffic flow to monitor application performance and notify managers when it falls below a set baseline. Extreme includes both in its list of Elements products.

Agile Vs Kanban: What’s the Difference?

Agile is a beneficial method for projects where the final goal is not set. As the project progresses, the development can adapt as per the requirements of the product owner. Kanban is about reducing waste and removing activities that never add value to the team. ... Kanban process is nothing but a Board, which is called "Kanban Board." Agile methodology is a practice which promotes continuous iteration of development and testing throughout SDLC life-cycle. Kanban process visualizes the workflow which is easy to learn and understand. The goal of the Agile method is to satisfy the customer by offering continuous delivery of software. In Kanban method, shorter cycle times can deliver features faster. In the agile method, breaking the entire project into smaller segments helps the scrum team to focus on high-quality development, testing, and collaboration. Kanban scrum needs very less organization set-up changes to get started. In Agile methodologies, Sprint planning can consume the team for an entire day.

Google sees Gmail as key to its collaboration plans

gmail google
Google faces strong competition as demand for team collaboration tools continues to soar; its rivals have already attracted significant numbers of users. Slack has 10 million daily active users, including 85,000 paid business customers, while Microsoft Teams, which like Hangouts Chat is available as part of a suite subscriptions, is used in 500,000 organizations. Facebook’s Workplace has more than 30,000 paid organizations and about 2 million users in total. It’s not clear how widely Hangouts Chat is actually used. The app is available as part of G Suite subscriptions, of which there are 5 million customers, but Google doesn’t break out stats for the messaging platform. Google’s offering appears to lag behind others in the market. “Based on our volume of conversations with clients, there isn’t much customer momentum with Hangouts Chat,” said Larry Cannell, a research director at Gartner. By integrating Hangouts Chat with Gmail, Google could spur greater adoption, said Angela Ashenden, a principal analyst at CCS Insight, providing an opening for adoption of the chat tool.

6 steps to avoiding an automation 'trap' by putting process first

Intelligent automation presents a powerful new lever with which to digitally transform an enterprise and fundamentally change how work gets done. By combining a wide range of techniques to enable the digitization, processing and evaluation of information, companies can improve the performance of a function, the effectiveness of the employees involved and, ultimately, the experience of the customer. Unfortunately, many attempts to implement intelligent automation disappoint because companies try to automate their current environment, rather than optimizing that environment to best leverage new tools and truly enable their workforce. One flawed approach focuses on finding applications for specific tools, much like a hammer looking for a nail. Certain steps might be automated, but they are fragmented across the existing flow, yielding fragmented capacity that can’t easily be realized as a benefit. Another common pitfall occurs when a process includes tasks that the tool isn’t intended to address, yet the tool is applied anyway, overextending its capabilities and introducing the risk of instability.

The Next Wellness Trend Should Be Google Spreadsheets

Spreadsheets in particular make it immediately clear—simply by opening and glancing at a document—when you’ve been neglecting your good habits. My philodendron houseplant needs regular, consistent watering to thrive, and so does a goal-tracking spreadsheet, which otherwise appears riddled with holes made of missing data. The motivation to fill out the spreadsheet is baked into the form: All those sad, empty boxes need to be filled in, and only you can do it, by completing whichever task you’ve set out for yourself and then marking it as done in the correlated column. “Rather than fall into patterns of procrastination that just breed more stress and hopelessness, a brief and specific to-do list can help you stay on track,” says Hershenberg. “When you make any steps toward that item on your to-do list, you can and should celebrate that effort. Finding a sense of accomplishment from things that are hard to do is an important part of improving depressed mood and low motivation.”

Killer SecOps Skills: Soft Is the New Hard

At just about every customer site, we are asked to help train SOC managers to do a better job of communicating technical information to non-technical executives. This is hard enough to do when you have time to prepare what you want to say, so imagine how stressful it can be to explain the nuances of a ransomware situation to a CFO or CEO when a decision on whether or not to pay the ransom needs to be made in a matter of minutes. ... SOC teams must be able to collect and disseminate information and tasks across multiple teams. For example, when correlating information about a new attack, clues usually come from multiple sources: network and endpoint experts, malware analysts, operations teams, and additional team members. Incident responders must not only communicate effectively and succinctly, they must be able to delegate to and project manage multiple teams that may have limited understanding of cybersecurity, and under accelerated timelines where broken communication channels can have irreversible negative consequences.

The case for general excellence

There’s no denying that in the modern world, the explosion of knowledge (and the efficiency of capitalism) promotes specialization. If you break a tooth, after all, I would suggest you see my wife, the dentist, rather than me, the generalist. Unfortunately, increasing specialization can have the paradoxical effect of narrowing horizons and limiting innovation to incremental advances. The scientific grant funding system seems to reinforce this syndrome. In medicine, where the spread of specialization is most obvious, patients in the U.S. often get good results on complex procedures (at very high prices), while the health of the population at large suffers. Does that mean expertise has no value? Of course not. But someone needs to see the big picture. Citing economist Robin Hogarth, Epstein relates a useful distinction here between the different kinds of arenas people work in. Chess and golf are “kind” learning environments: “Patterns repeat over and over, and feedback is extremely accurate and usually very rapid.” These environments tend to have strict and unchanging rules, and they reward repetition. Practice may not make perfect, but it certainly makes better.

In the 'post-digital' era, disruptive technologies are must-haves for survival

Organizations can best learn from companies – regardless of industry – that are exploring leveraging more than one DARQ capability to unlock value. This is where true disruption lies: those exploring how to integrate these seemingly standalone technologies together will be better positioned to reimagine their organizations and set new standards for differentiation within their industries. Volkswagen is an excellent example. The company is using quantum computing to test traffic flow optimization, as well as to simulate the chemical structure of batteries to accelerate development. To further bolster the results from quantum computing, the company is teaming up with Nvidia to add AI capabilities to future models. Volkswagen is also testing distributed ledgers to protect cars from hackers, facilitate automatic payments at gas stations, create tamper-proof odometers, and more. And the company is using augmented reality to provide step-by-step instructions to help its employes service cars.

AI vs. Machine Learning vs. Deep Learning

AI vs machine learning vs deep learning
Deep learning describes a particular type of architecture that both supervised and unsupervised machine learning systems sometimes use. Specifically, it is a layered architecture where one layer takes an input and generates an output. It then passes that output on to the next layer in the architecture, which uses it to create another output. That output can then become the input for the next layer in the system, and so on. The architecture is said to be "deep" because it has many layers. To create these layered systems, many researchers have designed computing systems modeled after the human brain. In broad terms, they call these deep learning systems artificial neural networks (ANNs). ANNs come in several different varieties, including deep neural networks, convolutional neural networks, recurrent neural networks and others. These neural networks use nodes that are similar to the neurons in a human brain. Neural networks and deep learning have become much more popular over the last decade in part because hardware advances, particularly improvements in graphics processing units (GPUs), have made them much more feasible.

Black Hat Q&A: Bruce Schneier Calls For Public-Interest Technologists

I spend four chapters laying out the different government interventions that can improve cybersecurity in the face of some pretty severe market failures. They're complex, and involve laws, regulations, international agreements, and judicial action. The subsequent chapter is titled "Plan B," because I know that nothing in those four chapters will happen anytime soon. And I don't even think my Plan B ideas will come to pass. There are a lot of reasons for this, but I think the primary one is that technologists and policy makers don't understand each other. For the most part, they can't understand each other. They speak different languages. They make different assumptions. They approach problem solving differently. Give technologists a problem, and they'll try the best solution they can think of with the idea that if it doesn't work they'll try another -- failure is how you learn. Explain that to a policy maker, and they'll freak. Failure is how you never get to try again. Solving this requires a fundamental change in how we view tech policy. It requires public-interest technologists.  

Quote for the day:

"Take time to deliberate; but when the time for action arrives, stop thinking and go in." -- Andrew Jackson

Daily Tech Digest - May 19, 2019

Delivering Business Value Through AI To Impact Top Line, Bottom Line And Unlock ROI

Business leaders need to realize AI’s potential to unlock new sources of revenue in addition to improving customer targeting and loyalty. One of these ways is data monetization. What is data monetization? Simply put, data monetization refers to the act of generating measurable economic benefits from available data resources. According to Gartner, there are two distinct ways in which business leaders can monetize data. The most commonly seen method from the two is Direct Monetization. The way to realize value from this avenue involves directly adding AI as a feature to existing offerings. ... Use cases discovered in this arena span social media sentiment mining, programmatic selection of advertising properties, measuring effectiveness of marketing programs, ensuring customer loyalty and intelligent sales recommendations. AI also has huge potential to drive businesses to explore and exploit eCommerce platforms as a credible channel for sales and to help drive the digital agenda forward.

Has the UK government's cloud-first policy served its purpose?

The obvious concern in all this is that, if the cloud-first mandate is revoked completely, central government IT chiefs might start falling back into bad procurement habits, whereby cloud becomes an afterthought and on-premise rules supreme again. Maybe that is an extreme projection, but there are signs elsewhere that some of the behaviours that G-Cloud, in particular, was introduced to curb could be starting to surface again. One only has to look at how the percentage of deals being awarded to SMEs via G-Cloud has started to slide of late, which has fed speculation a new oligopoly of big tech suppliers is starting to form, who will – in time – dominate the government IT procurement landscape. Where G-Cloud is concerned, there are also rumblings of discontent among suppliers who populate the framework that it is becoming increasingly side-lined for a number of reasons. There are semi-regular grumbles from suppliers that suggestions they have made to CCS or GDS about changes they would like made to the framework being ignored, or not being acted on as quickly as they would like.

There are several reasons why enterprise security threats -- especially malware attacks -- are on the rise, Kudelski Security's Kizziah said. "One of the most interesting is criminal groups' adoption of the latest, freely available malcode, which is quite advanced, easy to modify for different specific purposes, and modular, so it can use different techniques to infect an endpoint," Kizziah said. With over two billion known malware out there and with new malware being introduced every single day, it is impossible to achieve a reasonable level of protection with the traditional approaches to cybersecurity, which is focused on "chasing the bad," Nyotron's Kolga said. Instead, businesses should refocus their efforts on the "ensuring good" approach, Kolga said. This can be achieved through whitelisting approaches for application control and OS behavior, he added. ... Cybercriminals will always find a way to infiltrate businesses, Kujawa believes. He advised companies to adopt a mindset that is not focused solely on prevention. Enterprises should have a plan in place for when threat actors gain access to networks, so that they can protect the most important data with additional layers of security and to ensure that business operations are not disrupted.

FBI and Europol Disrupt GozNym Malware Attack Network

Authorities say this investigation was the result of cooperation between the U.S. and Bulgaria, Germany, Georgia, Moldova and Ukraine. An unusual aspect of the investigation is that charges were brought against suspects in the countries where they reside based, in part, on evidence gathered by the FBI and German authorities. "The prosecutions are based on shared evidence acquired through coordinated searches for evidence in Georgia, Ukraine, Moldova and Bulgaria, as well as from evidence shared by the United States and Germany from their respective investigations," the U.S. Justice Department says. Authorities say five suspects remain at large. All are believed to be in Russia, which did not cooperate with the investigation. The GozNym takedown involved close cooperation between the U.S. Department of Justice and counterparts abroad, supported by coordination from Europol, backed by Eurojust, the EU's agency for handling judicial cooperation on criminal matters among EU member states' agencies.

Demystifying Quantum Computing

Importantly, quantum computers aren’t suited for all problems. There are instances where classical computers can perform just as well as a quantum machine. Thus, quantum computers won’t replace classical computers; they’ll operate alongside them. However, more work and research remains to be done. Current quantum computers aren’t powerful or accurate enough yet to offer an advantage over classical computers. Today they can maintain entanglement for just 90 microseconds, and the algorithm can only run during this short timeframe. In quantum computers with superconducting qubits, the chip must be cooled to close to absolute zero, meaning that it must be totally isolated from the environment. Any outside noise or heat is enough to cause an entangled system to collapse. These limitations will have to be overcome before businesses can start using the technology widely. To date, the quantum computers that exist have been used largely for studying quantum computing and developing its potential use cases. Once quantum computers exceed the capabilities of classical computers, they’ll reach what is called quantum supremacy, and the true quantum era will be at hand. 

The Evolution Of The Chief Data Officer

data brain
“CDOs have emerged from one of two camps: IT or business,” she says. “CDOs that have risen through the ranks of a technology organisation recognise the value of data and see how it can be applied to improve the business. One of their biggest challenges is in building trust and credibility with business leaders, while pushing risk averse technologists outside their comfort zones.” “Meanwhile, CDOs who come from the business side of an organisation have been frustrated with how slow IT may have been to respond to requests for self service analytics, new types of data such as the IoT, and the evolution to AI. They are willing to take more risks and innovate faster because they know that the business livelihood depends upon it. Their biggest challenge is learning just enough of the technology—and there is a lot of it, which changes rapidly—to be respected by IT and to make the right decisions.” For all CDOs, regardless of background, the overarching aim is to create a business culture that is driven by data. How this is achieved may vary according to individual or organisation, but the end goal is the same: capture data, understand it, keep it safe, and use it to make the business better.

Only 9% of companies warn employees about IoT risks

IoT-related data breaches specifically caused by an unsecured IoT device or application increased from 15% in 2017 to 26% in the last year, the report found. It's possible that this number is actually larger, as most organizations said they are not aware of every unsecure IoT device or application in their environment, or introduced by third-party vendors, it noted. Despite these risks, only 9% of companies said their organizations currently inform and educate employees and third parties about the dangers created by IoT devices. The majority of organizations surveyed lack centralized accountability to address and manage IoT risks, according to the report. Only 21% of board members report that they are highly engaged in security practices, and understand third-party and cybersecurity risks in general. About one-third (32%) of the organizations surveyed said no single person or department is responsible for managing or implementing corrective actions to manage IoT risks, the report found.

Podcast: Adopting public cloud as a culture

The key for improved cloud adoption is opening the lines of communication, bridging the divides, and gaining new levels of understanding. As in the restaurant analogy, the chef says, “Well, I can add these ingredients, but it will change the flavor and it might increase the cost.” And then the finance people say, “Well, if we make better food, then more people will eat it.” Or, “If we lower prices, we will get more economies of scale.” Or, “If we raise prices, we will reduce volume of diners down.” It’s all about that balance―and it’s an open discussion among and between those three parts of the organization.  This is the digital transformation we are seeing across the board. It’s about IT being more flexible, listening to the needs of the end users, and being willing to be agile in providing services. In exchange, the end users come to IT first, understand where the cloud use is going, and can IT be responsive? IT knows better what the users want. It becomes not just that they want solutions faster but by how much. They can negotiate based on actual requirements.

The Power Of Hidden Teams

A recent Cisco study yielded comparable data. And according to Jones, “We can see from our data that teams with more-frequent check-ins have dramatically higher levels of engagement; so, moving forward, we are going to keep experimenting with smaller, more patient-centered, more agile teams, and keep investigating the link between span of control and patient outcomes — and all because we can see the link between attention, teams, and patient care.” The most-engaged teams — and the most-effective team leaders — understand that the currency of engagement is real, human attention. This helps us answer a long-standing question about the optimal span of control in all organizations. Some research puts the number at eight to 10, whereas some workplaces, such as call centers, push the limits with spans as great as 70 team members to one supervisor. Pinpointing the check in, and the frequent attention it provides, as the key driver of engagement shows that “span of control” is more accurately span of attention. The research reveals that for people to be engaged, the span of control must allow each team leader to check-in, one on one, with each team member every week of the year.

Attackers Exploit WhatsApp Flaw to Auto-Install Spyware

The U.K.'s National Cyber Security Center - the public-facing arm of GCHQ - has published guidance for all WhatsApp users. "The NCSC ... always recommends that people protect their device by installing updates as soon as they become available," it says. "The NCSC also recommends that people switch on automatic updates to install them as quickly as possible." Likewise, the Indian Computer Emergency Response Team, Cert-IN, has warned that attackers could launch further attacks. It's urging all users to upgrade immediately to latest version of WhatsApp. Questions remain about what exactly the exploit might allow attackers to do. For example, could they use it to escape Apple's iOS sandbox, and does updating eliminate any access they may now enjoy to a device? "Does updating the app remove whatever malware was placed on phone? Did they manage to pivot out of the app? I haven't seen any technical analysis of the malware yet so genuinely interested," says Alan Woodward, a professor of computer science at the University of Surrey.

Quote for the day:

"And how does one lead? We lead by doing; we lead by being." -- Bryant McGill

Daily Tech Digest - May 02, 2019

AI is already changing how cancer is diagnosed

cancer AI
Thanks to screening programmes, scientific breakthroughs, and technological advances in areas such as genetics, and medical imaging, cancer is much more likely to be diagnosed at a much earlier stage than it was several decades ago. However, accuracy in medical imaging diagnosis remains low, with professionals seeing 20-30 percent false negatives in chest X-rays and mammographies. False positive diagnosis (wrongly stating that there is a problem) is also common. AI can help counteract this, and the fact that healthcare is data-rich, is an added bonus. The vast majority of AI applications within healthcare leverage machine learning algorithms. The more data they are exposed to, the more likely they are to unearth hidden patterns within it that can then be used to perform a task without being explicitly programmed to do so. ... “One of the biggest challenges that we wish to address when it comes to a cancer diagnosis is ‘early detection.’ If a patient is diagnosed early, the chance of survival increases exponentially.

The Zero Server Web Framework Allows Developers to Create Web Applications With No Configuration

The Zero Server web framework allows developers to create, build and develop web applications with server-side rendering and little to no configuration. The recently released major iteration of Zero accepts a mix of Node.js, React, Vue, HTML, MDX, and static files, with support for Svelte poised to follow suite in upcoming versions. Zero 1.0 features automatic configuration, file-system based routing, automatic dependency resolution, and more. With Zero 1.0, developers organize the miscellaneous pieces of a web application into folders, whose structure and content will be mapped to the routes served by Zero Server. The file-system based routing maps files to routes according to the file extension. Following old PHP conventions, content that resides in ./api/login.js is exposed at http://<SERVER>/api/login. This is valid for any file with a .jsextension. Zero thus allows developers to define their API endpoints as individual functions. Zero additionally incorporates a route rewrite mechanism to allow for nested routes. Files with a .jsx extension are expected to contain React code which exports a React component which specifies the page to display at the appropriate route.

Microservices introduce hidden security complexity, analyst warns

Microservices security is something that needs to be tackled urgently, but this is challenging, said Balaganski, because there are almost no established design patterns, best practices or standards for the design, implementation and maintenance of microservices. “It is important for organisations to first realise that there is a problem that they were not previously aware of and that they need to start asking the right questions and looking for the answers,” he said. “If organisations are not aware of the problems, they won’t be looking for solutions.” Understanding the basics of how microservices work and the security implications of using this architecture is a good place to start, said Balaganski. “If you don’t know the basics, you can’t plan your further strategy based on an informed risk assessment,” he said. “In terms of finding out what questions to ask, they should be looking at the draft special publication from Nist [the US National Institute of Standards and Technology] on Security strategies for microservices-based application systems, which is basically a list of things that need to be considered.”

Tips for creating a successful big data lake

Most data collected by enterprises today is thrown away. Some small percentage is aggregated and kept in a data warehouse for a few years, but most detailed operational data, machine-generated data, and old historical data is either aggregated or thrown away altogether. That makes it difficult to do analytics. For example, if an analyst recognizes the value of some data that was traditionally thrown away, it may take months or even years to accumulate enough history of that data to do meaningful analytics. The promise of the data lake, therefore, is to be able to store as much data as possible for future use. So, the data lake is sort of like a piggy bank (Figure 1-4)—you often don’t know what you are saving the data for, but you want it in case you need it one day. Moreover, because you don’t know how you will use the data, it doesn’t make sense to convert or treat it prematurely. You can think of it like traveling with your piggy bank through different countries, adding money in the currency of the country you happen to be in at the time and keeping the contents in their native currencies until you decide what country you want to spend the money in

Is it still worth becoming a data scientist?

With slowing salary growth among data scientists and signs there may be a glut of junior talent, should aspiring data scientists pause for thought? Boykis' advice is to consider getting into the field by the "back door", by starting out in a tangentially related field like a junior developer or data analyst and working your way towards becoming a data scientist, rather than aiming straight for data scientist as a career. Stack Overflow's Silge has a slightly different interpretation of why salaries are levelling out and believes people shouldn't necessarily be deterred from entering the industry. "I think that what we're seeing is a little bit of the standardization and the professionalization of data science," she said. "The past ten years have been a bit of the Wild West when it comes to data science. 'How do you become a data scientist?', it's been a really open question. "I see the industry moving towards some consensus around 'What does it mean to be a data engineer? and 'What does it mean to be a data scientist?'. "When you get to that stage it becomes easier to hire for those roles, and when these roles are easier to hire for you don't have the crazy salary situation we had before."

CIO interview: Mark Holt, CTO, Trainline

“It’s just an amazing group of people,” he says. “We’ve gone from a quite slow-moving environment to one where we operate at e-commerce pace. We do more than 300 production releases every week. We have a team who are able to operate at that pace – and that requires a particular group of individuals, with the right skillset, attitude and approach.” Holt says it is not easy to find such talented professionals. He recognises that these highly skilled individuals are the types of people that Google or Facebook are looking to hire, and says the key to success is to continually think about how people work, the roles they fulfil and the supportive environment the business needs to create. “We focus on culture,” he says. “I like the phrase ‘intentional’ – we pay attention to our culture, we care about it and we nurture it on a daily basis. A lot of my conversations with my direct reports will be about culture and the cultural impact of doing something. If we make a change or move something around, what will happen? How does it feel to be in the development and infrastructure teams at Trainline?”

Revolutionary data compression technique could slash compute costs

A new form of data compression, called Zippads, could slash compute costs
In Zippads, as the new system is called, stored object hierarchical levels (called “pads”) are located on-chip and are directly accessed. The different levels (pads) have changing speed grades, with newly referenced objects being placed in the fastest pad. As a pad fills up, it begins the process of evicting older, not-so-active objects and ultimately recycles the unused code that is taking up desirable fast space and isn’t being used. Cleverly, at the fast level, the code parts aren’t even compressed, but as they prove their non-usefulness they get kicked down to compressed, slow-to-access, lower-importance pads—and are brought back up as necessary. Zippads would “see computers that can run much faster or can run many more apps at the same speeds,” an MIT News article says. “Each application consumes less memory, it runs faster, so a device can support more applications within its allotted memory.” Bandwidth is freed up, in other words. “All computer systems would benefit from this,” Sanchez, a professor of computer science and electrical engineering, says in the article. “Programs become faster because they stop being bottlenecked by memory bandwidth.”

4 best practices for improving governance strategies

As the role of technology in corporate America has evolved over the last 30-plus years, the term “IT governance” has risen to prominence. In the most basic sense, IT governance is a formal framework that provides structure for organizations that ensures all IT investments and systems support core business objectives. In other words, it helps align IT strategy with business strategy. “As changes rapidly occur, it is essential to have a well-defined IT governance framework, a state of compliance within regulatory requirements, and a preemptive approach to IT business risks,” Arbour Group explains. For large organizations that have dozens of priorities and millions of dollars invested into various strategies at any point in time, IT governance is an absolute necessity. A failure to articulate the correct approach to IT governance could result in costly mistakes that prevent the organization from being successful. For business leaders that haven’t empowered their companies with IT governance – or even those who have, but know they aren’t taking full advantage – there’s ample room for improvement.

Automating trust with new technologies

The need for trust starts when a product or component leaves the factory or farm. A manufacturer that has implemented automated trust creates a digital “birth certificate” with specifications, provenance, cost, and other relevant data. It then enters this birth certificate (usually an IoT tag) into its existing ERP system, integrated with blockchain to create a secure, immutable, cryptographically sealed record. This record is instantly available, in identical form, on the different servers of the participants in this supply chain, such as the manufacturer, logistics providers, distributors, and wholesalers. Next come IoT sensors, to record location, temperature, ambient vibrations, and other measures to provide continuous end-to-end provenance. The logistics provider scans the sensors to connect them to the blockchain and to the digital birth certificate. As assets change location and condition, IoT sensors gather the data. Blockchain stores it, securely and immutably, with a timestamp on the servers of all of the participants.

Using TypeScript with the MySQL Database

TypeScript, introduced in 2012, has had a recent surge in popularity. A recent JavaScript and Web Development InfoQ Trends Report notes that "TypeScript has had a dramatic rise in popularity, now in the top 10 most popular programming languages on GitHub...". In June 2018 TypeScript made its debut on the TIOBE Index of programming languages top 100 at #93 and the following month was ranked in the top 50. More recently TypeScript is ranked at 44 on the TIOBE index. TypeScript has emerged as a powerful environment for authoring web applications, providing significant improvements over standard JavaScript while remaining consistent with the language. In this article we'll explore in depth the details necessary to use TypeScript with Node.js, MySQL, and TypeORM to create a powerful solution for managing database access with server-side TypeScript. We'll build an example CRUD application to provide a complete end to end solution. The example application shall model a journal catalog. We shall assume familiarity with JavaScript.

Quote for the day:

"Knowledge is like underwear. It is useful to have it, but not necessary to show it off." -- Bill Murray

Daily Tech Digest - May 01, 2019

What Has Fintech Done, To Make Itself Feel Proud?

In this photo, a customer is assisted at an M-Pesa counter in Nairobi, Kenya, to make a money transfer. Photo Credit: AP Photo/Sayyid Abdul Azim
“What we are able to do as a fintech company is to offer better accessibility to financial products for this group of hardworking individuals, who are currently marginalized, particularly when it comes to accessing the lending system.” But what could the fintech industry do more of to prevent this financial worry in the first place. Boden points out the importance of “simplicity, accessibility and the user experience, keeping up the ‘mission to explain’. “As long as we continue to demystify subjects which can often intimidate people such as pensions and investments, we will be fighting the good fight on financial inclusion. What the fintech industry must not lose sight of is its ability to listen to customers and adapt to meet their needs. This is an area where traditional financial services companies struggle to compete.” Sarkar also discusses how impactful financial education can be, “while highlighting the unique position employers have to support improved financial wellbeing of their workforce. For instance, our research uncovered that 77 percent of people trust their employer when it comes to information about their personal finances, and also trust their employer to keep that information confidential.

How to Automatically Determine the Number of Clusters in your Data - and more

Determining the number of clusters when performing unsupervised clustering is a tricky problem. Many data sets don't exhibit well separated clusters, and two human beings asked to visually tell the number of clusters by looking at a chart, are likely to provide two different answers. Sometimes clusters overlap with each other, and large clusters contain sub-clusters, making a decision not easy. ... A number of empirical approaches have been used to determine the number of clusters in a data set. They usually fit into two categories: Model fitting techniques: an example is using a mixture model to fit with your data, and determine the optimum number of components; or use density estimation techniques, and test for the number of modes...; and Visual techniques: for instance, the silhouette or elbow rule (very popular.) In both cases, you need a criterion to determine the optimum number of clusters. In the case of the elbow rule, one typically uses the percentage of unexplained variance.

Fintech lobby spending targets cryptocurrency taxation

While the Securities and Exchange Commission has released some guidance on when it would consider a digital token a security, the nascent industry has complained that the SEC’s most recent comments have muddied the already murky matter. That’s why the fintech industry is lobbying hard for a bill from Ohio Republican Rep. Warren Davidson to exempt digital tokens from securities regulations, said Kristin Smith of the Blockchain Association. “That’s probably been our biggest focus,” she said. “And it will continue to be our biggest focus for the next couple of months.” Tax issues are another priority, Smith said. Because cryptocurrencies can alternately be considered currencies, securities, futures contracts or something else, their tax treatment is a complicated question that the industry hopes can be simplified soon. The IRS has issued scant guidance on how to tax digital coins, said Jerry Brito, executive director at Coin Center. Brito is hoping a pair of cryptocurrency tax bills introduced last year can advance this year.

Plandek co-CEO: 5 areas for Agile team self-improvement

Agile, is, after all, a relative term and fairly meaningless unless qualified. So do you know how agile your development is? One-way to embed the culture change required to answer that key question is through self-improvement (SI) processes underpinned by the right agility metrics. Agile is already closely linked to SI — let’s remember that the Agile Manifesto states: “At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.” In other words, Agile is about continuous, team-driven SI. The fact that retrospectives is among the top five Agile techniques underscores SI’s importance (source: State of Agile report). Nevertheless, SI efforts regularly fail due to inadequate leadership and follow-through. Teams either don’t have the right tools to collect the data or that they set the wrong metrics. The latter can be especially problematic when Agile development projects are scaling.

The story of smart data

You may have a series of sensors connected to a patient, where you’re monitoring their vital statistics which, in turn, may alert healthcare professionals or physicians as to their ongoing remote treatment and care. Indeed, smart data has many stories to tell, but we may not necessarily be privy to its journey. Moreover, in the evolution of smart objects or things, we may need the support of “smart agents” – autonomous entities that have been empowered to make decisions for us. However, in our current design doctrine human interaction is still needed. ... Of course, we’ve also empowered our smart agents to learn – a true cause and effect paradigm, in turn, slowly diminishing the need for human intervention and, again, realizing a truer definition of “machine learning.” Agents will also use blockchain technology to provide a ledger – an historical reference to what they have learned and might know for future situations – yes, predictive analytics is another reality. Our smart data is a diverse collection of values that offer many insights into the various journeys undertaken by our smart agents.

How a Google Street View image of your house predicts your risk of a car accident

It turns out that a policyholder’s residence is a surprisingly good predictor of the likelihood that he or she will make a claim. “We found that features visible on a picture of a house can be predictive of car accident risk, independently from classically used variables such as age or zip code,” say KidziƄski and Kita-Wojciechowska. When these factors are added to the insurer’s state-of-the-art risk model, they improve its predictive power by 2%. To put that in perspective, the insurer’s model is better than a null model by only 8% and is based on a much larger data set that includes variables such as age, sex, and claim history. So the Google Street View technique has the potential to significantly improve the prediction. And the current work is merely a proof of principle. The researchers say its accuracy could be improved using larger data sets and better data analysis. The researchers’ approach raises a number of important questions about how personal data should be used. Policyholders in Poland might be startled to learn that their home addresses had been fed into Google Street View to obtain and analyze an image of their residence.

How machine learning could change science

There are several projects underway to cure, understand, or otherwise ameliorate the symptoms of different cancers - three of which in the DOE specifically use machine learning, as well as a broader machine learning cancer research program known as CANDLE. "In this case, the DOE and [NIH's] National Cancer Institute are looking at the behavior of Ras proteins on a lipid membrane - the Ras oncogenic gene is responsible for almost half of colorectal cancer, a third of lung cancers.” Found on your cell membranes, the Ras protein is what “begins a signalling cascade that eventually tells some cell in your body to divide,” Streitz said. “So when you're going to grow a new skin cell, or hair is going to grow, this protein takes a signal and says, ‘Okay, go ahead and grow and another cell.’” In normal life, that activity is triggered, and the signal is sent just once. But when there’s a genetic mutation, the signal gets stuck. “And now it says, grow, grow, grow, grow, again, just keep growing. And these are the very, very fast growing cancers like pancreatic cancer, for which there's currently no cure, but it's fundamentally a failure in your growth mechanism.”

Done Right, Cloud Native Culture Means Happier Java Developers

“What is ahead-of-time compilation? It’s pre-computation of application code using closed-world static analysis. That’s a fancy way of saying ‘do more at compilation time and less at runtime,’” Rocher said in his keynote at Code Rome. Micronaut moves dependency injection, aspect-oriented programming, configuration management, and bean introspection from the runtime part to the build-time part so that fast-launching services don’t eat up memory. But Rocher wasn’t done with optimizing. He whipped out a demo of GraalVM, “the new universal Java Virtual Machine from Oracle that converts Java to native machine code using AOT.” Not only does it work well with Micronaut, it also features a language framework called Truffle that allows languages to interoperate, so “a Java app can call a JavaScript app without any overhead.” In his demo of Micronaut on GraalVM, startup time was just 20 milliseconds and memory consumption was18MB. “For a Java app, that is quite remarkable,” he said.

2 Million IoT Devices Vulnerable to Complete Takeover

iot security cameras baby monitors take over video feeds
It’s hardly the first security issue in security and surveillance cameras, which hold sensitive data and video footage ripe for the taking for hackers. In July, IoT camera maker Swann patched a flaw in its connected cameras that would allow a remote attacker to access their video feeds. And in September up to 800,000 IP-based closed-circuit television cameras were vulnerable to a zero-day vulnerability that could have allowed hackers to access surveillance cameras, spy on and manipulate video feeds, or plant malware. “Security cameras continue to be the oxymoron of the 21st century,” Joe Lea, vice president of product at Armis, in an email. “This is a perfect storm of a security exposure for an IoT device – no authentication, no encryption, near impossible upgrade path. We have to stop enabling connectivity over security – this is a defining moment in how we see lack of security for devices and lack of response.” In a comment to Threatpost, Marrapese said that vendors have a big part to play when it comes to doing more to secure their connected devices.

Creating Meaningful Diversity of Thought in the Cybersecurity Workforce

We have been discovering the value of diversity of thought through programs such as IBM’s new collar initiative and the San Diego Cyber Center of Excellence (CCOE)’s Internship and Apprenticeship Programs. IBM’s initiative and the CCOE’s program rethink recruiting to pull workers into cybersecurity from adjacent disciplines, not just adjacent fields. Toward the end of my stay at Intuit, I participated in a pilot program that brought innovation catalyst training to leaders outside of product development. Innovation catalysts teach the use of design thinking to deliver what the customer truly wants in a product. While learning the techniques I would later use to coach my teams and tease out well-designed services — services that would delight our internal customers — I was struck by an observation: People of different job disciplines didn’t just solve problems in different ways, they brought different values and valued different outcomes.

Quote for the day:

"Your first and foremost job as a leader is to take charge of your own energy and then help to orchestrate the energy of those around you." -- Peter F. Drucker