Daily Tech Digest - December 20, 2019

How 8 short months changed the cloud landscape


Even the most staunch believers in a strictly public cloud model are recognizing that the majority of organizations are just not able to go all-in on public cloud. Between legacy systems, security/compliance requirements, and the need for flexibility, end-user organizations want solutions that allow them to move seamlessly between environments if necessary. Hybrid and multicloud are now becoming standard. This same customer pressure is the number one factor driving the increase of mix and match cloud offerings. Beyond customer pressure, mix and match cloud has also progressed in part because customer momentum around multicloud has also progressed. Initially it was the regulated companies — mainly financial services — that embraced multicloud, but now demand has moved beyond these early adopters to the enterprise mainstream. After IT decisions makers see some success in using public clouds and start mapping the future, they often learn the reality of needing hybrid as part of the plan — and that it is smart to have multicloud as part of the plan as well.



Your Security Strategy Should Scale and Evolve Alongside Your Business


While all workers in a small business may know each another, large businesses often have greater numbers of employees, contractors and partners working in scattered locations and sometimes speaking different languages, which can lead to information security challenges. Likewise, interpersonal dynamics (i.e., office politics) are far more likely to affect human-factors issues for larger businesses than mom-and-pop operations. Also, because no one person can be an expert on all of the organization’s systems and technologies, proper communication becomes essential to maintaining information security, and communication failures or breakdowns can lead to dangerous mistakes or vulnerabilities. It’s no secret that people occasionally miscommunicate, so these risks, which small businesses consider far less frequently, must be mitigated in large businesses. As such, implementing formal policies and procedures around communication can help reduce human complexity issues significantly by outlining the specifics of proper communications and limiting the potential for riskier techniques.


Machine learning ops to lead AI in 2020


Commensurate with the growth of third-party model usage is the shift to consumption-centric approaches to ML model usage. The market is already shifting in the way that organizations approach model development and usage. While up to this point the primary emphasis has been on model development and creation for use by a single organization, the shift to consumption-centric models will require tooling and environments that have the specific needs of model users vs. model developments. In 2020, we'll see the growth of machine learning ops infrastructure that provides a range of functionality and capabilities for those looking to consume models. Machine learning ops systems are meant to simplify the usage and consumption of various AI and machine learning models that were built in-house or by third-party vendors. The platforms will offer features including model governance for controlling, limiting or prioritizing access to models for different users, allowing for collaboration in model usage among various team members.


Google Cloud’s 5 Boldest Moves of 2019

Google Cloud’s 5 Boldest Moves of 2019
Over the summer it added a new security feature that puts context aware capabilities in Cloud Identity-Aware Proxy, which allows companies to define and enforce granular access policies for apps and infrastructure based on a user’s identity and the “context” of their request, such as users’ location, time of day that they are trying to access a particular app, or the security status of the device. This is good for both the enterprise and its employees: it improves a company’s security posture and means employees have easier access to the cloud and workloads running in the cloud on any device without using a virtual private network (VPN) client. A couple months later Google added a security analytics product that identifies misconfigurations and compliance violations in companies’ GCP environments. And in November at the U.K. edition of its Cloud Next event, it unveiled a ton of capabilities around data encryption, network security, security analytics, and user protection.


How The Cloud Can Solve Life Science’s Big Data Problem

Genetic engineering and digital technology concept.
A growing number of large and small companies seem to agree with Benchling’s approach. Benchling announced today that it added 150 new clients to its roster in 2019 — more than doubling its customer base for the second consecutive year — with a significant expansion of enterprise customers. Biopharma was its largest and fastest-growing customer sector, but it also saw strong initial deployments in a range of new industries, including biomaterials, energy, consumer goods, and food and beverage. Earlier this year, Benchling closed $34.5 million in Series C funding to extend its product lead and expand commercial relationships and opened its Cambridge, MA, office in April. Since launching in 2012, Benchling says it has grown to become the most widely adopted life science R&D cloud software, used by over 170,000 scientists worldwide. What’s most impressive about its customer base is its diversity: from start-ups to multinationals, in every sector imaginable, including many of synthetic biology’s best-known names — Zymergen, Synlogic, Regeneron, and Intellia, to name a few.


How machine learning can transform AML

Traditional AML solutions resort to hard segmentations of customers based on the KYC data (often inaccurate or aged) or special sequences of historical events. However, customers are too complex to be assigned to such hard-and-fast segments and need to be monitored continuously for anomalous behaviour through soft segmentation. The best approach is to aggregate customers’ banking transactions and generate archetypes of customer behaviour. Each customer is a mixture of these archetypes, which are adjusted in realtime based on their financial and non-financial activity. Looked at in an “archetype space”, good customers are similar in behaviours, giving the appearance of customer cluster grouping. Machine learning models can detect and rank-order with scores those that fall in areas of low customer density or areas of extremely high similarity in behaviour. Machine learning can be used to bring the most urgent alerts to a human officer’s attention, which helps reduce false positives. For example, the FICO AML Threat Score prioritises investigation queues for SARs, to ensure the most pressing cases are looked at first.


Frankfurt
The infections across Germany aren't a surprise. In recent weeks, the Emotet gang has started targeting German users more often. On the same day the cities of Frankfurt and Bad Homburg were infected, the BSI sent out a security alert warning German organizations about an Emotet email spam campaign that was mimicking German government agencies -- and most likely the method through which the two cities were infected. Joseph Roosen, a member of Cryptolaemus, a group of security researchers who track Emotet campaigns, told ZDNet that the Emotet operators often translate their email spam templates to German and target the country's users. ... At this point, it is very clear that the Emotet gang is putting quite the effort into infecting German targets, something it hadn't done before on this scale. While we've seen cities shut down networks in the past, this usually happened because of ransomware attacks. What German cities are doing now is a first. No cities have reacted like this in the case of an Emotet infection.


Master Data Management: A Modern Guide for Data Governance Professionals

Once you have this framework, you can aggregate your data, standardize it and match values. You can link and synchronize your records to align user and application data with the master set. Data quality tools support MDM frameworks by cleansing and normalizing data and removing errors and duplicate values. This has all become extremely important – and difficult to achieve – in our data-driven age. Without MDM, large enterprises struggle to manage linkages and trends across the business. The data engineer at a large insurance firm recently described his team as “data-rich, but information poor” because each of a dozen recently-acquired business units had slightly different definitions of automobile status. Definitions ranged from “new” and “almost new” to “used” and “certified pre-owned.” So it’s no surprise that MDM is moving from “nice to have” to “must-have” in many CIO budgets. Structured data volumes keep rising; platforms and applications keep sprawling; user groups and workflows keep drifting. 


Edge computing trends for 2020s send internet into a new era


Edge computing will also affect other use cases, such as manufacturing, retail, healthcare, automotive and residential environments. However, as the need for speed and compute closer to data sources grows, these use cases may shift, and organizations that provide infrastructure services may glean the most benefits from edge computing. "Worldwide, general availability of edge computing beyond these early use cases will become commonplace sometime between 2025 to 2028," Trifiro said. "Providing an edge computing environment for a factory floor might happen earlier." Use case deployment may depend on the existing devices and architecture of an organization's network. The internet was developed for humans, yet humans aren't the only things that now use the internet or require it to function. Communication among various machines, devices and applications requires faster architecture to function efficiently, and edge computing aims to answer this call.


Get 2020 vision about edge computing and 5G

Edge computing exists along a spectrum, from the device edge to the infrastructure edge.
Edge computing also saves you from shuttling every bit of data back and forth between connected devices and the cloud. If you can determine the value of information close to where it’s created, you can optimize the way it flows. Limiting traffic to just the data that belongs on the cloud cuts down on bandwidth and storage costs, even for applications that aren’t sensitive to latency. Reliability stands to benefit from edge computing, too. A lot can go wrong between the device edge and centralized cloud. But in rugged environments like offshore platforms, refineries, or solar farms, the device and infrastructure edges can operate semi-autonomously when a connection to the cloud isn’t available. Distributed architectures can even be a boon to security. Moving less information to the cloud means there’s less information to intercept. And analyzing data at the edge distributes risk geographically. The endpoints themselves aren’t always easy to protect, so firewalling them at the edge helps limit the scope of an attack.



Quote for the day:


"Every great leader has incredible odds to overcome." -- Wayde Goodall


Daily Tech Digest - December 18, 2019

Are California's New Data Privacy Controls Even Legal?

dataprivacy_1161x653
In the rush to get a GDPR-style regulatory framework in place in California, no one seemed to stop and ask whether what they were doing was actually legal. Indeed, many of the controls enshrined in the European law are fundamentally at odds with American principles of permissionless innovation and open interstate commerce. Huddleston and Adams point out that state laws like the CCPA may run into constitutional problems concerning speech and interstate trade. Data is often speech. Laws that regulate speech are subject to a high level of legal scrutiny because of our First Amendment protections. States don't get to ignore the First Amendment just because they really don't like Facebook. If they try to regulate data-as-speech, the courts may promptly strike them down. How might this look in terms of a state data law? One popular idea is to treat data usage differently depending on whether that data is "for sale" or not. Given the populist anger about data brokers and "monetizing our personal information," it makes sense that this would make its way into a bill.


Your First Month as a CISO: Forming an Information Security Program

As with any significant undertaking, it makes sense to begin by understanding the current state of the security program and the context in which you'll manage it. Start by covering the following: What security concerns do you, your colleagues, and your customers have? What gaps exist, and how might they affect the business? Sometimes figuring this out requires a formal assessment. Other times, it may require an informal survey, conversations with colleagues, and analysis to get to a sufficient starting point. I was fortunate to become the CISO after holding another role at Axonius, but there were many questions that my new position required me to ask that I hadn't asked before; What expectations does the organization have of the security program and your role? Account for your own ideas and, of course, get input from your manager and other stakeholders. When outlining the goals, understand the business and technical needs that will influence the security program. You can see my objectives at Axonius, if you're curious; and Get to know the technical and business environment in which you'll be operating. 


Agile teams: 5 signs of trouble

agile project management
An agile team needs autonomy, but every team needs proper management and organization given the fast-paced nature of agile development. The duty often lies with the scrum master, who should be implementing the best communication processes suited for the team. However, if your daily standups are longer than 15 minutes, then your team might not be as organized as you think. I have seen many different organizations hold long, crowded meetings with people who weren’t necessary to the project - which, not surprisingly, led to a bored and unmotivated team. The best practice is to narrow the number of invites to the people who are vital to the task at hand: Wasted meeting time leads to a lower ROI. Another consideration is to assess how teams are communicating on schedules, stories, and responsibilities. Determine if your team is communicating adequately on their boards, and ensure that everyone understands the overarching goal and that each person is aware of their role. If these points are not drummed into each individual member, then you need to take a step back and reorganize how the work is being done.


3 Real-World Disaster Recovery Scenarios

One of the worst-case scenarios that a modern business can face is a disaster that destroys part or all of its datacenter—and all of the servers and disks inside it. While such a situation is rare, it can happen, and not only as a result of a major natural disaster like an earthquake or hurricane. Issues such as electrical surges and even ill-fated squirrels can cause permanent datacenter damage. The best way to prepare your business for recovery from this type of disaster is to ensure that you have offsite copies of your data. If your production data lives on-premise in one of your datacenters, this would mean keeping backups of the data at another datacenter site or in the cloud. If your data is hosted in the cloud, you could back it up to local storage, to another cloud or a different region of the same cloud. You will also want to make sure that you have a way of restoring the backup data to new infrastructure quickly. Moving large amounts of data from one site to another over the Internet can take a long time, so it’s not always a wise strategy within the context of disaster recovery.


Expected skills needs for the future of work

Challenges and stakeholders related to skills policies in Europe
The ‘future of skills’ receives considerable attention from governments around the world and stands high on the political agenda of many international organizations. As an example, the EU has adopted an overarching strategy – the New Skills Agenda7 – to tackle a wide range of skills-related challenges. Many of the tools contained in this initiative aim at empowering individuals to develop new skills or to exploit the skills they already have. Nevertheless, even with the most innovative policies in place and the mobilization of huge public resources, the success of any skills strategy depends heavily on the motivation of individuals and their decisions to take a step forward. Hence, it is of great importance for policymakers and other stakeholders to understand the impact of technological change from the perspective of workers in order to develop effective policy tools to create a future that works for all. A number of academic studies already shed light on the potential changes in the labour force of the future. This article which presents the opinions of more than 15,000 workers across ten European countries, was designed to contribute to the overall debate by giving voice to the workers themselves and potentially bring them closer to policymakers.


Technology is the answer to the healthcare productivity challenge

While there are major productivity gains to be secured through better management of clinical activity, technology can also transform many of the NHS’ support functions. In particular, the recruitment and the management of temporary staff is often slow and inefficient, with checks being carried out manually. Yet many of these processes could be automated and significantly reduce the time taken to get staff where they are needed on the ward. The opportunities are not just to improve the way the providers work. Patients too can benefit from the increasing availability of online and skype consultations which save them time and provide better, more accessible care. They are also more efficient. It is undoubtedly true that technology has often been hailed as the answer to the NHS’ problems and then it has disappointed. The reasons for these failures lies in the NHS’ tendency to buy technology and then look for a problem for it to solve. To secure the productivity gains it needs, that approach must be reversed.


Tech-skill certificates may be more valuable than a college degree

Older students classroom laptops
A study, "Degrees at Work," released by Emsi in August, found only two majors directly sent grads into its business: engineering and computer science. The study also revealed that non-STEM grads devalue their education. Yet there are many available tech jobs , including positions unfilled for up to months at a time. And a reason for the disparity? Lack of skills. Graduates, and students in general, lack not only experience, but also practical skills to land the kind of jobs to which they aspire. "Our educational system is not equipped to meet the demands of the modern workplace," said Grace Suh, vice president of education for IBM citizenship. "We have more jobs than skilled people to fill them. There are already more than 700,000 open technology jobs in the US, and another half million expected over the next decade. Part of the problem is that we've focused too heavily on just one path to a good job: a bachelor's degree." To address this issue of the educated, but unskilled, corporations pair with schools and alternative credentialing programs to help fill open positions, as well as match employees with their interest.


6 Essential Security Features for Network Monitoring Solutions

6 Essential Security Features for Network Monitoring Solutions
What are the essential security features that every network monitoring solution needs to have? Traditionally, network monitoring solutions focus strictly on examining the performance of a network and all connected devices. However, in order to unify security and performance monitoring for network professionals, an increasing number of network monitoring tools include security features. As one of the largest causes of performance-related issues for networks is security threats (such as malware), network monitoring solutions must be prepared to deal with security-related performance events. The exact scope of a network monitoring tool’s security feature set depends on the vendor’s focus, but every solution should have at least some native security capabilities built-in. Below, we’ve listed six essential security features for network monitoring solutions! If you’re looking for resources to help you evaluate network performance and security monitoring solutions, our free Network Monitoring Buyer’s Guide has you covered! 


DataOps: Your path to a culture of inclusion and innovation

DataOps: Your path to a culture of inclusion and innovation
A typical DataOps team incorporates data analysts, who prepare and wrangle data for operations and analysis; data scientists, who find new and emergent patterns in data, and data engineers, who create stable, scalable data pipelines. You may already have some or all of these roles within your company, or you may be able to outsource some of these functions to a third-party. Then there are customers, both internal and external, who focus on defining data requirements and consuming data. People are just one part of the equation, and it is the convergence of people, process and technology that is needed to run your DataOps. To help the DataOps team succeed in their endeavours, there are specific tools available, which are generally more advanced technologies than those used in DevOps and agile methodologies. Self-service tooling for data discovery (search), data transformation and preparation, warehousing, data lake, AI/ML and API management can all play vital roles in developing your DataOps. This advanced tooling can lead to the blurring of roles and a more emergent data ecosystem.


Wasmtime Enables Running WebAssembly From .NET Programs

Huene says sharing WebAssembly modules is significantly easier than sharing native libraries, e.g. C libraries, which requires platform-specific builds. Instead, WebAssembly modules can be distributed without modifications. ... This workflow is not yet perfectly streamlined, since you have to deal with a number of low-level details such as converting between values, but things should improve in the future, says Huene, thanks to the upcoming WebAssembly interface types proposal. This will improve interoperability and simplify the exchange of complex data types between WebAssembly and .NET without requiring too much of glue code to do value marshalling and making a WebAssembly module appear just as any other .NET package. Untrusted code isolation is made possible by WebAssembly requirement of explicitly importing all external functions that a module is allowed to use and only accessing a region of memory reserved by the host platform. This makes WebAssembly modules effectively run inside a sandbox that enables the execution of foreign plugins with the guarantee they cannot access the host platform indiscriminately.



Quote for the day:


"Leaders begin with a different question than others. Replacing who can I blame with how am I responsible?" -- Orrin Woodward


Daily Tech Digest - December 17, 2019

Microsoft’s C# 9.0 begins to take shape

Microsoft’s C# 9.0 begins to take shape
Among the capabilities eyed for C# 9.0 thus far include: Simplified parameter null validation code, allowing for standard null validation on parameters to be simplified using a small annotation on parameters; Support for a switch expression as a statement expression when every arm’s expression is also a statement expression. No common type among the arms is needed when used as a statement expression; Records, a simplified declaration form for C# class and struct types combining benefits of similar features. Records provide a mechanism for declaring a datatype by describing members of the aggregate as well as additional code or deviations from the usual boilerplate, if any; CallerArgumentExpression, enabling developers to capture the expression passed to a method, to allow better error messages in diagnostic/testing APIs and reduce keystrokes; Relaxing of ordering constraints around “ref” and “partial” modifiers on type declarations; Primary constructors, to reduce programming overhead by putting constructor arguments directly in scope throughout a class, obviating the need to explicitly declare a backing field.


Why the gig economy is in danger


The new California law, which takes effect on Jan. 1, 2020, will require companies to reclassify contract, freelance and contingent workers as full-time employees eligible for benefits, a guaranteed $12 to $13 state minimum wage and protections under the state's employment law. The employer must deduct Social Security and Medicare taxes from the freelancer's fees, and contribute to worker's comp and unemployment insurance. It will put a damper on what freelancers can use as tax deductions. And this is not a one-industry issue, despite the initial response as being all about freelance writers. While there are some 20 jobs that will be exempt from the law, including "creatives" (artists), travel agents, fishermen, stockbrokers, accountants, architects, doctors, insurance agents, lawyers, grant writers, real estate agents, tutors, truck drivers, and manicurists, many people who gig are going to find their income seriously curtailed. ... Previously, California had applied a 10-factor test, often distilled down to one core factor, the "right to control," explained Danielle Lackey.


What we can learn from five recent IT outages

British Airways came under fire in 2019.
The year 2019 was remarkable for the sheer volume and diversity of IT outages that organizations experienced. It seemed like no one was immune from performance degradations, including major airlines, hospitals, commercial banks, stock exchanges, and even cloud providers. ... Salesforce faced its biggest service disruption in May 2019 when the deployment of a database script to its Pardot Marketing Cloud ended up granting elevated permissions to regular users. Salesforce had to block access to Pardot users to prevent employees from stealing sensitive corporate data. However, when this fix didn’t work, Salesforce had to then block network access to other Salesforce services like Sales Cloud and Service Cloud. Customers were unable to access the Pardot Marketing Cloud for 20 hours as Salesforce engineers took affected systems offline to resolve user access permissions. While Salesforce was able to restore data permissions for most customers within a day, it took an additional 12 days to roll out fixes for other Salesforce services.


Augmented Reality with the ArcGIS Runtime SDK for iOS

Augmented Reality (AR) experiences are designed to "augment" the physical world with virtual content. That means showing virtual content on top of a device's camera feed. As the device is moved around, that virtual content respects the real-world scale, position, and orientation of the camera's view. The ArcGIS Runtime SDK for iOS and the ArcGIS Runtime Toolkit for iOS from Esri together provide a simplified approach to developing AR solutions that overlay maps and geographic data on top of a live camera feed. Users can feel like they are viewing digital mapping content in the real world. In this article, we'll learn how to give users that AR map experience. But first, some terminology: in Runtime parlance, a Scene is a description of a 3D "Map" containing potentially many types of 3D geographic data. A Runtime SceneView is a UI component used to display that Scene to the user. When used in conjunction with the ArcGIS Toolkit, a SceneView can quickly and easily be turned into an AR experience to display 3D geographic data as virtual content on top of a camera feed.


Microsoft: We never encourage a ransomware victim to pay

ransomware
"We never encourage a ransomware victim to pay any form of ransom demand," said Ola Peters, Senior Cybersecurity Consultant for Microsoft Detection and Response Team (DART), the OS maker's official incident response team. "Paying a ransom is often expensive, dangerous, and only refuels the attackers' capacity to continue their operations," Peters added. However, Microsoft understands that in many cases, organizations are sometimes left with only one option on the table -- paying the ransom -- as they don't have access to recent backups, or the ransomware encrypted the backups as well. But even if victims choose to pay the ransom, Microsoft warns that "paying cybercriminals to get a ransomware decryption key provides no guarantee that your encrypted data will be restored." .... Instead, Microsoft would want companies to take a pro-active approach and treat ransomware or any form of cyber-attack "as a matter of when" and not "whether." Companies, Microsoft says, should invest in minimizing the attack surface and in creating a solid backup strategy so they can recover from any attack.


12 programming mistakes to avoid

caution tape avoid mistake mistakes be careful crime scene by christian storto fotografia getty
Failing to shore up the basics is the easiest way to undercut your code. Often this means overlooking how arbitrary user behavior will affect your program. Will the input of a zero find its way into a division operation? Will submitted text be the right length? Have date formats been vetted? Is the username verified against the database? Mistakes in the smallest places cause software to fail. Some developers exploit the error catching features of the code to cover up these failures. They wrap their entire stack with one big catch for all possible exceptions. They dump the error to a log file, return an error code, and let someone else deal with the issue. ... On the flip side, overly buttoned-up software can slow to a crawl. Checking a few null pointers may not make much difference, but some software is written to be like an obsessive-compulsive who must check that the doors are locked again and again so that sleep never comes. Relentless devotion to detail can even lock up software if the obsessive checking requires communicating with a distant website over the network.


A decade of smart city projects: What worked and what didn't


American cities faced unintended consequence as a result of one data-driven idea. To improve public transportation in low-income communities, cities started building apartments and condos near transit stops to accomplish this. Instead of expanding educational and employment opportunities, these developments encouraged gentrification and pushed out the same people the project was designed to help. These new developments often raised rents in poor neighborhoods and priced out the people the transit expansions were meant to serve. The San Diego Union Tribune studied the developments in four California cities where about 400 multifamily buildings were completed or under construction within a half mile of a transit stop.  In neighborhoods where most families made less than $64,000 a year, the newspaper analysis found that monthly rent for a two-bedroom apartment was more than $3,500. In some areas where median household income was less than $30,000, the average rent on a two-bedroom apartment is still more than $3,300.


Credit: The Open Group
Most people are going into EA because they want to have a holistic view of the problem at hand. I do think that EA is a mindset that you can use to apply to any type of issue or problem you have. You look at an issue from many different perspectives and try to understand the fit between the issue or the problem and potential solutions. That’s human nature to want to do, to look at things from a holistic point of view. It’s such an interesting area to be in, because you can apply it to just about everything. Particularly, a general EA application, where you look at the business, how it works, and how that will affect the IT part of it. So looking at that holistic view I think is the important part -- and that’s the motivation. ... But to become agile doing EA, means adopting the agile mindset, too. We talked earlier about EA being the mindset.


The future of intelligence analysis

The intelligence cycle
Intelligence leaders know that AI can help cope with this data deluge but they may also wonder what impact AI will have on their work and workforce. According to surveys of private sector companies, there is a significant gap between the introduction of AI and understanding its impact. Nearly 20 percent of workers report experiencing a change in roles, tasks, or ways of working as a result of implementing AI, yet nearly 50 percent of companies have not measured how workers are being impacted by AI implementation.3 This article begins to tackle those questions, offering a tasks-level look at how AI may change work for intel analysts. It will also offer ideas for organizations seeking to speed adoption rates and move from pilots to full scale. AI is already here; let’s see how it will shape the future of intelligence analysis. ... Intelligence flows through a five-step “cycle” carried out by specialists, analysts, and management across the IC: planning and direction; collection; processing; analysis and production; and dissemination. The value of outputs throughout the cycle, including the finished intelligence that analysts put into the hands of decision-makers, is shaped to an important degree by the technology and processes used, including those that leverage AI.


5 Top Cybersecurity and DevOps Trends for 2020


“The case for why companies should protect consumer data is clear: companies lose less money and consumer information is safe from predators,” said Simon Marchand, chief fraud prevention officer for Nuance Communications. “But in the event of a data breach, what many people don’t consider is that, once their data is stolen, it is often made available for the highest bidder on the dark web. And, in some cases, this personal data is used to fund some of the most heinous of crimes—from terrorist organizations to drug and human trafficking.” Companies have a responsibility to stop the broader implications of fraud that go beyond their bottom line and their brand perception, Marchand added: “It’s not only about preventing customer information from being stolen, it’s preventing fraudsters from getting in organizations with information stolen elsewhere.” To that, Munya Kanaventi, senior director of information security at Everbridge, added: “A gap exists in the current Chief Security Officer and Chief Information Security Officer job descriptions, which is the ability to add strategic value to the company. There’s a lot of highly technical people in this role, but when you advance to the C-suite title, there’s a need for business vision alongside technical prowess.”



Quote for the day:


"No man is good enough to govern another man without that other's consent." -- Abraham Lincoln


Daily Tech Digest - December 16, 2019

AI R&D is booming, but general intelligence is still out of reach


For a start, the majority of these milestones come from defeating humans in video games and board games — domains that, because of their clear rules and easy simulation, are particularly amenable to AI training. Such training usually relies on AI agents sinking many lifetimes’ worth of work into a single game, training hundreds of years in a solar day: a fact that highlights how quickly humans learn compared to computers. Similarly, each achievements was set in a single domain. With very few exceptions, AI systems trained at one task can’t transfer what they’ve learned to another. A superhuman StarCraft II bot would lose to a five-year-old playing chess. And while an AI might be able to spot breast cancer tumors as accurately as an oncologist, it can’t do the same for lung cancer (let alone write a prescription or deliver a diagnosis). In other words: AI systems are single-use tools, not flexible intelligences that are stand-ins for humans. But — and yes, there’s another but — that doesn’t mean AI isn’t incredibly useful. As this report shows, despite the limitations of machine learning, it continues to accelerate in terms of funding, interest, and technical achievements.



Data Management Patterns for Microservices Architecture

For the applications where multiple transactions are possible, the Saga Pattern acts as a predominant microservices Data Management pattern. It is a series of local transactions where each transaction publishes an event stating the status of the queries being triggered. The other services are dependent on the previous services’ status, and hence, for the transactions with previously failing status, the saga will automatically undo the further transactions. When a customer places an order in an eCommerce store, the two services called customer service and order service will be working. When a customer service sends the order, the order will be in the pending state. The saga contacts the eCommerce store through the order service and will manage the placing of events. Once the order service gets the confirmation about the order, it sends the reply. Depending on the reply, the saga will approve or reject the order. The final status of the order is presented to the customer stating that the order will be delivered or having the buyer proceed to the payment method.


Algorithmia: 50% of companies spend between 8 and 90 days deploying a single AI model


Despite the fierce search for data science talent in the enterprise, nearly 55% of companies represented in the report say they haven’t yet deployed a machine learning model (up from 51% of companies last year). A full one-fifth are still evaluating use cases or plan to move models into production within the year, and just over 22% have had models in production for two years or fewer. That jibes with a recent study conducted by analysts at International Data Corporation (IDC), which found that of the organizations already using AI, only 25% have developed an “enterprise-wide” AI strategy. Firms responding to that survey blamed the cost of AI solutions and a lack of qualified workers, as well as biased data and unrealistic expectations. As alluded to earlier, moving models into production remains a challenge for most organizations, according to Algorithmia. At least 20% of companies of all sizes say their data scientists spend a quarter of their time deploying models, owing to pervasive scaling blockers like sourcing the hardware, data, and tools and performing the necessary optimizations.


Facial recognition boxes and dots cover the photo of a blond man.
But AI doesn’t just operate behind the scenes. If you’ve ever applied for a job and then been engaged by a text conversation, there’s a chance you’re talking to a recruitment bot. Chatbots that use natural-language understanding created by companies like Mya can help automate the process of reaching out to previous applicants about a new opening at a company, or finding out whether an applicant meets a position’s basic requirements — like availability — thus eliminating the need for human phone-screening interviews. Mya, for instance, can reach out over text and email, as well as through messaging applications like Facebook and WhatsApp. Another burgeoning use of artificial intelligence in job selection is talent and personality assessments. ... These systems typically operate on a scale greater than a human recruiter. For instance, HireVue  claims the artificial intelligence used in its video platform evaluates “tens of thousands of factors.” Even if companies are using the same AI-based hiring tool, they’re likely using a system that’s optimized to their own hiring preferences. Plus, an algorithm is likely changing if it’s continuously being trained on new data.


Spatial computing comes to the enterprise


As we've become increasingly familiar with the positive effects AR has on attention and memory encoding, it was exciting to see AR's adoption expand outside of a marketing context. In the workplace we observed practical applications of AR in areas such as employee onboarding, training, and professional development, with empirical evidence highlighting AR's power to drive efficiencies, time to competency and memory recall — galvanizing a disconnected workforce and helping reduce overheads. Pizza chain Papa Murphy's, for example, continue to leverage AR for its employee onboarding program by creating AR-powered stations at key training locations. These types of use cases are becoming increasingly common across a variety of industries — from financial services to healthcare, large consumer goods conglomerates to higher education and vocational learning institutions. As more businesses trial the technology and best use cases get shared, the more adoption we'll see and the more mainstream AR will become as an L&D tool.


Predictions 2020: What's Going to Happen in Cloud Computing

Hyperconvergence emerged several years back to describe several data center elements consolidating into a single box. More recently, we’ve started to see the emergence of DHCI (distributed hyperconverged infrastructure), an approach that I see as is contradictory and antithetical. As our industry moves forward in 2020, a new category will capture the essence of software-defined everything, and I believe it will be the notion of hybrid cloud. Hardware will still be required, but it could be located anywhere; software will continue to coordinate the increasing complexity to the point where location of hardware will increasingly become irrelevant in 2020. ... Containerization and solution portability will become the new battleground for enterprise IT; vendors having "the best" deployment-specific point solutions will lose out to competitors that can span multiple domains (e.g., public cloud, private cloud, on-premises) with ubiquitous offerings, thereby providing freedom and leverage against lock-in. Advertising claims will soar.


AI's real impact? Freeing us from the tyranny of repetitive tasks


In 2020, AI will begin to live up to the hype by starting to generate real economic value through its application across industries. According to consulting firm PricewaterhouseCoopers, the widespread adoption of AI will add about $15.7 trillion (£12.8 trillion) to global GDP by 2030. Most of that business value will come not from AI-focused companies, but from the infusion of artificial intelligence into traditional industries. Early movers who embrace AI will become the winners. One defining area of AI infusion is in the automation of repetitive tasks, using technologies such as RPA (robotic process automation). RPA will see widespread application in the work performed by functions such as accounts payable, back-office processing and various forms of data management. Routine tasks associated with a large number of jobs will now lend themselves to automation, freeing up people’s time to focus on more complex endeavours. RPA is already creating some of the most valuable AI companies in the world. Another similar area of routine task replacement is the use of speech recognition and natural-language processing in customer service, telemarketing and telesales.


How to Effectively Achieve IT Resilience with Hybrid Cloud and Multi-cloud

As companies look to implement these alternative cloud models, it’s important that they fully understand the time and resource investments needed to ensure they’re not leaving the company susceptible to IT failures or cyberattacks. Orchestrating these environments in a way that meets both IT and business needs is no easy feat. ... There are a host of different cloud options that an organization can choose from. So it’s critical that companies take a pragmatic approach to evaluate their options and ensure they’re picking services that meet both IT and business needs. To do this, they should create a committee of key decision-makers to establish which data, systems, and applications are most critical to operations; set a budget; and discuss where data currently resides. This way, they have a full picture of the current status of their IT infrastructure and can establish parameters around what they’d ideally like the outcome of the project to be. The biggest mistake organizations make is embarking on these projects without identifying internal champions to lead the endeavor.


Like many new technologies, BCIs have attracted interest from the military, and US military emerging technology agency DARPA is investing tens of millions of dollars in developing a brain-computer interface for use by soldiers. More broadly, it's easy to see the appeal of BCIs for the military: soldiers in the field could patch in teams back at HQ for extra intelligence, for example, and communicate with each other without making a sound. Equally, there are darker uses that the army could put BCIs too -- like interrogation and espionage. ... There are currently two approaches to BCIs: invasive and non-invasive. Invasive systems have hardware that's in contact with the brain; non-invasive systems typically pick up the brain's signals from the scalp, using head-worn sensors.  The two approaches have their own different benefits and disadvantages. With invasive BCI systems, because electrode arrays are touching the brain, they can gather much more fine-grained and accurate signals. However, as you can imagine, they involve brain surgery and the brain isn't always too happy about having electrode arrays attached to it -- the brain reacts with a process called glial scarring, which in turn can make it harder for the array to pick up signals.


As the saying goes, “You get out what you put in”. An organisation can have masses of data, but unless it is cleansed and normalised it can be useless. We do not take for granted knowing who the right John Smith is and being able to link a name with the correct address and date of birth. As usage-based insurance develops, whether through aftermarket telematics devices, smartphone apps, connected vehicles, even in the future from smart home data, all that data needs to be gathered, normalised, standardised so that consumers can enjoy an improved shopping experience based on their needs and preferences. In motor insurance we call this Driver DNA®, this allows insurers to verify and benchmark existing telematics scores. This market score becomes portable and allows drivers to take their driving score from one insurer and shop for insurance with another – in the same way as no claims discounts are universally applied. Image recognition ML techniques gives us the speed limits of UK roads, in real-time. Without this data we could not know with a good degree of confidence that a person may be travelling at twice the speed limit in an urban area.



Quote for the day:


"Any one can hold the helm when the sea is calm." -- Publilius Syrus


Daily Tech Digest - December 15, 2019

5 Key Insights From Intel’s New “Accelerate Industrial”

Manager Technical Industrial Engineer working and control robotics with monitoring system software and icon industry network connection on tablet. AI, Artificial Intelligence, Automation robot arm
A technical skills gap stands out as the number one obstacle to a successful digital transformation—flagged as crucial by over a third of respondents. Intel’s report highlights a dramatic shift in the mix of skills needed for success: Manufacturing companies believe the top 5 skills they will need for future growth are all digital skills, from data science to cybersecurity. Manufacturing skills, ranked as today’s second most valuable ability, rank only # 6 when looking at the future. Crucial will be the workforce’s “digital dexterity”, that is the ability to understand both the manufacturing process and the new digital tools. To leverage the full value of digital-industrial innovations, companies will need to truly meld digital technologies into their manufacturing processes, and this requires a workforce fluent in both sets of skills. ... The skills gap represents a tremendous challenge for companies. At the moment, companies are trying to address the gap by setting up training programs in specific digital skills. This, however, will not be enough.



Blood test combined with AI program could speed up diagnosis of brain tumors

Dr Brennan has worked with Dr Matthew Baker, reader in chemistry at the University of Strathclyde, UK, and chief scientific officer at ClinSpec Diagnostics Ltd to develop a test to help doctors to quickly and efficiently find those patients who are most likely to have a brain tumor. The test relies on an existing technique, called infrared spectroscopy, to examine the chemical makeup of a person's blood, combined with an AI program that can spot the chemical clues that indicates the likelihood of a brain tumor. The researchers tried out the new test on blood samples taken from 400 patients with possible signs of brain tumor who had been referred for a brain scan at the Western General Hospital in Edinburgh, UK. Of these, 40 were subsequently found to have a brain tumor. Using the test, the researchers were able to correctly identify 82% of brain tumors. The test was also able to correctly identify 84% of people who did not have brain tumors, meaning it had a low rate of 'false positives. In the case of the most common form of brain tumor, called glioma, the test was 92% accurate at picking up which people had tumors.


Google rolls out Verified SMS and Spam Protection in Android

google-verified-sms-and-spam.png
As the name of the first feature hints, Verified SMS works by confirming the identity of the SMS sender. "When a message is verified-which is done without sending your messages to Google-you'll see the business name and logo as well as a verification badge in the message thread," said Roma Slyusarchuk, a Google Software Engineer on the Messages app. The Verified SMS will only be used to verify the authenticity of SMS messages sent by businesses. It won't verify and add a verification badge to messages sent by normal users. Google said it created this feature to help users trust the messages they receive, especially for "things like one-time passwords, account alerts or appointment confirmations." The Android OS maker didn't explain how the new feature works, but it did say that it should be able to detect SMS messages sent from random numbers, previously not associated with a company, and consequently, help prevent some phishing attacks.


Slow Down to Do More: “Leave room in your schedule for the unexpected” 

One of the biggest problems with rushing through things, both in work and in life, is that it increases the likelihood that you’ll make a mistake. Multitasking is a skill so many people want to fully harness, but the reality is that studies have shown that trying to focus on several tasks at once doesn’t allow you to do any of the tasks as well, and it doesn’t save you time. It can actually waste time because when you switch from one task to another, your brain must refocus. This requires additional time if you’re constantly switching back and forth, compared to if you just focus on one task at a time. In addition, people who rush through their work tend to have higher stress levels, which can lead to more health problems and a lower level of happiness. Finally, we need to find time to take some distance from our work, to take the high ground and just to think. We are constantly consumed by distractions, and when we take the time to break from the norm, and create room for thoughts to ideate, we will be considerably more productive, healthier and happier.


Agile Estimation — Prerequisites for Better Estimates

measuring tape
Someone might consider all aspects of functional requirements, nonfunctional requirements to estimate it as big. Another person might estimate as low without considering nonfunctional requirements like security, performance, etc. It also depends on your delivery best practices, if you consider Unit Test, Automation, Accessibility, device support are part of doneness criteria then the estimate would be different. Of course, I definitely recommend all these best practices are part of your estimation. These best practices are a must for quality and better maintenance. They will cut down the cost in the long run. ... The development team and product management team must be on the same page. The development team must understand the business goals equally with the Product management team. Also, understand the objectives of the product management team and identify must-have requirements for supporting business growth. It will help you to decide the type of architecture foundation required. As per business goals, the expectation is going bigger in terms of size (user, data footprint) in the road map then the architecture will have to be different from the shorter business goals.


Q&A on the Book Building Digital Experience Platforms

Digital Experience Platforms are integrated set of technologies that aim to provide user-centric engaging experience, improve productivity, accelerate integration and deliver a solution in quick time. Digital Experience Platforms are based on platform philosophy so that they can easily extend and be scaled to future demands of innovation, and continuously adapt to the changing trends of technology. Enterprises can have solid integrated foundation for all the applications, which meets the needs of organizations going through digital transformation and provides a better customer experience across all touchpoints. DXPs package the most essential set of technologies, such as content management, portals, and ecommerce, which are necessary to digitize the enterprise operations and play a crucial role in the digital transformation journey. DXPs offer inbuilt features such as presentation, user management, content management, personalization, analytics, integrations, SEO, campaign management, social and collaboration, and search, among others.


4 Robotic Process Automation Trends For 2020

Robotic Process Automation
For a long time, prognosticators have anticipated a future with robots and intelligent elements running the world to the detriment of human laborers. Employment losses, they anticipated, would be unavoidable as AI did things quicker, more brilliant and with less HR headaches. As indicated by the HBR report that concentrated the effect of different RPA implementations demonstrated that supplanting administrative employees was neither the essential goal nor a typical result in 47% of the activities they contemplated. Truth be told, just a bunch of those RPA projects prompted decreases in headcount, and much of the time, the tasks had just been moved to outside workers. RPA bots that are intended to adjust to changing conditions and automatically deal with the correct response quickly. RPA is most normally thought of as a productivity and effectiveness tool. Decreasing or taking out tedious manual procedures is an effectiveness unto itself. RPA and different types of automation will turn into an increasingly obvious piece of data security methodologies, not on the grounds that a multitude of bots will be battling threats on the front lines, but since they can help lessen the most universal risk of all: human mistake.



Angular Breadcrumbs with Complex Routing and Navigation

The UI structure of the breadcrumbs on any serious website looks simple. But the underlying code logic, operation rules, and navigation workflow are not simple at all due to related routing complexities and navigation varieties. This article will demonstrate a sample application with full-featured breadcrumbs and discuss the resolutions of implementing and testing issues. The sample application that can be downloaded with the above links is the modified version of the original Heroes Example from the Angular document Routing & Navigation. I wouldn’t like to reinvent wheels for creating my sample application from scratch. The Heroes Example covers most routing patterns and types, hence, can be a base source for adding breadcrumb features. It, however, is not enough for demonstrating the realistic breadcrumbs with complex navigation scenarios and workflow completeness. The modification tasks involve adding more pages with corresponding navigation routes, changing UI structures and styles, fixing active router link issues with custom alternatives, updating code logic for authenticated session creation and persistence, just to mention a few.


Blockchain Prediction: 2020 Will Enable Levels of Data Trust


It will seem counterintuitive to most CISOs and other security professionals to hear that something public is more secure. Enterprises often prefer to operate in their walled garden and at first will be skeptical of public ledgers. But this stance will change over time. It is somewhat analogous to what happened with intranets and the internet. At first, enterprises only wanted systems connected internally (intranet), but eventually realized the value in connecting to external networks (internet) as well. Interest in blockchain has also germinated a vibrant research community that’s looking into novel cryptographic techniques such as zero-knowledge proofs, trusted computing platforms, verifiable delay functions and other innovative “cryptoeconomic” tools. As this research moves from the lab to the data center, we anticipate that these technologies will make computing more secure and private than ever before.  Security has always been a priority, but more recently privacy. Individuals aren’t in control of their data. From your healthcare data to browsing history, your data is at risk of being exposed or worse, manipulated.


Two Critical Questions for your Enterprise Blockchain Application

question-mark-graffiti
Any data going on a public chain are open, accessible, and irrevocable. Thus, public blockchain is not GDPR (and CCPA from next year) compliant, unless the data has been encoded with quantum-resistant algorithms and stored. Personally Identifiable Information (PII) or sensitive data compromising user privacy should not be stored on a blockchain. However, blockchain still needs account aka wallet addresses to individually link them with their real users ... The performance of software directly depends on the performance of its dependencies and their host environments. Blockchain brings a new paradigm of decentralization architecture, where every node on the chain constantly updates its states to maintain the world state. In addition to that, a blockchain application also needs to deal with the following issues and their varied implementations. ... A blockchain relies on the distributed consensus of participant nodes. The PoW (Proof of Work) consensus takes more time to achieve a consensus across the system based on the finality gadget watermark compared to any PoS (Proof of Stake) system.



Quote for the day:


"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr


Daily Tech Digest - December 14, 2019

Watch Out: 7 Digital Disruptions for IT Leaders

Image: beeboys - stock.adobe.com
Inexpensive sensors can now track physical biometrics, and organizations are working on providing hyper-personalized digital experiences, according to Gartner. The firm is forecasting that by 2024, AI identification of emotions will influence more than half of the online ads that you see. This trend will reach beyond marketing to consumers. It could also be used in HR applications and be applied to employee evaluations, for instance. Gartner recommends that CIOs identify emotional trigger-based opportunities with employees and customers, add emotional states evaluation to 360 Review processes, and mitigate privacy concerns with opt-in for-pay emotion mining. ... While it cost 4% of the entire U.S. budget to put a man on the moon, putting a satellite into orbit now costs just $300,000, Plummer said. That has led to a low space orbit getting mighty crowded with hundreds of satellites. It also raises a host of new questions. What rules apply to data residency in space? What laws apply? What about crime in space? Countries and companies will be competing in space, and the cheaper it gets to launch a satellite, the more crowded it will become.


Corporate venture capital deals hit new record as banks invest in fintech competitors

RT: Goldman Sachs sign
Financial services corporate venture deals surged 500% from 2014 through the third quarter of 2019. Nearly half of the total financial services deals are in California. Citi Ventures is the most active when it comes to deal flow with 66 venture deals, compared to 64 by Goldman Sachs’ VC arm. Goldman has backed the most so-called unicorns with five companies valued at more than $1 billion. Its bets include Plaid, Circle and Marqeta. Six other financial services groups have invested in three or more unicorns. American Express Ventures came in at number three with 55 deals since 2014. Payment corporate venture capital deals reached peak levels this year, according to CB Insights. But capital markets corporate VC with names like CME Ventures, Monex Ventures and Nasdaq Ventures, is slowing, according to CB Insights. It’s not just corporate venture capital. Overall, fintech funding is surging, with start-ups bringing in a record $24.6 billion in funding through September, according to CB Insights.


Going to the dark side: Should you consider becoming a consultant?

Happy business colleagues in modern office using tablet
If there's one thing I find that makes or breaks a successful consultant, it's an ability to thrive in an uncertain environment. Some people think I'm joking when I tell them I have no idea where I'll be physically working, what company I'll be working with, and what I'll be trying to accomplish weeks from now, but it's absolutely true. For some people, that's a thrilling proposition and very different from a predictable role where they can map out their future with relative certainty months, or even years out. Consulting provides a unique opportunity to quickly gain wildly diverse experiences across industries, geographies, and technologies, and also creates an opportunity to reinvent your career on a regular basis as you acquire new experiences and quickly develop new skills. The downside to this unpredictability is what drives many people to leave the profession. It's difficult to plan everything from family events to routine doctor visits when you could literally be anywhere in the world, in some cases with a day's notice (or less). The excitement of the unknown can quickly become a frightening instability and a sense that you have no control over your destiny.


Financial innovation in China: leading the way and one eye on the future

In the payment space, Accenture predicts ongoing cumulative losses of $US 61 billion to China’s incumbent banks between 2019 and 2025 due to digital payment platforms. Currently, Alipay and WeChat Pay represent about 90% of the payment market. Meanwhile, the loan books of the neobanks have grown with incredible speed. The total credit on the books of the biggest neobanks owned by WeChat and Alipay at the end of 2017, was RMB 1.3 trillion, or 22% of all of China’s consumer credit. An impressive figure considering they did this within three years of launch. So yes, being threatened is undoubtedly an excellent motivator for innovation. But there is an even more significant threat. Banking is becoming an afterthought. With the ease of digital payment provided by WeChat Pay and Alipay, most users don’t care much about the bank’s app or services, they are irrelevant, and banks are used only for storing money. Relegated to the role that some refer to as “dumb pipes.”


Mozilla to force all add-on devs to use 2FA to prevent supply-chain attacks

mozilla-to-firefox-users-heres-how-were-5da72643dc406100013edce7-1-oct-25-2019-19-28-12-poster.jpg
When this happens, hackers can use the developers' compromised accounts to ship tainted add-on updates to Firefox users. Since Firefox add-ons have a pretty privileged position inside the browser, an attacker can use a compromised add-on to steal passwords, authentication/session cookies, spy on a user's browsing habits, or redirect users to phishing pages or malware download sites. These types of incidents are usually referred to as supply-chain attacks. When they happen, end users have no way of detecting if an add-on update is malicious or not, especially when a tainted update comes from the official Mozilla AMO -- a source considered secure by all Firefox users. Mozilla's decision to force add-on devs to enable 2FA is the best course of action the browser maker could have taken to prevent future supply-chain incidents. While there have been no known cases of AMO account hijackings for Firefox add-ons in recent years, there have been many cases of hijacked Chrome extensions.


How the ArchiMate Modeling Standard Helps EAs Deliver Greater Business Agility

The key role of architecture is to ensure that you have flexibility in the short-term and in the long-term. Models are a great help in that. And that’s of course where the ArchiMate standard comes in. It lets you create models in standardized ways, where everybody understands them in the same way. It lets you analyze your architecture across many aspects, including identifying complexity bottlenecks, cost issues, and risks from outdated technology -- or any other kind of analysis you want to make. Enterprise architecture is the key discipline in this new world of digital transformation and business agility. Although the discipline has to change to move with the times, it’s still very important to make sure that your organization is adaptive, can change with the times, and doesn’t get stuck in an overly complex, legacy world. ... The capability concept and the mapping between them is also very important. That allows you see what capabilities are needed for the stages in the value production.


Instagram explains how it uses AI to choose content for your Explore tab


In its blog post, though, Instagram’s engineers explain the operation of the Explore tab while steering clear of thorny political issues. “This is the first time we’re going into heavy detail on the foundational building blocks that help us provide personalized content at scale,” Instagram software engineer Ivan Medvedev told The Verge over email. (You can read about how Instagram organizes content on the main feed in this story from last year.) The post emphasizes that Instagram is huge, and the content it contains is extremely varied, “with topics varying from Arabic calligraphy to model trains to slime.” This presents a challenge for recommending content, which Instagram overcomes by focusing not on what posts users might like to see, but on what accounts might interest them instead. Instagram identifies accounts that are similar to one another by adapting a common machine learning method known as “word embedding.” Word embedding systems study the order in which words appear in text to measure how related they are.


IoT Has Spawned Entity-Based Risks -- Now What?

uncaptioned
The security problem will only grow more complex. A study conducted by 451 Research (via Yahoo Finance) estimates that “the number of IoT connected devices (excluding PCs, smart TVs, and game consoles) will be approximately 8 billion in 2019 and reaching nearly 14 billion in 2024," while a report from the International Data Corporation (via MarketWatch) forecasts that worldwide spending on IoT will reach $745 billion in 2019. Increased connectivity means increased security threats. From my experience, many IoT products don't get regular updates, while some can't be updated. This exposes devices to potential cyberattacks that target vulnerabilities in outdated hardware and software. In addition, most IoT devices come with default passwords that can be easily compromised using publicly available password lists and automated searches for particular devices. Others have weak credentials that are susceptible to brute-force password hacking. The exponential growth in IoT devices has led to more ransomware, malware and botnet attacks that are specifically targeting certain equipment.


Battleground over accountability for AI


Vogel said that many people have a view that AI systems are neutral but they don't understand how many human touch points are involved in their development. With successful AI being reliant on diversity in their data sets and development teams, the under-representation of different gender and cultural groups in the IT industry, he said, has exacerbated any problems relating to AI neutrality. Lyndon Summers, the operations manager at Open Universities Australia, agreed that we need expertise from diverse backgrounds. He noted that some of the most successful service developments and improvements he has seen came from listening to call centre staff, as well as developers and software engineers. "One of the biggest values is the human touch points," said Summers. "We need to find the right balance between people and automation and, if we are going to increase the level of automation we use, we have to find roles for the people we displace and perhaps get them into roles to help us build even more automation".


Adaptive systems, machine learning and collaborative AI with Dr. Besmira Nushi

We’re really good in reasoning and imagination. And machines are good in processing these terabytes of data for us and giving us these patterns. However, you know, if we can use the machine capabilities in an efficient way, we can be quicker and faster, as I said. But then, on the other hand, you know, these are concepts that, if you think deep about it, they are not that new. In the sense that when we invented personal computing in the 80s, this is one of the reasons why it became so successful, because the personal computer was suddenly this “buddy” that could help you do things faster and quicker. But then there is another thing that enabled that development in those years and really, I think that that is the field of human computer interaction. ... Another one that we focus a lot on is predictability of errors. And what this really means is that, if I’m working with an AI algorithm, I should be able to, kind of, understand that that AI algorithm is going to make mistakes.



Quote for the day:



"Many men may see the King in a Kid but it takes a true leader to nurture it." -- Bernard Kelvin Clive