In terms of the legal implications of AI, vicarious liability and agency cannot be applied to AI in the same way as they would for employee liability. Due to the black box nature of AI and the lack of transparency in its reasoning, it is difficult to attribute liability. The Fairchild principles of a 'material increase to risk' could be applied in future to determine liability, but without legislative clarification, the position is not entirely clear. Furthermore, AI can monitor price changes within a market and react very quickly, thereby potentially stifling competition by creating a form of collusion in the market. The European Commission is currently taking the threat of AI in competition seriously and exploring solutions to resolve these types of issues. From an intellectual property perspective, legislation has not been updated to cover the ownership of AI-generated intellectual property. Companies will need to ensure ownership of any materials or intellectual property created by AI vests or is transferred to them. In terms of ethics, the law cannot cover every moral scenario. AI is already creating unintended gender, race and socio-economic bias based on the data it works with.
Organizations are employing security automation and orchestration technologies to make sure that the right person, with the right data, is there at the right time to make decisions, he said. In cybersecurity, it is important that the organization is clear about what actions must be taken after an incident occurs. Automation technologies can make changes right away to contain the issue, he added, but just relying on technologies isn't enough to help prepare for today's advanced threats, he added. Organizations should also practice breach preparedness drills to test their response, he stressed. Implementing these security orchestration and automation practices also relies on strong leadership that develops a team atmosphere, and teaches team members to work together during a crisis, he said. It will be important to exhibit these strong cultural traits during a breach, especially because cybersecurity playbooks can crack under pressure, he added. "People want to practice what it's like to go through a breach," he said. "Security orchestration gives you the technology to respond fast and encourages you to practice it so that [when things go wrong] you're ready."
Organizations use predictive analytics to sift through current and historical data to detect trends and forecast events and conditions that should occur at a specific time, based on supplied parameters. With predictive analytics, organizations can find and exploit patterns contained within data in order to detect risks and opportunities. Models can be designed, for instance, to discover relationships between various behavior factors. Such models enable the assessment of either the promise or risk presented by a particular set of conditions, guiding informed decision-making across various categories of supply chain and procurement events. ... While getting started in predictive analytics isn't exactly a snap, it's a task that virtually any business can handle as long as one remains committed to the approach and is willing to invest the time and funds necessary to get the project moving. Beginning with a limited-scale pilot project in a critical business area is an excellent way to cap start-up costs while minimizing the time before financial rewards begin rolling in. Once a model is put into action, it generally requires little upkeep as it continues to grind out actionable insights for many years.
"The IoT is useful when you know you can derive business benefit by making unknown processes visible," she says. "If you try and use sensors everywhere, you will get nowhere because it's too expensive and it's too imprecise. Rolls-Royce picks the places where its IoT solutions can make data visible, and which will create significant operational benefits. That, for me, is the key to a successful IoT deployment." Gorski advises other digital chiefs to analyse their business operations and understand where a lack of data transparency creates a headache. She has seen big-bang instrumentation projects happen and, for the most part, these are difficult to justify. "They end up being expensive to implement," says Gorski. "It's costly to transmit data and the business ends up with a patchwork quilt of information. It's important to remember there isn't a single solution for IoT instrumentation and you must bootstrap technology together from lots of different suppliers. All that bootstrapping adds costs and creates complexity."
“We are investing millions of pounds in technology, yet we’ve got six organisations that still can’t tell us what their waiting lists are. It’s not acceptable,” he said. Barts Health NHS Trust, for instance, hasn’t submitted a referral to treatment report to NHS England for nearly four years. “We walk around most hospitals and we’ve not known how many beds we have and how many patients are lying in them,” said Swindells. “We need to at the very least get the data that we capture back out. If we can’t do the basics, me going cap in hand to the treasury for another £10bn to sort IT out just sounds like fool’s money.” He highlighted e-rostering as another example of failing to use data properly, saying most hospitals use an e-rostering system, which he described as a “glorified spreadsheet” and “expensive pieces of technology that are not enabling better rostering: not enabling the matching of staffing to clinical need, not enabling staff to be flexible about when they work and therefore making more available”. “We have to make this stuff work well,” said Swindells.
The playing field might not remain in its current state for long, with the main issue being the proposed $26.5 billion merger between T-Mobile and Sprint. Partridge said that would be a game-changer for carrier-based IoT in the U.S. “In the consumer business, T-Mobile’s going to be in charge of that, they’ve been wildly successful – but I think in IoT, Sprint will have every opportunity to take the lead,” he said. The idea, after the combination, would be to make acquisitions aimed at strengthening the new company’s position on the enterprise side of service provisioning in general, and focused on IoT particularly, though there are a number of tactical options for pursuing such a strategy. The new company could get into fleet management, a la Verizon and AT&T, snap up IoT software companies and package their offerings into new branded services, move heavily into surveillance and security, or even hardware. “The playbook is fairly open in terms of that, but the goal is to get away from connectivity-only value, because that’s not the place to be,” according to Partridge.
Instead of a collective sigh, May 25 might create more of a collective grunt. Most privacy professionals know that although a lot of work has been done in the run-up to D-day, GDPR compliance will require a constant focus. It is a journey, not a final destination. Those organizations that treat May 25 as the endpoint of their compliance drive, will be proven wrong. Another distinction between organizations will be their levels of ambition. Some organizations will look at GDPR as a mere checklist approach, I call it the "lawyer" approach (with all due respect to the lawyers amongst you, including myself). Legal compliance is core, but an organization's ambition should aim to go beyond and create a true cultural change. I truly believe that these privacy leaders ultimately will be rewarded in the market, banking on what I call a "trust dividend", reaping the benefits of constant investments in this space. Even though there is a broad spectrum amongst organizations around GDPR compliance, there are also some common themes and questions. In my role as CA Technologies Chief Privacy Strategist, I have had the opportunity to discuss GDPR with organizations, both public and private.
The battle for LBS relevance moves from companies that only support increasingly commoditized location data, which they license (e.g., mapping data for GPS), to those that can offer enhanced and supplemental services. Previously seen as an old-style GPS/mapping data company, the largest LBS company, HERE, is moving away from the old model, although not totally. It’s changing from just being a database to being a value-added supplier of a full range of LBS with its Open Location Platform. HERE has several partnerships with auto companies (Audi, BMW) and others (Intel, Oracle, Amazon Web Services, Microsoft) to add platform capabilities beyond their extensive mapping database. Those capabilities include value-added services such as tracking, traffic, safety services, and HD maps. HERE's main cloud-based LBS platform competitor, MapBox, offers similar services but does not include its own mapping database, instead allowing clients to link to their preferred mapping data. HERE and Mapbox have some distinct strategy differences: Mapbox relies on others' data sets and can connect as needed and by user preference. HERE has its own data sets and is looking to add value on top.
Applications which can be downloaded are particularly vulnerable to cyber criminals, as they can be isolated from the network and attacked indefinitely until their defences are broken. Due to so many people using their personal mobile devices for work purposes, a compromised app will not only attack the individual or the business entity that published the app but could also grant attackers access to enterprise networks. Any application on an app store can be downloaded by anyone, and that includes bad actors. If an app is lacking in protection, once downloaded a bad actor might reverse engineer the app leaving it vulnerable to wide-scale tampering; IP/PII theft or API attack. With the code being left so vulnerable, the threat is extremely likely to turn into a widespread attack resulting in a loss of customers, brand damage, lost revenue, and lost jobs. On the other hand, with a threat analytics solution in place from the start, apps can provide valuable insights to the business the moment they are downloaded from an app store, thereby closing the loop.
Today, most AI workloads use a preconfigured database optimized for a specific hardware architecture. The market is going toward software-enabled hardware that will allow organizations to intelligently allocate processing across GPUs and CPUs depending on the task at hand, said Chad Meley, vice president of analytic products and solutions at Teradata. Part of the challenge is that enterprises use multiple compute engines to access multiple storage options. Large enterprises tend to store frequently accessed, high-value data such as customer, financials, supply chain, product and the like in high-performing, high I/O environments, while less frequently accessed big data sets such as sensor readings, web and rich media are stored in cheaper cloud object storage. One of the goals of composable computing is to use containerization to spin up computer instances such as SQL engines, Graph engines, machine learning engines and deep learning engines that can access data spread across these different storage options.
Quote for the day:
"The task of leadership is not to put greatness into humanity, but to elicit it, for the greatness is already there." -- John Buchan