AI needs a certification process, not legislation
Neither legal regulation nor ethical guidelines will keep AI development from running amok. That doesn’t mean there isn’t a solution, though. In fact, the solution is a lot simpler than you might think: Establish an independent body that can create standards and a program for certification. ... These compliance measures have highly technical standards that require organizations to comply with specific password protection measures, mobile phone security, data segregation, firewall protections, and many more nuanced topics. While there’s no legal penalty for non-certification, certification is often a necessity for businesses wanting to engage with one another. In AI, I propose that technical experts, investors, and policymakers within the space come together to create a global, independent governing body responsible for establishing and enforcing AI standards. The standards — which should be reviewed regularly with annual certification requirements — should spell out specific requirements such as compliance around avoiding bias in data sets, checks to ensure AI is being used ethically and in a way that isn’t discriminatory, controls around automated decision making, and emergency measures to stop an AI machine.
One way predictive analytics is changing transportation is in how it is forcing firms to evaluate how they arrange data sourced from electronic logs, video event recorders, electronic control modules, and other vehicle sensors. Organizing these sources is critical for triaging which transportation challenges to solve and means finding relationships among the data that can be made into useful experiences. For an automotive example, think of a Corvette. Specialty versions of the vehicle offer a Performance Data Recorder that enables telemetry overlays of vehicle data on video from a high-definition camera.That data, sourced from various system activities, is used to analyze driver sessions on a race track, enhancing a customer wish. Exploring data organization will rise as autonomous vehicle fleets become more prominent on public roads. Vehicles have historically managed this data in one format or another, but until now there were no opportunities to consider data from a network, moreover with consideration of a central repository or local platform to host data. Autonomous vehicles generate real-time data creation, which can inform managers with real-time logistic decisions.
Innovation Hubs v Regulatory Sandboxes and the Future of Innovation Facilitators
Regulators gain a better understanding of innovation in financial services, and firms understand better the regulatory and supervisory expectations against the backdrop of rapid technological advancement. “In particular, innovation facilitators can help competent authorities to keep pace with developments by gaining near ‘real time’ insights into emerging technologies (such as distributed ledger technologies, big data analytics, artificial intelligence and machine learning) and their application in the financial sector. Competent authorities can apply these insights for the purposes of anticipating regulatory and supervisory issues and responding proactively.” On the other hand, it makes regulators more accessible to firms and in particular start-ups that lack resources and experience in regulatory matters However, the report also summarises some of the risks that are perceived by competent authority regarding the operation of innovation facilitators in general and with regulatory sandboxes in particular.
Cybersecurity insurance: Read the fine print
Ernest Martin Jr. mentioned cybersecurity insurance is trying to protect a new and volatile industry; a good example would be determining how to insure a business that locates the company's technology (hardware and/or software) in a third-party's data center, which is becoming a common practice. "Even when a cyber policy provides a particular type of coverage, the actual scope of that coverage can be restricted in many ways," Dallas attorney Amy Elizabeth Stewart explains to Bounds. Stewart suggests firms that outsource their digital assets should understand how the coverage works when third-party vendors are involved, if they want to avoid unpleasant surprises. Bounds offers an example from Renee Hornbaker, former financial chief for Stream Energy Inc. as well as Flowserve Corporation (now retired). Hornbaker told Bounds she did not look forward to getting cybersecurity insurance, adding, "I found it to be costly, difficult to purchase, and the application process was onerous." Bounds brings up another good point about what could be a problem to some company executives: Obtaining insurance likely will entail disclosing a lot of sensitive information to the insurer, such as infrastructure setup and security practices.
Apache Mesos is an open source cluster management tool that abstracts and isolates resources within distributed IT environments. Enterprises use Mesos with, or as an alternative to, Kubernetes for container orchestration in large-scale deployments. ... Readers should expect the build process -- compiling and linking the components of Apache Mesos -- to take about one hour on a two-core machine with 8 GB of memory. Close any servers and end any running tasks on your machine before you begin compiling the Apache Mesos installation. This process can take 100% of the memory and prevent even SSH login attempts. All commands must execute via the sudo command, which enables you to act as the administrative root user. Test frameworks are not critical: It's a complicated process to write a Mesos test framework, and a regular user is unlikely to need one. Instead, IT admins are more likely to use a Mesos framework developed by an established vendor such as Hadoop, Spark or Cassandra.
The Problem with Quantum Computers
The trouble is, quantum mechanics challenges our intuition. So we struggle to figure out the best algorithms for performing meaningful tasks. To help overcome these problems, our team at Los Alamos National Laboratory is developing a method to invent and optimize algorithms that perform useful tasks on noisy quantum computers. Algorithms are the lists of operations that tell a computer to do something, analogous to a cooking recipe. Compared to classical algorithms, the quantum kind are best kept as short as possible and, we have found, best tailored to the particular defects and noise regime of a given hardware device. That enables the algorithm to execute more processing steps within the constrained time frame before decoherence reduces the likelihood of a correct result to nearly zero. In our interdisciplinary work on quantum computing at Los Alamos, funded by the Laboratory Directed Research and Development program, we are pursuing a key step in getting algorithms to run effectively. The main idea is to reduce the number of gates in an attempt to finish execution before decoherence and other sources of errors have a chance to unacceptably reduce the likelihood of success.
Everyone possesses the ability to be good innovators. We are all born like this. But most of us unlearn these abilities through spending much of our lives within the tightly controlled systems that are constituted by our educational institutions and workplaces. A large 10-year study of 1,600 children which tested their creativity—defined as the ability to engage in divergent thinking, i.e. the ability to have original ideas which differ from anything you have ever seen before—measured the creativity of children who were 5, 10, and 15 years old. ... If those numbers don’t give you pause, I don’t know what will. These results also explain why our organizations lack innovation power. As citizens, we unlearn our skills of divergent thinking, and most of our organizations are built to promote and maintain this state. The organizations may have been founded by people who were creative geniuses, but unless the founders still run the organizations and are very visible bearers of the culture, the organizations quickly change and are left to people who have largely unlearned divergent thinking, and have rather learned convergent thinking, which is the ability to be critical.
Was Chase’s Digital-Only Bank Spinoff a Viable Strategy?
Financial institutions need to transform themselves from product-centric to customer-centric, from efficiency to flexibility, and from digital support to digital-only. The winners in the banking industry will find ways to collect and act on insights faster than the competition. This is what Amazon does and what consumers will expect from their financial institution. This can’t be achieved by protecting existing branch-based organizations, processes, or by hoping that increased investments in technology will save the day. This is because the financial support of legacy branches and processes (at least to the degree that is occurring in most organizations) is not sustainable. Alternative providers can provide greater value, better rates and a better experience at a lower cost. According to noted author and futurist Brett King, “We’re likely to see more digital-only offerings from traditional banks fail in the future where banks aren’t truly committed to digital transformation. The problem is that many traditional banks are doing this for PR reasons — not because they believe in digital as a destination. Ultimately they will fail because the traditional organization kills it off or starves it of adequate support”
Cisco to buy IoT security, management firm Sentryo
Sentryo's ICS CyberVision lets enterprises ensure continuity, resilience and safety of their industrial operations while preventing possible cyberattacks, said Nandini Natarajan , industry analyst at Frost & Sullivan. "It automatically profiles assets and communication flows using a unique 'universal OT language' in the form of tags, which describe in plain text what each asset is doing. ICS CyberVision gives anyone immediate insights into an asset's role and behaviors; it offers many different analytic views leveraging artificial intelligence algorithms to let users deep-dive into the vast amount of data a typical industrial control system can generate. Sentryo makes it easy to see important or relevant information." In addition, Sentryo's platform uses deep packet inspection (DPI) to extract information from communications among industrial assets, Natarajan said. This DPI engine is deployed through an edge-computing architecture that can run either on Sentryo sensor appliances or on network equipment that is already installed. Thus, Sentryo can embed visibility and cybersecurity features in the industrial network rather than deploying an out-of-band monitoring network, Natarajan said.
Reducing data security complexity: Avoiding endpoint bloat
Whether agents, particularly security control agents, persist over time is the only metric worth our attention, because it puts a spotlight on the greatest hidden danger of all: the naturalness of security decay. Things fall apart. Rust never sleeps. Agents topple over. Decay is the fate of all security agents. But if these serve as the foundation of our security goals or most technical expression of security intent, then what could possibly be more important? It’s also not a question of whether security decay is happening in your environment, you can rest assured it is. What must be asked is, will you persist through it? This question demands an answer. Ideally, organizations reduce their overall security costs by monitoring how their endpoint controls work (or don’t) to reduce endpoint security decay. They validate safeguards and eliminate compliance failures. And they respond to threats and exposures with the confidence to control devices from anywhere.
Quote for the day:
"Leaders are people who believe so passionately that they can seduce other people into sharing their dream." -- Warren G. Bennis,
No comments:
Post a Comment