We Need to be Examining the Ethics and Governance of Artificial Intelligence
Recently, the role that pre-crime and artificial intelligence can play in our world has been explored in episodes of the popular Netflix TV show Black Mirror, focusing on the debate between free will and determinism. Working in counter-terrorism, I know that the use of artificial intelligence in the security space is fast becoming a reality. After all, decisions and choices previously made by humans are being increasingly delegated to algorithms, which can advise, and decide, how data is interpreted and what actions should result. Take the example of new technology that can recognize not just our faces but also determine our mood and map our body language. Such systems can even tell a real smile from a fake one. Being able to utilize this in predicting the risk of a security threat in a crowded airport or train station, and prevent it from occurring, for example, would be useful. Some conversations I have had with individuals working in cyber-security indicate that it is already being done.
The Code defines 13 guidelines for manufacturers, service providers, developers and retailers to implement in order to ensure that IoT products are safe to use. They are: no default passwords; implement a vulnerability disclosure policy; keep software updated; securely store credentials and security-sensitive data; communicate securely, minimise exposed attack surface; ensure software integrity; ensure that personal data is protected; make systems resilient to outages; monitor system telemetry data; make it easy for consumers to delete personal data; make installation and maintenance of devices easy and validate input data HP Inc. and Centrica Hive are the first companies to sign up to the new Code. Minister for Digital Margot James said that these pledges are "a welcome first step," but "it is vital other manufacturers follow their lead to ensure strong security measures are built into everyday technology from the moment it is designed."
The so-called password-less authentication, if implemented literally, would lead us to a world where we are deprived of the chances and means to get our volition confirmed in having our identity authenticated. It would be a 1984-like world. The values of democratic societies are not compatible. Some people allege that passwords can and will be eliminated by biometrics or PIN. But logic tells that it can never happen because the former requires a password/PIN as a fallback means and the latter is no more than the weakest form of numbers-only password. Various debates over ‘password-less’ or ‘beyond-password’ authentications only make it clear that the solution to the password predicament could be found only inside the family of broadly-defined passwords. ... If PIN or PINCODE, which is the weakest form of numbers-only password, had the power to kill the password, a small sedan should be able to kill the automobile. Advocates of this idea seem to claim that a PIN is stronger than passwords when it is linked to a device while the password is not linked to the device.
Juniper advances network automation community, skillsets
“Since a critical part of automated operations is the individual engineers and processes they follow, Juniper has put deliberate investment into these areas by introducing many formal and informal training programs, cloud-based lab services, testing as a service, free trials, live throwdowns and [the new] Juniper Engineering Network (EngNet),” Koley wrote. Juniper Engnet is a portal that includes a variety of automation tools, resources and social communities. According to the vendor, the site features API documentations, access to Juniper Labs, virtual resources, a learning portal and an automation exchange of useful network automation tools. “Juniper Engineering Network is aimed at elevating the entire networking community to move beyond incumbent CLIs knowledge and toward an automated, abstracted, self-driving technology. The networking community, including Juniper customers and partners, can contribute to the Automation Exchange within the community," Juniper stated.
AI is no silver bullet for cyber security
“AI is not a silver bullet – when you look at the technology, you have to make sure that senior management is aware of its risks and you don’t invest in it unless you already have good cyber hygiene – starting with people,” said Pereira User education is crucial, he said, because successful cyber attackers often exploit human weaknesses and emotions through social engineering and spear phishing to penetrate a system. “Those who don’t know how phishing attacks work will fall prey to them,” he said. “The panacea and antidote for phishing attacks is cyber education, which, when tailored for a person or function, is more effective than technology in stopping such attacks in many cases.” In deciding when and how to adopt AI to improve cyber security, Pereira said organisations should start with projects that address human and people risks, followed by processes and technology. “And when you get to the technology part, AI shouldn’t come first, but rather look at it as a way to enhance security processes, such as making it faster to review logs,” he said.
Deloitte says CIOs need to adapt or perish
Deloitte says that in 2018 CIOs need a better grasp on the big picture, and that means looking ‘inward’, ‘across’ and ‘beyond’ the business. “The digital era presents CIOs with the opportunity to look inward and reinvent themselves by breaking out of the trusted operator mould,” says the report. “We note, as in previous surveys, the importance of strong relationships to the CIO’s business success. This year we suggest that developing a technology fluency programme can help create a solid foundation for these relationship-building efforts. A tech fluency programme can provide organisations with knowledge about technology trends, scalability of emerging technologies and complexities of managing legacy core systems – while enabling CIOs to understand internal and external customer perspectives. “CIOs can also look across the IT organisation and transform it, particularly by focusing on the IT operating model, funding priorities and budget allocation, and tech talent and culture at the heart of their digital agendas.
Why your machine-learning team needs better feature-engineering skills
The skill of feature engineering — crafting data features optimized for machine learning — is as old as data science itself. But it’s a skill I’ve noticed is becoming more and more neglected. The high demand for machine learning has produced a large pool of data scientists who have developed expertise in tools and algorithms but lack the experience and industry-specific domain knowledge that feature engineering requires. And they are trying to compensate for that with better tools and algorithms. However, algorithms are now a commodity and don’t generate corporate IP. Generic data is becoming commoditized and cloud-based Machine Learning Services (MLaaS) like Amazon ML and Google AutoML now make it possible for even less experienced team members to run data models and get predictions within minutes. As a result, power is shifting to companies that develop an organizational competency in collecting or manufacturing proprietary data — enabled by feature engineering. Simple data acquisition and model building are no longer enough.
How blockchain technology is transforming healthcare cybersecurity
An additional critical feature of blockchain technology is that every member of a blockchain generally can access and audit the entire ledger. This allows all interested parties to confirm and update the information contained in individual blocks. Another significant benefit is that laws and regulations can be programmed into the blockchain as smart contracts. Smart contracts are logical rules programmed into the blockchain. They are self-executing contracts where the built-in agreement is enforced on all members. Smart contracts mimic traditional contracts and laws, and can be used to program in obligations and consequences. In this way, the requirements of specific data privacy and security laws, such as the Health Insurance Portability and Accountability Act of 1996 or the European Union General Data Protection Regulation, can be embedded in the blockchain. Innovators are already experimenting with blockchain use cases in the healthcare context that demonstrate many of the blockchain security benefits.
How To Integrate AI Into The Enterprise
Overcoming ignorance is a good place to start, and the tutorial given by Hammond was a pleasing break from many technology events that are largely attended by people from whatever discipline the event covers. Data scientists attend data events, roboticists attend robotics events, and so on. At the O'Reilly event however, techies were in the minority, with most of the attendees from managerial functions. The session began by providing an overview of what AI is, with a whistle stop tour of machine learning, and specifically the nature of learning itself, which feeds into the supervised, unsupervised and reinforcement learning models used by all machine learning systems today. Machine learning is, of course, just one aspect of AI, with McKinsey recently identifying five distinct forms, including physical AI, computer vision, natural-language processing, natural language, and then machine learning. Understanding what each of these is, even on a basic level, can you help you to make informed choices, and not be suckered in by hype.
Criminals' Cryptocurrency Addiction Continues
"With the increasing, malicious focus on cryptocurrency-related threats, attacks and exploits, it is clear that criminal innovation in this space continues unabated," Ferguson tells Information Security Media Group. "Starting from attacks targeting cryptocurrency wallets on individual users' machines - either directly or as an add-on to some widespread ransomware variants - attackers have rapidly diversified into direct breaches of cryptocurrency exchanges, malware for mining on traditional, mobile and even IoT devices, and developed attack methodologies specifically designed to target the mechanics of blockchain-based transactions, such as the 51 percent attack." The 51 percent attack gives attackers who can control more than 50 percent of a network's hash rate - or computing power - the power to reverse transactions on the blockchain or double-spend coins. The first half of this year saw five successful 51 percent attacks leading to "direct financial losses ranging from $0.55 million to $18 million," Moscow-based cybersecurity firm Group-IB says in a recently released cybercrime trends report.
Quote for the day:
"Leaders should influence others in such a way that it builds people up, encourages and edifies them so they can duplicate this attitude in others." -- Bob Goshen
No comments:
Post a Comment