The debate is over what it means when you run an actual quantum computer, such as Sycamore, and compare it to a simulation of that quantum computer inside of a classical, electronic computer. Quantum simulation software, such as Microsoft's LIQUi|⟩ program, allows a traditional computer to represent a quantum computer in ordinary circuitry, by translating quantum mechanics into mathematical structures, known as matrices of complex numbers (numbers that incorporate both real and imaginary numbers). With simulations, it's possible to compare how long it takes real quantum circuits to produce a given computation, and how long the same computation takes a classical computer to reproduce, by running the matrix math that resembles the functions of the quantum circuit. Google and IBM are both looking at such simulations, and they're taking different views as to what the comparison means. Google's point is that Sycamore is a device that does the work it takes millions of conventional processors to simulate.
A full 43% of the respondents cited phishing attacks that trick employees into sharing sensitive company information. Some 24% pointed to weak passwords, 15% referred to spear-phishing attacks targeted to specific individuals, and 15% cited orphaned accounts. Data leakage or theft is always a concern for security professionals both from outside and inside the company. Asked which type of data is most vulnerable to insider attacks, 63% of the respondents pointed to customer data, 55% to intellectual property, and 52% to financial data. ... Insider attacks pose enough of a concern that most organizations do have certain tools in place to deal with them. Some 68% of those surveyed said they feel anywhere from moderately to extremely vulnerable to insider attacks. While 49% said they feel they have the right controls to prevent an insider attack, 28% said they do not, and 23% said they were not sure. Most of the respondents use some type of analytics to determine insider threats with 32% relying on activity management and summary reports, 29% on user behavior analytics, 28% on data access and movement analytics, and 14% on predictive analytics.
The Nuclear Power Corporation of India (NPCIL) admitted yesterday that one of the computers at its Kudankulam nuclear power plant (KKNPP) had been attacked by malware. The malware, however, did not affect the critical internal network of the plant, NPCIL claimed, but the company only confirmed the attack following strong denials. "Identification of malware in NPCIL system is correct," A.K. Nema, Associate Director and Appellate Authority, NPCIL, belated admitted in a statement. "The matter was conveyed by CERT-In [Indian Computer Emergency Response Team] when it was noticed by them on 4 September 2019," he added. According to Nema, the matter was investigated by DAE cyber security specialists, who found that the compromised computer was connected to the internet and was being used only for administrative work only. He also added the virus infection was isolated from the critical internal network of the plant. A day earlier, KKNPP senior official R Ramdoss had rejected social media reports, which claimed that domain controller-level access at KKNPP has been compromised.
Metrics come into their own when they act first as a tool to help people understand what’s going on, what you need to do to improve, then how to track progress and measure success. Start by creating one metric per process – then, if this metric goes out of tolerance, you’ll have a clear idea of how to address this. For example, “number of high severity vulnerabilities” is not easily actionable. If this goes beyond your tolerance what action do you take? It’s affected by multiple processes as well as circumstances beyond your control. As metrics are developed and you get new views on what your data is telling you, you’ll likely uncover things that are slipping between the cracks of existing processes. If you have a metric that tracks the current status of a given process, but you require several projects to deal with legacy issues that existed before the process was created/updated, there is no harm in splitting that historical data out and tracking those remediation projects separately. This will help you avoid the situation where your metric confuses good current performance with past problems that are now under management via a different – and possibly longer term – process.
There's no shortage of information and data, but individuals with the ability to discern what information is trustworthy among the abundant mix of misinformation such as fakes news, deep fakes, propaganda, and more will be critical to an organization's success. Critical thinking doesn’t imply being negative; it’s about being able to objectively evaluate information and how it should be used or even if it should be trusted by an organization. Employees who are open-minded, yet able to judge the quality of information inundating us will be valued. ... Technical skills will be required by employees doing just about every job since digital tools will be commonplace as the 4th industrial revolution impacts every industry. Artificial intelligence, Internet of Things, virtual and augmented reality, robotics, blockchain, and more will become a part of every worker's everyday experience, whether the workplace is a factory or law firm. So, not only do people need to be comfortable around these tools, they will need to develop skills to work with them.
Schools often are more vulnerable to cyberattacks in comparison with larger companies and enterprises, and for a variety of reasons. Many school districts may have only one or two IT people to serve the entire district, so the staffers are spread thin. Budget constraints have affected many schools, limiting the amount of money they can spend on security solutions. Most schools likely have the necessary security set up on individual computers and even the overall network. But comprehensive perimeter protection may not be in place, potentially leading to data breaches and malware hosted on the school's website. Young students don't necessarily have the skills or training to adequately identify phishing emails and other threats, so such attacks are often more successful. The number of tablets and other devices issued by schools has increased in recent years and because of that, students may use those devices on outside networks that aren't secure, thereby raising the risk of infection. Even in the face of budget constraints and other limitations, schools should have adequate security measures in place to protect themselves, their data, and their students from security threats.
When we compare companies with relatively little AI experience (they’ve built five or fewer production systems) with those possessing extensive AI experience (they’ve built 20 or more production systems), we observe an interesting shift in “most wanted” roles (see chart). Early on, AI researchers are the most sought-after, with about a third of the less-experienced rating them a top-two needed role. Business leaders rank near the bottom. By the time adopters have become highly experienced at building AI solutions, business leaders have bubbled to the top, and AI researchers have sunk almost to the bottom. What can we make of this curious flip? Many companies embarking on AI initiatives may feel they need to hire AI superstars—researchers with advanced degrees who can invent new AI algorithms and techniques—to spearhead their efforts.2 By the time organizations have amassed a lot of AI experience, they may have filled their ranks with enough of these brilliant experts. At that stage, they’re eager to find business leaders who can play the crucial “translator” role: figuring out what the results from AI systems mean, and how they should factor into business decisions and actions.
Digitization spurs new priorities – alongside a full slate of historical departmental responsibilities. In enterprise tech speak, leverage simply means using tools, systems or techniques to convert relatively small effort into significantly greater output. Digital transformation won't fit nicely alongside traditional management processes or cleanly under one leader’s org chart. IT leaders have to get more. More out of themselves, their teams and their dollars to succeed in the new enterprise era. One industry survey shares that at least 84% of top CIOs are now responsible for areas outside of traditional IT. The most common areas are innovation and transformation. Further research reveals that 95% of CIOs expect digitization to change or remix their job. Regardless of where or how new responsibilities intersect business and technology, IT will have a key role to play. Yet, without suitable systems leverage, the CIO position is challenging. How else can they handle the burden of IT consumerization, mobile workforces, big-data challenges, shadow IT and cost management.
AIoT is defined as decision making aided by AI technologies in conjunction with connected IoT sensor, system or product data. AI technologies include deep learning, machine learning, natural language processing, voice recognition and image analysis. According to the survey, 34% of respondents said increasing revenue is the top goal for using AIoT. Improving the ability to innovate (17.5%), offering customers new digital services (14.3%) and decreasing operational costs (11.1%) were all key goals. Intel Americas chief data scientist Melvin Greer says AI and IoT are no longer separate technologies. “AI closes the loop in an IoT environment where IoT devices gather or create data, and AI helps automate important choices and actions based on that data,” explains Greer. “Today, most organisations using IoT are only at the first ‘visibility’ phase where they can start to see what’s going on through IoT assets. But they’re moving toward the reliability, efficiency and production phases, which are more sophisticated and require stronger AI capabilities.”
Applied Inventions cofounder and computer theorist Danny Hillis and board members agreed to amend the draft document to say the governable principle should include “avoid unintended harm and disruption and for human disengagement of deployed systems.” The report, Hillis said, should be explicit and unambiguous that AI systems used by the military should come with an off switch for a human to press in case things go wrong. “I think this was the most problematical aspect about them because they’re capable of exhibiting and evolving forms of behavior that are very difficult for the designer to predict, and sometimes those forms of behavior are actually kind of self preserving forms of behavior that can get a bit out of sync with the intent and goals of the designer, and so I think that’s one of the most dangerous potential aspects about them,” he said. The Defense Innovation Board is chaired by former Google CEO Eric Schmidt, and members include MIT CSAIL director Daniela Rus, Hayden Planetarium director Neil deGrasse Tyson, LinkedIn cofounder Reid Hoffman, Code for America director Jennifer Pahlka, and Aspen Institute director Walter Isaacson.
Quote for the day:
"It's not about how smart you are--it's about capturing minds." -- Richie Norton