Intelligent supply chains boast the ability to perform continuous predictive analytics on enormous amounts of data and use machine learning models to review historical information to plan for current and future needs. To create such a system, the first thing a company needs is "an intelligent brain, a cognitive operating system," said Frederic Laluyaux, president and CEO of Aera Technology, a platform dedicated to building the self-driving enterprise. Companies already have the needed data from their transactional systems. "The brain does the job, and the data feeds the brain." The cognitive operating system provides the computing connectivity. Laluyaux said that Aera’s system crawls transactional systems, like Google crawls websites. In some cases, their system has to crawl 54 different ERPs at one company, which is not unusual. "Every big company that has gone through mergers and acquisitions will have the same complexity," Laluyaux said. Even if the company has standardized their ERPs, they have many different modules.
TechRepublic's Nick Heath got an early hands-on with the Raspberry Pi 3 Model B+ and found in benchmarking tests that it's the fastest Raspberry Pi model available, both in single-core and quad-core measurements. As Heath notes, the addition of 802.11ac Wi-Fi gives the Raspberry Pi 3 Model B+ triple the maximum throughput of the Pi 3 Model B's 2.4GHz 802.11n Wi-Fi. Eben Upton, co-creator of the popular developer board, told TechRepublic that its B+ releases are all about refinement. "It's not a Raspberry Pi 4. I think what we've ended up with is something which is a really good B+, a bit too good for a B+, but that would be not really anywhere near enough for a Raspberry Pi 4," said Upton. "The B+ is our attention-to-detail spin for the product, where we take all the stuff we've learned in the past couple of years and smoosh it all together and make a new thing that addresses some of the concerns, or takes advantage of some of the opportunities that have come along in the intervening time."
McQuire admits that companies do struggle to keep up with the technological progress. “Many firms are simply unable to keep up with the rapid technology changes. The threat landscape is transforming before our eyes with malware, ransomware, and phishing attacks all rising rapidly,” he says. “There is also significant regulatory change occurring in the form of GDPR, which adds new pressures and holds those with weak security and privacy processes financially accountable.” “You combine this with a general lack of security talent in most firms and the fact that most run a complex web of legacy security technologies that don’t properly protect them from employees who now access work information across a mix of devices and cloud apps, and you have a security market that is booming,” McQuire adds. ... CISOs, it appears, are trying to be present throughout the entire DX process. For instance, at an event late last year, Los Angeles CISO Timothy Lee said that CISOs that embrace digital transformation may help an organization adapt to a rapidly evolving global marketplace.
Quantum computing is quickly moving from the theoretical world to reality. Last week Google researchers revealed a new quantum computing processor that the company says may beat the best benchmark in computing power set by today's most advanced supercomputers. That's bad news for CISOs because most experts agree that once quantum computing advances far enough and spreads wide enough, today's encryption measures are toast. General consensus is that within about a decade, quantum computers will be able to brute-force public key encryption as it is designed today. Here are some key things to know and consider about this next generation of computing and its ultimate impact on security.
Software-defined resiliency (SDR) is IBM’s approach to DRaaS. It helps ensure enterprise applications operate reliably and protect data even when disaster strikes. It’s the latest step in the journey to redefine data center operations in software. A software-defined approach makes disaster recovery more controllable and visible, enabling administrators to extend across hybrid cloud infrastructures. It also introduces perhaps the most valuable feature in an otherwise laborious process: orchestrated recovery. By automating and orchestrating the replication and recovery of not just the servers and virtual machines but also the applications and business services, disaster recovery becomes reliable and repeatable. SDR makes use of existing vendors’ data protection mechanisms like replication and backup, but also manages them. Instead of using different tools for each enterprise software product, it provides a single interface to control all replication and recovery processes.
Brain uploading will be familiar to readers of Ray Kurzweil’s books or other futurist literature. You may already be convinced that immortality as a computer program is definitely going to be a thing. Or you may think transhumanism, the umbrella term for such ideas, is just high-tech religion preying on people’s fear of death. Either way, you should pay attention to Nectome. The company has won a large federal grant and is collaborating with Edward Boyden, a top neuroscientist at MIT, and its technique just claimed an $80,000 science prize for preserving a pig’s brain so well that every synapse inside it could be seen with an electron microscope. McIntyre, a computer scientist, and his cofounder Michael McCanna have been following the tech entrepreneur’s handbook with ghoulish alacrity. “The user experience will be identical to physician-assisted suicide,” he says. “Product-market fit is people believing that it works.” Nectome’s storage service is not yet for sale and may not be for several years. Also still lacking is evidence that memories can be found in dead tissue.
There has also been huge growth in adoption of serverless computing among cloud users. In the fourth quarter of 2017, serverless adoption grew by 667 percent among the sites tracked, the survey's authors report. This is up from 321 percent just the quarter before. "Serverless continues to be attractive to organizations since it doesn't require management of the infrastructure," the report's authors observe. "As companies migrate increasingly to the cloud and continue to build cloud-native architectures, we think the pace of serverless adoption will also continue to grow." The study's authors also looked at cloud CPU consumption to draw conclusions about how people are deploying cloud power. The dominance of general-purpose workloads (employed 43 percent of the time) shows that most organizations start their cloud journey by moving development and test workloads, mostly provisioned as standard instances.
The best SIEM vendor you can pick is one that understands that less is more. The Herjavec Group is one such company that recently caught my eye. Started by Robert Herjavec, one of the stars of ABC’s addictive Shark Tank television series, the Herjavec Group lives this philosophy. Here’s what Ira Goldstein, Herjavec Group’s senior vice president of global technical operations, said about their less-is-more philosophy, “[The data required to manage security for a modern enterprise infrastructure] has to be parsed, correlated, alerted, evaluated, analyzed, investigated, escalated, and remediated fast enough to protect integrity and operations. The only way to make sense of it all is to focus on fewer, more specific use cases that matter, as opposed to a high volume of low fidelity alerts.” “An effective security operation is driven by discipline, preventing use-case sprawl that causes information overload,” says Goldstein.
I recently ran into this very same problem in which the bcp error message stated: An error occurred while processing the file “D:\MyBigFeedFile.txt” on data row 123798766555678. Given that the text file was far in excess of notepad++ limits (or even easy human interaction usability limits), I decided not to waste my time trying to find an editor that could cope and simply looked for a way to pull out a batch of lines from within the file programmatically to compare the good rows with the bad. I turned to PowerShell and came across the Get-Content cmdlet to read the file, which looked like it would give me at least part of what I wanted. At this point, there are quite a few ways to look at specific lines in the file, but in order to look at the line in question and its surrounding lines I could only come up with one perfect solution. By passing the pipeline output into the Select-Object cmdlet I could use its rich skipand next functionality.
Specifically, corporate officers, directors and “other corporate insiders” are prohibited from trading shares if they have knowledge of any unpublicized security incident within the company. While the overall intent of this latest statement is clear, the guidance is vague in key areas by design. For instance, the second section of the guidance emphasizes that companies must make "timely disclosure of any related material nonpublic information." It’s unclear what the SEC explicitly means by "timely disclosure," as the SEC doesn’t provide a specific time limit that companies must meet. This puts a lot of trust in corporate leaders to put speedy remediation and due diligence at the center of their security policy, which is a bit of a gamble given the track record of executive action during the fallout of the Equifax breach. The GDPR, on the other hand, is much more prescriptive, giving organizations 72 hours to report an incident related to the personal data of EU citizenry.
Quote for the day:
"Change is not a threat, it's an opportunity. Survival is not the goal, transformative success is." -- Seth Godin