PCIe is several times faster and has much more parallelism, so throughput is more suited to the NAND format. It comes in two physical formats: an add-in card that plugs into a PCIe slot and M.2, which is about the size of a stick of gum and sits on the motherboard. PCIe is most widely used in servers, while M.2 is in consumer devices. There used to be a significant price difference between PCIe and SATA drives with the same capacity, but they have come into parity thanks to Moore’s Law, said Jim Handy, principal analyst with Objective Analysis, who follows the memory market. “The controller used to be a big part of the price of an SSD. But complexity has not grown with transistor count. It can have a lot of transistors, and it doesn’t cost more. SATA got more complicated, but PCIe has not. PCIe is very close to the same price as SATA, and [the controller] was the only thing that justified the price diff between the two,” he said. DigiTimes estimates that the price drop for NAND flash chips will cause global shipments of SSDs to surge 20 to 25 percent in 2019
Edge computing is real. It's here, and companies have to have a strategy to handle the enormous influx of data coming in real time from devices globally. Analysts project there will be 50 billion telematics devices by 2020 and forecast the sum of the world's data will reach 175 zettabytes by 2025. Although edge computing is putting enormous pressure on IT infrastructure -- where legacy systems at the networking, storage, and application layers are straining today -- a new generation of systems is coming to market to help companies deal with the data explosion caused by edge computing. What is most exciting is the ability these new systems give companies to engage with customers in fundamentally new ways. There are examples of new business models being developed around the edge -- Netflix, Uber, and Amazon are notable examples -- but now many companies can adopt these new business models with next-generation, edge-aware systems emerging today.
The second-biggest improvement that Microsoft has made in HoloLens 2 is that the gesture control has been revamped. If I am to be completely honest, I have never had the best luck with getting HoloLens gestures to work. I always assumed that I was doing something wrong, because nobody else that I have talked to seems to have any trouble. From what I have heard about HoloLens 2, a new artificial intelligence (AI) processor and something called a time-of-flight depth sensor will collectively make it so that HoloLens will allow you to interact with holographic objects in the same way that you would interact with their real-world counterparts. This might mean being able to pick up a hologram and move it as if it were a physical object, as opposed to having to resort to using the convoluted gestures that are currently required. It remains to be seen how this new capability will actually be implemented, but I have high hopes that using HoloLens 2 will be far more intuitive than using its predecessor.
Most enterprises migrate their data to the public cloud in that second way: they just cart it all from the data center to the cloud. Often, there is no single source of truth in the on-premises databases, so all the data is moved to the public cloud keeps all its redundancies. Although it’s an architectural no-no, the reality is that most systems are built in silos, which is where the redundancies come from. They often create their own version of common enterprise data, such as customer data, order data, and invoice data. As a result, most enterprises have several security vulnerabilities that they have inadvertently moved to the cloud. ... The best solution to this problem is to not maintain redundant data. I’m sure the CRM system has APIs to allow for secure access to customer data that can be integrated directly into the inventory system. Or, the other way around. The goal is to maintain data in a single physical location, even if accessed by multiple systems. Even if you do eliminate most of the redundant data, all your data should be secured under a holistic security system that’s consistent from application to application and from database to database.
Let data analytics be your guide. In other words, take all your vulnerability scanning data and analyze it across a multitude of parameters, including asset value, known exploits, exploitability, threat actors, CVSS score, similar vulnerability history, etc. This data analysis can be used to calculate risk scores, and these risk scores can help guide organization on which vulnerabilities should be patched immediately, which ones require compensating controls until they can be patched, which ones can be patched on a scheduled basis, and which ones can be ignored. Of course, few organizations will have the resources or data science skills to put together the right vulnerability management algorithms on their own, but vendors such as Kenna Security, RiskSense, and Tenable Networks are all over this space. Furthermore, SOAR vendors such as Demisto, Phantom, Resilient, ServiceNow, and Swimlane are working with customers on runbooks to better manage the operational processes.
A disaster recovery plan is a bit like an insurance policy: we all agree we need it and we all hope we’ll never use it. And as with insurance, nobody wants to discover their DR plan doesn’t actually protect them when a disaster hits. Similarly, nobody wants to find out that their DR plan is overdone – meaning they’ve been spending too much time, money and energy maintaining it. But if you don’t regularly stress test your DR plan, you could find yourself in one of these situations. I’ve worked with a lot of businesses, and I’ve noticed that few conduct regular stress tests of their DR plans. That’s a problem: no disaster recovery plan is good enough to magically transform as a business changes – and realistically, no business remains static. At a previous firm, we tested quarterly and found changes and updates during every test! So how can you verify that your DR plan fits your current needs? Follow these seven steps.
Cisco rates both those router vulnerabilities as “High” and describes the problems like this: One vulnerability is due to improper validation of user-supplied input. An attacker could exploit this vulnerability by sending malicious HTTP POST requests to the web-based management interface of an affected device. A successful exploit could allow the attacker to execute arbitrary commands on the underlying Linux shell as root; and the second exposure is due to improper access controls for URLs. An attacker could exploit this vulnerability by connecting to an affected device via HTTP or HTTPS and requesting specific URLs. A successful exploit could allow the attacker to download the router configuration or detailed diagnostic information. Cisco said firmware updates that address these vulnerabilities are not available and no workarounds exist, but is working on a complete fix for both. On the IOS front, the company said six of the vulnerabilities affect both Cisco IOS Software and Cisco IOS XE Software, one of the vulnerabilities affects just Cisco IOS software and ten of the vulnerabilities affect just Cisco IOS XE software.
Deemed a work in progress with no official support from Microsoft and much functionality yet to be implemented, the GitHub-based project is described as an attempt to improve on currently available Python type checkers, with mypy mentioned specifically. Of course, the increasingly popular Visual Studio Code editor already sports an increasingly popular Microsoft-backed, jack-of-all-trades Python extension (just updated) that boasts more than 35 million downloads and 7.3 million installations and does type checking and a whole lot more. But Pyright isn't aiming to compete with that tool, rather to just improve on its type-checking capabilities, which are powered by the Microsoft Python Language Server that uses the language server protocol to provide IntelliSense and other advanced functionality for different programming languages in code editors and IDEs. "Pyright provides overlapping functionality but includes some unique features such as more configurability, command-line execution, and better performance," the GitHub project says.
For computer vision and facial recognition systems to work reliably, they need training datasets that approximate real-world conditions. So far, researchers have had access to only a small number of image datasets, many of which are heavily populated with still pictures of fair-skinned men. This limitation impacts the accuracy of the technology when it comes across types of images it's not familiar with – those of women or people of color, for instance. Another challenge is related to the varying quality of the images on video feeds available from surveillance cameras. Often the cameras' scope and angle, as well as the lighting or weather during a given recording, make it difficult for law enforcement to track or re-identify people from security camera footage as they try to reconstruct crimes, protect critical infrastructure and secure special events. To help solve this problem, the Intelligence Advanced Research Projects Activity has issued a request for information regarding video data that will help improve computer vision research in multicamera networks.
The latest findings are contained in the fifth annual report to be issued by the NCSC's Huawei Cyber Security Evaluation Center, which the U.K. government launched in 2010 to review Huawei's business strategies and test all product ranges before they were potentially used in any setting that might have national security repercussions. The new report emphasizes that the findings should not imply that U.K. telecommunications networks are at any greater risk now than they were before. Rather, the findings are part of a high-level review to ensure that Britain's telecommunications networks remain as secure as possible. "We can and have been managing the security risk and have set out the improvements we expect the company to make. We will not compromise on the progress we need to see: sustained evidence of better software engineering and cybersecurity, verified by HCSEC," the NCSC spokeswoman says. "This report illustrates above all the need for improved cybersecurity in the U.K. telco networks, which is being addressed more widely by the digital secretary's review."
Quote for the day:
"Prosperity isn't found by avoiding problems, it's found by solving them." -- Tim Fargo