AMD issued a statement on Meltdown and said it is potentially vulnerable to only one of the three variants of Meltdown, but no one has demonstrated an AMD vulnerability as yet. This applies to both the new Epyc server processor and older Opteron server chips for the half dozen customers still using them. With ARM, it gets complicated. The company has published a list of cores at risk. ARM has three types of cores — Cortex-A, Cortex-M and Cortex-R. Cortex-M is an embedded microcontroller used in Internet of Things (IoT) devices and a 32-bit processor, so it has no exposure. Cortex-R is also an embedded controller used in real-time applications, such as cars. Those are used in closed systems and are not prone to attack, although ARM said they are at risk of exposure. Only the Cortex-A line has exposure, and not all of the chips are at risk. For example, the Cortex-A53, which is the most widely used processor in smartphones and tablets, is not at risk.
The emerging blockchain-based distributed storage market could challenge traditional cloud storage services, such as Amazon AWS and Dropbox, for a cut of the cloud storage market. "Distributed compute and storage models are still in their infancy, but I do believe that there is an enormous market for this technology," said Paul Brody, Ernst & Young's (EY) Global Innovation Leader for Blockchain Technology. The idea of using P2P networks to aggregate computer resources is not new. In the early 2000s, BitTorrent opened as a distributed file-sharing service and grew to handle more than half of the internet's file-sharing bandwidth. Because blockchains come with a built-in mechanism for payments – cryptocurrencies, which were missing from the last go-around at P2P services – they are more likely to succeed, according to Brody.
Bitcoin is the first decentralized form of cryptocurrency, but it's certainly not the only one. A large number of blockchain-based cryptocurrencies have emerged since 2009, which raises the obvious question: How is Bitcoin different? Aside from its much greater value, there are several things that make Bitcoin different from cryptocurrencies such as Etherium, Dogecoin, Litecoin, and others. All of these cryptocurrencies use blockchain technology, but the method and purpose of each one is different. Etherium, one of the most talked about bitcoin alternatives, isn't actually a value transfer platform; instead, it is used for distributed application programming. Etherium does have a monetary value in the form of its fuel, called Ether, but that's just one part of its overall model. Other cryptocurrencies, like Litecoin, Dogecoin, and PotCoin, use blockchains but don't rely on SHA-256 encryption like Bitcoin does; they use Scrypt, a password-based key derivation function, to build coin hashes instead.
The underlying technologies are familiar: You can add content around executable code playgrounds using Markdown to format text. Azure Notebooks automatically adds UI to your code snippets, and you can use any of a selection of visualization tools for charting results. Data can be uploaded to and downloaded from local PCs, so you can take files you’ve been using with Excel’s analytics and use them in Azure Notebooks, letting you compare results and use business intelligence tools to prepare data before it’s used. You import online data with Curl or Wget, using Python code in a notebook or from a notebook’s built-in terminal window. There’s also integration with Dropbox, so you can share files with colleagues or use it to ensure you’re always working with the latest version of a file. Although Microsoft provides most of the tools you’ll need, it can only really support general-purpose analytical operations with tools like Python’s Anaconda data science extensions.
The topic of “observability” has been getting much attention recently, particularly in relation to building and operating “cloud native” systems. Several thought-leaders within this space like Cindy Sridharan have mused that observability could simply be a re-packaging of the age-old topic of monitoring (and argued that no amount of “observability” or “monitoring” tooling can ever be a substitute to good engineering intuition and instincts). Others, like Charity Majors have looked back at the roots of the term, which was taken from control theory and corresponds to a measure of how well internal states of a system can be inferred from knowledge of its external outputs. Both Sridharan and Majors discuss that the implementation of an observable systems should enable engineers to ask ad hoc (or following an incident, post hoc) questions about how the software works during execution. This eMag explores the topic of observability in-depth, covering the role of the “three pillars of observability” -- monitoring, logging, and distributed tracing
The speed at which cybercriminals launch attacks means the industry has no choice but to be more vigilant in protecting the precious information it keeps for its investors, so it can give more peace of mind to advisors and their clients. The public already sees cybercrime as a major threat. Research by Bitdefender, a cybersecurity technology provider based in Bucharest, Romania, finds U.S. citizens are more concerned about stolen identities (79%) than email hacking (70%) or home break-ins (63%). One major problem for the financial-services industry is that authentication methods are “severely outdated,” according to Harvey. “Many institutions have not yet recognized that cyberfelons already have the data to beat these practices. Millions of clients’ assets are at risk.” ... Today’s authentication practices largely rely on the of use private data, such as passwords, PINs and Social Security numbers — information that cyberfelons already possess.
For a data scientist or analyst to evolve as an effective leader three personal quality characteristics are needed: curiosity, imagination, and creativity. The three are sequentially linked. Curious people constantly ask “Why are things the way they are?” and “Is there a better way of doing things?” Without these personal qualities then innovation will be stifled. The emergence of analytics is creating opportunities for analysts as leaders. Weak leaders are prone to a diagnostic bias. They can be blind to evidence and somehow believe their intuition, instincts, and gut-feel are acceptable masquerades for having fact-based information. In contrast, a curious person always asks questions. They typically love what they do. If they are also a good leader they infect others with enthusiasm. Their curiosity leads to imagination. Imagination considers alternative possibilities and solutions. Imagination in turn sparks creativity.
“We must recognize that although technologies such as machine learning, deep learning, and AI will be cornerstones of tomorrow’s cyber defenses, our adversaries are working just as furiously to implement and innovate around them,” said Steve Grobman, chief technology officer at McAfee, in recent comments to the media. “As is so often the case in cybersecurity, human intelligence amplified by technology will be the winning factor in the arms race between attackers and defenders.” This has naturally led to fears that this is AI vs AI, Terminator style. Nick Savvides, CTO at Symantec, says this is “the first year where we will see AI versus AI in a cybersecurity context,” with attackers more able to effectively explore compromised networks, and this clearly puts the onus on security vendors to build more automated and intelligent solutions.
It is obvious that we are heading with this discussion in the direction of the classical security hygiene like risk management, identity management, patch management etc. to the extend needed by the customer, which is basically risk management. This needs to be done in every infrastructure and it needs to be done professionally. However, as most companies do not have IT as their core competence, they are trying to run security with a 0.5 FTE who then has to cover all the tasks needed – and who will be on a mission impossible. And even with the big and global companies, they are having difficulties with their inventory, with patch management (as a consequence), with their identities etc. I am deeply convinced that the cloud can help there! But before we need to understand the different responsibilities, knowing that this discussion is not new by far
The branch network is a critical piece of the IT infrastructure for most distributed organizations. The branch network is responsible for providing reliable, high quality communications to and from remote locations. It must be secure, easy to deploy, able to be managed centrally and cost effective. Requirements for branch networks continue to evolve with needs for increased bandwidth, quality of service, security and support for IoT. SDN and network virtualization technologies have matured to the point where they can deliver significant benefits for branch networks. For example, SD-WAN technology is rapidly being deployed to improve the quality of application delivery and reducing operational complexity. SD-WAN suppliers are rapidly consolidating branch network functions and have reduced (or eliminated) the need for branch routers and WAN optimization. The broader concept of SD-Branch is still in its early stages. During 2018, we will see a number of suppliers introduce their SD-Branch solutions.
Quote for the day:
"No obstacle is so big that one person with determination can't make a difference." -- Jay Samit