Daily Tech Digest - January 23, 2018

Meltdown and Spectre: How much are ARM and AMD exposed?

Meltdown and Spectre: How much are ARM and AMD exposed?
AMD issued a statement on Meltdown and said it is potentially vulnerable to only one of the three variants of Meltdown, but no one has demonstrated an AMD vulnerability as yet. This applies to both the new Epyc server processor and older Opteron server chips for the half dozen customers still using them. With ARM, it gets complicated. The company has published a list of cores at risk. ARM has three types of cores — Cortex-A, Cortex-M and Cortex-R. Cortex-M is an embedded microcontroller used in Internet of Things (IoT) devices and a 32-bit processor, so it has no exposure. Cortex-R is also an embedded controller used in real-time applications, such as cars. Those are used in closed systems and are not prone to attack, although ARM said they are at risk of exposure. Only the Cortex-A line has exposure, and not all of the chips are at risk. For example, the Cortex-A53, which is the most widely used processor in smartphones and tablets, is not at risk. 



Blockchain and cryptocurrency may soon underpin cloud storage

bitcoin currency blockchain finance bank binary
The emerging blockchain-based distributed storage market could challenge traditional cloud storage services, such as Amazon AWS and Dropbox, for a cut of the cloud storage market. "Distributed compute and storage models are still in their infancy, but I do believe that there is an enormous market for this technology," said Paul Brody, Ernst & Young's (EY) Global Innovation Leader for Blockchain Technology. The idea of using P2P networks to aggregate computer resources is not new. In the early 2000s, BitTorrent opened as a distributed file-sharing service and grew to handle more than half of the internet's file-sharing bandwidth. Because blockchains come with a built-in mechanism for payments – cryptocurrencies, which were missing from the last go-around at P2P services – they are more likely to succeed, according to Brody.


Bitcoin: A cheat sheet for professionals

bitcoin.jpg
Bitcoin is the first decentralized form of cryptocurrency, but it's certainly not the only one. A large number of blockchain-based cryptocurrencies have emerged since 2009, which raises the obvious question: How is Bitcoin different? Aside from its much greater value, there are several things that make Bitcoin different from cryptocurrencies such as Etherium, Dogecoin, Litecoin, and others. All of these cryptocurrencies use blockchain technology, but the method and purpose of each one is different. Etherium, one of the most talked about bitcoin alternatives, isn't actually a value transfer platform; instead, it is used for distributed application programming. Etherium does have a monetary value in the form of its fuel, called Ether, but that's just one part of its overall model. Other cryptocurrencies, like Litecoin, Dogecoin, and PotCoin, use blockchains but don't rely on SHA-256 encryption like Bitcoin does; they use Scrypt, a password-based key derivation function, to build coin hashes instead.


What you need to know about Azure Notebooks

What you need to know about Azure Notebooks
The underlying technologies are familiar: You can add content around executable code playgrounds using Markdown to format text. Azure Notebooks automatically adds UI to your code snippets, and you can use any of a selection of visualization tools for charting results. Data can be uploaded to and downloaded from local PCs, so you can take files you’ve been using with Excel’s analytics and use them in Azure Notebooks, letting you compare results and use business intelligence tools to prepare data before it’s used. You import online data with Curl or Wget, using Python code in a notebook or from a notebook’s built-in terminal window. There’s also integration with Dropbox, so you can share files with colleagues or use it to ensure you’re always working with the latest version of a file. Although Microsoft provides most of the tools you’ll need, it can only really support general-purpose analytical operations with tools like Python’s Anaconda data science extensions.


The InfoQ eMag: APM & Observability


The topic of “observability” has been getting much attention recently, particularly in relation to building and operating “cloud native” systems. Several thought-leaders within this space like Cindy Sridharan have mused that observability could simply be a re-packaging of the age-old topic of monitoring (and argued that no amount of “observability” or “monitoring” tooling can ever be a substitute to good engineering intuition and instincts). Others, like Charity Majors have looked back at the roots of the term, which was taken from control theory and corresponds to a measure of how well internal states of a system can be inferred from knowledge of its external outputs. Both Sridharan and Majors discuss that the implementation of an observable systems should enable engineers to ask ad hoc (or following an incident, post hoc) questions about how the software works during execution. This eMag explores the topic of observability in-depth, covering the role of the “three pillars of observability” -- monitoring, logging, and distributed tracing




Are Advisors’ Cyberdefenses Strong Enough?

The speed at which cybercriminals launch attacks means the industry has no choice but to be more vigilant in protecting the precious information it keeps for its investors, so it can give more peace of mind to advisors and their clients. The public already sees cybercrime as a major threat. Research by Bitdefender, a cybersecurity technology provider based in Bucharest, Romania, finds U.S. citizens are more concerned about stolen identities (79%) than email hacking (70%) or home break-ins (63%). One major problem for the financial-services industry is that authentication methods are “severely outdated,” according to Harvey. “Many institutions have not yet recognized that cyberfelons already have the data to beat these practices. Millions of clients’ assets are at risk.” ... Today’s authentication practices largely rely on the of use private data, such as passwords, PINs and Social Security numbers — information that cyberfelons already possess.


Do data scientists have the right stuff for the C-suite?


For a data scientist or analyst to evolve as an effective leader three personal quality characteristics are needed: curiosity, imagination, and creativity. The three are sequentially linked. Curious people constantly ask “Why are things the way they are?” and “Is there a better way of doing things?” Without these personal qualities then innovation will be stifled. The emergence of analytics is creating opportunities for analysts as leaders. Weak leaders are prone to a diagnostic bias. They can be blind to evidence and somehow believe their intuition, instincts, and gut-feel are acceptable masquerades for having fact-based information. In contrast, a curious person always asks questions. They typically love what they do. If they are also a good leader they infect others with enthusiasm. Their curiosity leads to imagination. Imagination considers alternative possibilities and solutions. Imagination in turn sparks creativity.


6 ways hackers will use machine learning to launch attacks

job search machine learning ai artifical intelligence robotics automation
“We must recognize that although technologies such as machine learning, deep learning, and AI will be cornerstones of tomorrow’s cyber defenses, our adversaries are working just as furiously to implement and innovate around them,” said Steve Grobman, chief technology officer at McAfee, in recent comments to the media. “As is so often the case in cybersecurity, human intelligence amplified by technology will be the winning factor in the arms race between attackers and defenders.” This has naturally led to fears that this is AI vs AI, Terminator style. Nick Savvides, CTO at Symantec, says this is “the first year where we will see AI versus AI in a cybersecurity context,” with attackers more able to effectively explore compromised networks, and this clearly puts the onus on security vendors to build more automated and intelligent solutions.


Why the Cloud is more secure than On Prem

It is obvious that we are heading with this discussion in the direction of the classical security hygiene like risk management, identity management, patch management etc. to the extend needed by the customer, which is basically risk management. This needs to be done in every infrastructure and it needs to be done professionally. However, as most companies do not have IT as their core competence, they are trying to run security with a 0.5 FTE who then has to cover all the tasks needed – and who will be on a mission impossible. And even with the big and global companies, they are having difficulties with their inventory, with patch management (as a consequence), with their identities etc. I am deeply convinced that the cloud can help there! But before we need to understand the different responsibilities, knowing that this discussion is not new by far


SD-Branch: What it is and why you'll need it

bridge between two buildings
The branch network is a critical piece of the IT infrastructure for most distributed organizations. The branch network is responsible for providing reliable, high quality communications to and from remote locations. It must be secure, easy to deploy, able to be managed centrally and cost effective. Requirements for branch networks continue to evolve with needs for increased bandwidth, quality of service, security and support for IoT. SDN and network virtualization technologies have matured to the point where they can deliver significant benefits for branch networks. For example, SD-WAN technology is rapidly being deployed to improve the quality of application delivery and reducing operational complexity. SD-WAN suppliers are rapidly consolidating branch network functions and have reduced (or eliminated) the need for branch routers and WAN optimization. The broader concept of SD-Branch is still in its early stages. During 2018, we will see a number of suppliers introduce their SD-Branch solutions.



Quote for the day:



"No obstacle is so big that one person with determination can't make a difference." -- Jay Samit


Daily Tech Digest - January 22, 2018

Buildings should behave like humans

Buildings should behave like humans
“When we look at buildings as living structures, we can understand how various systems are connected and operate together,” says Dr. Filip Ponulak, principal data scientist at Site 1001, in a press release. Ponulak says all buildings should now be listening for issues. He says it’s an innovative way of managing new buildings. Site 1001 believes its system would also work in older buildings. Chief Innovation Officer Eric Hall told me one could draw an analogy with an aging car, except that unlike cars, buildings don’t have odometers to help identify failing parts. In other words, by collecting data on failings, for example, predicting upkeep becomes possible — you know when things are likely to fail and can pre-empt them, like a flexible car service schedule. That lets facilities management “move to an entirely conditional and proactive maintenance schedule,” says Hall on the company’s website. Data centers fit into this platform, too, the company says. Indeed, I’ve written before about folks who think AI will ultimately self-manage the data center



The future of AI and endpoint security

network security digital internet firewall binary code
The key to machine learning success currently lies in the cloud. Traditional servers are not large or fast enough to process the data and create the models needed to detect and combat attacks, but by using cloud servers the process is quicker, easier and much more affordable than ever before, bringing it into the reach of more enterprises. Hackers are already using automated systems, machine learning and AI to create new cyber threats. Security experts think the next 12 months will see an acceleration in the adoption of machine learning by hackers as they try to carry out increasingly sophisticated phishing attacks. However, AI antivirus solutions are still relatively thin on the ground. Although a small number of companies do offer machine learning and AI cyber threat solutions for endpoints, such as Cylance, Darktrace and Symantec, this really should become the industry standard. Microsoft at least seems to have learned from its experience of WannaCry and is apparently turning to AI to create the next generation of anti-virus software.



Infosec expert viewpoint: Google Play malware

Another issue facing Google Play security is the complex and fragmentary nature of the Android device ecosystem, which has given rise to a patching problem, as unpatched devices are attractive targets. Google has been striving to improve on this issue, but a lack of direct control (multiple wireless carriers and manufacturers are responsible for pushing patches to a multitude of devices) will continue to hamper its efforts. Users should be discerning and skeptical when downloading anything and have passive protection along with regular backups. Watch out for malicious apps mimicking popular, reputable apps and check an app’s permissions to make sure it does not have access beyond its stated functionality. Although they cannot make up for preventative measures such as checking permissions, anti-malware products provide some protection from malicious code and can partially make up for failures to avoid malicious apps.


Collect the Dots: The New Possible for Digital Evidence

At the heart of this transformation is the power of computers. On all fronts, computational capability has soared in the last few years. Processor speeds are much faster than before, and multi-core technology takes that power to new levels. Network speeds have increased greatly, enabling much more data to be sent where it's needed, quickly and efficiently. And at the foundation of computing, the cost of data storage has never been lower. Cloud computing is another major force of change. The on-ramp for enterprise cloud computing has been long by some accounts, as many analysts predicted a faster migration from traditional data centers. But in 2017, enterprise cloud really took off. That's partly due to the ongoing juggernaut of Amazon Web Services, which arguably managed to pull off a 10-year head start on its competitors. But now, all the major software vendors are involved, including Microsoft with Azure, IBM with Bluemix, Oracle Cloud, Google Cloud and the SAP HANA Cloud Platform.


Wide-area networks: What WANs are and where they’re headed

Many believe that SD-WAN is poised to take off in 2018, moving from an early adopter technology to mainstream implementation. Research firm IDC has predicted () that SD-WAN revenues will hit $2.3 billion in 2018, with a potential revenue target of $8 billion by 2021. The first phase of SD-WAN aimed at creating hybrid WANs and aggregating MPLS and Internet connections to lower costs; the next phase will improve management, monitoring and provide better security, according to Lee Doyle of Doyle Research. A subset of SD-WAN called SD-Branch will help reduce the need for hardware within branch offices, replacing many physical devices with software running on off-the-shelf servers. Mobile backup across a SD-WAN can provide a failover for broadband connections as wireless WAN technology (4G, LTE, etc.) costs decrease. ... Asynchronous Transfer Mode (ATM) is similar to frame relay with one big difference being that data is broken into standard-sized packets called cells.


OnePlus Attackers Steal Credit Card Data From 40,000 Customers

Data Breach Alarm
The admission that there was a data breach comes three days after OnePlus announced that it was temporarily disabling credit card payments on its website. OnePlus disabled the credit card payments on Jan. 16, after receiving reports from customers that they were seeing unknown credit card charges after buying something online from OnePlus. "One of our systems was attacked, and a malicious script was injected into the payment page code to sniff out credit card info while it was being entered," OnePlus stated in an advisory on the breach. The attack appears to had been ongoing from mid-November 2017 until Jan. 11, 2018, OnePlus said. According to the company, credit card information (card numbers, expiration dates and security codes) that was entered on the Oneplus.net site may have been compromised. Users who saved their credit card information on the site, as well as those who use PayPal, do not appear to be impacted by the breach, however.


Take your online security more seriously this year

“People, me included, are lazy,” says a Web Developer, Joe Tortuga, “And ease of use is inversely related to security. If it’s just difficult, people won’t just do it.” Well, if you don’t work towards a safer internet for yourself and others, then who would? One of the most important online security measures that we should adopt is using strong, difficult-to-guess passwords. Alas, in most cases, our passwords are not strong enough to confuse a hacker who tries to find a back door. A strong password should contain different characters that make it difficult to guess. In some cases, when signing up for a particular service on the internet, users use their real details, an online approach that many internet security experts kick against. “What happens is that you build up an online profile of yourself across several sites that hackers can use to guess your weak passwords,” says Murphy Shaun, CEO of Online Security Agency, PrivateGiant.


Understanding Supply Chain Cyber Attacks

Host organizations now face having to adapt security procedures to include not just internal infrastructures, but also vendors, customers, and even partners. While internal IT and security departments might have strong security practices for thwarting a wide range of direct attacks, third-party collaborators might not adhere to the same culture. Consequently, programs for vetting vendors need to be in place before fully integrating them into internal infrastructures. Building a vendor management program is ideal and should start with defining an organization's most important vendors. Building the program around a risk-based approach ensures that vendors are constantly evaluated and assessed, and their policies are consistent with the host organization. Besides requiring vendors to provide timely notification of any internal security incident, periodic security reports should be included in the collaboration guidelines to regularly ascertain their security status.


What are the key areas that need to be transformed for a smart city concept?

Manchester Smart City
True Innovation comes from collaboration. This belief sits at the core of the Open Innovation challenge, which was launched today by Cisco and Manchester Science Partnerships (MSP), who are on the search to work with some of the UK’s best small and medium-sized enterprises (SMEs) with a vision to transform Manchester through smart technologies. Convened by CityVerve, the UK’s smart city demonstrator, the challenge will see eight SMEs selected to participate in an eight week initiative in Manchester to combine technology, data and creativity to tackle some of the city’s biggest problems in healthcare, transport and energy. Commencing in March 2018, the initiative gives SMEs the opportunity to work with partners from public sector, corporate and academic worlds who are part of the CityVerve Internet of Things (IoT) test bed. The eight selected SMEs from across the UK will have the opportunity to put their innovative solutions to the test, in a real-life situation.


Salted Hash Ep 15: The state of security now and the not too distant future

The adage of ‘it’s not if you’ll be hacked, but when’ is still realistic, but maybe now It’s wiser to consider what your organization can do to get in front of any potential situation and prevent as much damage as possible. “My prediction is that you’re going to start to see executives start to stop treating IT Security as a product they can implement, and start treating it as an operational concern as equally as important as managing their finances,” Lee remarked. In addition to looking ahead, we also take a look back at some interesting moments in 2017. One of the standouts is the Justice Department naming foreign actors and indicting them for their acts. This leads to an interesting conversation of the crossover between law and government operations, and down the path of once it’s on the internet, it’s there forever.



Quote for the day:


"There are three secrets to managing. The first secret is have patience. The second is be patient. And the third most important secret is patience." -- Chuck Tanner


Daily Tech Digest - January 21, 2018

Calculating the Costs of a Cyber Breach: Becoming the “Antifragile” Cyber Organization

Calculating the Costs of a Cyber Breach: Becoming the “Antifragile” Cyber Organization
We like the antifragile concept for two main reasons. First, when it comes to cybersecurity, what concerns people like us are these low-probability/high-impact events, sometimes called “fat-tail” events, that are difficult to account for and even harder to predict. Sure, we can say that a spear-phishing campaign could be catastrophic, but identifying which spear-phishing campaign will be the straw that broke the camel’s back is a whole lot harder if not impossible. Second, we like the antifragile concept because it is not only about resisting the breach, but rather, it is also about learning from the breach attempt. We like that, and that’s where we would like all organizations to be when it comes to their cyber posture. (Note: we are giving you the super oversimplified version of the antifragile concept.) So, if we want to become an “antifragile cyber organization,” where do our concerns lay? Actually, it is not so much with the technical capabilities.



MADIoT – The nightmare after XMAS (and Meltdown, and Spectre)


Now, there is a much larger underlying issue. Yes, software bugs happen, hardware bugs happen. The first are usually fixed by patching the software; in most cases the latter are fixed by updating the firmware. However, that is not possible with these two vulnerabilities as they are caused by a design flaw in the hardware architecture, only fixable by replacing the actual hardware. Luckily, with cooperation between the suppliers of modern operating systems and the hardware vendors responsible for the affected CPUs, the Operating Systems can be patched, and complemented if necessary with additional firmware updates for the hardware. Additional defensive layers preventing malicious code from exploiting the holes – or at least making it much harder – are an “easy” way to make your desktop, laptop, tablet and smartphone devices (more) secure. Sometimes this happens at the penalty of a slowdown in device performance


Forget bitcoin. Here come the blockchain ETFs

"Investors have been buying blindly, and there has been some abuse," said Christian Magoon, CEO of Amplify ETFs. "The SEC has to protect investors." But make no mistake. These two funds are set up to take advantage of the growing interest in blockchain. This is not the Winklevoss Bitcoin Trust, a fund that only owns bitcoin and is run by Cameron and Tyler, of Facebook and "The Social Network" movie fame. The Winklevii want to launch an ETF with the ticker symbol COIN, but the SEC has yet to approve it. In fact, the SEC seems unlikely to greenlight any funds that just want to invest in cryptocurrencies. Dalia Blass, director of the SEC's Division of Investment Management, wrote in a letter Thursday that it had many questions about these funds. And she said that until they are addressed, "we do not believe that it is appropriate for fund sponsors to initiate registration of funds that intend to invest substantially in cryptocurrency and related products."


Applying Quantum Physics to Data Security Matters Now and in the Future


As one of the few companies leveraging the power of quantum physics for random number generation, we also offer advanced key and policy management features that give customers complete control over the lifecycle and use of encryption keys. “Encryption and key management is complicated enough, and the injection of quantum mechanics into the discussion is enough to make most folks’ heads spin,” notes co-authors Garrett Becker and Patrick Daly in the report. “By combining the added security of encryption keys based on quantum random numbers, advanced key lifecycle management and an HSM for protecting those keys, QuintessenceLabs has developed a compelling offering for those enterprises and agencies with a need for the highest level of data security.”


How long will patient live? Deep Learning takes on predictions

The team described their work in their paper, "Improving Palliative Care with Deep Learning," which is up arXiv. The paper was submitted in November. The authors are Anand Avati, Kenneth Jung, Stephanie Harman, Lance Downing, Andrew Ng and Nigam Shah. Authors' Stanford affiliations ranged from Department of Computer Science, the Center for Biomedical Informatics Research, Department of Medicine and Stanford University School of Medicine. The algorithm was not developed to replace doctors but rather to provide a tool to improve the accuracy of prognoses. As Jeremy Hsu, IEEE Spectrum, wrote, "as a benign opportunity to help prompt physicians and patients to have necessary end-of-life conversations earlier." One can think of it as a triage tool for improving access to palliative care, one of the authors, Stephanie Harman, clinical associate professor of medicine at Stanford University and a co-author of the new study, told Gizmodo.


New Security Architecture Practitioner’s Initiative


The Security Architecture Practitioner’s Initiative is a joint effort of The Open Group Security Forum (a global thought leader in Enterprise Architecture) and The SABSA Institute to articulate in a clear, approachable way the characteristics of a highly-qualified Security Architect. The focus of this initiative is on the practitioner, the person who fills the role of the Security Architect, and on the skills and experience that make them great. This project is not about security architecture as a discipline, nor about a methodology for security architecture but rather about people and what makes them great Security Architects. The project team consists of pioneering Security Architects drawn from both The Open Group Security Forum and The SABSA Institute who have between them many decades of security architecture experience at organizations such as Boeing, IBM, HP, and NASA. Operating under the auspices of The Open Group and in collaboration with The SABSA Institute


Public Blockchain's Lure Will Become Irresistible for Enterprises in 2018


Central banks are already experimenting with the tokenization of their own currencies, but doing so in private, permissioned or proprietary blockchains that are managed by the central banks. It is a good start, but the next logical step is to create the legal and regulatory framework that enables the tokenization of fiat currency on any industrial or public blockchain. Once a closed-loop tokenized industrial blockchain exists, many of the key foundations of specialized blockchains would become add-on features in the true economic blockchain. Trade finance is easy if you trust that the representation of 1,000 phones, each worth $1,000 is accurate — you can loan money against those tokens in the blockchain. Similarly, customs declarations, tax calculations, and product history and provenance are all easily derived from looking at the history of the tokens in that blockchain. No separate blockchain is required for trade finance, payments or product traceability.


World Wide Data Wrestling

3vs1
As on date we see data almost every single person, company or any entity is just running after data. By combining all this disparate data, predictive analytics can create highly accurate models to predict pollution trends in advance allowing civic agencies to make relevant predictions and changes to prevent spikes and keep pollution levels in check. Big data and analytics can also help improve traffic management in addition to just monitoring pollution levels and the fates of artificial intelligence and big data are intertwined ... Organizations collect data from a variety of sources, including business transactions, social media and information from sensor or machine-to-machine data. In the past, storing it would’ve been a problem – but new technologies (such as Hadoop) have eased the burden. A comprehensive and widespread network such as this to track the causes of pollution at source will allow government agencies to create smarter strategies to combat pollution



Don’t be fooled: AI-powered tech still needs to prove its intelligence

It’s essentially a definitional problem: For some reason, the industry is hellbent on using AI when what is actually means is machine learning (ML). This is a much more narrow term, referring to what is essentially using trial and error to build a model that’s capable of guessing the answers to discrete questions very accurately. For example, take image recognition: say you want to build a system that separates pictures of cats from pictures of dogs. All you have to do is feed a ML algorithm enough pictures of cats, telling the system they are cats, and then enough pictures of dogs, telling it they are dogs. It will then build a model of what patterns to look for and eventually, after enough training, you should be able to feed it an unlabelled image, and it will be able to make a fairly accurate guess as to which of the two animals is in the picture.


What Aspiring Data Scientists Are Looking For in Hiring Companies

Data Scientists
Not long ago, big data was the exclusive territory of the most prominent IT brands. There weren’t as many experts in the field 20 or even 10 years ago, and many small companies functioned perfectly fine without using big data. But this isn’t the case anymore. Even the smallest startup companies and entrepreneurial ventures now rely on big data to execute their business models, locate specific demographics and ensure long-term success. Businesses leave almost nothing to chance anymore, and much of this transition is tied directly to the big data boom. Perhaps above all else, data scientists want to work with a company that offers job security and stability. Many professionals in the niche realize how few openings exist in the field, so they’ll be pursuing long-term assignments whenever possible. Companies that can accommodate this need and provide guaranteed work will likely find it easier to fill roles in big data management, as opposed to those that only need to complete a one-time project.



Quote for the day:


"A leader does not deserve the name unless he is willing occasionally to stand alone." -- Henry A. Kissinger


Daily Tech Digest - January 19, 2018

cloud computing network connections - IoT - internet of things
A fog computing fabric can have a variety of components and functions. It could include fog computing gateways that accept data IoT devices have collected. It could include a variety of wired and wireless granular collection endpoints, including ruggedized routers and switching equipment. Other aspects could include customer premise equipment (CPE) and gateways to access edge nodes. Higher up the stack fog computing architectures would also touch core networks and routers and eventually global cloud services and servers. The OpenFog Consortium, the group developing reference architectures, has outlined three goals for developing a fog framework. Fog environments should be horizontally scalable, meaning it will support multiple industry vertical use cases; be able to work across the cloud to things continuum; and be a system-level technology, that extends from things, over network edges, through to the cloud and across various network protocols.


How to enable TCP BBR to improve network speed on Linux

linuxhero.jpg
Google developed a TCP Congestion Control Algorithm (CCA) called TCP Bottleneck Bandwidth and RRT (BBR) that overcomes many of the issues found in both Reno and CUBIC (the default CCAs). This new algorithm not only achieves significant bandwidth improvements, but also lower latency. TCP BBR is already employed with google.com servers, and now you can make it happen--so long as your Linux machine is running kernel 4.9 or newer. Out of the box, Linux uses Reno and CUBIC. ... The first thing you need to do is make sure your Linux machine is running a supported kernel. Issue the command uname -r. If your kernel is earlier than 4.9, this won't work. You'll have to upgrade your kernel. For instance, out of the box Ubuntu 16.04 runs kernel 4.4. If your server is such that the kernel can be updated, Ubuntu now has a very easy means of updating to a much newer kernel.



Car hacking remains a very real threat as autos become ever more loaded with tech

A large-scale vehicle hacking resulting in death and destruction was depicted in last year's "The Fate of the Furious" action movie. “That’s Hollywood sensationalizing it, but that is not really that far-fetched," said Joe Fabbre, a director with Santa Barbara, Calif.-based Green Hills Software, which makes operating systems software for vehicles with a focus on security. “There are very skilled hackers out there who can beat through a lot of medium and low levels of robustness in terms of security that is present in a lot of cars today.” In response to the hacking threat, more vehicles are gaining the ability to wirelessly download security patches, similar to how computers and smartphones have been getting software updates for years. These over-the-air updates allow auto companies to respond to threats – and newly discovered vulnerabilities – faster than having to direct customers to bring their vehicles to dealerships.


Ransomware: Why the crooks are ditching bitcoin and where they are going next

istock-bitcoin-and-other-currency.jpg
"If you're the guy behind the ransomware campaign, you want people to pay you -- you don't want people not to be able to pay you! You want to make it as easy as possible," said Glusman. Meanwhile, ransomware victims don't really want to have to pay to get their files back at the best of times - they give in grudgingly - but the incentive to pay might go out the window if it's going to take them days to buy bitcoin and pay the hackers before getting their files back. And there's an even bigger headache: many forms of ransomware offer only a small window for victims to pay the ransom. If that expires, victims risk the ransom going up or even losing their data permanently. Delays in being able to buy bitcoin and then make the payment make it even harder for ransomware victims to be able to get their data back. This is also a headache for the ransomware crooks: ultimately, there's therefore no point in a ransomware distributor being in the business if they can't get paid for their illicit activity.


Create security culture to boost cyber defences, says Troy Hunt


“Even organisations that are security aware enough to be training employees on various related topics do not necessarily know how to make those hard skills part of the organisation’s culture,” he said. This realisation, he said, led to the development of a course on creating a security-centric culture for Pluralsight, an enterprise technology learning platform company. The course is aimed at helping technology professionals and management understand how to embed a culture of security in their organisations, said Hunt. Part of the problem, he said, is that many organisations’ development and security teams tend to work in separate silos. Typically, development groups build the software before it is passed to the security team, but this creates a divide between these groups. Developers tend to be scared of the security people, said Hunt, because the security people can stop software projects from going live if any critical security vulnerabilities are identified in the software code.


Blockchain, digital trust and distributed ledger technology – going big business

Blockchain technology and distributed ledger in business
The question is how do you deal with ever more and faster transactions as the core of digital business in a reliable way that doesn’t slow down transactions in any way but, on the contrary, offers the speed they need in a trustworthy and cost-efficient way? Using a distributed technology is the answer for many. Enter blockchain technology. As mentioned in the introduction blockchain technology is rooted in the world of cryptocurrencies, more specifically Bitcoin. That connotation will disappear and we will not speak about the blockchain but about blockchains (note the letter ‘s’), blockchain technology or distributed ledger technology. Blockchain technology is being tested and implemented across a broad range of applications, industries and use cases for endless applications. Examples, on top of the Internet of Things and financial services


Leverage the power of the mainframe to make sense of your IoT data

Leverage the power of the mainframe to make sense of your IoT data
Thankfully, new innovations on the mainframe make it possible to leave IoT data in each of the databases where they reside and join the different data sources to perform your analytics. After all, with so many IoT devices collecting data and storing it in different locations, eliminating the ETL/ELT process altogether certainly sounds like a more efficient means to analyzing data. The concept of data virtualization allows users to define the structure of a data source in a relational format so that SQL can be run against that data. This means disparate hierarchical data sources can be joined via SQL, just like relational data sources, creating an aggregate view of the available data, enterprisewide. Reading IoT data in situ using SQL can be performed on the mainframe using its enhanced capabilities and can eliminate the ETL/ELT process entirely. By using the workhorse platform retailers already have within their infrastructure, they can gain real-time access to their IoT data and provide their customers with a frictionless and customized shopping experience.


Cloud portability: Why you’ll never really get there

Cloud portability: Why you’ll never really get there
The reality is that porting applications,whether they’re in containers or not, requires a great deal of planning to deal with the compatibility issues of the different environments. The use of containers does not guarantee that your containerized applications will be portable from platform to platform, cloud to cloud. For example, you can’t take a containerized application meant for Linux and run it on Windows, or the other way around. Indeed, containers are really just a cool way of bundling applications with operating systems. You do get enhanced portability capabilities with containers, but you don’t get the “any platform to any platform” portability that many believe it to be. Of course, enterprises want portability. And you can have it. All that’s needed is a greater planning effort when it comes to creating the applications in the first place.  The fact is that all applications are portable if you have enough time and money.


No-collar workforce: collaborating in roles and new talent models


For HR organizations in particular, this trend raises a number of fundamental questions. For example, how can companies approach performance management when the workforce includes bots and virtual workers? What about onboarding or retiring non-human workers? These are not theoretical questions. One critical dimension of the no-collar workforce trend involves creating an HR equivalent to support mechanical members of the worker cohort. Given how entrenched traditional work, career, and HR models are, reorganizing and reskilling workers around automation will likely be challenging. It will require new ways of thinking about jobs, enterprise culture, technology, and, most importantly, people. Even with these challenges, the no-collar trend introduces opportunities that may be too promising to ignore. What if by augmenting a human’s performance, you could raise his productivity on the same scale that we have driven productivity in technology?


What is the impact and likelihood of global risks?

“Cyber-attacks are increasing and have become a global concern as many systems and devices that run critical infrastructure and decision making are now connected through the worldwide web. Public and private companies have become more vulnerable to cyber-attacks as established IT security controls are now failing to protect the current systems. As a result, cyber-attacks have been deemed one of the greatest threat and concern to eight global economies – the USA, Germany, Estonia, Japan, Holland, Switzerland, Singapore, and Malaysia,” he noted. “This means that it is highly important that cyber-attacks become an urgent boardroom debate; they are no longer an IT problem, but a whole company problem and everyone is now responsible for cybersecurity. Cyber risks put the regulatory frameworks under pressure as they to adapt to these new high-frequency and high-risk economic threats.”



Quote for the day:


"Knowledge is the new capital, but it's worthless unless it's accessible, communicated, and enhanced." -- Hamilton Beazley


Daily Tech Digest - January 18, 2018

Understanding Supervised, Unsupervised, and Reinforcement Learning


With supervised learning, you feed the output of your algorithm into the system. This means that in supervised learning, the machine already knows the output of the algorithm before it starts working on it or learning it. A basic example of this concept would be a student learning a course from an instructor. The student knows what he/she is learning from the course. With the output of the algorithm known, all that a system needs to do is to work out the steps or process needed to reach from the input to the output. The algorithm is being taught through a training data set that guides the machine. If the process goes haywire and the algorithms come up with results completely different than what should be expected, then the training data does its part to guide the algorithm back towards the right path. Supervised Machine Learning currently makes up most of the ML that is being used by systems across the world. The input variable (x) is used to connect with the output variable (y) through the use of an algorithm.



Shift your Java applications into containers with Jelastic PaaS

application container vs system container
System containers offer multiple benefits when migrating an existing legacy application. IP addresses, hostnames, and locally stored data can survive container downtimes, there’s no need for port mapping, and you gain a far better isolation and virtualization of resources. Plus you get compatibility with SSH-based config tools and even hibernation and live migration of the memory state. The only perceptible disadvantage compared to application containers might be a slower start-up time as system containers are a bit heavier due to the additional services required for running multiple processes. In Jelastic, it is possible to run both application and system containers. In contrast to other PaaS vendors that use the so-called Twelve-Factor App methodology, Jelastic does not force customers to use any specific approach or application design in order to deploy cloud-native microservices and legacy monoliths.


How machine learning can be used to write more secure computer programs

By transforming software code into a graph, you can actually extract different properties from that code by analyzing the graph. ... Let’s take a smaller function that might have one IF block. One of the graph structures that’s first generated is called an abstract syntax tree. That’s a tree that you’d get by just parsing the code. ... For each IF and for each variable, for each statement, there’s going to be a node. For each operator, like if there’s an assignment, there’s also going to be a node, and they are all connected by edges. You soon run into a lot of nodes and edges. If you take something like, let’s say, the Linux kernel, you’ll have several hundreds of thousands of nodes. ... You can do a lot by essentially solving reachability problems in these graphs.


Cheap Raspberry Pi alternatives: 20 computers that cost less than the Pi 3


On its release in 2012, the $35 Raspberry Pi showed just how much computer you could get for a bargain-basement price. But the cost of single-board computers has just kept dropping, with the Raspberry Pi Foundation releasing the tiny Pi Zero for just $5. Today the Zero is one of several computers with a single-digit price tag, and if you're looking for an as cheap as chips board you're spoiled for choice. These are the single-board computers that you can pick up for less than a price of the $35 Pi. One thing to bear in mind is that the cheapest offerings lack many of the features of the Raspberry Pi 3 Model B, and have more in common with the $5/$10 Raspberry Pi Zero. Even the more expensive boards are at somewhat of a disadvantage compared to the Pi range, lacking their breadth of stable software, tutorials and community support.


Server vendors push flex pricing to challenge cloud providers

cloud computing - data center
For many customers who are just starting to use cloud services, these plans offer a means for making a graceful transition. “Customers want the flexibility to start with a certain capacity and scale as needed,” he said. The success of public cloud services is forcing the hand of vendors to compete by addressing some of the shortcomings of cloud. “Our advice back to these vendors is the time is right because more and more companies are putting more workloads on public cloud that require more storage,” said Stanley Stevens, an analyst with Technology Business Research Companies. “They are getting sticker shock because when they replicated their environment in the public cloud, they realized a lot of that storage is inefficient and unused.” The real cost of the cloud is not the workload, but moving around data, he said. Cloud providers like Amazon and Microsoft charge you for data sent up to their data center, storage, processing, and data sent back down to you. 


IT sabotage: Identifying and preventing insider threats


Insiders that commit IT sabotage are technically competent users who have the access and ability to carry out an attack, as well as the capability to conceal their illicit activities. These characteristics make detecting these kinds of insider IT sabotage very difficult, as malicious behavior rarely looks any different than normal behavior. ... However, in nearly every IT insider sabotage attack, distinct patterns have been discovered, and the detection of these patterns can help identify malicious insider activities. The CERT Insider Threat Center has been working for more than 15 years cataloging, analyzing and detecting patterns of malicious insider behavior in order to understand who commits insider attacks, why they do it, when and where they do it, and how they carry out their attacks.


Next-gen Mirai botnet targets cryptocurrency mining operations


Satori.Coin.Robber works “primarily on the Claymore Mining equipment that allows management actions on 3333 ports with no password authentication enabled (which is the default config),” the researchers said. “To prevent potential abuse, we will not discuss details.” Analysis of the botnet code revealed similarities with the original Satori, including similar code structures, encrypted configurations, similar configuration strings, and the same payload. However, the new variant also comes with a payload targeting the Claymore Miner that features an asynchronous network connection method and enables a new set of command and control communication protocols. Researchers noted that the author behind Satori.Coin.Robber has claimed the code is not malicious, and has even left an email address behind.


Convolutional neural networks for language tasks

Notice how the CNN processes the input as a complete sentence, rather than word by word as we did with the LSTM. For our CNN, we pass a tensor with all word indices in our sentence to our embedding lookup and get back the matrix for our sentence that will be used as the input to our network. Now that we have our embedded representation of our input sentence, we build our convolutional layers. In our CNN, we will use one-dimensional convolutions, as opposed to the two-dimensional convolutions typically used on vision tasks. Instead of defining a height and a width for our filters, we will only define a height, and the width will always be the embedding dimension. This makes sense intuitively, when compared to how images are represented in CNNs. When we deal with images, each pixel is a unit for analysis, and these pixels exist in both dimensions of our input image.


Configuration errors in Intel workstations being labeled a security hole

data breach security threat lock crime spyware
Normally computers with AMT have a BIOS password to prevent making low-level changes, but due to insecure defaults in the BIOS and AMT’s BIOS extension (MEBx) configuration, an attacker with physical access can log in using the default password “admin.” Given the bad security habits of many people, there’s a good chance this default password was not changed. By changing the default password, enabling remote access and setting AMT’s user opt-in to “None,” the attacker has now backdoored the machine and can gain access to the system remotely, assuming the attacker is on the same network as the target machine. Intel says this is a problem in how the machine is configured by the OEM. Its recommendation is that MEBx access be gated by the BIOS password and has said so since 2015. What F-Secure found is that some system manufacturers were not requiring a BIOS password to access MEBx. So it updated its guidance for proper AMT/MEBx security in December.


The role of the data curator: Make data scientists more productive

The data curator has a good understanding of the types of systems that store the data, and the types of tools that can be used for processing the data, even if they are not practitioners of these technologies themselves. They have up-to-date knowledge about datasets, their provenance, and what data curation is needed. They also understand the different types of analysis that need to be performed on specific datasets, as well as the expectations in terms of latency and availability set by diverse business users. By working with data engineers, data custodians, data analysts, and data scientists, the data curator develops a deep understanding of how data is used by the business, and how IT applies technology to make the data available. Data curators are making data analysts and data scientists more productive by allowing them to focus on what they do best.



Quote for the day:


"It is impossible to defeat an ignorant man in an argument." -- William G. Mcadoo


Daily Tech Digest - January 17, 2018

The Neuroscience of Intelligence: An Interview with Richard Haier


Neuroscience approaches have already made intelligence research more mainstream and ready for inclusion in policy discussions. For example, the single most important factor that predicts school success, by far, is the student’s intelligence. Social economic status, family resources, school and teacher quality all pale in comparison. The data showing this is overwhelming. Yet, the word “intelligence” is virtually absent from all discussions about education policies in the United States, and many other countries. Even if intelligence is mostly influenced by genes, all that means for education is that each student comes to school with a different set of strengths for learning. Teachers all know this and the common goal is to maximize each students potential. Attempts to create policies to do this without paying attention to what we know about intelligence have failed for decades, especially with respect to closing achievement gaps.



Why Your Data Could Be At Risk Without Decentralized Computing

According to industry experts, it will take decades for CPUs to be properly redesigned to resolve these issues and replaced. What should the world do to protect itself in the meantime? The answer is decentralization. This is a form of “trustless” computing that assumes from the start that no single machine can be relied upon, instead spreading information out across many different computers or “nodes.” In this framework, even though each individual entity has the potential to be compromised, the decentralized collective will always perform the work safely and correctly. Bitcoin, Ethereum, and blockchain technology in general offer notable examples of decentralized computing. Decentralization achieves two goals. First, no single machine is making all the decisions, so no single machine can unilaterally make bad decisions that affect individual users.


5 Ways SD-WAN Equips Enterprises to Improve Network Security


While the headlines have been alarming, overall industry trends are mixed. According to a recent report by the Ponemon Institute, the average cost of a data breach dropped by about 10 percent to $3.62 million in 2017. This is most likely tied to a reduction in the cost per record stolen, which declined from $158 in 2016 to $141 in 2017. However, the average size of data breaches rose 1.8 percent to more than 24,000 records. Clearly, this is not the time for enterprises to neglect network security. With the rapid expansion of the cloud, followed by what is likely to be an equally rapid move to the Internet of Things, wide-area infrastructure is in need of more flexible and robust protection. One of the most significant enhancements in this field is the advent of the software-defined wide-area network (SD-WAN). By abstracting regional connectivity on top of underlying hardware, enterprises can experience a number of benefits over traditional hardware-centric architectures.


6 things that prevent Blockchain from ruling the world

Generally speaking, the internet is fairly efficient when it comes to the transmission of data. The user requests information, and the server transmits back the piece of data requested with only a small amount of additional data required to get it there. However, the blockchain, in order for it to be preserved, as well as to prevent hacking, needs multiple copies distributed across many nodes. And the blockchain then requires a large amount of storage – for example, Bitcoin’s blockchain was nearly 150GB in size as of last month, and it’s getting bigger all the time. Furthermore, transmitting so much data for the blockchain each time also consumes additional electricity, making the blockchain quite inefficient. In a time where efforts are being made to compress video further to decrease the data required for a download, blockchain’s bulkiness makes little sense.


Financial savings just the beginning for CIOs who understand code quality


It is not just about cutting costs, but improving development productivity and code quality. In the past year, NCOI has fed code into the Cast system four times, but is moving to a contract to enable it to do so monthly to keep up with more regular software updates. “This is so we can refresh our portal every month,” said van Eeden. Ironically, since using the Cast system NCOI has been using more developers because it is doing more development. “For our core ERP application, we have doubled software development productivity,” said van Eeden. “My output doubled, and the quality in the sense of downtime and the number of bugs also improved dramatically.” Van Eeden said he knows there have been no software outages since the company has been using the software intelligence platform, whereas previously it “didn’t even look at the robustness of systems”.


The role of trust in security: Building relationships with management and employees

In reality, security processes must constantly evolve based on discussions between the chief security officer, management, and employees in every business unit, accounting for emerging risks, new technologies, and recently uncovered vulnerabilities. Chief security officers need to first and foremost ensure that a solid understanding exists between the security team and the business units. There is no way that anyone could understand the nuances of a business unit’s capabilities, processes, assets, and services to the extent the unit itself does, so it is tremendously important for a chief security officer to meet with each unit and develop a comprehensive security plan, which is aligned on the corporate level. Only by gaining a more complete understanding of the unique needs of a business unit can a chief security officer develop safeguards that reduce risks.


Demystifying DynamoDB Streams


In order to build something even as simple as a master-slave replication, there are several primitives to understand. The first and foremost is ordering. Imagine if two transactions were to be applied sequentially to a database — the first writes a new entry and the second deletes this entry, which ultimately results in no data persisting in the database — but if the ordering is not guaranteed, the delete transaction could be processed first (causing no effect) and then the write transaction applied, which results in data incorrectly persisting in the database. The second core primitive is duplication: each single transaction should appear exactly once within the log. Failure to enforce ordering or prevent duplication within a log can result in the master and slave becoming inconsistent. ... There are multiple strategies to checkpointing, each of which is a trade-off between specificity and throughput.


How AI Would Have Caught the Forever 21 Breach

As a first step, we must recognize that the days of the desktop/server model are over. In the case of Forever 21, the POS devices served as ground zero — not a laptop, a server, or even a corporate printer. In the age of the Internet of Things, we increasingly rely on "nontraditional" devices to optimize efficiency and boost productivity. But what constitutes a nontraditional device, and how do we look for it? Is it a device without a monitor? A device without a keyboard? Today a nontraditional device could be anything from heating and cooling systems to Internet-connected coffee machines to a rogue Raspberry Pi hidden underneath the floorboards. Protecting registered corporate devices is not enough — criminals will look for the weakest link. As our businesses grow in digital complexity, we have to monitor the entire infrastructure, including the physical network, virtual and cloud environments, and nontraditional IT, to ensure we can spot irregularities as they emerge.


What is identity management? IAM definition, uses, and solutions

Compromised user credentials often serve as an entry point into an organization’s network and its information assets. Enterprises use identity management to safeguard their information assets against the rising threats of ransomware, criminal hacking, phishing and other malware attacks. Global ransomware damage costs alone are expected to exceed $5 billion this year, up 15 percent from 2016, Cybersecurity Ventures predicted. In many organizations, users sometimes have more access privileges than necessary. A robust IAM system can add an important layer of protection by ensuring a consistent application of user access rules and policies across an organization.  Identity and access management systems can enhance business productivity. The systems’ central management capabilities can reduce the complexity and cost of safeguarding user credentials and access.


Mental Models & Security: Thinking Like a Hacker

Although we cannot predict the future with great certainty, we often subconsciously make decisions based on probabilities. For example, when crossing the road, we believe there's a low risk of being hit by a car. The risk exists, but if you've looked for traffic, you are confident that you can cross. The Bayesian method says that one should consider all prior relevant probabilities and then incrementally update them as newer information arrives. This method is especially productive given the fundamentally nondeterministic world we experience: we must use both prior odds and new information to arrive at our best decisions. While there may not be a simple answer to what it means to "think like a hacker," the use of mental models to build frameworks of thought can help avoid the pitfalls associated with approaching every problem from the same angle.



Quote for the day:


"It is easy to lead from the front when there are no obstacles before you, the true colors of a leader are exposed when placed under fire." -- Mark W. Boyer


Daily Tech Digest - January 16, 2018

Data Management 2018
A lot is happening it the market today, as data continues to explode at unprecedented levels. Compliance is no longer an option, but a requirement. The push to cut costs and embrace the multi-cloud – yet still maintain visibility of your data – has never been more critical. At the same time, the need to safeguard against data breaches is an absolute must. And, the necessity to gather as many insights from your data as possible could be the difference between success and failure for many organisations. So, what does all of this mean for today’s CIOs and IT decision makers? Here are the top five predictions for the coming year. For those who see challenges as opportunities, this could be an exciting year for you. ... It is expected that new data valuation techniques to get a boost from AI to reshape information lifecycle management through the automation of policy enforcement and more intelligent data management actions.


What is Zero Trust? A model for more effective security

The Zero Trust model of information security basically kicks to the curb the old castle-and-moat mentality that had organizations focused on defending their perimeters while assuming everything already inside didn’t pose a threat and therefore was cleared for access. Security and technology experts say the castle-and-moat approach isn’t working. They point to the fact that some of the most egregious data breaches happened because hackers, once they gained access inside corporate firewalls, were able move through internal systems without much resistance. “One of the inherent problems we have in IT is we let too many things run way too openly with too many default connections. We essentially trust way too much,” Cunningham says. “That’s why the internet took off – because everyone could share everything all the time. But it’s also a key fail point: If you trust everything, then you don’t have a chance of changing anything security wise.”


Microsoft Stresses Security, Responsible AI in Cloud Policy Updates


In the 2018 update, Microsoft is tackling some of the negative consequences of using AI and other technologies based on the tumultuous year the IT industry experienced in 2017. "We continue to witness cyber-attacks by nation-states on citizens, critical infrastructure and the institutions of democracy. We read on an almost daily basis about the criminal hacking of companies and governments to steal private and sensitive information of customers," wrote Smith in the 2018 update to the e-book. "We listen to the concerns about the loss of jobs to automation and the disruptive impact of artificial intelligence (AI) on entire sectors of the economy." In terms of cyber-security, Microsoft continues to honor its commitment to spend $1 billion in the IT security field each year. If necessary, the company is poised to use legal means to disrupt nation-state attacks.


Cloud computing: Three strategies for making the most of on-demand

"It's still complicated," says Marks, before suggesting more organisations will continue to move from an exploratory stage through to full adoption. "The difference today is that the cloud is understood for its different dimensions, be that at the level of the infrastructure, the platform or services. The migration of applications and data from the server downstairs to the public cloud is a shift that continues. The key point is that it's hard to think of a sensible reason why a CIO would buy hardware ever again -- and that should be your starting point." ZDNet speaks with three IT leaders at different stages of the cloud adoption process: exploring, transforming, and pioneering. Evidence from these three stages suggests cloud-led change remains a work in progress, where smart IT leaders assess their business context and provide an on-demand solution that can flex with future requirements.


Enterprise software spending set to grow thanks to AI and digital boost


“Looking at some of the key areas driving spending over the next few years, Gartner forecasts $2.9tn in new business value opportunities attributable to AI by 2021, as well as the ability to recover 6.2 billion hours of worker productivity,” he said. “Capturing the potential business value will require spending, especially when seeking the more near-term cost savings. “Spending on AI for customer experience and revenue generation is likely to benefit from AI being a force multiplier – the cost to implement will be exceeded by the positive network effects and resulting increase in revenue.” Gartner forecast a slight increase of 0.6% in datacentre spending in 2018 compared with 2017, but predicted a decline of 0.2% in 2019. As Computer Weekly has reported previously, this may be related to the increase in SaaS and cloud-based services.


Lessons in Becoming an Effective Data Scientist

The first skill that I look for when engaging with or hiring a data scientist is humility. I look for the ability to listen and engage with others who may not seem as smart as them. And as you can see from our DEPP methodology, humility is the key to driving collaboration between the business stakeholders (who will never understand data science to the level that a data scientist do) and the data scientist. Humility is critical to our DEPP methodology because you can’t learn what’s important for the business if you aren’t willing to acknowledge that you might not know everything. Humility is one of the secrets to effective collaboration. Nowhere does the importance of the business/data science collaboration play a more important role than in hypothesis development. If you get the hypothesis and the metrics against which you are going to measure success wrong, everything the data scientist does to support that hypothesis doesn’t matter.


7 Acquisitions that Point to Cloud Maturity


Over the better part of a decade, cloud computing mergers and acquisitions painted a picture where cloud service providers where on the hunt to accumulate as many customers as possible. Thus, you saw massive build-outs of facilities and a haphazard set of mergers and acquisitions that had no real rhyme or reason. We also witnessed incredible price wars when it came to commodity cloud resources in the IaaS and PaaS space. In a nutshell, the cloud service provider market has always about growth by any means necessary. If you contrast previous years' cloud acquisitions with those that have occurred the latter parts of 2017 and into 2018, we start to see a new pattern forming. Sure, there are still signs of significant growth in the cloud space for those looking for bleeding-edge services. Yet at the same time, you see a trend towards stability and cloud acquisition dollars trending toward following what the customer wants in a cloud service -- as opposed to the other way around.


As the cloud’s popularity grows, so does the risk to sensitive data

Despite the prevalence of cloud usage, the study found that there is a gap in awareness within businesses about the services being used. Only a quarter (25%) of IT and IT security practitioners revealed they are very confident they know all the cloud services their business is using, with a third (31%) confident they know. Looking more closely, shadow IT may be continuing to cause challenges. Over half of Australian (61%), Brazilian (59%) and British (56%) organizations are not confident they know all the cloud computing apps, platform or infrastructure services their organization is using. Confidence is higher elsewhere, with only around a quarter in Germany (27%), Japan (27%) and France (25%) not confident. Fortunately, the vast majority (81%) believe that having the ability to use strong authentication methods to access data and applications in the cloud is essential or very important.


Big Data 2018: 4 Reasons To Be Excited, 4 Reasons To Be Worried

Figure 1. TensorFlow Playground offers an interactive sandbox for exploring the foundations of TensorFlow. (Source: Google)
Machine-learning models can accurately perform recognition of specific patterns in data streams. In environments already inundated with data, this capability provides high value and distinct advantages, and the industry has responded accordingly. Data scientists can take advantage of a growing number of open-source machine-learning frameworks including Google’s TensorFlow, Apache MXNet, Facebook Caffe2, and Microsoft Cognitive Toolkit, among others. Most important, the task of building models has never been easier. For example, Amazon Web Services (AWS) offers deep learning AMIs (Amazon Machine Images) with the leading ML frameworks already built in and ready for use on the AWS cloud. For those just starting, Google’s TensorFlow Playground helps users learn more about the neural networks underlying machine learning frameworks, using simple data sets and pre-trained models


Container infrastructure a silver lining amid Intel CPU flaw fixes

Meltdown and Spectre loom over containers
"Most folks running containers have something like [Apache] Mesos or Kubernetes, and that makes it easy to do rolling upgrades on the infrastructure underneath," said Andy Domeier, director of technology operations at SPS Commerce, a communications network for supply chain and logistics businesses based in Minneapolis. SPS uses Mesos for container orchestration, but it is evaluating Kubernetes, as well. Containers are often used with immutable infrastructures, which can be stood up and torn down at will and present an ideal means to handle the infrastructure changes on the way, due to these specific Intel CPU flaws or unforeseen future events. "It really hammers home the case for immutability," said Carmen DeArdo, technology director responsible for the software delivery pipeline at Nationwide Mutual Insurance Co. in Columbus, Ohio.



Quote for the day:


"When we lead from the heart, we don't need to work on being authentic we just are!" -- Gordon Tredgold