Daily Tech Digest - October 10, 2017

IT spending increases for software-defined storage, on-demand services
SDS is gaining popularity because of its versatility in a modern data center. Enterprise storage has been migrating from hardware-defined arrays as data centers migrate to virtualization and cloud-based infrastructure. SDS solutions run on commodity hardware but use virtualization and all functionality, such as provisioning and de-duplication, via software. This adds automation and thus speed to storage networks. "For IT organizations undergoing digital transformation, SDS provides a good match for the capabilities needed — flexible IT agility; easier, more intuitive administration driven by the characteristics of autonomous storage management; and lower capital costs due to the use of commodity and off-the-shelf hardware," said Eric Burgener, research director at IDC, in a statement.

Rise in Insider Threats Drives Shift to Training, Data-Level Security

With an insider threat, the culprit is already inside the network. Securing the perimeter around the network — which has long been the focus for enterprise security — does not do the job against this kind of a threat, whether it is malicious or unintentional. Nor is focusing on securing the perimeter the best strategy against many external threats. That's because data-smart companies want to be able to safely give partners, suppliers, and customers access to their networks in order to increase business opportunities. As a result of this shift, security needs to rest with the data itself, not just at the network level. The move to the cloud elevates the need for data-level protection. To reduce the risk of insider threats, companies and organizations need to focus on three areas

Understanding Cloud Native Infrastructure: Interview with Justin Garrison and Kris Nova

A major benefit of public cloud comes from process rather than performance. ​The people hours you can save from becoming an infrastructure consumer rather than an infrastructure builder will be very difficult to calculate but will likely enable a new method of working that far outweighs the technical limitations of a public cloud. Not to mention some of the best infrastructure builders and maintainers in the world work at public cloud providers and the companies behind them spend billions every year building out the infrastructure, R&D, and new features. ​The biggest considerations when building your own cloud is not what it will cost you to build the private cloud, but what it will cost you to maintain it and what happens when you fall behind public cloud offerings.

Make Cybersecurity A Priority in a Small Business’ Early Stages

The need for strong passwords is crucial for cybersecurity, no matter how often we groan about having to change (and remember) a new one. Shubhomita Bose writes about this and data from Headway Capital for smallbiztrends.com. The Headway infographic emphasizes having a company policy to avoid “weak” passwords, to change passwords on a regular basis, and to incorporate “two-factor authentication” — as some businesses are now doing with an additional text-message step in the password process. This is an increasingly significant threat to cybersecurity. ... As Anita Campbell, CEO of Small Business Trends, writes for Inc.com, “The ransom is displayed on the screen with a message stating you must pay a fine or fee in order to access your own system. Ransoms have ranged from hundreds of dollars to tens of thousands of dollars.”

Leaving employees to manage their own password security is a mistake

“Far too many organizations are leaving the responsibility for password management to their employees and don’t have the automated password management technology in place to identify when things are going wrong.” “In many cases, an organization’s password management practices are overly reliant on manual processes and far too often place an excessive level of trust in employees to use safe password practices,” said Matt Kaplan, GM of LastPass. “The threat posed by human behavior coupled with the absence of technology to underpin policy is leaving companies unnecessarily at risk from weak or shared passwords. Organizations need to focus on solving for both obstacles in order to significantly improve their overall security.”

How IPv6 deployment affects the security of IoT devices

As a result of their vast address space, IPv6 devices are provisioned with at least one unique global address and, thus, NATs are doomed to disappear. Therefore, a NAT's enforcement of the filtering policy to only allow outgoing communications is also likely to disappear, meaning communication between internal and external systems may no longer be policed by the network. In fact, the distinction between internal and external networks may disappear altogether if a filtering policy is not enforced at the network border. While this could have potential benefits -- for example, for peer-to-peer applications, in which unsolicited inbound communications are common -- this clearly comes at the expense of increased attack exposure.

Organizational Culture Needs To Change So That Security And DevOps Can Exist In Tandem

Cloud adoption often started to be called in terms of ‘shadow IT’ or ‘bypass IT’. So cloud adoption often occurred outside of the mainstream IT and mainstream IT security groups. So in a sense IT and IT security are still playing catch up to the original adoption of cloud. Even if they have been given responsibility for it now. And we have started to see that change. 2 years ago even in the US often it was – we were working with those ‘shadow IT’ projects. Now the responsibility is more moving into IT and IT security. So they’re bringing the traditional mindset. I think the remaining roadblock that you are still getting is the developer pipeline is moving at a much faster pace than it did historically where application introduction used to occur maybe in months

Intel plans hybrid CPU-FPGA chips

Intel plans hybrid CPU-FPGA chips
“The advantage for FPGA is GPUs play in some areas but not all, and if you look at the use model of inline vs. offload, they are limited to offload mostly. So, there’s a broader application space you can cover with FPGA,” he said.  The integrated solution provides tight coupling between CPU and FPGA with very high bandwidth, while the external PCI Express card is not as tightly coupled. For ultra-low latency and high-bandwidth applications, integrated is a great fit, Friebe said.  “Most of the differentiation [between integrated and discrete] is due to system architecture and data movement. In a data center environment where [you] run many different workloads, you don’t want to tie it to a particular app,” he said.  The more you do specialization, the more performance you can squeeze out of the accelerator, said Friebe. 

The future of mobility: Are we asking the right questions?

One of the categories requiring the sharpest questions about the future is mobility. The mobile present has many moving parts and is very complex, but base patterns are discernible. I believe every human on this planet needs at least to attempt to comprehend the current point to which the mobile revolution has brought us. Furthermore, I believe modern executives have a fiduciary responsibility to think long and hard about where the mobile revolution is taking us.  The most rapidly adopted consumer technology in the history of mankind, mobile technology has had a huge economic impact — more than $1 trillion — and has changed the corporate competitive landscape as well as how people live their daily lives. Some go so far as to argue that mobile technologies have changed what it is to be human.

Detecting and Analyzing Redundant Code

A typical analysis would involve running the tool repeatedly to prune back the source tree as brutally as possible. This was then followed by several cycles of reverting changes so as to get successful builds and then passing tests. The reasons for failure being that the tool had behaved incorrectly or there was a known limitation, examples of the latter being reflection or the existence of a code contract. The tool was trained on various GitHub repositories for C# projects that were chosen on the basis that I had used them and thus wanted to contribute back. Ultimately a pull request was submitted to the community asking for discussion of the changes in my branch. As the tool is brutal and I was engaging online with people for the first time this was where diplomacy was required and hopefully I didn’t offend too many people.

Quote for the day:

"When you're around someone good, your own standards are raised." -- Ritchie Blackmore