Daily Tech Digest - December 09, 2019

The PC was supposed to die a decade ago. Instead, this happened


Not all that long ago, tech pundits were convinced that by 2020 the personal computer as we know it would be extinct. You can even mark the date and time of the PC's death: January 27, 2010, at 10:00 A.M. Pacific Time, when Steve Jobs stepped onto a San Francisco stage to unveil the iPad. The precise moment was documented by noted Big Thinker Nicholas Carr in The New Republic with this memorable headline: "The PC Officially Died Today." ... And so, here we are, a full decade after the PC's untimely death, and the industry is still selling more than a quarter-billion-with-a-B personal computers every year. Which is pretty good for an industry that has been living on borrowed time for ten years. Maybe the reason the PC industry hasn't suffered a mass extinction event yet is because they adapted, and because those competing platforms weren't able to take over every PC-centric task. So what's different as we approach 2020? To get a proper before-and-after picture, I climbed into the Wayback Machine and traveled back to 2010.


Netflix open sources data science management tool

Netflix open sources data science management tool
Netflix has open sourced Metaflow, an internally developed tool for building and managing Python-based data science projects. Metaflow addresses the entire data science workflow, from prototype to model deployment, and provides built-in integrations to AWS cloud services.  Machine learning and data science projects need mechanisms to track the development of the code, data, and models. Doing all of that manually is error-prone, and tools for source code management, like Git, aren’t well-suited to all of these tasks. Metaflow provides Python APIs to the entire stack of technologies in a data science workflow, from access to the data through compute resources, versioning, model training, scheduling, and model deployment. ... Metaflow does not favor any particular machine learning framework or data science library. Metaflow projects are just Python code, with each step of a project’s data flow represented by common Python programming idioms. Each time a Metaflow project runs, the data it generates is given a unique ID. This lets you access every run—and every step of that run—by referring to its ID or user-assigned metadata.



AppSec in the Age of DevSecOps

laptop-in-dark
Application security as a practice is dynamic. No two applications are the same, even if they belong in the same market domain, presumably operating on identical business use-cases. Some (of the many) factors that cause this variance include technology stack of choice, programming style of developers, a culture of the product engineering team, priority of the business, platforms used, etc. This consequentially results in a wide spectrum of unique customer needs. Take penetration testing as an example. This is a practice area that is presumably well-entrenched, both as a need and as an offering in the application security market. However, in today's age, even a singular requirement such as this could make or break an initial conversation. While, for one prospect, the need could be to conduct the test from a compliance (only) perspective, another's need could stem from a proactive software security initiative. There are many others who have internal assessment teams and often look outside for a third-party view.


Data centers in 2020: Automation, cheaper memory

prediction predict the future crystal ball hand holding crystal ball by arthur ogleznev via unsplash
Storage-class memory is memory that goes in a DRAM slot and can function like DRAM but can also function like an SSD. It has near-DRAM-like speed but has storage capabilities, too, effectively turning it into a cache for SSD. Intel and Micron were working on SCM together but parted company. Intel released its SCM product, Optane, in May, and Micron came to market in October with QuantX. South Korean memory giant SK Hynix is also working on a SCM product that’s different from the 3D XPoint technology Micron and Intel use as well. ... Remember when everyone was looking forward to shutting down their data centers entirely and moving to the cloud? So much for that idea. IDC’s latest CloudPulse survey suggests that 85% of enterprises plan to move workload from public to private environments over the next year. And a recent survey by Nutanix found 73% of respondents reported that they are moving some applications off the public cloud and back on-prem. Security was cited as the primary reason. And since it’s doubtful security will ever be good enough for some companies and some data, it seems the mad rush to the cloud will likely slow a little as people become more picky about what they put in the cloud and what they keep behind their firewall.


Batch Goes Out the Window: The Dawn of Data Orchestration

Data.orchestration
Add to the mix the whole world of streaming data. By open-sourcing Kafka to the Apache Foundation, LinkedIn let loose the gushing waters of data streams. These high-speed freeways of data largely circumvent traditional data management tooling, which can't stand the pressure. Doing the math, we see a vastly different scenario for today's data, as compared to only a few years ago. Companies have gone from relying on five to 10 source systems for an enterprise data warehouse to now embracing dozens or more systems across various analytical platforms. Meanwhile, the appetite for insights is greater than ever, as is the desire to dynamically link analytical systems with operational ones. The end result is a tremendous amount of energy focused on the need for ... meaningful data orchestration. For performance, governance, quality and a vast array of business needs, data orchestration is taking shape right now out of sheer necessity. The old highways for data have become too clogged and cannot support the necessary traffic. A whole new system is required. To wit, there are several software companies focused intently on solving this big problem. Here are just a few of the innovative firms that are shaping the data orchestration space.


jobs.jpg
With the majority of companies looking for expertise in the three- to 10-year range, Robinson said they must change their traditional recruitment/training tactics. "The technical skill supply is far less than the demand, so companies are not going to simply be able to meet their exact needs on the open market,'' he said. "There must be a willingness to look outside the normal sources for technical skill, and there must be a willingness to invest in training to get workers up to speed once they are in house." The trend is toward specialization, "but this certainly introduces a financial challenge,'' he said, since most companies cannot afford to build large teams of specialists. So depending on the company's strategy, they may lean more on generalists or they may explore different mixes of internal/external talent. "Even for tech workers who specialize, knowledge across the different areas of IT is necessary for efficient operation of complex systems,'' Robinson said. The primary approach most tech workers are taking for career growth is to deepen their skills in their area of expertise, he said. But they must have knowledge in other areas beyond this, Robinson stressed, especially as tech workers move from a junior level to an architect level.


Seagate doubles HDD performance with multi-actuator technology

big data / data center / server racks / storage / binary code / analytics
The technology is pretty straightforward. Say you have four platters in a disk drive. The actuator controls the drive heads and moves them all in unison over all four platters. Seagate's multi-actuator makes two independent actuators out of one, so in a six-platter drive, the two actuators cover three platters each. ... While SSDs have buried HDDs in terms of performance, they simply can’t match HDDs for capacity. Of course there are multi-terabyte SSDs available, but they cost many times more than the 12TB/14TB HDD drives that Seagate and its chief competitor Western Digital offer. And data centers are not about to go all-SSD yet, if ever. So there's definitely a place for faster HDDs in the data center. Microsoft has been testing Exos 2X14 enterprise hard drives with MACH.2 technology to see if it can maintain the IOPS required for some of Microsoft’s cloud services, including Azure and the Microsoft Exchange Online email service, while increasing available storage capacity per data-center slot.


Synchronizing Cache with the Database using NCache

Caching improves the performance of web applications by reducing resource consumption in applications. It achieves this by storing page output or relatively stale application data across the HTTP requests. Caching makes your site run faster and provide better end-user experience. You can take advantage of caching to reduce the consumption of server resources by reducing the server and database hits. The cache object in ASP.NET can be used to store application data and reduce the expensive server (database server, etc.) hits. As a result, your web page is rendered faster. When you are caching application data, you would typically have a copy of the data in the cache that also resides in the database. Now this duplication of data (both in the database and in the cache) introduces data consistency issues. The data in the cache must be in sync with the data in the database. You should know how data in the cache can be invalidated and removed when any change occurs in the database in real-time.


Coders are the new superheroes of natural disasters

screen-shot-2019-12-02-at-5-33-21-pm.png
It will be a launching point for open-source programs like Call for Code and "Clinton Global Initiative University" and will support the entire process of creating solutions for those most in need. Call for Code is seeking solutions for this year's challenge and coders can go to the 2019 Challenge Experience to join. Call for Code unites developers and data scientists around the world to create sustainable, scalable, and live-saving open source technologies via the power of Cloud, AI, blockchain and IoT tech. Clinton Global Initiative University partners with IBM and commits to inspiring university students to harness modern, emerging and open-source technologies to develop solutions for disaster response and resilience challenges. "Technology skills are increasingly valuable," Krook said, "even for students who don't intend to become professional software developers. For computer science students, putting the end user first, and empathizing with how they hope to use technology to solve their problems—particularly those that represent a danger to their health and well-being—will help them understand how to build high-quality and well-designed software."



There is a widespread belief that rules, structure and processes inhibit freedom and that organizations that want to build a culture of autonomy and performance need to avoid them like the plague. ... There are times in history when this has happened to entire societies. When the leaders of the French Revolution abolished the laws of the "Ancien Regime", the result was terror. When Russia descended into chaos after the revolution of 1917, the result was civil war and the emergence of a tyrant, Stalin, who began a sustained terror of his own. When the Weimar Republic in Germany failed in the 1920’s, the result was Hitler. In our own time, as social structures weaken, strongmen like Putin or Erdogan come to power and impose personal rules of their own. Societies which abolish laws become chaotic. In chaos, there is absolute freedom. As the philosopher Hegel observed, absolute freedom is not freedom at all, but a playground for the arbitrary exercise of power, which ends in terror. In terror, only a few are free, and many are slaves.



Quote for the day:


"To do great things is difficult; but to command great things is more difficult." -- Friedrich Nietzsche


No comments:

Post a Comment