Daily Tech Digest - March 19, 2018

Linux Foundation unveils open source hypervisor for IoT products

The Linux Foundation recently unveiled ACRN (pronounced "acorn"), a new open source embedded reference hypervisor project that aims to make it easier for enterprise leaders to build an Internet of Things (IoT)-specific hypervisor. The project, further detailed in a press release, could help fast track enterprise IoT projects by giving developers a readily-available option for such an embedded hypervisor. It will also provide a reference framework for building a hypervisor that prioritizes real-time data and workload security in IoT projects, the release said. ACRN is made up of the hypervisor and its device model, the release noted. This is complete with I/O mediators. Firms like Intel, LG Electronics, Aptiv, and more have already contributed to the project. "ACRN's optimization for resource-constrained devices and focus on isolating safety-critical workloads and giving them priority make the project applicable across many IoT use cases," Jim Zemlin, executive director of The Linux Foundation, said in the release.

Java at a crossroads: Why the popular programming language needs to evolve to stay alive

Java is used most often in cloud computing, data science work, web development, and app development, said Karen Panetta, IEEE fellow and dean of graduate engineering at Tufts University. "I still see it evolving, and very popular," Panetta said. While languages such as Python are growing as well, Java is adapting to the increasing number of deep learning and machine learning workloads. "There's becoming a lot of libraries out there that are compatible for deep learning," Panetta said. "I think the fact that we keep talking about cloud computing and all of those things, that Java is still going to be the dominant player." Java also has built in more security options than Python, so it's a good option for Internet of Things (IoT) applications, Panetta said. Java has a foothold everywhere, and large user groups and libraries already written, making it a natural pathway for machine learning, Panetta said. "It's evolving to meet the needs," she added.

FPGA maker Xilinx aims range of software programmable chips at data centers

The first product range in the category is code-named Everest, due to tape out (have its design finished) this year and ship to customers next year, Xilinx announced Monday. Whether it’s an incremental evolution of current FPGAs or something more radical is tough to say since the company is unveiling an architectural model that leaves out many technical details, like precisely what sort of application and real-time processors the chips will use. The features that we do know about are consequential, though. Everest will incorporate a NOC (network-on-a-chip) as a standard feature, and use the CCIX (Cache Coherent Interconnect for Accelerators) interconnect fabric, neither of which appear in current FPGAs. Everest will offer hardware and software programmability, and stands to be one of the first integrated circuits on the market to use 7nm manufacturing process technology (in this case, TSMC’s). The smaller the manufacturing process technology, the greater the transistor density on processors, which leads to cost and performance efficiency.

Ethernet bandwidth costs fall to a six-year low

Cloud provider demand for more throughput increased last year's average bandwidth per switch port connection to almost 17 Gb from 12 Gb in 2016, according to the latest report from Crehan Research Inc., based in San Francisco. "Public, private and hybrid cloud providers are looking to deploy much faster networks within and between data centers in order to handle the myriad of new and existing applications that their customers need," Seamus Crehan, president of Crehan Research, said in a statement. "In turn, the data center switch vendors are responding by offering significantly more bandwidth at little or no additional cost." The net result in 2017 was impressive increases in Ethernet bandwidth, port shipments and revenue in the branded switch market, Crehan said. Revenue rose 10% -- the highest annual growth in four years. 

What CISOs must know about DFARS and NIST to be compliant

Privilege management and application control map to many of the different controls within the guidelines – and it’s hardly surprising given the proven effectiveness of the two security controls when combined with the visibility it provides. We know that privilege management allows admin rights to be applied to applications as needed – rather than giving the user too much access. Application control is the part that allows us to whitelist or blacklist an application from running at all. The good thing about these two technologies together is that they’re great a “bang-for-the-buck.” Between them, they overlap to address controls in access control, audit and accountability, configuration management, maintenance and system and information integrity. Compliance is crucial for CISOs because those who fail to comply will likely lose government contracts. Organizations that are able to demonstrate compliance at an early stage may be in a better position to secure additional wins.

IT’s Most Wanted: 16 Traits Of Indispensable IT Pros

IT’s most wanted: 16 traits of indispensable IT pros
Taking fresh looks at old problems is an essential part of the digital transformations that are changing many organization’s cultures, leading to approaches like DevOps and agile and incorporating emerging technical solutions such as AI and IoT, says Christoph ...  “There is one thing that IT staff cannot afford — and that’s to stand still,” Goldenstern says. “The willingness to learn and keep evolving, making yourself vulnerable in the process, is absolutely essential to staying relevant and being a growth driver in a constantly evolving business.” ... “In order to succeed in IT you need to have the ability to look at a problem, analyze it, and find a way to solve it,” Martini says. “I look for people who understand their strengths and weaknesses. These are the people that are most capable of learning on the job, while still improving the team’s ability overall. Look at raw potential. If you have potential and the drive to improve, the rest will follow.”

Disaster and Contingency Planning Lessons from the ICU

It’s inaccurate to accuse Memorial Hospital of not having a disaster plan. Theirs was 246 pages long. They had a designated disaster coordinator. What they didn’t have was a leader who had looked ahead multiple steps. They also failed to convert the generic pieces into living, breathing human beings whose survival hinged on what moves they made next. No one at Memorial knew that the surrounding levees would break after Katrina, isolating the hospital. That the generators, whose move to a higher floor had always fallen to a lower budget priority than some other need, would be incapacitated by flooding. That the presence of patients of a provider that was leasing the seventh floor would multiply the census of extremely ill patients exponentially. ...  That same physician was subsequently charged with second-degree murder. She was accused of choosing to euthanize the sickest patients without their consent. A comprehensive review of the situation indicates that, at a minimum, people in authority were scrambling to deal with their pieces of the game. No one was watching the whole board.

Android Oreo: 18 advanced tips and tricks

Android Oreo statue at Google
Got a notification you don't want to deal with immediately — but also don't want to forget? Use Oreo's super-handy (but also super-hidden!) snoozing feature: Simply slide a notification slightly to the left or right, then tap the clock icon that appears along its edge. That'll let you send it away for 15 minutes, 30 minutes, one hour, or two hours and then have it reappear as new when the time is right. ... Another new Oreo feature is the system-level ability for launchers to display dots on an app's home screen icon whenever that app has a notification pending — yes, much like the notification badges on iOS. Unlike iOS, though, Android already has an excellent system for viewing and managing notifications, which can make this addition feel rather redundant and distracting. But wait! Here's a little secret: You can disable the dots — if you know where to look. Mosey on back to the Apps & Notifications section of your system settings, then tap the line labeled "Notifications" and turn off the toggle next to "Allow notification dots."

Predictive maintenance: One of the industrial IoT’s big draws

Predictive maintenance: One of the industrial IoT’s big draws
CarForce is mostly focused on selling its product to garages, but Lora said that the potential beneficiaries are numerous. In the garage use case, mechanics can get real-time maintenance data from vehicles they service, which offers both the ability to warn customers of impending problems and to correlate large data sets together to help predict future reliability issues. It's a value-add because the garage can stay a step ahead of mechanical issues – an alert goes off, and the garage can contact the customer to schedule maintenance. Even an awareness that customer X might be coming in for an oil change on a given day can help with planning and scheduling. “If you look at the big data/AI path, step one is just seeing the data,” said Lora. It’s part of what she refers to as the “lilypad” approach to development – building one system to enable a leap to the next lilypad, and so on. CarForce plans to operate on a population level - predicting reliability and failures across big swaths of the automotive landscape.

The benefits of machine learning in network management

The problem with rule-based systems is they require maintenance and frequent updating as new rules are needed. It is often too cumbersome to create rules where numerous changes in the conditions require very different results. In addition, these systems are not very flexible. The rule sets may miss a problem if the rule set in question doesn't exactly match the problem's symptoms. It's much better to build a system that can learn about problems from the network experts who use it -- much like training a person who is new to the field of networking. Then, as new problems and solutions are found, the system would learn the symptoms and the resulting actions to take. Most of the industry agrees the integration of AI is among the benefits of machine learning. For our purposes, think of machine learning and deep learning as examples of neural network technology. A neural network is trained when it is fed a lot of data from the domain in question -- along with the appropriate answer or response. The neural network learns the appropriate response when presented with new data.

Quote for the day:

"My past has not defined me, destroyed me, deterred me, or defeated me, it has only strengthened me." -- Steve Maraboli

Daily Tech Digest - March 18, 2018

The Differences Between Machine Learning And Predictive Analytics

machine learning, predictive analytics
Machine learning applications can be highly complex, but one that’s both simple and very useful for business is a machine learning algorithm that compares employee satisfaction ratings to salaries. Instead of plotting a predictive satisfaction curve against salary figures for various employees, as predictive analytics would suggest, the algorithm assimilates huge amounts of random training data upon entry, and the prediction results are affected by any added training data to produce real-time accuracy and more helpful predictions. ... Predictive analytics can be defined as the procedure of condensing huge volumes of data into information that humans can understand and use. Basic descriptive analytic techniques include averages and counts. Descriptive analytics based on obtaining information from past events has evolved into predictive analytics, which attempts to predict the future based on historical data. This concept applies complex techniques of classical statistics, like regression and decision trees, to provide credible answers to queries

Creating With Cognitive

“Our IAIC initiatives were really born in response to our Clients. They said to us, ‘We’re intrigued by the concept of Cognitive Automation — but we don’t really know how it can affect our business’. We knew we had to create a safe place where companies could experiment combining process based automation solutions like Robotics Process Automation (RPA) and Business Process Management (BPM) with Autonomic and Cognitive Assets. Our process starts with an Education session to demystify different types of automation and arrive on common definitions. We then begin Ideating on how this technology could impact the overall organization and business processes. Here we take a hard look at User Experience, End to End Process Views and what the “art of the possible” could become in the future. Next, we move to a Strategy session, where we bring data scientists, business analysts, software engineers and developers together to re-imagine what the client — and their customers — really need to meet their expectations and begin applying the defined technologies to actual client use cases.

Data-Savvy Banking Organizations Will Destroy Everyone Else

“To win in today’s market and ensure future viability, it is essential that organizations capture value quickly, change direction at pace, and shape and deliver new products and services. Organizations also need to maximize the use of ‘always on’ intelligence to sense, predict and act on changing customer and market developments,” said Debbie Polishook, group chief executive, Accenture. The good news is that, despite what appears to be an ominous future, over 40% of executives see more opportunities than threats compared to two years ago. The key is to break down silos and leverage data and insights to support both internal and external business needs. Intelligent organizations have five essential ingredients that contribute to a lasting and impactful business process transformation. These essential provide the foundation for an agile, flexible and responsive organization that can act swiftly to market and consumer changes and be in a better position to succeed

Where NEXT for Tech Innovation in 2018?

Healthcare is an industry that is ripe for disruption. We will begin to see the power of IoT in healthcare with the emergence of inexpensive, continuous ways to capture and share our data, as well as derive insights that inform and empower patients. Moreover, wearable adoption will create a massive stream of real-time health data beyond the doctor’s office, which will significantly improve diagnosis, compliance and treatment. In short, a person’s trip to the doctor will start to look different – but for the right reasons. Samsung is using IoT and AI to improve efficiency in healthcare. Samsung NEXT has invested in startups in this area, such as Glooko which helps people with diabetes by uploading the patient’s glucose data to the cloud to make it easier to access and analyse them. Another noteworthy investment in this space from Samsung NEXT is HealthifyMe, an Indian company whose mobile app connects AI-enabled human coaches with people seeking diet and exercise advice.

Security Settles on Ethereum in First-of-a-Kind Blockchain Transaction

“It’s quite exciting that you can now leverage any clearing system, and it’s legally enforceable on even a public blockchain,” said Avtar Semha, founder and CEO of Nivaura, whose technology was used last year to issue an ethereum bond. Semha says it’s unclear with the note being issued Friday exactly how much will be saved on the overall cost of the transaction. But he added that in the ethereum bond last year the final cost was reduced from an estimated 40,000 pounds to about 50 pounds, “which is pretty awesome,” he said. Further, law firm Allen and Overy helped ensure the note was compliant, the Germany-based investment services firm Chartered Opus provided issuance services and Marex helped fix and execute the note within a “sandbox” created by the U.K. Financial Conduct Authority (FCA). As revealed for the first time to CoinDesk, on March 14, Nivaura also received full regulatory approval from the FCA that removed some restrictions and allows the company to operate commercially.

IoT and Data Analytics Predictions for 2018

With the increased collection of Big Data and necessity of advanced analytics, the year 2018 will witness high usage of cloud-based analytics software rather than on-premises software. Reports suggest that more than 50% of businesses will opt cloud-first strategy for their initiatives around big data analytics. AI will completely revolutionize the way the organizations work today. Enterprises will take full advantage of Machine Learning to optimize infrastructural behavior, transform business process, and improve the decision-making process. Gartner states that AI is just the starting of the 75-year technological cycle and it will utilize revenue for 30% of market-leading businesses. According to Gartner, natural language will play a dual role as a source of input for many business applications and for a variety of data visualizations. The operational transformation is necessary to adopt algorithmic business with DNNs (deep neural networks) in the year 2018. This will be a standardized component in the toolbox of more than 80% of data scientists.

Beyond Copy-Pasting Methods: Navigating Complexity

Why is agility a good idea? When you take Jeff Sutherland’s book, it says Scrum: Doing twice the work in half the time. At first sight that looks like a pure efficiency issue. But if we ask experienced agilists why they are doing agile, they usually come up with a list of challenges for which agile works better than other approaches: users only half understand their own requirements. Requirements change because the context changes. You have to build technological substitution into your design. Your solution is connected to many other things, and they all interact and change. But also inside the project you know there will be unpredictable problems and surprises. If we look behind the obvious: what is the common force that makes these challenges behave the way they do? The answer is complexity. Exploring complexity has a big advantage. Once we understand more about the complexity behind the problems which we are trying to solve with agile, we in fact clarify the purpose of our agile practice.

Building a new security architecture from the ground up

Overseeing an infrastructure that is operating thousands of servers is a burden on any architecture team. Moving those servers—all or in part—to the cloud takes patience and innovation. The innovation part, Fry said, is key because “most commercial security products are designed and built for specific use cases. Scale and complexity typically are not present,” meaning that architects in those situations need to adapt ready-built products to their networks or develop new tools from scratch, all of which takes time, money, and skill. Further, not all parts of the network can be treated equally; enterprise and customer-facing environments differ from test environments differ from production environments. When dealing with networks like those at Yahoo or Netflix, the need to think “outside the box” and innovate are, “not desirable; it’s a requirement,” said Fry. Though a security architect may be primarily concerned about security features and controls, the business is primarily concerned about availability and uptime.

Enterprises need a new security architecture: Graeme Beardsell

Today, data, applications, and users are outside the firewall and on the cloud, where they traverse the public Internet. To paraphrase, traditional security systems are guarding a largely empty castle. This means that enterprises need new approaches in their concept of security and to build new security architecture. Essentially, security should be designed to take advantage of the shape of the Internet and not try to defy it. The other challenge that is perhaps ‘invisible’ is the large number of vendors and solutions that each organization needs to manage. Analysts now advocate rationalizing multiple solutions by different vendors into suites of solutions by a single vendor to provide greater efficiency in productivity, and also towards solutions that share an integrated platform to facilitate data exchange and analysis.

10 Lessons Learned from 10 Years of Career in Software Testing

There is nothing wrong in getting certified but it’s not compulsory. A good tester needs to possess testing skills like sharp eye for details, analytical and troubleshooting skills etc. and I believe no certification can prove that you are good at those mentioned skills. While writing test cases, none of us would prefer to think about boundary value analysis and decision tables specifically. What one needs is application of common sense on knowledge. Who would like a person who indicates litter in your balcony and makes you sweep it? No matter if he is helping to make something clean, mostly he won’t be appreciated. That is how the profession is! You might or might not be appreciated for the quality improvement work you are doing but you need to understand importance of what you are doing. And on timely basis, you need to pat on your back for the work you are doing.

Quote for the day:

"Character is much easier kept than recovered." -- Thomas Paine

Daily Tech Digest - March 17, 2018

A Comparison Between Rust and Erlang

Erlang, being a high level, dynamic and functional language, provides lightweight processes, immutability, distribution with location transparency, message passing, supervision behaviors etc. Unfortunately, it is less than optimal at doing low-level stuff and is clearly not meant for that. ... Indeed, XML stanzas have to be read from the command line or from the network and anything coming from outside of the Erlang VM into is tedious to work with. You possibly know the odds. For this kind of use cases, one could be tempted to consider a different language. In particular, Rust has recently come to the foreground due to its hybrid feature set, which makes similar promises to Erlang’s in many aspects, with the added benefit of low level performance and safety. Rust compiles to binary and runs on your hardware directly just like your C/C++ program would do. How is it different from C/C++ then? A lot. According to its motto: “Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety

What everyone gets wrong about touchscreen keyboards

closeup of retro typewriter key with the letter B for blog
They’re not thinking about technology they haven’t seen or other ways of working with a device they haven’t tried. Another reason for the opposition is that two-screen laptops aren’t new. We’ve seen the idea tried in the past ten years in the form of Canova’s Dual-Screen Laptop, the Acer Iconia 6120 Dual Touchscreen Laptop, the Toshiba libretto W105-L251 7-Inch Dual Touchscreen Laptop and others. These devices were unpleasant to use and were rejected by laptop buyers. Future two-screen laptops will be the opposite. Here are five reasons why you’ll love two-screen laptops. ... Apple’s patents show an iTunes and Apple Music interface that replaces the on-screen keyboard with music controls, such as an equalizer, when one of these applications is running. It’s easy also to imagine what kind of interfaces third-party developers could build: turntables for DJs, drawing pads for illustrators, advanced calculator keyboards for eggheads, speech notes for business presentations and game-specific game controls for games.

Don't Drown Yourself With Big Data: Hadoop May Be Your Lifeline

Just how "big" is Big Data? According to IBM, 2.5 quintillion bytes of data are created every day, and 90 percent of all the data in the world was created in the last two years. Realizing the value of this huge information store requires data-analysis tools that are sophisticated enough, cheap enough, and easy enough for companies of all sizes to use. Many organizations continue to consider their proprietary data too important a resource to store and process off premises. However, cloud services now offer security and availability equivalent to that available for in-house systems. By accessing their databases in the cloud, companies also realize the benefits of affordable and scalable cloud architectures. The Morpheus database-as-a-service offers the security, high availability, and scalability organizations require for their data-intelligence operations. Performance is maximized through Morpheus's use of 100-percent bare-metal SSD hosting.

How business intelligence in banking is shifting the paradigm

FinTech - Financial Technology - dollar sign, circuits, data
Banking has always been a competitive environment, even before the digitization of the industry acquired its present pace. Thanks to financial technology, the competition has become even tougher. Fintech companies to banks are what Uber is to taxis. And, as we know, taxi drivers aren’t happy about Uber. Apart from having their profits endangered by fintech companies, banks also experience extreme pressure from regulators. After the 2008 crisis, regulatory agencies, such as FRB, OCC and FDIC, are carefully watching banks. And while most of the banks didn’t participate in activities that led to the crisis, all of them have to follow strict compliance rules adopted after the market crash. Competitive business intelligence solutions for banking have to reflect all these requirements. They have to be flexible and transparent to adapt to competition and regulatory environment. They have to be scalable to keep up with the growing digitization of the industry, as more clients are starting to forget the last time they visited the bank physically.

4 steps to implementing high-performance computing for big data processing

The message for company CIOs is clear: if you can avoid HPC and just use Hadoop for your analytics, do it. It is cheaper, easier for your staff to run, and might even be able to run in the cloud, where someone else (like a third party vendor) can run it. Unfortunately, being an all-Hadoop shop is not possible for the many life sciences, weather, pharmaceutical, mining, medical, government, and academic companies and institutions that require HPC for processing. Because file size is large and processing needs are extreme, standard network communications, or connecting with the cloud, aren't alternatives, either. In short, HPC is a perfect example of a big data platform that is best run in-house in a data center. Because of this, the challenge becomes—how do you (and your staff) assure that the very expensive hardware you invest in is the best shape to do the job you need it to do?

ONF puts focus on white box switches with Stratum project

ONF intends to make Stratum available on a broad selection of networking silicon and white box switches. Stratum will also work with existing deployed systems, as well as future versions of programmable silicon. Stratum uses recently released SDN interfaces and doesn't embed control protocols. Instead, it's designed to support external network operating systems or embedded switch protocols, like Border Gateway Protocol. In this way, ONF said the Stratum project will be more versatile and available for a broader set of use cases. Founding members of the Stratum project include Big Switch Networks, VMware, Barefoot Networks, Broadcom and Google -- which donated production code to initiate the project for open white box switches. "Google has contributed the latest and greatest, and just because it's Google [its participation in the project] makes it reasonably significant," Doyle said.

Wave Computing close to unveiling its first AI system

"A bunch of companies will have TPU knock-offs, but that's not what we do--this was a multi-year, multi millions of dollars effort to develop a completely new architecture," CEO Derek Meyer said in an interview. "Some of the results are just truly amazing." With the exception of Google's TPUs, the vast majority of training is currently done on standard Xeon servers using Nvidia GPUs for acceleration. Wave's dataflow architecture is different. The Dataflow Processing Unit (DPU) does not need a host CPU and consists of thousands of tiny, self-timed processing elements designed for the 8-bit integer operations commonly used in neural networks. Last week, the company announced that it will be using 64-bit MIPS cores in future designs, but this really for housekeeping chores. The first-generation Wave board already uses an Andes N9 32-bit microcontroller for these tasks, so MIPS64 will be an upgrade that will give the system agent the same 64-bit address space as the DPU as well as support for multi-threading so tasks can run on their own logical processors.

How to use Linux file manager to connect to an sftp server

The sftp command is quite easy. Open up a terminal window and log in with the command ssh USERNAME@IPADDRESS (Where USERNAME is the actual remote username and IPADDRESS is the address of the remote machine). Once logged in, you can then download files onto your local machine with the command get FILENAME (Where FILENAME is the name of the file). You can upload files with the command put FILENAME (Where FILENAME is the name of the file). But what if you don't want to work with the command line? Maybe you find the GUI a more efficient tool. If that's you, you're in luck, as most Linux file managers all have built-in support for SSH and its included tools. With that in mind, you can enjoy a GUI sftp experience, without having to install a third-party solution. As you might expect, this is quite easy to pull off. I'm going to demonstrate how to connect to a remote Ubuntu 16.04 server, via the sftp protocol, using both Elementary OS Pantheon Files and GNOME Files.

Deep Feature Synthesis Is the Future of Machine Learning

When data is conceptualized properly, sophisticated AI algorithms can make the most ingenious observations. Algorithms that have access to the right type of data may seem virtually omniscient. Unfortunately, real-world inputs can’t always be easily processed as the type of data that these algorithms depend on. At its core, machine learning depends on numerical data. Unfortunately, some qualitative data is not easily converted into a usable format. As human beings, we have one advantage over the AI algorithms that we sometimes expect to inevitably replace us. We understand the nuances of variables that aren’t easily broken down into strings of thousands of zeros and ones. The artificial intelligence solutions that we praise have yet to grasp this concept. The binary language that drives artificial intelligence has not changed in over half a century since it was first conceived.

4 key steps to building a comprehensive data strategy

As chief data officers and data scientists play more prominent roles in developing data strategies in many enterprises, we see organizations struggling to contend with these challenges and taking a shortsighted ‘save it now, worry about it later’ approach. These situations are worsening as data becomes more active and distributed across an enterprise, with many groups and individuals implementing unique and/or customized data management and storage solutions that often begin as unmanaged ‘aaS’ (as a service) projects and evolve into critical production systems with challenging data governance, security, access and cost management dynamics. Organizations that invest in developing and implementing a strategic data plan are fundamentally better prepared to anticipate, manage and capitalize on the increasing challenges and possibilities of data.

Quote for the day:

"There are things known and there are things unknown, and in between are the doors of perception." -- Aldous Huxley