Daily Tech Digest - March 17, 2018

A Comparison Between Rust and Erlang


Erlang, being a high level, dynamic and functional language, provides lightweight processes, immutability, distribution with location transparency, message passing, supervision behaviors etc. Unfortunately, it is less than optimal at doing low-level stuff and is clearly not meant for that. ... Indeed, XML stanzas have to be read from the command line or from the network and anything coming from outside of the Erlang VM into is tedious to work with. You possibly know the odds. For this kind of use cases, one could be tempted to consider a different language. In particular, Rust has recently come to the foreground due to its hybrid feature set, which makes similar promises to Erlang’s in many aspects, with the added benefit of low level performance and safety. Rust compiles to binary and runs on your hardware directly just like your C/C++ program would do. How is it different from C/C++ then? A lot. According to its motto: “Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety


What everyone gets wrong about touchscreen keyboards

closeup of retro typewriter key with the letter B for blog
They’re not thinking about technology they haven’t seen or other ways of working with a device they haven’t tried. Another reason for the opposition is that two-screen laptops aren’t new. We’ve seen the idea tried in the past ten years in the form of Canova’s Dual-Screen Laptop, the Acer Iconia 6120 Dual Touchscreen Laptop, the Toshiba libretto W105-L251 7-Inch Dual Touchscreen Laptop and others. These devices were unpleasant to use and were rejected by laptop buyers. Future two-screen laptops will be the opposite. Here are five reasons why you’ll love two-screen laptops. ... Apple’s patents show an iTunes and Apple Music interface that replaces the on-screen keyboard with music controls, such as an equalizer, when one of these applications is running. It’s easy also to imagine what kind of interfaces third-party developers could build: turntables for DJs, drawing pads for illustrators, advanced calculator keyboards for eggheads, speech notes for business presentations and game-specific game controls for games.



Don't Drown Yourself With Big Data: Hadoop May Be Your Lifeline

Just how "big" is Big Data? According to IBM, 2.5 quintillion bytes of data are created every day, and 90 percent of all the data in the world was created in the last two years. Realizing the value of this huge information store requires data-analysis tools that are sophisticated enough, cheap enough, and easy enough for companies of all sizes to use. Many organizations continue to consider their proprietary data too important a resource to store and process off premises. However, cloud services now offer security and availability equivalent to that available for in-house systems. By accessing their databases in the cloud, companies also realize the benefits of affordable and scalable cloud architectures. The Morpheus database-as-a-service offers the security, high availability, and scalability organizations require for their data-intelligence operations. Performance is maximized through Morpheus's use of 100-percent bare-metal SSD hosting.


How business intelligence in banking is shifting the paradigm

FinTech - Financial Technology - dollar sign, circuits, data
Banking has always been a competitive environment, even before the digitization of the industry acquired its present pace. Thanks to financial technology, the competition has become even tougher. Fintech companies to banks are what Uber is to taxis. And, as we know, taxi drivers aren’t happy about Uber. Apart from having their profits endangered by fintech companies, banks also experience extreme pressure from regulators. After the 2008 crisis, regulatory agencies, such as FRB, OCC and FDIC, are carefully watching banks. And while most of the banks didn’t participate in activities that led to the crisis, all of them have to follow strict compliance rules adopted after the market crash. Competitive business intelligence solutions for banking have to reflect all these requirements. They have to be flexible and transparent to adapt to competition and regulatory environment. They have to be scalable to keep up with the growing digitization of the industry, as more clients are starting to forget the last time they visited the bank physically.


4 steps to implementing high-performance computing for big data processing

istock-686690206.jpg
The message for company CIOs is clear: if you can avoid HPC and just use Hadoop for your analytics, do it. It is cheaper, easier for your staff to run, and might even be able to run in the cloud, where someone else (like a third party vendor) can run it. Unfortunately, being an all-Hadoop shop is not possible for the many life sciences, weather, pharmaceutical, mining, medical, government, and academic companies and institutions that require HPC for processing. Because file size is large and processing needs are extreme, standard network communications, or connecting with the cloud, aren't alternatives, either. In short, HPC is a perfect example of a big data platform that is best run in-house in a data center. Because of this, the challenge becomes—how do you (and your staff) assure that the very expensive hardware you invest in is the best shape to do the job you need it to do?


ONF puts focus on white box switches with Stratum project


ONF intends to make Stratum available on a broad selection of networking silicon and white box switches. Stratum will also work with existing deployed systems, as well as future versions of programmable silicon. Stratum uses recently released SDN interfaces and doesn't embed control protocols. Instead, it's designed to support external network operating systems or embedded switch protocols, like Border Gateway Protocol. In this way, ONF said the Stratum project will be more versatile and available for a broader set of use cases. Founding members of the Stratum project include Big Switch Networks, VMware, Barefoot Networks, Broadcom and Google -- which donated production code to initiate the project for open white box switches. "Google has contributed the latest and greatest, and just because it's Google [its participation in the project] makes it reasonably significant," Doyle said.


Wave Computing close to unveiling its first AI system

"A bunch of companies will have TPU knock-offs, but that's not what we do--this was a multi-year, multi millions of dollars effort to develop a completely new architecture," CEO Derek Meyer said in an interview. "Some of the results are just truly amazing." With the exception of Google's TPUs, the vast majority of training is currently done on standard Xeon servers using Nvidia GPUs for acceleration. Wave's dataflow architecture is different. The Dataflow Processing Unit (DPU) does not need a host CPU and consists of thousands of tiny, self-timed processing elements designed for the 8-bit integer operations commonly used in neural networks. Last week, the company announced that it will be using 64-bit MIPS cores in future designs, but this really for housekeeping chores. The first-generation Wave board already uses an Andes N9 32-bit microcontroller for these tasks, so MIPS64 will be an upgrade that will give the system agent the same 64-bit address space as the DPU as well as support for multi-threading so tasks can run on their own logical processors.


How to use Linux file manager to connect to an sftp server

linuxnetworkhero.jpg
The sftp command is quite easy. Open up a terminal window and log in with the command ssh USERNAME@IPADDRESS (Where USERNAME is the actual remote username and IPADDRESS is the address of the remote machine). Once logged in, you can then download files onto your local machine with the command get FILENAME (Where FILENAME is the name of the file). You can upload files with the command put FILENAME (Where FILENAME is the name of the file). But what if you don't want to work with the command line? Maybe you find the GUI a more efficient tool. If that's you, you're in luck, as most Linux file managers all have built-in support for SSH and its included tools. With that in mind, you can enjoy a GUI sftp experience, without having to install a third-party solution. As you might expect, this is quite easy to pull off. I'm going to demonstrate how to connect to a remote Ubuntu 16.04 server, via the sftp protocol, using both Elementary OS Pantheon Files and GNOME Files.


Deep Feature Synthesis Is the Future of Machine Learning

When data is conceptualized properly, sophisticated AI algorithms can make the most ingenious observations. Algorithms that have access to the right type of data may seem virtually omniscient. Unfortunately, real-world inputs can’t always be easily processed as the type of data that these algorithms depend on. At its core, machine learning depends on numerical data. Unfortunately, some qualitative data is not easily converted into a usable format. As human beings, we have one advantage over the AI algorithms that we sometimes expect to inevitably replace us. We understand the nuances of variables that aren’t easily broken down into strings of thousands of zeros and ones. The artificial intelligence solutions that we praise have yet to grasp this concept. The binary language that drives artificial intelligence has not changed in over half a century since it was first conceived.


4 key steps to building a comprehensive data strategy

As chief data officers and data scientists play more prominent roles in developing data strategies in many enterprises, we see organizations struggling to contend with these challenges and taking a shortsighted ‘save it now, worry about it later’ approach. These situations are worsening as data becomes more active and distributed across an enterprise, with many groups and individuals implementing unique and/or customized data management and storage solutions that often begin as unmanaged ‘aaS’ (as a service) projects and evolve into critical production systems with challenging data governance, security, access and cost management dynamics. Organizations that invest in developing and implementing a strategic data plan are fundamentally better prepared to anticipate, manage and capitalize on the increasing challenges and possibilities of data.



Quote for the day:


"There are things known and there are things unknown, and in between are the doors of perception." -- Aldous Huxley


No comments:

Post a Comment