Daily Tech Digest - July 03, 2018

Facebook releases its load balancer as open-source code

Facebook releases its load balancer as open-source code
Google is known to fiercely guard its data center secrets, but not Facebook. The social media giant has released two significant tools it uses internally to operate its massive social network as open-source code. The company has released Katran, the load balancer that keeps the company data centers from overloading, as open source under the GNU General Public License v2.0 and available from GitHub. In addition to Katran, the company is offering details on its Zero Touch Provisioning tool, which it uses to help engineers automate much of the work required to build its backbone networks. This isn’t Facebook’s first foray into open-sourcing the software that runs its network. Last month, the company open-sourced PyTorch, the software used for its artificial intelligence (AI) and machine learning projects. PyTorch is a Python-based package for writing tensor computation and deep neural networks using GPU acceleration. Facebook has to develop these kinds of software packages because while there are plenty of off-the-shelf software products out there, none of them is made for a global social media company that has 2 billion users.



If you thought GDPR was bad – Just wait for ePrivacy Regulation

GDPR delineates rules for obtaining clear and unambiguous consent for collection and use of personal information. ePR follows the same definition of what constitutes valid consent but makes it the “central legal ground” for the processing of electronic communications data, direct marketing communications and the access to end users’ terminal devices (phones, wearable devices, gaming consoles, etc.). One area of concern under ePR is that while GDPR also includes legitimate interest (as long as the consumers are aware of this and have consented to it) and contractual necessity as allowable factors for collecting and processing personal data, ePR lacks these exemptions to consent. This adds ambiguity, brings into question the alignment and relationship between GDPR and ePR and may effectively narrow how companies can process electronic communications data as well as what they can collect. ... Or on the flip side, does it mean that companies have to alter account origination processes to obtain consent for processing necessary to set up the account? Both questions seem to extend the need for obtaining consent beyond what the GDPR established.


Jabra Elite 65t true wireless earphones review: A true AirPod alternative

jabra elite 65t
They’ll remain safe during a shower, and they’ll be fine if you get caught in the rain. But just don’t take them swimming. With their IP55 rating, they’ll stand up to a blast from a jet of water. But submersion? Not so much. ... If you remove one of the buds from your ear, it’ll pause whatever you’re listening to. Put it back in, and the music continues—a nice touch of sophistication. And staying true to Jabra’s roots as a Bluetooth headset maker, the Elite 65t can also be used with just one earbud, the right one, pushed into your ear canal. This makes the earphones a good choice for anyone looking to use his or her phone handsfree while driving. Oh, and should you lose one of your earphones, Jabra makes it easy to buy a replacement through its accessory site. Jabra says the 65t can run for up to five hours off of a single charge. I found this estimate to be reasonably accurate. The earphone’s slim charging case, while larger than what you’ll see with a set of AirPods, has enough juice to provide two additional five-hour charges. Users will appreciate the fact that 15 minutes worth of charging in the case will provide about 90 minutes worth of music.


Ransomware: Not dead, just getting a lot sneakier

Ransomware may no longer be flavour of the month but it still remains a significant threat. The short-term damage means business can't be done while files are encrypted while the longer-term impact may result in loss of trust from customers and users who may not feel that the victim can be trusted to keep their data secure. There's also the possibility that a victim who pays the ransom could easily become infected again as attackers realise they've got an easy target on their hands. For cybercriminals ransomware still offers a big payday, quickly, unlike malicious cryptocurrency mining which requires patience to realise a pay-off. Behind much of the potency of ransomware is the EternalBlue SMB vulnerability which allowed WannaCry, NotPetya and other ransomware attacks to self-perpetuate around networks. It's over a year since the NSA vulnerability was leaked by hackers but there are plenty of organisations which, despite the clear demonstrations of the damage attacks exploiting EternalBlue can do, still haven't patched their networks.


The pros and cons of serverless architecture


Fundamentally, serverless lets developers focus on writing code. There are still servers somewhere in the stack, but the developer doesn't need to worry about managing those underlying resources. While services like Amazon Elastic Compute Cloud (EC2) require you to provision resources for the OS and the application, a serverless architecture simply asks how many resources a single demand of your function requires. For example, a web testing suite might require 128 MB of RAM for any single website. Even if you deploy 10 million copies of that function, each individual one needs only 128 MB. They can even all run at the same time. Serverless focuses on what each individual request requires and then scales automatically. There are several different approaches to serverless development. Most developers who transition from a traditional framework, such as Flask, Rails or Express, might choose to use a serverless framework, such as Chalice for Python or Serverless for Node.js. These frameworks are similar to the traditional ones, which help ease the transition for those developers.


How careless app developers are risking data of millions of sensitive users

The poorly secured backend database in thousands of apps is leaking sensitive user data. Many app developers have put at risk millions of sensitive medical and financial records of users due to their poor coding practices.  Recently released report by the mobile security firm Appthority describes the data leaks. The report pins the blame on app developers that failed to properly use Google’s Firebase cloud database. The platform acquired by Google in 2014 is used for authentication of user details. Firebase is intended to make app development much easier by doing much of the manual authentication work for coders. Appthority’s report lists more than 3,000 apps that leaked the user details. Most of these apps are Android-based while only 600 apps are on iOS. These incorrectly configured Firebase databases have exposed many users on the internet. Many of these apps record sensitive information such as financial data, employee medical records, and plain text passwords.

Why accounting matters to your cloud computing plans
While cloud computing can save you millions of dollars a year, it may actually cost you money, at least in the short term. That’s something that I’ve run into from time to time with clients over the years. At issue is that you need to consider net savings. That mean looking for the all-in cost of the cloud, including dealing with tax and other accounting implications. Although cloud computing is typically a superior model, walking away from traditional hardware and software has a cost as well. Indeed, in a few cases I’ve found that a cloud computing solution that will save $10 million a year actually will cost $15 million considering the impact of taxes. The gross savings made sense for cloud, but the net savings did not. So, how are cloud geeks supposed to deal with these accounting issues? By using business analysts to work up cloud ROI models. It’s not uncommon for these business analysts to be CPAs. Even more complex is the fact that most companies are multinational these days, and so you to figure out not only the net cost impact for a single country, but for dozens of countries that have some pretty odd laws when it comes to accounting, especially tax issues.


The modern CSO: Future-proofing your organization in a disruptive world

Thinking well in advance about the risk involved in moving to new IT platforms should allow CSOs to make sure that some things (e.g., privacy by design) are taken into account from the start and the emphasis on security and compliance is kept. “It’s also worth keeping up with what is taking place on the security side by looking at the low hanging fruit for security problems. Patching machines, keeping software updated, managing access control – these are all well-understood issues that keep getting exploited,” he notes. “The big problems like WannaCry in 2017 were all due to known issues. Understanding those breaches and patching vulnerabilities quickly should keep companies ahead of the large majority of potential attacks.” New technologies such as containers should also make this easier. “Rather than having to build upon that existing IT infrastructure and keep updating it, you can use a clean container build each time that is up to date. You keep the containers as up to date as possible, you audit any third-party software or plug-ins that get used within those containers continuously, and you focus on those images in your library,” he explains.


Pulse Secure VPN enhanced to better support hybrid IT environments

Pulse Secure VPN enhanced to better support hybrid IT environments
Pulse Connect Secure is fully mobile-aware, with features such as certificate-based authentication with an embedded certificate authority and integrated endpoint container. Support for SAML authentication allows enterprises to blend data center and cloud resources into a robust user experience.  Pulse Connect Secure simplifies network administration and compliance management with a centralized web-based console, end user self-provisioning, and integration with EMM policy management platforms. Centralized appliance management delivers an IT administration experience that enables proactive and rapid responses to security threats and network events. Administrators are able to replicate configuration and policies from one appliance to others and perform bulk operations for firmware updates and policy changes. An administrative dashboard provides appliance status and unified compliance reporting with context-aware visibility of devices and users. Pulse Connect Secure can be deployed as a hardware, virtual, or cloud appliance. Pulse Secure recently announced a new release of Pulse Connect Secure aimed at simplifying connectivity and security in cloud and hybrid IT environments.


Cybersecurity remains non-core competency for most C-suite executives

Whilst cybersecurity has now become a critical business function, it remains a non-core competence for a significant number of boards. CISOs have become increasingly common in recent years (recent research suggests that nearly two-thirds of large US companies now have a CISO position), but the majority do not report directly to the CEO, which reduces their effectiveness. Cyrus Mewawalla, Head of Thematic Research at GlobalData commented, “The frequency of cyberattacks is only likely to accelerate over the coming years, therefore it is vital that senior executives have a full understanding of the inherent risks and implications. The losers will be those companies whose boards do not take cybersecurity seriously, as they run a higher risk of being hacked.” It is hard to assess a company’s exposure to cybersecurity risk, but the composition of the board often provides clues: CEOs who do not have a CISO reporting directly to them present a high risk.



Quote for the day:


"The leadership team is the most important asset of the company and can be its worst liability." -- Med Jones


Daily Tech Digest - July 02, 2018

Microsoft Surface Studio: A cheat sheet

surface-studio-1.png
From the point of view of artists and designers, the Studio offers a high-end computer built around their creative needs, which does away with having to use a separate drawing tablet and computer. Even if creatives ignore the Surface Studio, its release is good news, likely to prompt incumbents like Apple and Wacom to spec up and cut the prices of new machines — in particular for the iMac, which the Studio has been compared to many times, despite the iMac lacking a touchscreen. By following up the immaculately designed Surface Book laptop with a striking machine like the Surface Studio, Microsoft also appears to be trying to establish itself as a competitor to Apple on the design front. The Surface Studio garnered good reviews but with sizable caveats. TechRepublic's sister site ZDNet praised its attractive high-resolution screen and snappy performance but criticized its high price, limited build-to-order and upgradeability options, as well as the fact the Surface Dial is not included by default. CNET had similar concerns, and also highlighted limitations of the GPU choice and lack of front-mounted USB ports and Thunderbolt connection.



UK government cyber security standard welcomed


The standard outlines a set of cyber security outcomes for government departments to achieve in the areas of identification, protection, detection, response and recovery. The outcomes-based approach is aimed at allowing government departments flexibility in how the standards are implemented, “dependent on their local context”, the document states, adding that “compliance with the standards can be achieved in many ways, depending on the technology choices and business requirements in question.” Some of the key requirements include clear lines of responsibility and accountability to named individuals for the security of sensitive information, training and guidance for senior accountable individuals, strict access control, use of secure configurations, regular patching, attention to email and web application security, developing an incident response and management plan and the testing of contingency mechanisms to ensure continued delivery of essential services. One of the few prescriptive uses of technologies is the use of Transport Layer Security version 1.2 (TLS v1.2) to protect email and data in transit.


How to Write Better Code

For the first of these points, great books help you to read code. Some books that I strongly recommend are Clean Code, Implementation Patterns, Refactoring, The Art of Agile, Pragmatic Programmer, and Practices of an Agile Developer. I've enjoyed reading all of these books immensely. These books will teach you considerations such as low coupling, high cohesion, and simple design. They will teach you useful principles like the Single Responsibility Principle and the Open-Closed Principle. The patterns and principles teach you new information and code to discuss with your team. For the second of these points, Test-Driven Development is one great way to learn how to write code. I enjoy doing coding katas myself and often use them for teaching. But, the most valuable skill when writing code is one I learned at code retreats. It is essential to learn when to delete the code you write. I don't just mean refactor it in order to be smaller. I mean that, for coding exercises, highlight all the files and press the delete button. I mean for production code, after spending a few hours working on a task, use git reset --hard HEAD.


How a robot vacuum navigates your home

r960 left
Newer, higher-end robot vacuums include self-navigation systems that use mapping technology. Each manufacturer implements its own particular spin on mapping, but each of them is currently built around two slightly different methods. One uses an onboard digital camera to take pictures of walls, ceilings, doorways, furniture and other landmarks. A version of this type of mapping is used in Roomba’s 900 series vacuums and Samsung’s Powerbots. The other method, employed in vacuums like Neato's Botvac series, uses a laser range finder (also called LIDAR for Light Detection and Ranging) that measures the distance to objects in the vacuum’s path. In either case, the robot vacuum uses the data it collects in combination with information from its other sensors to gradually build a map of the room during its initial cleaning. Mapping delivers significant advantages. Armed with a floor plan, the robot vacuum can plot the most efficient route through the room, which is why mapping models seem to move in more orderly straight lines than their non-mapping counterparts. Mapping also allows the robot vac to localize itself within the map, which informs it where it's been and where it yet needs to go.


Slack outages raise reliability concerns

In an interview last month, a Slack representative acknowledged the company's rapid growth has been challenging to keep up with at times. Since January 2015, the company has grown from 1.1 million daily users to 8 million regular users today. "To be frank, we're still learning as we go," said Julia Grace, senior director of infrastructure engineering at Slack, based in San Francisco. "This is such a complex piece of software. We're operating at a global scale. We're learning and evolving and growing and making the service better along the way." Some analysts pointed out that Slack's performance was much worse when it was starting out. "Once upon a time, in the very, very, very early days of Slack, they were built on a model that couldn't scale," said Michael Facemire, an analyst at Forrester Research. "You remember those outages; you remember the old days when [Slack] would be down, and it would be down for very perceivable amounts of time." Nevertheless, with tech powerhouses Cisco and Microsoft as competitors, Slack can no longer afford to look weak. Companies are unlikely to standardize on a collaboration vendor with an uptime record significantly less than rivals.


IEEE Sets Fog Computing Standard

IEEE sets fog computing standard for compute, storage, networking
“We now have an industry-backed and -supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Helder Antunes, chairman of the OpenFog Consortium and senior director at Cisco, said in a statement.  According to the OpenFog website: “The sheer breadth and scale of IoT, 5G and AI applications requires collaboration at a number of levels, including hardware, software across edge and cloud, as well as the protocols and standards that enable all of our ‘things’ to communicate. "Existing infrastructures simply can’t keep up with the data volume and velocity created by IoT devices, nor meet the low-latency response times required in certain use cases such as emergency services and autonomous vehicles. "By extending the cloud closer to the edge of the network, fog enables latency-sensitive computing to be performed in proximity to the data-generating sensors, resulting in more efficient network bandwidth and more functional and efficient IoT solutions. Fog computing also offers greater business agility through deeper and faster insights, increased security and lower operating expenses.”


HealthEngine's Latest Problem: A Data Breach

Embattled Australian medical appointment booking service HealthEngine says late Friday it has notified 75 users of a data breach that may have exposed some identifying information. The data breach is the latest in a string of problems for HealthEngine, which has fallen under scrutiny for tampering with patient reviews and for its third-party marketing activities, which underpin its free medical booking service. The breach involved HealthEngine's Practice Recognition system, which allows patients to write reviews of practices. It is unclear when the breach occurred. More than 59,600 patient feedback entries may have been improperly accessed, and 75 of those contained "identifying information," HealthEngine says in a notice on its website. "Due to an error in the way the HealthEngine website operated, hidden patient feedback information within the code of the webpage was improperly accessed, the company says on its website. "The information is ordinarily not visible to users of the site."


Shadow IT: When employees venture to the dark side

A man using a mobile phone in shadow against a bright wall
While the factors leading to shadow IT today are the same, the outcome (and risks) are largely new. Equipped with a company credit card and their web browser—users are willing to go outside the scope of IT to get the apps they need to work productively—jeopardizing security and corporate compliance in the process. Employees who play fast and loose with the rules of IT can lead to renegade apps stirring-up all sorts of trouble, and everyone can contribute to the problem. Executives who store sensitive notes and documents within apps like Evernote and Dropbox put company secrets at risk. The marketing department can cause financial headaches by, for example, purchasing unsanctioned Salesforce licenses for their team members. When shadow apps run amok outside the scope of IT, a lot can go wrong. Most importantly, without knowledge or control of the apps workers are using, IT admins cannot guarantee corporate or user privacy. Employee workflow and productivity are also at risk. Individual teams that use competing apps (for example, sales uses Slack while engineering uses Microsoft Teams) can make collaboration more difficult, if not impossible. And then there are the costs associated with paying for separate software licenses, or worse, paying double for the same software license across different teams.


Enabling stakeholders to boldly support data governance

Decisions on data governance programs ultimately affect many different stakeholders, some of whom are unknown when those decisions are made. Understanding the entirety of the people that are affected is where a lot of the change management aspects come into play. Widening the net to engage as many people as possible helps ensure the program is effective. Creating a data governance program behind closed doors with just a select few participants can be a recipe for disaster. Engagement depends on knowledge and buy-in, so don’t short-change your efforts by limiting participation. Communicate with as many data stakeholders as you can, especially with those who have taken on data community leadership roles. Data stakeholders want to be part of the process of creating, implementing and sustaining a data governance program. They want to know that their knowledge is appreciated and taken into account when decisions are made about policies, procedures, business rules, metadata and tools.


Dispatch From The Super Internet

super fast gigabit internet speed
The Super Internet is the sum total of how the internet operates when you’re running a very large number of Chrome extensions. It’s a different and better internet, where all the normal complaints don’t apply. On the Super Internet, you don’t enter passwords or see advertising. You don’t get tracked. Every page is HTTPS. And if you go to a page where your registered password has been leaked to the dark web, the Super Internet will tell you. Cloud applications on the Super Internet are ten times better than those available to most users. The Super Internet version of Gmail can send and receive SMS, do advanced mail merge, send recurring and scheduled emails, send PGP-encrypted emails, apply follow-up and due-date reminders to incoming emails, edit outgoing emails using HTML or Google Docs, block notification of senders when you open an email — the list goes on and on. The Super Internet has social networking features most users can’t even imagine. Twitter, for example, is enhanced with auto-refreshing streams, one-button account switching, instant and automatic following and unfollowing, the ability to remove any component of Twitter, including promoted tweets, and hundreds of additional features.



Quote for the day:


"The weak can never forgive. Forgiveness is the attribute of the strong." -- Mahatma Gandhi


Daily Tech Digest - July 01, 2018


There’s a lot of interest in becoming a data scientist, and for good reasons: high impact, high job satisfaction, high salaries, high demand. A quick search yields a plethora of possible resources that could help -- MOOCs, blogs, Quora answers to this exact question, books, Master’s programs, bootcamps, self-directed curricula, articles, forums and podcasts. Their quality is highly variable; some are excellent resources and programs, some are click-bait laundry lists. Since this is a relatively new role and there’s no universal agreement on what a data scientist does, it’s difficult for a beginner to know where to start, and it’s easy to get overwhelmed. ... Many of these resources follow a common pattern: 1) here are the skills you need and 2) here is where you learn each of these. Learn Python from this link, R from this one; take a machine learning class and “brush up” on your linear algebra. Download the iris data set and train a classifier (“learn by doing!”). Install Spark and Hadoop. Don’t forget about deep learning -- work your way through the TensorFlow tutorial.



Crypto Coin Graveyard Fills Up Fast as ICOs Meet Their Demise

Blockchain startups are faring worse than their counterparts in other industries. Of 103 companies that received initial seed or angel funding in 2013 and 2014, only 28 percent managed to raise additional funding, according to CB Insights’s October report. That compares with 46 percent of the 1,098 tech companies that raised a second round in the U.S. between 2008 and 2010. Among tech companies, 14 percent went on to a fourth round, while only 2 percent of the blockchain companies did, the researcher found. "I don’t think we found the killer app yet," said Arieh Levi, an analyst at CB Insights. "It just seems like there’s been a lot of projects tried, but there aren’t really many users of blockchain protocols beyond speculators and traders." The failed projects have cost investors billions. Barring outliers like BitConnect, which saw its market cap shrink to about $4 million from nearly $3 billion in December, most of the ICOs that birthed these coins were relatively small, but investors may have still lost as much as $500 million, estimated Lex Sokolin, global director of fintech strategy at Autonomous Research LLP.


Why adopt cloud technology in the financial services industry?

Why adopt cloud technology in the financial services industry? image
Financial firms should undertake a shift in thinking and put technology – rather than finance – at the core of their business. UEBA (User and Entity Behaviour Analytics) and CASB (Cloud Access Security Broker) technologies together provide solutions to these challenges. UEBA tracks what users are doing and how data is moving, flagging if user or data behaviour differs from what could be considered normal and safe. Whether authorised or not, employees can put data and systems at risk, even if they stay within the security policies managed by a CASB. For example, a hacker that’s tricked an employee into divulging their credentials can move cloud data laterally from different applications to a cloud system, designed to surreptitiously withdraw the data afterwards. A recent survey found that hackers can exit a network within an hour, armed with prized data, so it’s vital to spot a compromised account before it’s too late. CASB helps financial firms get the rules of engagement just right, as CASB security keeps users in line with an organisation’s cyber security policies.


The Blockchain ecosystem v3: six months after the hype


The crypto world nowadays is not a safe haven. New businesses appear, evolve and rest on laurels, while others default or fail to meet investors expectations. In the nine months that passed since the first version of our map, many projects have soared, but some have terminated all activities. So, we have added a special section for some discontinued projects or proven frauds to remind you to triple check before getting into adventurous endeavors. Nevertheless, positive news also struck the crypto universe. We have placed on the map many new interesting projects with successfully closed ICOs during the past 9 months, as well as companies that we think deserve our readers’ attention. Those companies that haven’t released their product yet, are painted in a pale gray color, to make the map even more informative. Despite the number of projects that we highlight here has more than doubled, a half of the new ones still haven’t launched their product, thereby falling behind those who have. Of course, it’s quite expected that a decent portion of these newcomers arise from the sector of financial services. 



Using Topological Data Analysis to Understand the Behavior of Convolutional Neural Networks


There is a particular class of neural networks that are well adapted to databases of images, called convolutional neural networks. In this case, the input nodes are arranged in a square grid corresponding to the pixel array for the format of the images that comprise the data. The nodes are composed in a collection of layers, so that all edges whose initial node is in the i-th layer have their terminal node in the (i+1)-st layer. A layer is called convolutional if it is made up of a collection of square grids identical to the input layer, and it is understood that the weights at the nodes in each such square grid (a) involve only nodes in the previous layer that are very near to the corresponding node and (b) obey a certain homogeneity condition, so that for each square grid in layer i, the weights attached to a given node are identical to those for another node in the same grid, but translated to its surrounding neurons. Sometimes intermediate layers called pooling layers are introduced between convolutional layers, and in this case the higher convolutional layers are smaller square grids. Here is a schematic picture that summarizes the situation.


‘Moneyball’ing data – A closer look at how churn and propensity models work


As machine learning, deep learning, artificial intelligence, etc. become mainstream words that are taught in primary schools these days, it pays to fully understand how the system truly makes predictions and prescribes actions that a business should take. In this article, let’s look at how churn models and propensity to buy models can help you ‘moneyball’ your data. First things first, to ‘moneyball’ your data, you first need to have data. It can be anything from sales data, customer demographics, visits, social profiles, customer feedback, etc. This data forms the basis for your models to get trained on is called as ‘training data’. Models and algorithms are either pre-built or can be customized for a specific use case. For example, if you want to understand which segment of customers are going to churn in the next quarter, you can build a churn model which denotes the probability of a specific customer/ set of customers as a percentage. You can then get an output along the lines of: ‘Top 100 customers that going to churn in Q2 18’ and use that report to engage the customer better.


EU Report Says Cryptocurrencies 'Unlikely' to Challenge Central Banks

In the latest Monetary Dialogue report issued on June 26, the European Parliament's Committee on Economic and Monetary Affairs said that while cryptocurrencies have made financial transactions "relatively safe, transparent, and fast," they pose no threat to sovereign currencies around the world. The analysis, which was conducted by the Center for Social and Economic Research, a non-profit research institute based in Warsaw, first recognized the positive changes cryptocurrencies have brought to financial transactions, noting that they now "are used globally, disregarding national borders." Cryptocurrencies "respond to real market demand," the analysis claimed, and they will have the potential to become a "full-fledged private money" or even a permanent element to the global economy. However, the researchers said it is "unlikely" cryptocurrencies will threaten central banks and sovereign currencies and dismantle the existing monetary structures, especially in countries where their sovereign currencies are widely circulated.


The Spooky World of Quantum Computation


Quantum theories are counter-intuitive, because we all know that's not how the world works. The theories were even unsettling to the scientists who developed them, who struggled with their strange implications. Einstein hated the idea that the world was fundamentally non-deterministic, which led to his famous pronouncement that ‘God does not play dice.’ Schrödinger hoped his later theoretical work would eliminate what he called ‘the quantum jump nonsense’ and expressed regret at ever having contributed to quantum theory.. In order to show how absurd superposition was, Schrödinger devised a thought experiment, in which a cat was isolated from the outside world in a box. Also in the box was a radioactive rock, a bottle of cyanide, and a geiger counter wired to a hammer, which would smash the bottle if any radioactivity was detected. The emission of particles from a radioactive source is a probabilistic event, which cannot be predicted ahead of time. If a particle was emitted, the poison would be released, killing the occupants of the box. Quantum theory says that until a measurement was made, the decaying particle would be in both a decayed and non-decayed state.


A Deep Dive Into Cloud-Agnostic Container Deployments

Container orchestration refers to the automated organization, linkage, and management of software containers. These concepts are conventional for most of the tools mentioned above. This article aims to deep dive into a comparison between the two dominating players. ... Kubernetes necessitates a set of manual configurations to tie its components to the Docker engine. It comes with unique installations for every operating system. Before installation, Kubernetes requires information like node IP addresses, their roles, and numbers. There are many tools available to simplify the install and config process, though. Kubernetes is considered relatively white-box, i.e. you can get a lot more out of it, but you really need to have a deep understanding of what makes Kubernetes tick to achieve this. The platform is not designed for novices and the faint of heart to navigate. Throughout the pros and cons of Docker Swarm, you can note that Swarm's focus is on the ease of adoption and integration with Docker. Kubernetes, on the other hand, stands open and flexible.


2018 State of Testing Report


Open questions can sometimes be tricky but they are also incredibly interesting, as they provide an open platform for individual testers to express themselves and provide answers we could not foresee ahead of time. Specifically in the question about non-testing tasks, we saw a number of recurring answers pointing towards testers working either closer to customers (organizing Beta Testing programs, or briefing customers directly on the functionality of the product), or representing these customers while serving as product owners within their teams. We also saw a number of answers stating testers are now writing product code as part of their day-to-day tasks - aligned with the philosophy that teams are uniform and every member can and should be able to perform all actions. Open questions are also an opportunity for respondents to release some of the tensions and frustrations they feel as part of their work... Like the person who answered one of his non-testing tasks was to serve as a ZOO KEEPER, something I am sure many of us have felt one time or another in our testing careers.



Quote for the day:


"Leadership happens at every level of the organization and no one can shirk from this responsibility." -- Jerry Junkins


Daily Tech Digest - June 30, 2018

Block diagram showing system impact due to a failed microservice
Resiliency is the capability to handle partial failures while continuing to execute and not crash. In modern application architectures — whether it be microservices running in containers on-premises or applications running in the cloud - failures are going to occur. For example, applications that communicate over networks are subject to transient failures. These temporary faults cause lesser amounts of downtime due to timeouts, overloaded resources, networking hiccups, and other problems that come and go and are hard to reproduce. ... You can’t avoid failures, but you can respond in ways that will keep your system up or at least minimize downtime. For example, when one microservice fails, its effects can cause the system to fail. ... Developers often use the Circuit Breaker and Retry patterns together to give retrying a break. Retry tries an operation again, but when it doesn’t succeed, you don’t always want to just try it one more time or you may risk prolonging the problem. The Circuit Breaker pattern effectively shuts down all retries on an operation after a set number of retries have failed.



The Humanoid Banker – Science Fiction or Future?
It is obvious that banking will also have to change and adapt to the new reality. Big events in the financial industry have had great impacts in the last 10 years. The global financial crisis from 2007-08 was just the ignition for numerous scandals followed by huge fines, leading to a flood of new regulations (AML, KYC, Basel III). Unfortunately, most regulations have yet to prove their benefits for the consumer and bank client. They have definitely made service delivery more bureaucratic, time-consuming and costly, reducing client satisfaction. And, of course, they have also made life for the bank staff more difficult, especially as resources are trimmed and cost-cutting continues. And this is all happening in the midst of a global shift of economic power towards the East. We should also note that the developed world, especially in Europe, is rather more technophobic than progressive. This sharply contrasts with the emerging Asian economies, driven by behemoth China. They are already now taking the lead in many technology and science disciplines, e.g., robotics, artificial intelligence, social media, smartphones, wearable technologies, internet of things and so on.



Machine learning evolution (infographic)

Deep learning’s improved accuracy in image, voice, and other pattern recognition have made Bing Translator and Google Translate go-to services. And enhancements in image recognition have made Facebook Picture Search and the AI in Google Photos possible. Collectively, these have put machine recognition capabilities in the hands of consumers in a big way. What will it take to make similar inroads in business? Quality training data, digital data processing, and data science expertise. It will also require a lot of human intelligence, such as language-savvy domain experts who refine computable, logically consistent business context to allow logical reasoning. Business leaders will have to take the time to teach machines and incorporate machine intelligence into more processes, starting with narrow domains. Some in the statistically oriented machine learning research “tribes”—the Connectionists, the Bayesians and the Analogizers, for example —will worry that “human-in-the-loop” methods advocated by the Symbolists aren’t scalable. However, we expect these human-to-machine feedback loops, that blend methods of several tribes, will become a lot more common inside the enterprise over the next few years.


Reinventing the healthcare sector with Artificial Intelligence

Photo: iStock
India is also joining a growing list of the countries that are using AI in the healthcare. The adoption of AI in India is being propelled by the likes of Microsoft and a slew of health-tech startups. For instance, Manipal Hospitals, headquartered in Bengaluru, is using IBM Watson for Oncology, a cognitive-computing platform, to assist physicians discover personalised cancer care options, according to an Accenture report. For cardiac care, Columbia Asia Hospitals in Bengaluru is leveraging startup Cardiotrack’s AI solutions to predict and diagnose cardiac diseases. “Last year the company embarked on Healthcare NExT, a Microsoft initiative which aims to accelerate healthcare innovation through AI and cloud computing. By working side-by-side with the healthcare industry’s most pioneering players, we are bringing Microsoft’s capabilities in research and product development to help healthcare providers, biotech companies and organizations across India use AI and the cloud to innovate,” said Anil Bhansali, Corporate Vice President, Cloud & Enterprise, Managing Director, Microsoft India (R&D) Private Limited.


Look For the ‘Human’ When Buying HR Technology


Cognitive data processing technologies have the ability to not only automate common tasks like generating and distributing reports, but also to perform duties as highly nuanced as career coaching across an entire employee base. Machine learning algorithms can learn an organization’s priorities for skills development, help assess an individual’s credentials, then make recommendations for training or positions to consider. Offered to employees through an existing learning management system or a mobile app, such innovations can facilitate large-scale corporate objectives of development and retention and give each and every worker the useful career information they crave today. Yet some new technologies have a wow factor that makes them seem useful or innovative when they may not be ready or reasonable to replace HR’s role. Virtual Reality (VR) is being touted for all sorts of uses, from eLearning to giving candidates a taste of what it’s like to work for a company. But these should be considered as augmenting traditional tactics, not entirely replacing them.


Some reasons to think about big data as data commons

Finally, the digital economy has generated a considerable fiscal distortion: companies that rely on digital business models pay on average half the effective tax rate of traditional companies, thanks to the “fluid” nature of their businesses and to the placement of their subsidiaries in countries with low tax regimes. In response thereto, the European Commission has freshly come up with two distinct legislative proposals to ensure a fair and efficient European tax system. In the light of the above, an alternative data management for a more equal and sustainable socio-economic digital environment is not only desirable but necessary. The DECODE project is building towards this direction: a digital economy where citizen data is not stored in silos located and handled in overseas countries but rather self-controlled and available for broader communal use, with appropriate privacy protections and value distribution outcomes. The Data Commons approach centres precisely around the pooling and collective management of individually-produced streams of personal and sensor data that, combined with public data of the cities, will offer data-driven services that better respond to individual and overall community needs.


Third Wave Dev + Ops: Self-Learning DevOps for a Self-Learning World

DevOps AI success, in other words, may be largely contingent upon continuous testing. After all, we want our AI-powered recommendation engines to guide customers toward the right items while they're still on our websites. Why wouldn't we similarly want our AI-powered DevOps to guide developers toward the right behaviors while they're still on a feature or fix? ... Most IT organizations still treat cybersecurity as its own functional silo. In five years, this approach won't work—especially since next-generation architectures, such as blockchain, that depend on innate code might run on imperfectly secured environments beyond enterprise IT teams' control. Security must therefore become intrinsic to DevOps QA—not something for someone else to clean up after the fact. Building security checks directly into the integrated development environment (IDE) helps to protect companies from increasingly sophisticated attacks designed to discover and exploit design flaws. Security-enabled DevOps also saves money and speeds time to market in precisely the same way that QA shift left does.


Every Android device from the last 6 years may be at risk to RAMPage vulnerability

androidrampage.jpg
While not impossible, RAMPage is more difficult to practically attack on end-user devices, partially as vendor-specific or device-specific issues make it more difficult to reliably create the conditions that allow for exploitation. Because of the degree of precision involved, it would theoretically be possible that the same model phone with DRAM from different vendors would have different avenues to attack, or that certain optional hardware protections of LPDDR4, if added at manufacturing time, would partially mitigate the attack, the paper noted. Additionally, while the RAMPage attack was only demonstrated on an LG G4, it is possible that it may be applicable toward iOS devices and other devices using LPDDR2, 3, or 4 chips and running software with similar memory management techniques. That said, the researchers have proposed a fix for RAMPage called GuardION. From their tests in the whitepaper, they found "results in a performance degradation of 6.6%, which we believe is still acceptable.


The Insane Amounts of Data We're Using Every Minute

By the looks of the research, things are only getting bigger. In 2012, there were approximately 2.2 billion active internet users. In 2017, active internet users reached 3.8 billion people -- nearly 48 percent of the world’s population. When it comes to social media, data usage is unsurprisingly high. Since last year, Snapchat alone saw a 294 percent increase in the amount of images shared per minute. Nearly 2.1 million snaps are shared every 60 seconds. On average, there are 473,400 tweets posted every minute, 49,380 Instagrams photos and 79,740 Tumblr posts. So who’s behind all this social media madness? Americans upped their internet usage by 18 percent since 2017, however it’s not all going to Snapchat and Twitter. Much of it is going to video-streaming services such as Netflix and YouTube. Since last year, Netflix saw a whopping 40 percent increase in streaming hours, going from 69,444 hours to 97,222. And YouTube videos have reached 4.3 million views per minute. Even the peer-to-peer transactions app Venmo saw a major data jump, with 32 percent more transactions processed every minute compared to last year. Overall, Americans use 3.1 million GB of data every minute.


Building a Robust and Extensive Security Architecture


Building a device security system is the first line of defense in ensuring IoT security. The security capabilities of devices need to be configured to match their functions and computing resources, including memory, storage and CPU. For weak devices, such as water and gas meters, where resources are limited and power consumption and cost are issues, basic security capabilities are a must. These include basic two-way authentication, DTLS+, encrypted transmission and remote upgradability, as well as lightweight and secure transmission protocols. Strong devices with more powerful computing capabilities that don’t have power consumption constraints and are operationally critical, such as industrial control terminals and car networking equipment, require advanced security capabilities, including trusted devices, intrusion detection, secure startup, and anti-virus protection. Device chip security and security for lightweight operating systems such as LiteOS need defense capabilities in line with the security protections of strong devices.



Quote for the day:


"Remember this: Anticipation is the ultimate power. Losers react; leaders anticipate." -- Tony Robbins


Daily Tech Digest - June 29, 2018

What is Julia? A fresh approach to numerical computing
Julia is a free open source, high-level, high-performance, dynamic programming language for numerical computing. It has the development convenience of a dynamic language with the performance of a compiled statically typed language, thanks in part to a JIT-compiler based on LLVM that generates native machine code, and in part to a design that implements type stability through specialization via multiple dispatch, which makes it easy to compile to efficient code. ... What we’re seeing here is that Julia code can be faster than C for a few kinds of operations, and no more than a few times slower than C for others. Compare that to, say, R, which can be almost 1,000 times slower than C for some operations. Note that one of the slowest tests for Julia is Fibonacci recursion; that is because Julia currently lacks tail recursion optimization.  ... Julia fans claim, variously, that it has the ease of use of Python, R, or even Matlab. These comparisons do bear scrutiny, as the Julia language is elegant, powerful, and oriented towards scientific computing, and the libraries supply a broad range of advanced programming functionality.



Using machine learning to understand the human immune system

antigen-map-hero.jpg
The human immune system works on multiples large enough to make your head spin. There are two billion lymphocytes in the body, among them what's known as 'helper' T cells, others as 'cytotoxic' or 'killer' T cells. Each T cell can recognise the antigens -- the triggers that will set off the immune system -- that are the signatures of bacteria, viruses, fungi or other invaders that have entered the body. Each T cell can bind to hundreds of different antigens, each potentially unique to a different bacteria or virus. Once a T cell has got a hit, depending on what type of T cell it is, it may kill the invader, or signal the millions of other immune cells to come and take on the wrongdoer too. Anyone taking a snapshot of the immune system when the T cells are activated, by noting which T cell receptors are activated and which antigens they bind to, could work out which disease has taken over the body. And, once the disease is known, doctors can see more clearly how it can be treated.



Why PGP is fundamentally flawed and needs to be fixed


From my vantage point, the biggest problem with encryption (outside of these vulnerabilities) is the fact that few people actually use it. Sure those that are really, really concerned about privacy and security will make use of PGP in some form or function, but the average user (of which there are hundreds of millions) wouldn't know PGP if it reached out from their monitor and slapped them across the face to say, "Pay attention to me!" There's a reason for this. The average user doesn't know where to begin working with encryption ... on any level. Try talking your parents through the usage of encryption in email. Watch their faces go slack as every word you utter flies over their heads. The thing is, if PGP (or OpenPG, or GPG, or GnuPG ... you get the idea) is to succeed, it needs to be used (and not by the enlightened few). Encryption needs to become the standard. For that to happen, it needs to be built into email clients such that users require zero hand-holding to understand or use the technology. But before that can happen, the implementations of PGP need to be fixed.


Storage class memory set to upset hardware architectures


The more expensive and lower capacity devices offer the capability to be used as a host or array cache that add the benefit of persistence compared with simply using DRAM. The extended endurance of these products compared with NAND flash also makes them more suited for write caching or as an active data tier. Hyper-converged infrastructure (HCI) solutions can take advantage of low latency persistent memory deployed into each host. Placing the persistent storage on the PCIe or even the memory bus will significantly reduce I/O latency. But this also risks exposing inefficiencies in the storage stack, so suppliers will want to be quick to identify and fix any issues. Disaggregated HCI solutions, such as those from Datrium and NetApp, should see large performance improvements. In both cases, the architecture is built on shared storage with local cache in each host. Performance is already high with NAND flash, but offers more resiliency (and less cache warm-up) with persistent caching using products such as Optane.


Data readiness is much more than mere cleanliness

Data set readiness comprises traditional data preparation ideas: data cleanliness and consistency, de-duplication and the management of unstructured data. (The seemingly simple task of mailing address normalization is a data preparation discipline in its own right.) In the world of the V’s – variety, volume, velocity, veracity, and even validity and volatility – the biggest challenge here is variety. Since data sets evolve over time as domain experts look for new insights and correlation with new data sources, some agility in the ability to acquire and integrate new data sets is a part of data set readiness, albeit in the “meta” sort of way where being ready to get more data ready is a prerequisite. Data pipeline readiness addresses some of the larger big data V’s: volume and velocity. Once you have models to execute, operationalizing them to operate reliably at scale and at business speed brings an entirely new set of challenges. Can your business handle the massive data flows? Can it handle them in an increasingly expeditious way?


Companies are struggling with security automation

First iOS trojan exploiting Apple DRM design flaws infects any iOS device
“The cybercrime landscape is incredibly vast, organized and automated – cybercriminals have deep pockets and no rules, so they set the bar,” said Amy James, Director of Security Portfolio Marketing at Juniper Networks. “Organizations need to level the playing field. You simply cannot have manual security solutions and expect to successfully battle cybercriminals, much less get ahead of their next moves. Automation is crucial.” The growing threat landscape and security skills gap facing cybersecurity teams demand that organizations implement automation for a stronger security posture. Respondents recognize this growing importance and how automation can improve productivity, address the growing volume of threats and reduce the rate of false positives. The top two benefits of security automation, according to respondents, are: increased productivity of security personnel (64 percent) and automated correlation of threat behavior to address the volume of threats (60 percent). Half or 54 percent of respondents say these automation technologies simplify the process of detecting and responding to cyber threats and vulnerabilities.


Will blockchain bring data ownership back to users?

Blockchain, a technology that has seen success in cryptocurrency and beyond through its security, efficiency and non-centralized control, has been seen as a way of democratizing data and putting ownership back into the hands of users. As compared to the current practices where ownership of user data is held by the enterprise, blockchain would enable the creation of a self-sovereign identity, where individuals control their own identities and personal data and are able to decide who to share it with, and to what extent. In addition, blockchain offers the possibility of micro-incentivizing people to share data at their own will, which can significantly disrupt current ways of working for industries such as advertising and content.  Organizations will need to come to terms with this new reality and be aligned with the changing mindsets and desires of their users when it comes to management of personal data. While a self-sovereign identity that is enabled by blockchain could revolutionize how personal data is managed, it does not come about without hurdles. For starters, the burden of managing and allocating access would have to be borne by the individual.


IT Mission Vision & Values Statements: Foundations For Success

IT mission, vision and values statements: Foundations for success
The difference between a vision statement and a mission statement can be confusing. Some enterprise vision statements are actually missions and vice versa. A good vision paints a picture of a desired future state. It appeals to the heart inspiring employees, customers, and other stakeholders to do their best. A good vision rarely changes, remaining constant through different leaders, economic circumstances, and challenges. A mission describes how the enterprise will get to the desired future state. It appeals to the head and is an anchor against which departments and programs can be measured to determine how well they support the enterprise. Missions evolve to reflect new challenges as intermediate goals are attained. ... Value statements describe the principles, beliefs and operating philosophy that shape culture. Strong values serve as a moral compass, guiding interactions among employees and providing a standard against which behaviors can be assessed. Passion, teamwork, integrity, diversity and quality are found in many enterprise value statements.


Slack outage causes disruption, but highlights importance of team chat

#slack signage
“For individuals and organizations using team collaboration tools such as Slack, real-time communications have become ubiquitous and second nature to their work,” said Raúl Castañón-Martínez, a senior analyst at 451 Research. High expectations for service availability, he added, mean that every downtime incident will be perceived as a “serious disruption.” “On the positive side, this signals that Slack has been successful in permeating the enterprise, and team collaboration tools are rapidly becoming a core productivity tool, alongside email and calendar,” he said. Slack did not provide Computerworld with further details of the cause of the outage, or the number of users affected. A spokesperson said: “On June 27th between 6:33am and 9:49am PT Slack experienced an outage where people could not connect to their workspace. We worked as quickly as possible to bring Slack back to normal for everyone, and we are continuing to investigate the cause of this issue. We’re deeply sorry for the disruption. Please see the Slack Status page for the latest information


Online banks struggle to stay connected with younger mobile users

J.D. Power’s findings imply that those digital-only institutions aren’t just competing with other banks. Consumer use of other apps, like Uber or Seamless, is influencing customers’ expectations in the banking sphere as well. Direct banks earned an overall satisfaction score of 863 out of 1,000 points in J.D. Power’s latest ranking, compared with traditional retail banks’ overall score of 806. Direct banks also scored higher than their traditional counterparts for service across all other banking channels, including websites and call centers. On mobile banking, however, direct banks held the narrowest lead over traditional retail banks, earning a score of 864 compared with traditional banks’ score of 850. Last year, both direct and traditional banks scored 872 on satisfaction with mobile channels. J.D. Power also found that direct bank customers’ awareness and usage of various mobile banking features had declined year over year across every single feature the firm tracks, including bill pay and person-to-person payments.



Quote for the day:


"Learn to appreciate what you have, before time makes you appreciate what you had." -- Anonymous


Daily Tech Digest - June 28, 2018

A closer look at Google Duplex


Duplex was — and still is — very much a work in progress. Among other things, the system didn’t provide a disclosure in the early days, a fact that could potentially violate the “two-party consent” required to record phone calls and conversations in states like Connecticut, Florida, Illinois, Maryland, Massachusetts, Montana, New Hampshire, Pennsylvania, Washington and Google’s own home base of California. “The consent-to-record issues here go beyond just Duplex to the broader legal implications of machine speech,” said Gabe Rottman, director of the Technology and Press Freedom Project at the Reporters Committee for Freedom of the Press. “If the service extends to all-party consent states or globally, you could see questions pop up like whether consent is valid if you don’t know the caller is a machine. Curveballs like that are just going to multiply the more we get into the uncanny valley where automated speech can pass as human.” Going forward, the system will be confined to those states where the laws make it feasible. That also applies to interstate calls, so long as both sides are covered.



10 Hottest Job Skills for Young IT Workers

Image: Pixabay
Daniel Culbertson, an economist at job posting site Indeed.com, says those younger workers are more attracted to technology jobs than older workers are. In addition, when workers under 40 go looking for a job, they tend to click on very different postings than their older counterparts do. For organizations that are looking to expand their head count in the tight labor market, attracting these younger workers can be critical for remaining competitive. That means they need to craft job postings that will appeal to the Millennials. The skills that attract attention from young job candidates can also serve as a sort of compass for where the technology industry is heading. Because technology changes so quickly, tech workers tend to look for jobs related to areas that they believe will become more important in the future. Their interests can highlight trends that are likely to remain relevant for some time. Culbertson ran an analysis of job seeker behavior on Indeed.com and come up with a list of terms that appeared most often in the job postings clicked by people under 40.


Will artificial intelligence bring a new renaissance?


Artificial intelligence and robotics were initially thought to be a danger to be blue-collar jobs, but that is changing with white-collar workers – such as lawyers and doctors – who carry out purely quantitative analytical processes are also becoming an endangered species. Some of their methods and procedures are increasingly being replicated and replaced by software. For instance, researchers at MIT's Computer Science and Artificial Intelligence Laboratory, Massachusetts General Hospital and Harvard Medical School developed a machine learning model to better detect cancer. They trained the model on 600 existing high-risk lesions, incorporating parameters like, family history, demographics, and past biopsies. It was then tested on 335 lesions and they found it could predict the status of a lesion which 97 per cent accuracy, ultimately enabling the researchers to upgrade those lesions to cancer. Traditional mammograms uncover suspicious lesions, then test their findings with a needle biopsy. Abnormalities would undergo surgeries, usually resulting in 90 per cent to be benign, rendering the procedures unnecessary.


How to secure cloud buckets for safer storage


Amazon Macie automatically discovers and classifies data stored inside Amazon S3 buckets using machine learning technology for natural language processing, and this might very well be the future. It is clear that human error cannot be reduced to zero, so putting near-real-time automated controls in to contain the risks once such an error inevitably occurs is a good approach. Another option is to enable Amazon's Default Encryption feature, which will automatically encrypt any file placed inside a bucket. Some other available features include Amazon's permission checks and alarms and the use of access control lists. It is also critical to monitor public access and API calls. Alerts should be set and actioned to cover the dumping of large amounts of files or large files in general. A SIEM can assist in correlating the required security event data for these alerts via rules and set thresholds. Data breaches through cloud storage are a problem that will not go away. There are many reasons why this topic is still such an issue, but there are mitigation options and there have been some promising developments in this space.


Robots Are Our Friends -- How Artificial Intelligence Is Leveling-Up Marketing

Robots Are Our Friends -- How Artificial Intelligence Is Leveling-Up Marketing
Not only is AI automating jobs we don’t want to do, it’s also opening the doors to jobs we can’t do. Since AI has the ability to process an infinitely larger dataset than a human can, it can leverage that scale to identify marketing insights that would otherwise be lost. Say you want to take the next step in that content-marketing data-collection project: You not only want to catalogue all of the “video marketing” content, but to catalogue all of the content being published in your industry more broadly. Ultimately, you'll want to use this catalogue to drive market-informed content campaigns of our own. Identifying new topics emerging or types of articles that garner above-average shares can help direct new content creation to align with existing trends. A given article could have many different qualities that could lead to its success. It’s AI’s ability to tag and compare many data points that ultimately produce the marketing takeaway. AI’s strength in turning a mass of data into insight truly shines in the noisiest, highest-volume channels that a marketer hopes to master.


Reduce breach risk and costs with security resilience

Reduce breach risk and costs with security resilience
Best effort is a familiar scenario for most IT shops. Either the security engineer, executive or another leader has said, “We need to install some level of security.” This typically involves implementing firewalls, basic security components, and maybe some basic auditing and monitoring. The next rung up the ladder is regulatory compliance. This is often an executive-level initiative. The thought is that business needs compel the company to be compliant to PCI, HIPAA, or some other standard. One might think this would make the security architecture more robust. Unfortunately, while compliance may be necessary for auditing purposes, it does not guarantee security. The third level is essentially the defensive approach — “I’m going to make this network so secure that no one is going to break into it.” This is when all those inline and out-of-band devices are deployed. You can even create defense-in-depth strategies for prevention. For instance, if someone gets through Port 80 on the firewall, the next step is to challenge the data with DPI (deep packet inspection). There are other things you can do as well, like implement prevention, detection, and response processes.


Crossing the Big Data / Data Science Analytics Chasm


Becoming more effective at leveraging data and analytics is forcing organizations to move beyond the world of Business Intelligence (BI) to embrace the world of predictive and prescriptive analytics. Business Intelligence is about descriptive analytics: retrospective analysis that provides a rearview mirror view on the business—reporting on what happened and what is currently happening. Predictive analytics is forward-looking analysis: providing future-looking insights on the business—predicting what is likely to happen and what one should do ... Unfortunately, with many companies with whom I talk and teach, there is an “analytics chasm” that is hindering the transition from descriptive questions to predictive analytic and prescriptive actions. This chasm is preventing organizations from fully exploiting the potential of data and analytics to power the organization’s business and operational models ... Forever in search of the technology “silver bullet” (the newest technology that magically solves the Analytics Chasm challenge), IT organizations continue to buy new technologies without a good understanding how what it takes to cross the Analytics Chasm.


How to get blockchains to talk to each other

A startup called Aion is developing what it calls a “token bridge” that will let holders of Ethereum-based tokens back up their assets on another blockchain—initially, one built and run by Aion—without duplicating the actual monetary supply, says Matthew Spoke, the company’s founder. The process relies on a group of computers, also called nodes, that have the ability to recognize valid transactions and write new ones to each chain, Spoke says. The nodes that form the bridge will also have a process for reaching agreement amongst themselves and deciding whether to respond to a certain transaction on one of the chains by executing a corresponding one on the other. Spoke says a big difference between the pre-internet days and the blockchain world is the money: today’s competing protocols are often backed by billions of dollars of investment. That will probably ensure that many will succeed, meaning the future will be ruled by numerous blockchains, he says, and interoperability will be key to mainstream adoption. Whatever we end up with, it probably won’t look like the internet—but it could be just as transformative.


Get ready for upcoming 6G wireless, too

“High frequencies, in the range of 100GHz to 1THz (terahertz),” will be used for 100Gbps 6G, the ComSenTer scientists from University of Santa Barbara say in a release. The group created the ComSenTer center, which is part of Semiconductor Research Corporation (SRC) at their school. For spectrum comparison, Verizon’s initial 5G millimeter trials (along with Qualcomm and Novatel Wireless) that are taking place now will only go as far up the spectrum as 39GHz. “Our center is simply the next-, next-generation of communication and sensing,” says Ali Niknejad, ComSenTer associate director and a UC Berkeley professor, on SRC’s website. It’s “something that may become ‘6G.’” “Extreme densification of communications systems, enabling hundreds and even thousands of simultaneous wireless connections” will be part of it, the researchers claim, “with 10 to 1,000 times higher capacity than the nearer-term 5G systems and network.” Medical imaging, augmented reality and sensing for the Internet of Things (IoT) are some of the applications the scientists say will be enhanced by faster-than-5G radios.


The Big Data Question: To Share or Not To Share

(Image: Odua Images/Shutterstock)
"People are realizing that the data they have has some value, either for internal purposes or selling to a data partner, and that is leading to more awareness of how they can share data anonymously," Mike Flannagan of SAP told InformationWeek in an interview earlier this year. He said that different companies are at different levels of maturity in terms of how they think about their data. Even if you share data that has been anonymized in order to train an algorithm, the question remains whether you are giving away your competitive edge when you share your anonymized data assets. Organizations need to be careful. "Data is extremely valuable," said Ali Ghodsi, co-founder and CEO of Databricks (the big data platform with its origins offering hosted Spark) and an adjunct professor at the University of California, Berkeley. In Ghodsi's experience, organizations don't want to share their data, but they are willing to sell access to it. For instance, organizations might sell limited access to particular data sets for a finite period of time. Data aggregators are companies that will create data sets to sell by scraping the web, Ghodsi said.



Quote for the day:


"A leader must have the courage to act against an expert's advice." -- James Callaghan