Daily Tech Digest - March 09, 2019

Misconceptions about the term RPA: would removing a letter from the acronym help?

Misconceptions about the term RPA: would removing a letter from the acronym help? image
Removing the ‘robotic’ term may help to alleviate fears of robots taking over; but according to Jon Clark, proposition development at ActiveOps, it is the word ‘process’ which is the problem. “A process can be very wide-ranging and complex and the type of robots we are seeing automate ‘tasks’ within a ‘process’, so I think the ‘P’ in RPA is part of the problem, not the ‘R’. This is a subtle distinction but creates a challenge in terms of perception,” he says. The process of a credit card application for example, is made up of a series of steps such as checking details, credit scores, updating systems, sending confirmation emails and instructing the card printer. “That’s important because people tend to hear ‘process automation’ and think the whole thing will be automated. Unfortunately, it’s not that simple because robots aren’t yet able to do every task in the process,” he states. However, many within the industry believe that the RPA term should remain, and that changing any of the words could cause more problems that it solves.


Online voting: Now Estonia teaches the world a lesson in electronic elections

Voting online, or i-voting, as it is often called in Estonia, takes place during the advance voting period that runs from the 10th until the fourth day before the election. It is not possible to i-vote on election day. The voting process itself is fairly simple. The voter needs a computer with an internet connection and a national ID card or a mobile ID with valid certificates and PIN codes. Once the voting application is downloaded, the software automatically checks if the voter is eligible to cast a ballot and displays the list of candidates according to the region where the voter is registered. After voters make their decision, the application encrypts their vote and it is securely sent to the vote-collecting server. Every vote receives also a timestamp, so if necessary, it is possible to verify later whether the vote was forwarded to the collecting server. As i-voting doesn't take place in a controlled environment like a polling station, the authorities have to ensure that the vote has been freely cast. So, voters can change their choice during the advance voting period digitally or at a polling station, and then the last vote given is the one that counts.


Triton is the world’s most murderous malware, and it’s spreading


The malware made it possible to take over these systems remotely. Had the intruders disabled or tampered with them, and then used other software to make equipment at the plant malfunction, the consequences could have been catastrophic. Fortunately, a flaw in the code gave the hackers away before they could do any harm. It triggered a response from a safety system in June 2017, which brought the plant to a halt. Then in August, several more systems were tripped, causing another shutdown. The first outage was mistakenly attributed to a mechanical glitch; after the second, the plant's owners called in investigators. The sleuths found the malware, which has since been dubbed “Triton” (or sometimes “Trisis”) for the Triconex safety controller model that it targeted, which is made by Schneider Electric, a French company. In a worst-case scenario, the rogue code could have led to the release of toxic hydrogen sulfide gas or caused explosions, putting lives at risk both at the facility and in the surrounding area.Gutmanis recalls that dealing with the malware at the petrochemical plant, which had been restarted after the second incident, was a nerve-racking experience.


Blockchain marches steadily into global financial transaction networks

Chains of binary data.
SWIFT is among a groundswell of financial services firms testing blockchain as a more efficient and transparent way of conducting cross-border financial transactions, unhampered by much of the regulatory oversight to which current networks must adhere. SWIFT may also be feeling pressure as more and more firms in financial services pilot, or outright adopt, DLT technology. "There is a lot of competition now," said Avivah Litan, Gartner vice president of research. "If you think about SWIFT, it was just a big banking network that moved money quickly and authenticated users, but it costs a lot to do that. And now there are competing initiatives using blockchain." Litan pointed to J.P. Morgan Chase, CLS Group and Ripple, a permissioned blockchain ledger that moves money using a proprietary cryptocurrency, as prime examples of those developing blockchain for cross-border financial transfers. "Ripple is a competitor in the sense that they are trying to set up a bank-to-bank network," Litan said.


GDPR: Still Plenty of Lessons to Learn

GDPR: Still Plenty of Lessons to Learn
During the RSA panel, security expert Ariel Silverstone reported that as of the end of January, there were 41,000 breaches reported under GDPR that fell within the 72-hour notification window. Additionally, there have been about 250 investigations by the various data protection authorities. Silverstone noted that while GDPR involves all 28 countries of the EU, variations in how each country is implementing the law mean companies could face different penalties. For instance, he described that Germany's interpretation of the law makes a violation nearly a criminal case, while other nations have been reducing fines. Silverstone also pointed out that the California Consumer Privacy Act, which adheres to some of the same principals as GDPR, is offering some of the same consumer protections that Europeans now enjoy. Mark Weatherford, the global information security strategist at Booking Holdings, told the audience that while complying with the GDPR rules is difficult, it's not impossible. Before his current job, he worked at a startup that needed to come into compliance.



A Practical Intro to Kotlin Multiplatform

Kotlin has enjoyed an explosion in popularity ever since Google announced first-class support for the language on Android, and Spring Boot 2 offered Kotlin support. You’d be forgiven for thinking that Kotlin only runs on the JVM, but that’s no longer true. Kotlin Multiplatform is an experimental language feature that allows you to run Kotlin in JavaScript, iOS, and native desktop applications, to name but a few. And best of all, it’s possible to share code between all these targets, reducing the amount of time required for development. This blog post will explore the current state of Kotlin Multiplatform by building a simple app that runs on Android, iOS, Browser JS, Java Desktop, and Spring Boot. Maybe in a few years, Kotlin will be a popular choice on all these platforms as well. ... To share Kotlin code between platforms, we’ll create a common module that has a dependency on the Kotlin standard library. For each platform, we’ll support the need to create a separate module that depends on the common module and the appropriate Kotlin language dependency.


How Daimler is using graph database technology in HR


For us, we could see advantages to using graph technology in HR projects because HR data is not isolated, so you don't normally have one person working without a connection to another person. If you look at a company, every time you look at the people working in the company you will see that they all have a connection to other people working in the company, you won't see anybody who is completely isolated. That is one of the reasons why we thought that HR data might be a very good fit with a graph data model. We have started with trying to understand what graph and HR data have in common. ... The second reason, and it's a concrete reason why we created this structured application, is that we created our Leadership 2020 programme at Daimler. We are transforming as a company from the classical, hierarchical structure to a mixture of classic hierarchies and what is called a 'swarm' which is a mixture of the same people working on the same project but coming from different departments and different hierarchies.


Blockchain boosters warn that regulatory uncertainty is harming innovation

Businesses and consumers are reluctant to develop and use blockchain applications in the face of uncertainty over whether they might violate outdated financial laws, the Chamber of Digital Commerce argues in its “National Action Plan” (PDF). Among other things, it calls for “clearly articulated and binding statements from regulators regarding the application of law to blockchain-based applications and tokens.” On Wednesday at the DC Blockchain Summit, SEC commissioner Hester Peirce warned industry advocates to be careful what they wish for. Peirce called the action plan “helpful” and agreed that clear regulatory guidelines are needed. But she cautioned against expecting the government to try to foster innovation, which she said could do more harm than good. Peirce urged patience and cooperation. Regulators are slow, she said, and this technology is complicated: “There’s a learning curve. People at the SEC are trying to learn about this space, and trying to understand where the pressure points are.”


2 reasons a federated database isn’t such a slam-dunk

2 reasons a federated database isn̢۪t such a slam-dunk
First, performance. You can certainly mix data from an object-based database, a relational database, and even unstructured data, using centralized and virtualized metadata-driven view. But your ability to run real-time queries on that data, in a reasonable amount of time, is another story. The dirty little secret about federated database systems (cloud or not) is that unless you’re willing to spend the time it takes to optimize the use of the virtual database, performance issues are likely to pop up that make the use of a federated database, well, useless. By the way, putting the federated database in the cloud won’t help you, even if you add more virtual storage and compute to try to brute-force the performance. The reason is that so much has to happen in the background just to get the data in place from many different databases sources. These issues are fixed typically with figuring out good federated database design, tuning the database, and placing limits on how many physical databases can be involved in a single pattern of access. I’ve found that the limit is typically four or five.


How to use process data mining to improve DevOps

Process mining is the data-driven improvement of business processes, and data scientists often use it to suggest ways to enhance performance. Process data mining works for companies and DevOps teams with processes in place, as well as those that still need to create processes. In the first case, people can compare the best practices for their process with what regularly happens within the team. But, individuals at the enterprise level can also use process data mining to establish their processes. Information sources such as event logs give details about how and when people use tools. Process data mining shows people how far away they are from the target of an ideal process, which can also mean it helps people solidify the processes a DevOps team follows. Then, it’s possible to know how to make the most meaningful process-related improvements and discover the things going wrong. ... Process data mining allows for real-time data collection. The companies that successfully use DevOps rely on release cycle metrics that tell them about progress and quality levels.



Quote for the day:


"Strong convictions precede great actions." -- James Freeman Clarke


Daily Tech Digest - March 08, 2019

20190307birdalison.jpg
What we see is a bigger and bigger push to not just protect data, but demands to protect identity. That was always a expected quantity inside of companies on our own infrastructures in our own data centers because we needed to protect data and assets of value. But now this is being extended to an expectation for customers that we are doing business with to have securitized access. And this is such a big leap. It's only come into being in the last couple of years with regulations around data protection and privacy, that we need to once again make sure that that customer is who they say they are, in order to be able to ensure the privacy of that data. And this is causing a tremendous disruption in the marketplace, if not from a solution standpoint, it is definitely causing a disruption relative to thinking about architecture, thinking about how security is designed. We were originally designed to protect assets and we have firewalls and perimeters.


What is Big Data and why does it matter for business?

what is Big Data in business
The misuse and mishandling of personal data is currently a hot topic, thanks in large part to the scandal involving Facebook and Cambridge Analytica. Increased regulation around the storage and processing of data is highly likely – indeed, it is already underway in Europe in the form of the General Data Protection Regulation (GDPR), which came into force in May 2018. Many technology areas are reliant on large data sets and any restrictions on their ability to use them could have significant consequences for future growth. ... Within vertical markets such as retail, where a sale can be won or lost in a matter of moments, there is no other way to make the necessary rapid-fire decisions, such as which offer to display for a specific customer as he or she enters a store. These decisions cannot wait for such transient events to be uploaded to the company’s cloud, so cloud providers such as Microsoft are revamping their own platforms to push critical analytics functions, such as predictive artificial intelligence (AI) algorithms, downstream to devices.


Marriott CEO shares post-mortem on last year's hack


"As part of our investigation into the alert, we learned that the individual whose credentials were used had not actually made the query," Sorenson said. At that point, the Marriott staff realized they were dealing with a probable breach, although they didn't know if it was something big or just the beginning of a hack that could be very easily contained before the attackers accessed any user data. The company said it brought in third-party forensic investigators on September 10, to help its IT staff look into a possible breach. The forensic firm's rummaging uncovered malware on the Starwood IT systems less than a week later. "The investigators uncovered a Remote Access Trojan ('RAT'), a form of malware that allows an attacker to covertly access, surveil, and even gain control over a computer. I was notified of the ongoing investigation that day, and our Board was notified the following day," the CEO said. Uncovering the full scope of the attack took significant forensic work, the CEO said. 


Gartner on futurology and the year 2035

Gartner on futurology and the year 2035: Technologists can be pragmatic about futurism, but there is a need for us all to speak up image
One definition, explained Frank Buytendijk, is ‘futurism is about postulating possible probable and preferable futures in order to prepare for them.’ But that implies that our role is quite passive — we sit back and wait. He prefers futurology or futurism as ‘the art and science of being able to take responsibility for the long term consequences of actions and decisions today.’ That’s an important definition. It implies we have a responsibility — we can shape and mould the future into an image we might prefer. So he asks the question: “How can we be pragmatic futurists?” Part of the problem is that our view of what the future is can be distorted by the prism of the present. Maybe our futuristic view is framed by rose tinted crystal balls. Maybe it is distorted by whatever is in fashion at any moment. When Gartner asked for a view of the year 2035 five years or so ago, privacy was an overriding theme. In its latest survey, privacy not so much, AI was either mentioned directly or by implication.


How to improve Apache server security by limiting the information it reveals

apachehero.jpg
If you administer the Apache web server, you know there are quite a lot of things you can do to help improve its security. For example, you could (and should) employ mod_security. You could also hide directory folders, run only necessary modules, limit large requests, restrict browsing to specific directories, and so much more. But there's two, often-overlooked, steps you can take to help give your Apache server a bit more security: Turning off the Apache signature and configure ServerTokens. Why does this help? Simple. If you broadcast your server's specific information, you would be informing potential malicious actors what they're up against. They could know what web server you're using, what version of the web server, the hosting platform, and even more. You don't want that information displayed for all to see. So, how do you obfuscate that information? There are two options to be configured, and I'm going to show you exactly how to set them, so to hide away your server details. ... 


Where container infrastructure and management investments yield ROI


The ecosystem of infrastructure, services, tools and expertise listed above turns a simple workload isolation technology into a scalable production platform for multiple applications, batch jobs and microservices. To assess the return on investment for these Capex and Opex charges, review the capabilities each provides. ... Meta-management products appeal to organizations with production containerized application experience, whether on premises or via a cloud service, that now want to standardize on container infrastructure and possibly a PaaS development platform. Within this category of tools is a range of subcategories. Organizations can turn to infrastructure management suites, such as HashiCorp Terraform and Consul, Joyent Triton, Rancher and Mesosphere. Alternatively, PaaS offerings that do the job include Pivotal Cloud Foundry, Red Hat OpenShift and Atos powered by Apprenda.


How to determine if Wi-Fi 6 is right for you

How to determine if Wi-Fi 6 is right for you
There’s a lot of hype around the next Wi-Fi standard, 802.11ax, more commonly known as Wi-Fi 6. Often new technologies are built up by the vendors as being the “next big thing” and then flop because they don’t live up to expectations. In the case of Wi-Fi 6, however, the fervor is warranted because it is the first Wi-Fi standard that has been designed with the premise that Wi-Fi is the primary connection for devices rather than a network of convenience. Wi-Fi 6 is loaded with new features, such as Orthogonal Frequency Division Multiple Access (OFDMA), 1024-QAM (quadrature amplitude modulation) encoding and target wake time (TWT), that make Wi-Fi faster and less congested. Many of these enhancements came from the world of LTE and 4G, which solved many of these challenges long ago. These new features will lead to a better mobile experience and longer client battery life, and they will open the door to a wide range of new applications that could not have been done on Wi-Fi before. For example, an architect could now use virtual reality (VR) over Wi-Fi to showcase a house.


It's just a graph, making gravitational waves in the real world

maxresdefaultjpg.jpg
The combination of JSON-LD and schema.org has probably done more to spread the use of RDF than anything else. Just getting Google and other search engines to adopt it has lead to an array of use cases. And yet, JSON-LD was hugely controversial in its time in the RDF community. This was not the last controversy the RDF community faced, but it seems like JSON-LD's success may have had something to teach. But we'll get back to that shortly. Property graphs have been around for about 10 years, and have been driven by the industry. As such, you could say they are a reversed mirror image of RDF: Pragmatism rules, tooling is abundant and easy to use, outreach and community building are a top priority, but standardization only came as an afterthought at this point. Most property graph solutions do not have a schema, or have a very basic schema. Just getting data in and out of property graph solutions is an exercise in patience and improvisation -- good luck representing a graph structure in CSV, and mapping that from solution to solution.


Extracting value from data: how to do it and the obstacles to overcome

Extracting value from data: how to do it and the obstacles to overcome image
The most significant obstacle for information sharing exchanges, is whether the law or regulation will allow it. According to the survey, 33% of respondents said they would be unable to adjust to new regulations effectively for data protection and privacy. With certain business models this information sharing would not be easily achievable — under GDPR or CCPA (California Consumer Privacy Act) — unless with the explicit consent of the consumer. ... “A lot of this has to do with how companies are organised: 31% said we are organisationally siloed — the data that belongs to one business unit is locked up in that business unit, it is not shared with other business units — so they’re not getting the full value of their data, just because of the structure,” he says. Another interesting result from the survey was that 30% lack the data scientists or analytical talent, who would have the capabilities to better exploit the data. “So, there’s definitely a talent shortage leaving money on the table for companies,” confirms Cline.


How blockchain will manage networks

How blockchain will manage networks
Smart Packet Contracts would protect the network from intrusions and so on, and a “Marconi Pipe” would be the channel. It provides the routing and processing. While it’s actually at the data-link OSI Layer 2 (switches, bridges in terms of hardware; and MAC and Ethernet in terms of protocols), it can also overlay on other infrastructure, such as wireless. A barter system, where network resources can be traded for compute resources, say, rounds out the concept. Monitization could indeed be introduced. Another angle is securing the multiple cloud-based systems running in enterprise. It’s a “challenge to make sure [multi-cloud] communication is secure and safe from attacks such as eavesdropping or ‘man in the middle,'" Jong Kim, chief architect of Marconi Foundation and Network World contributor, said in a VentureBeat article in January. “A common network where each connection point securely peers with every other point, regardless of cloud provider or container instance” could be provided with an Ethernet-layer blockchain.



Quote for the day:


Challenges in life always seek leaders and leaders seek challenges. -- Wayde Goodall


Daily Tech Digest - March 07, 2019

Why Wi-Fi needs artificial intelligence
Over time, I expect AI to lead to fully autonomous networks where the AI runs the wired and wireless network. However, I don’t expect businesses to embrace the concept of a “self-driving network” immediately. Instead, the initial wave of AI as a network management tool will be to assist the engineer by providing recommendations coupled with automated basic tasks, including troubleshooting and a problem avoidance. Engineers shouldn’t fear AI or worry about the technology replacing them. Instead, they should look at it as their best friend because it will free up huge amounts of time, as much of the heavy lifting will be done by machines. The access edge, particularly the wireless network, is growing in importance. But at the same time, it is being pushed to do more because more devices are connecting to it, resulting in orders of magnitude more data traversing the network. Manual operational methods have never worked and certainly will not work in a hyper-connected world. AI-based systems are becoming mandatory to keep the performance of Wi-Fi high and to shed the reputation that flaky Wi-Fi is the norm.



5 trends driving the design of next-generation data centers


The efficiency of data centers is both an environmental concern and a large-scale economic issue for operators. Enterprises in diverse industries from automotive design to financial forecasting are implementing and relying on machine-learning in their applications, which results in more expensive and high-temperature data center infrastructure. It’s widely known that power and cooling represent the biggest costs that data center owners have to contend with, but new technologies are emerging to combat this threat. ... One of the most successful technologies that data center operators have put into practice to improve efficiency is monitoring software that implements the critical advances made in machine learning and artificial intelligence. Machines are much more capable of reading and predicting the needs of data centers second to second than their human counterparts, and with their assistance operators can manipulate cooling solutions and power usage in order to dramatically increase energy efficiency.



“When you are in a disaster recovery situation, you do not want the new person trying out the wings,” says Bruce Beam, chief information officer at (ISC)². Unfortunately, the number of cyber security positions outweighs the number of available cyber security professionals. The demand for cyber security professionals has outpaced supply in recent years, due to emerging threats and organisations increasing the amount of business they conduct online. According to a study, the number of organisations that reported shortages in the cyber security skills of their staff has increased over the past four years. In 2014, approximately 23% of organisations indicated this was a challenge, but this has now risen to more than 50%. Much of this rise has been due to the increasing workload of cyber security teams. Continuing professional development (CPD) has been used to ensure that skills remain relevant. 


Open Source Benefits to Innovation and Organizational Agility

To understand how organizations use open source today, Andrew Aitken presented the state of open source in the context of its evolution from the founders until today. Aitken identified four generations. Generation one, initiated in the early 70s, is represented by the evangelists and thought leaders who founded the open source movement, Richard Stallman, Linus Torvalds, Eric Raymond, etc. Their purpose was to make software free to allow anybody to contribute to their improvement. Generation two consists of influencers, such as Marc Fleury, Marten Mickos, Larry Augustin, who began to think about how to commercialize open source and launched the first few commercial open source companies. Generation three of open source started with the proliferation of the internet and the vast amount of data that became available to organizations. Dotcoms created new technologies to manage data and started open-sourcing their software. 


"If the insurer knows our drivers are always driving well on safer routes, then we might be able to bring down our premium," says Gifford. "So, there's opportunities like that when it comes to using blockchain — and that's just an example. But success in blockchain is all about getting partners on board." Gifford says effective partnerships are critical to Wincanton's broader development efforts. The firm launched an innovation programme called W² Labs last March, which gets startups to develop innovative solutions to the firm's challenges. Wincanton also uses its internal development team and works with external consultants, such as IBM and PA Consulting. The broader aims of these combined efforts is to produce what Gifford refers to as the Internet of Transport. These developments focus on three key areas. First, Winsight, an app that enables a paperless cab, so all the paper lorry drivers normally carry, such as routes and proof of delivery, is wrapped up into a single piece of software on a smart device.


"DevOps Institute is thrilled to share the research findings that will help businesses and the IT community understand the requisite skills IT practitioners need to meet the growing demand for T-shaped professionals," said Jayne Groll, CEO of DevOps Institute. "By identifying skill sets needed to advance the human side of DevOps, we can nurture the development of the T-shaped professional that is being driven by the requirement for speed, agility and quality software from the business." Automation, process, and soft skills were the top three most important skills categories, according to the report. Soft skills—including collaboration and cooperation, problem-solving, interpersonal skills, and sharing and knowledge transfer—are equally important as technical skills to DevOps practitioners, highlighting the need for well-rounded candidates in this field. "The reality of the DevOps world is one that is frequently changing," Erin Lovern, director of global talent acquisition at CloudBees, said in the report.


IoT Expands the Botnet Universe

Botnets comprised of vulnerable IoT devices, combined with widely available DDoS-as-a-Service tools and anonymous payment mechanisms, have pushed denial-of-service attacks to record-breaking volumes. At the same time, new domains such as cryptomining and credentials theft offer more opportunities for hacktivism. ... A new piece of malware that takes advantage of Android-based devices exposing debug capabilities to the internet. It leverages scanning code from Mirai. When a remote host exposes its Android Debug Bridge (ADB) control port, any Android emulator on the internet has full install, start, reboot and root shell access without authentication. Part of the malware includes Monero cryptocurrency miners (xmrig binaries), which are executing on the infected devices. Radware’s automated trend analysis algorithms detected a significant increase in activity against port 5555, both in the number of hits and in the number of distinct IPs. 


Clearer North Korean link to global infrastructure malware campaign


The researchers were able to get a rare look at the workings of a nation state cyber espionage campaign after being handed a command and control server for the campaign by one of the government’s targeted. This provided an opportunity to conduct a detailed analysis of code and data from the server responsible for the management of the operations, tools and tradecraft behind the campaign, previously thought to have run from October to November 2018. The analysis led to the identification of several previously unknown command-and-control centres and indicates that Sharpshooter began as early as September 2017, targeted a broader set of organisations in more industries and countries, and that it is currently ongoing. “McAfee Advanced Threat Research analysis of the command-and-control server’s code and data provides greater insight into how the perpetrators behind Sharpshooter developed and configured control infrastructure, how they distributed the malware, and how they stealthily tested campaigns prior to launch,” said Raj Samani


Cisco uncorks 26 security patches for switches, firewalls

network security lock padlock breach
While the 26 alerts describe vulnerabilities that have a Security Impact Rating of “High,” most –23 – affect Cisco NX-OS software, and the remaining three involve both software packages. The vulnerabilities span a number of problems that would let an attacker gain unauthorized access, gain elevated privileges, execute arbitrary commands, escape the restricted shell, bypass the system image verification checks or cause denial of service (DoS) conditions, Cisco said. It has released software fixes for all the vulnerabilities, and none of the problems affect Cisco IOS software or Cisco IOS XE software, the company said. Information about which Cisco FXOS Software and Cisco NX-OS Software releases are vulnerable and what to do about it is available in the fixed software section of the advisory. ... A couple vulnerabilities in the Nexus software could let attackers gain elevated privileges on the switches and execute nefarious commands. The first weakness is due to an incorrect authorization check of user accounts and their associated group ID, Cisco wrote.


Artificial intelligence and cybersecurity: Attacking and defending

Social engineering remains one of the most common attack vectors. How often is malware introduced in systems when someone just clicks on an innocent-looking link? The fact is, to entice the victim to click on that link, quite a bit of effort is required. Historically, it’s been labor-intensive to craft a believable phishing email. Days and sometimes weeks of research, and the right opportunity, were required to successfully carry out such an attack. Things are changing with the advent of AI in cyber. Analyzing large data sets helps attackers prioritize their victims based on online behavior and estimated wealth. Predictive models can go further and determine willingness to pay the ransom based on historical data, and even adjust the size of pay-out to maximize the chances and, therefore, revenue for cybercriminals. Imagine all the data available in the public domain, as well as previously leaked secrets, through various data breaches are now combined for the ultimate victim profiling in a matter of seconds with no human effort.



Quote for the day:


"Leaders keep their eyes on the horizon, not just on the bottom line." -- Warren G. Bennis


Daily Tech Digest - March 04, 2019


IBM said its Q System One, which has a 20-qubit processor, produced a Quantum Volume of 16, double the current IBM Q, which has a Quantum Volume of 8. IBM also said the Q System One has some of the lowest error rates IBM has measured. That progress is notable, but practical broad use cases are still years away. IBM said Quantum Volume would need to double every year to reach Quantum Advantage within the next decade. Faster progress on Quantum Advantage would speed up that timeline. IBM has doubled the power of its quantum computers annually since 2017. Once Quantum Advantage is hit, there would be new applications, more of an ecosystem and real business use cases. Consumption of quantum computing would still likely be delivered via cloud computing since the technology has some unique characteristics that make a traditional data center look easy. IBM made its quantum computing technology available in 2016 via a cloud service and is working with partners to find business and science use cases.



Another Bitcoin Indicator Signals Price Bottom May Be Forming


Essentially, the MFI validates or confirms price trends. Many times, however, the indicator diverges from the prevailing market trend. For instance, BTC dashed hopes of a long-term bullish reversal with a break below $6,000 on Nov. 14 and hit a 15-month low of $3,122 on Dec. 15. The 14-week MFI also nosedived from the high of 43.00 in mid-November, confirming the sell-off in prices. The indicator, however, bottomed out with a higher low at 22.00, contradicting the lower low in bitcoin’s price. That bullish divergence is widely considered an early warning of a bearish-to-bullish trend reversal. Supporting that argument is the fact BTC snapped its record six-month losing streak with a 10 percent gain in February and the MFI rose from 25 to 44. Other indicators like the moving average convergence divergence (MACD) and the bearish crossover of the 50- and 100-week moving average are also signaling long-term bearish exhaustion. These tools, however, don’t incorporate trading volumes. The MFI, therefore, stands out as a more reliable technical tool.



Dangerous gaps in cybersecurity investments


Historically, many companies have underfunded employee awareness education and training, but that tide largely turned as it became clear that employees with poor security practices were the source of many cyberbreaches. Even so, far too many companies still fail to educate their entire employee base, or to test employee awareness and practices on an ongoing basis. Cybersecurity insurance is a relative newcomer to the security budget mix. Companies have learned that – no matter their defenses – they face high odds of becoming cyberattack victims at some point. Given this awareness, insurance policies are almost certain to capture a growing percentage of the overall cybersecurity budget. However, insurance should be treated as a complement to strong security technology, staffing and education, not as an alternative to them. When making your cybersecurity investments, it’s critical that you direct the funds in a balanced way that addresses all of these security areas. Each one plays a critical role in building comprehensive defenses, and underfunding any of them could prove to be an extremely dangerous and costly error.



No Avoiding the Inevitable: The Time for Cyber Security Analytics is Now

“Most organizations understand security analytics as an elusive cluster of different technologies encompassing ‘a little bit of everything’," said Pavlakis. "While on a top level they are somewhat correct on that respect, they, unfortunately, opt to pick whatever makes sense budget-wise."  Regardless of how organizations approach the security analytics marketplace, approach they will. For example, Gartner's 2019 CIO Agenda Survey found that analytics and cyber security top this year's priority lists among CIOs in the government sector.  In analyzing Gartner's findings, Security Boulevard's Filip Truta suggested that government is actually a late-comer to this realization, and that other industries are already hip to the power of cyber security analytics. "High-profile data breaches have highlighted cybersecurity analytics as a formidable weapon against sophisticated attacks and advanced threats that elude prevention mechanisms at endpoint level," wrote Truta.


Huawei Denies Then Plays The Blame Game Over Cybersecurity Vulnerabilities


Claims and subsequent action by the United States and other countries have put Huawei, Supermicro, and ZTE under a negative spotlight and the effects have been damaging from a revenue, brand, and loyalty perspective. Although the UK's National Cyber Security Centre (NCSC) deemed Huawei as a "manageable risk," these companies will be challenged to regain their credibility and reputations in the security industry. Although it is nearly impossible to prove the claims against each company, it does force every equipment vendor to determine which side of the fence they are on and perhaps incentivize the industry to make meaningful long-term changes and safeguards—especially as 5G becomes a reality. While these companies are on their heels, rivals like Cisco, Ericsson , Nokia , etc. have a healthy competitive opportunity to grow market share. However, as a wise person once said, “what comes around, goes around” it will be easier for the industry to take care of itself before clueless bureaucrats and politicians do it for them.


Fintech in Sub-Saharan Africa: A Potential Game Changer


Sub-Saharan Africa is the only region in the world where close to 10 percent of GDP in transactions occur through mobile money. This compares with just 7 percent of GDP in Asia and less than 2 percent of GDP in other regions. Most African users now rely on mobile payments to send and receive money domestically. Increasingly, they are taking advantage of new services to also send and receive money internationally. In addition, they use mobile money to pay their bills, receive their wages, and pay for goods and services. Innovation is allowing Africans to move up the “financial services value chain.” From mobile payments, customers in sub-Saharan Africa are gaining access to mobile banking and other services as they open saving accounts, take out loans, purchase insurance, and invest in Government securities or in stock markets with a few touches of their mobile phone. They can even “borrow” electricity and pay later instead of sitting in the dark. New innovations in fintech are proceeding rapidly. New technologies are being developed and implemented on the continent, and they have the potential to yield significant benefits for Africa.


This coworking space is like a horse trailer, but for humans


No one looks forward to a day at the office–no matter how much free cold brew is on tap. We all dream of that digital nomad life, kicking up our feet at the beach while knocking out a day of emails. Mojitos optional. So I very much understand what the South African shared workspace company Work & Co (not to be confused with the New York digital design agency Work & Co) was thinking when it developed the Nova workspace–an office on wheels, which you can rent for $250 a day, and in exchange, the company will tow it to a uniquely beautiful location. I just didn’t imagine that the office would look like this: a horse trailer, but for humans. I mean, don’t get me wrong, there’s effort here! You have velvet upholstery (seating for six!), a hip little wallpapered corner, and plenty of windows for panoramic views of the scenery. You have coffee, shade, and a bathroom–fulfilling the three core components of Maslow’s Hierarchy of Needs When Working Remote. What more could you want?


How to prepare employees for AI's impact on the workforce

istock-920743046ai.jpg
"The growth of artificial intelligence and emerging technologies (ET) is poised to reshape the workforce. While the exact impact of AI and ET is unclear, experts expect that many jobs currently performed by humans will be performed by robots in the near future, and at the same time, new jobs will be created as technology advances," said Elizabeth Mann Levesque of the Brookings Institute. Companies can ease employee's concerns about AI adoption by taking these two steps: In my career I've experienced many company reorganizations. The format usually consists of a consultant visit, the vice president explaining that the department is being assessed to maximize workflows and that it will benefit everyone—and then everyone goes back to their desks wondering if they will be laid off. As a junior staff member, I was a lay-off victim in my very first IT job. I did documentation that the consultant deemed it "non-essential." Years later, I still recall the trauma of it. It wasn't getting laid off that hurt. It was going to work and not knowing what would happen next.


In cybersecurity, it’s AI vs. AI: Will the good guys or the bad guys win?

cyber-security-3443625_1280-thedigitalartist-pixabay
For all its promise, there are areas in which AI adds little value or may even create new vulnerabilities. Machine and deep learning work best when the problem domain is well-known, and the variables don’t change very much. Algorithms are good at detecting variations in patterns but not at identifying new ones. “To say you’re going to find the unknown is really tough,” said Tom Clare, senior product marketing manager at Fidelis Cybersecurity Inc., which specializes in threat detection and response. Changing variables can flummox machine learning algorithms, which is one reason they have so far had limited value in combating malware, the incidence of which has risen fivefold since 2013, according to SafetyDetective.com. Machine learning algorithms “inherently fail because the training set of malware changes too quickly,” said Doug Swanson, chief technology officer at Malwarebytes Corp. “The malware your model will see in the future will end up looking little to nothing like the malware it has seen, and been trained on, in the past.“


The Open Source Approach to Accelerating Digital Transformation

The Open Source Approach to Accelerating Digital Transformation
It’s worth noting that a lot of advances and services innovation found in hyperscale cloud companies is actually mostly achieved by leveraging the thousands of developers in the open source community, and OpenStack provides a compatible platform for taking advantage of these, and even more advanced developments. The latest advances in distributed databases, containers, Kubernetes automation and scaling, platform as a service (PaaS), artificial intelligence, machine learning, Internet-of-Things and 5G networks are all available on OpenStack – sometimes long before the proprietary cloud vendors can develop systems to exploit these new technologies and make them available across all geographies. As customers embrace hybrid environments, the same technology that can be offered both online can also be implemented in their own data centers. Many companies choose to put their variable workloads in the cloud, while keeping production on-site.



Quote for the day:


"It takes an influential leader to excellently raise up leaders of influence." -- Anyaele Sam Chiyson


Daily Tech Digest - March 03, 2019


You need to methodically identify the possible risks that could face your start-up. You might want to think outside the box for this one because anything could, and will, happen so you better have an answer for any risks you have identified. Next you need to assess the likelihood that the risks will happen and understand how to respond. You might want to put them in a list from most likely to almost impossible. Unless you live in Tornado Alley you may consider a natural disaster as a long shot. Arlene Dickinson, the Canadian Dragon’s Den maven may disagree. She went to work one day after a brutal storm to find that the basement in her tony Toronto office had water damage. No, this wasn’t a leaky roof, her entire basement where she stored client records, computers, taxes and more was filled to the ceiling with smelly, dirty water. Everything was ruined but she did have insurance. When you see the types of risks you will have to come up with your own way to mitigate them.


A golden Bitcoin sits on a printed circuit board as a concept for cryptocurrency and blockchain technology. (Photocredit: Getty).
No single central entity stores and processes payments or manages the admission of new units into the database, thus ensuring freedom of access. This is the key difference of DAG over its predecessors. In centralized systems, only one party was allowed to add transactions to the ledger, while in blockchains, only a select few - the miners - are allowed to do it. And, in DAG, everybody is allowed to write to the ledger. DAG also improves speed and throughput. So, instead of having one single chain of blocks, data can be added to any number of parallel interconnected “lanes.” One can think that it would be challenging to keep this thing together while everybody is allowed to write to the ledger at the same time, which is right. This is what consensus algorithms are about and it is currently an area of active research where some of the brightest minds are involved. Simplified, the intuition behind Obyte’s consensus algorithm is as follows: when a user adds a new transaction, it is placed on the ledger together with addresses of twelve witnesses.


Dongle Danger: Operating Systems Don't Defend Memory
The weaknesses, collectively called Thunderclap, highlight a new class of threats posed by malicious peripherals. The research has been in the works since 2016, and Apple is one of several vendors that have issued software updates as a result. The work focused on the Thunderbolt 3 data transfer standard over USB Type-C connectors. Although operating systems are supposed to only allow a peripheral to have direct memory access for the resources it needs, researchers found that this defense isn't implemented effectively to prevent data theft. The research also covered PCI Express, or PCIe, an older set of device connection and data transfer protocols. Stealing data this way would require physical access to a device. "The combination of power, video and peripheral-device DMA over Thunderbolt 3 ports facilitates the creation of malicious charging stations or displays that function correctly but simultaneously take control of connected machines," the researchers write in a blog post.



The approach has the potential not just to diversify tech but to help “techify” everything else, said Megan Smith, former CTO for the Obama administration: “We could really work on ... the hardest problems together in this collaborative way.” Faculty at the new college will work with other MIT departments to cross-pollinate ideas. Classes will also be designed so that technical skills, social sciences, and the humanities are bound up together within each course rather than learned separately. “It’s not just thinking about how you learn computation,” Melissa Nobles, the dean of MIT’s School of Humanities, Arts, and Social Sciences, told MIT Technology Review after the main-stage event, “but it’s also students having an awareness of the larger political, social context in which we’re all living.” This has also been my driving mission with MIT Technology Review’s AI newsletter, The Algorithm: to dismantle our outdated notions that technology is for the tech people and social problems are for the humanities people; that there is such thing as a “math person,” which is certainly not the same as a “people person.”



Psychologists and researchers have developed a systematic approach for discovering a sustainable solution to any problem. This technique, commonly referred to as the problem-solving cycle, starts with identifying the problem. After all, there could be multiple issues within one situation, and you could be focusing on the wrong one. Separate the symptoms from the cause. After defining the problem, form a strategy. This will vary depending on the situation and your preferences, but develop wide-ranging ideas while taking into consideration your resources. Are the solutions feasible? Come up with multiple ideas to have options. Organize your information: What do you know -- or not know -- about the problem? By collecting as much information as possible, you increase your chances of achieving a positive outcome. Once you settle on a solution, monitor its progress. The solution you developed should be measurable so you can assess whether it's reaching its destination. If not, you may need to implement an alternative strategy.


evolve-business-by-zach-meyers.jpg
Kimball noted that when they started the company, they weren't yet sure where CockroachDB would fit into the ecosystem, or what kinds of companies would be willing and able to move to a new RDBMS. He went on to add, however, that in 2018 they began to answer those questions and ended with an impressive first year of revenue:  "It turns out that much of the Fortune 2000 is struggling with often board-level mandates to embrace the benefits of the public cloud. That modernization process opens the door to consideration of alternatives to Oracle, especially databases better suited to exploiting the opportunities inherent in the cloud.  Where CockroachDB has a big strategic advantage over the likes of AWS Aurora or Google Cloud Spanner is that we offer a bridge from the reality of existing on-premise deployments to the desired outcome of using the public cloud wherever it makes sense. CockroachDB can be run on-premise, hybrid, and across arbitrary cloud vendors."


big data experts love data
“First, data is a rich source of insights and discovery about any domain, to understand deeper the things that we already know about the domain and to discover new things that we did not know about it. Second, data is the fuel (the essential input) for interesting algorithms and models that can be used to help predict the future, to optimize outcomes, to reveal emerging trends, and to detect anomalies, sometimes before they happen. Third, data is sensory input to our natural human activities of pattern detection and pattern recognition that become the basis for nearly all human decisions and actions as we move forward through our world. Fourth, data are measurements that encode knowledge – as such, data delivers a wonderful very human challenge to us to decode that knowledge and consequently to become smarter and wiser about people, processes, events, and all things. Finally, data ignites innovation, transformation, and value creation in all organizations and businesses through pattern exploration and pattern exploitation within the digital signals that flow all around us.”


Dow Jones Data Exposed on Public Server
Bob Diachenko, an independent security researcher, discovered that an Amazon Web Services-hosted Elasticsearch database exposed the records, TechCrunch first reported. The exposed data, which has since been secured, is Dow Jones' Watchlist database, which companies use as part of their risk and compliance efforts. Dow Jones says in a statement that "an authorized third party" was to blame for the exposure, but it did not name the company. Dow Jones declined to provide further details on the incident. Security researchers say the incident highlights the need for adequate vendor risk management. A recent Verizon report found that one of every two data breaches stems from third-party risks. Too many organizations focus on protecting their own IT infrastructure, ignoring the security of data handed over to a third party, security experts say. "This becomes a major issue because you are as vulnerable as your vendor managing your data," says Edwin Lim, director of professional services - APJ, at Trustwave, a Singtel company.



By creating a seemingly innocent application that holds a malicious exploit script, potential attackers can dupe users when the app asks for permission to access the external storage. A typical user is likely to approve the request, enabling the attacker to manipulate the data written on that storage and used by multiple applications. App development guidelines urge developers not to have their apps store sensitive code in the external storage, though our researchers found that many apps, including Google Translate, did not heed this advice. However, while security-related guidelines are great, frankly, it’s naïve to expect every developer in the world to have security top of mind when they write their code, let alone to have enough expertise to get it right. Google patched their applications that were affected by this particular vulnerability, as responsible companies do, but it goes to show that identifying just one entry point is enough to keep attackers in business.



The only thing constant is change. And, no matter what the size of an enterprise is, over the last few decades, businesses across the world have witnessed tremendous changes in the ways they operate and run. One of the major factors influencing this change is, undoubtedly, the technological explosion in all aspects of our lives. From the gigantic machinery that man started out with, the ones with heavy knobs and loud motors, to sleek tablets and microchips with tremendous computing and processing powers, the application of science and technology in businesses has brought about drastic changes, and mostly for the better. Since technology stepped into businesses, a spike in the production of software, programs, applications, and interfaces all designed exclusively for businesses to collaborate with teams, manage data, and derive insights from sales made, have been flooding the market. Today, we have a new class of businesses called ‘digital businesses’ that heavily rely on the Internet for their everyday functionality.



Quote for the day:



"A leader must have the courage to act against an expert_s advice." -- James Callaghan


Daily Tech Digest - March 02, 2019


In the wildest dreams of enthusiasts, these devices will be a gateway to something called the decentralized web, or “Web 3.0.” In this future version of the internet, blockchains and similar technologies would support decentralized applications—“dapps”—that look and feel like the mobile apps we use today but run on public, peer-to-peer networks instead of the private servers of big tech companies. It’s widely thought that a major impediment to mainstream adoption of cryptocurrency and dapps is that these technologies are too difficult to use for people who are not especially tech savvy. Better user experiences, starting with cryptographic key management, could change that. But getting there is not straightforward, given that key security is paramount: you lose your keys, you lose your assets. This also explains why Ethereum creator Vitalik Buterin seems so excited about one particular feature of HTC’s Exodus 1, called social key recovery. Essentially, users can choose a small group of contacts and give them parts of their keys.



Q&A with Dominic Harvey, director at CWJobs: Plugging the tech skills gap

The tech sector is notorious for experiencing high staff turnover. Tech workers regularly switch jobs to climb the career ladder or have the chance to get their hands on the latest equipment, in turn taking their training and specialist skills with them. This can be seen as problematic for companies across the board, but the spreading out of great talent and important skills throughout the sector means an overall boost to the UK’s tech capability. Bigger companies should also seriously consider tech apprenticeship schemes as a long-term solution to addressing skill shortages, with this presenting the opportunity to train young people in the sector and the company from the ground up. So rather than viewing this trend as negative, businesses need to accept this is part of the industry and use it as an opportunity to equip the next generation of talent. I would encourage tech firms to focus, not just on investing in the latest equipment, but in creating a culture that sees employees leave on good terms, with the potential for them to return with an even greater skill set than when they left.


New chemistry-based data storage would blow Moore’s Law out of the water

New chemistry-based data storage would blow Moore̢۪s Law out of the water
Ultra-miniaturization, using chemistry and its molecules and atoms, has been on the scientific community radar for a while. However, it’s been rocky—temperature has been a problem, among other things. One big issue, which may be about to be solved, is related to controlling flowing electrons. The flowing current, acting like a wave, gets interfered with—a bit like a water wave. The trouble is called quantum interference and is an area in which the researchers claim to be making progress. Researchers want to get a handle on “not only measuring quantum phenomena in single molecules, but also controlling them,” says Nongjian "NJ" Tao, director of the ASU's Biodesign Center for Bioelectronics and Biosensors, in the article. He says that by figuring the charge-transport properties better, they’ll be able to develop the new, ultra-tiny electronics devices. If successful, data storage equipment and the general processing of information could end up operating through high-speed, high-power molecular switches. Transistors and rectifiers could also become molecular scale. Miniaturization-limiting silicon could be replaced.


Create an IT support process to take on any outage


Issue management platforms can initiate post-mortems as a follow-up action to a major or critical issue. The tool supplies a detailed log of the incident response timeline and actions/results for review. Post-mortems focus on root causes rather than proximate causes. Proximate causes are the reasons or triggers that started the issue. A root cause is the central fault that, if corrected, could prevent all such incidents. For example, an application throws an error because its volume runs out of storage. The application error is the proximate cause of the issue, but the root cause is a lack of monitoring of logical unit number (LUN) usage and remaining capacity. The post-mortem evaluation might result in new storage monitoring that triggers an alert when the LUN hits 85% full. With that fix in place, administrators can add storage before an application error ever occurs. Similarly, a post-mortem could inform a decision to upgrade systems or software.


Equifax CTO: 3 keys to leading culture change

CIO Hands Collaboration
Tech leaders need to recognize, and embrace, the biggest barrier to technology plans isn’t whether the technology will work – it’s the organizational culture. Ten or 15 years ago, the danger was in the details of knowing whether or not Vendor A could interoperate with Vendor B. Today, we don’t have those same technology barriers; instead, people and cultures have become our largest challenges regardless of the size of the company or industry. Encouraging teams to think and act differently requires an invitation to have them join you in a different way of working. Whether or not they accept that invitation is up to them, but it’s up to you to make decisions based on their response. Change must occur – people either want to change or you need to change the people. Look at your team and your people, your own cadence, your own style, and your own words. We have to be intentional about the way we work, the way we talk, the way we interact and the types of people we hire.


Node.js JavaScript vs PHP: Which programming language is winning over developers?


The growing popularity of Node.js JavaScript was captured by 2018 Node.js User Survey , which also shed light on how the language is being used. "Node.js continues to see it's popularity grow on every continent and in a very broad set of use cases due to its flexibility and utility for a wide variety of use cases", it stated, with web apps being the most popular use case, followed by enterprise apps. Of course, the problem with the surveys above is they are canvassing developers who work primarily with JavaScript, and who as a result may be more likely to choose Node.js at the backend. It's also not necessarily the case that firms either wholly use a single back-end language. Organizations may use Node.js JavaScript for some sites and services and PHP on servers supporting other sites. That dual-use is borne out by the surveys, with almost one third of developers saying they used PHP alongside Node.js in the 2018 Node.js user survey, while respondents to the Vue.js survey also reported using a variety of languages at the backend.


ROX Is the New ROI: Prioritizing Customer Experience


PwC’s 2019 Global Consumer Insights Survey — the results of which will be published soon — highlights the need to focus on both ROI and ROX. For example, we asked more than 21,000 consumers in 27 territories around the world what they thought was the most influential type of advertising. About 35 percent said traditional TV ads, the highest percentage among all of the choices. This might seem like good news for the world’s biggest companies, because that’s where they still spend the bulk of their advertising dollars. But dig a little deeper, and you’ll see the desire for experience staring us right in the face. ... There are other ways you can balance your focus on ROX versus ROI, including with your physical retail space. Retailers, banks, and auto dealerships invest enormous amounts of resources and time in their stores, branches, and showrooms. Yet, in today’s harried world, where so much product research is done online by consumers, creating the best customer experience is often more about getting patrons in and out as quickly and efficiently as possible.


Data: if it’s the next oil, is it renewable or toxic?

Data: if it̢۪s the next oil, is it renewable or toxic? image
It all boils down to privacy. Data has the potential to support the discovery of new medical treatments. It could transform healthcare for the better — and it is hard to find anyone who would not be in favour of that. But at what price? Regulators seem to have decided that in some cases the price is too high. ... The EU’s GDPR and other privacy regulations being rolled out across the world in countries like Canada, Japan and Brazil are an attempt to ensure we get the benefits of data without the penalty of lack of privacy. But GDPR does not always work. How often do you throw your hands up in frustration because you have to read and agree/disagree with privacy policies and opt-in requests, just to get a tiny piece of information? It sometimes takes longer to read the disclaimers and other compliance inspired literature, than get the actual information you need. According to Sarah Burnett, Executive Vice President and Distinguished Analyst at Everest Group: “Organisations are confusing their ability to share data internally between departments.”


Teen becomes first millionaire through HackerOne bug bounties


According to bug bounty pioneer and CEO of Luta Security, Katie Moussouris, although targeted bug bounties have a role to play in cyber security, they are not a “silver bullet”, and run the risk of wiping out talent pipelines if poorly implemented, by providing incentives for people with cyber security skills to work outside organisations in pursuit of bounties. Lopez said he was proud to see his work recognised and valued. “To me, this achievement represents that companies and the people that trust them are becoming more secure than they were before, and that is incredible. This is what motivates me to continue to push myself and inspires me to get my hacking to the next level,” he said. Lopez is a top-ranked hacker on HackerOne’s leaderboard, out of more than 330,000 hackers competing for the top spot. His specialty is finding insecure direct object reference (IDOR) vulnerabilities.


The big picture: Is IoT in the enterprise about making money or saving money?

Is IoT in the enterprise about making money or saving money?
Basically, we’ve got about a third of companies hoping to save some money, another third looking for new revenue from increased production, monetizing data or creating product-as-a-service offerings, and the last third expecting a little of both. That couldmean that the IoT offers something for everyone, solving whatever problems a company might face, which is how Seth Robinson, senior director for technology analysis at CompTIA pitched the results in a statement: “This recognition that IoT is not simply a tool for cost savings, but a potential source of new revenue, is mirrored by our finding that for a majority of companies, funding for IoT projects often comes from places other than the IT department. This demonstrates not only the importance of IoT to future strategy, but the company-wide impact IoT tends to have.” In other words, the IoT fights crime and cures cancer! It’s a deodorant that doubles as a floor wax!



Quote for the day:


"Many people read History books but it takes just a few people to LEAD the cause that will shape the course of HISTORY." -- Fela Durotoye