Daily Tech Digest - March 10, 2019

Hacking Our Identity: The Emerging Threats from Biometric Technology

Deposit Photos Enhanced By CogWorld
Despite the seemingly enormous potential of biometric technology and its applications, the security it provides seems to be just an illusion due to the complex process, policy and people challenges it brings with it. While it is almost impossible to lose or replace biometrics, the question remains whether biometrics technology is full proof and ready for global implementation. That brings us to an important question: can the evolving biometric system be in itself a complete human identification and authentication system, or it can only be part of an identification system? ... Perhaps the biometric system can only be one part of an overall human identification or authentication process, as there are many other variables and parts of that process that will need to play an equal role in determining identity verification effectiveness. Moreover, since the evolving biometric technologies are vulnerable to errors and are easily tricked and manipulated (by AI), it is important that we evaluate whether the ongoing effort towards human identity authentication gives the decision-makers the level of security they are hoping for.



Why Our Brains Fall for False Expertise, and How to Stop It

The brain uses shortcuts to manage the vast amounts of information that it processes every minute in any given social situation. These shortcuts allow our nonconscious brain to deal with sorting the large volume of data while freeing up capacity in our conscious brain for dealing with whatever cognitive decision making is at hand. This process serves us well in many circumstances, such as having the reflex to, say, duck when someone throws a bottle at our head. But it can be harmful in other circumstances, such as when shortcuts lead us to fall for false expertise. At a cognitive level, the biases that lead us to believe false expertise are similarity (“People like me are better than people who aren’t like me”); experience (“My perceptions of the world must be accurate”); and expedience(“If it feels right, it must be true”). These shortcuts cause us to evaluate people on the basis of proxies — things such as height, extroversion, gender, and other characteristics that don’t matter, rather than more meaningful ones.


How to create a transformational cybersecurity strategy: 3 paths


"There's a fine line between the deeply technical, scientific part of cybersecurity, and the people part, which we spend less time talking about—the stuff that actually enables a sustainable transformation," Budge said. "We've seen how one without the other can fail." A good strategy moves security from an IT issue to one of customer trust, Budge said. It also moves security from a technically-focused discipline to a holistic one, and gives business the freedom to achieve its digital aspirations, rather than acting as a blocking agent, she added. Bad cybersecurity strategies are those that cause companies to miss the breaches they experience, that invest in the wrong areas, that require teams to spend their time responding tactically, and that struggle to attract and retain talent, Budge said. No one silver bullet exists for creating a cybersecurity strategy; each is dependent upon the size of the organization, its cybersecurity maturity, and the level of support in the organization, Budge said.


People Are More Complex Than Computers

Defining what Human Resources looks like and how this functions in a decentralised organisation with more than half of its staff being independent consultants is the first issue. Another realisation that came pretty quickly is that software metaphors can only take you so far. People have feeling, whereas computers don't. In real life, you are always testing in production, there is no "staging environment". In software when you make a mistake you can try again many times, or write an automated test to make sure that the same issue won't happen again. In real life, this is impossible. Balancing freedom versus accountability is hard, as is diversity and inclusion when growing a global, distributed organisation. In short, growth of an organisation is neither linear, nor predictable. What Equal Experts has learned from this process is that bigger is different, and many times you need to dynamically adapt, or "make it up as you go". As long as you strive for continuous improvement, and trust and empower your people, you are setting yourself up for success.


These are the keys to recruiting (and keeping) your Gen Z employees


“Gen Z is 100% digitally native, meaning they are the first job seekers to be born during the age of smartphones, self-service online tools and AI-enabled virtual assistants like Siri and Alexa. They’ve never known a world without the convenience and speed of digital interaction. Much of their time is spent on social media, streaming videos and gaming online,” says Kurt Heikkinen, CEO of candidate engagement and interview software Montage. “As a result, because so much of their world is instant, digital, and seamless, they expect the exact same experience when it comes to job searches and the hiring process. To create the kind of candidate experience that will engage Gen Z and accelerate job offers, explore interviewing technology that gives candidates more choice and control–like automated scheduling, AI-enabled virtual hiring assistants, and on-demand interviews–that offers candidates the high-touch, high-tech experience that they want during their job hunts.”


JP Morgan’s Stablecoin: A Feat of Engineering or Marketing?


JP Morgan’s stablecoin neatly connects the dots between the aspects of settlement and volatility management by providing digital cash that can be used and enabling the ability to redeem the coin at a stable rate. While this may sound like a significant achievement, all JP Morgan’s stablecoin actually provides is the ability for a counterparty to be paid by JP Morgan in exchange to being provided a digital certificate. It is actually the anathema to the idea of creating an ecosystem whereby all participants can utilize a universally accepted and redeemable digital cash. Instead, it is a mechanism where JP Morgan will redeem a token, that it issues on its platform only. This is akin to only being able to buy, gamble and cash in your gambling chips at the Venetian casino. And far from being a technology innovation, this is something that at its most fundamental is old technology masquerading as a new innovation.


How Crypto Company is Fighting Setbacks to Deliver New Technology for Users


ILCoin says that things truly began to change in November 2017, putting the startup in a position where it could start to develop and build foundations for future success. The project says 2018 delivered much more change and positive developments than in the past three years combined, with its team establishing meaningful relationships with exchanges and listing sites. ILCoin says that its newly developed consensus mechanism, C2P, will help deliver levels of security on blockchain that have never been achieved by other projects. After learning harsh lessons during the early stages of its business, the company’s founders are determined to focus on creating sound technology that can make a difference to the public. ILCoin says this is a stark contrast to other companies which have aimed to promote ERC-20 tokens through exaggerated and often slick-looking marketing campaigns, even though the product has little substance.


A Great Engineer Needs the Liberal Arts

Every great developer I've worked with has excellent problem solving skills. I've participated in many technical interviews, on both sides of the table, where the goal wasn't to determine coding ability as much as it was to demonstrate how a person approaches a new problem. In STEM subjects, the scientific method is often employed as a logical set of analytical steps. ... It may be easy to forget that the process begins with asking a question and doing background research, and ends with communicating your findings. Coming up with a question, determining if it is the right question to ask, and doing background research, all require critical thinking skills which are the focus of the liberal arts. Effectively reporting your findings comes back to knowing your audience. If you wrote a simple prototype application to test performance improvement, how would you communicate the results to your non-technical product owner? Showing the raw code is probably no more helpful than writing a 50-page report.


How to Build an Enterprise Architecture Roadmap: 4 Styles to Consider

How to build an Enterprise Architecture Roadmap
Of course, not everyone wants to go to Hawaii for sun and seafood. Perhaps snow and schnapps in the Swiss Alps is more your style? Similarly, each business has a different destination in mind and a different set of core metrics to guide them there. If what you primarily care about is using high-cost resources efficiently you will need quite a different set of roadmapping priorities to an FMCG company, which is primarily focused on time-to-market. In ABACUS you can set any number of goals and then use the tool’s powerful analytics to quantitatively assess potential options. This includes out-of-the-box algorithms using equational, structural, discrete-event and Monte-Carlo techniques. You can run these using a range of metrics including Financial (e.g. TCO, ROI, NPV), Technical (e.g. resource utilization, response times, availability, reliability, etc) and Environmental (e.g. carbon footprint, resource re-use, sustainability, heat & power consumption).


Catching Up On The Open Group Open Process Automation Forum

One key element that distinguishes process automation is that it is “always-on.” It’s a non-stop effort. Once the plant stops, the organization stops making money. It’s vitally crucial that the plant keeps operating, hopefully at optimal efficiency. The manufacturer is very much opposed to anything that will cause the plant to shut down because that will result in a direct loss of revenue to the organization. The same applies to other industries beside oil and gas. It applies in pharmaceutical companies as they go through the whole process of generating the products, packaging them and getting them out the door. That has its own challenges in that a lot is based on how quickly you can get to market. Food and beverage is another example. They do a lot of continuous processing where soda, beer, cereals and all kinds of other food stuff is created from raw materials on a continuous processing basis.



Quote for the day:


"Really great people make you feel that you, too, can become great." -- Mark Twain


Daily Tech Digest - March 09, 2019

Misconceptions about the term RPA: would removing a letter from the acronym help?

Misconceptions about the term RPA: would removing a letter from the acronym help? image
Removing the ‘robotic’ term may help to alleviate fears of robots taking over; but according to Jon Clark, proposition development at ActiveOps, it is the word ‘process’ which is the problem. “A process can be very wide-ranging and complex and the type of robots we are seeing automate ‘tasks’ within a ‘process’, so I think the ‘P’ in RPA is part of the problem, not the ‘R’. This is a subtle distinction but creates a challenge in terms of perception,” he says. The process of a credit card application for example, is made up of a series of steps such as checking details, credit scores, updating systems, sending confirmation emails and instructing the card printer. “That’s important because people tend to hear ‘process automation’ and think the whole thing will be automated. Unfortunately, it’s not that simple because robots aren’t yet able to do every task in the process,” he states. However, many within the industry believe that the RPA term should remain, and that changing any of the words could cause more problems that it solves.


Online voting: Now Estonia teaches the world a lesson in electronic elections

Voting online, or i-voting, as it is often called in Estonia, takes place during the advance voting period that runs from the 10th until the fourth day before the election. It is not possible to i-vote on election day. The voting process itself is fairly simple. The voter needs a computer with an internet connection and a national ID card or a mobile ID with valid certificates and PIN codes. Once the voting application is downloaded, the software automatically checks if the voter is eligible to cast a ballot and displays the list of candidates according to the region where the voter is registered. After voters make their decision, the application encrypts their vote and it is securely sent to the vote-collecting server. Every vote receives also a timestamp, so if necessary, it is possible to verify later whether the vote was forwarded to the collecting server. As i-voting doesn't take place in a controlled environment like a polling station, the authorities have to ensure that the vote has been freely cast. So, voters can change their choice during the advance voting period digitally or at a polling station, and then the last vote given is the one that counts.


Triton is the world’s most murderous malware, and it’s spreading


The malware made it possible to take over these systems remotely. Had the intruders disabled or tampered with them, and then used other software to make equipment at the plant malfunction, the consequences could have been catastrophic. Fortunately, a flaw in the code gave the hackers away before they could do any harm. It triggered a response from a safety system in June 2017, which brought the plant to a halt. Then in August, several more systems were tripped, causing another shutdown. The first outage was mistakenly attributed to a mechanical glitch; after the second, the plant's owners called in investigators. The sleuths found the malware, which has since been dubbed “Triton” (or sometimes “Trisis”) for the Triconex safety controller model that it targeted, which is made by Schneider Electric, a French company. In a worst-case scenario, the rogue code could have led to the release of toxic hydrogen sulfide gas or caused explosions, putting lives at risk both at the facility and in the surrounding area.Gutmanis recalls that dealing with the malware at the petrochemical plant, which had been restarted after the second incident, was a nerve-racking experience.


Blockchain marches steadily into global financial transaction networks

Chains of binary data.
SWIFT is among a groundswell of financial services firms testing blockchain as a more efficient and transparent way of conducting cross-border financial transactions, unhampered by much of the regulatory oversight to which current networks must adhere. SWIFT may also be feeling pressure as more and more firms in financial services pilot, or outright adopt, DLT technology. "There is a lot of competition now," said Avivah Litan, Gartner vice president of research. "If you think about SWIFT, it was just a big banking network that moved money quickly and authenticated users, but it costs a lot to do that. And now there are competing initiatives using blockchain." Litan pointed to J.P. Morgan Chase, CLS Group and Ripple, a permissioned blockchain ledger that moves money using a proprietary cryptocurrency, as prime examples of those developing blockchain for cross-border financial transfers. "Ripple is a competitor in the sense that they are trying to set up a bank-to-bank network," Litan said.


GDPR: Still Plenty of Lessons to Learn

GDPR: Still Plenty of Lessons to Learn
During the RSA panel, security expert Ariel Silverstone reported that as of the end of January, there were 41,000 breaches reported under GDPR that fell within the 72-hour notification window. Additionally, there have been about 250 investigations by the various data protection authorities. Silverstone noted that while GDPR involves all 28 countries of the EU, variations in how each country is implementing the law mean companies could face different penalties. For instance, he described that Germany's interpretation of the law makes a violation nearly a criminal case, while other nations have been reducing fines. Silverstone also pointed out that the California Consumer Privacy Act, which adheres to some of the same principals as GDPR, is offering some of the same consumer protections that Europeans now enjoy. Mark Weatherford, the global information security strategist at Booking Holdings, told the audience that while complying with the GDPR rules is difficult, it's not impossible. Before his current job, he worked at a startup that needed to come into compliance.



A Practical Intro to Kotlin Multiplatform

Kotlin has enjoyed an explosion in popularity ever since Google announced first-class support for the language on Android, and Spring Boot 2 offered Kotlin support. You’d be forgiven for thinking that Kotlin only runs on the JVM, but that’s no longer true. Kotlin Multiplatform is an experimental language feature that allows you to run Kotlin in JavaScript, iOS, and native desktop applications, to name but a few. And best of all, it’s possible to share code between all these targets, reducing the amount of time required for development. This blog post will explore the current state of Kotlin Multiplatform by building a simple app that runs on Android, iOS, Browser JS, Java Desktop, and Spring Boot. Maybe in a few years, Kotlin will be a popular choice on all these platforms as well. ... To share Kotlin code between platforms, we’ll create a common module that has a dependency on the Kotlin standard library. For each platform, we’ll support the need to create a separate module that depends on the common module and the appropriate Kotlin language dependency.


How Daimler is using graph database technology in HR


For us, we could see advantages to using graph technology in HR projects because HR data is not isolated, so you don't normally have one person working without a connection to another person. If you look at a company, every time you look at the people working in the company you will see that they all have a connection to other people working in the company, you won't see anybody who is completely isolated. That is one of the reasons why we thought that HR data might be a very good fit with a graph data model. We have started with trying to understand what graph and HR data have in common. ... The second reason, and it's a concrete reason why we created this structured application, is that we created our Leadership 2020 programme at Daimler. We are transforming as a company from the classical, hierarchical structure to a mixture of classic hierarchies and what is called a 'swarm' which is a mixture of the same people working on the same project but coming from different departments and different hierarchies.


Blockchain boosters warn that regulatory uncertainty is harming innovation

Businesses and consumers are reluctant to develop and use blockchain applications in the face of uncertainty over whether they might violate outdated financial laws, the Chamber of Digital Commerce argues in its “National Action Plan” (PDF). Among other things, it calls for “clearly articulated and binding statements from regulators regarding the application of law to blockchain-based applications and tokens.” On Wednesday at the DC Blockchain Summit, SEC commissioner Hester Peirce warned industry advocates to be careful what they wish for. Peirce called the action plan “helpful” and agreed that clear regulatory guidelines are needed. But she cautioned against expecting the government to try to foster innovation, which she said could do more harm than good. Peirce urged patience and cooperation. Regulators are slow, she said, and this technology is complicated: “There’s a learning curve. People at the SEC are trying to learn about this space, and trying to understand where the pressure points are.”


2 reasons a federated database isn’t such a slam-dunk

2 reasons a federated database isn̢۪t such a slam-dunk
First, performance. You can certainly mix data from an object-based database, a relational database, and even unstructured data, using centralized and virtualized metadata-driven view. But your ability to run real-time queries on that data, in a reasonable amount of time, is another story. The dirty little secret about federated database systems (cloud or not) is that unless you’re willing to spend the time it takes to optimize the use of the virtual database, performance issues are likely to pop up that make the use of a federated database, well, useless. By the way, putting the federated database in the cloud won’t help you, even if you add more virtual storage and compute to try to brute-force the performance. The reason is that so much has to happen in the background just to get the data in place from many different databases sources. These issues are fixed typically with figuring out good federated database design, tuning the database, and placing limits on how many physical databases can be involved in a single pattern of access. I’ve found that the limit is typically four or five.


How to use process data mining to improve DevOps

Process mining is the data-driven improvement of business processes, and data scientists often use it to suggest ways to enhance performance. Process data mining works for companies and DevOps teams with processes in place, as well as those that still need to create processes. In the first case, people can compare the best practices for their process with what regularly happens within the team. But, individuals at the enterprise level can also use process data mining to establish their processes. Information sources such as event logs give details about how and when people use tools. Process data mining shows people how far away they are from the target of an ideal process, which can also mean it helps people solidify the processes a DevOps team follows. Then, it’s possible to know how to make the most meaningful process-related improvements and discover the things going wrong. ... Process data mining allows for real-time data collection. The companies that successfully use DevOps rely on release cycle metrics that tell them about progress and quality levels.



Quote for the day:


"Strong convictions precede great actions." -- James Freeman Clarke


Daily Tech Digest - March 08, 2019

20190307birdalison.jpg
What we see is a bigger and bigger push to not just protect data, but demands to protect identity. That was always a expected quantity inside of companies on our own infrastructures in our own data centers because we needed to protect data and assets of value. But now this is being extended to an expectation for customers that we are doing business with to have securitized access. And this is such a big leap. It's only come into being in the last couple of years with regulations around data protection and privacy, that we need to once again make sure that that customer is who they say they are, in order to be able to ensure the privacy of that data. And this is causing a tremendous disruption in the marketplace, if not from a solution standpoint, it is definitely causing a disruption relative to thinking about architecture, thinking about how security is designed. We were originally designed to protect assets and we have firewalls and perimeters.


What is Big Data and why does it matter for business?

what is Big Data in business
The misuse and mishandling of personal data is currently a hot topic, thanks in large part to the scandal involving Facebook and Cambridge Analytica. Increased regulation around the storage and processing of data is highly likely – indeed, it is already underway in Europe in the form of the General Data Protection Regulation (GDPR), which came into force in May 2018. Many technology areas are reliant on large data sets and any restrictions on their ability to use them could have significant consequences for future growth. ... Within vertical markets such as retail, where a sale can be won or lost in a matter of moments, there is no other way to make the necessary rapid-fire decisions, such as which offer to display for a specific customer as he or she enters a store. These decisions cannot wait for such transient events to be uploaded to the company’s cloud, so cloud providers such as Microsoft are revamping their own platforms to push critical analytics functions, such as predictive artificial intelligence (AI) algorithms, downstream to devices.


Marriott CEO shares post-mortem on last year's hack


"As part of our investigation into the alert, we learned that the individual whose credentials were used had not actually made the query," Sorenson said. At that point, the Marriott staff realized they were dealing with a probable breach, although they didn't know if it was something big or just the beginning of a hack that could be very easily contained before the attackers accessed any user data. The company said it brought in third-party forensic investigators on September 10, to help its IT staff look into a possible breach. The forensic firm's rummaging uncovered malware on the Starwood IT systems less than a week later. "The investigators uncovered a Remote Access Trojan ('RAT'), a form of malware that allows an attacker to covertly access, surveil, and even gain control over a computer. I was notified of the ongoing investigation that day, and our Board was notified the following day," the CEO said. Uncovering the full scope of the attack took significant forensic work, the CEO said. 


Gartner on futurology and the year 2035

Gartner on futurology and the year 2035: Technologists can be pragmatic about futurism, but there is a need for us all to speak up image
One definition, explained Frank Buytendijk, is ‘futurism is about postulating possible probable and preferable futures in order to prepare for them.’ But that implies that our role is quite passive — we sit back and wait. He prefers futurology or futurism as ‘the art and science of being able to take responsibility for the long term consequences of actions and decisions today.’ That’s an important definition. It implies we have a responsibility — we can shape and mould the future into an image we might prefer. So he asks the question: “How can we be pragmatic futurists?” Part of the problem is that our view of what the future is can be distorted by the prism of the present. Maybe our futuristic view is framed by rose tinted crystal balls. Maybe it is distorted by whatever is in fashion at any moment. When Gartner asked for a view of the year 2035 five years or so ago, privacy was an overriding theme. In its latest survey, privacy not so much, AI was either mentioned directly or by implication.


How to improve Apache server security by limiting the information it reveals

apachehero.jpg
If you administer the Apache web server, you know there are quite a lot of things you can do to help improve its security. For example, you could (and should) employ mod_security. You could also hide directory folders, run only necessary modules, limit large requests, restrict browsing to specific directories, and so much more. But there's two, often-overlooked, steps you can take to help give your Apache server a bit more security: Turning off the Apache signature and configure ServerTokens. Why does this help? Simple. If you broadcast your server's specific information, you would be informing potential malicious actors what they're up against. They could know what web server you're using, what version of the web server, the hosting platform, and even more. You don't want that information displayed for all to see. So, how do you obfuscate that information? There are two options to be configured, and I'm going to show you exactly how to set them, so to hide away your server details. ... 


Where container infrastructure and management investments yield ROI


The ecosystem of infrastructure, services, tools and expertise listed above turns a simple workload isolation technology into a scalable production platform for multiple applications, batch jobs and microservices. To assess the return on investment for these Capex and Opex charges, review the capabilities each provides. ... Meta-management products appeal to organizations with production containerized application experience, whether on premises or via a cloud service, that now want to standardize on container infrastructure and possibly a PaaS development platform. Within this category of tools is a range of subcategories. Organizations can turn to infrastructure management suites, such as HashiCorp Terraform and Consul, Joyent Triton, Rancher and Mesosphere. Alternatively, PaaS offerings that do the job include Pivotal Cloud Foundry, Red Hat OpenShift and Atos powered by Apprenda.


How to determine if Wi-Fi 6 is right for you

How to determine if Wi-Fi 6 is right for you
There’s a lot of hype around the next Wi-Fi standard, 802.11ax, more commonly known as Wi-Fi 6. Often new technologies are built up by the vendors as being the “next big thing” and then flop because they don’t live up to expectations. In the case of Wi-Fi 6, however, the fervor is warranted because it is the first Wi-Fi standard that has been designed with the premise that Wi-Fi is the primary connection for devices rather than a network of convenience. Wi-Fi 6 is loaded with new features, such as Orthogonal Frequency Division Multiple Access (OFDMA), 1024-QAM (quadrature amplitude modulation) encoding and target wake time (TWT), that make Wi-Fi faster and less congested. Many of these enhancements came from the world of LTE and 4G, which solved many of these challenges long ago. These new features will lead to a better mobile experience and longer client battery life, and they will open the door to a wide range of new applications that could not have been done on Wi-Fi before. For example, an architect could now use virtual reality (VR) over Wi-Fi to showcase a house.


It's just a graph, making gravitational waves in the real world

maxresdefaultjpg.jpg
The combination of JSON-LD and schema.org has probably done more to spread the use of RDF than anything else. Just getting Google and other search engines to adopt it has lead to an array of use cases. And yet, JSON-LD was hugely controversial in its time in the RDF community. This was not the last controversy the RDF community faced, but it seems like JSON-LD's success may have had something to teach. But we'll get back to that shortly. Property graphs have been around for about 10 years, and have been driven by the industry. As such, you could say they are a reversed mirror image of RDF: Pragmatism rules, tooling is abundant and easy to use, outreach and community building are a top priority, but standardization only came as an afterthought at this point. Most property graph solutions do not have a schema, or have a very basic schema. Just getting data in and out of property graph solutions is an exercise in patience and improvisation -- good luck representing a graph structure in CSV, and mapping that from solution to solution.


Extracting value from data: how to do it and the obstacles to overcome

Extracting value from data: how to do it and the obstacles to overcome image
The most significant obstacle for information sharing exchanges, is whether the law or regulation will allow it. According to the survey, 33% of respondents said they would be unable to adjust to new regulations effectively for data protection and privacy. With certain business models this information sharing would not be easily achievable — under GDPR or CCPA (California Consumer Privacy Act) — unless with the explicit consent of the consumer. ... “A lot of this has to do with how companies are organised: 31% said we are organisationally siloed — the data that belongs to one business unit is locked up in that business unit, it is not shared with other business units — so they’re not getting the full value of their data, just because of the structure,” he says. Another interesting result from the survey was that 30% lack the data scientists or analytical talent, who would have the capabilities to better exploit the data. “So, there’s definitely a talent shortage leaving money on the table for companies,” confirms Cline.


How blockchain will manage networks

How blockchain will manage networks
Smart Packet Contracts would protect the network from intrusions and so on, and a “Marconi Pipe” would be the channel. It provides the routing and processing. While it’s actually at the data-link OSI Layer 2 (switches, bridges in terms of hardware; and MAC and Ethernet in terms of protocols), it can also overlay on other infrastructure, such as wireless. A barter system, where network resources can be traded for compute resources, say, rounds out the concept. Monitization could indeed be introduced. Another angle is securing the multiple cloud-based systems running in enterprise. It’s a “challenge to make sure [multi-cloud] communication is secure and safe from attacks such as eavesdropping or ‘man in the middle,'" Jong Kim, chief architect of Marconi Foundation and Network World contributor, said in a VentureBeat article in January. “A common network where each connection point securely peers with every other point, regardless of cloud provider or container instance” could be provided with an Ethernet-layer blockchain.



Quote for the day:


Challenges in life always seek leaders and leaders seek challenges. -- Wayde Goodall


Daily Tech Digest - March 07, 2019

Why Wi-Fi needs artificial intelligence
Over time, I expect AI to lead to fully autonomous networks where the AI runs the wired and wireless network. However, I don’t expect businesses to embrace the concept of a “self-driving network” immediately. Instead, the initial wave of AI as a network management tool will be to assist the engineer by providing recommendations coupled with automated basic tasks, including troubleshooting and a problem avoidance. Engineers shouldn’t fear AI or worry about the technology replacing them. Instead, they should look at it as their best friend because it will free up huge amounts of time, as much of the heavy lifting will be done by machines. The access edge, particularly the wireless network, is growing in importance. But at the same time, it is being pushed to do more because more devices are connecting to it, resulting in orders of magnitude more data traversing the network. Manual operational methods have never worked and certainly will not work in a hyper-connected world. AI-based systems are becoming mandatory to keep the performance of Wi-Fi high and to shed the reputation that flaky Wi-Fi is the norm.



5 trends driving the design of next-generation data centers


The efficiency of data centers is both an environmental concern and a large-scale economic issue for operators. Enterprises in diverse industries from automotive design to financial forecasting are implementing and relying on machine-learning in their applications, which results in more expensive and high-temperature data center infrastructure. It’s widely known that power and cooling represent the biggest costs that data center owners have to contend with, but new technologies are emerging to combat this threat. ... One of the most successful technologies that data center operators have put into practice to improve efficiency is monitoring software that implements the critical advances made in machine learning and artificial intelligence. Machines are much more capable of reading and predicting the needs of data centers second to second than their human counterparts, and with their assistance operators can manipulate cooling solutions and power usage in order to dramatically increase energy efficiency.



“When you are in a disaster recovery situation, you do not want the new person trying out the wings,” says Bruce Beam, chief information officer at (ISC)². Unfortunately, the number of cyber security positions outweighs the number of available cyber security professionals. The demand for cyber security professionals has outpaced supply in recent years, due to emerging threats and organisations increasing the amount of business they conduct online. According to a study, the number of organisations that reported shortages in the cyber security skills of their staff has increased over the past four years. In 2014, approximately 23% of organisations indicated this was a challenge, but this has now risen to more than 50%. Much of this rise has been due to the increasing workload of cyber security teams. Continuing professional development (CPD) has been used to ensure that skills remain relevant. 


Open Source Benefits to Innovation and Organizational Agility

To understand how organizations use open source today, Andrew Aitken presented the state of open source in the context of its evolution from the founders until today. Aitken identified four generations. Generation one, initiated in the early 70s, is represented by the evangelists and thought leaders who founded the open source movement, Richard Stallman, Linus Torvalds, Eric Raymond, etc. Their purpose was to make software free to allow anybody to contribute to their improvement. Generation two consists of influencers, such as Marc Fleury, Marten Mickos, Larry Augustin, who began to think about how to commercialize open source and launched the first few commercial open source companies. Generation three of open source started with the proliferation of the internet and the vast amount of data that became available to organizations. Dotcoms created new technologies to manage data and started open-sourcing their software. 


"If the insurer knows our drivers are always driving well on safer routes, then we might be able to bring down our premium," says Gifford. "So, there's opportunities like that when it comes to using blockchain — and that's just an example. But success in blockchain is all about getting partners on board." Gifford says effective partnerships are critical to Wincanton's broader development efforts. The firm launched an innovation programme called W² Labs last March, which gets startups to develop innovative solutions to the firm's challenges. Wincanton also uses its internal development team and works with external consultants, such as IBM and PA Consulting. The broader aims of these combined efforts is to produce what Gifford refers to as the Internet of Transport. These developments focus on three key areas. First, Winsight, an app that enables a paperless cab, so all the paper lorry drivers normally carry, such as routes and proof of delivery, is wrapped up into a single piece of software on a smart device.


"DevOps Institute is thrilled to share the research findings that will help businesses and the IT community understand the requisite skills IT practitioners need to meet the growing demand for T-shaped professionals," said Jayne Groll, CEO of DevOps Institute. "By identifying skill sets needed to advance the human side of DevOps, we can nurture the development of the T-shaped professional that is being driven by the requirement for speed, agility and quality software from the business." Automation, process, and soft skills were the top three most important skills categories, according to the report. Soft skills—including collaboration and cooperation, problem-solving, interpersonal skills, and sharing and knowledge transfer—are equally important as technical skills to DevOps practitioners, highlighting the need for well-rounded candidates in this field. "The reality of the DevOps world is one that is frequently changing," Erin Lovern, director of global talent acquisition at CloudBees, said in the report.


IoT Expands the Botnet Universe

Botnets comprised of vulnerable IoT devices, combined with widely available DDoS-as-a-Service tools and anonymous payment mechanisms, have pushed denial-of-service attacks to record-breaking volumes. At the same time, new domains such as cryptomining and credentials theft offer more opportunities for hacktivism. ... A new piece of malware that takes advantage of Android-based devices exposing debug capabilities to the internet. It leverages scanning code from Mirai. When a remote host exposes its Android Debug Bridge (ADB) control port, any Android emulator on the internet has full install, start, reboot and root shell access without authentication. Part of the malware includes Monero cryptocurrency miners (xmrig binaries), which are executing on the infected devices. Radware’s automated trend analysis algorithms detected a significant increase in activity against port 5555, both in the number of hits and in the number of distinct IPs. 


Clearer North Korean link to global infrastructure malware campaign


The researchers were able to get a rare look at the workings of a nation state cyber espionage campaign after being handed a command and control server for the campaign by one of the government’s targeted. This provided an opportunity to conduct a detailed analysis of code and data from the server responsible for the management of the operations, tools and tradecraft behind the campaign, previously thought to have run from October to November 2018. The analysis led to the identification of several previously unknown command-and-control centres and indicates that Sharpshooter began as early as September 2017, targeted a broader set of organisations in more industries and countries, and that it is currently ongoing. “McAfee Advanced Threat Research analysis of the command-and-control server’s code and data provides greater insight into how the perpetrators behind Sharpshooter developed and configured control infrastructure, how they distributed the malware, and how they stealthily tested campaigns prior to launch,” said Raj Samani


Cisco uncorks 26 security patches for switches, firewalls

network security lock padlock breach
While the 26 alerts describe vulnerabilities that have a Security Impact Rating of “High,” most –23 – affect Cisco NX-OS software, and the remaining three involve both software packages. The vulnerabilities span a number of problems that would let an attacker gain unauthorized access, gain elevated privileges, execute arbitrary commands, escape the restricted shell, bypass the system image verification checks or cause denial of service (DoS) conditions, Cisco said. It has released software fixes for all the vulnerabilities, and none of the problems affect Cisco IOS software or Cisco IOS XE software, the company said. Information about which Cisco FXOS Software and Cisco NX-OS Software releases are vulnerable and what to do about it is available in the fixed software section of the advisory. ... A couple vulnerabilities in the Nexus software could let attackers gain elevated privileges on the switches and execute nefarious commands. The first weakness is due to an incorrect authorization check of user accounts and their associated group ID, Cisco wrote.


Artificial intelligence and cybersecurity: Attacking and defending

Social engineering remains one of the most common attack vectors. How often is malware introduced in systems when someone just clicks on an innocent-looking link? The fact is, to entice the victim to click on that link, quite a bit of effort is required. Historically, it’s been labor-intensive to craft a believable phishing email. Days and sometimes weeks of research, and the right opportunity, were required to successfully carry out such an attack. Things are changing with the advent of AI in cyber. Analyzing large data sets helps attackers prioritize their victims based on online behavior and estimated wealth. Predictive models can go further and determine willingness to pay the ransom based on historical data, and even adjust the size of pay-out to maximize the chances and, therefore, revenue for cybercriminals. Imagine all the data available in the public domain, as well as previously leaked secrets, through various data breaches are now combined for the ultimate victim profiling in a matter of seconds with no human effort.



Quote for the day:


"Leaders keep their eyes on the horizon, not just on the bottom line." -- Warren G. Bennis


Daily Tech Digest - March 04, 2019


IBM said its Q System One, which has a 20-qubit processor, produced a Quantum Volume of 16, double the current IBM Q, which has a Quantum Volume of 8. IBM also said the Q System One has some of the lowest error rates IBM has measured. That progress is notable, but practical broad use cases are still years away. IBM said Quantum Volume would need to double every year to reach Quantum Advantage within the next decade. Faster progress on Quantum Advantage would speed up that timeline. IBM has doubled the power of its quantum computers annually since 2017. Once Quantum Advantage is hit, there would be new applications, more of an ecosystem and real business use cases. Consumption of quantum computing would still likely be delivered via cloud computing since the technology has some unique characteristics that make a traditional data center look easy. IBM made its quantum computing technology available in 2016 via a cloud service and is working with partners to find business and science use cases.



Another Bitcoin Indicator Signals Price Bottom May Be Forming


Essentially, the MFI validates or confirms price trends. Many times, however, the indicator diverges from the prevailing market trend. For instance, BTC dashed hopes of a long-term bullish reversal with a break below $6,000 on Nov. 14 and hit a 15-month low of $3,122 on Dec. 15. The 14-week MFI also nosedived from the high of 43.00 in mid-November, confirming the sell-off in prices. The indicator, however, bottomed out with a higher low at 22.00, contradicting the lower low in bitcoin’s price. That bullish divergence is widely considered an early warning of a bearish-to-bullish trend reversal. Supporting that argument is the fact BTC snapped its record six-month losing streak with a 10 percent gain in February and the MFI rose from 25 to 44. Other indicators like the moving average convergence divergence (MACD) and the bearish crossover of the 50- and 100-week moving average are also signaling long-term bearish exhaustion. These tools, however, don’t incorporate trading volumes. The MFI, therefore, stands out as a more reliable technical tool.



Dangerous gaps in cybersecurity investments


Historically, many companies have underfunded employee awareness education and training, but that tide largely turned as it became clear that employees with poor security practices were the source of many cyberbreaches. Even so, far too many companies still fail to educate their entire employee base, or to test employee awareness and practices on an ongoing basis. Cybersecurity insurance is a relative newcomer to the security budget mix. Companies have learned that – no matter their defenses – they face high odds of becoming cyberattack victims at some point. Given this awareness, insurance policies are almost certain to capture a growing percentage of the overall cybersecurity budget. However, insurance should be treated as a complement to strong security technology, staffing and education, not as an alternative to them. When making your cybersecurity investments, it’s critical that you direct the funds in a balanced way that addresses all of these security areas. Each one plays a critical role in building comprehensive defenses, and underfunding any of them could prove to be an extremely dangerous and costly error.



No Avoiding the Inevitable: The Time for Cyber Security Analytics is Now

“Most organizations understand security analytics as an elusive cluster of different technologies encompassing ‘a little bit of everything’," said Pavlakis. "While on a top level they are somewhat correct on that respect, they, unfortunately, opt to pick whatever makes sense budget-wise."  Regardless of how organizations approach the security analytics marketplace, approach they will. For example, Gartner's 2019 CIO Agenda Survey found that analytics and cyber security top this year's priority lists among CIOs in the government sector.  In analyzing Gartner's findings, Security Boulevard's Filip Truta suggested that government is actually a late-comer to this realization, and that other industries are already hip to the power of cyber security analytics. "High-profile data breaches have highlighted cybersecurity analytics as a formidable weapon against sophisticated attacks and advanced threats that elude prevention mechanisms at endpoint level," wrote Truta.


Huawei Denies Then Plays The Blame Game Over Cybersecurity Vulnerabilities


Claims and subsequent action by the United States and other countries have put Huawei, Supermicro, and ZTE under a negative spotlight and the effects have been damaging from a revenue, brand, and loyalty perspective. Although the UK's National Cyber Security Centre (NCSC) deemed Huawei as a "manageable risk," these companies will be challenged to regain their credibility and reputations in the security industry. Although it is nearly impossible to prove the claims against each company, it does force every equipment vendor to determine which side of the fence they are on and perhaps incentivize the industry to make meaningful long-term changes and safeguards—especially as 5G becomes a reality. While these companies are on their heels, rivals like Cisco, Ericsson , Nokia , etc. have a healthy competitive opportunity to grow market share. However, as a wise person once said, “what comes around, goes around” it will be easier for the industry to take care of itself before clueless bureaucrats and politicians do it for them.


Fintech in Sub-Saharan Africa: A Potential Game Changer


Sub-Saharan Africa is the only region in the world where close to 10 percent of GDP in transactions occur through mobile money. This compares with just 7 percent of GDP in Asia and less than 2 percent of GDP in other regions. Most African users now rely on mobile payments to send and receive money domestically. Increasingly, they are taking advantage of new services to also send and receive money internationally. In addition, they use mobile money to pay their bills, receive their wages, and pay for goods and services. Innovation is allowing Africans to move up the “financial services value chain.” From mobile payments, customers in sub-Saharan Africa are gaining access to mobile banking and other services as they open saving accounts, take out loans, purchase insurance, and invest in Government securities or in stock markets with a few touches of their mobile phone. They can even “borrow” electricity and pay later instead of sitting in the dark. New innovations in fintech are proceeding rapidly. New technologies are being developed and implemented on the continent, and they have the potential to yield significant benefits for Africa.


This coworking space is like a horse trailer, but for humans


No one looks forward to a day at the office–no matter how much free cold brew is on tap. We all dream of that digital nomad life, kicking up our feet at the beach while knocking out a day of emails. Mojitos optional. So I very much understand what the South African shared workspace company Work & Co (not to be confused with the New York digital design agency Work & Co) was thinking when it developed the Nova workspace–an office on wheels, which you can rent for $250 a day, and in exchange, the company will tow it to a uniquely beautiful location. I just didn’t imagine that the office would look like this: a horse trailer, but for humans. I mean, don’t get me wrong, there’s effort here! You have velvet upholstery (seating for six!), a hip little wallpapered corner, and plenty of windows for panoramic views of the scenery. You have coffee, shade, and a bathroom–fulfilling the three core components of Maslow’s Hierarchy of Needs When Working Remote. What more could you want?


How to prepare employees for AI's impact on the workforce

istock-920743046ai.jpg
"The growth of artificial intelligence and emerging technologies (ET) is poised to reshape the workforce. While the exact impact of AI and ET is unclear, experts expect that many jobs currently performed by humans will be performed by robots in the near future, and at the same time, new jobs will be created as technology advances," said Elizabeth Mann Levesque of the Brookings Institute. Companies can ease employee's concerns about AI adoption by taking these two steps: In my career I've experienced many company reorganizations. The format usually consists of a consultant visit, the vice president explaining that the department is being assessed to maximize workflows and that it will benefit everyone—and then everyone goes back to their desks wondering if they will be laid off. As a junior staff member, I was a lay-off victim in my very first IT job. I did documentation that the consultant deemed it "non-essential." Years later, I still recall the trauma of it. It wasn't getting laid off that hurt. It was going to work and not knowing what would happen next.


In cybersecurity, it’s AI vs. AI: Will the good guys or the bad guys win?

cyber-security-3443625_1280-thedigitalartist-pixabay
For all its promise, there are areas in which AI adds little value or may even create new vulnerabilities. Machine and deep learning work best when the problem domain is well-known, and the variables don’t change very much. Algorithms are good at detecting variations in patterns but not at identifying new ones. “To say you’re going to find the unknown is really tough,” said Tom Clare, senior product marketing manager at Fidelis Cybersecurity Inc., which specializes in threat detection and response. Changing variables can flummox machine learning algorithms, which is one reason they have so far had limited value in combating malware, the incidence of which has risen fivefold since 2013, according to SafetyDetective.com. Machine learning algorithms “inherently fail because the training set of malware changes too quickly,” said Doug Swanson, chief technology officer at Malwarebytes Corp. “The malware your model will see in the future will end up looking little to nothing like the malware it has seen, and been trained on, in the past.“


The Open Source Approach to Accelerating Digital Transformation

The Open Source Approach to Accelerating Digital Transformation
It’s worth noting that a lot of advances and services innovation found in hyperscale cloud companies is actually mostly achieved by leveraging the thousands of developers in the open source community, and OpenStack provides a compatible platform for taking advantage of these, and even more advanced developments. The latest advances in distributed databases, containers, Kubernetes automation and scaling, platform as a service (PaaS), artificial intelligence, machine learning, Internet-of-Things and 5G networks are all available on OpenStack – sometimes long before the proprietary cloud vendors can develop systems to exploit these new technologies and make them available across all geographies. As customers embrace hybrid environments, the same technology that can be offered both online can also be implemented in their own data centers. Many companies choose to put their variable workloads in the cloud, while keeping production on-site.



Quote for the day:


"It takes an influential leader to excellently raise up leaders of influence." -- Anyaele Sam Chiyson