Daily Tech Digest - May 31, 2019

How To Identify What Technologies To Invest In For Digital Transformation

How To Identify What Technologies To Invest In For Digital Transformation
There are many aspects of the experience, but if you look at the central pillars of a great experience, it comes down to the acronym “ACT.” The “A” pillar of ACT is anticipation. The platform must anticipate what the customer or employee needs when using the platform. A second pillar, C, reminds that their experience must be complete. The platform should not put the burden of tasks on the customer or employee; it should run the activity to its completion and deliver a satisfying, complete result back to the customer or employee. The third pillar, T, represents the timeliness factor. The experience needs to be performed in a time frame that is relevant and consistent with customer or employee expectations. An example is in sales where the company has 45 minutes (or perhaps two days) to complete the stakeholder’s need. The time is not about response time; it’s about the appropriate amount of time that the individual gives the company to get to a complete answer. It could be seconds, hours or days.




The digital twin is an evolving digital profile of the historical and current behavior of products, assets, or processes and can be used to optimize business performance. Based on cumulative, real- time, real-world data measurements across an array of dimensions, the digital twin depends on connectivity—and the IIoT—to drive functionality. Amid heightened competition, demand pressures, inaccurate capacity assumptions, and a suboptimal production mix, one manufacturing company sought ways to drive operational improvements, accelerate production throughput, and promote speed to market. At the same time, however, the manufacturer was hampered by limited visibility into its machine life cycles, and knew relatively little about resource allocation throughout the facility. To gain deeper insight into its processes—and to be able to simulate how shifts in resources or demand might affect the facility—the manufacturer used sensors to connect its finished goods and implement a digital twin.



How iRobot used data science, cloud, and DevOps

irobot-terra-hero.jpg
The core item in the new design language is the circle in the middle of the robots. The circle represents the history of iRobot, which featured a bevy of round Roomba robots. "The circle is a nod back to the round robots and gives us the ability to be more expansive with geometries," he explains. But iRobot 2.0 also represents the maturation of iRobot. "Innovation at iRobot started back in the early days with a toolkit of robot technology. Innovation was really about market exploration and finding different ways for the toolkit to create value," Angle says. Through that lens, iRobot explored everything from robots for space exploration to toys to industrial cleaning and medical uses. "Our first 10 to 15 years of history is fraught with market exploration," Angle says. Ultimately, iRobot, founded in 1990, narrowed its focus to defense, commercial and consumer markets before focusing solely on home robots. iRobot divested its commercial and its military robot division, which was ultimately acquired by FLIR for $385 million.


The Defining Role of Open Source Software for Managing Digital Data


Open source use is accelerating and driving some of the most exciting ventures of modern IT for data management. It is a catalyst for infusing innovation. For example, Apache Hadoop, Apache Spark, and MongoDB in big data; Android in mobile; OpenStack and Docker in Cloud; AngularJS, Node.js, Eclipse Che, React, among others in web development; Talend and Pimcore in data management; and TensorFlow in Machine learning. Plus, the presence of Linux is now everywhere—in the cloud, the IoT, AI, machine learning, big data, and blockchain. This ongoing adoption trend of open source software, especially in data management, will intensify in the coming time. The capability of open source has a certain edge as it does not restrain IT specialists and data engineers to innovate and make the use of data more pervasive. In my experience, successful data management depends upon on breaking down data silos in the enterprise with a consolidated platform in place for rationalizing old data as well as deploying new data sources across the enterprise.


DevOps security best practices span code creation to compliance


Software security often starts with the codebase. Developers grapple with countless oversights and vulnerabilities, including buffer overflows; authorization bypasses, such as not requiring passwords for critical functions; overlooked hardware vulnerabilities, such as Spectre and Meltdown; and ignored network vulnerabilities, such as OS command or SQL injection. The emergence of APIs for software integration and extensibility opens the door to security vulnerabilities, such as lax authentication and data loss from unencrypted data sniffing. Developers' responsibilities increasingly include security awareness: They must use security best practices to write hardened code from the start and spot potential security weaknesses in others' code.Security is an important part of build testing within the DevOps workflow, so developers should deploy additional tools and services to analyze and evaluate the security posture of each new build.
Chief artificial intelligence officer
The CAIO might not be at the Executive Committee level, but beware the various other departments reaching out to own the role. AI often gets its initial traction through innovation teams – but is then stymied in the transition to broader business ownership. The IT function has many of the requisite technological skills but often struggles to make broader business cases or to deliver on change management. The data team would be a good home for the CAIO, but only if they are operating at the ExCom level: a strong management information (MI) function is a world away from a full AI strategy. Key functions may be strong users of AI  –  digital marketing teams or customer service teams with chatbots, for example  – but they will always be optimising on specific things.  So, who will make a good CAIO? This is a hard role to fill — balancing data science and technology skills with broader business change management experience is a fine line. Ultimately it will be circumstances that dictate where the balance should be struck. Factors include the broader team mix and the budget available, but above all the nature of the key questions that the business faces.


Researcher Describes Docker Vulnerability

Researcher Describes Docker Vulnerability
Containers, which have grown in popularity with developers over the last several years, are a standardized way to package application code, configurations and dependencies into what's known as an object, according to Amazon Web Services. The flaw that Sarai describes is part of Docker's FollowSymlinkInScope function, which is typically used to resolve file paths within containers. Instead, Sarai found that this particular symlink function is subject to a time-to-check-time-to-use, or TOCTOU, bug. ... But a bug can occur that allows an attacker to modify these resource paths after resolution but before the assigned program starts operating on the resource. This allows the attack to change the path after the verifications process, thus bypassing the security checks, security researchers say. "If attackers can modify a resource between when the program accesses it for its check and when it finally uses it, then they can do things like read or modify data, escalate privileges, or change program behavior," Kelly Shortridge, vice president of product strategy at Capsule8, a security company that focuses on containers, writes in a blog about the this Docker vulnerability.


JDBC vs. ODBC: What's the difference between these APIs?

Many people associate ODBC with Microsoft because Microsoft integrates ODBC connectors right into its operating system. Furthermore, Microsoft has always promoted Microsoft Access as an ODBC-compliant database. In reality, the ODBC specification is based upon the Open Group's Call Level Interface specification, and is supported by a variety of vendors. The JDBC specification is owned by Oracle and is part of the Java API. Evolution of the JDBC API, however, is driven by the open and collaborative JCP and Java Specification Requests. So while Oracle oversees the API development, progress is largely driven by the user community. Despite the separate development paths of ODBC and JDBC, both allow support of various, agreed-upon specifications by RDBMS vendors. These standards are set by the International Standards Organization's data management and interchange committee, and both JDBC and ODBC vendors work to maintain compliance with the latest ISO specification. 


LinkedIn Talent Solutions: 10 tips for hiring your perfect match

Best practices for hiring and recruiting on LinkedIn
The product uses AI to recommend relevant candidates that could be a good fit for an available role, and it leverages analytics to make recommendations in real time as you’re crafting your job description. LinkedIn Recruiter and Jobs also allows companies to target open roles using LinkedIn Ads to reach relevant candidates. In the new Recruiter and Jobs, talent professionals no longer have to jump back and forth between Recruiter and Jobs; the update puts search leads and job applicants for an open role within the same project, viewable on a single dashboard. Candidates can then be saved to your Pipeline, where they’ll move through the later stages of the hiring process. ... Finally, LinkedIn Pages allows organizations of any size to showcase their unique culture and employee experience by posting employee-created content, videos and photos. Candidates can visit and organization’s page to see what your organization has to offer, as well as get personalized job recommendations and connect with employees like them, according to LinkedIn. Real-time page analytics can identify who’s engaging with your organization’s page and which content is making the greatest impact.


Sidecar Design Pattern in Your Microservices Ecosystem

Segregating the functionalities of an application into a separate process can be viewed as a Sidecar pattern. The sidecar design pattern allows you to add a number of capabilities to your application without additional configuration code for third-party components. As a sidecar is attached to a motorcycle, similarly in software architecture a sidecar is attached to a parent application and extends/enhances its functionalities. A sidecar is loosely coupled with the main application. Let me explain this with an example. Imagine that you have six microservices talking with each other in order to determine the cost of a package. Each microservice needs to have functionalities like observability, monitoring, logging, configuration, circuit breakers, and more. All these functionalities are implemented inside each of these microservices using some industry standard third-party libraries. But, is this not redundant? Does it not increase the overall complexity of your application?



Quote for the day:


"The essential question is not, "How busy are you?" but "What are you busy at?" -- Oprah Winfrey


Daily Tech Digest - May 30, 2019

GDPR - Data Privacy And The Cloud

GDPR - Data Privacy and the Cloud - CIO&Leader
The recent and rapid transition to multi-cloud networks, platforms, and applications complicates this challenge. To meet data privacy requirements in such environments, organizations need to implement security solutions that span the entire distributed network in order to centralize visibility and control. This enables organizations to provide consistent data protections and policy enforcement, see and report on cyber incidents, and remove all instances of PII on demand. Achieving this requires three essential functions: Security needs to span multi-cloud environments. Compliance standards need to be applied consistently across the entire distributed infrastructure. While privacy laws may belong to a specific region, the cloud makes it easy to cross these boundaries. ... Compliance reporting requires centralized management. Compliance reporting needs to span the entire distributed infrastructure. As with other requirements, this also demands consistent integration throughout the cloud and with the on-premise security infrastructure. Achieving this requires the implementation of a central management and orchestration solution


Disruption, data and the changing role of the CIO

This paradigm shift is a necessary result of the accelerated pace of technological change and increased pressure to adopt emerging technologies to avoid falling behind competitors. One possible response is to cling to the old ways, that is, to slow down adoption of 4IR technologies, and to resist the democratization of technology. But the risks of this approach, tempting as it might be given the sometimes overwhelming challenges, are high. First, a rigid or cumbersome process for adopting technologies will surely mean that competitors are moving forward faster. Second, a company that resists the democratization of technology may discourage potential employees who are intellectually curious. Further, such resistance to change may limit the potential of employees by signaling that compliance is more important than creativity. While having a heavy foot on the brake is a problem, a CIO who is pushing too hard on the accelerator isn’t the solution. The temptation is understandable.


Top 10 Future Trends In Android Development You Cannot Miss In 2019

IOT apps future trends in android development
Yes! People can now command the smart devices to perform basic routine activities and these devices will interact with the machine to run, stop, and function through the internet connection. Internet of Things (IoT) refers to the increased interconnectedness among different smart devices through the internet. It is one step ahead in device-to-machine interaction. For this, the smart devices should feature internet connection and sensors in order to allow the device to gather, receive, and transfer the information. It’s very much easy to operate and control the smart TV or a toaster in the kitchen or an air conditioner in the living room or a treadmill in the gym area through the smart devices. ... It’s fascinating that the wearables market is thriving and alive. Smart wearables are basically the use of technology which is worn on the body, close to the body or in the body. There’s no doubt about the trends in Wearables will go a step ahead to get many tasks done from a single smart device. Be it playing a game from a VR glass, from a smartwatch or from other Android wearables. Be it having a moving nurse with you to track your health through a smart belt, smartwatch or smart glasses.


Hackers targeting UK universities a threat to national security


In light of this, and the threat research programmes are under, 10% of 75 senior IT leaders polled by Vanson Bourne research “strongly agree” that a successful attack could have a harmful impact on the lives of UK citizens. Findings also show that nearly a quarter (24%) of UK universities polled believe their security and defence research may have already been infiltrated, while over half (53%) say a cyber attack on their institution has led to research ending up in foreign hands. “British universities have long been celebrated around the world for their academic excellence, and the role they play in not only driving technological and social innovation through research, but also advances in defence and security,” said Louise Fellows, director, public sector UK and Ireland, at VMware. “Keeping pace with today’s sophisticated cyber threats is an enormous challenge. Those responsible for protecting universities and the data they hold must examine how they can evolve practices and approaches in line with an increasingly complex threat landscape, including cyber security as a consideration at every stage of the research process by design,” she said.


Natural language processing explained

Natural language processing explained
Like any other machine learning problem, NLP problems are usually addressed with a pipeline of procedures, most of which are intended to prepare the data for modeling. In his excellent tutorial on NLP using Python, DJ Sarkar lays out the standard workflow: Text pre-processing -> Text parsing and exploratory data analysis -> Text representation and feature engineering -> Modeling and/or pattern mining -> Evaluation and deployment.  Sarkar uses Beautiful Soup to extract text from scraped websites, and then the Natural Language Toolkit (NLTK) and spaCy to preprocess the text by tokenizing, stemming, and lemmatizing it, as well as removing stopwords and expanding contractions. Then he continues to use NLTK and spaCy to tag parts of speech, perform shallow parsing, and extract Ngram chunks for tagging: unigrams, bigrams, and trigrams. He uses NLTK and the Stanford Parser to generate parse trees, and spaCy to generate dependency trees and perform named entity recognition. 


Baltimore Ransomware Attack Triggers Blame Game

The Times reports that the exploit was used numerous times, and proved very valuable for intelligence operations over a five-year period, before the agency lost control of it. Only then did the NSA alert Microsoft to the flaw, leading to it quickly issuing patches. And now Baltimore is one of the latest victims of attackers exploiting the flaw, the Times reports. The short list of who to potentially blame for the Baltimore incident now includes: the National Security Agency, for building the exploit and holding onto it for five years, without alerting Microsoft, before losing control of it; the shadowy group - maybe foreign, maybe domestic - calling itself the Shadow Brokers, which leaked the exploit in April 2017; Microsoft, for not building bug-free operating systems; the city of Baltimore, for having failed to apply an emergency Windows security update more than two years after it was released in March 2017 - and two months later for older operating systems - which blocked EternalBlue exploits in every Windows operation system from XP onward; and, of course, the attackers, whoever they might be.


Code Linux binary
In a technical report published today, Nacho Sanmillan, a security researcher at Intezer Labs, highlights several connections and similarities that HiddenWasp shares with other Linux malware families, suggesting that some of HiddenWasp code might have been borrowed. "We found some of the environment variables used in a open-source rootkit known as Azazel," Sanmillan said. "In addition, we also see a high rate of shared strings with other known ChinaZ malware, reinforcing the possibility that actors behind HiddenWasp may have integrated and modified some MD5 implementation from [the] Elknot [malware] that could have been shared in Chinese hacking forums," the researcher added. ... Hackers appear to compromise Linux systems using other methods, and then deploy HiddenWasp as a second-stage payload, which they use to control already-infected systems remotely.


Going beyond basic cyberhygiene to protect data assets

Skills and career development can start on a small scale, through free, vendor-sponsored programs, convenient online courses, or even at the library. ... By investing in learning as a lifestyle, common challenges such as finding time to sit down and complete a training module become easier to overcome. ... The scale and scope of cybercrime grows every day—new technologies introduce new vulnerabilities faster than they can be secured, and cybercriminals continue to find new ways to attack organizations. By understanding the pattern of evolution in the cyberlandscape and adopting an intelligence-based approach, technology and security professionals can arm themselves for anything that comes their way. As tech pros continue building security skills in daily operations, they take steps beyond basic cyberhygiene. Understanding their IT environment to uncover hidden risks, educating business leaders, leveraging data to show the value of IT efforts, implementing the “right” tools, and investing in training are key to going beyond basic cyberhygiene.


IoT > Internet of Things > network of connected devices
The technology itself has pushed adoption to these heights, said Graham Trickey, head of IoT for the GSMA, a trade organization for mobile network operators. Along with price drops for wireless connectivity modules, the array of upcoming technologies nestling under the umbrella label of 5G could simplify the process of connecting devices to edge-computing hardware – and the edge to the cloud or data center. “Mobile operators are not just providers of connectivity now, they’re farther up the stack,” he said. Technologies like narrow-band IoT and support for highly demanding applications like telehealth are all set to be part of the final 5G spec. ... That’s not to imply that there aren’t still huge tasks facing both companies trying to implement their own IoT frameworks and the creators of the technology underpinning them. For one thing, IoT tech requires a huge array of different sets of specialized knowledge. “That means partnerships, because you need an expert in your [vertical] area to know what you’re looking for, you need an expert in communications, and you might need a systems integrator,” said Trickey.


Business Associates Reminded of HIPAA Duties

Business Associates Reminded of HIPAA Duties
"Business associates still struggle with their HIPAA Security Rule obligations, in many of the same ways as do covered entities, including with regard to risk analysis, risk management and encryption, for example," says privacy attorney Iliana Peters of the law firm Polsinelli. "Business associates struggle with understanding their obligations to flow down the requirements of their business associate agreements with their own vendors that have access to protected health information." Covered entities and business associates alike must understand the lifecycle of their data so that appropriate HIPAA-required security safeguards are applied, Peters adds. And business associates should periodically conduct "mini-audits" of their security practices to ensure they are meeting obligations spelled out in their BA agreements, she says. Even though business associates became directly liable for HIPAA compliance nearly six years ago, confusion about their duties persists. "Some BAs fail to understand the full scope of their compliance responsibilities," says Kate Borten, president of privacy and security consultancy The Marblehead Group.




Quote for the day:


"If you truly love life, don’t waste time because time is what life is made of." -- Bruce Lee


Daily Tech Digest - May 29, 2019

Is Lean IT Killing Your Digital Transformation Plans?

Image: Olivier Le Moal - stock.adobe.com
The first thing to keep in mind is that IT should not be in any huge hurry to significantly trim down in terms of time and technology waste. A proper framework must first be put in place that clearly outlines and categorizes technology services, how they should be implemented, supported and spun down at the end of the lifecycle. These processes should be broad enough to encompass things like technical staff/management roles, service provider requirements, lifecycle planning, quality control and lines of communication. Also keep in mind that unlike Lean manufacturing, Lean IT must take into consideration the speed at which technology advances and the volatility in what the business needs. Manufacturing is far more static in nature – and major changes can be planned for well in advance. Yet, with IT, that’s not the case. The need to adopt disruptive digital technologies can strike at lightning speed. Added to this is the fact that DX is about converting all business processes to a digitized state under the operational umbrella of the IT department. Thus, even a minor pivot in business strategy requires IT to change or add new technologies to accommodate for shifting business processes.


10 years from now your brain will be connected to your computer

BMIs (Brain Machine Interfaces) are an intriguing area of research with huge potential, offering the ability to directly connect the human brain to computers to share data or control devices. Some of the work on BMI is one step away from science fiction. Probably the best-known company working on this technology today is Neuralink, the Elon Musk-backed firm that aims to develop ultra high bandwidth 'neural lace' devices to connect humans and computers. At least part of the reason for Musk's interest in the idea of mind brain-computer connections is that such technology could stop humans getting left behind by a (still to emerge) super-intelligent artificial intelligence. The idea is that connecting our minds directly to the AI with high bandwidth links would at least give us a chance to keep up with the conversation. However, more basic forms of BMI technology have been used in medicine for years, like cochlear implants which provide a sense of sound to a person who is profoundly deaf or severely hard of hearing. 


blockchain world
Regardless of which solution is chosen, the current underlying structure of blockchain is simply not sustainable. If cryptocurrencies and the myriad of other applications of the technology are to be used reliably and at scale, the system has to change. Ethereum has taken tangible steps towards doing this, but the chance of blockchain feasibly replacing central authorities – like banks and energy companies – remains slim. That doesn’t mean, however, that blockchain can’t be used to gradually improve transparency and trust in industries where there are environmental and ethical concerns. The tricky relationship between blockchain and sustainability demonstrates just how complex sustainable solutions can be. While blockchain has the potential to improve supply chain sustainability, it also necessitates the mammoth energy consumption required by cryptocurrencies – particularly Bitcoin. Perhaps the evolution of blockchain won’t come from the financial sector, but from the governments, organisations, and communities that use it to support sustainability.


No real change a year into GDPR, says privacy expert

Since the implementation of the GDPR, Room said there has been a “fixation” among privacy practitioners on the idea that the regulatory system needs to deliver pain and punishment to deliver change, with a great deal of discussion and focus on the potentially huge fines under the GDPR. “We are deluding ourselves about the power to change that comes from enforcement action such as fines. We should not be investing our hopes in pain if we want to deliver change,” he said, adding that already this has led many to believe GDPR is about US tech giants. “One year on, many organisations are thinking the fight is against US technology companies and not really about them. Not only is that distortion troubling, but so too is the view that pain is key to change because that suggests a fundamental failure to understand the significance and importance of the subject matter in its own right,” said Room. The focus should not be on the fines and other enforcement actions, he said, but on the fact that the GDPR is about fundamental rights and freedoms.


Deploying RPA: why DevOps and IT need more control

Deploying RPA: why DevOps and IT need more control image
“Non-IT departments have targets and ambitions to transform their business and feel frustrated that IT is just trying to keep the lights on,” he said. “So when a technology like RPA comes along and it’s pitched and marketed to a business audience and they can see positive results almost immediately, it’s a no brainer for them that they’re just going to try and run it themselves; rather than have a lengthy conversation with IT over how to best implement it or how it fits within their technology roadmap.” But should DevOps be worried? According to O’Donoghue, no. He said: “Ultimately, RPA does not take away the bulk of what DevOps and IT services teams do. There’s a whole spectrum of tasks their busy with from on-the-spot patching and service development. RPA can only do a very tiny part of this. So we’re never going to see a direct competition between RPA and DevOps, which is more of a cultural methodology for IT development and operations.


NVIDIA Launches Edge Computing Platform to Bring Real-Time AI to Global Industries

NVIDIA EGX was created to meet the growing demand to perform instantaneous, high-throughput AI at the edge — where data is created – with guaranteed response times, while reducing the amount of data that must be sent to the cloud. By 2025, 150 billion machine sensors and IoT devices will stream continuous data that will need to be processed(1) — orders of magnitude more than produced today by individuals using smartphones. Edge servers like those in the NVIDIA EGX platform will be distributed throughout the world to process data in real time from these sensors. ... EGX combines the full range of NVIDIA AI computing technologies with Red Hat OpenShift and NVIDIA Edge Stack together with Mellanox and Cisco security, networking and storage technologies. This enables companies in the largest industries — telecom, manufacturing, retail, healthcare and transportation — to quickly stand up state-of-the-art, secure, enterprise-grade AI infrastructures.


DevOps for networking hits chokepoints, tangled communications


While NetOps still lags behind DevOps, major market players look to bridge that gap. Red Hat, for example, brought network automation into Ansible configuration management to enable DevOps and network teams to automate the deployment of network devices and connections in the same way they would with OSes and cloud services. Ansible Tower, a management console for Ansible Engine, can store network credentials and scale network automation, among other tasks. Collectively, these networking features are referred to as Ansible Network Automation. DevOps teams should watch to see if, or how, they evolve in light of IBM's acquisition of Red Hat. In another move, this time by an established networking vendor, F5 Networks invested in NetOps via its acquisition of Nginx, an open source app delivery platform, early in 2019. With Nginx, F5 aims to blend network management with DevOps practices, as well as strengthen its multi-cloud presence. At the time of the deal, F5 said it will meld its app and network security services with Nginx's app delivery and API management portfolio.


Perfect storm for data science in security


Another key contribution by data science is in describing the extent of an attack as well as possible through automated methods. “Detection and response go hand in hand, and so the more we can detail the extent of an attack in terms of detection, the more we can accelerate the response.” Data scientists are also working in the field of automated response, but Neil said in this regard, it is “still early days” and automated response remains highly dependent on detection capability. “You need to be very sure of your detection before you start shutting machines down because a false positive here is quite expensive for the enterprise, so this is a real challenge. “However, progress is being made, and Microsoft has some of these automated response systems deployed. But we are very careful about this. Automated response is a very long-term goal. Regardless of the hype, it is going to take us years to realise this fully.” That said, Neil believes a lot of the manual, human-driven cyber attacks by teams of well-funded attackers will start to be replaced. “I think we are going to start seeing attackers using automated decision making.”


How researchers are teaching AI to learn like a child


One of the most challenging tasks is to code instincts flexibly, so that AIs can cope with a chaotic world that does not always follow the rules. Autonomous cars, for example, cannot count on other drivers to obey traffic laws. To deal with that unpredictability, Noah Goodman, a psychologist and computer scientist at Stanford University in Palo Alto, California, helps develop probabilistic programming languages (PPLs). He describes them as combining the rigid structures of computer code with the mathematics of probability, echoing the way people can follow logic but also allow for uncertainty: If the grass is wet it probably rained—but maybe someone turned on a sprinkler. Crucially, a PPL can be combined with deep learning networks to incorporate extensive learning. While working at Uber, Goodman and others invented such a "deep PPL," called Pyro. The ride-share company is exploring uses for Pyro such as dispatching drivers and adaptively planning routes amid road construction and game days. Goodman says PPLs can reason not only about physics and logistics, but also about how people communicate, coping with tricky forms of expression such as hyperbole, irony, and sarcasm.


Effective Risk Analysis in Cybersecurity, Operational Technology and the Supply Chain


From a cybersecurity perspective, Open Standards can be used to provide a proven, consensus-based methodology for the application of quantitative risk analysis, allowing for effective measurement that offers more validity. In supply chain security, for example, the Open Trusted Technology Provider Standard exists to help providers of IT products to utilize a quantitative approach to risk analysis. This enhances the manufacturers ability to identify how much risk is present and determine which third party is the weakest link within their supply chain.  In OT environments, however, risk evaluation methodologies like Bow-tie are often used to relate hazards, threats and mitigating controls. To enhance this technique, the addition of quantitative risk measurement will enable OT decision makers to more accurately evaluate which risks are worthy of mitigation. Although the measurement and management of risk has long been recognized as an important organizational responsibility, the hyper-complexity of today’s business environment has catapulted it to the forefront of the minds of senior executives.



Quote for the day:


"It is the responsibility of leadership to provide opportunity, and the responsibility of individuals to contribute." -- William Pollard


Daily Tech Digest - May 28, 2019

How Mindfulness Drives Better Design And Innovation

Photocredit: Getty
Mindful by Design draws its ideas from fields of neuroscience, evidence-based mindfulness practices, design and storytelling exercises, and more, in which it’s important to emphasize, with intention, that there is no one right way. I work with a variety of clients, including founders of small startups, CEOs of large multinational companies, school principals, researchers, artists, inventors and educators, guiding them to use a designer mindset. ... Mindful by Design is a toolkit with approaches that invite you to become the agent of change and action, to involve yourself in the moment and to learn to appreciate the quality of what is unfolding when we fully connect. ... Mindful by Design encourages each person to connect with their deeper sense of purpose, to trust, and to go beyond perceived boundaries and divisions, creating connection and bridges. Each individual is involved and empowered as a designer of personal and collective experience, also documenting and reflecting at each stage. This is a mindfulness saying: each moment is an invitation to learn and grow.


Five industries outside of tech being changed by DevOps

Five industries outside of tech being changed by DevOps image
Today’s warehouses are substantially more high-tech than the ones from past eras. For example, it’s common for such facilities to use a warehouse management system (WMS) that allows keeping track of all items from the time they arrive on-site to when those products get packaged and shipped to their destinations. A WMS can keep track of stock numbers, product categories and more, telling warehouse workers precisely where to find a desired item within a sprawling warehouse. Implementing a WMS into a facility for the first time is not always easy, but it can become more straightforward with help from DevOps. ... The internet has drastically changed how people research hotels, book rooms, hire special events managers and more. It’s not surprising, then, that many of the companies in the sector turned to DevOps to maintain their competitiveness. ... Depending on DevOps shortens the time required to develop and test new offerings and speeds up the time to market for those products.


Leading your organization to responsible AI

Leading your organization to responsible AI
The best solution is almost certainly not to avoid the use of AI altogether—the value at stakecan be too significant, and there are advantages to being early to the AI game. Organizations can instead ensure the responsible building and application of AI by taking care to confirm that AI outputs are fair, that new levels of personalization do not translate into discrimination, that data acquisition and use do not occur at the expense of consumer privacy, and that their organizations balance system performance with transparency into how AI systems make their predictions. It may seem logical to delegate these concerns to data-science leaders and teams, since they are the experts when it comes to understanding how AI works. However, we are finding through our work that the CEO’s role is vital to the consistent delivery of responsible AI systems and that the CEO needs to have at least a strong working knowledge of AI development to ensure he or she is asking the right questions to prevent potential ethical issues. In this article, we’ll provide this knowledge and a pragmatic approach for CEOs to ensure their teams are building AI that the organization can be proud of.



The future of self-service is customer-led automation — Gartner

The future of self-service is customer-led automation — Gartner image
According to Gartner, organisations are turning to naturalistic engagement methods, such as voice and other AI-powered technologies, to give customers what they want and achieve higher operational efficiency. In fact, 91% of organisations are planning to deploy AI within the next three years. And, by 2030, a billion service tickets will be raised automatically by customer-owned bots. “What’s interesting is that when we begin to look at the dynamics of self-service and continued automation by organisations over a longer time frame, cracks begin to appear,” continued Mullen. “The burden of managing and supporting self-services is being taken from today’s support staff and being pushed into customers’ hands. This level of delegation, from ‘DIY’ to customer-led AI, will be a major force shaping customer self-service.” ... “As customers embrace these DIY mindsets, they will choose providers that allow them to interact easily with these consumer-controlled touchpoints, like smart speakers and VPAs. Enterprise-provided user interfaces will increasingly play second fiddle to customer-controlled experiences,” added Mullen.



Data storage: Everything you need to know about emerging technologies

hpe-synergy-12000-frame.jpg
With the rapid growth of data volumes at the edge and in data centers, it is increasingly difficult to move data to processors. Instead, processing is moving to the storage. There are two different ideas covered under the rubric of intelligent storage. At the edge, data pre-processing and reduction, perhaps using machine learning, reduces bandwidth requirements to data centers. In big data applications, sharing a pool of storage and/or memory allows as many processors as needed to share the data needed to achieve required performance. These concepts are currently labeled intelligent storage by HPE, Dell/EMC, and NGD Systems. It goes beyond the optimizations built into storage array controllers that manage issues with disk latency or access patterns. Call it storage intelligence v2. Consider a petabyte rack of fast, dense, non-volatile memory, attached to dozens of powerful CPUs in the next rack. With proper synchronization and fine-grained locking thousands of VMs could operate on a massive data pool, without moving hundreds of terabytes across a network.


What new collaborations will you be doing in Microsoft's Fluid Framework?


Patton describes the Fluid framework as "A new distributed data structure platform that allows for hyper-performant scenarios with AI included. Think about it as the ability to have, say, simultaneously 18 different people that are around the world in different geographies with not just real-time collaboration, but AI translations happening at the same time in sub milliseconds." In other words, don't think of SharePoint as slow or clunky, or just an intranet site and document library: think of it as "a new hyper-fast and performant cloud platform that has AI built into it." What you work with through that SharePoint storage layer and distributed data structure isn't just a standard Office document; it's an Office document broken up into pieces — "components that can then be shared across other apps that have the ability to collaborate within the end points [with the changes] coming back to the original file." So a 'compound' Word document might include a component that's a table someone can be editing in the Word document, but that can also be shared into a Teams conversation where someone else can be adding more information.


When event-driven messaging is the right choice


With cloud integration, APIs are the prevailing mechanism. But let's say you deploy your CRM, such as Salesforce, in the cloud. First, you need to upload data, such as customer data, into the new CRM system. This is typically a batch process because you can't call an API a million times to populate the customer database in the CRM. So batch data integration is used frequently. We also see varied event-based scenarios where an application sends out a notification and all the applications [that integrate with it] receive the information in parallel. Instead of using the classic request-reply paradigm that [exists] when you use APIs, event technology lets you implement what is called a fire-and-forget mechanism: I send you a message and you receive it when you receive it. A good example of these event processes are the notifications that you get on your mobile device. Occasionally, a notification pops up to tell you, for example, that your plane is delayed. This is classic event processing -- I send you a message and you do whatever you want with my message. But when I send you the message, I'm done.


Amazon Is Working on a Device That Can Read Human Emotions

The notion of building machines that can understand human emotions has long been a staple of science fiction, from stories by Isaac Asimov to Star Trek’s android Data. Amid advances in machine learning and voice and image recognition, the concept has recently marched toward reality. Companies including Microsoft Corp., Alphabet Inc.’s Google and IBM Corp., among a host of other firms, are developing technologies designed to derive emotional states from images, audio data and other inputs. Amazon has discussed publicly its desire to build a more lifelike voice assistant. The technology could help the company gain insights for potential health products or be used to better target advertising or product recommendations. The concept is likely to add fuel to the debate about the amount and type of personal data scooped up by technology giants, which already collect reams of information about their customers. Earlier this year, Bloomberg reported that Amazon has a team listening to and annotating audio clips captured by the company’s Echo line of voice-activated speakers.


US Senate passes anti-robocalling bill


If the bill makes it through the House and is signed into law, it will empower the Federal Communications Commission (FCC) to inflict hefty new fines – as much as $10,000 per call – for illegal robocalls. The legislation would also increase the statute of limitations for bringing such cases, thereby giving FCC regulators more time to track down offenders. The act would also create an interagency task force to address the problem, and it would push carriers like AT&T and Verizon to deploy call authentication systems, such as the pending STIR/SHAKEN call identification protocols, into their networks. That’s now in the works: in September 2018, the Alliance for Telecommunications Industry Solutions (ATIS) announced the launch of the Secure Telephone Identity Governance Authority (STI-GA), designed to ensure the integrity of the STIR/SHAKEN protocols. That move paved the way for the remaining protocols to be established. 


Goodbye Passwords: Hello Identity Management


By 2022 there will be an estimated 29 billion connected devices, of which 18 billion will be related to IoT, according to a recent report by telecommunications firm Ericsson. Many of those connected things, plus the mobile apps and autonomous processes that drive them, will need new IAM solutions. “Identity and access management can depend on a lot of different things,” said Noam Liran, director of customer success at CyberArk. “It used to be just based on [the question of], does that identity have a password. Now, companies need to manage identities of microservices, cloud containers and mobile apps seeking access to privileged data in the cloud.” Liran added that even a website with a simple chat system needs access management. “A customer-service chatbot can be another form of identity to manage,” he said. “We have customers who are using a chatbot to grab tracking numbers from UPS or FedEx deliveries and then push the shipping data into a database.” Each one of those interactions requires a privileged relationship.



Quote for the day:


"True leaders bring out your personal best. They ignite your human potential" -- John Paul Warren


Daily Tech Digest - May 27, 2019

No cloud required: Why AI’s future is at the edge

artificial-intelligence-503592_1280-geralt-pixabay
More compact and capable software is paving the way for AI at the edge as well. Google LLC, for instance debuted its TensorFlow Lite machine learning library for mobile devices in late 2017, enabling the potential for smart cameras to can identify wildlife or imaging devices to can make medical diagnoses even where there’s no internet connection. Some 2 billion mobile now have TensorFlow Lite deployed on them, Google staff research engineer Pete Warden said at a keynote presentation at the Embedded Vision Summit. And in March, Google rolled out an on-device speech recognizer to power speech input in Gboard, Google’s virtual keyboard app. The automatic speech recognition transcription algorithm is now down to 80 megabytes so it can run on the Arm Ltd. A-series chip inside a typical Pixel phone, and that means it works offline so there’s no network latency or spottiness. Not least, rapidly rising privacy concerns about data traversing the cloud means there’s also a regulatory reason to avoid moving data off the devices. “Virtually all the machine learning processing will be done on the device,” said Bier



DDoS: a weapon of mass disruption

The five most commonly used in attacks were the Domain Name System (DNS) protocol, the Network Time Protocol (NTP) based weapons, the Simple Service Discovery Protocol (SSDP), Simple Network Management Protocol (SNMP) and the Trivial File Transfer Protocol (TFTP), this last of which is a new entrant into the top five. So, as new protocols are being highlighted as the source of DDoS weapons, and the total number of attacks looks set to grow, what security measures can be taken? Cybersecurity companies compile millions-strong inventories of DDoS weapons, allowing blacklisted IP addresses to be blocked. Shin says that A10 Networks can create up to 96 million entries in a blacklist. “If you can get ahead and identify them, we can use this as a strategy to prevent DDoS attacks,” says Shin. A10 Networks and its partners use several approaches, including tracking bot-herders, analysing forensic data, scanning the internet for weapons signatures and tapping networks. Shin says it is important to have an “actionable defence”.


U.S. Airports Will Use AI To Scan 97% Of Passengers' Faces Within 4 Years

Empty airport terminal waiting area
The AI system has already been placed in 15 airports across the U.S. It has currently been tested on more than 15,000 flights and identified over 7,000 travelers who overstayed their visas. CBP calculates that 666,582 passengers who arrived by plane or boat overstayed visas in fiscal 2018. The main goal of the airport scans is to catch those who have overstayed their visas. For the past few years, overstayers have represented a majority of undocumented immigrants, larger than those who enter the country illegally. However, not everyone is thrilled about this venture. Critics argue that this use of AI is an invasion of privacy and it could be of concern how this information could be used outside the airport. With access to facial recognition from many people, it could be used by hackers or given to law enforcement and used unlawfully. The documents released by President Trump explicitly said there were no limits on how partnering airlines can use this facial recognition data. CBP did not answer specific questions about whether there are any guidelines for how other technology companies involved in processing the data can potentially also use it.


VMware talks up multi-cloud era, need to transform security

"How do you make 250 security products work [together]? It's insanity," Gelsinger quipped, noting that 80 percent of security budgets were being spent on detection and response, as opposed to prevention. He called for the need to help lower enterprises' attack surface and build the underlying infrastructure to prevent security incidents from happening in the first place.  Again, VMware was looking to provide the tools to help simplify this and enable its customers to better manage their security requirements. Last August, the vendor introduced VMware Secure State to automate configuration security and compliance monitoring in native cloud environments.  Rima Olinger, AWS's global alliance lead, also spoke at the forum to pitch the cloud platform's partnership with VMware Cloud, which she said had been adopted by enterprises across various sectors including financial services and healthcare.  VMware Cloud on AWS recently launched in Singapore and also was available in Sydney and Tokyo, according to Olinger.


Top 10 Cybersecurity Risks For 2019

2019 Cyber Risk Man Assessing Servers Blue Illustration Password
Unfortunately, Cloud storage is susceptible to abuse. A large risk factor is that Infrastructure as a Service (IaaS), which is responsible for functionality, has no secure registration process. What does that imply? Provided you have a credit card, you have the key to signing up and using the cloud as soon as you are done. The simplicity, in turn, makes the cloud vulnerable to spam mails, criminals, and other malicious attacks. To mitigate the situation, it is advisable that cloud service providers develop authentication and registration processes. Additionally, they should have a way of monitoring credit card transactions. A thorough evaluation of network traffic is also crucial in eliminating cyber abuse. ... Shadow IT is software used within an organization, but not supported by the company’s central IT system. What causes a breach in shadow IT is the fact that the risk of data loss does not receive much attention when it comes to data backups. More so, there is no control over who gets to access the data. Also, the backup and recovery processes have no one to monitor.


CMO & CIO Collaboration- Integrating The Best Of C-Suite Management


To achieve new pinnacles of customer delight, the CMOs and CIOs know that it’s time for the collaboration. Mature collaborations follow similar paths of evolution, transitioning from a role-specific focus to broader internal partnerships to integrated teams. As the data grows in an enterprise, the CMOs are turning to the CIOs to make sense of this information with a common goal to increase the revenue in the dynamic competitive era. The CIOs have a continuous role to play to turn new technology into revenue. They need the CMOs to help them meet the customer’s demand for this intelligent information. Thus, the CIOs and CMOs need to work together, for turning all this data into growth numbers. As the worldwide volume of data grows at least 40 percent a year, the CIOs and CTOs have come to a stage to be dependent on each other in much more collaborate manner than ever before. That’s why many CMOs are waking up to the fact that IT can’t be treated like a back-office function anymore; rather, the CIO is becoming a strategic partner who is crucial to developing and executing marketing strategy.


Most enterprise IoT transactions are unencrypted

network security / network traffic scanning
Researchers looked through one month’s worth of enterprise traffic traversing Zscaler’s cloud seeking the digital footprints of IoT devices. It found and analyzed 56 million IoT-device transactions over that time, and identified the type of devices, protocols they used, the servers they communicated with, how often communication went in and out and general IoT traffic patterns. The team tried to find out which devices generate the most traffic and the threats they face. It discovered that 1,015 organizations had at least one IoT device. The most common devices were set-top boxes (52 percent), then smart TVs (17 percent), wearables (8 percent), data-collection terminals (8 percent), printers (7 percent), IP cameras and phones (5 percent) and medical devices (1 percent). While they represented only 8 percent of the devices, data-collection terminals generated 80 percent of the traffic. The breakdown is that 18 percent of the IoT devices use SSL to communicate all the time, and of the remaining 82 percent, half used it part of the time and half never used it.


Beware of email lawsuit scam, an Android missed call con

A new kind of spam may be coming to Android phone users. The news site Bleeping Computer has a report that a security company has discovered a campaign that tries to trick users with a message that says “Missed call.” One version suggests you’re going to get a new iPhone, or there’s some sort of reward. The idea is to get you to click on an image or a link. Don’t fall for these scams. If you don’t know who a call is from, delete the message. Finally, some countries do a better job of filtering out malicious email than others. That’s one of the findings of a British information site called Merchant Machine.Thirty-six per cent of the email in Brazil carried malware, according to its research. Mexico was second with a rate of almost 30 per cent. By comparison, almost nine per cent of email in the U.S. was malicious, almost five per cent in China. The lowest was 3.6 per cent in the Middle East country of Oman. Still, all of the countries studied 60 per cent of their email had spam. The security industry and Internet service providers have to do better.


Fog computing vs. edge computing: What's the difference?


According to OpenFog, fog computing, which is also called fog networking and fogging, standardizes cloud extension out to the edge, encompassing all the space and activity between the two. Edge computing, in this case, is more limited in scope, as it refers to individual, predefined instances of computational processing that happen at or near network endpoints. With this paradigm, edge computing cannot create direct network connections between two endpoints or between an endpoint and an IoT gateway on its own; for that, it needs fog. ... Still other IT pros say the use of fog computing vs. edge computing depends specifically on the location of the distributed compute and storage resources. If processing capabilities are embedded directly within a connected endpoint, they call that edge computing. But if intelligence resides in a separate network node stationed between an endpoint and the cloud, such as a local node or IoT gateway, then it's fog.


Being an Ethical Software Engineer

What can we do if we care about ethics and want to bring it more into our practice? The main thing to do is probably to keep an open mind and keep asking questions. This is what it’s mostly about- asking questions. Thinking about what we do and how it would affect other people, and if we are happy with how it affects other people. We’re lucky because we’re in a needed profession, and we have the ability to make a stand and be heard. We need to raise those questions when we encounter them, start making these conversations, and even if we don’t have answers, at least bring it up, get people involved, raise awareness. Another powerful tool we have is choosing who we work for. There might always be some compromises when it comes to business priorities, but we can at least avoid helping the obvious ‘evil’ ones, companies that exploit their users or working in questionable fields. Early on in my career, I had a very short period of working for a company in the online gambling industry. The ease I felt leaving the job made it clear to me that feeling good about where you work and knowing that your efforts aren’t contributing to damaging society, is priceless.



Quote for the day:


"We cannot choose our external circumstances, but we can always choose how we respond to them." -- Epictetus


Daily Tech Digest - May 26, 2019

How is AI benefiting industries throughout Southeast Asia?

machine learning ai artificial intelligenceAI in agriculture is mainly used for precision farming, livestock monitoring, drone analytics and agriculture robots. Precision farming was the most widely used application in 2018, taking up about 35.6% of the global total. However, agriculture robots are expected to have a bigger share in the future. Speaking at a recent seminar titled “Connecting Manufacturing Industry with AI Technology”, Dr Siridej Boonsaeng, Dean of the College of Advanced Manufacturing Innovation, King Mongkut's Institute of Technology Ladkrabang, said that AI is presenting the agricultural sector in Thailand with great opportunities. "Self-driving farm vehicles and the process of sorting and grading agricultural products which involve complicated factors of random shape and variation are suitable tasks for AI to replace human when required," he said. During last year’s Grow Asia Forum, Vietnam’s Deputy Prime Minister Trịnh Đình Dũng called for the private sector to get more involved with cutting edge technologies in the 4.0 revolution in a bid to transform the agriculture industry of Southeast Asian countries.


Lack of Secure Coding Called a National Security Threat

The lack of secure coding is a pervasive and serious threat to national security, according to a new paper from the Institute for Critical Infrastructure Technology, a cybersecurity think tank. Rob Roy, an ICIT fellow who was co-author of the report, suggests in an interview with Information Security Media Group that an app standards body could play an important role in improving app security. "If there were some objective standards put in place that all software would have to abide by, then we could start to make progress," Roy says. "It may just be that there needs to be an objective standard ... and a legislative mandate that requires a certain level of assurance to provide an assured product." The "call to action" report, "Software Security Is National Security: Why the U.S. Must Replace Irresponsible Practices with a Culture of Institutionalized Security," discusses systemic issues with the software development landscape and what needs to be done to rectify the problem of negligent coding. But solving the problem won't be easy, given the problems of speed-to-market pressures and the sheer number of IoT devices being produced, the report notes.


Telangana launches draft Blockchain Policy

Telangana
According to the draft framework, the Blockchain District will house all major blockchain technology companies; will have a huge incubator and a world-class facility for promoting research, innovation and industry collaboration. This one-of-its-kind initiative aims to put all blockchain companies based out of Hyderabad at a strategically advantageous position globally. The major highlights of the draft policy include the development of talent pool by collaborating with industries; creation of shared infrastructure facilities that can be used by startups, industry, academia and communities; promotion of research and innovation and enable collaboration and focus on community building. Other than these, the policy also looks at providing incentives and subsidies to enterprises, startups and other entities. While for enterprises, 25 per cent subsidy on lease rentals and 50 per cent subsidy on exhibition rentals will be provided, startups will get 100 per cent reimbursement of State GST, R&D grant of up to 10 per cent, one time grant of Rs 10 lakh to 10 blockchain startups per year for three years and patent filing assistance.


uncaptioned
China's cybersecurity move is designed to both echo and address the U.S. sanctions against Huawei, as well as the country's leading surveillance equipment makers, including HikVision and Dahua. Under the new terms of reference, organizations within the country including network operators, IT services providers and even financial services companies, would need to conduct "comprehensive analysis and evaluation of risks brought about by national security." Nick Marro, a Hong Kong-based analyst with The Economist Intelligence Unit, told SCMP that "the regulatory opacity means that officials have quite a lot of flexibility in how they want to implement this - meaning it could be applied to U.S. firms in a way that embodies ‘qualitative measures’ as part of China’s trade war response." The goal, claims the Administration in its consultation document, is to "promote the application of advanced technologies, enhancing fairness and transparency, and protecting intellectual property rights." The Central Network Security and Informatization Committee "will take the national lead."


E-Commerce Skimming Attacks Evolve Into iFrame Injection"What we notice are new fields to enter credit card data that did not exist on the left (untampered form)," Segura writes in a blog post. By itself, this may not be out of the ordinary since online merchants do use such forms - including iFrame - as part of their checkout pages." Essentially, the iFrame jumps ahead in line. Although all PHP pages within the e-commerce site were infected, Segura says that the iFrame would only be triggered "if the current URL in the address bar is the shopping cart checkout page (onestepcheckout)." Apparently to help the malicious code avoid detection, "some extra checks (screen dimensions and presence of a web debugger) are also performed before continuing." The JavaScript that draws the iFrame comes from a domain, thatispersonal[.]com, which is hosted in Russia, Segura writes. Another script is used to process and validate the data. Once the victim enters their card details, the "data is sent via a POST request to the same malicious domain in a custom-encoded format." Finally, after that occurs, the user does get directed to the proper PSP, where they can pay.


Why Blockchain-based Governance Requires In-Person Identity Verification

Any system of governance is vulnerable to this, so designing one that is “collusion-safe” is critical. Easier said than done because there is another component to the game theoretical design of “collusion-safe” design. It relates to identity. That second part is “identity-free.” If everyone in the system is anonymous, that makes collusion much, much more difficult. Imagine how difficult it would be for OPEC to exist if no one knew who the producers of oil were. Identity-free systems, however, bring up another issue. They are vulnerable to manipulation by mechanisms in a number of ways such as as bribing. You can read through the whole post, but the net of it is that in Vitalik’s mind, it is just not realistic to have a system that is both collusion-safe AND identity-free. It won’t work. Since compromising on collusion-safety is a non-starter, identity must be a key fixture. Furthermore, Vitalik suggests that the only realistic solution for identity is, ironically, in-person verification. At the end of the day, you need to show up and prove you are who you say you are.


Deep learning explained

Deep learning explained
While you could write deep learning programs from first principles, it’s far more efficient to use deep learning frameworks, especially given that they have been optimized for use with GPUs and other accelerators. The pre-eminent framework is TensorFlow, which originated at Google. The favored high-level API for TensorFlow is Keras, which can also be used with other back-end frameworks. PyTorch, from Facebook and others, is a strong alternative to TensorFlow, and has the distinction of supporting dynamic neural networks, in which the topology of the network can change from epoch to epoch. Fastai is a high-level third-party API that uses PyTorch as a back-end. MXNet, from Amazon and others, is another strong alternative to TensorFlow, with a claim to better scalability. Gluon is the preferred high-level imperative API for MXNet. Chainer, from IBM, Intel, and others, was in some ways the inspiration for PyTorch, given that it defines the neural network by run and supports dynamic neural networks.


AI and machine learning driving skills revolution in business intelligence

AI and machine learning driving skills revolution in business intelligence image
An analyst’s role has become elevated over time, delivering a much more integral business impact. The research found a marked rise in a need for business skills (up 76% in the last five years versus 2009-2014), problem-solving (112%), and verbal communications skills (19%). Meanwhile, the need for Microsoft Excel (-49%), along with quantitative (-69%) and data analysis (-16%) skills, have all fallen considerably. Interpreting the results, Sir Cary Cooper, professor of organisational psychology at Alliance Manchester Business School, said: “In the future, the business analyst will be a different person. With AI and machine learning picking up mundane number-crunching work, the role now requires more innovative and original thinking. “This poses a challenge for employers in making sure they have people with the right skills in their workforce. As new technology comes into play, employers will need to re-evaluate the skills of their employees and develop training and recruitment practices that can make the most of the opportunities available.


Why office spaces should include Wi‑Fi‑enabled buses


If enough businesses were to offer such services, the environmental benefit could be tremendous. Cars generally emit around 0.7 pounds of carbon dioxide per mile; in heavy traffic, that can go up to two pounds. With 50 seats, the bus could prevent up to 100 pounds of carbon dioxide emissions for every mile of its route. Now imagine that such services would utilize the electric-battery buses that are slowly beginning to roll out on the roads. One study has found they’re 2.5 times cleaner than diesel buses on average. And in areas where hydro, wind, and solar account for a great deal of electricity, like the West Coast, the fuel powering these buses could be largely free of emissions. Of course, some employees might regard this as simply one more benefit — like the provision of free food and snacks on-site — aimed at tethering people more closely to their offices and promulgating an always-on culture. And that danger certainly exists. But the power of Wi-Fi-enabled buses lies in the option that it offers to employees. It’s a way to spend part of a standard workday, not extra hours in addition to one.


How Security Vendors Can Address the Cybersecurity Talent Shortage

While most efforts to address the talent shortage are centered on expanding technical skills to fill cybersecurity jobs, we need to be aware that the cybersecurity skills gap goes far beyond the job market for cybersecurity professionals. One of the biggest cyber-risks in today's workplace is a general lack of awareness of even the most basic attacks, such as phishing emails and other social engineering techniques. And that is due to a failure in understanding that cybersecurity is everyone's job, and organizations need training and education programs that address many different audiences. What cybersecurity vendors are usually quite good at is creating training programs to equip customers and partners with the knowledge and skills required to operate their own products. This is certainly critical as cybersecurity solutions become more sophisticated.  ... Formal programs are a necessary element to filling the skills gap, but a comprehensive training and education strategy must include strategic partnerships within government, academia, and NGOs.



Quote for the day:

"Real leadership is being the person others will gladly and confidently follow." -- John C. Maxwell