Daily Tech Digest - July 28, 2021

DevOps Is Dead, Long Live AppOps

The NoOps trend aims to remove all the frictions between development and the operation simply removing it, as the name tells. This may seem a drastic solution, but we do not have to take it literally. The right interpretation — the feasible one — is to remove as much as possible the human component in the deployment and delivery phases. That approach is naturally supported by the cloud that helps things to work by themself. ... One of the most evident scenarios that explain the benefit of AppOps is every application based on Kubernetes. If you will open each cluster you will find a lot of pod/service/deployment settings that are mostly the same. In fact, every PHP application has the same configuration, except for parameters. Same for Java, .Net, or other applications. The matter is that Kubernetes is agnostic to the content of the host's applications, so he needs to inform it about every detail. We have to start from the beginning for all new applications even if the technology is the same. Why? I should explain only once how a PHP application is composed. 


Thrill-K: A Blueprint for The Next Generation of Machine Intelligence

Living organisms and computer systems alike must have instantaneous knowledge to allow for rapid response to external events. This knowledge represents a direct input-to-output function that reacts to events or sequences within a well-mastered domain. In addition, humans and advanced intelligent machines accrue and utilize broader knowledge with some additional processing. I refer to this second level as standby knowledge. Actions or outcomes based on this standby knowledge require processing and internal resolution, which makes it slower than instantaneous knowledge. However, it will be applicable to a wider range of situations. Humans and intelligent machines need to interact with vast amounts of world knowledge so that they can retrieve the information required to solve new tasks or increase standby knowledge. Whatever the scope of knowledge is within the human brain or the boundaries of an AI system, there is substantially more information outside or recently relevant that warrants retrieval. We refer to this third level as retrieved external knowledge.


GitHub’s Journey From Monolith to Microservices

Good architecture starts with modularity. The first step towards breaking up a monolith is to think about the separation of code and data based on feature functionalities. This can be done within the monolith before physically separating them in a microservices environment. It is generally a good architectural practice to make the code base more manageable. Start with the data and pay close attention to how they’re being accessed. Make sure each service owns and controls access to its own data, and that data access only happens through clearly defined API contracts. I’ve seen a lot of cases where people start by pulling out the code logic but still rely on calls into a shared database inside the monolith. This often leads to a distributed monolith scenario where it ends up being the worst of both worlds - having to manage the complexities of microservices without any of the benefits. Benefits such as being able to quickly and independently deploy a subset of features into production. Getting data separation right is a cornerstone in migrating from a monolithic architecture to microservices. 


Data Strategy vs. Data Architecture

By being abstracted from the problem solving and planning process, enterprise architects became unresponsive, he said, and “buried in the catacombs” of IT. Data Architecture needs to look at finding and putting the right mechanisms in place to support business outcomes, which could be everything from data systems and data warehouses to visualization tools. Data architects who see themselves as empowered to facilitate the practical implementation of the Business Strategy by offering whatever tools are needed will make decisions that create data value. “So now you see the data architect holding the keys to a lot of what’s happening in our organizations, because all roads lead through data.” Algmin thinks of data as energy, because stored data by itself can’t accomplish anything, and like energy, it comes with significant risks. “Data only has value when you put it to use, and if you put it to use inappropriately, you can create a huge mess,” such as a privacy breach. Like energy, it’s important to focus on how data is being used and have the right controls in place. 


Why CISA’s China Cyberattack Playbook Is Worthy of Your Attention

In the new advisory, CISA warns that the attacks will also compromise email and social media accounts to conduct social engineering attacks. A person is much more likely to click on an email and download software if it comes from a trusted source. If the attacker has access to an employee's mailbox and can read previous messages, they can tailor their phishing email to be particularly appealing – and even make it look like a response to a previous message. Unlike “private sector” criminals, state-sponsored actors are more willing to use convoluted paths to get to their final targets, said Patricia Muoio, former chief of the NSA’s Trusted System Research Group, who is now general partner at SineWave Ventures. ... Private cybercriminals look for financial gain. They steal credit card information and health care data to sell on the black market, hijack machines to mine cryptocurrencies, and deploy ransomware. State-sponsored attackers are after different things. If they plan to use your company as an attack vector to go after another target, they'll want to compromise user accounts to get at their communications. 


Breaking through data-architecture gridlock to scale AI

Organizations commonly view data-architecture transformations as “waterfall” projects. They map out every distinct phase—from building a data lake and data pipelines up to implementing data-consumption tools—and then tackle each only after completing the previous ones. In fact, in our latest global survey on data transformation, we found that nearly three-quarters of global banks are knee-deep in such an approach.However, organizations can realize results faster by taking a use-case approach. Here, leaders build and deploy a minimum viable product that delivers the specific data components required for each desired use case (Exhibit 2). They then make adjustments as needed based on user feedback. ... Legitimate business concerns over the impact any changes might have on traditional workloads can slow modernization efforts to a crawl. Companies often spend significant time comparing the risks, trade-offs, and business outputs of new and legacy technologies to prove out the new technology. However, we find that legacy solutions cannot match the business performance, cost savings, or reduced risks of modern technology, such as data lakes. 


Data-Intensive Applications Need Modern Data Infrastructure

Modern applications are data-intensive because they make use of a breadth of data in more intricate ways than anything we have seen before. They combine data about you, about your environment, about your usage and use that to predict what you need to know. They can even take action on your behalf. This is made possible because of the data made available to the app and data infrastructure that can process the data fast enough to make use of it. Analytics that used to be done in separate applications (like Excel or Tableau) are getting embedded into the application itself. This means less work for the user to discover the key insight or no work as the insight is identified by the application and simply presented to the user. This makes it easier for the user to act on the data as they go about accomplishing their tasks. To deliver this kind of application, you might think you need an array of specialized data storage systems, ones that specialize in different kinds of data. But data infrastructure sprawl brings with it a host of problems.  


The Future of Microservices? More Abstractions

A couple of other initiatives regarding Kubernetes are worth tracking. Jointly created by Microsoft and Alibaba Cloud, the Open Application Model (OAM) is a specification for describing applications that separate the application definition from the operational details of the cluster. It thereby enables application developers to focus on the key elements of their application rather than the operational details of where it deploys. Crossplane is the Kubernetes-specific implementation of the OAM. It can be used by organizations to build and operate an internal platform-as-a-service (PaaS) across a variety of infrastructures and cloud vendors, making it particularly useful in multicloud environments, such as those increasingly commonly found in large enterprises through mergers and acquisitions. Whilst OAM seeks to separate out the responsibility for deployment details from writing service code, service meshes aim to shift the responsibility for interservice communication away from individual developers via a dedicated infrastructure layer that focuses on managing the communication between services using a proxy. 


Navigating data sovereignty through complexity

Data sovereignty is the concept that data is subject to the laws of the country which it is processed in. In a world where there is a rapid adoption of SaaS, cloud and hosted services, it becomes obvious to see the issues that data sovereignty can have. In simpler times, data wasn’t something businesses needed to be concerned about and could be shared and transferred freely with no consequence. Businesses that also had a digital presence operated on a small scale and with low data demands hosted on on-premise infrastructure. This meant that data could be monitored and kept secure, much different from the more distributed and hybrid systems that many businesses use today. With so much data sharing and lack of regulation, it all came crashing down with the Cambridge Analytica scandal in 2016, promoting strict laws on privacy. ... When dealing with on-premise infrastructure, governance is clearer, as it must follow the rules of the country it’s in. However, when it’s in the cloud, a business can store its data in any number of locations regardless of where the business itself is.


How security leaders can build emotionally intelligent cybersecurity teams

EQ is important, as it has been found by Goleman and Cary Cherniss to positively influence team performance and to cultivate positive social exchanges and social support among team members. However, rather than focusing on cultivating EQ, cybersecurity leaders such as CISOs and CIOs are often preoccupied by day-to-day operations (e.g., dealing with the latest breaches, the latest threats, board meetings, team meetings and so on). In doing so, they risk overlooking the importance of the development and strengthening of their own emotional intelligence (EQ) and that of the individuals within their teams. As well as EQ considerations, cybersecurity leaders must also be conscious of the team’s makeup in terms of gender, age and cultural attributes and values. This is very relevant to cybersecurity teams as they are often hugely diverse. Such values and attributes will likely introduce a diverse set of beliefs defined by how and where an individual grew up and the values of their parents. 



Quote for the day:

"The mediocre leader tells The good leader explains The superior leader demonstrates The great leader inspires." -- Buchholz and Roth

Daily Tech Digest - July 27, 2021

How AI Algorithms Are Changing Trading Forever

The Aite Group in its report "Hedge Fund Survey, 2020: Algorithmic Trading" argues that the main reason for the growing popularity of algorithms in trading is to try to reduce the influence of the human factor on the market due to its high volatility. The economic fallout from COVID-19 has seen a record-breaking drop in the American, European, and Chinese stock markets. And only a few months later, measures to stimulate the economy were able to stop the fall and reverse the downtrend up. Thus, we get the first task of Algorithmic Trading - risk reduction in a market with high volatility. The second global advantage of algorithmic trading lies in the ability to analyze the potential impact of trade on the market. This can be especially useful for Hedge Funds and institutional investors who handle large sums of money with a visible effect on price movements. The third fundamental advantage of trading algorithms is protection from emotions. Traders and investors, like all living people, experience the emotions of fear, greed, lost profits, and others. These emotions have a negative impact on performance and results.


How to prevent corporate credentials ending up on the dark web

Employees are the weakest link in any organization’s security posture. A Tessian report found that 43% of US and UK employees have made mistakes that resulted in cybersecurity repercussions for their organizations. Phishing scams, including emails that try to trick employees into sharing corporate login details, are particularly common. Educating employees on cyber threats and how to spot them is crucial to mitigating attacks. However, for training to be effective, it needs to consist of more than just repetitive lectures. In the report mentioned above, 43% of respondents said a legitimate-looking email was the reason they fell for a phishing scam, while 41% of employees said they were fooled because the email looked like it came from higher up. Live-fire security drills can help employees familiarize themselves with real-world phishing attacks and other password hacks. Safety awareness training should also teach workers the importance of good practices like using a virtual private network (VPN) when working from home and making social media accounts private.


IT leadership: 4 ways to find opportunities for improvement

Technology leaders should regularly use their own technology to better identify pain points and opportunities for improvements. That means that I should be teaching and using the same systems that faculty does to understand their experience through their lens. I should be meeting regularly with them and generating a Letterman-style Top 10 list of the things I hate most about my technology experience. This is something to do with the students, too. What do they hate most about the technology at the university? And how can we partner with them to address these issues over the next 12 months? Several years ago, for example, we reexamined our application process. If a prospective student wanted to submit an application, we required them to generate a unique username and password. If the one they chose was already taken, they needed to continue creating alternate versions until they eventually landed upon one that was available. If someone began the application process and logged off to complete it later, then forgot their username and password, they’d have to start all over again. It was absurd.


Data Management In The Age Of AI

AI is increasingly converging the traditional high-performance computing and high-performance data analytics pipelines, resulting in multi-workload convergence. Data analytics, training and inference are now being run on the same accelerated computing platform. Increasingly, the accelerated compute layer isn’t limited to GPUs⁠—it now involves FPGAs, graph processors and specialized accelerators. Use cases are moving from computer vision to multi-modal and conversational AI, and recommendation engines are using deep learning while low-latency inference is used for personalization on LinkedIn, translation on Google and video on YouTube. Convolutional neural networks (CNN) are being used for annotation and labeling to transfer learning. And learning is moving to federated learning and active learning, while deep neural networks (DNN) are becoming even more complex with billions of hyper-parameters. The result of these transitions is different stages within the AI data pipelines, each with distinct storage and I/O requirements.


SASE: Building a Migration Strategy

Gartner's analysts say that "work from anywhere" and cloud-based computing have accelerated cloud-delivered SASE offerings to enable anywhere, anytime secure access from any device. Security and risk management leaders should build a migration plan from the legacy perimeter and hardware-based offerings to a SASE model. One hindrance to SASE adoption, some security experts tell me, is that organizations lack visibility into sensitive data and awareness of threats. Too many enterprises have separate security and networking teams that don't share information and lack an all-encompassing security strategy, they say. "While the vendors are touting SASE as the end-all solution, the key to success would depend upon how well we define the SASE operating model, particularly when there are so many vendors coming up with SASE-based solutions," says Bengaluru-based Sridhar Sidhu, senior vice president and head of the information security services group at Wells Fargo. Yask Sharma, CISO of Indian Oil Corp., says that as data centers move to the cloud, companies need to use SASE to enhance security while controlling costs.


How to Bridge the Gap between Netops and Secops

If you were designing the managerial structure for a software development firm from scratch today, it’s very unlikely that you would separate NetOps and SecOps in the first place. Seen from the perspective of 2021, many of the monitoring and visibility tools that both teams seek and use seem inherently similar. Unfortunately, however, the historical development of many firms has not been that simple. Teams and remits are not designed from the ground up with rationality in mind – instead they emerge from a complex series of interactions and ever-changing priorities. This means that different teams often end up with their own priorities, and can come to believe that they are more important than those of other parts of your organization. This is seen very clearly in the distinction between SecOps and NetOps teams in many firms. At the executive level, your network exists in order to facilitate connections – between systems and applications but above all between staff members. Yet for many NetOps teams, the network can come to be seen as an end in itself.


The future of data science and risk management

“Enterprise data is growing nearly exponentially, and it is also increasing in complexity in terms of data types,” said Morgan. “We have gone way past the time when humans could sift through this amount of data in order to see large-scale trends and derive actionable insights. The platforms and best practices of data science and data analytics incorporate technologies which automate the analytics workflows to a large extent, making dataset size and complexity much easier to tackle with far less effort than in years past. “The second value-add is to leverage machine learning, and ultimately artificial intelligence, to go beyond historical and near-real-time trend analysis and ‘look into the future’, so to speak. Predictive analysis can unveil new customer needs for products and services and then forecast consumer reactions to resultant offers. Equally, predictive analytics can help uncover latent anomalies that lead to much better predictions about fraud detection and potentially risky behaviour. “Nothing can foretell the future with 100% certainty, but the ability of modern data science to provide scary-smart predictive analysis goes well beyond what an army of humans could do manually.”


DevOps Observability from Code to Cloud

DevOps has transformed itself in the last few years, completely changing from what we used to see as siloed tools connected together to highly integrated, single-pane-of-glass platforms. Collaboration systems like JIRA, Slack, and Microsoft Teams are connected to your observability tools such as Datadog, Dynatrace, Splunk, and Elastic. Finally, IT Service management tools like PagerDuty are also connected in. Tying these high-in-class tools together on one platform, such as the JFrog Platform, yields high value to the enterprises looking for observability workflow. The security folks also need better visibility into an enterprise’s systems, to look for vulnerabilities. A lot of this information is available in Artifactory and Amazon Web Services‘ Xray, but how do we leverage this information in other partner systems like JIRA and Datadog? It all starts with JFrog Xray’s security impact, where we can generate the alert to Slack and robust security logs to Datadog to be analyzed by your Site Reliability Engineer. A PagerDuty incident that’s also generated from Xray can then be used to create a JIRA issue quickly.


6 Global Megatrends That Are Impacting Banking’s Future

The line between digital and physical has blurred, with consumers who once preferred brick-and-mortar engagements now researching, shopping and buying using digital channels more than ever. This trend is expected to increase across all industries. While organizations have enabled improved digital engagement over the past several months, there are still major pain points, mostly with speed, simplicity and cross-channel integration during the ‘first mile’ of establishing a relationship. The retail industry already understands that consumers are becoming increasingly impatient, wanting the convenience and transparency of eCommerce and the service and humanization of physical stores. In banking, consumers are diversifying their financial relationships, moving to fintech and big tech providers that can open relationships in an instant and personalize experiences. According to Brett King, founder of Moven and author of the upcoming book, ‘The Rise of Technosocialism’, “The ability to acquire new customers at ‘digital scale’ will impact market share and challenge existing budgets for branches. ..."


Understanding Contextual AI In The Modern Business World

Contextual AI can be divided into three pillars that help make businesses become more visible to the people they want to reach. In the same sense, when a business is looking for a partner, it has to be sure that a prospect can offer the right services to fulfill its goals. Contextual AI aims to deliver that. The technology allows a brand to enhance its understanding of consumer interests. It is easy to make assumptions about consumer interests in different sectors, but difficult to prove them. ... In previous years, contextual AI was seen as an enhancing technology, but not an essential one. Now, the recognition of contextual AI as more than simply enhancing is growing. Businesses are constantly looking for more cost-effective solutions to their problems, and contextual AI offers one solution to fit that bracket. If you look at a similar alternative, such as behavioral advertising, it is heavily reliant on data — and lots of it. The huge amounts of data required to make this a success means that businesses have to implement a successful collection, analysis and then reporting solution in order to leverage it effectively. This can be a costly process if a business does not have large economies of scale.



Quote for the day:

"A true dreamer is one who knows how to navigate in the dark" -- John Paul Warren

Daily Tech Digest - July 26, 2021

CIOs and CFOs: creating a value-driven partnership

The CFO/CIO relationship is evolving in the UK and elsewhere in the world. The digitisation of everything is forcing both functions to recognise that technology is not just integral to running the business efficiently, but also permeates every aspect of business strategy and how companies define competitive advantage. Consequently, technology is exerting much greater influence on the way CFOs and CIOs think about their roles and how they define value for their organisations. ... “Technology is expanding the roles that CFOs and CIOs play in an organisation…”. It implies the need for closer collaboration between IT and finance in this country. If both roles collaborate and ask meaningful questions of each other, their shared expertise will enable them to better understand their contribution to delivering value for the business and how their combined skillsets can leverage the benefits of digitisation to become more productive. Yet, not all is sweetness and success, because traditionally both functions have come from very different standpoints when it comes to what value means to their organisations: “While the CFO-CIO relationship is interconnected, sometimes it can become divided, as both often speak different ‘languages’ about the same topic”.


Ignore API security at your peril

Many organizations are quick to embrace the potential and possibilities of connected devices and apps. However, they frequently neglect to put in place the right technology and processes needed to make their APIs secure. Understanding APIs in terms of private/partner/public differences and understanding that these are not the same as internal/external is just the start. Organizations should have both an API strategy and a well-managed API management platform in place so that before teams expose APIs to anybody, a thorough security review is undertaken before rolling out certain API designs. Similarly, any identified issue needs to be handled in a highly structured way. This includes conducting a full assessment of the impact and scope of reported vulnerabilities and having processes in place to ensure that all these issues are then resolved in a timely manner to prevent bigger problems arising further down the road. As organizations push ahead with using APIs to power up digital transformation and deploy a new generation app-based services, so the risk of unauthorized access and data exposure is growing.


AI Liability Risks to Consider

Most AI systems are not autonomous. They provide results, they make recommendations, but if they're going to make automatic decisions that could negatively impact certain individuals or groups (e.g., protected classes), then not only should a human be in the loop, but a group of individuals who can help identify the potential risks early on such as people from legal, compliance, risk management, privacy, etc. ... It states, "The data subject shall have the right not to be subject to a decision based solely on automated processing, including profile, which produces legal effects concerning him or her similarly significantly affects him or her." While there are a few exceptions, such as getting the user's express consent or complying with other laws EU members may have, it's important to have guardrails that minimize the potential for lawsuits, regulatory fines and other risks. "You have people believing what is told to them by the marketing of a tool and they're not performing due diligence to determine whether the tool actually works," said Devika Kornbacher, a partner at law firm Vinson & Elkins. "Do a pilot first and get a pool of people to help you test the veracity of the AI output – data science, legal, users or whoever should know what the output should be."


Digital transformation: 3 priorities for CIOs facing a tough climb

Leading a successful digital transformation is like leading a mountain climbing expedition: It takes courage, leadership, and perseverance. Consider these tips from a leader who's done both ... Imagine boiling the ocean in one day. That’s how digital transformation feels sometimes. The psychological impact becomes unbearable and overwhelming. By preparing and staying the course, however, digital transformation becomes an achievable feat with lasting outcomes. In the case of our climb, preparing meant wearing the right clothes, packing the right things, communicating with each other, trusting one another, fuelling ourselves with energy bars, breaking down the path into smaller chunks, and learning about the road ahead. As a leader, I ventured to turn our performance up that mountain from mediocre to exceptional. In digital transformation, this may mean upskilling the workforce and adopting new platforms. ... Climbing Mount Hood was precarious and mentally and physically difficult. I never wavered. I stuck to our goal because I knew the outcome would benefit everyone in my family. To soldier on, you must be that persistent.


How to Secure Your Cryptocurrency Wallet

Owners of Bitcoin, Ethereum, and other cryptocurrency typically trade on centralized platforms such as Robinhood, Coinbase, FTX, and others. They don't need to worry about creating and managing digital wallets since the platform handles those tasks. That's the convenience of a centralized platform. However, there are serious drawbacks to keeping your crypto assets on a platform. If the platform gets hacked, or your account credentials are stolen, or the government decides to seize your digital assets, you could lose all of your crypto investments. If you would rather not rely on these platforms to secure your digital assets and prefer not to be subject to their policies, it's better to move your digital assets off of the platform and to where you can have full control. Centralized platforms are the on-ramps to purchase digital assets with dollars. Once you make the purchase, you can take custody of your assets by transferring them to your wallet. Decentralized applications (dapp), on the other hand, require users to hold funds in their own wallet. Decentralized finance (DeFi) – such as lending, borrowing, insurance – requires using a digital wallet. DeFi is only slowly becoming available to users of centralized platforms.


How to Work Better Together: Building DEI Awareness in Tech

Increasingly, we also gatekeep on existing experience. By that I mean the problem that those new to our industry experience when they need to "get experience to get experience". This happens when entry level roles already require some number of years of experience as a condition of hire. Without "year 0" opportunities, then the only people in the available job pool will be people already behind the gate and that number will decrease over time as people change industries, retire, or even want to go on holidays or sabbaticals. Perception of what success looks like is also a major barrier to success. A great example is the previous section, where I outlined groups of people who are not normally included in dress code; not normally actively, but rather invisibly due to lack of representation or lack of awareness of those currently in the majority. A way to start self testing for this is to see what comes to mind when I say "successful engineer", "manager", or "CEO". Specially: what do the people in those roles look like and sound like, by default, in your mind’s eye?


Australia Says Uber 'Interfered' With Users' Privacy

The OAIC action comes almost five years after Uber's systems were infiltrated by attackers who stole user data. Uber's cover up of the incident spurred outrage, inquiries and action by several regulators worldwide. Two attackers obtained login credentials from a private GitHub site that was used by some of Uber's engineers. They then used those login credentials to access an Amazon Web Services account that had an archive with rider and driver information. All told, there were 57 million accounts exposed. The data affected included names, email addresses and phone numbers for Uber customers as well as personal information of 7 million drivers and 600,000 driver's license numbers. Uber paid $100,000 in bitcoin to the two attackers and positioned the payment as a bug bounty. Uber did not reveal the breach until more than a year later in November 2017. Shortly after that disclosure, Uber fired Joe Sullivan, its CSO. Sullivan, who is now CSO for Cloudflare, was charged in the U.S. with obstruction of justice and misprision, which is the deliberate concealment of a felony or treasonable act.


CISO Interview Series: How Aiming for the Sky Can Help Keep Your Organization Secure

Visibility is key to understanding your landscape, to understanding what ‘your organizational landscape’ and world looks like. The capability I would invest in is looking at your cyber risk profile, ensuring that you understand your risks. If you understand your risks, then you can help translate that across the business. Or it doesn’t need to be translated. It’s already done for you because you’ve got it in a risk profile that the business understands because the business will essentially dictate that.
Once you understand your risk profile, that gives you actions you can work towards. Even if you’re using a risk framework, without a good risk assessment, you can be working on stuff that doesn’t really add value or isn’t a problem. Understanding your landscape is what gives the visibility. Focus on your basics and get your policies and processes in place so that there is structure that everyone can work from. As an example, we work to four area: governance, risk, and compliance; security operations center; secure architecture; and secure infrastructure. They acre the four pillars we align to. What that means is your secure infrastructure is critical.


Health Care’s Digital Transformation: Three Trends To Watch For

A shift is happening within our health care system that is allowing more and more data to enter the health system. According to Capital Markets, 30% of the world’s data volume is being generated by the health care industry, and by 2025, the compound annual growth rate of data for health care will reach 36%. Health care organizations must develop a plan to manage this data and integrate it with SDoH data, AI-fueled behavioral science, patient history and more to facilitate a more proactive approach to care. Value-based care — a buzzword for years now that emphasizes preventative care — may finally be within reach if health care leaders are able to harness this data and integrate it into clinical workflows. Like the health care system itself, these topics are interwoven and complex. Overcoming these challenges will require hard work and dedication from the entire health care industry, but I am confident we are making incredible strides. We’re seeing cloud adoption that would have been unimaginable just 18 months ago. 


Re-focusing your tech strategy post-Covid

Too often businesses forget about the importance of measuring these KPIs long-term – in fact, research carried out last year by AppLearn found that just 12 per cent of organisations measure the success of their technology investments after one year, falling to five per cent after three years. When you consider the time and money ploughed into software roll outs, these stats are shocking. But there’s also the fact that software evolves and the way users interact with it can change, especially with major updates – this makes assessing the performance and value of investments beyond the first few years of implementation just as important. In the age of the digital workplace, data is king and will give business leaders greater insights into the technologies used and the end-to-end employee experience. To maintain productivity in the long-term, you must move beyond surface level vanity metrics and gather intelligent data points – this could be time spent navigating tasks within applications, task error/completion rates, what pages users have visited or where they’ve looked for support.



Quote for the day:

"We are reluctant to let go of the belief that if I am to care for something I must control it." -- Peter Block