Showing posts with label DLT. Show all posts
Showing posts with label DLT. Show all posts

Daily Tech Digest - July 30, 2023

What Is Data Strategy and Why Do You Need It?

Developing a successful Data Strategy requires careful consideration of several key steps. First, it is essential to identify the business goals and objectives that the Data Strategy will support. This will help determine what data is needed and how it should be collected, analyzed, and used. Next, it is important to assess the organization’s current data infrastructure and capabilities. This includes evaluating existing databases, data sources, tools, and processes for collecting and managing data. It also involves identifying current gaps in skills or technology that need to be addressed. Once these foundational elements are in place, organizations can begin to define their approach to Data Governance. This involves establishing policies and procedures for managing Data Quality, security, privacy, compliance, and access. It may also involve developing a framework for decision-making that ensures the right people have access to the right information at the right time. Finally, organizations should consider how they will measure success in implementing their Data Strategy. 


Battling Technical Debt

Technical debt costs you money and takes a sizable chunk of your budget. For example, a 2022 Q4 survey by Protiviti found that, on average, an organization invests more than 30% of its IT budget and more than 20% of its overall resources in managing and addressing technical debt. This money is being taken away from building new and impactful products and projects, and it means the cash might not be there for your best ideas. ... Technical debt impacts your reputation. The impact can be huge and result in unwanted media attention and customers moving to your competitors. In an article about technical debt, Denny Cherry attributes performance woes by US airline Southwest Airlines to poor investment in updating legacy equipment, which caused difficulties with flight scheduling as a result of "outdated processes and outdated IT." If you can't schedule a flight, you're going to move elsewhere. Furthermore, in many industries like aviation, downtime results in crippling fines. These could be enough to tip a company over the edge.


‘Audit considerations for digital assets can be extremely complex’

Common challenges when auditing crypto assets include understanding and evaluating controls over access to digital keys, reconciliations to the blockchain to verify existence of assets, considerations around service providers in terms of qualifications, availability and scope, and forms of reporting, among others. As the technology is rapidly evolving, the regulatory standards do not yet capture all crypto offerings. Everyone is operating in an uncertain regulatory environment, where the speed of change is significant for all participants. If you take accounting standards, for example, a common discussion today is how to measure these assets. Under IFRS, crypto assets are generally recognized as an intangible asset and recorded at cost. While this aligns with the technical requirements of the standards, it sometimes generates financial reporting that may not be well understood by users of the financial information who may be looking for the fair value of these assets.


Does AI have a future in cyber security? Yes, but only if it works with humans

One technique that has been around for a while is rolling AI technology into security operations, especially to manage repeating processes. What the AI does is filter out the noise, identifies priority alerts and screens these out. The other thing it is capable of is capturing this data and being able to look for any anomalies and joining the dots. Established vendors are already providing capabilities like this. Here at Nominet, we have masses of data coming into our systems every day, and being able to look at correlations to identify malicious and anomalous behaviour is very valuable. But once again we find ourselves in the definition trap. Being alerted when rules are triggered is moving towards ML, not true AI. But if we could give the system the data and ask it to find us what looked truly anomalous, that would be AI. Organisations might get tens of thousands of security logs at any point in time. Firstly, how do you know if these logs show malicious activity and if so, what is the recommended course of action? 


Moody’s highlights DLT cyber risks for digital bonds

The body of the paper warns of the cyber risks of smaller public blockchains, which are less decentralized and hence more vulnerable to attacks. It considers private DLTs are more secure than similar (small) sized public blockchains because they have greater access controls. Moody’s acknowledges that larger Layer 1 public blockchains such as Ethereum are far harder to attack, but upgrades to the network carry risks. A major challenge is the safeguarding of private keys. In reality the most significant risks relate to the platforms themselves, bugs in smart contracts and oracles which introduce external data. It notes that currently many solutions don’t have cash on ledger, which reduces the attack surface. In reality this makes them less attractive to attack. As cash on ledger becomes more widespread, this enables greater automation. Manipulating smart contract weaknesses could result in unintended payouts and other vulnerabilities. Moody’s specifically mentions the risks associated with third party issuance platforms such as HSBC Orion, DBS, and Goldman Sachs’ GS DAP.


Cyber Resilience Act: EU Regulators Must Strike the Right Balance to Avoid Open Source Chilling Effect

The good news is that developers are willing to work with regulators in fine-tuning the act. And why not get them involved? They know the industry, count deep insights into prevailing processes and fully grasp the intricacies of open source. Additionally, open source is too lucrative and important to ignore. One suggestion is to clarify the wording. For example, replace “commercial activity” with “paid or monetized product.” This will go some way to narrowing the act’s scope and ensuring that open-source projects are not unnecessarily targeted. Another is differentiating between market-ready software products and stand-alone components, ensuring that requirements and obligations are appropriately tailored. Meanwhile, regulators can provide funding in the legislation to actively support open source. For example, Germany grants resources to support developers in maintaining open-source software projects of strategic importance. A similar sovereign tech fund could prove instrumental in supporting and protecting the industry across the continent.


Organizational Resilience And Operating At The Speed Of AI

The challenge becomes—particularly for mid-market organizations that may not have the resources of their larger competitors—how to corral resources to ensure they can effectively incorporate AI. If businesses are to achieve the kind of organizational resilience that is necessary to build sustainable enterprises, they must accept that AI and automation will fundamentally change company structures, culture and operations. Much of this will require investment in “intangible goods, such as business processes and new skills,” as suggested in the Brookings Institute article, but I would like to add one additional imperative: data gravity. ... To operate at the speed of AI, systems must be able to access all the information within an organization’s disparate IT infrastructure. That data must be secure, have integrity and be without bias. AI requires data agility. Therefore, organizations should employ a data gravity strategy whereby all the data within an organization is consolidated into a central hub, creating a single view of all the information. 


As Ransomware Monetization Hits Record Low, Groups Innovate

With ransomware profits in decline, groups have been exploring fresh strategies to drive them back up. While groups such as Clop have shifted tactics away from ransomware to data theft and extortion, other groups have been targeting larger victims, seeking bigger payouts. Some affiliates have been switching ransomware-as-a-service provider allegiance, with many Dharma and Phobos business partners adopting a new service named 8Base, Coveware says. Numerous criminal groups continue to wield crypto-locking malware. The most number of successful attacks it saw during the second quarter involved either BlackCat or Black Basta ransomware, followed by Royal, LockBit 3.0, Akira, Silent Ransom and Cactus. One downside of crypto-locking malware is that attacks designed to take down the largest possible victims, in pursuit of the biggest potential ransom payment, typically demand substantial manual effort, including hands on keyboard time. Groups may also need to purchase stolen credentials for the target from an initial access broker, pay penetration testing experts or share proceeds with other affiliates.


How Indian organisations are keeping pace with cyber security

Jonas Walker, director of threat intelligence at Fortinet, said the digitisation of retail and the rise of e-commerce makes those sectors susceptible to payment card data breaches, supply chain attacks and attacks targeting customer information. “Educational institutions also hold a wealth of personal information, including student and faculty data, making them attractive targets for data breaches and identity theft,” he added. But enterprises in India are not about to let the bad actors get their way. Sakra World Hospital, for example, has segmented its networks and implemented role-based access, endpoint detection and response, as well as zero-trust capabilities for its internal network. It also conducts vulnerability assessments and penetration tests to secure its external assets. “Zero-trust should be implemented on your external security appliances as well,” he added. “The notification system should be strong and prompt so that action can be taken immediately to mitigate any cyber security risk.”


How Can Blockchain Lead to a Worldwide Economic Boom?

The inherent trustworthiness of distributed ledgers is a key factor here in that they greatly enhance critical economic drivers like supply chain management, land ownership, and the distribution of government and non-government services. At the same time, blockchain’s support of digital currencies provides greater access to capital, in large part by side-stepping the regulatory frameworks that govern sovereign currencies. And perhaps most importantly, blockchain helps to stymie public corruption and the diversion of funds away from their intended purpose, which allows capital and profits to reach those who have earned them and will put them to more productive uses. None of this should imply that blockchain will put the entire world on easy streets. Significant challenges remain, not the least of which is the cost to establish the necessary infrastructure to support secure digital ledgers. Multiple hardened data centers are required to prevent hacking, along with high-speed networks to connect them.



Quote for the day:

"Leadership is a privilege to better the lives of others. It is not an opportunity to satisfy personal greed." -- Mwai Kibaki

Daily Tech Digest - March 05, 2023

Transforming transformation

Transformation has been a way of extracting value rather than re-invention. Financial services companies are particularly guilty of this. For example, in banking, digital has been a way of reducing costs by moving the “business of banking” into the hands of the end customer – hence why we all do things ourselves that the bank used to do for us. This focus on cost reduction has meant that processes have been optimised for the digital age at the expense of true innovation. The days of extracting value are almost over for the financial services industry. There are not many places left to reduce costs. So, they must become value creators, which means taking a leaf out of the digital giants’ book and finding ways of identifying and solving problems. ... But, according to Paul Staples, who was, until recently, head of embedded banking for HSBC, success will not be determined by technology but by the proposition, approach, and processes that the banks wrap around it. Pain points and value must be identified up front, forming the basis of what gets delivered.


Five Megatrends Impacting Banking Forever

The first megatrend impacting banking is the democratization of data and insights. More than ever, data is being collected everywhere, and it is the lifeblood of any financial institution. The democratization of data and insights refers to the process of making data and insights accessible to a wider audience, including both employees and customers. ... The explosion of hyper-personalization is driven by the use of significantly larger amounts of data, such as browsing and purchase history, interests and preferences, demographics and even survey information. With advanced technologies that include facial recognition, augmented reality and conversational AI, it is now possible to also offer customers highly personalized experiences that cater to their unique delivery preferences – in near real-time. ... Traditionally, banks and credit unions have viewed their relationship with consumers as a series of transactions. However, in recent years, there has been an increasing focus on providing a seamless and integrated engagement opportunity that can result in a more stable and long-term relationship. 


Understanding the Role of DLT in Healthcare

Finding actual healthcare circumstances where this DLT technology could be useful and relevant is crucial. Instead of implementing a solution without first identifying an issue to answer, organizations must take into account any current requirements or challenges that the technology may help address. Organizations employing this technology must be aware of and receptive to the new organizational paradigms that go along with these solutions. Recognizing the paradigm shift to decentralized, distributed solutions is essential to evaluating this technology. ... In shared ledgers, the validity and consistency of which are maintained by nodes using a variety of processes, including consensus mechanisms, protecting the secrecy of data entail ensuring that only authorized access is granted to data. Institutions are employing a multi-layered strategy for blockchain in healthcare, using private blockchains where all of the linked healthcare organizations are well-known and trusted.


Control the Future of Data with AI and Information Governance

“The average company manages hundreds of terabytes of data. For that data to prove an asset rather than a liability, it must be located, classified, cleansed, and monitored. With so much data entering the organization so quickly from so many disparate sources, conducting those data tasks manually is not feasible.” “For organizations to make accurate data-driven decisions, decision makers need clean, reliable data. By the same token, AI-powered analysis will only prove useful if based on complete and accurate data sets. That requires visibility into all relevant data. And it requires exhaustive checks for errors, duplicates, and outdated information.” “An important aspect of information governance includes data security. Privacy regulations, for example, require that organizations take all reasonable measures to keep confidential data safe from unauthorized access. This includes ensuring against inappropriate sharing and applying encryption to sensitive information.”


BI solution architecture in the Center of Excellence

Designing a robust BI platform is somewhat like building a bridge; a bridge that connects transformed and enriched source data to data consumers. The design of such a complex structure requires an engineering mindset, though it can be one of the most creative and rewarding IT architectures you could design. In a large organization, a BI solution architecture can consist of: Data sources; Data ingestion; Big data / data preparation; Data warehouse; BI semantic models; and Reports. At Microsoft, from the outset we adopted a systems-like approach by investing in framework development. Technical and business process frameworks increase the reuse of design and logic and provide a consistent outcome. They also offer flexibility in architecture leveraging many technologies, and they streamline and reduce engineering overhead via repeatable processes. We learned that well-designed frameworks increase visibility into data lineage, impact analysis, business logic maintenance, managing taxonomy, and streamlining governance. 


When finops costs you more in the end

Don’t overspend on finops governance. The same can be said for finops governance, which controls who can allocate what resources and for what purposes. In many instances, the cost of the finops governance tools exceeds any savings from nagging cloud users into using fewer cloud services. You saved 10%, but the governance systems, including human time, cost way more than that. Also, your users are more annoyed as they are denied access to services they feel they need, so you have a morale hit as well. Be careful with reserved instances. Another thing to watch out for is mismanaging reserved instances. Reserved instances are a way to save money by committing to using a certain number of resources for a set period. But if you’re not optimizing your use of them, you may end up spending more than you need to. Again, the cure is worse than the disease. You’ve decided that using reserved instances, say purchasing cloud storage services ahead of time at a discount, will save you 20% each year. However, you have little control over demand, and if you end up underusing the reserved instances, you still must pay for resources that you didn’t need.


Core Wars Shows the Battle WebAssembly Needs to Win

So the basics are that you have two or more competing programs, running in a virtual space and trying to corrupt each other with code. In summary:The assembler-like language is called Redcode. Redcode is run by a program called MARS. The competitor programs are called “warriors” and are written in Redcode, managed by MARS. The basic unit is not a byte, but an instruction line. MARS executes one instruction at a time, alternatively for each “warrior” program. The core (the memory of the simulated computer), or perhaps “battlefield”, is a continuous wrapping loop of instruction lines, initially empty except for the competing programs, which are set apart. Code is run and data stored directly on these lines. Each Redcode instruction contains three parts: the operation itself (OpCode), the source address and the destination address. ... While in modern chips, code moves through parallel threads in mysterious ways, the Core War setup is still pretty much the basics of how a computer works. However code is written it, we know it ends up as a set of machine code instructions.


Data Fear Looms As India Embraces ChatGPT

Considering the vast amounts of data that OpenAI has amassed without permission—enough that there is a chance that ChatGPT will be trained on blog posts, product reviews, articles and more—its privacy policy raises legitimate concerns. The IP address of visitors, their browser’s type and settings, and the information about how visitors interact with the websites—such as the kind of content they engage with, the features they use, and the actions they take—are all collected by OpenAI in accordance with its privacy policy. Additionally, it compiles information on the user’s website and time-based browsing patterns. OpenAI also states that it may share users’ personal information with unspecified third parties without informing them to meet its business objectives. The lack of clear definitions for terms such as ‘business operation needs’ and ‘certain services and functions’ in the company’s policies creates ambiguity regarding the extent and reasoning for data sharing. To add to the concerns, OpenAI’s privacy policy also states that the user’s personal information may be used for internal or third-party research and could potentially be published or made publicly available.


Booking.com's OAuth Implementation Allows Full Account Takeover

While researchers only divulged how they used OAuth to compromise Booking.com in the report, they discovered other sites with risk from improperly applying the authentication protocol, Balmas tells Dark Reading. "We have observed several other instances of OAuth flaws on popular websites and Web services," he says. "The implications of each issue vary and depends on the bug itself. In our cases, we are talking about full account takeovers across them all. And there are surely many more that are yet to be discovered." OAuth provides an easy solution to bypass the user login process for site owners, reducing friction for which is a "long and frustrating" problem, Balmas says. However, though it seems simple, implementing the technology successfully and securely is actually very complicated in terms of proper technical implementation, and a single small wrong move can have a huge security impact, he says. "To put it in other words — it is very easy to put a working social login functionality on a website, but it is very hard to do it correctly," Balmas tells Dark Reading.


More automation, not just additional tech talent, is what is needed to stay ahead of cybersecurity risks

Just over three-quarters of CISOs believe that their limited bandwidth and lack of resources has led to important security initiatives falling to the wayside, and nearly 80% claimed they have received complaints from board members, colleagues or employees that security tasks are not being handled effectively. ... Stress is also having an impact on hiring. 83% of the CISOs surveyed admitted they have had to compromise on the staff they hire to fill gaps left by employees who have quit their job. “I’ve never tried harder in my career to keep people than I have in the past few years,” said Rader. “It’s so key to hang onto good talent because without those people you’re always going to be stuck focusing on operations instead of strategy.” But there are solutions — and it’s not just finding more talent, says George Tubin, director of product marketing at Cynet. He said CISOs want more automated tools to manage repetitive tasks, better training, and the ability to outsource some of their work.



Quote for the day:

"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup

Daily Tech Digest - January 15, 2023

How confidential computing will shape the next phase of cybersecurity

At its core, confidential computing encrypts data at the hardware level. It’s a way of “protecting data and applications by running them in a secure, trusted environment,” explains Noam Dror—SVP of solution engineering at HUB Security, a Tel Aviv, Israel-based cybersecurity company that specializes in confidential computing. In other words, confidential computing is like running your data and code in an isolated, secure black box, known as an “enclave” or trusted execution environment (TEE), that’s inaccessible to unauthorized systems. The enclave also encrypts all the data inside, allowing you to process your data even when hackers breach your infrastructure. Encryption makes the information invisible to human users, cloud providers, and other computer resources. Encryption is the best way to secure data in the cloud, says Kurt Rohloff, cofounder and CTO at Duality, a cybersecurity firm based in New Jersey. Confidential computing, he says, allows multiple sources to analyze and upload data to shared environments, such as a commercial third-party cloud environment, without worrying about data leakage.


Not All Multi-Factor Authentication Is Created Equal

Many legacy MFA platforms rely on easily phishable factors like passwords, push notifications, one-time codes, or magic links delivered via email or SMS. In addition to the complicated and often frustrating user experience they create, phishable factors such as these open organizations up to cyber threats. Through social engineering attacks, employees can be easily manipulated into providing these authentication factors to a cyber criminal. And by relying on these factors, the burden to protect digital identities lies squarely on the end user, meaning organizations’ cybersecurity strategies can hinge entirely on a moment of human error. Beyond social engineering, man-in-the middle attacks and readily available toolkits make bypassing existing MFA a trivial exercise. Where there is a password and other weak and phishable factors, there is an attack vector for hackers, leaving organizations to suffer the consequences of account takeovers, ransomware attacks, data leakage, and more. A phishing-resistant MFA solution completely removes these factors, making it impossible for an end user to be tricked into handing them over even by accident or collected by automated phishing tactics.


Europe’s cyber security strategy must be clear about open source

While the UK government has tried to recognise the importance of digital supply chain security, current policy doesn’t consider open source as part of that supply chain. Instead, regulation or proposed policies focus only on third-party software vendors in the traditional sense but fail to recognise the building blocks of all software today and the supply chain behind it. To hammer the point, the UK’s 11,000+ word National Cyber Security Strategy does not include a single reference to open source. GCHQ guidance meanwhile remains limited, with little detailed direction beyond ‘pull together a list of your software’s open source components or ask your suppliers.’ ... In this sense, the EU has certainly been listening. The recently released Cyber Resilience Act (CRA) is its proposed regulation to combat threats affecting any digital entity and ‘bolster cyber security rules to ensure more secure hardware and software products’. First, the encouraging bits: the CRA doesn’t just call for vendors and producers of software to have (among other things) a Software Bill of Materials (SBoM) - it demands companies have the ability to recall components. 


Eight Common Data Strategy Pitfalls

Lack of data culture: Data hidden within silos with little communication between business units leads to a lack of data culture. Data Literacy and enterprise-wide data training is required to allow business staff to read, analyze, and discuss data. Data culture is the starting point for developing an effective Data Strategy.The Data Strategy is too focused on data and not on the business side of things: When businesses focus too much on just data, the Data Strategy may just end up serving the needs of analytics without any focus on business needs. An ideal Data Strategy enlists human capabilities and provides opportunities for training staff to carry out the strategy to meet business goals. This approach will work better if citizen data scientists are included in strategy teams to bridge the gap between the data scientist and the business analyst.Investing in data technology before democratizing data: In many cases, Data Strategy initiatives focus on quick investment in technology without first addressing data access issues. If data access is not considered first, costly technology investments will go to waste. 


Here's Why Your Data Science Project Failed (and How to Succeed Next Time)

Every data science project needs to start with an evaluation of your primary goals. What opportunities are there to improve your core competency? Are there any specific questions you have about your products, services, customers, or operations? And is there a small and easy proof of concept you can launch to gain traction and master the technology? The above use case from GE is a prime example of having a clear goal in mind. The multinational company was in the middle of restructuring, re-emphasizing its focus on aero engines and power equipment. With the goal of reducing their six- to 12-month design process, they decided to pursue a machine learning project capable of increasing the efficiency of product design within their core verticals. As a result, this project promises to decrease design time and budget allocated for R&D. Organizations that embody GE's strategy will face fewer false starts with their data science projects. For those that are still unsure about how to adapt data-driven thinking to their business, an outsourced partner can simplify the selection process and optimize your outcomes.


5 Skills That Make a Successful Data Manager

The role of a data manager in an organization is tricky. This person is often neither an IT guy who implements databases on his/her own, nor a business guy who is actually responsible for data or processes (that’s rather a Data Steward’s area of responsibility). So what’s the real value-add of a data manager (or even a data management department)? In my opinion, you need someone who is building bridges between the different data stakeholders on a methodical level. It’s rather easy to find people who consider themselves as experts for a particular business area, data analysis method or IT tool, but it is rather complicated to find one person who is willing to connect all these people and to organize their competencies as it is often required in data projects. So what I am referring to are skills like networking, project management, stakeholder management and change management HIwhich are required to build a data community step-by-step as backbone for Data Governance. Without people, a data manager will fail! So in my opinion, a recruiter who seeks for data managers should not only challenge technical skills but also these people skills.


Why distributed ledger technology needs to scale back its ambition

There is nonetheless an expectation that DLT can prove to be a net good for financial markets. Foreign exchange markets have an estimated $8.9 trillion at risk every day due to the final settlement of transactions between two parties taking days. This is why the Financial Stability Board and the Committee on Payments and Market Infrastructures have focused their efforts on enhancing cross-border payments with a comprehensive global roadmap. Part of this roadmap includes exploring the use of DLT and Central Bank Digital Currencies. The problem may not be the technology itself, but the aim of replacing current technology systems with distributed networks. DLT networks are being designed to completely overhaul and replace legacy technology that financial markets depend on today. Many pilot projects, such as mBridge and Jura, rely on a single blockchain developed by a single vendor. This introduces a single point of trust, and removes many of the benefits of disintermediation. 


Why is “information architecture” at the centre of the design process?

The information architecture within a design (both process and output) makes the balancing within the equation possible. It also ensures the equation is “solvable” by other people. It does this by introducing logical coherence. It ensures words, images, shapes and colours are used consistently. And it ensures that as we move from idea to execution, we stay true to the original intent — and can clearly articulate it — so that we can meaningfully measure the effectiveness of our design. Without this internal coherence and confidence that our output is an accurate, reliable test of our hypothesis, we’re not doing design. The power of design which has a consistent information architecture is that if we find that our idea (which we translate to intent, experiments and experiences) is not equal to the problem, we can interrogate every part of the equation. We may have made a mistake in execution. Maybe our idea wasn’t quite right. Or even more powerfully, maybe we didn’t really understand the problem fully. 


Improve Your Software Quality with a Strong Digital Immune System

You can improve your software quality with a strong digital immune system since a digital immune system is designed to guard against cyberattacks and other sorts of hostile activities on computer systems, networks, and hardware. It operates by constantly scanning the network and systems for indications of prospective threats and then taking the necessary precautions to thwart or lessen such dangers. This can entail detecting and preventing malicious communications, identifying and containing compromised devices, and patching security holes. A robust digital immune system should offer powerful and efficient protection against cyber threats and assist individuals and companies in staying secure online. Experts in software engineering are searching for fresh methods and strategies to reduce risks and maximize commercial impact. The idea of “digital immunity” offers a direction. It consists of a collection of techniques and tools for creating robust software programmes that provide top-notch user experiences. With the help of this roadmap, software engineering teams may identify and address a wide range of problems, including functional faults, security flaws, and inconsistent data.


Security Bugs Are Fundamentally Different Than Quality Bugs

For each one of the types of testing listed above, a different skillset is required. All of them require patience, attention to detail, basic technical skills, and the ability to document what you have found in a way that the software developers will understand and be able to fix the issue(s). That is where the similarities end. Each one of these types of testing requires different experience, knowledge, and tools, often meaning you need to hire different resources to perform the different tasks. Also, we can’t concentrate on everything at once and still do a great job at each one of them. Although theoretically you could find one person who is both skilled and experienced in all of these areas, it is rare, and that person would likely be costly to employ as a full-time resource. This is one reason that people hired for general software testing are not often also tasked with security testing. Another reason is that people who have the experience and skills to perform thorough and complete security testing are currently a rarity. 



Quote for the day:

"Leadership is particularly necessary to ensure ready acceptance of the unfamiliar and that which is contrary to tradition." -- Cyril Falls

Daily Tech Digest - December 10, 2022

Risk and resilience priorities, as told by chief risk officers

CROs acknowledge that they need to spend more time considering “over the horizon risks.” This gap in thinking was brought into sharp focus by the heavy impact the COVID-19 pandemic and geopolitical tensions had on their institutions’ risk profiles—including second- and third-order effects—such as supply chain risk, inflation, and rising interest rates—which were not anticipated by most banking executives. Institutions were little prepared to address these highly consequential risks. The failure goes well beyond risk functions, however. Many organizations used forecasting to develop market strategies, but this approach failed to pick up major reality shifts in the recent past—from the financial crisis of the 2000s to the pandemic to geopolitical realignments. Leading institutions are moving to scenario-based foresight to increase institutional resilience against over-the-horizon risks. The risk function can play an important role here in ensuring that the scenarios capture existing and expected risks, while aligning function priorities against scenarios.


Should central banks use DLT for CBDC?

When it comes to the topic of whether “to DLT or not to DLT” in the world of CBDC, Mikhalev took a slightly different position. He stated central banks have taken a top-down approach for hundreds of years, and while this works in many jurisdictions, it doesn’t work as well in emerging markets. “To have blockchain or not for CBDCs is increasingly being answered in the negative across established economies. ... This could reduce volatility in these emerging markets. These aspects which are specifically inherent to decentralisation and the distribution of power, should have positive effects in emerging economies.” However, Mikhalev continued, in each conversation carried out with central bankers in developed countries, he has found that they perceive blockchain as having little effect in situations where the supervisory institutions are not ready or unwilling to alter their business models around a new technology. “Blockchain doesn't really make much difference if nothing changes in terms of the existing established top-down structure of CBDCs. However, in emerging economies, this seems to differ,” Mikhalev noted.


A compliance fight in Germany could hurt Microsoft customers

German compliance authorities “can live with the situation where Microsoft pretends to do everything right and the authorities pretend to have done everything in their power to force Microsoft to become compliant,” Hence said in an interview with Computerworld. Microsoft “does not fulfill the most basic requirements of GDPR. They lack basic transparency. We can’t assess what they are doing because they are not telling us.” This is where politics comes into play, wheret practical forces can influence government compliance actions. German regulators “are afraid of retribution. (With regulators thinking) we won't get more budget if we say that you can’t use Office any more. Or even Google Analytics, any more,” Hence said. “These are poltical issues. Nobody wants to be the bad guy.” Thus, Microsoft is likely to skate on the issue — at least for now. But what about enterprise IT execs? Are companies using Microsoft products immune from compliance punishments? Not necessarily. It might not seem fair to let Microsoft get away with this but to fine and otherwise punish its customers, but Hence argues that's quite likely. And not just in Germany.


Policy Developments around Blockchain

On September 15, 2022, Singapore’s Monetary Authority (MAS) launched the Financial Services Industry Transformation Map 2025 that provides a framework of strategies to develop the country as a leading global financial center through enhanced payment connectivity to build a responsible digital asset ecosystem. It also laid out clear strategies to explore DLT in use cases such as cross-border payments, trade finance, and capital markets, besides supporting tokenization of financial assets. The policy supports a central bank digital currency (CBDC) and public-private collaboration to develop the infrastructure required to deliver such a currency. However, the first off the blocks in 2022 was the Securities and Futures Commission of Hong Kong which issued a joint circular with the HK Monetary Authority on intermediaries that can undertake virtual asset-related activities. Per the statement, intermediaries distributing virtual assets need to comply with the SFC’s requirements for sale of the products.


Share of Emerging Technologies in IT Budgets

National Association of Software and Services Companies (NASSCOM) and Boston Consultancy Group (BCG) today released a report titled "Sandboxing into the Future: Decoding Technology's Biggest Bets" on the sidelines of NASTech 2022 in Bengaluru. The report aims to uncover and develop perspectives on big-bet technologies that can potentially disrupt markets in the next 3-5 years. Enterprise Tech spending is estimated to reach $4.2 Tn by 2026 globally, amongst which Tech Services companies represent the largest segment and are expected to become $1.7 Tn by 2026 with a CAGR of 8.1%. As part of the study, 28 emerging technology themes from 11 tech families were identified – across markets and verticals - with the potential to disrupt markets, basis current tech spending, growth potential, innovation maturity, and funding momentum. Amongst these 12 emerging technologies, with high funding momentum and R&D focus, have emerged as the “Biggest Bets,” including, Autonomous analytics, AR & VR, Autonomous Driving, Computer Vision, Deep learning, Distributed Ledger, Edge Computing, Sensor Tech, Smart Robots, Space Tech, Sustainability Tech, and 5G/6G.


Amazon Wants to Kill the Barcode

The system, called multi-modal identification, isn't going to fully replace barcodes soon. It's currently in use in facilities in Barcelona, Spain, and Hamburg, Germany, according to Amazon. Still, the company says it's already speeding up the time it takes to process packages there. The technology will be shared across Amazon's businesses, so it's possible you could one day see a version of it at a Whole Foods or another Amazon-owned chain with in-person stores. The problem that the system eliminates -- incorrect items coming down the line to be sent to customers -- doesn't happen too often, Amazon says. But even infrequent mistakes add up to significant slowdowns when considering just how many items a single warehouse processes in one day. Amazon's AI experts had to start by building up a library of images of products, something the company hadn't had a reason to create prior to this project. The images themselves as well as data about the products' dimensions fed the earliest versions of the algorithm, and the cameras continually capture new images of items to train the model with. The algorithm's accuracy rate was between 75% and 80% when first used, which Amazon considered a promising start. 


Cyber crime threatens manufacturing production

Targeted attacks are the most common, with smaller companies often the most vulnerable, yet many offering no cyber security training to staff. Sixty-two percent of manufacturers now have a formal cyber security procedure in place in the event of an incident, up 11% on last year’s figures with the same number giving a senior manager responsibility for cyber security. More than half (58%) have escalated this responsibility to board level. Stephen Phipson, CEO of Make UK, the manufacturers’ organisation said: “Digitisation is revolutionising modern manufacturing and becoming increasingly important to drive competitiveness and innovation. “While cost remains the main barrier to companies installing cyber protection, the need to increase the use of the latest technology makes mounting a defence against cyber threats essential. No business can afford to ignore this issue and while the increased awareness across the sector is encouraging, there is still much to be done. “Every business is vulnerable, and every business needs to take the necessary steps to protect themselves properly.”


How Do Agility and Software Architecture Fit Together?

But the question is, I mean, when we, when we create software, we make decisions all the time. So the question is, what what is architecture? Like? How is architecture is different from all of these normal decisions that we take. And when I think about that, I always use a very loose definition. And this is, architecture is about the important things. I think Martin Fowler said something like that, and I really liked that, because it's about those things that have a high risk or a high cost of change, if we need to re evaluate them if we need to redo them. And, and I think that that these types of decisions qualifies architectural decisions. And then the question is, I mean, when do we when do we decide on these important things, whatever important means, and in my opinion, there is something that is quite underappreciated in most Scrum teams and from from my experience, it is that the product owner has a very, very important part when it comes to software architecture, because it always starts with what is the vision of the product? 


What’s a distributed compliance ledger and how is one integrated into Matter?

Matter’s DCL is a network of independent servers operated by the CSA and its partners. Each DCL server includes a complete copy of the database. The original data is managed and controlled by the CSA. The DCL is implemented by connecting all the servers using a cryptographically secured protocol. The DCL makes it difficult to manipulate the data in the database and increases the security of Mater devices and networks. ... The manufacturer writes the data to the database to add a new product to the DCL. It’s not ‘active’ until approved by the CSA. Once the device has passed certification and the CSA has received the confirmation from the PPA, the CSA adds “certified” to the status list letting all members of the Matter ecosystem know that this is an approved device and ready to be added to Matter networks. Database access is restricted. Device makers can only add data for their own products that are linked to their vendor identification (VendorID) number. Software updates must also be linked to the VendorID, or they will be rejected. Official CSA PPA bodies or the CSA can confirm or revoke device compliance data.


4 tips for implementing consistent configuration and automation standards

The team regularly reviews standards with squads and SMEs to keep the operating system and middleware standards current and compliant with security and other requirements. We created a naming structure for standards to maintain version control for compliance and audit purposes. The standards form a baseline for maintaining playbooks for consistent automation across the environment. The organization also promotes InnerSource (the use of open source practices to improve internal software) and advocates reusing playbooks. We base the playbooks on common configuration and automation standards. This establishes governance for operating systems and middleware support. ... Automation is a continuous journey. Achieving touchless deployments requires standard configurations, processes, procedures, security guidelines, and other dependencies that you must review and validate periodically. These standards form the baseline that the automation team will adopt and implement. 



Quote for the day:

"We are drowning in information, but starved for knowledge." -- John Naisbitt

Daily Tech Digest - November 30, 2022

7 lies IT leaders should never tell

Things break, and in most cases, it comes as a surprise. IT consists of many systems requiring different degrees of connectivity and monitoring, making it difficult to know absolutely everything at every moment. The key to minimizing failures is to be proactive rather than simply waiting for bad things to happen. CIOs should not only expect things to break but also be honest about this with their team members and business colleagues. “Eat, sleep, and live that life,” advises Andre Preoteasa, internal IT director at IT business management firm Electric. “There are things you know, things you don’t know, and things you don’t know you don’t know,” he observes. “Write down the first two, then think endlessly about the last one — it will make you more prepared for the unknowns when they happen.” Preoteasa stresses the importance of building and maintaining detailed disaster recovery and business continuity plans. “IT leaders that don’t have [such plans] put the company in a bad position,” he notes. “The exercise alone of writing things down shows you’re thinking about the future.”


Amid Legal Fallout, Cyber Insurers Redefine State-Sponsored Attacks as Act of War

Acts of war are a common insurance exclusion. Traditionally, exclusions required a "hot war," such as what we see in Ukraine today. However, courts are starting to recognize cyberattacks as potential acts of war without a declaration of war or the use of land troops or aircraft. The state-sponsored attack itself constitutes a war footing, the carriers maintain. ... Effectively, Forrester's Valente notes, larger enterprises might have to set aside large stores of cash in case they are hit with a state-sponsored attack. Should insurance carriers be successful in asserting in court that a state-sponsored attack is, by definition, an act of war, no company will have coverage unless they negotiate that into the contract specifically to eliminate the exclusion. When buying cyber insurance, "it is worth having a detailed conversation with the broker to compare so-called 'war exclusions' and determining whether there are carriers offering more favorable terms," says Scott Godes, partner and co-chair of the Insurance Recovery and Counseling Practice and the Data Security & Privacy practice at District of Columbia law firm Barnes & Thornburg.


Top 5 challenges of implementing industrial IoT

Scalability is another challenge faced by professionals trying to make progress with their IIoT implementations. Bain’s 2022 study of IIoT decision-makers indicated that 80% of those who purchase IIoT technology scale fewer than 60% of their planned projects. The top three reasons why those respondents failed to scale their projects were that the integration effort was overly complicated and required too much effort, the associated vendors could not support scaling, and the life cycle support for the project was too expensive or not credible. One of the study’s takeaways was that hardware could help close gaps that prevent company decision-makers from scaling. Another best practice is for people to take a long-term viewpoint with any IIoT project. Some people may only think about what it will take to implement an initial proof of concept. That’s just a starting point. They’ll have to look beyond the early efforts if they want to eventually scale the project, but many of the things learned during the starting phase of a project can be beneficial to know during later stages.


AWS And Blockchain

The customer CIO, an extremely smart person, spoke up, in beautifully-rounded European vowels: “Here’s a use case I’ve been told about that’s on my mind.” He named a region in Asia and explained that the small farmers there mark their landholdings carefully, but then the annual floods sometimes wash the markers away. Then unscrupulous larger landowners use the absence of markers to cut away at the smallholdings of the poorest. “But if the boundary markers were on the blockchain,” he said, “they wouldn’t be able to do that, would they?” ... I thought. Then said “As a lifelong technologist, I’ve always been dubious about technology as a solution to a political problem. It seems a good idea to have a land-registry database but, blockchain or no, I wonder if the large landowners might be able to find another way to fiddle the records and still steal the land? Perhaps this is more about power than boundary markers?” Later in the ensuing discussion I cautiously offered something like the following, locking eyes on the CIO: “There are many among Amazon’s senior engineers who think blockchain is a solution looking for a problem.” He went entirely expressionless and the discussion moved on.

The key message is that before persisting the data into the storage layers (Bronze, Silver, Gold), the data must pass data quality checks and for the corrupted data records that fail the data quality checks to be dealt with separately, before they are written into the storage layer. ... The “Bronze => Silver => Gold” pattern is a type of data flow design , also called a medallion architecture. A medallion architecture is designed to incrementally and progressively improve the structure and quality of data as it flows through each layer of the architecture. This is why it is relevant for today’s article regarding data quality and reliability. ... Generally the data quality requirement become more and more stringent as the data flows from raw to bronze to silver and to gold as the gold layer directly serves the business. You should, by now, have a high-level understanding of what a medallion data design pattern is and why it is relevant for a data quality discussion.


The Digital Skills Gap is Jeopardising Growth

With people staying in workforces longer than ever before and careers spanning five decades becoming the norm, upskilling at a massive scale is needed. However, this need is not fully addressed; a worrying 6 in 10 (58%) people we surveyed in the UK told us that they have already been negatively affected by a lack of digital skills. Organisations can’t just rely on recruiting from a limited pool of digital specialists. More focus is also needed by organisations to upskill their own employees, in both tech and human digital skills. At a recent digital skills panel debate in Manchester, the director of a recruitment agency stated bluntly that: “Many businesses are currently overpaying to bring in external digital skills because of increased competition and this just isn’t sustainable. Upskilling your current teams should be as important as recruiting in new talent to keep costs in check and create a more balanced and loyal workforce.” It’s crucial to upskill employees, not only to get the necessary digital capabilities in our organisations, but to build loyalty and retain valued team members.


Emerging sustainable technologies – expert predictions

AI and automation technologies offer a smart solution, too; they could channel energy when it is plentiful into less time-sensitive uses, such as charging up electric vehicles or heating storage heaters. For example, Drax has looked at ways of combining AI with smart meters to channel our energy use, so that we take advantage of those periods when energy creation exceeds demand. The debate over whether we need new technologies or just need to scale-up existing sustainable technologies has even reached the higher echelons of power. John Kerry, US special presidential envoy for climate, and a certain Bill Gates say we need technologies which haven’t been invented yet. World-renowned climate change scientist Michael Mann disagrees. In his expert opinion, we just need to scale up existing technologies. ... But there is one other application — an application which will create extraordinary opportunity and open the way for many technologies we have been considering up to now. When all of our power is provided by renewables, the total annual supply is likely to exceed total annual demand by a large margin.


Women in IT: Progress in Workforce Culture, But Problems Persist

From Milică's perspective, the greatest challenge facing women in IT today is a lack of role models. “Women need to be the role models who can inspire young minds, especially more women and minority leaders,” she says. “Even at the individual level, each of us -- teachers, parents, and other influential adults -- can plant the seed and grow the understanding among young people of the importance of IT jobs, and how that career path can make a difference in our world and society.” She adds hiring bias and pay inequality, along with the lack of female role models, leaders, and advancement opportunities, all discourage women from pursuing a STEAM career. “Women have to work much harder both to get hired and to advance their careers -- which perhaps explains why 52% of women in cybersecurity hold postgraduate degrees, compared to only 44% of men,” Milică notes. She adds the industry also hasn’t done a great job sparking interest at an early age. “Attention to a career path starts with children as early as elementary school, and by middle or high school, many students will have made their decisions,” she explains.


EPSS explained: How does it compare to CVSS?

EPSS aims to help security practitioners and their organizations improve vulnerability prioritization efforts. There are an exponentially growing number of vulnerabilities in today’s digital landscape and that number is increasing due to factors such as the increased digitization of systems and society, increased scrutiny of digital products, and improved research and reporting capabilities. Organizations generally can only fix between 5% and 20% of vulnerabilities each month, EPSS claims. Fewer than 10% of published vulnerabilities are ever known to be exploited in the wild. Longstanding workforce issues are also at play, such as the annual ISC2 Cybersecurity Workforce Study, which shows shortages exceeding two million cybersecurity professionals globally. These factors warrant organizations having a coherent and effective approach to aid in prioritizing vulnerabilities that pose the highest risk to their organization to avoid wasting limited resources and time. The EPSS model aims to provide some support by producing probability scores that a vulnerability will be exploited in the next 30 days and the scores range between 0 and 1 or 0% and 100%.


Could it be quitting time?

The book tackles a challenge that proves stubbornly difficult for most people. Letting go of anything is hard, especially at a time when pundits tout the power of grit, building resilience, and toughing it out. Duke provides permission to see quitting as not only viable but often preferable, and she explains why people rarely give up at the right time. “Quitting is hard, too hard to do entirely on our own,” she writes. “We as individuals are riddled by the host of biases, like the sunk cost fallacy, endowment effect, status quo bias, and loss aversion, which lead to escalation of commitment. Our identities are entwined in the things that we’re doing. Our instinct is to want to protect that identity, making us stick to things even more.” These biases—some of them unconscious—prompt us to stick with jobs that have lost their appeal or value; hold on to losing stocks long after an inner voice screams “Sell!”; or endure myriad other situations that no longer serve us. Duke focuses far more on the thinking behind the decision to “quit or grit” rather than on the decision’s final outcomes.



Quote for the day:

"Teamwork is the secret that make common people achieve uncommon result." -- Ifeanyi Enoch Onuoha

Daily Tech Digest - October 16, 2022

Top AI investors reveal State of AI in 2022

What’s new in 2022, and what made Benaich and Hogarth dedicate an entire section to AI safety, is the other end of AI safety. This is what Hogarth referred to as AI alignment: ensuring that an extremely powerful and superintelligent AI system doesn’t ever go rogue and start treating humanity badly in aggregate. The 2022 State of AI report is very much biased toward that end of safety because, according to Hogarth, the topic is not receiving enough attention. “We’re seeing exponential gain in capabilities, exponential use of compute, exponential data being fed into these [AI] models,” Hogarth said. “And yet we have no idea how to solve the alignment problem yet.” It’s still an unsolved technical problem where there are no clear solutions, he added: “That’s what alarms me — and I think that the thing that is probably the most alarming about all of it is that the feedback loops now are so violent. You have huge wealth creation happening in AI. So there’s more and more money flowing into making these models more powerful.”


Interview with Vinayak Godse, CEO of Data Security Council of India

We see ourselves as an important catalyst in the National Cybersecurity initiatives, especially in terms of the technology geopolitics that is now hitting up; take the US–China tech war, for example. Cyber security in today’s age and day has become pivotal because the coming decade is going to be driven by technology, and cyber security is one fundamental area which will be driving all of these transitions. As per NASSCOM’s TECHADE 2020: Digital Tech Opportunities report, AI/ML, cloud, and cybersecurity will be crucial and critical for this decade. But, how will that happen? There are many different parts to this. Firstly, security should enable the growth of the industry. We aim to prepare the industry, the society, the individuals and, most importantly, the economy against possible issues and challenges regarding privacy. This is the second part. With technologies such as AI/ML, Data Analytics, and VR/AR gaining prominence, we will work towards solving the security problem in relation to these emerging technologies.


How Can Business DataOps Drive Growth?

DataOps is a fast-expanding area of expertise. Data analytics and operations specialists eager to learn how to develop and oversee DataOps procedures will have a successful future. They have the opportunity to guide the following group of data teams and set the bar for data practices for at least the ensuing ten years. Additionally, a creative, quickly expanding organization that reduces laborious and repetitive business activities will have happier and more motivated employees. The time it takes to develop a concept into something valuable is crucial to businesses. Through the use of agile development methodologies, DataOps shortens lead times. Additionally, the interval between rounds is shortened. Additionally, producing and dispersing solutions in tiny pieces enables solutions to be applied gradually. Shadow IT may form in businesses that use a sluggish development strategy for data solutions. Other departments create their concepts without the IT department’s approval or involvement.


Message Routing and Topics: A Thought Shift

There is one thing that caught my attention and fascination simultaneously. The central theme of a real-time enterprise integration is message routing. Almost all messaging systems employ an intelligent, interest-based subscription mechanism that ensures the routing of messages to interested parties. The intelligent part of routing is built with rules around the Event type (name) and the content. Routing decisions based on an Event type name are straightforward because they are easily accessible and available without unpacking the payload. However, a routing based on filtering rules on the content would require unpacking and evaluating the rules to determine a match. This directly impacts message throughput and performance. Content-based message routing (CBR) is performance-penalizing and not the best choice for real-time messaging scenarios where latency costs good decisions. In the early days of integration, content-based routing was considered essential. 


Where Quantum Entanglement Is Actually Being Used

Quantum entanglement is a critical element of quantum information processing, and photonic entanglement of the type pioneered by the Nobel laureates is crucial for transmitting quantum information. Quantum entanglement can be used to build large-scale quantum communications networks. On a path toward long-distance quantum networks, Jian-Wei Pan, one of Zeilinger’s former students, and colleagues demonstrated entanglement distribution to two locations separated by 1,203 km on Earth via satellite transmission. However, direct transmission rates of quantum information are limited due to loss, meaning too many photons get absorbed by matter in transit so not enough reach the destination. Entanglement is critical for solving this roadblock, through the nascent technology of quantum repeaters. An important milestone for early quantum repeaters, called entanglement swapping, was demonstrated by Zeilinger and colleagues in 1998. Entanglement swapping links one each of two pairs of entangled photons, thereby entangling the two initially independent photons, which can be far apart from each other.


Virtual Panel: The New US-EU Data Privacy Framework

Given the Court of Justice of the European Union's (CJEU) stance regarding US surveillance law, it is not clear how GDPR can be made compatible with transatlantic data transmission. Thus it is likely that any new privacy frameworks will be challenged in courts. Yet, the newly proposed Trans-Atlantic Data Privacy Framework brings an attempt to solve the underlying issues and may include an independent Data Protection Review Court as a mechanism to solve disputes that could provide an effective solution. If the new framework did not pass European Courts' scrutiny, it is possible that a completely different approach to data privacy will be required in future to ensure data transmission and collaboration while granting privacy rights, such as treating user data as a currency or similarly to copyright. In this virtual panel, three knowledgeable experts in the field of data privacy discuss where the existing agreements fall short, whether a new privacy agreement could improve transatlantic data sharing while granting privacy rights for EU citizens and stronger oversight of US intelligence, and more.


Distributed Ledger Technology (DLT): The Solution to the Age of Digital Distrust?

DLT has more going for it than blockchain technology alone. Alternatives are available across the DLT spectrum that already solve the so-called blockchain trilemma: guaranteeing high security, scalability, and decentralization. While a blockchain works with a linked list of blocks, IOTA, for example, uses the ‘Tangle,’ an acyclic fabric of mutually linked transactions that maintain the global shared state of the ledger while boosting speed. In addition, IOTA also avoids classic transaction fees, which would be prohibitive for applications in the IoT area, through an alternative consensus algorithm based on the reputation of the nodes. ... What makes DLT so exciting and relevant is that it was conceived and developed for this decentralized digital world where trust is at a premium. It’s not simply a case of storing the information safely that creates trust. It’s also how it’s created and continuously ensured between all the different partners of a business process. DLT determine the conditions under which nodes of the decentralized infrastructure capture and record new transactions and when they do not.


Digital innovation and the future of financial services

The growth of Web3 technologies also offers new opportunities, as the internet evolves from read-only pages to more interactive and immersive experiences. “Web1 is traditional data. Web2 added social [interaction] as data. And in Web3, everything is data,” said Dr Booth. In Web3, that data has real value to its users, and underlying blockchain technology allows it to be transferred and monetised easily. The most recognisable example may be buying digital ‘land’ in the metaverse. Transactions that take weeks in the real world can be completed in seconds and recorded on a secure, immutable blockchain. “How we connect the metaverse to the real-verse is going to be where banks will play a role,” said Mr Williamson. The digitalisation of financial services will create virtual mountains of new and complex data, generated from disparate sources and stored in different locations. Yet when everything's digital, that kind of volume and complexity becomes manageable. From Dr Booth’s point of view, AI is the connective digital tissue holding everything together.


How to Prove the ROI of Your Enterprise Architecture Efforts

The ROI of EA can be felt at the highest levels of the organization, but it can also have an impact at the individual department level. Cultivating this impact involves both research into specific challenges different departments face and educating department heads on what EA can do. For example, legal teams may not know that enterprise architecture has a critical role to play when it comes to navigating compliance standards and regulations. Similarly, marketing and sales teams may not realize how EA can support data management to drive analytics and personalization efforts. Every corporate function today depends on technology to be effective. EA is all about better, more strategic uses of technology. It thus falls to EAs to evangelize their capabilities across the enterprise, seeking out often unexpected opportunities to improve operations and outcomes, department by department. Getting users from across your organization on your EAM tool is no small undertaking. 


How to turbocharge collaboration in innovation ecosystems

Handled in the right way, collaborations will align and connect potential co-innovators to a shared purpose—internally as well as externally. Importantly, value creation is no longer just a numbers game. Echoing the motto of former PepsiCo CEO Indra Nooyi, “performance with purpose,” impactful innovation in an ecosystem is likewise driven by social values as much as by numbers. These are reflected in the challenges and specific problems the collaborators seek to address; in reframing the purpose, if necessary, so that it fits everyone’s objectives; and in the way the collaboration defines success. Much as in a team sport, egos and titles are swept aside in pursuit of a greater goal. ... Disruption, the digital revolution, covid-19—in aggregate, these factors have blurred and, in some cases, dissolved the boundaries between organizations, segments, and entire industries. As a result, innovation ecosystems are emerging as the dominant paradigm for corporate innovation. Yet, because of the fundamental disparities embedded in their structures, ecosystems are difficult to form and initiate, let alone sustain.



Quote for the day:

"Be a Strong Leader, Even If You Follow a Weak Leader" -- Miles Anthony Smith