By asking questions like, “Are these applications still relevant?” or “Is this system working?” or “How I can I make this system better?” Assess how you can make a difference to add value and propel your organization to become an industry leader. The complex environment, fueled by continued advances in technology, hinders the ability of the organization to realize value. The enterprise architecture solution will likely not deliver immediate returns (Gong & Janssen, 2021). Kotusev (2018) noted that a rigid approach to enterprise architecture implementation is the worst strategy. Persistent evaluation and adaptation of the EA solution are necessary to signal the need for adaption. It is appropriate to have parts of the EA strategy remain purposively generalized (Alwadain, 2020; Marcinkowski & Gawin, 2019). For example, a flexible EA solution can quickly transition to a SaaS (software as a solution) that delivers more value than on-premises operations. Cooiman (2021) noted that considering operations that directly support and influence portfolios, programs, projects, and business functions, such as supply chain management and payroll.
Get an MMF Baseline: Even if no formal MMF exists in an organization, an implicit one does. Technical documents mapping data architecture, the knowledgeable business analyst who others turn to understand reporting data, and data-entry procedures provide context around an organization’s data and pieces of its MMF. Getting a baseline about what people, processes, and technology already exist and how they inform the organization’s Metadata Management framework just makes sense. Using a “qualified and knowledgeable data professional (and other skilled talents) to administer and interpret data readiness assessments” along with Data Maturity models like those put forth by Gartner, or the Capability Maturity Model of Integration (CMMI), gives a good MMF starting place. Be Clear About What an MMF Will Achieve: Be clear why an organization needs to manage metadata and implement a Metadata Management framework. Metadata Management helps reduce training costs, provides better data usage across data systems, and simplifies communication.
The right tooling will help you put your governance framework into practice, providing the necessary guardrails and data visibility that your teams need to boost trust and confidence in their data analysis. Perhaps the most fundamental tool for data governance—certainly the greatest help for us here at Tableau—is our integrated data catalog. This enables employees to see data details like definitions and formulas, lineage and ownership information, as well as important data quality notifications, from certification status to events, like if a data source refresh failed and the information isn’t up to date. A data catalog boosts the visibility of valuable metadata right in people’s workstreams, whether that metadata lives in Tableau or is brought in from an external metadata management system via an API. This also helps IT with impact analysis and change management, to understand who and which assets are affected downstream when changes are made to a table.
A centralized DLT is not immutable. The ledger can be rewritten arbitrarily by the one (or more) who controls it or due to a cyberattack. Because of its open and competitive nature (mining, staking, etc.), any blockchain can achieve immutability and hence its records will be credible. Thousands of independent nodes can ensure an unprecedented level of resistance to any sort of attack. Usually, it comes next after the discussion about immutability. How to correct a mistake? What if you need to change your smart contract? What if you lost your private key? There is nothing you can do retroactively — alteration in the blockchain is impossible. What’s done is done. In this regard, the DLT is usually the opposite of an alternative to blockchain. You will hear that DLTs can be designed so that those who control the network verify transactions on entry and therefore, non-compliant transactions are not allowed to pass through. But it would be a fallacy to think that censorship in the network will ultimately exclude all mistakes and unwanted transactions. There will always be a chance for a mistake.
The extensive documentation, verified by third party brokers, that has underpinned trading and commercial agreements in the past is at odds with digital ways of working. The same steps of these processes need to be maintained, but conducted through digital interfaces that are more open and more complex.Distributed Ledger Technologies (DLT) can fill this gap. Distributed ledger describes the approach of creating equal decentralized copies of transactions, instead of storing them in one central place (ie a database for digital, or a document for analogue). What makes DLT so exciting and relevant is that it was conceived and developed for this decentralized digital world where trust is at a premium. Instead of being built on existing relationships, trust can be anchored in encrypted processes (the so-called consensus algorithms), which control the transactions. It's not simply a case of storing the information safely that creates trust, it's also how it's collected. DLT can determine the conditions under which nodes of the decentralized infrastructure capture and record new transactions and when they do not.
Quote for the day:
"We get our power from the people we lead, not from our stars and our bars." -- J. Stanford