Daily Tech Digest - December 12, 2022

14 lessons CISOs learned in 2022

Ransomware attacks have increased in 2022, with companies and government entities among the most prominent targets. Nvidia, Toyota, SpiceJet, Optus, Medibank, the city of Palermo, Italy, and government agencies in Costa Rica, Argentina, and the Dominican Republic were among the victims in 2022, a year in which the lines between financially and politically motivated ransomware groups continued to be blurred. A critical piece of any organization's defense strategy should be employee awareness and training because "employees continue to be targeted in threat actor strategies through phishing and other social engineering means," says Gary Brickhouse, CISO at GuidePoint Security. ... Organizations should also do more to keep up with vulnerabilities in both open- and closed-source software. However, this is no easy task since thousands of bugs surface yearly. Vulnerability management tools can help identify and prioritize vulnerabilities found in operating systems applications.


Grow your own CIO: Building leadership and succession plans

To ensure the long-term health of the company, tech chiefs must focus on building up that middle tier of IT leaders, a reality many CIOs are only now recognizing the need to address. “There are not enough people out there — you have to develop your own people,’’ says Roberts, who estimates that only 10% to 20% of companies are “being intentional about doing formal development programs.’’ Mike Eichenwald, a senior client partner at Korn Ferry Consulting, agrees that it’s important to elevate individuals from vertical leadership roles within the pillars of infrastructure, engineering, product, and security to enterprise leadership roles. With technology converging in all aspects of the business, doing so will help organizations leverage the diversity of experience those midlevel managers have under their belts, and their learning curve and degree of risk will be minimized, Eichenwald says. “Unfortunately, organizations miss an opportunity to cultivate that talent internally and often find themselves needing to reach out to the [external] market to bring it in,’’ he adds.


Open source security fought back in 2022

Anyone paying attention to open source for the past 20 years—or even the past two—will not be surprised to see commercial interests start to flourish around these popular open source technologies. As has become standard, that commercial success is usually spelled c-l-o-u-d. Here's one prominent example: On December 8, 2022, Chainguard, the company whose founders cocreated Sigstore while at Google, released Chainguard Enforce Signing, which enables customers to use Sigstore-as-a-service to generate digital signatures for software artifacts inside their own organization using their individual identities and one-time-use keys. This new capability helps organizations ensure the integrity of container images, code commits, and other artifacts with private signatures that can be validated at any point an artifact needs to be verified. It also allows a dividing line where open source software artifacts are signed in the open in a public transparency log; however, enterprises can sign their own software with the same flow, but with private versions that aren’t in the public log. 


Turning the vision of a utopic smart city into reality

It’s critical to consider what success looks like, and this can be measured by how user-friendly and efficient a service is, as well as cost efficiencies. For instance, reducing the time to find a parking space in a new city from an hour to just a few minutes when using parking apps which can indicate spaces and process payment. It’s almost impossible to consider smart cities without thinking about the efficient energy management benefits of smart buildings. Sustainable initiatives such as integrated workplace management systems already have the capability to monitor over 50,000 data points per second, analyse data, and send it to mobile apps. This could see millions of users saving energy. With a long-term vision for smart city platforms to become unified or standardised, one solution can potentially work seamlessly anywhere in the world. Platforms could integrate city infrastructure and navigation, and access to emergency and city services. Transformation will be driven by users empowered with the right data, perhaps even according to their user type of student, tourist, or city resident.


Can real-time data visualisation deliver trust and opportunity?

What is interesting is that so much of this is driven through an ecosystem of partners. No one organisation can deliver the breadth and depth of data and tools needed to make such projects work and there is much to learn from that. Collaborations and partnerships can elevate and enhance real-time data visualisation and value. For many organisations however, real-time data is still virgin territory and real-time visualisation is one of those technologies where reality cannot hope to match expectation, at least according to Jaco Vermeulen, CTO of tech consultancy BML Digital. “Almost every customer says they want real-time visualisation, but then nine out of 10 can’t qualify why they need it, especially when it comes to what decisions or actions it will enable,” says Vermeulen. “This is usually because they start from the belief that the data is always available and therefore should be immediately understandable and yield profound insight. The truth is a bit more challenging.” ... “It is the real-time decisions that create impact,” he says. “Optimising supply chains, reducing waste and pollution, optimising operations, and informing and satisfying consumers. 


IBM’s Krishnan Talks Finding the Right Balance for AI Governance

The challenge comes essentially from not knowing how the sausage was made. One client, for instance, had built 700 models but had no idea how they were constructed or what stages the models were in, Krishnan said. “They had no automated way to even see what was going on.” The models had been built with each engineer’s tool of choice with no way to know further details. As result, the client could not make decisions fast enough, Krishnan said, or move the models into production. She said it is important to think about explainability and transparency for the entire life cycle rather than fall into the tendency to focus on models already in production. Krishnan suggested that organizations should ask whether the right data is being used even before something gets built. They should also ask if they have the right kind of model and if there is bias in the models. Further, she said automation needs to scale as more data and models come in. The second trend Krishan cited was the increased responsible use of AI to manage risk and reputation to instill and maintain confidence in the organization. 


13 tech predictions for 2023

“Different edges are implemented for different purposes. Edge servers and gateways may aggregate multiple servers and devices in a distributed location, such as a manufacturing plant. An end-user premises edge might look more like a traditional remote/branch office (ROBO) configuration, often consisting of a rack of blade servers. Telecommunications providers have their own architectures that break down into a provider far edge, a provider access edge, and a provider aggregation edge. ... As we enter 2023, CIOs have earned a seat among the decision-makers and are now at the helm of company-wide technology decision-making. Amid a volatile economic climate, IT leaders must prioritize reducing costs, but they are finding themselves pulled between contrasting concerns of managing spend, dealing with security risks, and fostering innovation. As they navigate an uncertain market, CIOs will need to analyze company usage, along with their previous experience, to rethink business approaches and make decisions. The goal is to identify ways to reduce spend across the company, but not at the expense of key areas like cybersecurity and innovation. 


Preventing a ransomware attack with intelligence: Strategies for CISOs

One of the most effective ways to stop a ransomware attack is to deny them access in the first place; without access, there is no attack. The adversary only needs one route of access, and yet the defender has to be aware and prevent all entry points into a network. Various types of intelligence can illuminate risk across the pre-attack chain—and help organizations monitor and defend their attack surfaces before they’re targeted by attackers. The best vulnerability intelligence should be robust and actionable. For instance, with vulnerability intelligence that includes exploit availability, attack type, impact, disclosure patterns, and other characteristics, vulnerability management teams predict the likelihood that a vulnerability could be used in a ransomware attack. With this information in hand, vulnerability management teams, who are often under-resourced, can prioritize patching and preemptively defend against vulnerabilities that could lead to a ransomware attack. Having a deep and active understanding of the illicit online communities where ransomware groups operate can also help inform methodology, and prevent compromise.


What to do when your devops team is downsized

If you lead teams or manage people, your first thought must be how they feel or how they are personally impacted by the layoffs. Some will be angry if they’ve seen friends and confidants let go; others may be fearful they’re next. Even when leadership does a reasonable job at communication (which is all too often not the case), chances are your teams and colleagues will have unanswered questions. Your first task after layoffs are announced is to open a dialogue, ask people how they feel, and dial up your active listening skills. Other steps to help teammates feel safe include building empathy for personal situations, energizing everyone around a mission, and thanking team members for the smallest wins. Use your listening skills to identify the people who have greater concerns and fears or who may be flight risks. You’ll want to talk to them individually and find ways to help them through their anxieties or recognize when they need professional help. You should also give people and teams time to reflect and adjust. Asking everyone to get back to their sprint commitments and IT tickets is insensitive and unrealistic, especially if the company laid off many people.


Our ChatGPT Interview Shows AI Future in Banking Is Scary-Good

ChatGPT is a large, advanced language processing model that is trained using a technique called generative pre-trained transformer, or GPT. This allows ChatGPT to generate human-like responses to questions and statements in a conversation, making it a powerful tool for a wide range of applications. Compared to traditional chatbots, which are often limited in their ability to understand and generate natural language, ChatGPT has the advantage of being able to provide more accurate and detailed responses. Additionally, because it is trained using a large amount of data, ChatGPT is able to learn and adapt to different conversational styles and contexts, making it more versatile and capable of handling a wider range of scenarios. ... The banking industry can use ChatGPT technology in a number of ways to improve their operations and provide better service to their customers. For example, ChatGPT can be used to automate customer service tasks, such as answering frequently asked questions or providing detailed information about products and services. This can free up customer service representatives to focus on more complex or high-value tasks, improving overall efficiency and customer satisfaction.



Quote for the day:

"Strong leaders encourage you to do things for your own benefit, not just theirs." -- Tim Tebow

Daily Tech Digest - December 11, 2022

Designing out of difficult times

Uniquely challenging times call for unique approaches, not the standard playbook. Design offers this fresh perspective. McKinsey research has shown that companies that embrace the business value of design are better able to respond to shifting landscapes and generate improved performance. From 2013 to 2018, these companies had TSR that were 56 percentage points higher than that of their peers.3 In addition, companies that continued or increased their investment in innovation during the 2008–09 recession generated three times more growth compared with their industry peers in the three to five years that followed—in many cases leapfrogging their competitors.4 These results make sense given that a recession doesn’t mean that markets and customer needs suddenly stop evolving. In fact, such evolutions often speed up. For these reasons, we believe design should join topics such as finance, strategy, and talent on the CEO’s agenda. In this article, we explore specific examples where design has the potential to create significant value and boost an organization’s resilience. Executives can use the design function to unleash the power of creativity in strategy and problem solving in at least five important areas (exhibit).


Microsoft’s Distributed Application Framework Orleans Reaches Version 7

In Orleans, the desired distributed functionality is modelled as a grain, an addressable unit of execution that can send and receive messages to other grains and maintain its own state if necessary. The grains are virtual actors, persisted to durable storage and activated in memory on demand, in the same sense as virtual memory is an abstraction over a computer's physical memory. The grains had to inherit from the Grain base class in the previous versions of Orleans. Now the grains can be POCO objects. To get access to the code previously available only inside the Grain class, they can now implement the IGrainBase interface instead. The Orleans runtime keeps track of the activation/deactivation and finding/invoking grains as necessary. It also keeps clusters of silos, the containers for the execution of grains. The communication with the Orleans runtime is done using the client library. The last Orleans major version before 7.0 was version 3.0, released in 2019. The planned 4.0 release was later ported to .NET 7 and renamed to 7.0 to match the broader .NET 7 ecosystem launches.


Collaboration on IoT could transform risk and insurance

IoT offers an opportunity to fundamentally transform the insurance and risk management proposition in the large commercial space, to the mutual benefit of both customers and carriers. Networks of partnerships and IoT ecosystems will enable insurers to bundle technology, risk management services and risk transfer, while the flow of real-time risk data and insights would pave the way for a range of new and innovative solutions. However, these benefits can only be fully realised when IoT is installed in close collaboration with risk management and other business functions that stand to benefit or that would be impacted. Risk managers and risk experts play a vital role in transforming IoT data into a meaningful tool, and are ideally positioned to facilitate these discussions with internal stakeholders, as well as insurers. We are only at the start of this exciting journey, but collaboration will be critical for understanding customer needs and creating new solutions.


4 Steps to Help Organizations Embrace Risk from Emerging Technology

Particularly as companies invest in emerging technologies, business leaders need to listen more to their risk and compliance functions and integrate them into conversations about how those technologies will be implemented. Artificial intelligence is a great example: when companies rush to implement systems to accelerate efficiency and analyze trends, they risk creating disproportionate bias and violating personal privacy through data sourcing. Risk professionals need to be at the table from beginning to end to make sure that an evolving regulatory environment and other pitfalls are fully accounted for in the organization’s implementation process. While investment in risk management technology is helpful, it is insufficient without making structural changes to the organization to prioritize the risk function company-wide. ... When adopted purposefully, emerging technologies can make companies more efficient, more profitable, and better stewards for their employees, clients and communities. Risk is often unavoidable for early adopters of emerging technologies, but it can be mitigated if C-suites equip their risk functions with a holistic strategy and a voice in key business decisions.


EU fails to protect human rights in surveillance tech transfers

In its decision, the Ombudsman said that, having examined the documentation surrounding several EUTFA projects, there was no indication that proper human rights impact assessments were carried out. “The Ombudsman has identified shortcomings in that the Commission was not able to demonstrate that the measures in place ensured a coherent and structured approach to assessing the human rights impacts of EUTFA projects,” it said. “The Ombudsman finds it regrettable that the EUTFA projects in question were not subject to a clear human rights impact assessment, presented either as separate document or a separate section in the action documents. It further noted, for example, that despite the EUTFA projects covered by its inquiry being implemented in countries with major governance issues and poor human rights records, the analysis conducted by the Commission focused more on logistical issues, and that any assessments of the human rights impacts were “sporadic and unstructured” at best. It added that while the Commission itself considers the measures in place – including its multilayer approval process of projects; the use of specific “action” documentation for projects; and the possible suspension of funds – to be sufficient in safeguarding human rights, “the Ombudsman disagrees”.


Defining a Data-Driven Culture to Turn Uncertainty into Possibility

The limiting factor often lies in the design of organizational structures, especially in those focused on executing or exploiting current business models. The ultra-focus on efficiency here may lead to leaders stifling key impulses that lead to change such as seeking new data or exploring new possibilities that are also inherently a basic human attribute. Recognizing this need for change in uncertain environments is a critical first step and a strong data-driven culture values and promotes curiosity among its employees, and fosters creativity as an outcome to turn uncertainty into possibility. Nathan Furr, a celebrated author and professor at Institut Européen d’Administration des Affaires (INSEAD) has studied uncertainty based on interviews and observations across world-renowned leaders, innovators, and entrepreneurs. His studies provide strong evidence that leaders can be trained to face uncertainty and as result, discover new sources of revenue not seen before. You need to start by asking how your industry must evolve to meet the challenges of new technologies. 


Is the Hassle of Sharing Data Worth the Value it Creates?

When deciding whether to address customer demands that require collection, use and sharing of their personal data, D&A leaders must consider the balance of economic and consumer value. Too often, a focus on regulations and legalese gets in the way of this conversation. Legal discussions about data-driven initiatives are often constrained by risk avoidance mindsets, which limits business impact and value. To prioritize business outcomes, executive leaders must focus data collection and monetization discussions on revenue generation, cost savings, and balanced risk mitigation. Start data sharing and monetization efforts by identifying known desired business outcomes, as well as unknown opportunities, for quantifiable economic benefit. Organizations can expend sizable investments for personal data rights, including the right to use/reuse and share/reshare data, but not all rights procured will match your business case, be relevant to consumer demand, or even be enforceable, resulting in economic waste. Invest resources to obtain legal rights to locate, use and share the “right” data to match your targeted monetization use case.


Privacy-first data via data mesh: migrating governance to federated delegation

As you are building organizational co-ownership and shared responsibility via federation, you also need to design systems that allow for federated design, development and use. This means pushing your architects to think about data governance in a federated manner, which might be new for them! For privacy experts at your organization, this means working closely with your information security and technical counterparts. How can we implement the privacy goals set by the organization in our architecture and software? How does our architecture design support privacy first? You may need to spend time exchanging knowledge to get to a place where this conversation can happen more fluidly and in the same language. If technical leaders at your organization haven’t already heard about Privacy Engineering — this is a great place for them to start learning the current technologies, processes, design and theory around incorporating privacy in a technical environment. 


Stakeholders want more than AI Bill of Rights guidance

The AI Bill of Rights offers guidance and resources to businesses, but is not an enforceable law, meaning businesses and federal agencies don't need to comply with the ethical AI principles it lays out, Engler said. Instead, the AI Bill of Rights catalyzes federal agencies to act on the guidance and points the way for policymakers to consider AI regulation, said Harlan Yu, executive director of technology and equity nonprofit Upturn, based in Washington, D.C. "This document in the long term, will be judged not by what's on paper but all the concrete actions that are going to flow from this document, particularly from the federal agencies," Yu said during the panel discussion. "We're talking about prospective rule-making, enforcement actions, regulatory guidance and legislative actions that really need to put these principles into practice." The AI Bill of Rights applies to all automated systems that significantly impact people, such as AI decision-making systems for housing, employment and healthcare-related decisions, said Sorelle Friedler, OSTP's assistant director for data and democracy and a panelist.


Does Enterprise Architecture belong in IT?

The belief among the panelists was that EA is a core component in delivering business value. They all saw themselves as acting on behalf of the business to support business objectives. I wondered if, perhaps, the reason some EAs wanted to move to the business was because they believed they would have more visibility and more purpose there. As if to echo my thoughts, at least one attendee was not totally confident that shifting to the business would would have the desired effect. Mats Berglund from Ericsson, wanted to understand why Enterprise Architects have a “kind of complex” about where they sit. Instead of asking whether we should report into this or that part of the organization, he said, “We should be asking ourselves, as EAs, what are we giving the business?” Mats said it didn't matter that EA was in IT. Enterprise Architects should continuously prove their value to the business wherever they sit. Their contribution, after all, is what makes EAs credible advisors to the business. Returning for a moment to Dominik’s Söhnle’s thesis, he makes the point there that part of this "IT or the business" tension arises from a lack of business focus in some of the EA tools on the market. 



Quote for the day:

"There is no "one" way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer

Daily Tech Digest - December 10, 2022

Risk and resilience priorities, as told by chief risk officers

CROs acknowledge that they need to spend more time considering “over the horizon risks.” This gap in thinking was brought into sharp focus by the heavy impact the COVID-19 pandemic and geopolitical tensions had on their institutions’ risk profiles—including second- and third-order effects—such as supply chain risk, inflation, and rising interest rates—which were not anticipated by most banking executives. Institutions were little prepared to address these highly consequential risks. The failure goes well beyond risk functions, however. Many organizations used forecasting to develop market strategies, but this approach failed to pick up major reality shifts in the recent past—from the financial crisis of the 2000s to the pandemic to geopolitical realignments. Leading institutions are moving to scenario-based foresight to increase institutional resilience against over-the-horizon risks. The risk function can play an important role here in ensuring that the scenarios capture existing and expected risks, while aligning function priorities against scenarios.


Should central banks use DLT for CBDC?

When it comes to the topic of whether “to DLT or not to DLT” in the world of CBDC, Mikhalev took a slightly different position. He stated central banks have taken a top-down approach for hundreds of years, and while this works in many jurisdictions, it doesn’t work as well in emerging markets. “To have blockchain or not for CBDCs is increasingly being answered in the negative across established economies. ... This could reduce volatility in these emerging markets. These aspects which are specifically inherent to decentralisation and the distribution of power, should have positive effects in emerging economies.” However, Mikhalev continued, in each conversation carried out with central bankers in developed countries, he has found that they perceive blockchain as having little effect in situations where the supervisory institutions are not ready or unwilling to alter their business models around a new technology. “Blockchain doesn't really make much difference if nothing changes in terms of the existing established top-down structure of CBDCs. However, in emerging economies, this seems to differ,” Mikhalev noted.


A compliance fight in Germany could hurt Microsoft customers

German compliance authorities “can live with the situation where Microsoft pretends to do everything right and the authorities pretend to have done everything in their power to force Microsoft to become compliant,” Hence said in an interview with Computerworld. Microsoft “does not fulfill the most basic requirements of GDPR. They lack basic transparency. We can’t assess what they are doing because they are not telling us.” This is where politics comes into play, wheret practical forces can influence government compliance actions. German regulators “are afraid of retribution. (With regulators thinking) we won't get more budget if we say that you can’t use Office any more. Or even Google Analytics, any more,” Hence said. “These are poltical issues. Nobody wants to be the bad guy.” Thus, Microsoft is likely to skate on the issue — at least for now. But what about enterprise IT execs? Are companies using Microsoft products immune from compliance punishments? Not necessarily. It might not seem fair to let Microsoft get away with this but to fine and otherwise punish its customers, but Hence argues that's quite likely. And not just in Germany.


Policy Developments around Blockchain

On September 15, 2022, Singapore’s Monetary Authority (MAS) launched the Financial Services Industry Transformation Map 2025 that provides a framework of strategies to develop the country as a leading global financial center through enhanced payment connectivity to build a responsible digital asset ecosystem. It also laid out clear strategies to explore DLT in use cases such as cross-border payments, trade finance, and capital markets, besides supporting tokenization of financial assets. The policy supports a central bank digital currency (CBDC) and public-private collaboration to develop the infrastructure required to deliver such a currency. However, the first off the blocks in 2022 was the Securities and Futures Commission of Hong Kong which issued a joint circular with the HK Monetary Authority on intermediaries that can undertake virtual asset-related activities. Per the statement, intermediaries distributing virtual assets need to comply with the SFC’s requirements for sale of the products.


Share of Emerging Technologies in IT Budgets

National Association of Software and Services Companies (NASSCOM) and Boston Consultancy Group (BCG) today released a report titled "Sandboxing into the Future: Decoding Technology's Biggest Bets" on the sidelines of NASTech 2022 in Bengaluru. The report aims to uncover and develop perspectives on big-bet technologies that can potentially disrupt markets in the next 3-5 years. Enterprise Tech spending is estimated to reach $4.2 Tn by 2026 globally, amongst which Tech Services companies represent the largest segment and are expected to become $1.7 Tn by 2026 with a CAGR of 8.1%. As part of the study, 28 emerging technology themes from 11 tech families were identified – across markets and verticals - with the potential to disrupt markets, basis current tech spending, growth potential, innovation maturity, and funding momentum. Amongst these 12 emerging technologies, with high funding momentum and R&D focus, have emerged as the “Biggest Bets,” including, Autonomous analytics, AR & VR, Autonomous Driving, Computer Vision, Deep learning, Distributed Ledger, Edge Computing, Sensor Tech, Smart Robots, Space Tech, Sustainability Tech, and 5G/6G.


Amazon Wants to Kill the Barcode

The system, called multi-modal identification, isn't going to fully replace barcodes soon. It's currently in use in facilities in Barcelona, Spain, and Hamburg, Germany, according to Amazon. Still, the company says it's already speeding up the time it takes to process packages there. The technology will be shared across Amazon's businesses, so it's possible you could one day see a version of it at a Whole Foods or another Amazon-owned chain with in-person stores. The problem that the system eliminates -- incorrect items coming down the line to be sent to customers -- doesn't happen too often, Amazon says. But even infrequent mistakes add up to significant slowdowns when considering just how many items a single warehouse processes in one day. Amazon's AI experts had to start by building up a library of images of products, something the company hadn't had a reason to create prior to this project. The images themselves as well as data about the products' dimensions fed the earliest versions of the algorithm, and the cameras continually capture new images of items to train the model with. The algorithm's accuracy rate was between 75% and 80% when first used, which Amazon considered a promising start. 


Cyber crime threatens manufacturing production

Targeted attacks are the most common, with smaller companies often the most vulnerable, yet many offering no cyber security training to staff. Sixty-two percent of manufacturers now have a formal cyber security procedure in place in the event of an incident, up 11% on last year’s figures with the same number giving a senior manager responsibility for cyber security. More than half (58%) have escalated this responsibility to board level. Stephen Phipson, CEO of Make UK, the manufacturers’ organisation said: “Digitisation is revolutionising modern manufacturing and becoming increasingly important to drive competitiveness and innovation. “While cost remains the main barrier to companies installing cyber protection, the need to increase the use of the latest technology makes mounting a defence against cyber threats essential. No business can afford to ignore this issue and while the increased awareness across the sector is encouraging, there is still much to be done. “Every business is vulnerable, and every business needs to take the necessary steps to protect themselves properly.”


How Do Agility and Software Architecture Fit Together?

But the question is, I mean, when we, when we create software, we make decisions all the time. So the question is, what what is architecture? Like? How is architecture is different from all of these normal decisions that we take. And when I think about that, I always use a very loose definition. And this is, architecture is about the important things. I think Martin Fowler said something like that, and I really liked that, because it's about those things that have a high risk or a high cost of change, if we need to re evaluate them if we need to redo them. And, and I think that that these types of decisions qualifies architectural decisions. And then the question is, I mean, when do we when do we decide on these important things, whatever important means, and in my opinion, there is something that is quite underappreciated in most Scrum teams and from from my experience, it is that the product owner has a very, very important part when it comes to software architecture, because it always starts with what is the vision of the product? 


What’s a distributed compliance ledger and how is one integrated into Matter?

Matter’s DCL is a network of independent servers operated by the CSA and its partners. Each DCL server includes a complete copy of the database. The original data is managed and controlled by the CSA. The DCL is implemented by connecting all the servers using a cryptographically secured protocol. The DCL makes it difficult to manipulate the data in the database and increases the security of Mater devices and networks. ... The manufacturer writes the data to the database to add a new product to the DCL. It’s not ‘active’ until approved by the CSA. Once the device has passed certification and the CSA has received the confirmation from the PPA, the CSA adds “certified” to the status list letting all members of the Matter ecosystem know that this is an approved device and ready to be added to Matter networks. Database access is restricted. Device makers can only add data for their own products that are linked to their vendor identification (VendorID) number. Software updates must also be linked to the VendorID, or they will be rejected. Official CSA PPA bodies or the CSA can confirm or revoke device compliance data.


4 tips for implementing consistent configuration and automation standards

The team regularly reviews standards with squads and SMEs to keep the operating system and middleware standards current and compliant with security and other requirements. We created a naming structure for standards to maintain version control for compliance and audit purposes. The standards form a baseline for maintaining playbooks for consistent automation across the environment. The organization also promotes InnerSource (the use of open source practices to improve internal software) and advocates reusing playbooks. We base the playbooks on common configuration and automation standards. This establishes governance for operating systems and middleware support. ... Automation is a continuous journey. Achieving touchless deployments requires standard configurations, processes, procedures, security guidelines, and other dependencies that you must review and validate periodically. These standards form the baseline that the automation team will adopt and implement. 



Quote for the day:

"We are drowning in information, but starved for knowledge." -- John Naisbitt

Daily Tech Digest - December 09, 2022

Why CIOs must think of themselves as products—and hostage negotiators

Despite the growing breadth of the CIO’s role, Tyler believes that technology executives can go further still, extending their value and influence within the organization by thinking of themselves less of a service provider and more of a ‘powerful, valuable product’, which senior executives, partners and peers need to do their jobs effectively. “Product value proposition in business terms is something we develop when we’re trying to create the next generation of goods and services for our customers or citizens—any stakeholder we’re working with,” he said ... “Your leadership is a product that all of your executive team, partners, peers, and all of the people in your organization needs,” he said. Tyler also suggested that CIOs must build their own product value proposition to deliver the maximum value to the business, and make a promise to stakeholders of how technology will help them achieve their desired outcomes, adding that technology leaders can take simple steps to start by understanding who consumes IT, and by deeply understanding their jobs, and how IT can remove pains and create gains


Digital transformation trends in 2023

Automation adoption is often viewed as one of the best methods that companies can employ to streamline processes and boost revenues without causing costs to spiral out of control. Recognising the benefits that can be brought by automation, 54 per cent of organisations have already begun implementing robotic process automation [RPA] into their processes, according to a Deloitte survey. With the economy in such dire straits, and the landscape continuing to look so grim for businesses, it is highly likely that we will see a greater number of companies investing in this technology than ever before. Not only will implementing automation be an affordable alternative to investing in a full digital transformation project for many businesses, it will also provide a basis upon which to create new efficiencies in the years to come. Indeed, Gartner predicts that, by 2024, hyper automation will enable organisations to lower their operational costs by 30 per cent. At a time of great economic uncertainty, when every penny that businesses spend needs to be clearly justified, automation is a proven and dependable cost saver.


5 Ways to Embrace Next-Generation AI

While AI can solve big problems, it doesn’t have to do so all at once. “In the projects I’ve seen successful from our customers or internally, it is just getting moving with the tools you have or small investments and then growing from there,” Mark Maughan, chief analytics officer at cloud business intelligence platform Domo, asserted. “Just get moving. Get started. Test, learn, grow, and iterate.” ... A significant part of gaining that buy-in is having the talent that can communicate effectively with the various stakeholders. Leveraging storytelling to illustrate the problem and how AI can solve it is a powerful tool. The earlier organizations engage relevant stakeholders, the more likely the project is to be successful. That ability to communicate may or may not come naturally, but it can be learned. “One thing that we've always found very helpful is either rotations or shadowing for anyone in the data science analytic organization around the different business and operational stakeholder groups to get a better understanding of what is actually going on,” Finnerty said.


The cybersecurity challenges and opportunities of digital twins

Unfortunately, while CISOs should be key stakeholders in digital twin projects, they are almost never the ultimate decision maker, says Alfonso Velosa, research vice president of IoT at Gartner. “Since digital twins are tools to drive business process transformation, the business or operational unit will often lead the initiative. Most digital twins are custom-built to address a specific business requirement,” he says. When an enterprise buys a new smart asset, whether a truck, backhoe, elevator, compressor, or freezer, it will often come with a digital twin, according to Velosa. “Most of the operational teams will need a streamlined and cross-IT—not just CISO—set of support to integrate them into their broader business processes and to manage security.” If proper cybersecurity controls aren’t put in place, digital twins can expand a company’s attack surface, give threat actors access to previously inaccessible control systems, and expose pre-existing vulnerabilities. When the digital twin of a system is created, the potential attack surface effectively doubles—adversaries can go after the systems themselves or attack the digital twin of that system.


RegTech Can Help Solve ESG Data Management and Trust Challenges

The benefits that RegTech offers sustainability data officers go beyond mere automation of processes; they also include a guarantee that the data will be seamlessly integrated into a financial institution’s broader data assets and can ensure that it is traceable and auditable. Katie Carrasco agreed. The head of ESG at Global Innovation Fund, an impact investment vehicle said that RegTech could provide a firm with a 360-degree view of its ESG data, a matter that’s critical to good governance, which in turn is important in ensuring the accurate identification of opportunities and risks. Data traceability and auditability will also provide for credibility, argued Mary Anne Bullock, global strategic account director for Solidatus. At a time when greenwashing is making headlines and undermining the ESG project, Bullock said that being able to trace data from source to use-case would help demonstrate its veracity, offer transparency into firms’ activities and build trust. Building trust comes down to credible metrics, said Seethepalli, and that would come when auditability is added into the data management mix. 


As Complexity Challenges Security, Is Time the Solution?

"Complexity leaves me in a very depressing place; complexity is just forever increasing," said Moss, who's the founder of Black Hat and regularly opens the conference by detailing leading challenges as well as potential solutions. For addressing complexity, he said, "time has got me pretty excited." Simply put, being strategic about doing things faster - including detection and recovery - gives organizations one tactic to blunt the impact of increased complexity. Not all complexity involves technological evolution, such as malware built to better evade defenses, or criminals wielding zero-day exploits. Researchers debuted last week ChatGPT, a prototype, conversational AI chatbot that can sometimes appear to be human. This means added complexity for security professionals, since many security tools use attackers' poor command of English to detect and block phishing attacks. Expect criminals to soon use tools such as ChatGPT to write lures that seem to have been crafted by a native speaker, said Daniel Cuthbert, a veteran cybersecurity researcher who's a member of the U.K. government's new cyber advisory board.


Meta’s behavioral ads will finally face GDPR privacy reckoning in January

If Meta is forced to ask users if they want “personalized” ads (its favored euphemism for surveillance ads), that is definitely big news — given that rates of denials when web users are actually given a choice over targeted ads are typically very high. The crux of noyb’s original complaints against Meta services was that users were not offered a choice to deny its processing for advertising — despite the GDPR stipulating that if consent is the legal basis being claimed for processing personal data, it must be specific, informed and freely given. However — plot twist! — it later emerged that as the GDPR came into application, Meta had quietly switched from claiming consent as its legal basis for this behavioral advertising processing to saying it is necessary for the performance of a contract — and claiming users of Facebook and Instagram are in a contract with Meta to receive targeting ads. This argument implies that Meta’s core service is not social networking; it’s behavioral advertising. Max Schrems, noyb’s honorary chairman and long-time privacy law thorn in Facebook’s side, has called this an exceptionally shameless attempt to bypass the GDPR.


Introduction to Interface-Driven Development (IDD)

This concept already existed in some areas, such as Protocol-oriented programming in Swift or Interface-based programming in Java, and it is based on Design by Contract by Bertrand Meyer, described in his book “Object-Oriented Software Construction”. In the book, he discusses standards for contracts between a method and a caller. Also, Hunt and Thomas rely upon a similar concept in their “The Pragmatic Programmer” book, in the section on Prototyping Architecture: “Most prototypes are constructed to model the entire system under consideration. As opposed to tracer bullets, none of the modules in the prototype system need to be particularly functional. What you are looking for is how the system hangs together as a whole, again deferring details.“ The problem that this process needs to solve is components that are vaguely defined during design, and we tend to give more responsibility to some components than is necessary. A usual implication of such design is bad and untestable code.


Leveraging the full potential of zero trust

In line with the motivations behind cloud migration, Zscaler found that a focus on wider strategic outcomes is missing from how organizations are planning emerging technology initiatives. Regarding the single most challenging aspect of implementing emerging technology projects, 30% cited adequate security, followed by budget requirements for further digitization (23%). However, only 19% cited dependency on strategic business decisions as a challenge. While budget concerns are natural, the focus on securing the network while ignoring strategic business alignment suggests organizations are focused on security without a full understanding of its business benefit, and that zero trust itself is not yet understood as a business enabler. “The state of zero trust transformation within organizations today is promising – implementation rates are strong,” said Nathan Howe, VP of Emerging Tech, 5G at Zscaler. “But organizations could be more ambitious. There’s an incredible opportunity for IT leaders to educate business decision-makers on zero trust as a high-value business driver, especially as they grapple with providing a new class of hybrid workplace or production environment and reliant on a range of emerging technologies, such as IoT and OT, 5G and even the metaverse.


Going from Architect to Architecting: the Evolution of a Key Role

A fundamental principle of today’s software architecture is that it's an evolutionary journey, with varying routes and many influences. That evolution means we change our thinking based on what we learn, but the architect has a key role in enabling that conversation to happen. ... The architect playing a sole role in the software development game is no longer the case. Architecting a system is now a team sport. That team is the cross functional capability that now delivers a product and is made up of anyone who adds value to the overall process of delivering software, which still includes the architect. Part of the reason behind this, as discussed earlier, is that the software development ecosystem is a polyglot of technologies, languages (not only development languages, also business and technical), experiences (development and user) and stakeholders. No one person can touch all bases. This change has surfaced the need for a mindset shift for the architect; working as part of a team has great benefits but in turn has its challenges.



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor

Daily Tech Digest - December 08, 2022

NASA's next-gen robot will explore space and do your chores at home

The robot will be utilized in three sectors: commerce, space and personal home use -- in that specific order, from structured to unstructured environments. "Structured means you can control the environment," Cardenas says. "Unstructured means the environment is very dynamic – and there's no more dynamic environment than the home, right?" Before Apollo can become your newest family member, the robot has to be affordable, safe and agile enough to operate in such a dynamic environment. ... "One of NASA's goals is not just to develop technology for space exploration," said Azimi. "We also want these technologies to be available for use on Earth and that the outcome of the development projects that we undertake with our partners will be available to as many people as possible, to the maximum benefit of humanity in general." One major way Apollo will be able to help humanity is by supporting the commercial sector. Apollo will mitigate supply chain issues by doing the jobs that people don't necessarily want to do but are still vital to sustaining industry and the economy.


Five Actionable Success Tips for Security Professionals in 2023

Have a personal incident response plan - We all have CIRT/SIRT teams, major incident response plans and playbooks, but how many of us consider the real personal impact if we need to deploy these plans? Everyone has a home life, and they will differ greatly, but no one can run 24/7. Some of us have caring responsibilities, and we all get stressed. ... Turn the camera on/go into the office (if you have one) - This is about making connections and re-connecting in a post pandemic world. Lots of us may have lost our offices but it’s so important to try and keep the human connectivity within technical professions. ... Know your business - This is focused on understanding how the business you work for makes its money. Working in cybersecurity and GRC, we are keen to see risks mitigated and controls applied, but the biggest risk to a business is that it doesn’t survive, and we need to be clear that our job is to help the business grow by protecting what it cares about and being trusted advisors, not the people who say 


The Hidden Cost of Software Automation

Nothing is free. Even after we automate a process, it is not free. We shift the entire cost from manual work to the cost of creating and maintaining the automation. It is the maintenance cost of automation that a lot of time gets neglected. Assuming the automation code no needs to change and improve, there’s still a need to upgrade the tools or library from time to time. These are all future overhead that is not present when creating the automation. ... After a while, the people who created the automation were no longer on the team. Nobody feels it’s a problem, as it still works. The people who created it don’t even remember it as well, as it was created “centuries” ago. But like any software, nothing lasts. One day, something needs to change. This automated code is like a black box to the entire team. We cannot do incremental evolvement to it. The only option is to recreate everything from scratch without having a reference of what was done internally in the past. The bigger the automation, the costlier it is to rebuild one. The cost of lost context has no warning sign. It’s a time bomb that is not IF, but WHEN it will explode if we don’t attend to it.


The Future of Technology Depends on the Talent to Run it

First, we need an upskilled team, trained in the technical competencies that will allow us to create and upgrade our products or services. We also need a consistent team, AKA low turnover. Our team should benefit from the institutional knowledge that comes with longer staff tenure. This means that we need to keep our staff happy enough to stay onboard, but it also means that we need to be scaling our teams thoughtfully, not recklessly. We’ve seen how companies that grow too quickly can end up suffering from sudden layoffs. This impedes company success, both because of a shrunken staff and because workers on the job market will be less interested in working for companies that could let them go with little notice. And finally, to get to the point where we have a highly skilled team with low turnover, we need to streamline our hiring and onboarding processes.


Is Your Data Team Enabled To Deliver The Killer Punch?

Even Financially, it makes more sense to embed Data into the Product teams. Traditionally Data Teams are often treated as Cost Centres while Product teams are as Profit centers. When we nurture an aggressive ambition to leverage data as differentiators and identify possible new revenue opportunities, it’s ironic to continue Data as part of cost centers that are highly vulnerable to cost-cutting and first in line to get hit by Industry slowness. It’s akin to cutting the limb and taking up a driving job !! This dilemma about “Centralized or Federated” Data teams doesn’t have a cookie-cutter response; it’s a function of organizational maturity. A centralized model is a foundational step; this will help to identify, establish and refine the scope, process, guidelines, and, more essentially, harvesting niche data talents. When the journey commences here, it shouldn’t end but evolve. The Federated model is the next, the Product teams have an embedded data component similar to the Agile team having a functional tester. Certain non-negotiables, such as Data Privacy ( e.g., GDPR) security, Data Governance, and Cross product features, will require a representative(s) from product teams to come together to establish and implement enterprise guidelines.


3 Essential Tips for Adopting DevSecOps

Build generation is the best time to include a scan that checks to see if new vulnerabilities have been added. This scan should check the entire application, not just the new code. Adding this check to the pipeline will force developers to update and patch vulnerabilities in order for the pipeline to run. ... A good observability setup is not just for monitoring application health. It can also be helpful for identifying security issues. For example, a spike in an endpoint can be an attack. Therefore, you want to create intelligent alerts that combine information about access sources, failed access attempts, operating systems and databases. Along with these alerts, you can add some predefined actions to prevent an attack from taking down your application. For example, try to figure out your app’s average usage and block or redirect access if you get an unexpected spike. But make sure that you’re on the same page with marketing and other departments so that you can properly prepare and change your limits when a spike is detected or predicted.


Complexity is the enemy of cloud security

Most IT shops don’t consider complexity a significant metric to track when researching cybersecurity or cloud security. It’s often neglected because most security is a siloed set of processes. The architecture teams look at security as a black box where stuff is tossed over a wall and somehow magically becomes secure. We’ve needed to integrate security with development, architecture, and operations for a long time. Some organizations practice devsecops (development, security, and operations) and integrate these concepts, bringing everyone’s expertise to bear on all problems. In an ideal world, security is never somebody else’s problem because the lines of demarcation between development, architecture, security, and operations do not exist. Everyone works together across all development, design, and deployment aspects. Security is systemic to everything, which is the correct way to view it. When security is everywhere, it also becomes a factor when defining core cloud and non-cloud architectures, including the amount of complexity introduced and how to effectively manage it.


How Can Emerging Technology Actually Drive Value for Companies?

There is a connection between advancing data management analytics practices and the ability to derive value from emerging technologies. The most successful companies understand how to turn emerging technology into action. The framework for making emerging technology actionable begins with the question: Is the technology ready for your company? “Can it do what your business needs it to do?” Hopkins asked. Next, leaders need to consider if their companies are ready for the technology. “We really think about three maturity windows in which the emerging technologies will deliver return on investment,” Hopkins said. Already, some of these technologies are being widely used in companies today. “There's cloud data computing and natural language processing. Those things are delivering benefits for mainstream average firms today,” Hopkins pointed out. Others on the list, like explainable AI, edge intelligence and intelligent agents, are two to four years out for most firms, according to Hopkins. TuringBots, Web3, and extended reality could be five or more years out.


IT leaders adjust budget priorities as economic outlook shifts

These days, IT leaders are keeping a closer eye than usual on pricing, and in some cases are buying out their long-term cloud contracts to give themselves more flexibility. “Executive leadership doesn’t want to hear we’re locked in and can’t move,” US Silica’s Piddington says. Vendors “want to true you up but never want to true you down,” he adds, and shorter-term contracts can help incent them to do so. ... Although the supply-chain shortage and other factors have caused prices to increase for two or three years now, IDC’s Minton says IT buyers have had enough. “There’s pushback now,” he says, and when there was once more tolerance for the reasons behind vendor price increases, IT leaders are now saying they just can’t keep pace and must keep budgets within a narrow range. Piddington agrees, saying that the situation is forcing IT executives to “be smarter” and understand where the opportunities are within each vendor relationship to “pull the right levers.” Having strong relationships with vendors, and not just engaging in transactional deals, can “give you more potential” to create the flexibility to work with them on pricing.


Australia to develop new cyber security strategy

It would be unreasonable to expect to see detailed policy proposals, given that the minister was announcing work to develop a strategy, not the strategy itself. But her stated goal is to make Australia “the world’s most cyber-secure country by 2030”. O’Neil listed four ways that the government plans to make that happen: bringing the nation into the fight to protect citizens and the economy; strengthening international engagements so that Australia can be a global cyber leader; strengthening critical infrastructure and government networks; and building sovereign cyber security capabilities. During questions after the address, O’Neil said: “We’re not spending enough on cyber defence at the moment. One of my challenges is how we are going to address that problem.” She noted that securing government infrastructure will be expensive. The minister appeared to be calling for bipartisan support for the development and implementation of the strategy when she said: “Many in the opposition are good, thoughtful people who know that the approach we are taking – strong, serious, depoliticised – is how we make our country safer.”



Quote for the day:

"Leadership is a way of thinking, a way of acting and, most importantly, a way of communicating." -- Simon Sinek

Daily Tech Digest - December 07, 2022

4 Ways CFOs Can Mitigate Costs of Poor Data Management

CFOs can also join forces with their IT counterparts to elevate security procedures as part of the company ethos (without detracting from employee productivity). Incentivizing employees to be mindful of data security and data management policies that could lead to financial impacts is one way to jump-start this effort. ... When approving IT expenditures, CFOs have a great opportunity to ensure the emphasis is on data management projects that reduce financial risk and prevent waste of resources. For example, it may be worth investing in the establishment of a single sign-on for company employees. A single sign-on allows access to company data to be quickly turned off upon an employee’s departure. In general, tools that speed up the response time to vulnerabilities and reduce the attack surface — and hopefully stop breaches before they happen — are worth prioritizing. Freeware for data sanitization exists, but enterprise-grade tools provide assurances, such as certificates of erasure, which equate to less risk. Also, automating the different stages of data management processes not only increases productivity but can also significantly expedite the recycling or disposal of assets, mitigating storage issues and security risks.


The Unheard Story of Lost Anonymity

Although there are data privacy regulations in the picture, it is expected that pieces of our information will fall into some wrong hands through organization acquisitions, data breaches or data theft. Have you at any point asked yourself why banks you have never opened an account with flood you with calls offering loans and credit cards? Or why you receive countless spam messages from unknown numbers asking you to update your KYC? How do these people you never shared your information with know your full name and your number? It is important to understand that your number is not simply a number. It is connected to a lot of information that may be sensitive—for example, your employer information, bank balance, personally identifiable information (PII) or maybe even personal health information. This information might begin from data you provided to a bank, to an e-recharge website or to a retail/e-commerce store where you might have made a purchase; however, from that point onward, your consent does not make any difference. Your information could be sold to anyone, from a marketing agency to criminals looking for targets.


What is ChatGPT and why does it matter? Here's what you need to know

Despite looking very impressive, ChatGPT still has limitations. Such limitations include the inability to answer questions that are worded a specific way, requiring rewording to understand the input question. A bigger limitation is a lack of quality in the responses it delivers -- which can sometimes be plausible-sounding but make no practical sense or can be excessively verbose. Lastly, instead of asking for clarification on ambiguous questions, the model just takes a guess at what your question means, which can lead to unintended responses to questions. Already this has led developer question-and-answer site StackOverflow to at least temporarily ban ChatGPT-generated responses to questions. "The primary problem is that while the answers that ChatGPT produces have a high rate of being incorrect, they typically look like they might be good and the answers are very easy to produce," says Stack Overflow moderators in a post. Critics argue that these tools are just very good at putting words into an order that makes sense from a statistical point of view, but they cannot understand the meaning or know whether the statements it makes are correct.


HHS: Web Trackers in Patient Portals Violate HIPAA

The warning from the department's Office of Civil Rights comes months after revelations that medical providers have used free web user tracking code offered by Facebook and Google in websites frequented by patients. Facebook parent Meta faces a proposed class action alleging it violated privacy law by collecting patient information via its Pixel tracker, including data on doctors, conditions and appointments. At least three major healthcare organizations in recent weeks have treated their previous use of web tracking code as a reportable data breach. ... "Providers, health plans, and HIPAA-regulated entities, including technology platforms, must follow the law. This means considering the risks to patients' health information when using tracking technologies,” said HHS OCR Director Melanie Fontes Rainer in a statement. The bulletin specifies that trackers embedded into login pages such as a patient or health plan beneficiary portal or a telehealth platform are particularly susceptible to transmitting protected health information if they contain trackers.


The Case for Transparency in Data Collection

The relationship between consumers and data transparency (or in some cases, lack of transparency) is not unique to internet marketing. Parallels can be drawn between online data transparency and methods used for years by retailer loyalty programs. Long before the internet, enrolling in a loyalty program gave the issuer access to a consumer’s personal spending habits, geographic spending data, and other personal data -- and consumers rarely read the fine print in their agreements. The reality is, reading and taking the time to digest privacy policies is a huge ask, especially in the context of the internet, which has become synonymous with instant gratification. One study found it takes more than 200 hours -- longer than a typical work month -- to read the average privacy policy word-for-word on the websites we visit each year. Although that is an entertaining statistic, most consumers do not have any idea what they are saying yes to when signing into apps or agreeing to a website’s terms of service. They are blissfully unaware of exactly how companies use consumer data to test marketing campaigns, improve the customer journey, or share with third parties. 


Tone From The Top: How Top-Down Inspires Bottom-Up In PrivSec

Simply put, privacy and security are everyone's responsibility in the modern workplace. Gone are the days when cybersecurity "belongs" to the IT department and privacy "lives" in the legal team. Physical security and guest privacy begin at the reception desk, social engineering defenses start in the call center, and business email compromise (BEC) prevention starts in the finance office. Privacy and security are role-based these days, and as such, you need to start right at the bottom and work your way up. ... In working your way up the chain, it generally goes well until you hit the complexities of divisions with certain powers to bypass policy, often at the executive level. When any member of an executive team regularly avoids adhering to policies, it undermines the policy that has been put in place. Often, this is exercised under the guise of "being agile" or "responding to demand" in the quickest way possible, but it does far more damage than it does good. Effectively in these situations, the executive has stopped playing by the company rules. And when this happens, it very easily and very rapidly rubs off on the rest of the staff and paves the way for "normalization of deviance." 


What you should know when considering cyber insurance in 2023

Security advisors and consultants say they see insurers asking more questions of those seeking insurance policies. They’re requiring proof that applicants have achieved certain levels of security hardening, such as SOC 2 compliance. They’re reviewing security strategies and policies as well as security training and awareness programs. “Insurance companies are taking a closer look at all of those,” Wilkison says. This in turn has required more involvement from enterprise security leaders in the insurance procurement process. ... CISOs may also have to make adjustments to their strategies based on insurer demands. “If you want to get your claim, you usually have to use their panel of vendors or follow their procedures,” says Michael Pisano, a managing director at global consulting firm Protiviti. For example, they will be required to have detailed response and recovery plans in place—in the event of an incident, insurers want clients to meet specific requirements, such as which lawyers should be used and what forensics should be performed, and by whom. 


How to Build Privacy By Design Into Customer Experience

The demands for data privacy are growing and there is no turning back. But is it too late to make a real difference? “We need to push back on the thinking that privacy is dead,” says Baber Amin, COO of Veridium, an integrated identity management platform provider. “It is not dead. In fact, more than ever, it needs to be nurtured and thought through in light of modern technology. A good example of not giving up is [the US Supreme Court case] Carpenter v United States.” The question remaining in this discussion is: Are companies ready, willing, and able to provide data privacy protections? ... “The proper way to go forward is through transparent privacy policies that notify users about the data and information we collect,” says Apu Pavithran, CEO of Hexnode, a device management company. “Transparency is the key if you want to generate trust and build a more valuable connection with consumers. However, building trust via openness requires time and effort, but can help firms outperform their competitors in terms of sales, revenue, and marketing ROI.”


Value-creating chief data officers: Cementing a seat at the top table

Across every industry, we found more CDO appointments have been made since our last study. The heavily regulated financial services industry—where effective use of data is vital for both reporting and compliance—continues to set the bar. Just over half of banks and insurers now have a CDO in place, a number that accounts for 22% of CDOs globally. But although we saw most CDOs appointed at banks (25), and capital goods (18) and software (13) firms this year, household and personal products, automotive, food and beverage, and retail organizations saw the highest year-on-year increase in the proportion of companies with a CDO. Regardless of industry, CDO growth is still being driven by the largest companies—those with multimillion-dollar revenues and the largest head count. This is likely due to their greater organizational and technological complexity. However, CDO appointments are on the rise across businesses of all sizes. The emergence of CDO positions in midsized firms suggests the role is beginning to be more widely recognized as a useful way to help executive teams pursue business growth.


What Is Code Churn?

Code churn, also known as code rework, is when a developer deletes or rewrites their own code shortly after it has been composed. Code churn is a normal part of software development and watching trends in code churn can help managers notice when a deadline is at risk, when an engineer is stuck or struggling, problematic code areas, or when issues concerning external stakeholders come up. It is common for newly composed code to go through multiple changes. The volume and frequency of code changes in a given period of time can vary due to several factors and code churn can be good or bad depending upon when and why it is taking place. ... Code churn varies depending on many factors. For instance, when engineers work on a fairly new problem, churn would most likely be higher than the benchmark, whereas when developers work on a familiar problem or a relatively easier problem, churn could most likely be lower. Churn could also vary depending on the stage of a project in the development lifecycle. Hence, it is important for engineering managers and leaders to develop a sense of the patterns or benchmarks of churn level for different teams and individuals across the organization.



Quote for the day:

"Listening to the inner voice, trusting the inner voice, is one of the most important lessons of leadership." -- Warren Bennis