Daily Tech Digest - December 25, 2023

Technical Debt is Killing Your Business: How a PLM Strategy Helps

Many organizations implicitly tolerate technical debt as a necessary investment to adapt to changing circumstances or swiftly seizing new opportunities. Successful businesses stress the importance of managing technical debt through acceptance, measurement and proactive strategies, including the adoption of open standards, abstraction and incremental changes. ... Defining and adopting an effective PLM strategy is instrumental in managing technical debt comprehensively. A 2020 McKinsey study titled “Tech Debt: Reclaiming Tech Equity” highlighted the importance of strategic alignment, stating that, “A degree of technical debt is an unavoidable cost of doing business, and it needs to be managed appropriately to ensure an organization’s long-term viability." Furthermore, the study emphasized that “the goal is not to reach zero technical debt. That would involve devoting all resources to remediation rather than building points of competitive differentiation. It would also make it difficult to expedite IT development when strategic or risk considerations require it. Rather, companies should work to size, value and control their technical debt and regularly communicate it to the business.”


Improving the case for waste from data centers

The challenge originally stems from the practical complexities of collecting and harnessing residual heat from data centers. Planning authorities actively encourage heat reclamation, but the lack of existing infrastructure poses a significant obstacle. While planning conditions that mandate developers to allow for connections to ‘future’ heating networks is a positive move, this becomes futile where there is no corresponding plan for heat network development. Developers comply with the condition out of an obligation to meet regulatory requirements rather than in genuine expectation of the infrastructure ever being used. From the perspective of data center operators, investing in the infrastructure only makes sense when it generates Operational Expenditure (OpEx) savings through the reduced power and water consumption. However, the misalignment in load profiles complicates this matter. As the heating network’s demands peak in winter whilst reducing in summer, the data center operates the opposite way, as it can take advantage of ‘free cooling’ during the colder months. This misalignment in load profiles also impacts the ESCos.


The rise of observability and why it matters to your business

Automation is a two-edged sword. It’s one of those alluring concepts, but there’s real caution around trusting machines to judge what actions should and shouldn’t be taken and when. So given the sensitive nature of change management, we would expect this trend to continue to lean toward AI-led automation, but it will take some time before humans are mostly out of the loop. Moreover, while many vendors claim to have AI, there’s a wide spectrum of capabilities, and customers should be very cautious about vendor claims in this regard. Now, not surprisingly, the regulated industries of financial services, healthcare and government see a much lower tendency to be mostly-AI led in this context over the next year (well under 5% say mostly AI-led in this chart), whereas industries such as energy and high tech are much more likely to adopt AI aggressively in this space. Interestingly, the data show that senior managements are more likely to push for AI adoption whereas the practitioners, who literally have their jobs on the line – that is, machines replacing humans or getting fired for implementing rogue automation – are much less optimistic.


Innovate to elevate: Blueprint for business excellence in 2024 and beyond

The upcoming year promises an exciting development in the form of GenAI, which will be integrated into everyday applications such as search engines, office software, design tools, and communication platforms. This integration will reveal its full potential as a super-smart hyper-automation engine. With the ability to take over routine tasks, including information retrieval, scheduling, compliance management, and project organization, individuals will be able to boost their productivity and efficiency. As per a report, hyper-automation, combined with other technologies, can automate work activities that currently occupy 60-70% of employees’ time by 2024. This development offers immense value to sectors such as software engineering, R&D, customer operations, marketing, and sales, making it an indispensable part of the IT industry. In this rapidly evolving world, organizations are constantly searching for ways to enhance customer service and drive growth. One of the most promising ways to achieve this is by embracing hyper-automation technologies such as AI-powered tools, Natural Language Processing (NLP), chatbots, and virtual assistants. 


4 ways robotics, AI will transform industry in 2024

The future of manufacturing is intricately linked to IT/OT integration as data will underpin innovation and efficiency. Research shows that the manufacturing industry has been at the forefront of adopting cloud-based software services and we are already seeing some customers use these to enhance quality, cost efficiency, and predictability. That makes me confident that 2024 will see the growth of data-driven logistics and manufacturing systems. Many still have an outdated view of the cloud as merely being a data collector and backup function, as we know it from our private lives. But the real potential and power don’t lie in storing data or even in linking machines. The real transformative leap comes when cloud-based software services connect humans and machines and help manufacturers simplify complex processes and make smarter decisions. The benefits of this digital evolution are significant. Remote access to manufacturing data enables quick responses to issues and continuous automation improvement. With dynamic systems now essential, trusted cloud technologies offer the latest in security and state-of-the-art services.


Proper Data Management Drives Business Success

Organizations across industries are excited about generative artificial intelligence (AI) and large language models (LLMs), and for good reason. Tools like Chat GPT-4 have the potential to transform business and revolutionize how employees do their jobs, so it’s no surprise that many people are enthusiastic about implementing them within their organizations. However, LLMs are only as good as the data on which they are trained. If an organization’s data isn’t properly sorted, tagged, and secured, the addition of LLMs will not be nearly as transformative as business leaders hope. Nearly half (45%) of IT leaders admitted that ineffective and inefficient Data Management means they can’t leverage emerging technology such as generative AI, which can put them at a competitive disadvantage. IT leaders must holistically assess the state of their data practices before implementing generative AI. Only 13% of respondents reported that Data Management initiatives are their number one priority, so it’s unsurprising that 77% of the average U.S. company’s data is redundant, obsolete, or trivial (ROT) or dark data. 


Understanding the NSA’s latest guidance on managing OSS and SBOMs

In an effort to provide context and prioritization to downstream product and software consumers, the guidance recommends suppliers and developers adopt Vulnerability Exploitability eXchange (VEX) documents to help consumers and customers know which components are actually impacted by a vulnerability, which have been resolved, and what should potentially be addressed via compensating controls. The NSA also recommends suppliers and vendors adopt attestation processes to demonstrate the secure development of a product throughout the building, scanning, and packaging of product development and distribution. This of course is being led by industry efforts such as in-toto and SSDF and self-attestations when machine-readable artifacts are not generated and used. This helps provide assurance of not just the components of an end product but the security of the development process as well. To address vulnerabilities the NSA recommends using not just CVE and NVD but also other vulnerability databases such as OSV as well as vulnerability intelligence sources such as the CISA Known Exploited Vulnerability (KEV) catalog and Exploit Prediction Scoring System (EPSS).


5 Common Data Science Challenges and Effective Solutions

The upskilling and reskilling of existing data science experts aren’t limited to technical skills. Data science experts also need enhanced problem-solving and communication skills. With the massive amount of data now available come new challenges and problems that need to be addressed. The solutions to these problems need to be properly communicated to team members and management, who may or may not have the expertise to interpret data on their own. We’ll explore this in more detail later. To address the challenge of a smaller pool of data scientists relative to demand, you just need to stand out as a potential employer and attract some of those professionals who are part of that pool. So, offer competitive salaries and benefits. The average base pay for data scientists in the US is $146,422, according to Glassdoor, and if you can offer more, better. Whether you hire data scientists or already have data professionals as employees, you need to invest in data science workshops and training. These can help ensure your team’s data science skills are attuned to the times and consider current practices and standards in the data science industry.


How Observability Strengthens Your Company Culture

Observability breaks down silos and makes collaboration easier across different clouds, databases, and dashboards seamlessly. For example, an issue that the DevOps team discovers through observability might lead them to collaborate with the design team in a way they may never have before. Leaders should aim to do the same for their teams by fostering greater collaboration across the entire organization. A lack of effective collaboration and communication is the top cause of workplace failures, according to 86 percent of employees and executives. Just as observability is a step up from monitoring, collaboration is the output that evolves from transparent communication. Your head of accounting probably knows precisely where each decimal point needs to be within a spreadsheet and why it needs to be there. Can they say the same about the IT team’s technology stack or the sales team’s go-to-market plan? With a culture underpinned by collaboration, employees won’t just learn how to get along. They’ll understand why each cog in your machine functions the way it does, as well as the effect of their work on their fellow employees, the end product, and the business as a whole.


The Third-Party Threat for Financial Organisations

DORA requires financial entities to have robust contracts in place with ICT service providers. Financial organisations must also maintain a register of service providers and report on this to the competent authority every year. The key here is to manage risks. This includes managing the risk of having too many critical or important functions supported by a small number of service providers. In addition, DORA requires that financial entities only contract with providers that “comply with appropriate information security standards”. Where the ICT service provider supports critical or important functions, the financial entity must ensure the standards are “the most up-to-date and highest quality”. ... Unlike the GDPR (General Data Protection Regulation), DORA does not require that these standards be identified by a specific authority, so it’s reasonable to assume that ISO 27001 – since it sets the international benchmark for information security management – would qualify as such a standard. As Alan mentioned, certifications like ISO 22301 and Europrivacy™/® add further assurance, as do due diligence checks on suppliers’ resilience, particularly for critical suppliers.



Quote for the day:

"Innovation is taking two things that already exist and putting them together in a new way." -- Tom Freston

No comments:

Post a Comment