Showing posts with label quality management. Show all posts
Showing posts with label quality management. Show all posts

Daily Tech Digest - June 12, 2025


Quote for the day:

"It takes a lot of courage to show your dreams to someone else." -- Erma Bombeck


Tech Burnout: CIOs Might Be Making It Worse

“CIOs often unintentionally worsen burnout by underestimating the human toll of constant context switching, unclear priorities, and always-on availability. In the rush to stay competitive with AI-driven initiatives, teams are pushed to deliver faster without enough buffer for testing, reflection, or recovery,” Marceles adds. In the end, it’s the panic surrounding AI adoption, and not the technology itself, that’s accelerating burnout. The panic is running hot and high, surpassing anything CIOs and IT members think of as normal. “The pressure to adopt AI everywhere is real, and CIOs are feeling it from every angle -- executives, investors, competitors. But when that pressure gets passed down as back-to-back initiatives with no breathing room, it fractures the team. Engineers get pulled into AI pilots without proper training. IT staff are asked to maintain legacy systems while onboarding new automation tools. And all of it happens under the expectation that this is just “the new normal,” says Cahyo Subroto, founder of MrScraper, a data scraping tool. ... “What gets lost is the human capacity behind the tech. We don’t talk enough about how context-switching and unclear priorities drain cognitive energy. When everything is labeled critical, people lose the ability to focus. Productivity drops. Morale sinks. And burnout sets in quietly, until key people start leaving,” Subroto says.


Asset sprawl, siloed data and CloudQuery’s search for unified cloud governance

“The biggest challenge with existing tools is that they’re siloed — one for security, one for cost, one for asset inventory — making it hard to get a unified view across domains,” CQ founder Yevgeny Pats told VentureBeat. “Even simple questions like ‘What EBS volume is attached to an EC2 that is turned off? are hard to answer without stitching together multiple tools.” ... Taking a developer-first approach is critical, said Pats, because developers are ultimately the ones building, operating and securing today’s cloud infrastructure. Still, many cloud visibility tools were built for top-down governance, not for the people actually in the trenches. “When you put developers first, with accessible data, flexible APIs and native language like SQL, you empower them to move faster, catch issues earlier and build more securely,” he said. Customers are finding ways to use CloudQuery beyond asset inventory. ... “Having a fully serverless solution was an important requirement,” Hexagon cloud governance and FinOps expert Peter Figueiredo and CloudQuery director of engineering Herman Schaaf wrote in a blog post. “This decision brought lots of benefits since there is no need for time-consuming updates and virtually zero maintenance.”


Digital twins combine with AI to help manage complex systems

And it’s not just AI making digital twins better. The digital twins can also make for better AI. “We’re using digital twins to actually generate information for large language models,” says PwC’s Likens, adding that the synthetic data is of better quality when it comes from a digital twin. “We see opportunity to have the digital twins generate the missing pieces of data we need, and it’s more in line with the environment because it’s based on actual data.” A digital twin is a working model of a system, says Gareth Smith, GM of software test automation at Keysight Technologies, an electronics company. “It’ll respond in a way that mimics the expected response of the physical system.” ... Another potential use case for digital twins that might become more relevant this year is to help with understanding and scaling agentic AI systems. Agentic AI allows companies to automate complex business processes, such as solving customer problems, creating proposals, or designing, building, and testing software. The agentic AI system can be composed of multiple data sources, tools, and AI agents, all interacting in non-deterministic ways. That can be extremely powerful, but extremely dangerous. So a digital twin can monitor the behavior of an agentic system to ensure it doesn’t go off the rails, and test and simulate how the system will react to novel situations.


Will Quantum Computing Kill Bitcoin?

If a technological advance were to render these assets insecure, the consequences could be severe. Cryptocurrencies function by ensuring that only authorized parties can modify the blockchain ledger. In Bitcoin’s case, this means that only someone with the correct private key can spend a given amount of Bitcoin. ... Quantum computers, however, operate on different principles. Thanks to phenomena like superposition and entanglement, they can perform many calculations in parallel. In 1994, mathematician Peter Shor developed a quantum algorithm capable of factoring large numbers exponentially faster than classical methods. ... Could quantum computing kill Bitcoin? In theory, yes, if Bitcoin failed to adapt and quantum computers suddenly became powerful enough to break its encryption, its value would plummet. But this scenario assumes crypto stands still while quantum computing advances, which is highly unlikely. The cryptographic community is already preparing, and the financial incentives to preserve the integrity of Bitcoin are enormous. Moreover, if quantum computers become capable of breaking current encryption methods, the consequences would extend far beyond Bitcoin. Secure communications, financial transactions, digital identities, and national security all depend on encryption. In such a world, the collapse of Bitcoin would be just one of many crises.


Smaller organizations nearing cybersecurity breaking point

Small and medium enteprises (SMEs) that do have budget to hire specialists often struggle to attract and retain skilled professionals due to the lack of variation in the role. Burnout is also a growing issue for the understaffed, underqualified IT teams common in small business. “With limited resource in the business, employees are often wearing multiple hats and the pressure to manage cybersecurity on top of their regular duties can lead to fatigue, missed threats, and higher turnover,” Exelby says. ... SMEs often mistakenly believe that cyber attackers only target larger organizations, but that’s often not the case — particularly because small business partners of larger companies are often deliberately targeted as part of supply chain attacks. “Threats are becoming more advanced but their resources aren’t keeping pace,” says Kristian Torode, director and co-founder of Crystaline, a specialist in SME cybersecurity. “Many SMEs are still relying on outdated systems or don’t have dedicated security teams in place, making them an easy target.” Torode adds: “They’re also seen by cybercriminals as an exploitable link in the supply chain, since they often work with larger enterprises.” “SMEs have traditionally been low-hanging fruit — with limited resources for cybersecurity training, advanced tools, or dedicated security teams,” Adam Casey, director of cybersecurity and CISO at cloud security firm Qodea, tells CSO. 


Want fewer security fires to fight? Start with threat modeling

Some CISOs begin with one critical system or pilot project. From there, they build templates, training materials, and internal champions who help scale the practice across teams. Incorporating threat modeling into an organization’s development lifecycle doesn’t have to be daunting. In fact, it shouldn’t be, according to David Kellerman, Field CTO of Cymulate. “The key is to start small and make threat modeling approachable,” Kellerman says. Rather than rolling out a heavyweight process full of complex methodologies, CISOs should look for ways to embed threat modeling into workflows that teams already use. “I advise CISOs to embed threat modeling into existing workflows, such as architecture reviews, design discussions, or sprint planning, rather than creating separate, burdensome exercises.” This lightweight, integrated approach not only reduces resistance but helps normalize secure thinking within engineering culture. “Use simple frameworks like STRIDE or basic attacker storyboarding that non-security engineers can easily grasp,” Kellerman explains. “Make it collaborative and educational, not punitive.” As teams gain familiarity and confidence, organizations can gradually evolve their threat modeling capabilities. “The goal isn’t to build a perfect threat model on day one,” Kellerman says. “It’s to establish a security mindset that grows naturally within engineering culture.”


Rethinking Success in Security: Why Climbing the Corporate Ladder Isn’t Always the Goal

In the security field, like in many other fields, there seems to be constant pressure to advance. For whatever reason, the choice to climb the corporate ladder seems to garner far more reverence and respect than the choice to develop expertise and skills in one particular area of specialization. In other words, the decision to go higher and broader seems to be lauded more than the decision to go deeper and more focused. Yet, both are important in their own right. There are certain times in a security professional’s career when they find themselves at a crossroads – confronted by this issue. One career path is not more “correct” than another one. Which direction is the right one is an individual choice where many factors are relevant. ... It is the sad reality of the security field that we don’t show our respect and appreciation for our colleagues enough. That being said, the respect is there. See, one important thing to keep in mind is that respect is earned – not ordained or otherwise granted. If you are a great security professional, people take notice. You shouldn’t feel compelled to attain a specific title, paygrade, or otherwise just to get some respect. The dirty secret in the industry is that just because someone is in a higher-level role, it doesn’t mean that people respect them. 


The AI data center boom: Strategies for sustainable growth and risk management

Data center developers are experiencing extended long lead times for critical equipment such as generators, switchgear, power distribution units (PDUs) and cooling systems. Global shortages in semiconductors and electrical components are still impacting timelines. Additionally, uncertainty regarding tariffs is further complicating procurement and planning processes, as potential changes in trade policies could affect the cost and availability of these essential components. ... Data center owners are increasingly trying to use low-carbon materials to decarbonize both the centers and construction operations. This approach includes concrete that permanently traps carbon dioxide and steel, which is powered using renewable energy. Microsoft is now building its first data centers made with structural mass timber to slash the use of steel and concrete, which are among the most significant sources of carbon emissions. ... Fires in data centers are typically caused by a breakdown of machinery, plant or equipment. A fire that spreads quickly can result in significant financial losses and business interruption. While the structures for data centers often have concrete frames that are not significantly impacted by fires, it’s the high-value equipment that drives losses – from cooling technology to high-end computer servers or graphic card components.


Managing software projects is a double-edged sword

Doing two platform shifts in six months was beyond challenging—it was absurd. We couldn’t have hacked together a half-baked version for even one platform in that time. It was flat-out impossible. Let’s just say I was quite unhappy with this request. It was completely unreasonable. My team of developers was being asked to work evenings and weekends on a task that was guaranteed to fail. The subtle implication that we were being rebellious and dishonest was difficult to swallow. So I set about making my position clear. I tried to stay level-headed, but I’m sure that my irritation showed through. I fought hard to protect my team from a pointless death march—my time in the Navy had taught me that taking care of the team was my top job. My protestations were met with little sympathy. My boss, who like me came from the software development tool company, certainly knew that the request was unreasonable, but he told me that while it was a challenge, we just needed to “try.” This, of course, was the seed of my demise. I knew it was an impossible task, and that “trying” would fail. How do you ask your team to embark on a task that you know will fail miserably and that they know will fail miserably? Well, I answered that question very poorly.


The CIO Has Evolved. It's Time the Board Catches Up

Across industries, CIOs have risen to meet the moment. They are at the helm of transformation strategies with business peers and drive digital revenue models. They even partner with CFOs to measure value, CMOs to reimagine customer experience and COOs to build data-driven models. ... CIOs have evolved. But if boards continue to treat them as back-room managers instead of strategic partners, they are underutilizing one of the strategic roles in the enterprise. ... In today's times, every company is a technology company. AI, automation, cloud and digital platforms aren't just enablers. They form the foundation for competitive advantage and new revenue models. Similarly, cybersecurity is no longer just an IT challenge, it's a board-level fiduciary responsibility. Boards, however, dominantly engage with CIOs in a transactional manner. Issues such as budget approvals, risk reviews and project updates are common conversations. CIOs are rarely invited into conversations related to growth strategy, market reinvention or long-term capital allocation. This disconnect is proving to be a strategic liability. ... In industries where technology is the differentiator, CIOs should not be in the boardroom, they should be shaping their agenda. Because if CIOs are empowered to lead, organizations don't just avoid risk, they build resilience, relevance and reinvention.

Daily Tech Digest - August 23, 2024

Generative AI is sliding into the ‘trough of disillusionment’

“Even as AI continues to grab the attention, CIOs and other IT executives must also examine other emerging technologies with transformational potential for developers, security, and customer and employee experience and strategize how to exploit these technologies in line with their organizations’ ability to handle unproven technologies,” Chandrasekaran said. ... Autonomous AI software was among four emerging technologies called out in the report because it can operate with minimal human oversight, improve itself, and become effective at decision-making in complex environments. “These advanced AI systems that can perform any task a human can perform are beginning to move slowly from science fiction to reality,” Gartner said in its report. “These technologies include multiagent systems, large action models, machine customers, humanoid working robots, autonomous agents, and reinforcement learning.” Autonomous agents are currently heading up the slope to the peak of inflated expectations. Just ahead of autonomous agents on that slope is artificial general intelligence, currently a hypothetical form of AI where a machine learns and thinks like a human does.


As Fintechs Stumble, A New Breed of ‘TechFins’ Move to the Fore

TechFins have provided many points of value in recent years, but particularly in 2024 and in the near future, they will highly benefit financial institutions in the areas of: Leveraging the power of transaction data cleansing and analysis; Artificial intelligence (AI); Fraud prevention and cost mitigation; Extending the personalized user experience and reliability of the digital banking application; Transforming digital banking platforms into a digital sales and service platform; Increasing revenues and lowering costs for financial institutions. With financial institutions amassing high volumes of transaction data within their ecosystems, processing and analyzing that data is becoming a greater priority. According to the Pragmatic Institute, data practitioners spend 80% of their valuable time finding, cleaning, and organizing the data. This leaves only 20% of their time to actually perform analysis on it. This is the 80/20 rule, also known as the Pareto principle. TechFins can provide vital support to financial institutions’ data teams through transaction cleansing, leaving them more time to build campaigns and take action on the data. 


The Developer Crisis: Mental Health, Burnout, and Retention

Seven​​ out of 10 developers state that job satisfaction is the most important factor. Unplanned extra tasks and excessive overtime will have developers looking for the door. Businesses need to make it clear to both existing and new hires that they will do everything they can to respect these boundaries. Developers encounter constant roadblocks in their work, so time is precious. To help devs maintain a “flow state” (total focus on the task), businesses should consider re-evaluating their calendars to reduce unnecessary meetings. If not implemented, software development frameworks could help dev teams better organize their work and progress through projects faster. As with any operational change, feedback is critical. ... By freeing developers from burdensome backend duties, they can stay creative and focus on developing innovative new frontend solutions to improve a customer’s overall experience. This makes brilliant business sense, particularly in the case of e-commerce, where standard feature developments, which would otherwise take up tons of developer resources, can be handled much more efficiently by a tech platform.


Vulnerability prioritization is only the beginning

Scrutiny of cybersecurity processes and performance is ratcheting up due to the dual hammers of increased regulatory scrutiny and the brutal trend of highly damaging attacks. The US Securities and Exchange Commission, the European Union, the US Department of Defense, the British National Government, and the US Cybersecurity and Infrastructure Agency have all put or are putting in place significantly more stringent requirements for CISOs and their teams. Both the SEC and CISA have moved to push accountability to the Board of Directors and the C-Suite. This means that metrics alone are no longer sufficient for CISOs that want to provide full transparency. Process transparency has become just as critical to validate KPIs and allow auditors and the government to peer inside what were formerly security process “bottlenecks”. These bottlenecks are highly variable, human-centric processes, such as opening or closing a Jira ticket, back and forth commenting in a Slack thread, pushing a pull request on GitHub, or running a CI/CD pipeline to test and redeploy software after a patch. All can have human path dependencies, injecting uncertainty and variability.


Authentication and Authorization in Red Hat OpenShift and Microservices Architectures

Moving up the layers and looking at the blue layer (that is, interacting with OpenShift or Kubernetes in general) means communicating to the Kubernetes API server. This is true for both human and non-human users, whether they're using a GUI console or a terminal. Ultimately, all interaction with OpenShift or Kubernetes goes through the API server. The OAuth2/OIDC combination makes perfect sense for API authentication and authorization, so OpenShift features a built-in OAuth2 server. As part of the configuration of this OAuth2 server, an supported identity provider must be added. The identity provider helps the OAuth2 server confirm who the user is. Once this part has been configured, OpenShift is ready to authenticate users. For an authenticated user, OpenShift creates an access token and returns that token to the user. This token is called an OAuth access token. ... Users and Service Accounts can be organized into groups in OpenShift. Groups are useful when managing authorization policies to grant permissions to multiple users at once. For example, you can allow a group access to objects within a project instead of granting access to each user individually.


Bridging the digital divide: driving positive impact where it is needed most

There are definitely pros and cons to building rural fiber networks. On the one hand, by nature the construction process is more complex and expensive, but this typically means that there is little to no competition, leading to higher customer demand and lower overbuild risk. The challenges are even more acute in areas that qualify for Project Gigabit subsidies, with barriers including challenging terrain, geography, and geology, which often increases costs and extends timelines. Due to the distances being covered, rural rollouts often also require more permits and wayleaves from multiple landowners, further increasing complexity. Without subsidies, these projects would not be commercially viable, but with cost cover of between 60 percent to 80 percent of capex, a defensive position is created for contract winners, which increases returns for investors, while also supporting some of the most neglected rural communities. In these cases, network commercialization is also likely to be more achievable and we are starting to see a growing evidence base of strong customer cohort penetration in these projects which supports that thesis.


How IT Leaders Can Benefit From Active Listening

Active listening is crucial for effective leadership, says Justice Erolin, CTO at BairesDev, a technology services company. "It strengthens team dynamics, drives innovation, and ensures that all voices are heard," he observes in an email interview. When IT leaders speak, particularly with business stakeholders, they often err by assuming everyone understands the taxonomy and language being used, Chowning observes. "This is frequently the case with technology-related terminology that we understand well, but which business stakeholders might define or understand much differently," she adds. "If we start from unequal or disconnected positions, then we tend to hear something other than what the speaker intended." IT leaders can improve collaboration by understanding team members' perspectives and enhancing problem-solving with deeper insights, Erolin says. It can also help build trust by making team members feel heard. "Ultimately, leaders will be able to make better decisions through diverse viewpoints." Erolin notes that BairesDev incorporates active listening skills into its leadership training program, recognizing the tool's importance in fostering a culture of trust and collaboration.


Embracing Data and Emerging Technologies for Quality Management Excellence

Traditionally, quality management has been seen through a compliance lens – a necessary business cost to meet regulations. To unleash QM’s power as a catalyst for ongoing business growth and customer satisfaction, a fundamental mindset shift toward a more comprehensive, proactive approach is crucial. In the past, quality reporting and data tracking were reactive, addressing issues after they occurred. This fuels a fix-it-later cycle instead of prevention. The needed cultural change is from reactive to proactive QM. Forward-thinking firms use AI and predictive analytics to foresee problems before they arise, emphasizing prevention and continuous improvement. However, some companies remain regulation-focused due to deep-rooted challenges. Breaking this mold requires realigning toward customer-centricity, building robust systems that prioritize satisfaction and ongoing enhancement, with regulatory compliance as a natural outcome. It’s key to see quality and regulatory goals as aligned to each other and drivers of commercial growth, not conflicting with each other and inhibitors to commercial growth.


The reality of AI-centric coding

“Although AI is able to solve many college problem sets and handle small-to-medium snippets of code generation, it still struggles with complex logic, large code bases, and especially novel problems without precedent in the training data. Hallucinations and errors remain significant issues that require expert engineering oversight and correction,” Nag said. “These tools are far better at quick prototypes from scratch rather than iterating large applications, which is the bulk of engineering. Much of the context that drives large applications doesn’t actually exist in the code base at all.” Tom Taulli, who has authored multiple AI programming books, including this year’s AI-Assisted Programming: Better Planning, Coding, Testing, and Deployment, agreed that the move to great GenAI coding efforts will catch most enterprises off guard. "These tools will mean a change in traditional workflows, approaches, and mindset. Consider that they are pretrained, so they are often not updated for the latest frameworks and libraries. Another issue is the context window. Code bases can be massive. But even the most sophisticated LLMs cannot handle the huge amount of code files in the prompts,” Taulli said. 


Is AI Making Banking Safer or Just More Complicated?

The rise of AI in fraud detection has been a game changer. Through real-time analysis, machine learning and pattern recognition, AI tools can flag unusual transactions and often catch fraud before it occurs. AI's capabilities in anomaly detection allow financial institutions to be proactive, staying ahead of cybercriminals. But AI has its flaws. One of the most significant issues is the high rate of false positives. John MacInnes, a retired professor from Edinburgh, encountered this new reality firsthand. He tried to send 15,000 euros to a friend in Austria, expecting it to be a quick and routine transaction. The process became an ordeal involving the fraud team at Starling Bank. When MacInnes declined to provide personal messages and tax documents to prove the legitimacy of the payment, the bank took drastic action - it froze his account. It wasn't until media wrote about his plight that the bank admitted it went too far and unfroze the account. This incident sheds light on a growing challenge for banks: While caution is understandable, overly aggressive fraud prevention can alienate the very customers they aim to protect.



Quote for the day:

"Difficulties strengthen the mind, as labor does the body." -- Seneca