Daily Tech Digest - January 09, 2024

10 ways to destroy developer happiness

Who doesn’t get annoyed by endless meetings? Developers are busy people, and most would rather spend their time coding than talking about it. Meetings that are not focused and efficient are a frequent source of disenchantment. “Meetings that drag on without contributing to progress can be very draining,” says Vlad Gukasov, software development engineer at Amazon. “These often take up valuable time that could be better spent on actual development work.” ... Unnecessary red tape can be incredibly frustrating to developers. “Navigating through layers of bureaucracy can be quite stifling,” Gukasov says. “The complexity of internal procedures can sometimes hinder the smooth progress of software development.” Developers like efficiency, says Remi Desmarais, director of engineering and software development at software company Tempo Software. “They frequently encounter delays, from waiting for clarification on requirements, to code processes like compilation, building and testing to seeking approval from code reviewers, which can hinder their progress,” he says.


Data Center Cooling: Embracing Liquid Cooling for the Era of Sustainable and Efficient Operations

While the adoption of liquid cooling undeniably offers an enticing remedy for the thermal challenges posed by high-density racks, its integration into data center management introduces a new set of considerations and complexities. The deployment of liquid cooling systems necessitates a bespoke infrastructure, comprising specialized components such as pumps, heat exchangers, and filtration systems. These elements work in concert to ensure the seamless circulation and efficient heat dissipation of the liquid coolant throughout the intricate network of electronic components. Beyond the physical requirements, the use of liquid coolants imposes a critical need for stringent safety protocols and specialized training for personnel entrusted with the operation and maintenance of these systems. The introduction of liquid into the data center ecosystem marks a shift that extends beyond hardware considerations, demanding a holistic approach to facility management and personnel training to guarantee the safe and effective functioning of these advanced cooling solutions.


Meet the industrial metaverse: How Sony and Siemens seek to unleash the power of immersive engineering

According to Siemens CEO Roland Busch, "This will empower customers to accelerate innovation, enhance sustainability, and adopt new technologies faster and at scale, leading to a profound transformation of entire industries and our everyday lives. Together with our customers and partners, Siemens is proud to announce new products that will bring the industrial metaverse a step closer to all of us." ... Another interesting phrase Siemens has been using is immersive engineering. The idea is that engineering, design, and content creation can be done inside a 3D environment using a toolkit called the Siemens Xcelerator portfolio. Siemens offers a product called NX Immersive Designer. This tool is designed to "seamlessly connect the real and digital worlds." Essentially, users can create a digital twin, a version of a real-world system modeled and simulated in the virtual world. Digital twins replicate physical entities with accurate virtual models, helping engineers simulate performance testing and make predictions about points of failure. The virtual nature of the twin allows many more variations and usages compared to the cost and risk of building a single physical prototype. 


Leadership opportunities must align with an individual's natural strengths: Gallup’s Rohit Kar

In addressing the correlation between effective leadership and employee engagement, our approach to leadership development strategy emphasises the enduring elements that remain constant amid technological advancements and societal shifts. We prioritise understanding the unchanging fundamental aspects of human nature, acknowledging that regardless of external influences, certain core needs persist. At Gallup, when we delve into engagement, we recognise that employees bring their emotions into the workplace. To measure these emotions productively, we simplify the process, ensuring it remains straightforward and outcome-oriented. Our focus lies in enabling managers to engage in meaningful conversations by providing adaptable frameworks and tools. We ensure scalability, speed, and agility in our delivery methods to cater to changing consumption patterns without altering the essence of what we deliver – because the core principles of management don't require alteration. Our unique approach lies in retaining fundamental principles while adapting to the evolving landscape. 


New York Times’ blockbuster suit could decide the fate of genAI

Microsoft and OpenAI say their use of copyrighted material is transformative. They contend the output of the chatbots transforms the original content into something different. The Times suit claims there’s no real transformation, that what Microsoft and OpenAI are doing is outright theft. It claims the companies are not just stealing Times content, but their audience as well, and making billions of dollars from it. People will have no need to read the Times either online or in print, if they can get all the newspaper’s information for free from a chatbot instead, the suit alleges. “There is nothing ‘transformative’ about using The Times’scontent without payment to create products that substitute for The Times and steal audiences away from it. Because the outputs of Defendants’ GenAI models compete with and closely mimic the inputs used to train them, copying Times works for that purpose is not fair use.” ... So, who’s right? This is not a difficult call. The answer is simple. The Times is right. Microsoft and OpenAI are wrong. Microsoft and OpenAI are getting a free ride to use copyrighted material that takes a tremendous amount of time and money to create, and uses that material to reap big profits.


After Orange Disruption, Brace for More BGP Route Hijacking

Orange's attacker appeared to have obtained and used a valid password for the telco's administrator account with RIPE, for which two-factor authentication wasn't enabled. Security experts report that the source of the password appears to have been information-stealing malware called Raccoon. After gaining access to the account, the attacker used RIPE's hosted RPKI resource certification service to broadcast a valid, cryptographically signed route origin authorization to direct traffic to an autonomous system number not controlled by Orange, resulting in the traffic never reaching its intended destination. ... The telecommunications giant subsequently told Information Security Media Group in a statement: "The Orange account in the IP network coordination center (RIPE) has suffered an improper access that has affected the browsing of some of our customers." The company said it had immediately responded and resolved the problem later Wednesday and that "appropriate measures have been taken to prevent such an incident from happening again."


Data Governance & Controls: An Increasingly Critical Foundation for AML Compliance Programs

Ensuring effective AML compliance necessitates a steadfast commitment to data accuracy and integrity, achievable only through robust governance. Financial institutions can only rely on the consistency and correctness of their AML compliance solutions by establishing stringent controls ensuring accuracy and integrity. The significance of data accuracy and integrity can easily be seen in relation to false positive and false negative alerts in transaction monitoring. False positives, stemming from inaccurate data triggering unnecessary alerts, can lead to a significant waste of resources and needless increases in program costs. Conversely, false negatives, arising from incomplete or unreliable data, may cause suspicious activity to go unreported, which could lead to potential regulatory action. Financial institutions that build strong data accuracy and integrity standards into their processes minimize these risks, thereby enhancing the overall efficiency and efficacy of their AML compliance program.


Designing an IT department for a world defined by change

For years, inside the technology department, we’ve referred to “the business”, meaning everything other than IT, and for its part, “the business” has spent years trying to isolate IT as a mere cost centre. But nearly a quarter of the way through the 21st century, the need to integrate digital tools into every part of business means this splendid isolation no longer really works. Even in the most performant IT departments, centralised decision-making around tooling and technology choices falls behind the galloping pace of the market, and it's difficult to imagine how it could ever be otherwise in a department of finite resources. ... Still, by refusing to allow the curation of information technology outside of that provided by the technology function, it becomes a self-fulfilling prophecy - the only people expected to understand the challenges of technology are those hired into the IT department. The only way to break that vicious cycle is to discard the model of the central IT department. Once you start questioning the existence of the IT department, it’s only a short journey to rejecting the concept of functional structures entirely.


Building a Better Analytics Team

Once you have the right group of people on your team, your challenge is to build an environment where they can work effectively. Ensure that they are clear about the organization’s mission, vision, values, and objectives. Create an environment where the connection between your team’s objectives, projects, and tasks and the targets and objectives of the organization are clear. Team members need to understand how the data architecture being built will have a direct impact on the decisions the organization is making and how their activities further the goal of becoming a more data-driven organization. Team building is not a one-time activity. It is built into everyday activities, including goal setting, performance reviews, project planning and execution, team meetings, and one-on-one meetings between you and your direct reports. The clarity of purpose is something that increases each time it is reviewed and as the team’s objectives and goals are refined and realigned. As a leader, one of your main objectives is to remove obstacles that arise. 


The Rise of Dual Ransomware Attacks

Two ransomware attacks mean that companies could be facing more data encryption, more data exfiltration, more data leaks, and multiple demands for payment. If an enterprise is dealing with a single attacker behind two attacks, they could be facing the challenge of different ransomware strains. How did the two strains impact an organization’s systems, and what will it take to remediate and return to normal operations? How much data was taken? Ransomware groups can take that exfiltrated data to leak sites to up the pressure on victims to pay. If two different attackers are in play, recovery can be even more complicated. Two different attackers may encrypt the same files. “We've had a couple of cases where the first ransomware incident occurs. They encrypted a bunch of files. In the midst of dealing with that a second attack…those attackers encrypt the encrypted files of the other attackers,” Minder shares. “If that happens, where you get encrypted files encrypted again, the likelihood of corruption goes up thousand percent, and you may not get your files back at all.”



Quote for the day:

“Our greatest fear should not be of failure but of succeeding at things in life that don't really matter.” -- Francis Chan

Daily Tech Digest - January 08, 2024

Can Generative AI and Data Quality Coexist?

Not only is it possible for generative AI and data quality to co-exist, it’s imperative that they do, says Marinela Profi, AI strategy advisor for analytics software developer SAS in an email interview. Data for AI is like food for humans, she notes. “Based on the quality of the food you feed your body and your brain, you will receive a certain quality of outputs, such as higher performance or more focus.” Simply put, if you've neglected the quality of your enterprise data, or haven't defined a proper data strategy, you won't get value out of generative AI, Profi says. “On the flip side those who have implemented a strong data management discipline are uniquely positioned to gain a competitive advantage with generative AI.” ... New training techniques that require less data or that can learn more effectively from existing datasets might reduce the pressure on data quantity but increase the need for highly representative and unbiased data samples, Profi says. “Self-supervised and unsupervised learning techniques, in which models generate their own labels or learn from unlabeled data, reduce reliance on manually labeled datasets” she notes.


10 top priorities for CIOs in 2024

We are in a cybersecurity pandemic right now, warns Juan Orlandini, CTO for North America at solutions and systems integrator Insight Enterprises. He encourages CIOs to focus on cyber preparedness to ensure they’ve done everything possible to prevent assaults. “Attackers are getting more sophisticated, and all it takes is one mistake for them to get in,” Orlandini notes. “Assume that attacks are inevitable.” Work toward having the right cybersecurity team in place, Orlandini advises. “This could be an in-house team or trusted advisors who can make sure you’ve done what you can to protect yourself.” ... Caldas believes that’s important to renew and maintain existing workforce competencies as well as to establish a high-performance culture that’s ready to deliver results in today’s fast-paced technology ecosystem. “We’re shifting away from building development plans based on job profiles alone and are now pivoting to build plans on top of a foundation of skills,” she states. “Skill-building will inform how we provide training as well as how team members can grow their careers.”


Insights From Quantum Computing Could Create Light-Controlled Memory Tech

The discovery is tightly linked to the realm of quantum technologies, and combined principles from two scientific communities that so far had little overlap: “We arrived to this understanding by using principles that are well established within the quantum computing and quantum optics communities but less so in the spintronics and magnetism communities.” The interaction between a magnetic material and radiation is well established when the two are in perfect equilibrium. However, the situation where there is both radiation and a magnetic material that are not in equilibrium has so far been described very partially. This non-equilibrium regime is at the core of quantum optics and quantum computing technologies. From our examination of this non-equilibrium regime in magnetic materials, while borrowing principles from quantum physics, we have underpinned the fundamental understanding that magnets can even respond to the short time scales of the light. Moreover, the interaction turns out to be very significant and efficient.


IoT Data Governance: Taming the Deluge in Connected Environments

IoT environments typically comprise diverse devices and systems, often operating on different standards and protocols. Data Governance must address the challenge of integrating this disparate data to ensure seamless interoperability. This requires establishing common data models and communication protocols to use. For example, integrating data from wearable devices, electronic health records, and diagnostic equipment in a healthcare IoT setup is crucial for comprehensive patient monitoring and care. IoT Data Governance must also include policies for data storage and lifecycle management. This involves determining how long data should be stored and when it should be archived or deleted, as well as ensuring that storage solutions are scalable and cost-effective. Likewise, it’s important to understand that proper Data Governance doesn’t relate only to touchpoints between the device, the network, and the cloud. Instead, proper security protocols must be applied throughout the organization. From integrated document editors and AI assistants to plugins and VPNs, everything must be airtight.


5 Reasons Not to Use Serverless Computing

Serverless computing is cost-efficient in the sense that you only pay for the time your workloads are active. However, the per-minute cost of serverless is almost always higher than the cost of running an equivalent workload on a VM. For this reason, serverless may result in greater total costs than other types of cloud services, especially for workloads that are active most of the time. ... Each serverless computing platform works in a different way. That makes it challenging to migrate workloads from one serverless environment (like AWS Lambda) to another (such as Azure Functions). By comparison, the differences between other types of cloud services (such as AWS EC2 and Azure Virtual Machines) are less pronounced, leading to lower levels of lock-in when you use those services. ... Although serverless workloads theoretically run on-demand, in practice there is typically a delay between when serverless code is triggered and when it actually runs. This is especially true in the case of "cold starts," which happen when serverless code hasn't run recently. Sometimes, the delays in startup time can be a second or longer.


Industry 4.0 at Scale Can Make Smart Manufacturing Pervasive

We're barely scratching the surface to realize the potential of generative AI. Consider a worker's typical task: locating and utilizing manuals to address issues at a workstation. This process usually takes hours, sifting through pages and troubleshooting cryptic error codes. Now, picture digitizing these manuals, integrating them into a bedrock system and overlaying an anthropic cloud interface. With this setup, a worker standing before a machine encountering an error can simply input the error code and receive a clear explanation of the problem. This saves an immense amount of time, directly translating to cost savings and increased uptime. But the real potential lies in integrating this knowledge into workflows seamlessly. Imagine moving from receiving an error message to being provided with repair instructions while simultaneously updating maintenance records and checking inventory for required spare parts - all automatically. This transformative capability eliminates the need for manual intervention, which currently involves hours of inventory checks, finding personnel for repairs, and managing associated logistics.


Embracing the future: Top 5 home automation trends for 2024

As we strive towards a sustainable future, the inclusion of energy-saving devices has become a priority. Energy-efficient smart home devices include unique features that make energy conservation easier than ever. Lighting systems, for example, independently dim or switch off when rooms are unused, saving electricity and extending bulb life. Also, schedules to switch off lights, etc. at a preset time every day – ensure nil wastage of energy. By incorporating energy-efficient smart home gadgets into daily life, people can not only reduce their environmental effects but also benefit from lower costs of electricity. ... Artificial intelligence (AI) is stepping in to make the lives of people even more convenient and efficient. Virtual assistants powered by AI, such as Amazon Alexa and Google Assistant, have already become commonplace in many homes. These smart companions can answer inquiries and provide weather updates, as well as operate smart gadgets and much more. In the coming years, AI will become more important in home automation. Your home can learn your habits and adjust to create the perfect atmosphere.


AI Will Create Demand and Empower Developers, Not Replace Them

AI takes care of the annoying, tedious, routine tasks that may otherwise take up a significant amount of developers’ time, so they’re able to better concentrate on the real work at hand. In fact, 92% of developers are already using AI to lighten their load. ... AI can do this in seconds. All a developer needs to do is tell the technology what they want to accomplish and what language they want to use, and they can get some understanding of the best approach to their problem. Learning: Although there’s a need to check for accuracy, AI can help developers understand code snippets and programming concepts without having to do the research themselves. Documentation: Nobody likes documentation. It’s tedious and difficult. However AI documentation can help bring attention to things that didn’t work during the development process while reducing development times in the aftermath. Code quick-starts: This gives a major leg up to developers who have an idea but don’t know exactly where to begin. AI can generate coding within seconds despite the language. Even if the parameters of the project need a review, it gives you a head start.


Future challenges and innovations in cloud security platforms

The role of identity and access management systems in cloud security has evolved from simple gatekeeping to intelligent filtering. These systems now serve as sophisticated sentinels equipped to discern legitimate users from intruders. They’re no longer static barriers but dynamic shields, adapting to new threats and user behaviors. The advancement in these systems means they can now offer personalised access based on user roles and context, adding a layer of security that’s both smart and user-centric. Essentially, they act as the first line of defence, ensuring that only the right people can access the right data at the right time. ... The adage “it takes a village” holds particular relevance in cloud security, emphasising the importance of collective effort and cooperation in this field. Achieving unbreakable security is a collaborative mission, relying not on individual effort but on collective strength and teamwork. This collaboration transcends organizational boundaries, bringing together experts, companies, and competitors to forge a united front against cyber threats.


Social engineer reveals effective tricks for real-world intrusions

My main social engineering trick is just walking into a location like you belong there. People underestimate how far confidence will get you into a location and how unsuspecting people are when they feel secure. I’ve always said the only thing worse than no security is the false sense of security because it is tough to imagine something terrible will happen when you have that false sense of security. One of the main tricks that I do when I am doing a phishing attack is not to tell them that something positive has happened. I always have the topic of the e-mail to be unfortunate, something that may be a mistake, something that has happened that is important and, if not fixed immediately, could have dire consequences. People are very suspicious when they get an e-mail that something good has happened or will happen to them. Still, throughout history, humans have always craved information almost at any cost when they felt like a threatening situation was occurring around them. They need to discover what is happening and how it could affect them.



Quote for the day:

"Ambition is the path to success. Persistence is the vehicle you arrive in." -- Bill Bradley

Daily Tech Digest - January 07, 2024

2024 cybersecurity forecast: Regulation, consolidation and mothballing SIEMs

CISOs’ jobs are getting harder. Many are grappling with an onslaught of security threats, and now the legal and regulatory stakes are higher. The new SEC cybersecurity disclosure requirements have many CISOs concerned they’ll be left with the liability when an attack occurs. ... After the Cyber Resilience Act, policymakers and developers drive adoption of security-by-design. The CRA wisely avoided breaking the open source software ecosystem, but now the hard work starts: helping manufacturers adopt modern software development practices that will enable them to ship secure products and comply with the CRA, and driving public investment in open source software security to efficiently raise all boats. ... With the increase in digital business-as-usual, cybersecurity practitioners are already feeling lost in a deluge of inaccurate information from mushrooming multiple cybersecurity solutions coupled with a lack of cybersecurity architecture and design practices, resulting in porous cyber defenses.


Expert Insight: Adam Seamons on Zero-Trust Architecture

Zero trust goes beyond restricting access by need to know and the principle of least privilege. It’s about properly verifying access and being 110% certain that the access is legitimate. That means things like limiting access to specific criteria, such as by port or protocol, time period, IP address and/or physical location. ... A zero-trust network is about verification or double-checking. You want to be verifying not just the person, but also the device and limiting that access to specific permissions and rights that have been approved in advance. And you’re also restricting data access, particularly in situations like the example I just gave. Think of it like the difference between a key to the front door that gives you access to the whole house, and needing a key for the front door as well as separate keys for all the different rooms. ... AI and machine learning have both been used in detecting anomalies and suspicious patterns for some time, and will only continue to be used more. I expect SOCs to become increasingly reliant on AI. Getting more specific, log analysis is a key area for AI to automate. 


6 innovative and effective approaches to upskilling

Beverage maker Torani has been mixing up L&D by flipping the traditional performance review — which can be “demoralizing” — on its head. It puts the onus on future rather than past performance and on employee learning aspirations, rather than manager assessment. ... Devine adds: “With today’s shift to agile working, some firms believe yearly performance objectives and appraisals are insufficient and inflexible. They need something more frequent, nimble, and focused on feedback, skills and future needs. But you still need managers to assess performance to justify and provide transparency on promotions and pay decisions.” ... Microsoft is supporting workers across its organization gain skills related to AI — from non-techies to IT professionals and leaders. Simon Lambert, chief learning officer at Microsoft UK, says: “One lesson we’ve learned from our AI learning journey is that upskilling means far more than merely equipping employees with skills. It requires an ecosystem that fosters adaptability and continuous learning. In the face of AI-upskilling demand, employees need faster, seamless access to learning infrastructure.


2024 Data Center Un-Predictions: Five Unlikely Industry Forecasts

The potential impact of data centers on local communities is an important issue. At a recent conference in Virginia, we had activists from the community right alongside data center leaders to discuss the challenges and opportunities we face. While there were still some disconnects, we met in the middle on some critical topics around power, community engagement, and ensuring we create a more sustainable future. ... Leveraging self-driving technology, robots independently chart and traverse the data center, gathering real-time sensor data. This lets them immediately juxtapose present patterns against pre-defined norms, facilitating swift identification of deviations for human examination. In an ever more interconnected and intricate environment, this robotic technology grants decision-makers enhanced visibility, rapidity, and a breadth of intelligence that surpasses what humans or stationary cameras can provide. This advanced capability is vital for maintaining the efficiency and security of data centers in our increasingly digital world.


US DOD’s CMMC 2.0 rules lift burdens on MSPs, manufacturers

The proposed rules also let manufacturers off the hook for complying with NIST SP 800-171. SP 800-171 is a set of NIST cybersecurity rules to protect sensitive federal information. “The requirements of 171 set of cyber standards are designed for IT networks and information systems,” Metzger says. “They were never really designed for a manufacturing environment. It’s now said clearly in the proposed rules that the assessments won’t apply to operational technology." "That, to me, should cause manufacturers to breathe a huge sigh of relief because being required to meet NIST standards that simply don’t fit a manufacturing or OT environment is a recipe for trouble of many forms," Metzger says. “The most important change is what did not change. The document has essentially the same structure and strategy that was in 1.0. It requires third-party assessments for a very large number of defense suppliers.” The proposed version 2.0 of the CMMC rules was published in the Federal Register December 26. Interested parties have until February 26 to file comments with the DOD before the agency finalizes the rules.


Banking Innovation is Paramount Even as Regulatory and Competitive Pressures Mount

Guiding technology-forward regulations can empower banks to harness innovation, enhancing security, transparency, and customer value. Regulators should seek thoughtful oversight that encourages innovation while safeguarding against excessive risks instead of attempting to prevent the recurrence of a once-in-a-century financial crisis. Banks face a growing challenge to their market share from alternative lending platforms, which poses an existential threat, as noted in McKinsey’s 2023 Global Banking Annual Review. Over 70% of the growth in global financial assets since 2015 has shifted away from traditional bank lending, finding its way into private markets, institutional investors and the realm of “shadow banking.” Near-zero interest rates have enabled private equity firms and non-bank lenders to offer lower-cost loans. With its digitally savvy consumer base, the fintech sector has further accelerated this transition, particularly during the pandemic.


IT and OT cybersecurity: A holistic approach

As OT becomes more interconnected, the need to safeguard OT systems against cyber threats is paramount. Many cyber threats and vulnerabilities specifically target OT systems, which emphasizes the potential impact on industrial operations. Many OT systems still use legacy technologies and protocols that may have inherent vulnerabilities, as they were not designed with modern cybersecurity standards in mind. They may also use older or insecure communication protocols that may not encrypt data, making them susceptible to eavesdropping and tampering. Concerns about system stability often lead OT environments to avoid frequent updates and patches. This can leave systems exposed to known vulnerabilities. OT systems are not immune to social engineering attacks either. Insufficient training and awareness among OT personnel can lead to unintentional security breaches, such as clicking on malicious links or falling victim to social engineering attacks. Supply chain risks also pose a threat, as third-party suppliers and vendors may introduce vulnerabilities into OT systems if their products or services are not adequately secured.


Exploring the Future of Information Governance: Key Predictions for 2024

In today’s rapidly evolving digital landscape, information governance has become a collective responsibility. Looking ahead to 2024, we can anticipate a significant shift towards closer collaboration between the legal, compliance, risk management, and IT departments. This collaborative effort aims to ensure comprehensive data management and robust protection practices across the entire organization. By adopting a holistic approach and providing cross-functional training, companies can empower their workforce to navigate the complexities of information governance with confidence, enabling them to make informed decisions and mitigate potential risks effectively. Embracing this collaborative mindset will be crucial for organizations to adapt and thrive in an increasingly data-driven world. ... Blockchain technology, with its decentralized and immutable nature, has the tremendous potential to revolutionize information governance across industries. By 2024, as businesses continue to recognize the benefits, we can expect a significant increase in the adoption of blockchain for secure and transparent transaction ledgers.


Data Professional Introspective: Demystifying Data Culture

We are discussing data culture from several points of view: what new content should be added to the DCAM, where would it fall within the current framework structure, what changes we propose to that structure, what modifications should be made to existing content, how the new/modified content would be assessed, and so on. ... One can begin decomposing data culture from a high-level vision, which summarizes what the organization has accomplished when it can feel confident in asserting that, “We have a strong data culture.” One can also compile a collection of activities and behaviors that demonstrate a developed data culture, and then categorize them and parse them into the DCAM. Or, one can apply a combination of the two approaches, which is the path the working group has followed. The working definition posited to date includes a summary description of a strong data culture: “A strong data culture promotes data-driven decision-making, data transparency, and the alignment of data and analytics to business objectives. It prioritizes strategic data use and encourages sharing and collaboration around data.”


Tackling technical debt in the insurance industry

The impact of technical debt on insurers spans various dimensions. Data inefficiencies arise, leading to compliance issues and difficulties in recruiting and retaining talent. Outdated processes hinder optimal decision-making, impacting both established and newer insurers. Addressing technical debt requires insurers to foster a culture of change, emphasising the risks of neglecting this issue and aligning strategies with broader organisational objectives. Tackling technical debt involves immediate action, prioritised backlog creation, and adaptive development processes. Insurers are advised to navigate technical debt through a combination of incremental and transformational changes. Incremental adjustments and breakthrough advancements should complement comprehensive restructuring efforts for sustained and effective resolution. The roadmap to a resilient, innovative future in insurance hinges on proactive management of technical debt. Insurers must embark on their journey towards pricing transformation to remain competitive and future-ready.



Quote for the day:

"In the end, it is important to remember that we cannot become what we need to be by remaining what we are." -- Max De Pree

Daily Tech Digest - January 06, 2024

FTC offers $25,000 prize for detecting AI-enabled voice cloning

Through the Voice Cloning Challenge, the FTC aims to find a solution that can identify cases of voice cloning with the help of generative AI. The agency calls it “an exploratory challenge” that could potentially provide a direction for the risk mitigation effort. The winning proposal will receive $25,000 and the runner-up will get $4,000. There are up to three honorable mentions, each awarded with $2,000. On January 2nd, the agency started accepting submissions via this portal and will receive ideas for 10 days, until January 12, 08:00 PM EST. Submissions must include a one-page overview of the proposal and a detailed description of up to 10 pages. Participants may also include a video to show how their idea works. All submissions will be judged based on their practical feasibility, impact on corporate accountability and burden on the consumer, and resilience to rapid technological advancements in the field. Should the challenge fail to yield any effective defense ideas, FTC notes that the effort will serve as an early warning for policymakers and would highlight the need for more stringent regulations on the use of AI technology.


Building a Great Security Operations Center

Without a defined SOC strategy, security leaders may struggle to prioritize resources. A strategy provides direction based on various inputs such as the threat landscape, regulatory requirements and threat assessments specific to the organization. In the context of an SOC, the primary objective of the SOC strategy should be to avoid a situation where the cost and effort is high and the value and return on investment (ROI) is low. The aim of the SOC strategy is to ensure that the SOC effectively fulfils its function and, in doing so, helps the organization to fulfil its overall business objectives. A well-architected SOC provides a positive ROI by minimizing potential financial losses due to cyberincidents. At the same time, an SOC enhances an organization’s ability to detect and respond to cyberthreats in real time, safeguarding sensitive data and protecting the organization’s reputation. Therefore, compliance, ROI and risk reduction are interconnected. Although it is easy to get carried away with generic cybersecurity use cases, the development of business-aligned use cases is what separates average SOCs from great SOCs.


Is the vCISO Model Right for Your Organization?

It's getting harder to justify not having a CISO, so many businesses that have never had one are filling the gap with a virtual CISO (vCISO). A vCISO, sometimes referred to as a fractional CISO or CISO-as-a-service, is typically a part-time, outsourced security expert who helps businesses protect their infrastructure, data, personnel and customers. Depending on the needs of the company, vCISOs can work on-site or remotely, for the long term or short term. There are plenty of reasons why companies are going the vCISO route. Sometimes it's an internal crisis where a company's CISO has unexpectedly resigned and the board needs time to find a permanent new one. Other times it revolves around new regulatory or business requirements or a cybersecurity framework the company needs to adhere to, like NIST's Cybersecurity Framework 2.0. Sometimes a board member used to being briefed by the CISO may request a vCISO. "A smaller company might need a CISO but just a few days a week, and that type of delivery model is perfect for a vCISO," says Russell Eubanks, a vCISO who is also on the faculty of IANS Research and an instructor with SANS Institute.


Generative AI and Data Management: Transforming B2B Practices

Generative AI’s future in data management and analytics shines with promising trends to redefine data analysis methodologies. These trends encompass enhanced augmentation, deeper understanding and explanation, and the democratization of data analysis, presenting a transformative shift in how organizations harness data for insights and decision-making. Generative AI is poised to transcend traditional data visualization, evolving to augment the entire data analysis workflow. This evolution encompasses automated data exploration, hypothesis generation, data storytelling, and predictive analytics. AI’s capability to suggest patterns, relationships, and anomalies and generate comprehensive reports promises to revolutionize data-driven decision-making. The future of Generative AI goes beyond reporting events, delving into causality and explanations. The upcoming trends include causal inference, counterfactual analysis, and the integration of Explainable AI (XAI). These advancements ensure a profound understanding of underlying causes behind observed trends and transparent insights for users.


4 Strategies for Migrating Monolithic Apps to Microservices

For many organizations, taking a lift-and-shift approach is the first step for migrating monolithic applications to Kubernetes and microservices. This involves directly lifting the monolith onto hardware hosted in the cloud, and then gradually breaking down the app into microservices. However, the lift-and-shift philosophy has its challenges, as organizations must refactor monoliths to optimize them for the cloud. Therefore, it’s often more cost-effective to refactor an application service by service into a containerized architecture. ... Dependencies within monolithic apps are deeply intertwined. These close relationships among components are one of the driving forces behind the move to Kubernetes and microservices, as they hinder flexible changes and deployment. When migrating an application to a microservices architecture, it’s important for teams to understand all dependencies among services and to reduce and streamline them as much as possible. Asynchronous messaging is key, allowing services to communicate by sending and receiving messages using queues. 


Network Tokenization and Digital Identities Are Quietly Transforming Payment Security

Digital identities, through biometric data and multi-factor authentication, fortify the security of transactions. This not only protects users from identity theft but also strengthens the overall trustworthiness of digital payment systems. “We never really thought about, what does it mean to identify a person on the internet in a way that is portable and doesn’t require you to rely on a single private platform,” Mike Brock, CEO of TBD, a business from Block focused on open-source decentralized technologies, told PYMNTS. Digital identities play a crucial role in meeting regulatory requirements. By providing a secure and traceable means of verifying user identities, businesses can navigate compliance challenges more efficiently, reducing the complexities associated with anti-money laundering (AML) and know your customer (KYC) processes. “Combating Online Fraud With Digital Identification,” a PYMNTS Intelligence and Prove collaboration, finds that security is highly important for 83% of consumers, while 53% say consistent experiences across different platforms have a very or extremely big impact on their trust in financial institutions.


AI governance outlook: A Global South perspective

An under-regulated path for AI and emerging technologies may bring diverse negative outcomes. These outcomes may lead to a rise in inequality, loss of privacy, and ethical transgressions. By contextualising this through understanding the history of the industrial revolutions that brought drastic changes in people's social and economic lives and prioritising moral concerns, the G20 and GPAI member states can reduce negative results that will arise without the right steering and regulation. Despite the G20's significant influence and GPAI’s members’ technical expertise, many member states face issues with the digital divide, especially the unequal distribution of advanced technologies and their benefits. The divide deepens as AI development, mainly in developed markets, widens the gap between these countries and their developing counterparts in AI research and development (R&D). As per the AI Index Fund 2023, private investments in AI from 2013-22 in the United States (US) (US$250 billion) outpaces that of other economies including India, Japan, the United Kingdom (UK) and most of the other G20 nations.


At What Point Is Digital Transformation A Success?

“Digital transformation” sounds like an expensive, laborious slog. The good news is that most companies are likely closer to succeeding at it than they think. Getting in shape and digital transformation have a lot in common: planning, persistence and patience—with a lot of pragmatism—are the keys to achieving your goals. ... When you are in a new fitness regimen, have you “failed” because you’ve only lost 10 pounds of your 20-pound goal? Of course not. You celebrate your progress, and you keep working at it. In a digital transformation, each company’s goals and starting points are unique to their particular circumstances. As a result, based on the clients I work with daily, there are many ways to measure progress. ... In building a great company or social sector enterprise, there is no single defining action, no grand program, no one killer innovation, no solitary lucky break, no miracle moment. Rather, the process resembles relentlessly pushing a giant, heavy flywheel, turn upon turn, building momentum until a point of breakthrough, and beyond.


How to prepare for increased oversight of cybersecurity

DORA, NIST 2.0 frameworks and the new SEC rules can help speed up this process. However, companies can also develop best practices to better implement board oversight of cybersecurity risk. First, covered entities must start planning now for the structural and cultural changes these rules and regulations will require—they will take time to implement. When done right, a risk management program will educate and empower company leaders to understand and confidently accept, mitigate or transfer risk. Second, to promote this strong governance at the C-Suite and board level, companies must educate their leadership on how to take a front seat around cyber strategy and governance. Rather than an insulated organizational function, cyber risk management should be informed by a company’s business strategies, compliance landscape, and risk culture. Finally, it will be critical for organizations to understand specific roles and responsibilities and to maintain regular lines of communications. In addition to the Board and other company leaders, security, communications, and legal teams should be involved in ongoing conversations around achieving a whole-of-business cyber governance strategy.


Optimizing PCI compliance in financial institutions

In practice, IT architectural patterns give architects the building blocks to design any IT solution. The architect chooses and orders the patterns available in the portfolio to meet the end goal. Having segmentation between infrastructure providing data processing and data storage is an example of a broad IT security architectural pattern. If the solution’s goal involves processing and storing data, the architect is constrained to place the pieces that will fulfill those tasks in the proper segments. Furthermore, if the operating system pattern is Linux Oracle Enterprise, the architect would use that pattern first in its design unless technical constraints made the consumption of this pattern suboptimal to accomplish the solution’s goal. All other needs, for example, authentication, encryption, log management, system configuration, would be treated the same—by using the architectural patterns available. The notion of pattern exists beyond IT in areas that a PCI security assessment touches, such as employee pre-employment practices, awareness security training, risk assessment methodology, or third-party service provider management.



Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - January 05, 2024

The dark side of AI: Scientists say there’s a 5% chance of AI causing humans to go extinct

Despite concerns about AI behaving in ways misaligned with human values, some argue that current technology cannot cause the catastrophic consequences predicted by skeptics. Nir Eisikovits, a philosophy professor, contends that AI systems cannot make complex decisions and do not have autonomous access to critical infrastructure. While the fear of AI wiping out humanity grabs attention, an editorial in Nature contends that the more immediate societal concerns lie in biased decision-making, job displacement, and the misuse of facial recognition technology by authoritarian regimes. The editorial calls for a focus on actual risks and actions to address them rather than fearmongering narratives. The prospect of AI with human-level intelligence raises the theoretical possibility of AI systems creating other AI, leading to uncontrollable “superintelligence.” Authors Otto Barten and Joep Meindertsma argue that the competitive nature of AI labs incentivizes tech companies to create products rapidly, possibly neglecting ethical considerations and taking risks.


10 Skills Enterprise Architects Need In 2024

While an abundance of legacy technology is a cause for concern, each application needs to be appraised on a case-by-case basis. It's possible that an older application could actually be a better functional fit for your organization. More likely, however, is that removing a legacy application could be more trouble than it's worth. When you have clarity on how each application fits into your IT landscape, it could become apparent that removing an application would cause more problems than it would solve. Just as enterprise architects need to become experts at surgically removing outdated applications, they also need to know when the time is right to remove an application and how to manage legacy technology until that point. That's the true value of enterprise architecture. ... As generative artificial intelligence (AI) and other new technologies continue to take the weight of work out of daily tasks, the value a human can add is more about communication, negotiation, and diplomacy. Getting stakeholders on board with enterprise architecture involves charm and understanding.


The European Data Act: New Rules for a New Age

Being a key element of the EU’s data strategy, the Data Act intends to lead to new, innovative services and more competitive prices for aftermarket services. According to the European Commission, the Data Act will make more data available for reuse, and it is expected to create 270 billion euros of additional gross domestic product by 2028. Complementing the Data Governance Act, which sets out the processes and structures to facilitate data sharing by companies across the EU and between sectors, the Data Act clarifies who can create value from industrial data and under which conditions. The Data Act also aims to put users and providers of data processing services on more equal footing in terms of access to data. ... The Data Act includes specific measures to allow users to gain access to the data their connected products generate (including the relevant metadata necessary to interpret such data) and to share such data with third parties to provide aftermarket or other data-driven innovative services. The Data Act further sets out that such data should be accessible in an easy, secure, comprehensive and structured manner, and it should be free of charge and provided in a commonly used machine-readable format.


Unlocking the Potential of Gen AI in Cyber Risk Management

Security automation powered by AI plays a pivotal role in streamlining various security functions, alleviating the workload for CSOs and CIOs and facilitating regulatory compliance. Security automation significantly simplifies routine security tasks, allowing human resources to pivot toward more intricate risk analysis and strategic decision-making. One of the notable contributions of AI lies in its assistance in meticulous code inspection and vulnerability assessment. For instance, tools such as Amazon Inspector for Lambda code and Amazon Detective provide indispensable support. Amazon Inspector aids in the comprehensive examination of code, identifying potential vulnerabilities or security loopholes within the Lambda functions, which are integral parts of many cloud applications. This early identification ensures preemptive measures are taken to fortify these vulnerabilities before deployment. Additionally, Amazon Detective helps security analysts by correlating and organizing vast amounts of data to identify patterns or anomalies that might signify a security issue. By leveraging machine learning and AI-driven insights, it streamlines the process of identifying and addressing them effectively. 


Honeywell’s Journey to Autonomous Operations

We’ve integrated AI into our technical-support operations, enabling customers to receive answers to their technical questions within minutes or seconds, as opposed to the day or two it previously took. Today, the addition of generative AI has amplified the capabilities of industrial AI, making it even more powerful than ever before. For example, we’re currently looking at millions of instances of alarms being triggered in the plants of our industrial customers -- to evaluate the potential use of such historical datasets to train a robust language model that would assist plant operators in identifying and addressing alarm issues promptly and providing guidance on necessary actions. ... With the convergence of IoT and AI software, the journey to autonomous operations is accelerating rapidly in the industrial world. However, automated decision-making requires both domain knowledge and the technical capabilities to build such a system. In vetting potential partners, look for one with the experience, data, and domain expertise to help you make the transition at scale.


Data and AI Predictions for 2024: Shifting Sands in Enterprise Data and AI Technologies

As organizations continue their shift to cloud-based data and analytics infrastructure, a more prudent fiscal outlook will be the theme for 2024. The cloud migration megatrend will not reverse, but organizations will scrutinize their cloud spend more than ever due to the challenging macroeconomic environment. In the cloud analytics arena, Databricks and Snowflake will continue their dominance with their well-established platforms. In particular, Databricks’ first-mover advantage for facilitating a lakehouse architecture will allow it to capture more market share. This paradigm combines the flexibility of data lakes with the management features of data warehouses, offering the best of both worlds to enterprises. On the other hand, Google BigQuery is expected to retain its stronghold within Google Cloud Platform (GCP) deployments, bolstered by deep integration with other GCP services and a strong existing customer base. However, the economic headwinds will compel enterprises to consider the total cost of ownership more closely. As a result, the traditional data warehouse architecture will see a decline in favor of the more cost-effective lakehouse design pattern. 


“You can’t prevent the unpreventable” - Rubrik CEO

A significant hurdle in the fight against cyber threats as a whole is in legislation and prosecution. The most capable cyber criminal enterprises are often state-sponsored groups harbored within nations that share their sympathies. While it is possible to seize their cyber assets and disrupt their operations, it is near impossible to prosecute a criminal who is working on behalf of a hostile government. Sinha states that not enough is being done at both the business and governmental levels to create frameworks for information sharing. This means that when one business faces a successful attack, it can be studied to understand the methods of intrusion, how the data was encrypted or extracted, and what could have been done at each stage of the attack to minimize the damage. Not only does this allow businesses to improve their data security and recovery strategies, but also provides attack playbooks that can be used to identify the groups responsible and their cyber infrastructure. However, there is an air of hesitation among many businesses as many would prefer to pay a ransom rather than reveal that their organization was successfully breached, which could cause potential reputational and economic losses.


Gen AI: A Shield for Improved Cyber Resilience

Before implementing GenAI as a proper defense tool, teams and leaders need to understand the strengths and weaknesses of GenAI. Proper research and education on this topic will ensure accurate security procedures fortifying the appropriate tool for the corresponding task. An easy way to understand the benefits of a certain AI tool is by surveying its AI model card (sometimes known as a “system card”), which ultimately provides users with knowledge about its benefits and advantages, what it has and has not been tested for, and its flaws and vulnerabilities. Vetting AI models is a vital step, and model provenance should be the first step of any and all defense strategies. Biden’s latest executive order about AI reinforces the importance of vetting AI models, requiring all AI models to be red-teamed to suss out potential weaknesses. Model provenance provides all documented history such as the AI model origin, the architecture and parameters it possesses, dependencies it may bear, the data used to train it, and other corresponding details. 


Apache ERP Zero-Day Underscores Dangers of Incomplete Patches

The incident highlights attackers' strategy of scrutinizing any patches released for high-value vulnerabilities — efforts which often result in finding ways around software fixes, says Douglas McKee, executive director of threat research at SonicWall. "Once someone's done the hard work of saying, 'Oh, a vulnerability exists here,' now a whole bunch of researchers or threat actors can look at that one narrow spot, and you've kind of opened yourself up to a lot more scrutiny," he says. "You've drawn attention to that area of code, and if your patch isn't rock solid or something was missed, it's more likely to be found because you've extra eyes on it." ... The reasons that companies fail to fully patch an issue are numerous, from not understanding the root cause of the problem to dealing with huge backlogs of software vulnerabilities to prioritizing an immediate patch over a comprehensive fix, says Jared Semrau, a senior manager with Google Mandiant's vulnerability and exploitation group. "There is no simple, single answer as to why this happens," he says. 


Unlocking the Secrets of Data Privacy: Navigating the World of Data Anonymization, Part 1

Implementing data anonymization techniques presents many technical challenges that demand meticulous deliberation and expertise. One paramount obstacle lies in the intricacies of determining the optimal level of anonymization. A profound comprehension of the data's structure and the potential for re-identification is imperative when employing techniques such as k-anonymity, l-diversity, or differential privacy. Furthermore, scalability poses another formidable hurdle. With the continuous growth of data volumes, effectively applying anonymization techniques without unduly compromising performance becomes increasingly more work. Numerous difficulties emerge in the execution procedure because of the differing nature of information types, from organized information in databases to unstructured information in reports and pictures. Additionally, the challenge of keeping pace with the ever-evolving data formats and sources necessitates constant updates and adaptations of anonymization strategies.



Quote for the day:

"You may be disappointed if you fail, but you are doomed if you don't try." -- Beverly Sills