Daily Tech Digest - January 07, 2024

2024 cybersecurity forecast: Regulation, consolidation and mothballing SIEMs

CISOs’ jobs are getting harder. Many are grappling with an onslaught of security threats, and now the legal and regulatory stakes are higher. The new SEC cybersecurity disclosure requirements have many CISOs concerned they’ll be left with the liability when an attack occurs. ... After the Cyber Resilience Act, policymakers and developers drive adoption of security-by-design. The CRA wisely avoided breaking the open source software ecosystem, but now the hard work starts: helping manufacturers adopt modern software development practices that will enable them to ship secure products and comply with the CRA, and driving public investment in open source software security to efficiently raise all boats. ... With the increase in digital business-as-usual, cybersecurity practitioners are already feeling lost in a deluge of inaccurate information from mushrooming multiple cybersecurity solutions coupled with a lack of cybersecurity architecture and design practices, resulting in porous cyber defenses.


Expert Insight: Adam Seamons on Zero-Trust Architecture

Zero trust goes beyond restricting access by need to know and the principle of least privilege. It’s about properly verifying access and being 110% certain that the access is legitimate. That means things like limiting access to specific criteria, such as by port or protocol, time period, IP address and/or physical location. ... A zero-trust network is about verification or double-checking. You want to be verifying not just the person, but also the device and limiting that access to specific permissions and rights that have been approved in advance. And you’re also restricting data access, particularly in situations like the example I just gave. Think of it like the difference between a key to the front door that gives you access to the whole house, and needing a key for the front door as well as separate keys for all the different rooms. ... AI and machine learning have both been used in detecting anomalies and suspicious patterns for some time, and will only continue to be used more. I expect SOCs to become increasingly reliant on AI. Getting more specific, log analysis is a key area for AI to automate. 


6 innovative and effective approaches to upskilling

Beverage maker Torani has been mixing up L&D by flipping the traditional performance review — which can be “demoralizing” — on its head. It puts the onus on future rather than past performance and on employee learning aspirations, rather than manager assessment. ... Devine adds: “With today’s shift to agile working, some firms believe yearly performance objectives and appraisals are insufficient and inflexible. They need something more frequent, nimble, and focused on feedback, skills and future needs. But you still need managers to assess performance to justify and provide transparency on promotions and pay decisions.” ... Microsoft is supporting workers across its organization gain skills related to AI — from non-techies to IT professionals and leaders. Simon Lambert, chief learning officer at Microsoft UK, says: “One lesson we’ve learned from our AI learning journey is that upskilling means far more than merely equipping employees with skills. It requires an ecosystem that fosters adaptability and continuous learning. In the face of AI-upskilling demand, employees need faster, seamless access to learning infrastructure.


2024 Data Center Un-Predictions: Five Unlikely Industry Forecasts

The potential impact of data centers on local communities is an important issue. At a recent conference in Virginia, we had activists from the community right alongside data center leaders to discuss the challenges and opportunities we face. While there were still some disconnects, we met in the middle on some critical topics around power, community engagement, and ensuring we create a more sustainable future. ... Leveraging self-driving technology, robots independently chart and traverse the data center, gathering real-time sensor data. This lets them immediately juxtapose present patterns against pre-defined norms, facilitating swift identification of deviations for human examination. In an ever more interconnected and intricate environment, this robotic technology grants decision-makers enhanced visibility, rapidity, and a breadth of intelligence that surpasses what humans or stationary cameras can provide. This advanced capability is vital for maintaining the efficiency and security of data centers in our increasingly digital world.


US DOD’s CMMC 2.0 rules lift burdens on MSPs, manufacturers

The proposed rules also let manufacturers off the hook for complying with NIST SP 800-171. SP 800-171 is a set of NIST cybersecurity rules to protect sensitive federal information. “The requirements of 171 set of cyber standards are designed for IT networks and information systems,” Metzger says. “They were never really designed for a manufacturing environment. It’s now said clearly in the proposed rules that the assessments won’t apply to operational technology." "That, to me, should cause manufacturers to breathe a huge sigh of relief because being required to meet NIST standards that simply don’t fit a manufacturing or OT environment is a recipe for trouble of many forms," Metzger says. “The most important change is what did not change. The document has essentially the same structure and strategy that was in 1.0. It requires third-party assessments for a very large number of defense suppliers.” The proposed version 2.0 of the CMMC rules was published in the Federal Register December 26. Interested parties have until February 26 to file comments with the DOD before the agency finalizes the rules.


Banking Innovation is Paramount Even as Regulatory and Competitive Pressures Mount

Guiding technology-forward regulations can empower banks to harness innovation, enhancing security, transparency, and customer value. Regulators should seek thoughtful oversight that encourages innovation while safeguarding against excessive risks instead of attempting to prevent the recurrence of a once-in-a-century financial crisis. Banks face a growing challenge to their market share from alternative lending platforms, which poses an existential threat, as noted in McKinsey’s 2023 Global Banking Annual Review. Over 70% of the growth in global financial assets since 2015 has shifted away from traditional bank lending, finding its way into private markets, institutional investors and the realm of “shadow banking.” Near-zero interest rates have enabled private equity firms and non-bank lenders to offer lower-cost loans. With its digitally savvy consumer base, the fintech sector has further accelerated this transition, particularly during the pandemic.


IT and OT cybersecurity: A holistic approach

As OT becomes more interconnected, the need to safeguard OT systems against cyber threats is paramount. Many cyber threats and vulnerabilities specifically target OT systems, which emphasizes the potential impact on industrial operations. Many OT systems still use legacy technologies and protocols that may have inherent vulnerabilities, as they were not designed with modern cybersecurity standards in mind. They may also use older or insecure communication protocols that may not encrypt data, making them susceptible to eavesdropping and tampering. Concerns about system stability often lead OT environments to avoid frequent updates and patches. This can leave systems exposed to known vulnerabilities. OT systems are not immune to social engineering attacks either. Insufficient training and awareness among OT personnel can lead to unintentional security breaches, such as clicking on malicious links or falling victim to social engineering attacks. Supply chain risks also pose a threat, as third-party suppliers and vendors may introduce vulnerabilities into OT systems if their products or services are not adequately secured.


Exploring the Future of Information Governance: Key Predictions for 2024

In today’s rapidly evolving digital landscape, information governance has become a collective responsibility. Looking ahead to 2024, we can anticipate a significant shift towards closer collaboration between the legal, compliance, risk management, and IT departments. This collaborative effort aims to ensure comprehensive data management and robust protection practices across the entire organization. By adopting a holistic approach and providing cross-functional training, companies can empower their workforce to navigate the complexities of information governance with confidence, enabling them to make informed decisions and mitigate potential risks effectively. Embracing this collaborative mindset will be crucial for organizations to adapt and thrive in an increasingly data-driven world. ... Blockchain technology, with its decentralized and immutable nature, has the tremendous potential to revolutionize information governance across industries. By 2024, as businesses continue to recognize the benefits, we can expect a significant increase in the adoption of blockchain for secure and transparent transaction ledgers.


Data Professional Introspective: Demystifying Data Culture

We are discussing data culture from several points of view: what new content should be added to the DCAM, where would it fall within the current framework structure, what changes we propose to that structure, what modifications should be made to existing content, how the new/modified content would be assessed, and so on. ... One can begin decomposing data culture from a high-level vision, which summarizes what the organization has accomplished when it can feel confident in asserting that, “We have a strong data culture.” One can also compile a collection of activities and behaviors that demonstrate a developed data culture, and then categorize them and parse them into the DCAM. Or, one can apply a combination of the two approaches, which is the path the working group has followed. The working definition posited to date includes a summary description of a strong data culture: “A strong data culture promotes data-driven decision-making, data transparency, and the alignment of data and analytics to business objectives. It prioritizes strategic data use and encourages sharing and collaboration around data.”


Tackling technical debt in the insurance industry

The impact of technical debt on insurers spans various dimensions. Data inefficiencies arise, leading to compliance issues and difficulties in recruiting and retaining talent. Outdated processes hinder optimal decision-making, impacting both established and newer insurers. Addressing technical debt requires insurers to foster a culture of change, emphasising the risks of neglecting this issue and aligning strategies with broader organisational objectives. Tackling technical debt involves immediate action, prioritised backlog creation, and adaptive development processes. Insurers are advised to navigate technical debt through a combination of incremental and transformational changes. Incremental adjustments and breakthrough advancements should complement comprehensive restructuring efforts for sustained and effective resolution. The roadmap to a resilient, innovative future in insurance hinges on proactive management of technical debt. Insurers must embark on their journey towards pricing transformation to remain competitive and future-ready.



Quote for the day:

"In the end, it is important to remember that we cannot become what we need to be by remaining what we are." -- Max De Pree

Daily Tech Digest - January 06, 2024

FTC offers $25,000 prize for detecting AI-enabled voice cloning

Through the Voice Cloning Challenge, the FTC aims to find a solution that can identify cases of voice cloning with the help of generative AI. The agency calls it “an exploratory challenge” that could potentially provide a direction for the risk mitigation effort. The winning proposal will receive $25,000 and the runner-up will get $4,000. There are up to three honorable mentions, each awarded with $2,000. On January 2nd, the agency started accepting submissions via this portal and will receive ideas for 10 days, until January 12, 08:00 PM EST. Submissions must include a one-page overview of the proposal and a detailed description of up to 10 pages. Participants may also include a video to show how their idea works. All submissions will be judged based on their practical feasibility, impact on corporate accountability and burden on the consumer, and resilience to rapid technological advancements in the field. Should the challenge fail to yield any effective defense ideas, FTC notes that the effort will serve as an early warning for policymakers and would highlight the need for more stringent regulations on the use of AI technology.


Building a Great Security Operations Center

Without a defined SOC strategy, security leaders may struggle to prioritize resources. A strategy provides direction based on various inputs such as the threat landscape, regulatory requirements and threat assessments specific to the organization. In the context of an SOC, the primary objective of the SOC strategy should be to avoid a situation where the cost and effort is high and the value and return on investment (ROI) is low. The aim of the SOC strategy is to ensure that the SOC effectively fulfils its function and, in doing so, helps the organization to fulfil its overall business objectives. A well-architected SOC provides a positive ROI by minimizing potential financial losses due to cyberincidents. At the same time, an SOC enhances an organization’s ability to detect and respond to cyberthreats in real time, safeguarding sensitive data and protecting the organization’s reputation. Therefore, compliance, ROI and risk reduction are interconnected. Although it is easy to get carried away with generic cybersecurity use cases, the development of business-aligned use cases is what separates average SOCs from great SOCs.


Is the vCISO Model Right for Your Organization?

It's getting harder to justify not having a CISO, so many businesses that have never had one are filling the gap with a virtual CISO (vCISO). A vCISO, sometimes referred to as a fractional CISO or CISO-as-a-service, is typically a part-time, outsourced security expert who helps businesses protect their infrastructure, data, personnel and customers. Depending on the needs of the company, vCISOs can work on-site or remotely, for the long term or short term. There are plenty of reasons why companies are going the vCISO route. Sometimes it's an internal crisis where a company's CISO has unexpectedly resigned and the board needs time to find a permanent new one. Other times it revolves around new regulatory or business requirements or a cybersecurity framework the company needs to adhere to, like NIST's Cybersecurity Framework 2.0. Sometimes a board member used to being briefed by the CISO may request a vCISO. "A smaller company might need a CISO but just a few days a week, and that type of delivery model is perfect for a vCISO," says Russell Eubanks, a vCISO who is also on the faculty of IANS Research and an instructor with SANS Institute.


Generative AI and Data Management: Transforming B2B Practices

Generative AI’s future in data management and analytics shines with promising trends to redefine data analysis methodologies. These trends encompass enhanced augmentation, deeper understanding and explanation, and the democratization of data analysis, presenting a transformative shift in how organizations harness data for insights and decision-making. Generative AI is poised to transcend traditional data visualization, evolving to augment the entire data analysis workflow. This evolution encompasses automated data exploration, hypothesis generation, data storytelling, and predictive analytics. AI’s capability to suggest patterns, relationships, and anomalies and generate comprehensive reports promises to revolutionize data-driven decision-making. The future of Generative AI goes beyond reporting events, delving into causality and explanations. The upcoming trends include causal inference, counterfactual analysis, and the integration of Explainable AI (XAI). These advancements ensure a profound understanding of underlying causes behind observed trends and transparent insights for users.


4 Strategies for Migrating Monolithic Apps to Microservices

For many organizations, taking a lift-and-shift approach is the first step for migrating monolithic applications to Kubernetes and microservices. This involves directly lifting the monolith onto hardware hosted in the cloud, and then gradually breaking down the app into microservices. However, the lift-and-shift philosophy has its challenges, as organizations must refactor monoliths to optimize them for the cloud. Therefore, it’s often more cost-effective to refactor an application service by service into a containerized architecture. ... Dependencies within monolithic apps are deeply intertwined. These close relationships among components are one of the driving forces behind the move to Kubernetes and microservices, as they hinder flexible changes and deployment. When migrating an application to a microservices architecture, it’s important for teams to understand all dependencies among services and to reduce and streamline them as much as possible. Asynchronous messaging is key, allowing services to communicate by sending and receiving messages using queues. 


Network Tokenization and Digital Identities Are Quietly Transforming Payment Security

Digital identities, through biometric data and multi-factor authentication, fortify the security of transactions. This not only protects users from identity theft but also strengthens the overall trustworthiness of digital payment systems. “We never really thought about, what does it mean to identify a person on the internet in a way that is portable and doesn’t require you to rely on a single private platform,” Mike Brock, CEO of TBD, a business from Block focused on open-source decentralized technologies, told PYMNTS. Digital identities play a crucial role in meeting regulatory requirements. By providing a secure and traceable means of verifying user identities, businesses can navigate compliance challenges more efficiently, reducing the complexities associated with anti-money laundering (AML) and know your customer (KYC) processes. “Combating Online Fraud With Digital Identification,” a PYMNTS Intelligence and Prove collaboration, finds that security is highly important for 83% of consumers, while 53% say consistent experiences across different platforms have a very or extremely big impact on their trust in financial institutions.


AI governance outlook: A Global South perspective

An under-regulated path for AI and emerging technologies may bring diverse negative outcomes. These outcomes may lead to a rise in inequality, loss of privacy, and ethical transgressions. By contextualising this through understanding the history of the industrial revolutions that brought drastic changes in people's social and economic lives and prioritising moral concerns, the G20 and GPAI member states can reduce negative results that will arise without the right steering and regulation. Despite the G20's significant influence and GPAI’s members’ technical expertise, many member states face issues with the digital divide, especially the unequal distribution of advanced technologies and their benefits. The divide deepens as AI development, mainly in developed markets, widens the gap between these countries and their developing counterparts in AI research and development (R&D). As per the AI Index Fund 2023, private investments in AI from 2013-22 in the United States (US) (US$250 billion) outpaces that of other economies including India, Japan, the United Kingdom (UK) and most of the other G20 nations.


At What Point Is Digital Transformation A Success?

“Digital transformation” sounds like an expensive, laborious slog. The good news is that most companies are likely closer to succeeding at it than they think. Getting in shape and digital transformation have a lot in common: planning, persistence and patience—with a lot of pragmatism—are the keys to achieving your goals. ... When you are in a new fitness regimen, have you “failed” because you’ve only lost 10 pounds of your 20-pound goal? Of course not. You celebrate your progress, and you keep working at it. In a digital transformation, each company’s goals and starting points are unique to their particular circumstances. As a result, based on the clients I work with daily, there are many ways to measure progress. ... In building a great company or social sector enterprise, there is no single defining action, no grand program, no one killer innovation, no solitary lucky break, no miracle moment. Rather, the process resembles relentlessly pushing a giant, heavy flywheel, turn upon turn, building momentum until a point of breakthrough, and beyond.


How to prepare for increased oversight of cybersecurity

DORA, NIST 2.0 frameworks and the new SEC rules can help speed up this process. However, companies can also develop best practices to better implement board oversight of cybersecurity risk. First, covered entities must start planning now for the structural and cultural changes these rules and regulations will require—they will take time to implement. When done right, a risk management program will educate and empower company leaders to understand and confidently accept, mitigate or transfer risk. Second, to promote this strong governance at the C-Suite and board level, companies must educate their leadership on how to take a front seat around cyber strategy and governance. Rather than an insulated organizational function, cyber risk management should be informed by a company’s business strategies, compliance landscape, and risk culture. Finally, it will be critical for organizations to understand specific roles and responsibilities and to maintain regular lines of communications. In addition to the Board and other company leaders, security, communications, and legal teams should be involved in ongoing conversations around achieving a whole-of-business cyber governance strategy.


Optimizing PCI compliance in financial institutions

In practice, IT architectural patterns give architects the building blocks to design any IT solution. The architect chooses and orders the patterns available in the portfolio to meet the end goal. Having segmentation between infrastructure providing data processing and data storage is an example of a broad IT security architectural pattern. If the solution’s goal involves processing and storing data, the architect is constrained to place the pieces that will fulfill those tasks in the proper segments. Furthermore, if the operating system pattern is Linux Oracle Enterprise, the architect would use that pattern first in its design unless technical constraints made the consumption of this pattern suboptimal to accomplish the solution’s goal. All other needs, for example, authentication, encryption, log management, system configuration, would be treated the same—by using the architectural patterns available. The notion of pattern exists beyond IT in areas that a PCI security assessment touches, such as employee pre-employment practices, awareness security training, risk assessment methodology, or third-party service provider management.



Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - January 05, 2024

The dark side of AI: Scientists say there’s a 5% chance of AI causing humans to go extinct

Despite concerns about AI behaving in ways misaligned with human values, some argue that current technology cannot cause the catastrophic consequences predicted by skeptics. Nir Eisikovits, a philosophy professor, contends that AI systems cannot make complex decisions and do not have autonomous access to critical infrastructure. While the fear of AI wiping out humanity grabs attention, an editorial in Nature contends that the more immediate societal concerns lie in biased decision-making, job displacement, and the misuse of facial recognition technology by authoritarian regimes. The editorial calls for a focus on actual risks and actions to address them rather than fearmongering narratives. The prospect of AI with human-level intelligence raises the theoretical possibility of AI systems creating other AI, leading to uncontrollable “superintelligence.” Authors Otto Barten and Joep Meindertsma argue that the competitive nature of AI labs incentivizes tech companies to create products rapidly, possibly neglecting ethical considerations and taking risks.


10 Skills Enterprise Architects Need In 2024

While an abundance of legacy technology is a cause for concern, each application needs to be appraised on a case-by-case basis. It's possible that an older application could actually be a better functional fit for your organization. More likely, however, is that removing a legacy application could be more trouble than it's worth. When you have clarity on how each application fits into your IT landscape, it could become apparent that removing an application would cause more problems than it would solve. Just as enterprise architects need to become experts at surgically removing outdated applications, they also need to know when the time is right to remove an application and how to manage legacy technology until that point. That's the true value of enterprise architecture. ... As generative artificial intelligence (AI) and other new technologies continue to take the weight of work out of daily tasks, the value a human can add is more about communication, negotiation, and diplomacy. Getting stakeholders on board with enterprise architecture involves charm and understanding.


The European Data Act: New Rules for a New Age

Being a key element of the EU’s data strategy, the Data Act intends to lead to new, innovative services and more competitive prices for aftermarket services. According to the European Commission, the Data Act will make more data available for reuse, and it is expected to create 270 billion euros of additional gross domestic product by 2028. Complementing the Data Governance Act, which sets out the processes and structures to facilitate data sharing by companies across the EU and between sectors, the Data Act clarifies who can create value from industrial data and under which conditions. The Data Act also aims to put users and providers of data processing services on more equal footing in terms of access to data. ... The Data Act includes specific measures to allow users to gain access to the data their connected products generate (including the relevant metadata necessary to interpret such data) and to share such data with third parties to provide aftermarket or other data-driven innovative services. The Data Act further sets out that such data should be accessible in an easy, secure, comprehensive and structured manner, and it should be free of charge and provided in a commonly used machine-readable format.


Unlocking the Potential of Gen AI in Cyber Risk Management

Security automation powered by AI plays a pivotal role in streamlining various security functions, alleviating the workload for CSOs and CIOs and facilitating regulatory compliance. Security automation significantly simplifies routine security tasks, allowing human resources to pivot toward more intricate risk analysis and strategic decision-making. One of the notable contributions of AI lies in its assistance in meticulous code inspection and vulnerability assessment. For instance, tools such as Amazon Inspector for Lambda code and Amazon Detective provide indispensable support. Amazon Inspector aids in the comprehensive examination of code, identifying potential vulnerabilities or security loopholes within the Lambda functions, which are integral parts of many cloud applications. This early identification ensures preemptive measures are taken to fortify these vulnerabilities before deployment. Additionally, Amazon Detective helps security analysts by correlating and organizing vast amounts of data to identify patterns or anomalies that might signify a security issue. By leveraging machine learning and AI-driven insights, it streamlines the process of identifying and addressing them effectively. 


Honeywell’s Journey to Autonomous Operations

We’ve integrated AI into our technical-support operations, enabling customers to receive answers to their technical questions within minutes or seconds, as opposed to the day or two it previously took. Today, the addition of generative AI has amplified the capabilities of industrial AI, making it even more powerful than ever before. For example, we’re currently looking at millions of instances of alarms being triggered in the plants of our industrial customers -- to evaluate the potential use of such historical datasets to train a robust language model that would assist plant operators in identifying and addressing alarm issues promptly and providing guidance on necessary actions. ... With the convergence of IoT and AI software, the journey to autonomous operations is accelerating rapidly in the industrial world. However, automated decision-making requires both domain knowledge and the technical capabilities to build such a system. In vetting potential partners, look for one with the experience, data, and domain expertise to help you make the transition at scale.


Data and AI Predictions for 2024: Shifting Sands in Enterprise Data and AI Technologies

As organizations continue their shift to cloud-based data and analytics infrastructure, a more prudent fiscal outlook will be the theme for 2024. The cloud migration megatrend will not reverse, but organizations will scrutinize their cloud spend more than ever due to the challenging macroeconomic environment. In the cloud analytics arena, Databricks and Snowflake will continue their dominance with their well-established platforms. In particular, Databricks’ first-mover advantage for facilitating a lakehouse architecture will allow it to capture more market share. This paradigm combines the flexibility of data lakes with the management features of data warehouses, offering the best of both worlds to enterprises. On the other hand, Google BigQuery is expected to retain its stronghold within Google Cloud Platform (GCP) deployments, bolstered by deep integration with other GCP services and a strong existing customer base. However, the economic headwinds will compel enterprises to consider the total cost of ownership more closely. As a result, the traditional data warehouse architecture will see a decline in favor of the more cost-effective lakehouse design pattern. 


“You can’t prevent the unpreventable” - Rubrik CEO

A significant hurdle in the fight against cyber threats as a whole is in legislation and prosecution. The most capable cyber criminal enterprises are often state-sponsored groups harbored within nations that share their sympathies. While it is possible to seize their cyber assets and disrupt their operations, it is near impossible to prosecute a criminal who is working on behalf of a hostile government. Sinha states that not enough is being done at both the business and governmental levels to create frameworks for information sharing. This means that when one business faces a successful attack, it can be studied to understand the methods of intrusion, how the data was encrypted or extracted, and what could have been done at each stage of the attack to minimize the damage. Not only does this allow businesses to improve their data security and recovery strategies, but also provides attack playbooks that can be used to identify the groups responsible and their cyber infrastructure. However, there is an air of hesitation among many businesses as many would prefer to pay a ransom rather than reveal that their organization was successfully breached, which could cause potential reputational and economic losses.


Gen AI: A Shield for Improved Cyber Resilience

Before implementing GenAI as a proper defense tool, teams and leaders need to understand the strengths and weaknesses of GenAI. Proper research and education on this topic will ensure accurate security procedures fortifying the appropriate tool for the corresponding task. An easy way to understand the benefits of a certain AI tool is by surveying its AI model card (sometimes known as a “system card”), which ultimately provides users with knowledge about its benefits and advantages, what it has and has not been tested for, and its flaws and vulnerabilities. Vetting AI models is a vital step, and model provenance should be the first step of any and all defense strategies. Biden’s latest executive order about AI reinforces the importance of vetting AI models, requiring all AI models to be red-teamed to suss out potential weaknesses. Model provenance provides all documented history such as the AI model origin, the architecture and parameters it possesses, dependencies it may bear, the data used to train it, and other corresponding details. 


Apache ERP Zero-Day Underscores Dangers of Incomplete Patches

The incident highlights attackers' strategy of scrutinizing any patches released for high-value vulnerabilities — efforts which often result in finding ways around software fixes, says Douglas McKee, executive director of threat research at SonicWall. "Once someone's done the hard work of saying, 'Oh, a vulnerability exists here,' now a whole bunch of researchers or threat actors can look at that one narrow spot, and you've kind of opened yourself up to a lot more scrutiny," he says. "You've drawn attention to that area of code, and if your patch isn't rock solid or something was missed, it's more likely to be found because you've extra eyes on it." ... The reasons that companies fail to fully patch an issue are numerous, from not understanding the root cause of the problem to dealing with huge backlogs of software vulnerabilities to prioritizing an immediate patch over a comprehensive fix, says Jared Semrau, a senior manager with Google Mandiant's vulnerability and exploitation group. "There is no simple, single answer as to why this happens," he says. 


Unlocking the Secrets of Data Privacy: Navigating the World of Data Anonymization, Part 1

Implementing data anonymization techniques presents many technical challenges that demand meticulous deliberation and expertise. One paramount obstacle lies in the intricacies of determining the optimal level of anonymization. A profound comprehension of the data's structure and the potential for re-identification is imperative when employing techniques such as k-anonymity, l-diversity, or differential privacy. Furthermore, scalability poses another formidable hurdle. With the continuous growth of data volumes, effectively applying anonymization techniques without unduly compromising performance becomes increasingly more work. Numerous difficulties emerge in the execution procedure because of the differing nature of information types, from organized information in databases to unstructured information in reports and pictures. Additionally, the challenge of keeping pace with the ever-evolving data formats and sources necessitates constant updates and adaptations of anonymization strategies.



Quote for the day:

"You may be disappointed if you fail, but you are doomed if you don't try." -- Beverly Sills

Daily Tech Digest - January 04, 2024

Beyond transactions: Reimagining banking with superior digital customer journeys

In their journey toward digitalization, many banks have concentrated on streamlining customer journeys, often through targeted solutions like automation or AI to address specific problem areas. However, true digital success goes beyond these piecemeal enhancements. For banks to truly excel in the digital arena, a shift in mindset is required. It’s about adopting a platform-first approach that offers a more integrated, holistic solution rather than completely overhauling existing systems. This approach isn’t just about fine-tuning isolated components; it’s about seamlessly connecting these components through a unified platform that enhances the overall efficiency and effectiveness of the banking ecosystem. ... Many banks streamline, but the real leaders in digital banking are those who reimagine the entire customer journey. They take their ‘hands off the wheel,’ designing systems that amplify human potential while supporting touchless operations. They weave AI into the fabric of banking to deliver seamless service that’s not just efficient but also intuitive and connected. 


5 Microservices Design Patterns Every DevOps Team Should Know

The API gateway serves as a single entry point for all client requests. It routes requests to the appropriate microservice and subsequently aggregates the responses. It also handles cross-cutting concerns like authentication, monitoring and rate limiting. Furthermore, it provides a unified API that is easier to consume by the client, shielding them from the complexity of the microservices architecture. However, the API Gateway pattern is not without its challenges. It can become a bottleneck if not properly designed and scaled. Also, it’s a single point of failure unless highly available. Despite these challenges, with careful design choices and good operational practices, the API Gateway pattern can greatly simplify client interaction with microservices. ... The Circuit Breaker pattern aims to prevent this scenario. With the Circuit Breaker pattern, you can prevent a network or service failure from cascading to other services. When a failure is detected, the circuit breaker trips and prevents further calls to the failing service. It then periodically attempts to call the service, and if successful, it closes the circuit and lets the calls go through.


6 warning signs CIOs should look out for in 2024

2023 saw a massive boom in AI, and governments are starting to catch up. In the US, President Biden rolled out an executive order on the safe and secure uses of AI, while in the European Union, lawmakers in December agreed on the details of the AI Act — one of the first bills in the world to establish comprehensive rules for AI. So CIOs will have to follow the debate closely as the year progresses. “Staying updated with new regulations, especially regarding AI ethics, data usage, and copyright concerns, is crucial,” says Bilyk. “Ignoring these changes can lead to legal complications and a loss of public trust.” Companies should make sure they have enough compliance experts, while startups need to hire them early on because they have to understand if and how regulations apply to them. Also, it helps if CIOs know exactly which AI-powered tools their company uses and how their in-house tools are developed. Not knowing this is a serious red flag. “A lot of times, leadership, or the legal side, doesn’t even know what developers are building,” Joseph Thacker, security researcher at AppOmni, told CSO. “I think for small and medium enterprises, it’s going to be pretty tough.”


A Look at Microsoft's Secure Future Initiative

My main gripes with the SFI, and the way it's presented, are twofold. First, Satya is nowhere to be seen. This is not the CEO having a moment of deep realization that the current path is dangerous and the ship needs to steer clear of the icebergs ahead. It's security leaders talking about incremental security improvements. Second, it's marketing word salad (mostly, see more below). It talks in generalities about improvements but not enough about concrete, measurable and transparent details that we can see and use to start rebuilding our trust in Microsoft's security culture. ... In an ever-increasing "speed of feature release" competition with AWS in the IaaS and PaaS cloud spaces, and ditto in SaaS with Google (I think this is a mistake, Google Workspace is no longer a serious competitor to Microsoft 365 except in very small businesses) and to a lesser extent Salesforce and others, will program managers at Microsoft be incentivized to say "No, we can't release this feature now, even though the competition is, because we'll need to spend another two months ironing out the security issues."


Want to Identify Good Generative AI Use Cases? Don’t Be Boring!

When we talk about impact, we should always be thinking about acceleration. Code generation, for instance, is a great example of generative AI’s ability to take complex inputs and produce complex outputs. Writing code, even for master coders, can be a complex and time-consuming task. Great code is also invaluable to organizations and their solutions. With generative AI, an expert coder could prompt a model with something like, “Generate some code that sorts this array here in descending order” and receive a working output almost instantaneously. Sure, they could write that code themselves, but generative AI accelerates their job multiple times over. This is a capability that, until now, was only a dream. Conversational analytics is also another “golden zone” use case because of its accelerative potential. Within business intelligence (BI) and other advanced organizational data/AI tools, you could ask a model something like: “What have our sales been for each of our product groups, and are there any trends?” To do this in code would take a person, team, or department a long time. 


Emotional Cyberrisk Management Decisions

There are certain aspects of managing cyberrisk in which emotion can play an important role. However, far too often, emotion is treated as the default tool in risk management, rather than the specialty tool it is meant to be. Cognitive biases and heuristics have an outsized influence on cyberrisk management. They are hard-wired into our brains and, if managed poorly, can make it difficult to ascertain reality. One of the more common sources of bias is confirmation bias, which is the tendency to search for, interpret, and recall information in a way that confirms one’s preexisting beliefs or hypotheses. Confirmation bias may lead someone to disregard evidence that contradicts their existing beliefs. Overconfidence bias is also common. This leads individuals to overestimate their abilities, such as the effectiveness of their cybersecurity controls. Lastly, the anchoring effect happens when someone relies too heavily on the first piece of information ingested (i.e., the anchor) when making decisions, such as an initial risk assessment that may be outdated.


Why Data Quality Will Rise to the Top of Enterprise Priorities in 2024

Data quality is essential. Even if you have found the data sets that will be appropriate for training an AI model or digging for insights, your results will be poor if the quality of your data is poor. Ownership of data and data quality are core to business success but are still too often ignored or overlooked by the executives and board of directors of most organizations. We can see this in the disconnects between perception and reality. Over 80% of executives that HFS surveyed think they trust their data, but the reality is that there are still many people doing a lot of work to get data quality to a level where the data can be consumed and relied upon. As data quality takes on greater importance, it will escalate to an executive-level conversation. Enterprises will need to start collecting and documenting data, metadata, processes, and business rules as they pursue data quality. Without these basic elements, AI models won’t be able to produce insightful and exact results. If you haven’t yet, you’ll need to invest in initiatives to improve your data quality so you can build a strong foundation for AI use.


The Rise of Linux in Edge Computing and IoT

Edge devices often operate with limited resources. While Linux’s scalability is essentially an asset, striking a balance between functionality and resource usage is important. Careful optimization is essential to ensure efficient utilization of limited resources. ... There is a vast array of edge devices, ranging from sensors and actuators to gateways and edge servers. The diversity can present a challenge in terms of maintenance and optimization. Addressing various architectures and hardware configurations is a time-intensive process that requires continuous effort. ... Decentralization is a common feature of edge computing systems, with a multitude of devices each having unique vulnerabilities. Regularly patching and updating these devices can be challenging, as well as difficult to track. Organizations must be vigilant about maintaining the security integrity of edge computing systems. ... Integrating various devices into a cohesive system poses complex challenges. Developers must navigate the intricacies to ensure interoperability among devices running Linux and other operating systems.


Ransomware Group Steals Australian Courts' Video Recordings

Louise Anderson, chief executive officer of the court, did not reveal the nature of the attack but said the department had been able to isolate the affected network and could confirm that the unauthorized access had been restricted to recordings stored on the network. "We understand this will be unsettling for those who have been part of a hearing. We recognize and apologize for the distress that this may cause people," Anderson said. "Maintaining security for court users is our highest priority. Our current efforts are focused on ensuring our systems are safe and making sure we notify people in hearings where recordings may have been accessed." Anderson did not share the court's efforts to regain access to the compromised recordings but said changes are being made to hearing arrangements and will be announced shortly. Anderson said the hackers had breached a single computer system that manages only audiovisual recordings for all court jurisdictions. "The system holds recordings for around 28 days, so the primary investigation period is Nov. 1 to Dec. 21, which is when we identified the problem and isolated and disabled the affected network," she said.


Align IT With the Business

For most CIOs, redefining IT as a strategic discipline and getting the rest of the organization to recognize this isn’t an overnight process. This is also an area where the CIO has to take the lead. Taking the lead means changing perceptions of IT by altering IT practices so IT better aligns with the business. ... Those CIOs regularly visit with these managers to understand business goals and pain points, and then strategize on how IT solutions can help the managers (and the business) succeed. As part of this process, more CIOs are placing IT personnel such as business analysts directly into business units, or they’re teaming their own IT analysts with citizen developers in user departments. This creates inter-departmental cooperation, and it is a natural onramp for relationship building with other business managers. ... Being transformative in your company can be risky and isn’t for the faint of heart, but there are CIOs who have understood their companies’ businesses and have come up with transformative solutions that have grown out of IT. These breakthroughs have reinvented their companies. I



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick

Daily Tech Digest - January 03, 2024

5 best practices for digital twin implementation

Rather than wait until post-build, consider initiating digital twins during the planning, design, and construction phases of your projects. At the planning stage, this can enable plan simulation and various what-if scenario testing prior to committing to real-world investment. Part of the benefit of digital twins is they can address the full lifecycle from construction twins to operational twins. The digital twins, therefore, know far more than after-the-fact asset management systems, and the learnings and insights captured by the twin during design and build can improve operations and maintenance. According to Rapos, early incorporation allows for better data collection, more accurate modeling, and immediate feedback during the construction or development phase. It’s crucial to understand that digital twins aren’t just a final product, but a dynamic tool that evolves and adds value throughout the project’s life. Delaying its development can result in missed opportunities for optimization and innovation.


Why exit the cloud? 37signals explains

37signals was a significant cloud user with a $3.2 million cloud budget for 2022. The company pledged $600,000 to procure Dell servers, envisioning significant savings during the next five years. Of course, there were questions, and Hansson did an excellent job of addressing them one by one in the FAQ, such as the additional costs in terms of humans needed to run the on-premises systems, how optimization only took them so far in the cloud, and how they handled security requirements. Hansson also explained the limited abilities of cloud-native applications to reduce costs and highlighted the need for a world-class team to address security concerns, which the company has. Notably, privacy regulations and GDPR compliance were underscored as reasons for European companies to opt for self-owned hardware as opposed to relying on the cloud. Of course, this is not the case for everyone. ... Everyone is looking for a single answer, and it doesn’t exist. The requirements of your systems will dictate what platform you should use—not whatever seems trendy. Sometimes the cloud provides the most value, but not always.


Size doesn’t matter!

Small enterprises are less likely to have dedicated IT staff, let alone afford cyber security specialists. Security solutions are usually considered too expensive(Chidukwani 2022) and their technical features come across as overwhelmingly complex to be handled in-house. As a consequence, there is a tendency to rely heavily on external IT vendors that provide sub-optimal support without customized care(Benz 2020). Fear-driven, some business owners take up the reactive route. Instead of a unified threat solution, they continue to buy off-the-shelf security products in response to recent emerging threats, leaving may leakages unplugged and ineffective protection. These human, financial, and technical resource constraints create a puzzling gap between the cyber security awareness of small business leaders and their commensurate commitment to address the risk. Alongside the well-known construct of the ‘digital divide”, academic literature now also acknowledges a ‘security divide’, what with lagging investments in cybersecurity solutions coupled with increasing cyber incidents at SMEs (Heidt et al., 2019).


Cybersecurity challenges emerge in the wake of API expansion

APIs are already the fundamental building blocks of any modern organization today, and that will become even more evident going forward. As organizations look to transform their digital business and enter the era of the API economy, we expect that we will be building and using more and more APIs. That’s especially true if we take a look at some of the trends that are happening in technology nowadays. Things like VR/AR glasses, wearable devices, and voice-controlled devices all require APIs to work. APIs will play a more critical role as the world transitions to more browserless devices. All this growth and expansion means more APIs, requests, and security challenges. The toughest thing about API security is that, in most cases, organizations don’t know that hackers exploit their APIs because they don’t have access to API data in real-time. That’s why tooling, which allows you to do that, will become even more critical.


Attackers Abuse Google OAuth Endpoint to Hijack User Sessions

OAuth enables applications to get access to data and resources to other trusted online services and sites based on permissions set by a user, and it is the mechanism responsible for the authentication handoff between the sites. While the standard is certainly useful, it also presents risk to organizations if it's not implemented correctly, and there are a number of ways attackers can abuse vulnerable instances and the standard itself. For example, security researchers have found flaws in its implementation that have exposed key online services platforms such as Booking.com and others to attack. Meanwhile, others have used malicious OAuth apps of their creation to compromise Microsoft Exchange servers. In the case of the Google endpoint, the OAuth exploit discovered by Prisma targets Google Chrome's token_service table to extract tokens and account IDs of logged-in Chrome profiles, according to CloudSEK. That table contains two "crucial" columns, titled "service (GAIA ID)" and "encrypted_token," Karthick M explained.


Observability in 2024: More OpenTelemetry, Less Confusion

Observability has transcended its traditional association with monitoring to find bugs and to resolve outages, and now extends its influence across different interfaces, tools, and demonstrating enhanced openness and compatibility to increasingly make forecasts. These frecasts can involve predicting outages before they happen, cost shifts, resources usage and other variables that certainly would be much harder and mostly involve trial and error previously. ...  “This means that organizations can now use a single agent to collect observability data across their increasingly distributed and therefore complex universe of microservices applications,” “This could significantly simplify one of today’s most significant pain points in observability: instrumentation. Developers can now benefit from the continuously increasing auto-instrumentation capabilities of OpenTelemetry and no longer have to worry about instrumenting their code for specific observability platforms,” Volk said. However, such a freedom of choice due to a proliferation of tools has created challenges of its own.


IT’s Key Role in Planting ESG Effort

The one thing we know about all compliance measures is that they require new levels of integration that the company usually lacks. If you can focus on integration work now, you will be more agile-and better prepared for ESG regs when they hit. Keep your ears to the ground - You can learn a lot about the directions ESG is taking from your outside audit firms, regulators and your internal legal or regulatory department. These entities already have information in advance on future ESG directions and what laws or regulations are likely to be forthcoming. Do your part internally - Several years ago, I was visiting with the CIO of a large healthcare company in the Northeast. He told me that the company wanted to trim its carbon footprint and that the first place the company looked for tangible results was in the data center. “This prompted us to move more IT to the cloud, and even to build a new, eco-friendly data center,” he said. “We virtualized servers as much as possible, reduced energy consumption, mandated that all new equipment we purchased used less power, and even redid the HVAC unit airflows.”


Why 2024 will be the year of ‘augmented mentality’

With this AI technology now available for consumer use, companies are rushing to build them into systems that can guide you through your daily interactions. This means putting a camera, microphone and motion sensors on your body in a way that can feed the AI model and allow it to provide context-aware assistance throughout your life. The most natural place to put these sensors is in glasses, because that ensures cameras are looking in the direction of a person’s gaze. Stereo microphones on eyewear (or earbuds) can also capture the soundscape with spatial fidelity, allowing the AI to know the direction that sounds are coming from — like barking dogs, honking cars and crying kids. In my opinion, the company that is currently leading the way to products in this space is Meta. Two months ago they began selling a new version of their Ray-Ban smart glasses that was configured to support advanced AI models. The big question I’ve been tracking is when they would roll out the software needed to provide context-aware AI assistance.


Google flaunts concurrency, optimization as cloud rivals overhaul platforms

Kazmaier explains that Google’s approach to concurrency avoids spinning up more virtual machines and instead improves performance on a sub-CPU level unit. “It moves these capacity units seamlessly around, so you may have a query which is finishing and freeing up resources, which can be moved immediately to another query which can benefit from acceleration. All of that micro-optimization takes place without the system sizing up. It's constantly giving you the ideal projection of the capacity you use on the workloads you run,” he says. A paper from Gartner earlier last year approved of the approach. "A mix of on-demand and flat-rate pricing slot reservation models provides the means to allocate capacity across the organization. Based on the model used, slot resources are allocated to submitted queries. Where slot demand exceeds current availability, additional slots are queued and held for processing once capacity is available. This processing model allows for continued processing of concurrent large query workloads," it says.


As AI Advances, Who Is Looking to Its Architecture?

There is a case to be made, though, that enterprise architects have a much more fundamental role to play in our current phase of technological evolution than simply implementing its advancements into our workflows. AI solutions must seek to enhance the role of the enterprise architecture and their productivity, not attempt to supplant it. Standards are important not just because they enable collaboration, but because they build consensus. A successful standard draws on the insights and expertise of the whole community of practitioners which needs to use it. In that process, many conversations are had – and occasionally quite fraught ones – in the interest of finding a common understanding of what a good, mature, responsible, successful approach looks like. One that puts the human at the center of the decision loop. The point of listing so many of AI’s potential positive outcomes earlier in this article was not just to emphasize how dramatic and wide-ranging its impact could be. 



Quote for the day:

"People often say that motivation doesn't last. Well, neither does bathing - that's why we recommend it daily." -- Zig Ziglar

Daily Tech Digest - January 02, 2024

Decoding the Black Box of AI – Scientists Uncover Unexpected Results

“If the GNNs do what they are expected to, they need to learn the interactions between the compound and target protein and the predictions should be determined by prioritizing specific interactions,” explains Prof. Bajorath. According to the research team’s analyses, however, the six GNNs essentially failed to do so. Most GNNs only learned a few protein-drug interactions and mainly focused on the ligands. Bajorath: “To predict the binding strength of a molecule to a target protein, the models mainly ‘remembered’ chemically similar molecules that they encountered during training and their binding data, regardless of the target protein. These learned chemical similarities then essentially determined the predictions.” According to the scientists, this is largely reminiscent of the “Clever Hans effect”. This effect refers to a horse that could apparently count. How often Hans tapped his hoof was supposed to indicate the result of a calculation. As it turned out later, however, the horse was not able to calculate at all, but deduced expected results from nuances in the facial expressions and gestures of his companion.


Why 2024 is the year for IT managers to revamp their green IT plans

Conversations with enterprise operators reveal that many do not consolidate applications when they refresh their IT equipment and are hesitant to deploy power-aware workload management tools out of concern for impacting the reliability of their IT operations. IT managers must guide their organisations to intelligently utilise their available equipment capacity, using software tools to measure, manage, and maximise utilisation within reliability constraints. All organizations should set equipment utilisation goals and build multi-year efficiency project plans to improve IT infrastructure energy performance. Data available from IT equipment manufacturers indicate that workloads on two to eight old servers can be migrated to one new server when deploying n+1 (e.g., Intel or AMD CPU generation 3 to 4) or n+2 (e.g., Intel or AMD CPU generation 2 to 4) technology. Similar improvements can be achieved in storage and network equipment. Consolidating CPU workload and storage and network operations delivers a defined workload on three-quarters to one-half of the equipment. 


AI Everywhere, All the Time: Top Developments of 2023

AI-powered robots are increasingly automating tasks in manufacturing and logistics, among other industries, driving efficiency and changing the nature of work. Some major milestones include Tesla's Optimus Bot prototype demonstrating dexterous and adaptable humanoid robots, which could shape future automation solutions. Separately, Boston Dynamics' Atlas showcased its parkour skills, paving the way for applications in search and rescue or disaster response. The AlphaFold 2 AI system, developed by Alphabet subsidiary DeepMind, can perform predictions of protein structure, and stands to revolutionize drug discovery and personalized medicine, carrying the potential for helping mitigate numerous diseases. Robotic surgery systems grow ever more sophisticated, while AI-powered prosthetics offer amputees greater control and functionality. AI algorithms are already assisting doctors in medical diagnosis for diseases such as cancer, offering increased accuracy and early detection possibilities. 


How Gamification Can Help Your Business

At work, gamification is often used to build employee experience by promoting fun competition and immersive learning experiences, leading to better information retention and a heightened incentive to engage in ongoing learning and upskilling, Ringman says. Gamification is frequently used to boost staff productivity. “In any business, there are many things that need to be done every day that many of us aren’t naturally motivated to do,” Avila observes. Gamification, provides helpful context, guidance, and rewards, allowing tasks to be completed faster and more efficiently while improving focus. “This, in turn, helps the company achieve larger business goals.” Brands can also tap into gamification as they strive to engage customers and transform ordinary interactions into memorable experiences. Ringman notes that brands can use gamification to add extra fun to loyalty programs by hosting contests and competitions, as well as awarding virtual badges and trophies to customers as they complete various actions or pass significant milestones.


Envisioning a great future – India as a SuperPower

A nation’s growth is underpinned with technological advancement and how swiftly it adopts tech. During the recent state visit of India’s Prime Minister, Narendra Modiji to the United States, he put a lot of emphasis on growing technology that will revolutionize various industries. India is fast moving towards digitization. The thrust from the Government of India with the Digital India initiative and the growing use of digital technology such as Artificial Intelligence, Machine Learning and Data Analytics across various private organisations is bringing a phenomenal shift in India’s growth and development. Secondly, there is going be a lot of disruption in the way we work. With AI, lots of work will be done by BOTS, so it is important to have highly skilled labor to manage the AI which will also require upscaling the work, as we will have more leisure. The way society works will change and need to be adaptable. AI tools can be used as an Add-on tool to enable our lawyers, CAs, economists and leaders at large. Today, India is a force to be reckoned with in the domain of Information Technology without an iota of doubt.


Wi-Fi 7’s mission-critical role in enterprise, industrial networking

Wi-Fi 7 devices can use multi-link operation (MLO) in the 2.4 GHz, 5 GHz, and 6 GHz bands to increase throughput by aggregating multiple links or to quickly move critical applications to the optimal band using seamless switching between links. Fast link switching allows Wi-Fi 7 devices to avoid interference and access Wi-Fi channels without delaying critical traffic. This and other new features also make Wi-Fi 7 ideal for immersive XR/AR/VR, online gaming and other consumer applications that require high throughput, low latency, minimal jitter, and high reliability. ... Naturally there are challenges with achieving seamless connectivity between 5G & Wi-Fi. A lot of industry alignment is needed to enable frictionless movement between networks, across technologies, vendors, and areas such as authentication, QoS, QoE and security. The Wireless Broadband Alliance is playing a key role in bringing all the stakeholders (operators, enterprises and network owners) together to ensure collaboration and alignment on the frameworks that will deliver seamless connectivity.


Data Center Governance Trends to Watch in 2024

Historically, data center governance did not drive frequent conversations in the data center industry. Data center operators sometimes talked about it, but it has not tended to be a core area of concern – perhaps because, unlike other types of governance, data center governance isn't a requirement for businesses seeking to meet regulatory rules or avoid compliance fines. Looking ahead, however, governance in data centers is likely to become a more common item of discussion. Data centers have now matured to the point that businesses are increasingly keen to squeeze as much efficiency as possible out of them. In the past, disorganized data center assets or lack of optimal server room layouts may not have been critical. But today, data center operators face growing pressure to maximize the efficiency of their facilities. Certain regulators are now requiring disclosures about data center emissions, for example, meaning that increasing energy efficiency through effective governance practice has become important for protecting business's brands and reputations.


Essential skills for today’s threat analysts

Very often, for instance, there's an urgent need to communicate a new vulnerability to different audiences, which demands tailored communications for technical teams, CISOs, and board members. Williams highlights task management and patience, especially when dealing with uncertain or misleading information, and above all, coordinating between different sources of information. "So much of threat hunting today relates to that living off the land kind of thing where you're seeing things that look malicious. And so oftentimes you’re developing hypotheses and that involves consulting system admin and working toward a resolution," says Williams. ... It's also a mind game, with threat hunters needing to be highly adaptable as threats are changing daily, sometimes hourly. "You need to change with them. Never allow an inflexible mind to pervade your operational approach," says Brian Hussey, VP of threat hunting, intelligence and DFIR at SentinelOne. At the same time, you also need to see the forest through the trees. "Often threat actors introduce surface changes to their attack patterns, but core modus operandi remains unchanged, leaving important opportunities to identify and eliminate new attacks, even before they arrive," Hussey tells CSO.


Want to tackle technical debt? Sell it as business risk

There is no magic potion that can eliminate all technical debt, but technical debt can be attacked via budgeting if technical debt is not just perceived as upgrading IT infrastructure. What CIOs need to do instead is to present IT infrastructure investment as an important corporate financial and risk management issue that the business can’t afford to ignore. ... Technical budget justifications for IT infrastructure upgrades, which are seldom linked to end business strategies, make it easy for budget decision-makers to defer IT infrastructure investment. Instead, budget decision-makers figure that the company can “make do” because IT will somehow find a way to keep systems running. CIOs must change this thinking. They can start the process by changing IT infrastructure investment justifications from technical explanations to corporate financial and risk management explanations. ... CIOs should also team with the CFO to help reframe the tech debt narrative, because CFOs are always on the lookout for new corporate financial and risk management scenarios. 


Leveraging Leadership: The Fourfold Path to Business Control

Belief systems function as a mechanism for communicating the core values, objectives, and mission of the organization, thus providing guidance and motivation to staff members. By encouraging people to improve their customer service through the inculcation of positive values, conduct, performance, and a feeling of inclusion, this lever ensures the fulfillment of the organization's objectives. In the absence of a clearly-defined Belief System, employees are forced to depend on conjecture regarding the organization's intended behaviors and objectives. ... Without stifling individuals' capacity for innovation or entrepreneurship, this control mechanism permits the development of policies and standards that instruct individuals on bad behavior. Boundary systems implement regulations, codes of conduct, and premeditated strategic boundaries to delineate acceptable and abhorrent employee conduct, thereby establishing governing parameters. These boundaries clearly define the irreversible consequences of violating ethical principles and the potential outcomes that should be avoided. 



Quote for the day:

"It is better to fail in originality than to succeed in imitation." -- Herman Melville