Daily Tech Digest - January 08, 2024

Can Generative AI and Data Quality Coexist?

Not only is it possible for generative AI and data quality to co-exist, it’s imperative that they do, says Marinela Profi, AI strategy advisor for analytics software developer SAS in an email interview. Data for AI is like food for humans, she notes. “Based on the quality of the food you feed your body and your brain, you will receive a certain quality of outputs, such as higher performance or more focus.” Simply put, if you've neglected the quality of your enterprise data, or haven't defined a proper data strategy, you won't get value out of generative AI, Profi says. “On the flip side those who have implemented a strong data management discipline are uniquely positioned to gain a competitive advantage with generative AI.” ... New training techniques that require less data or that can learn more effectively from existing datasets might reduce the pressure on data quantity but increase the need for highly representative and unbiased data samples, Profi says. “Self-supervised and unsupervised learning techniques, in which models generate their own labels or learn from unlabeled data, reduce reliance on manually labeled datasets” she notes.


10 top priorities for CIOs in 2024

We are in a cybersecurity pandemic right now, warns Juan Orlandini, CTO for North America at solutions and systems integrator Insight Enterprises. He encourages CIOs to focus on cyber preparedness to ensure they’ve done everything possible to prevent assaults. “Attackers are getting more sophisticated, and all it takes is one mistake for them to get in,” Orlandini notes. “Assume that attacks are inevitable.” Work toward having the right cybersecurity team in place, Orlandini advises. “This could be an in-house team or trusted advisors who can make sure you’ve done what you can to protect yourself.” ... Caldas believes that’s important to renew and maintain existing workforce competencies as well as to establish a high-performance culture that’s ready to deliver results in today’s fast-paced technology ecosystem. “We’re shifting away from building development plans based on job profiles alone and are now pivoting to build plans on top of a foundation of skills,” she states. “Skill-building will inform how we provide training as well as how team members can grow their careers.”


Insights From Quantum Computing Could Create Light-Controlled Memory Tech

The discovery is tightly linked to the realm of quantum technologies, and combined principles from two scientific communities that so far had little overlap: “We arrived to this understanding by using principles that are well established within the quantum computing and quantum optics communities but less so in the spintronics and magnetism communities.” The interaction between a magnetic material and radiation is well established when the two are in perfect equilibrium. However, the situation where there is both radiation and a magnetic material that are not in equilibrium has so far been described very partially. This non-equilibrium regime is at the core of quantum optics and quantum computing technologies. From our examination of this non-equilibrium regime in magnetic materials, while borrowing principles from quantum physics, we have underpinned the fundamental understanding that magnets can even respond to the short time scales of the light. Moreover, the interaction turns out to be very significant and efficient.


IoT Data Governance: Taming the Deluge in Connected Environments

IoT environments typically comprise diverse devices and systems, often operating on different standards and protocols. Data Governance must address the challenge of integrating this disparate data to ensure seamless interoperability. This requires establishing common data models and communication protocols to use. For example, integrating data from wearable devices, electronic health records, and diagnostic equipment in a healthcare IoT setup is crucial for comprehensive patient monitoring and care. IoT Data Governance must also include policies for data storage and lifecycle management. This involves determining how long data should be stored and when it should be archived or deleted, as well as ensuring that storage solutions are scalable and cost-effective. Likewise, it’s important to understand that proper Data Governance doesn’t relate only to touchpoints between the device, the network, and the cloud. Instead, proper security protocols must be applied throughout the organization. From integrated document editors and AI assistants to plugins and VPNs, everything must be airtight.


5 Reasons Not to Use Serverless Computing

Serverless computing is cost-efficient in the sense that you only pay for the time your workloads are active. However, the per-minute cost of serverless is almost always higher than the cost of running an equivalent workload on a VM. For this reason, serverless may result in greater total costs than other types of cloud services, especially for workloads that are active most of the time. ... Each serverless computing platform works in a different way. That makes it challenging to migrate workloads from one serverless environment (like AWS Lambda) to another (such as Azure Functions). By comparison, the differences between other types of cloud services (such as AWS EC2 and Azure Virtual Machines) are less pronounced, leading to lower levels of lock-in when you use those services. ... Although serverless workloads theoretically run on-demand, in practice there is typically a delay between when serverless code is triggered and when it actually runs. This is especially true in the case of "cold starts," which happen when serverless code hasn't run recently. Sometimes, the delays in startup time can be a second or longer.


Industry 4.0 at Scale Can Make Smart Manufacturing Pervasive

We're barely scratching the surface to realize the potential of generative AI. Consider a worker's typical task: locating and utilizing manuals to address issues at a workstation. This process usually takes hours, sifting through pages and troubleshooting cryptic error codes. Now, picture digitizing these manuals, integrating them into a bedrock system and overlaying an anthropic cloud interface. With this setup, a worker standing before a machine encountering an error can simply input the error code and receive a clear explanation of the problem. This saves an immense amount of time, directly translating to cost savings and increased uptime. But the real potential lies in integrating this knowledge into workflows seamlessly. Imagine moving from receiving an error message to being provided with repair instructions while simultaneously updating maintenance records and checking inventory for required spare parts - all automatically. This transformative capability eliminates the need for manual intervention, which currently involves hours of inventory checks, finding personnel for repairs, and managing associated logistics.


Embracing the future: Top 5 home automation trends for 2024

As we strive towards a sustainable future, the inclusion of energy-saving devices has become a priority. Energy-efficient smart home devices include unique features that make energy conservation easier than ever. Lighting systems, for example, independently dim or switch off when rooms are unused, saving electricity and extending bulb life. Also, schedules to switch off lights, etc. at a preset time every day – ensure nil wastage of energy. By incorporating energy-efficient smart home gadgets into daily life, people can not only reduce their environmental effects but also benefit from lower costs of electricity. ... Artificial intelligence (AI) is stepping in to make the lives of people even more convenient and efficient. Virtual assistants powered by AI, such as Amazon Alexa and Google Assistant, have already become commonplace in many homes. These smart companions can answer inquiries and provide weather updates, as well as operate smart gadgets and much more. In the coming years, AI will become more important in home automation. Your home can learn your habits and adjust to create the perfect atmosphere.


AI Will Create Demand and Empower Developers, Not Replace Them

AI takes care of the annoying, tedious, routine tasks that may otherwise take up a significant amount of developers’ time, so they’re able to better concentrate on the real work at hand. In fact, 92% of developers are already using AI to lighten their load. ... AI can do this in seconds. All a developer needs to do is tell the technology what they want to accomplish and what language they want to use, and they can get some understanding of the best approach to their problem. Learning: Although there’s a need to check for accuracy, AI can help developers understand code snippets and programming concepts without having to do the research themselves. Documentation: Nobody likes documentation. It’s tedious and difficult. However AI documentation can help bring attention to things that didn’t work during the development process while reducing development times in the aftermath. Code quick-starts: This gives a major leg up to developers who have an idea but don’t know exactly where to begin. AI can generate coding within seconds despite the language. Even if the parameters of the project need a review, it gives you a head start.


Future challenges and innovations in cloud security platforms

The role of identity and access management systems in cloud security has evolved from simple gatekeeping to intelligent filtering. These systems now serve as sophisticated sentinels equipped to discern legitimate users from intruders. They’re no longer static barriers but dynamic shields, adapting to new threats and user behaviors. The advancement in these systems means they can now offer personalised access based on user roles and context, adding a layer of security that’s both smart and user-centric. Essentially, they act as the first line of defence, ensuring that only the right people can access the right data at the right time. ... The adage “it takes a village” holds particular relevance in cloud security, emphasising the importance of collective effort and cooperation in this field. Achieving unbreakable security is a collaborative mission, relying not on individual effort but on collective strength and teamwork. This collaboration transcends organizational boundaries, bringing together experts, companies, and competitors to forge a united front against cyber threats.


Social engineer reveals effective tricks for real-world intrusions

My main social engineering trick is just walking into a location like you belong there. People underestimate how far confidence will get you into a location and how unsuspecting people are when they feel secure. I’ve always said the only thing worse than no security is the false sense of security because it is tough to imagine something terrible will happen when you have that false sense of security. One of the main tricks that I do when I am doing a phishing attack is not to tell them that something positive has happened. I always have the topic of the e-mail to be unfortunate, something that may be a mistake, something that has happened that is important and, if not fixed immediately, could have dire consequences. People are very suspicious when they get an e-mail that something good has happened or will happen to them. Still, throughout history, humans have always craved information almost at any cost when they felt like a threatening situation was occurring around them. They need to discover what is happening and how it could affect them.



Quote for the day:

"Ambition is the path to success. Persistence is the vehicle you arrive in." -- Bill Bradley

Daily Tech Digest - January 07, 2024

2024 cybersecurity forecast: Regulation, consolidation and mothballing SIEMs

CISOs’ jobs are getting harder. Many are grappling with an onslaught of security threats, and now the legal and regulatory stakes are higher. The new SEC cybersecurity disclosure requirements have many CISOs concerned they’ll be left with the liability when an attack occurs. ... After the Cyber Resilience Act, policymakers and developers drive adoption of security-by-design. The CRA wisely avoided breaking the open source software ecosystem, but now the hard work starts: helping manufacturers adopt modern software development practices that will enable them to ship secure products and comply with the CRA, and driving public investment in open source software security to efficiently raise all boats. ... With the increase in digital business-as-usual, cybersecurity practitioners are already feeling lost in a deluge of inaccurate information from mushrooming multiple cybersecurity solutions coupled with a lack of cybersecurity architecture and design practices, resulting in porous cyber defenses.


Expert Insight: Adam Seamons on Zero-Trust Architecture

Zero trust goes beyond restricting access by need to know and the principle of least privilege. It’s about properly verifying access and being 110% certain that the access is legitimate. That means things like limiting access to specific criteria, such as by port or protocol, time period, IP address and/or physical location. ... A zero-trust network is about verification or double-checking. You want to be verifying not just the person, but also the device and limiting that access to specific permissions and rights that have been approved in advance. And you’re also restricting data access, particularly in situations like the example I just gave. Think of it like the difference between a key to the front door that gives you access to the whole house, and needing a key for the front door as well as separate keys for all the different rooms. ... AI and machine learning have both been used in detecting anomalies and suspicious patterns for some time, and will only continue to be used more. I expect SOCs to become increasingly reliant on AI. Getting more specific, log analysis is a key area for AI to automate. 


6 innovative and effective approaches to upskilling

Beverage maker Torani has been mixing up L&D by flipping the traditional performance review — which can be “demoralizing” — on its head. It puts the onus on future rather than past performance and on employee learning aspirations, rather than manager assessment. ... Devine adds: “With today’s shift to agile working, some firms believe yearly performance objectives and appraisals are insufficient and inflexible. They need something more frequent, nimble, and focused on feedback, skills and future needs. But you still need managers to assess performance to justify and provide transparency on promotions and pay decisions.” ... Microsoft is supporting workers across its organization gain skills related to AI — from non-techies to IT professionals and leaders. Simon Lambert, chief learning officer at Microsoft UK, says: “One lesson we’ve learned from our AI learning journey is that upskilling means far more than merely equipping employees with skills. It requires an ecosystem that fosters adaptability and continuous learning. In the face of AI-upskilling demand, employees need faster, seamless access to learning infrastructure.


2024 Data Center Un-Predictions: Five Unlikely Industry Forecasts

The potential impact of data centers on local communities is an important issue. At a recent conference in Virginia, we had activists from the community right alongside data center leaders to discuss the challenges and opportunities we face. While there were still some disconnects, we met in the middle on some critical topics around power, community engagement, and ensuring we create a more sustainable future. ... Leveraging self-driving technology, robots independently chart and traverse the data center, gathering real-time sensor data. This lets them immediately juxtapose present patterns against pre-defined norms, facilitating swift identification of deviations for human examination. In an ever more interconnected and intricate environment, this robotic technology grants decision-makers enhanced visibility, rapidity, and a breadth of intelligence that surpasses what humans or stationary cameras can provide. This advanced capability is vital for maintaining the efficiency and security of data centers in our increasingly digital world.


US DOD’s CMMC 2.0 rules lift burdens on MSPs, manufacturers

The proposed rules also let manufacturers off the hook for complying with NIST SP 800-171. SP 800-171 is a set of NIST cybersecurity rules to protect sensitive federal information. “The requirements of 171 set of cyber standards are designed for IT networks and information systems,” Metzger says. “They were never really designed for a manufacturing environment. It’s now said clearly in the proposed rules that the assessments won’t apply to operational technology." "That, to me, should cause manufacturers to breathe a huge sigh of relief because being required to meet NIST standards that simply don’t fit a manufacturing or OT environment is a recipe for trouble of many forms," Metzger says. “The most important change is what did not change. The document has essentially the same structure and strategy that was in 1.0. It requires third-party assessments for a very large number of defense suppliers.” The proposed version 2.0 of the CMMC rules was published in the Federal Register December 26. Interested parties have until February 26 to file comments with the DOD before the agency finalizes the rules.


Banking Innovation is Paramount Even as Regulatory and Competitive Pressures Mount

Guiding technology-forward regulations can empower banks to harness innovation, enhancing security, transparency, and customer value. Regulators should seek thoughtful oversight that encourages innovation while safeguarding against excessive risks instead of attempting to prevent the recurrence of a once-in-a-century financial crisis. Banks face a growing challenge to their market share from alternative lending platforms, which poses an existential threat, as noted in McKinsey’s 2023 Global Banking Annual Review. Over 70% of the growth in global financial assets since 2015 has shifted away from traditional bank lending, finding its way into private markets, institutional investors and the realm of “shadow banking.” Near-zero interest rates have enabled private equity firms and non-bank lenders to offer lower-cost loans. With its digitally savvy consumer base, the fintech sector has further accelerated this transition, particularly during the pandemic.


IT and OT cybersecurity: A holistic approach

As OT becomes more interconnected, the need to safeguard OT systems against cyber threats is paramount. Many cyber threats and vulnerabilities specifically target OT systems, which emphasizes the potential impact on industrial operations. Many OT systems still use legacy technologies and protocols that may have inherent vulnerabilities, as they were not designed with modern cybersecurity standards in mind. They may also use older or insecure communication protocols that may not encrypt data, making them susceptible to eavesdropping and tampering. Concerns about system stability often lead OT environments to avoid frequent updates and patches. This can leave systems exposed to known vulnerabilities. OT systems are not immune to social engineering attacks either. Insufficient training and awareness among OT personnel can lead to unintentional security breaches, such as clicking on malicious links or falling victim to social engineering attacks. Supply chain risks also pose a threat, as third-party suppliers and vendors may introduce vulnerabilities into OT systems if their products or services are not adequately secured.


Exploring the Future of Information Governance: Key Predictions for 2024

In today’s rapidly evolving digital landscape, information governance has become a collective responsibility. Looking ahead to 2024, we can anticipate a significant shift towards closer collaboration between the legal, compliance, risk management, and IT departments. This collaborative effort aims to ensure comprehensive data management and robust protection practices across the entire organization. By adopting a holistic approach and providing cross-functional training, companies can empower their workforce to navigate the complexities of information governance with confidence, enabling them to make informed decisions and mitigate potential risks effectively. Embracing this collaborative mindset will be crucial for organizations to adapt and thrive in an increasingly data-driven world. ... Blockchain technology, with its decentralized and immutable nature, has the tremendous potential to revolutionize information governance across industries. By 2024, as businesses continue to recognize the benefits, we can expect a significant increase in the adoption of blockchain for secure and transparent transaction ledgers.


Data Professional Introspective: Demystifying Data Culture

We are discussing data culture from several points of view: what new content should be added to the DCAM, where would it fall within the current framework structure, what changes we propose to that structure, what modifications should be made to existing content, how the new/modified content would be assessed, and so on. ... One can begin decomposing data culture from a high-level vision, which summarizes what the organization has accomplished when it can feel confident in asserting that, “We have a strong data culture.” One can also compile a collection of activities and behaviors that demonstrate a developed data culture, and then categorize them and parse them into the DCAM. Or, one can apply a combination of the two approaches, which is the path the working group has followed. The working definition posited to date includes a summary description of a strong data culture: “A strong data culture promotes data-driven decision-making, data transparency, and the alignment of data and analytics to business objectives. It prioritizes strategic data use and encourages sharing and collaboration around data.”


Tackling technical debt in the insurance industry

The impact of technical debt on insurers spans various dimensions. Data inefficiencies arise, leading to compliance issues and difficulties in recruiting and retaining talent. Outdated processes hinder optimal decision-making, impacting both established and newer insurers. Addressing technical debt requires insurers to foster a culture of change, emphasising the risks of neglecting this issue and aligning strategies with broader organisational objectives. Tackling technical debt involves immediate action, prioritised backlog creation, and adaptive development processes. Insurers are advised to navigate technical debt through a combination of incremental and transformational changes. Incremental adjustments and breakthrough advancements should complement comprehensive restructuring efforts for sustained and effective resolution. The roadmap to a resilient, innovative future in insurance hinges on proactive management of technical debt. Insurers must embark on their journey towards pricing transformation to remain competitive and future-ready.



Quote for the day:

"In the end, it is important to remember that we cannot become what we need to be by remaining what we are." -- Max De Pree

Daily Tech Digest - January 06, 2024

FTC offers $25,000 prize for detecting AI-enabled voice cloning

Through the Voice Cloning Challenge, the FTC aims to find a solution that can identify cases of voice cloning with the help of generative AI. The agency calls it “an exploratory challenge” that could potentially provide a direction for the risk mitigation effort. The winning proposal will receive $25,000 and the runner-up will get $4,000. There are up to three honorable mentions, each awarded with $2,000. On January 2nd, the agency started accepting submissions via this portal and will receive ideas for 10 days, until January 12, 08:00 PM EST. Submissions must include a one-page overview of the proposal and a detailed description of up to 10 pages. Participants may also include a video to show how their idea works. All submissions will be judged based on their practical feasibility, impact on corporate accountability and burden on the consumer, and resilience to rapid technological advancements in the field. Should the challenge fail to yield any effective defense ideas, FTC notes that the effort will serve as an early warning for policymakers and would highlight the need for more stringent regulations on the use of AI technology.


Building a Great Security Operations Center

Without a defined SOC strategy, security leaders may struggle to prioritize resources. A strategy provides direction based on various inputs such as the threat landscape, regulatory requirements and threat assessments specific to the organization. In the context of an SOC, the primary objective of the SOC strategy should be to avoid a situation where the cost and effort is high and the value and return on investment (ROI) is low. The aim of the SOC strategy is to ensure that the SOC effectively fulfils its function and, in doing so, helps the organization to fulfil its overall business objectives. A well-architected SOC provides a positive ROI by minimizing potential financial losses due to cyberincidents. At the same time, an SOC enhances an organization’s ability to detect and respond to cyberthreats in real time, safeguarding sensitive data and protecting the organization’s reputation. Therefore, compliance, ROI and risk reduction are interconnected. Although it is easy to get carried away with generic cybersecurity use cases, the development of business-aligned use cases is what separates average SOCs from great SOCs.


Is the vCISO Model Right for Your Organization?

It's getting harder to justify not having a CISO, so many businesses that have never had one are filling the gap with a virtual CISO (vCISO). A vCISO, sometimes referred to as a fractional CISO or CISO-as-a-service, is typically a part-time, outsourced security expert who helps businesses protect their infrastructure, data, personnel and customers. Depending on the needs of the company, vCISOs can work on-site or remotely, for the long term or short term. There are plenty of reasons why companies are going the vCISO route. Sometimes it's an internal crisis where a company's CISO has unexpectedly resigned and the board needs time to find a permanent new one. Other times it revolves around new regulatory or business requirements or a cybersecurity framework the company needs to adhere to, like NIST's Cybersecurity Framework 2.0. Sometimes a board member used to being briefed by the CISO may request a vCISO. "A smaller company might need a CISO but just a few days a week, and that type of delivery model is perfect for a vCISO," says Russell Eubanks, a vCISO who is also on the faculty of IANS Research and an instructor with SANS Institute.


Generative AI and Data Management: Transforming B2B Practices

Generative AI’s future in data management and analytics shines with promising trends to redefine data analysis methodologies. These trends encompass enhanced augmentation, deeper understanding and explanation, and the democratization of data analysis, presenting a transformative shift in how organizations harness data for insights and decision-making. Generative AI is poised to transcend traditional data visualization, evolving to augment the entire data analysis workflow. This evolution encompasses automated data exploration, hypothesis generation, data storytelling, and predictive analytics. AI’s capability to suggest patterns, relationships, and anomalies and generate comprehensive reports promises to revolutionize data-driven decision-making. The future of Generative AI goes beyond reporting events, delving into causality and explanations. The upcoming trends include causal inference, counterfactual analysis, and the integration of Explainable AI (XAI). These advancements ensure a profound understanding of underlying causes behind observed trends and transparent insights for users.


4 Strategies for Migrating Monolithic Apps to Microservices

For many organizations, taking a lift-and-shift approach is the first step for migrating monolithic applications to Kubernetes and microservices. This involves directly lifting the monolith onto hardware hosted in the cloud, and then gradually breaking down the app into microservices. However, the lift-and-shift philosophy has its challenges, as organizations must refactor monoliths to optimize them for the cloud. Therefore, it’s often more cost-effective to refactor an application service by service into a containerized architecture. ... Dependencies within monolithic apps are deeply intertwined. These close relationships among components are one of the driving forces behind the move to Kubernetes and microservices, as they hinder flexible changes and deployment. When migrating an application to a microservices architecture, it’s important for teams to understand all dependencies among services and to reduce and streamline them as much as possible. Asynchronous messaging is key, allowing services to communicate by sending and receiving messages using queues. 


Network Tokenization and Digital Identities Are Quietly Transforming Payment Security

Digital identities, through biometric data and multi-factor authentication, fortify the security of transactions. This not only protects users from identity theft but also strengthens the overall trustworthiness of digital payment systems. “We never really thought about, what does it mean to identify a person on the internet in a way that is portable and doesn’t require you to rely on a single private platform,” Mike Brock, CEO of TBD, a business from Block focused on open-source decentralized technologies, told PYMNTS. Digital identities play a crucial role in meeting regulatory requirements. By providing a secure and traceable means of verifying user identities, businesses can navigate compliance challenges more efficiently, reducing the complexities associated with anti-money laundering (AML) and know your customer (KYC) processes. “Combating Online Fraud With Digital Identification,” a PYMNTS Intelligence and Prove collaboration, finds that security is highly important for 83% of consumers, while 53% say consistent experiences across different platforms have a very or extremely big impact on their trust in financial institutions.


AI governance outlook: A Global South perspective

An under-regulated path for AI and emerging technologies may bring diverse negative outcomes. These outcomes may lead to a rise in inequality, loss of privacy, and ethical transgressions. By contextualising this through understanding the history of the industrial revolutions that brought drastic changes in people's social and economic lives and prioritising moral concerns, the G20 and GPAI member states can reduce negative results that will arise without the right steering and regulation. Despite the G20's significant influence and GPAI’s members’ technical expertise, many member states face issues with the digital divide, especially the unequal distribution of advanced technologies and their benefits. The divide deepens as AI development, mainly in developed markets, widens the gap between these countries and their developing counterparts in AI research and development (R&D). As per the AI Index Fund 2023, private investments in AI from 2013-22 in the United States (US) (US$250 billion) outpaces that of other economies including India, Japan, the United Kingdom (UK) and most of the other G20 nations.


At What Point Is Digital Transformation A Success?

“Digital transformation” sounds like an expensive, laborious slog. The good news is that most companies are likely closer to succeeding at it than they think. Getting in shape and digital transformation have a lot in common: planning, persistence and patience—with a lot of pragmatism—are the keys to achieving your goals. ... When you are in a new fitness regimen, have you “failed” because you’ve only lost 10 pounds of your 20-pound goal? Of course not. You celebrate your progress, and you keep working at it. In a digital transformation, each company’s goals and starting points are unique to their particular circumstances. As a result, based on the clients I work with daily, there are many ways to measure progress. ... In building a great company or social sector enterprise, there is no single defining action, no grand program, no one killer innovation, no solitary lucky break, no miracle moment. Rather, the process resembles relentlessly pushing a giant, heavy flywheel, turn upon turn, building momentum until a point of breakthrough, and beyond.


How to prepare for increased oversight of cybersecurity

DORA, NIST 2.0 frameworks and the new SEC rules can help speed up this process. However, companies can also develop best practices to better implement board oversight of cybersecurity risk. First, covered entities must start planning now for the structural and cultural changes these rules and regulations will require—they will take time to implement. When done right, a risk management program will educate and empower company leaders to understand and confidently accept, mitigate or transfer risk. Second, to promote this strong governance at the C-Suite and board level, companies must educate their leadership on how to take a front seat around cyber strategy and governance. Rather than an insulated organizational function, cyber risk management should be informed by a company’s business strategies, compliance landscape, and risk culture. Finally, it will be critical for organizations to understand specific roles and responsibilities and to maintain regular lines of communications. In addition to the Board and other company leaders, security, communications, and legal teams should be involved in ongoing conversations around achieving a whole-of-business cyber governance strategy.


Optimizing PCI compliance in financial institutions

In practice, IT architectural patterns give architects the building blocks to design any IT solution. The architect chooses and orders the patterns available in the portfolio to meet the end goal. Having segmentation between infrastructure providing data processing and data storage is an example of a broad IT security architectural pattern. If the solution’s goal involves processing and storing data, the architect is constrained to place the pieces that will fulfill those tasks in the proper segments. Furthermore, if the operating system pattern is Linux Oracle Enterprise, the architect would use that pattern first in its design unless technical constraints made the consumption of this pattern suboptimal to accomplish the solution’s goal. All other needs, for example, authentication, encryption, log management, system configuration, would be treated the same—by using the architectural patterns available. The notion of pattern exists beyond IT in areas that a PCI security assessment touches, such as employee pre-employment practices, awareness security training, risk assessment methodology, or third-party service provider management.



Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - January 05, 2024

The dark side of AI: Scientists say there’s a 5% chance of AI causing humans to go extinct

Despite concerns about AI behaving in ways misaligned with human values, some argue that current technology cannot cause the catastrophic consequences predicted by skeptics. Nir Eisikovits, a philosophy professor, contends that AI systems cannot make complex decisions and do not have autonomous access to critical infrastructure. While the fear of AI wiping out humanity grabs attention, an editorial in Nature contends that the more immediate societal concerns lie in biased decision-making, job displacement, and the misuse of facial recognition technology by authoritarian regimes. The editorial calls for a focus on actual risks and actions to address them rather than fearmongering narratives. The prospect of AI with human-level intelligence raises the theoretical possibility of AI systems creating other AI, leading to uncontrollable “superintelligence.” Authors Otto Barten and Joep Meindertsma argue that the competitive nature of AI labs incentivizes tech companies to create products rapidly, possibly neglecting ethical considerations and taking risks.


10 Skills Enterprise Architects Need In 2024

While an abundance of legacy technology is a cause for concern, each application needs to be appraised on a case-by-case basis. It's possible that an older application could actually be a better functional fit for your organization. More likely, however, is that removing a legacy application could be more trouble than it's worth. When you have clarity on how each application fits into your IT landscape, it could become apparent that removing an application would cause more problems than it would solve. Just as enterprise architects need to become experts at surgically removing outdated applications, they also need to know when the time is right to remove an application and how to manage legacy technology until that point. That's the true value of enterprise architecture. ... As generative artificial intelligence (AI) and other new technologies continue to take the weight of work out of daily tasks, the value a human can add is more about communication, negotiation, and diplomacy. Getting stakeholders on board with enterprise architecture involves charm and understanding.


The European Data Act: New Rules for a New Age

Being a key element of the EU’s data strategy, the Data Act intends to lead to new, innovative services and more competitive prices for aftermarket services. According to the European Commission, the Data Act will make more data available for reuse, and it is expected to create 270 billion euros of additional gross domestic product by 2028. Complementing the Data Governance Act, which sets out the processes and structures to facilitate data sharing by companies across the EU and between sectors, the Data Act clarifies who can create value from industrial data and under which conditions. The Data Act also aims to put users and providers of data processing services on more equal footing in terms of access to data. ... The Data Act includes specific measures to allow users to gain access to the data their connected products generate (including the relevant metadata necessary to interpret such data) and to share such data with third parties to provide aftermarket or other data-driven innovative services. The Data Act further sets out that such data should be accessible in an easy, secure, comprehensive and structured manner, and it should be free of charge and provided in a commonly used machine-readable format.


Unlocking the Potential of Gen AI in Cyber Risk Management

Security automation powered by AI plays a pivotal role in streamlining various security functions, alleviating the workload for CSOs and CIOs and facilitating regulatory compliance. Security automation significantly simplifies routine security tasks, allowing human resources to pivot toward more intricate risk analysis and strategic decision-making. One of the notable contributions of AI lies in its assistance in meticulous code inspection and vulnerability assessment. For instance, tools such as Amazon Inspector for Lambda code and Amazon Detective provide indispensable support. Amazon Inspector aids in the comprehensive examination of code, identifying potential vulnerabilities or security loopholes within the Lambda functions, which are integral parts of many cloud applications. This early identification ensures preemptive measures are taken to fortify these vulnerabilities before deployment. Additionally, Amazon Detective helps security analysts by correlating and organizing vast amounts of data to identify patterns or anomalies that might signify a security issue. By leveraging machine learning and AI-driven insights, it streamlines the process of identifying and addressing them effectively. 


Honeywell’s Journey to Autonomous Operations

We’ve integrated AI into our technical-support operations, enabling customers to receive answers to their technical questions within minutes or seconds, as opposed to the day or two it previously took. Today, the addition of generative AI has amplified the capabilities of industrial AI, making it even more powerful than ever before. For example, we’re currently looking at millions of instances of alarms being triggered in the plants of our industrial customers -- to evaluate the potential use of such historical datasets to train a robust language model that would assist plant operators in identifying and addressing alarm issues promptly and providing guidance on necessary actions. ... With the convergence of IoT and AI software, the journey to autonomous operations is accelerating rapidly in the industrial world. However, automated decision-making requires both domain knowledge and the technical capabilities to build such a system. In vetting potential partners, look for one with the experience, data, and domain expertise to help you make the transition at scale.


Data and AI Predictions for 2024: Shifting Sands in Enterprise Data and AI Technologies

As organizations continue their shift to cloud-based data and analytics infrastructure, a more prudent fiscal outlook will be the theme for 2024. The cloud migration megatrend will not reverse, but organizations will scrutinize their cloud spend more than ever due to the challenging macroeconomic environment. In the cloud analytics arena, Databricks and Snowflake will continue their dominance with their well-established platforms. In particular, Databricks’ first-mover advantage for facilitating a lakehouse architecture will allow it to capture more market share. This paradigm combines the flexibility of data lakes with the management features of data warehouses, offering the best of both worlds to enterprises. On the other hand, Google BigQuery is expected to retain its stronghold within Google Cloud Platform (GCP) deployments, bolstered by deep integration with other GCP services and a strong existing customer base. However, the economic headwinds will compel enterprises to consider the total cost of ownership more closely. As a result, the traditional data warehouse architecture will see a decline in favor of the more cost-effective lakehouse design pattern. 


“You can’t prevent the unpreventable” - Rubrik CEO

A significant hurdle in the fight against cyber threats as a whole is in legislation and prosecution. The most capable cyber criminal enterprises are often state-sponsored groups harbored within nations that share their sympathies. While it is possible to seize their cyber assets and disrupt their operations, it is near impossible to prosecute a criminal who is working on behalf of a hostile government. Sinha states that not enough is being done at both the business and governmental levels to create frameworks for information sharing. This means that when one business faces a successful attack, it can be studied to understand the methods of intrusion, how the data was encrypted or extracted, and what could have been done at each stage of the attack to minimize the damage. Not only does this allow businesses to improve their data security and recovery strategies, but also provides attack playbooks that can be used to identify the groups responsible and their cyber infrastructure. However, there is an air of hesitation among many businesses as many would prefer to pay a ransom rather than reveal that their organization was successfully breached, which could cause potential reputational and economic losses.


Gen AI: A Shield for Improved Cyber Resilience

Before implementing GenAI as a proper defense tool, teams and leaders need to understand the strengths and weaknesses of GenAI. Proper research and education on this topic will ensure accurate security procedures fortifying the appropriate tool for the corresponding task. An easy way to understand the benefits of a certain AI tool is by surveying its AI model card (sometimes known as a “system card”), which ultimately provides users with knowledge about its benefits and advantages, what it has and has not been tested for, and its flaws and vulnerabilities. Vetting AI models is a vital step, and model provenance should be the first step of any and all defense strategies. Biden’s latest executive order about AI reinforces the importance of vetting AI models, requiring all AI models to be red-teamed to suss out potential weaknesses. Model provenance provides all documented history such as the AI model origin, the architecture and parameters it possesses, dependencies it may bear, the data used to train it, and other corresponding details. 


Apache ERP Zero-Day Underscores Dangers of Incomplete Patches

The incident highlights attackers' strategy of scrutinizing any patches released for high-value vulnerabilities — efforts which often result in finding ways around software fixes, says Douglas McKee, executive director of threat research at SonicWall. "Once someone's done the hard work of saying, 'Oh, a vulnerability exists here,' now a whole bunch of researchers or threat actors can look at that one narrow spot, and you've kind of opened yourself up to a lot more scrutiny," he says. "You've drawn attention to that area of code, and if your patch isn't rock solid or something was missed, it's more likely to be found because you've extra eyes on it." ... The reasons that companies fail to fully patch an issue are numerous, from not understanding the root cause of the problem to dealing with huge backlogs of software vulnerabilities to prioritizing an immediate patch over a comprehensive fix, says Jared Semrau, a senior manager with Google Mandiant's vulnerability and exploitation group. "There is no simple, single answer as to why this happens," he says. 


Unlocking the Secrets of Data Privacy: Navigating the World of Data Anonymization, Part 1

Implementing data anonymization techniques presents many technical challenges that demand meticulous deliberation and expertise. One paramount obstacle lies in the intricacies of determining the optimal level of anonymization. A profound comprehension of the data's structure and the potential for re-identification is imperative when employing techniques such as k-anonymity, l-diversity, or differential privacy. Furthermore, scalability poses another formidable hurdle. With the continuous growth of data volumes, effectively applying anonymization techniques without unduly compromising performance becomes increasingly more work. Numerous difficulties emerge in the execution procedure because of the differing nature of information types, from organized information in databases to unstructured information in reports and pictures. Additionally, the challenge of keeping pace with the ever-evolving data formats and sources necessitates constant updates and adaptations of anonymization strategies.



Quote for the day:

"You may be disappointed if you fail, but you are doomed if you don't try." -- Beverly Sills

Daily Tech Digest - January 04, 2024

Beyond transactions: Reimagining banking with superior digital customer journeys

In their journey toward digitalization, many banks have concentrated on streamlining customer journeys, often through targeted solutions like automation or AI to address specific problem areas. However, true digital success goes beyond these piecemeal enhancements. For banks to truly excel in the digital arena, a shift in mindset is required. It’s about adopting a platform-first approach that offers a more integrated, holistic solution rather than completely overhauling existing systems. This approach isn’t just about fine-tuning isolated components; it’s about seamlessly connecting these components through a unified platform that enhances the overall efficiency and effectiveness of the banking ecosystem. ... Many banks streamline, but the real leaders in digital banking are those who reimagine the entire customer journey. They take their ‘hands off the wheel,’ designing systems that amplify human potential while supporting touchless operations. They weave AI into the fabric of banking to deliver seamless service that’s not just efficient but also intuitive and connected. 


5 Microservices Design Patterns Every DevOps Team Should Know

The API gateway serves as a single entry point for all client requests. It routes requests to the appropriate microservice and subsequently aggregates the responses. It also handles cross-cutting concerns like authentication, monitoring and rate limiting. Furthermore, it provides a unified API that is easier to consume by the client, shielding them from the complexity of the microservices architecture. However, the API Gateway pattern is not without its challenges. It can become a bottleneck if not properly designed and scaled. Also, it’s a single point of failure unless highly available. Despite these challenges, with careful design choices and good operational practices, the API Gateway pattern can greatly simplify client interaction with microservices. ... The Circuit Breaker pattern aims to prevent this scenario. With the Circuit Breaker pattern, you can prevent a network or service failure from cascading to other services. When a failure is detected, the circuit breaker trips and prevents further calls to the failing service. It then periodically attempts to call the service, and if successful, it closes the circuit and lets the calls go through.


6 warning signs CIOs should look out for in 2024

2023 saw a massive boom in AI, and governments are starting to catch up. In the US, President Biden rolled out an executive order on the safe and secure uses of AI, while in the European Union, lawmakers in December agreed on the details of the AI Act — one of the first bills in the world to establish comprehensive rules for AI. So CIOs will have to follow the debate closely as the year progresses. “Staying updated with new regulations, especially regarding AI ethics, data usage, and copyright concerns, is crucial,” says Bilyk. “Ignoring these changes can lead to legal complications and a loss of public trust.” Companies should make sure they have enough compliance experts, while startups need to hire them early on because they have to understand if and how regulations apply to them. Also, it helps if CIOs know exactly which AI-powered tools their company uses and how their in-house tools are developed. Not knowing this is a serious red flag. “A lot of times, leadership, or the legal side, doesn’t even know what developers are building,” Joseph Thacker, security researcher at AppOmni, told CSO. “I think for small and medium enterprises, it’s going to be pretty tough.”


A Look at Microsoft's Secure Future Initiative

My main gripes with the SFI, and the way it's presented, are twofold. First, Satya is nowhere to be seen. This is not the CEO having a moment of deep realization that the current path is dangerous and the ship needs to steer clear of the icebergs ahead. It's security leaders talking about incremental security improvements. Second, it's marketing word salad (mostly, see more below). It talks in generalities about improvements but not enough about concrete, measurable and transparent details that we can see and use to start rebuilding our trust in Microsoft's security culture. ... In an ever-increasing "speed of feature release" competition with AWS in the IaaS and PaaS cloud spaces, and ditto in SaaS with Google (I think this is a mistake, Google Workspace is no longer a serious competitor to Microsoft 365 except in very small businesses) and to a lesser extent Salesforce and others, will program managers at Microsoft be incentivized to say "No, we can't release this feature now, even though the competition is, because we'll need to spend another two months ironing out the security issues."


Want to Identify Good Generative AI Use Cases? Don’t Be Boring!

When we talk about impact, we should always be thinking about acceleration. Code generation, for instance, is a great example of generative AI’s ability to take complex inputs and produce complex outputs. Writing code, even for master coders, can be a complex and time-consuming task. Great code is also invaluable to organizations and their solutions. With generative AI, an expert coder could prompt a model with something like, “Generate some code that sorts this array here in descending order” and receive a working output almost instantaneously. Sure, they could write that code themselves, but generative AI accelerates their job multiple times over. This is a capability that, until now, was only a dream. Conversational analytics is also another “golden zone” use case because of its accelerative potential. Within business intelligence (BI) and other advanced organizational data/AI tools, you could ask a model something like: “What have our sales been for each of our product groups, and are there any trends?” To do this in code would take a person, team, or department a long time. 


Emotional Cyberrisk Management Decisions

There are certain aspects of managing cyberrisk in which emotion can play an important role. However, far too often, emotion is treated as the default tool in risk management, rather than the specialty tool it is meant to be. Cognitive biases and heuristics have an outsized influence on cyberrisk management. They are hard-wired into our brains and, if managed poorly, can make it difficult to ascertain reality. One of the more common sources of bias is confirmation bias, which is the tendency to search for, interpret, and recall information in a way that confirms one’s preexisting beliefs or hypotheses. Confirmation bias may lead someone to disregard evidence that contradicts their existing beliefs. Overconfidence bias is also common. This leads individuals to overestimate their abilities, such as the effectiveness of their cybersecurity controls. Lastly, the anchoring effect happens when someone relies too heavily on the first piece of information ingested (i.e., the anchor) when making decisions, such as an initial risk assessment that may be outdated.


Why Data Quality Will Rise to the Top of Enterprise Priorities in 2024

Data quality is essential. Even if you have found the data sets that will be appropriate for training an AI model or digging for insights, your results will be poor if the quality of your data is poor. Ownership of data and data quality are core to business success but are still too often ignored or overlooked by the executives and board of directors of most organizations. We can see this in the disconnects between perception and reality. Over 80% of executives that HFS surveyed think they trust their data, but the reality is that there are still many people doing a lot of work to get data quality to a level where the data can be consumed and relied upon. As data quality takes on greater importance, it will escalate to an executive-level conversation. Enterprises will need to start collecting and documenting data, metadata, processes, and business rules as they pursue data quality. Without these basic elements, AI models won’t be able to produce insightful and exact results. If you haven’t yet, you’ll need to invest in initiatives to improve your data quality so you can build a strong foundation for AI use.


The Rise of Linux in Edge Computing and IoT

Edge devices often operate with limited resources. While Linux’s scalability is essentially an asset, striking a balance between functionality and resource usage is important. Careful optimization is essential to ensure efficient utilization of limited resources. ... There is a vast array of edge devices, ranging from sensors and actuators to gateways and edge servers. The diversity can present a challenge in terms of maintenance and optimization. Addressing various architectures and hardware configurations is a time-intensive process that requires continuous effort. ... Decentralization is a common feature of edge computing systems, with a multitude of devices each having unique vulnerabilities. Regularly patching and updating these devices can be challenging, as well as difficult to track. Organizations must be vigilant about maintaining the security integrity of edge computing systems. ... Integrating various devices into a cohesive system poses complex challenges. Developers must navigate the intricacies to ensure interoperability among devices running Linux and other operating systems.


Ransomware Group Steals Australian Courts' Video Recordings

Louise Anderson, chief executive officer of the court, did not reveal the nature of the attack but said the department had been able to isolate the affected network and could confirm that the unauthorized access had been restricted to recordings stored on the network. "We understand this will be unsettling for those who have been part of a hearing. We recognize and apologize for the distress that this may cause people," Anderson said. "Maintaining security for court users is our highest priority. Our current efforts are focused on ensuring our systems are safe and making sure we notify people in hearings where recordings may have been accessed." Anderson did not share the court's efforts to regain access to the compromised recordings but said changes are being made to hearing arrangements and will be announced shortly. Anderson said the hackers had breached a single computer system that manages only audiovisual recordings for all court jurisdictions. "The system holds recordings for around 28 days, so the primary investigation period is Nov. 1 to Dec. 21, which is when we identified the problem and isolated and disabled the affected network," she said.


Align IT With the Business

For most CIOs, redefining IT as a strategic discipline and getting the rest of the organization to recognize this isn’t an overnight process. This is also an area where the CIO has to take the lead. Taking the lead means changing perceptions of IT by altering IT practices so IT better aligns with the business. ... Those CIOs regularly visit with these managers to understand business goals and pain points, and then strategize on how IT solutions can help the managers (and the business) succeed. As part of this process, more CIOs are placing IT personnel such as business analysts directly into business units, or they’re teaming their own IT analysts with citizen developers in user departments. This creates inter-departmental cooperation, and it is a natural onramp for relationship building with other business managers. ... Being transformative in your company can be risky and isn’t for the faint of heart, but there are CIOs who have understood their companies’ businesses and have come up with transformative solutions that have grown out of IT. These breakthroughs have reinvented their companies. I



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick