Showing posts with label data culture. Show all posts
Showing posts with label data culture. Show all posts

Daily Tech Digest - March 01, 2024

Why Large Language Models Won’t Replace Human Coders

Are any of these GenAI tools likely to become substitutes for real programmers? Unless the accuracy of coding answers supplied by models increases to within an acceptable margin of error (i.e 98-100%), then probably not. Let’s assume for argument’s sake, though, that GenAI does reach this margin of error. Does that mean the role of software engineering will shift so that you simply review and verify AI-generated code instead of writing it? Such a hypothesis could prove faulty if the four-eyes principle is anything to go by. It’s one of the most important mechanisms of internal risk control, mandating that any activity of material risk (like shipping software) be reviewed and double-checked by a second, independent, and competent individual. Unless AI is reclassified as an independent and competent lifeform, then it shouldn’t qualify as one pair of eyes in that equation anytime soon. If there’s a future where GenAI becomes capable of end-to-end development and building Human-Machine Interfaces, it’s not in the near future. LLMs can do an adequate job of interacting with text and elements of an image. There are even tools that can convert web designs into frontend code.


The future of farming

SmaXtec’s solution requires cows to swallow what the company calls a “bolus” - a small device that consists of sensors to measure a cow’s pH and temperature, an accelerometer, and a small processor. “It sits inside the cow and constantly measures very important body health parameters, including temperature, the amount of water intake, the drinking volume, the activity of the animal, and the contraction of the rumen in the dairy cow,” Scherer said. Rumination is a process of regurgitation and re-digestion. “You could almost envision this as a Fitbit for cows,” he said, adding that by constantly measuring those parameters at a high density - short timeframes with high robustness and high accuracy - SmaXtec can make assessments about potential diseases that are about to break out. ... Small Robot Company is known for its Tom robot. Tom - the robot - distantly recalls memories of Doctor Who’s dog K9. The device wheels itself up and down fields, capturing images and mapping out the land. The data is then taken from Tom’s SSD and uploaded to the cloud, where an AI identifies the different plants and weeds, and provides a customized fertilizer and herbicide plan for the crops.


The CISO: 2024’s Most Important C-Suite Officer

Short- and long-term solutions to navigating increased regulatory and plaintiff bar scrutiny start with the CISO. Cybersecurity defense strategies, implementation and monitoring fall under the purview of the CISO, who must closely coordinate with other members of the C-suite as well as boards of directors. Recent lawsuits highlight individual fiduciary liability for cybersecurity controls and accurate disclosures. Individual liability demands increased knowledge of, participation in and shared ownership of cybersecurity defense decisions. Gone are the days when liability risks could be eliminated by placing the blame on a single security officer. Boards and other C-suite executives now have personal risks over company cybersecurity defenses and preparedness. CISOs carry primary ownership for formulating and maintaining robust cybersecurity defenses and preparedness. This starts with implementing secure by design and other leading security frameworks. It extends to effective real-time threat monitoring and continual technology assessment of company capabilities to defend against advanced cyber threats or the “Defining Threat of Our Time.”


Generative AI and the big buzz about small language models

LLMs can create a wide array of content from text and images to audio and video, with multimodal systems emerging to handle more than one of the above tasks. They process massive amounts of information to execute natural language processing (NLP) tasks that approximate human speech in response to prompts. As such, they are ideal for pulling from vast amounts of data to generate a wide range of content, as well as conversational AI tasks. This requires a significant number of servers, storage and the all-too-scarce GPUs that power the models — at a cost some organizations are unwilling or unable to bear. It’s also tough to satisfy ESG requirements when LLMs hog compute resources for training, augmenting, fine-tuning and other tasks organizations require to hone their models. In contrast, SLMs consume fewer computing resources than their larger brethren and provide surprisingly good performance — in some cases on par with LLMs depending on certain benchmarks. They’re also more customizable, allowing organizations to execute specific tasks. For instance, SLMs may be trained on curated data sets and run through retrieval-augmented generation (RAG) that help refine search. For many organizations, SLMs may be ideal for running models on premises.


Captive centers are back. Is DIY offshoring right for you?

Captive centers are no longer just means of value creation, providing cost savings and driving process standardization. They are driving organization-wide innovation, facilitating digital transformations, and contributing to revenue growth. Unlike earlier generations of what are increasingly being called “global capabilities centers,” which tended to be large operations set up by multinationals, more than half of last year’s new centers were launched by first-time adopters — and on the smaller side, with less than 250 full-time employees; in some cases, less than 50. The desire to build internal IT capabilities amid a tight talent market is at the heart of the trend. As companies have grown comfortable with offshore and nearshore delivery, the captive model offers the opportunity to tap larger populations of lower-cost talent without handing the reins to a third party. “Eroding customer satisfaction with outsourcing relationships — per some reports, at an all-time low — has caused some companies to opt to ‘do it themselves,’” says Dave Borowski, senior partner, operations excellence, at West Monroe. What’s more, establishing up a captive center no longer needs to be entirely DIY. 


Questioning cloud’s environmental impact

Contrary to popular belief, cloud computing is not inherently green. Cloud data centers require a lot of energy to power and maintain their infrastructure. That should be news to nobody. Cloud is becoming the largest user of data center space, perhaps only to be challenged by the growth of AI data centers, which are becoming a developer’s dream. But wait, don’t cloud providers use solar and wind? Although some use renewable energy, not all adopt energy-efficient practices. Many cloud services rely on coal-fired power. Ask cloud providers which data centers use renewable. Most will provide a non-answer, saying their power types are complex and ever-changing. I’m not going too far out on a limb in stating that most use nonrenewable power and will do so for the foreseeable future. The carbon emissions from cloud computing largely stem from the power consumed by the providers’ platforms and the inefficiencies embedded within applications running on these platforms. The cloud provider itself may do an excellent job in building a multitenant system that can provide good optimization for the servers they run, but they don’t have control over how well their customers leverage these resources.


Revolutionizing Real-Time Data Processing: The Dawn of Edge AI

For effective edge computing, efficient and computationally cost-effective technology is needed. One promising option is reservoir computing, a computational method designed for processing signals that are recorded over time. It can transform these signals into complex patterns using reservoirs that respond nonlinearly to them. In particular, physical reservoirs, which use the dynamics of physical systems, are both computationally cost-effective and efficient. However, their ability to process signals in real time is limited by the natural relaxation time of the physical system. This limits real-time processing and requires adjustments for best learning performance. ... Recently, Professor Kentaro Kinoshita, and Mr. Yutaro Yamazaki developed an optical device with features that support physical reservoir computing and allow real-time signal processing across a broad range of timescales within a single device. Speaking of their motivation for the study, Prof. Kinoshita explains: “The devices developed in this research will enable a single device to process time-series signals with various timescales generated in our living environment in real-time. In particular, we hope to realize an AI device to utilize in the edge domain.”


Agile software promises efficiency. It requires a cultural shift to get right

The end result of these fake agile practices is lip service and ceremonies at the expense of the original manifesto’s principles, Bacon said. ... To get agile right, Wickham recommended building on situations in your organization where agile is practiced relatively effectively. Most often, that involves teams building internal tools, such as administrative panels for customer support or CI/CD pipelines. Those use cases have more tolerance for “let’s put something up, ask for feedback, iterate, repeat,” he said. After all, internal customers expect to accept seeing something that’s initially imperfect. “This indicates to me that people comprehend agile and have at least a baseline understanding of how to use it, but a lack of willingness to use it as defined when it comes to external customers,” said Wickham. ... “Agile is an easy term to toss around as a ‘solution,’” Richmond said. “But effective agile does not have a cookie-cutter solution to improving execution.” Getting it right requires a focus on what has to happen to understand the company’s challenges, how those challenges manifest out of the business environment, in what way those challenges impact business outcomes, and then, finally, identifying how to apply agile concepts to the business.


Building a Strong Data Culture: A Strategic Imperative

Effective executive backing is crucial for prioritizing and financing data initiatives that help cultivate an organization’s data-centric culture. Initiatives such as data literacy programs equip employees with vital data skills that are fundamental to fostering such a culture. Nonetheless, these programs often fail to thrive without the robust support of leadership. Results from the same Alation research show that only 15 percent of companies with moderate or weak data leadership integrate data literacy across most departments or throughout the entire organization. This is in stark contrast to the 61 percent adoption rate in companies with strong data leadership. Moreover, strong data leadership involves more than just endorsement; it requires executives to actively engage and set an example in data culture initiatives. For instance, when an executive carves out time from her hectic schedule to partake in data literacy training, it conveys a much more powerful message to her team than if she were to simply instruct others to prioritize such training. This hands-on approach by leaders underscores the importance of data literacy and demonstrates their commitment to embedding a data-driven culture in the organization.


Cybercriminals harness AI for new era of malware development

Threat actors have already shown how AI can help them develop malware only with a limited knowledge of programming languages, brainstorm new TTPs, compose convincing text to be used in social engineering attacks, and also increase their operational productivity. Large language models such as ChatGPT remain in widespread use, and Group-IB analysts have observed continued interest on underground forums in ChatGPT jailbreaking and specialized generative pre-trained transformer (GPT) development, looking for ways to bypass ChatGPT’s security controls. Group-IB experts have also noticed how, since mid-2023, four ChatGPT-style tools have been developed for the purpose of assisting cybercriminal activity: WolfGPT, DarkBARD, FraudGPT, and WormGPT – all with different functionalities. FraudGPT and WormGPT are highly discussed tools on underground forums and Telegram channels, tailored for social engineering and phishing. Conversely, tools like WolfGPT, focusing on code or exploits, are less popular due to training complexities and usability issues. Yet, their advancement poses risks for sophisticated attacks.



Quote for the day:

"It takes courage and maturity to know the difference between a hoping and a wishing." -- Rashida Jourdain

Daily Tech Digest - January 07, 2024

2024 cybersecurity forecast: Regulation, consolidation and mothballing SIEMs

CISOs’ jobs are getting harder. Many are grappling with an onslaught of security threats, and now the legal and regulatory stakes are higher. The new SEC cybersecurity disclosure requirements have many CISOs concerned they’ll be left with the liability when an attack occurs. ... After the Cyber Resilience Act, policymakers and developers drive adoption of security-by-design. The CRA wisely avoided breaking the open source software ecosystem, but now the hard work starts: helping manufacturers adopt modern software development practices that will enable them to ship secure products and comply with the CRA, and driving public investment in open source software security to efficiently raise all boats. ... With the increase in digital business-as-usual, cybersecurity practitioners are already feeling lost in a deluge of inaccurate information from mushrooming multiple cybersecurity solutions coupled with a lack of cybersecurity architecture and design practices, resulting in porous cyber defenses.


Expert Insight: Adam Seamons on Zero-Trust Architecture

Zero trust goes beyond restricting access by need to know and the principle of least privilege. It’s about properly verifying access and being 110% certain that the access is legitimate. That means things like limiting access to specific criteria, such as by port or protocol, time period, IP address and/or physical location. ... A zero-trust network is about verification or double-checking. You want to be verifying not just the person, but also the device and limiting that access to specific permissions and rights that have been approved in advance. And you’re also restricting data access, particularly in situations like the example I just gave. Think of it like the difference between a key to the front door that gives you access to the whole house, and needing a key for the front door as well as separate keys for all the different rooms. ... AI and machine learning have both been used in detecting anomalies and suspicious patterns for some time, and will only continue to be used more. I expect SOCs to become increasingly reliant on AI. Getting more specific, log analysis is a key area for AI to automate. 


6 innovative and effective approaches to upskilling

Beverage maker Torani has been mixing up L&D by flipping the traditional performance review — which can be “demoralizing” — on its head. It puts the onus on future rather than past performance and on employee learning aspirations, rather than manager assessment. ... Devine adds: “With today’s shift to agile working, some firms believe yearly performance objectives and appraisals are insufficient and inflexible. They need something more frequent, nimble, and focused on feedback, skills and future needs. But you still need managers to assess performance to justify and provide transparency on promotions and pay decisions.” ... Microsoft is supporting workers across its organization gain skills related to AI — from non-techies to IT professionals and leaders. Simon Lambert, chief learning officer at Microsoft UK, says: “One lesson we’ve learned from our AI learning journey is that upskilling means far more than merely equipping employees with skills. It requires an ecosystem that fosters adaptability and continuous learning. In the face of AI-upskilling demand, employees need faster, seamless access to learning infrastructure.


2024 Data Center Un-Predictions: Five Unlikely Industry Forecasts

The potential impact of data centers on local communities is an important issue. At a recent conference in Virginia, we had activists from the community right alongside data center leaders to discuss the challenges and opportunities we face. While there were still some disconnects, we met in the middle on some critical topics around power, community engagement, and ensuring we create a more sustainable future. ... Leveraging self-driving technology, robots independently chart and traverse the data center, gathering real-time sensor data. This lets them immediately juxtapose present patterns against pre-defined norms, facilitating swift identification of deviations for human examination. In an ever more interconnected and intricate environment, this robotic technology grants decision-makers enhanced visibility, rapidity, and a breadth of intelligence that surpasses what humans or stationary cameras can provide. This advanced capability is vital for maintaining the efficiency and security of data centers in our increasingly digital world.


US DOD’s CMMC 2.0 rules lift burdens on MSPs, manufacturers

The proposed rules also let manufacturers off the hook for complying with NIST SP 800-171. SP 800-171 is a set of NIST cybersecurity rules to protect sensitive federal information. “The requirements of 171 set of cyber standards are designed for IT networks and information systems,” Metzger says. “They were never really designed for a manufacturing environment. It’s now said clearly in the proposed rules that the assessments won’t apply to operational technology." "That, to me, should cause manufacturers to breathe a huge sigh of relief because being required to meet NIST standards that simply don’t fit a manufacturing or OT environment is a recipe for trouble of many forms," Metzger says. “The most important change is what did not change. The document has essentially the same structure and strategy that was in 1.0. It requires third-party assessments for a very large number of defense suppliers.” The proposed version 2.0 of the CMMC rules was published in the Federal Register December 26. Interested parties have until February 26 to file comments with the DOD before the agency finalizes the rules.


Banking Innovation is Paramount Even as Regulatory and Competitive Pressures Mount

Guiding technology-forward regulations can empower banks to harness innovation, enhancing security, transparency, and customer value. Regulators should seek thoughtful oversight that encourages innovation while safeguarding against excessive risks instead of attempting to prevent the recurrence of a once-in-a-century financial crisis. Banks face a growing challenge to their market share from alternative lending platforms, which poses an existential threat, as noted in McKinsey’s 2023 Global Banking Annual Review. Over 70% of the growth in global financial assets since 2015 has shifted away from traditional bank lending, finding its way into private markets, institutional investors and the realm of “shadow banking.” Near-zero interest rates have enabled private equity firms and non-bank lenders to offer lower-cost loans. With its digitally savvy consumer base, the fintech sector has further accelerated this transition, particularly during the pandemic.


IT and OT cybersecurity: A holistic approach

As OT becomes more interconnected, the need to safeguard OT systems against cyber threats is paramount. Many cyber threats and vulnerabilities specifically target OT systems, which emphasizes the potential impact on industrial operations. Many OT systems still use legacy technologies and protocols that may have inherent vulnerabilities, as they were not designed with modern cybersecurity standards in mind. They may also use older or insecure communication protocols that may not encrypt data, making them susceptible to eavesdropping and tampering. Concerns about system stability often lead OT environments to avoid frequent updates and patches. This can leave systems exposed to known vulnerabilities. OT systems are not immune to social engineering attacks either. Insufficient training and awareness among OT personnel can lead to unintentional security breaches, such as clicking on malicious links or falling victim to social engineering attacks. Supply chain risks also pose a threat, as third-party suppliers and vendors may introduce vulnerabilities into OT systems if their products or services are not adequately secured.


Exploring the Future of Information Governance: Key Predictions for 2024

In today’s rapidly evolving digital landscape, information governance has become a collective responsibility. Looking ahead to 2024, we can anticipate a significant shift towards closer collaboration between the legal, compliance, risk management, and IT departments. This collaborative effort aims to ensure comprehensive data management and robust protection practices across the entire organization. By adopting a holistic approach and providing cross-functional training, companies can empower their workforce to navigate the complexities of information governance with confidence, enabling them to make informed decisions and mitigate potential risks effectively. Embracing this collaborative mindset will be crucial for organizations to adapt and thrive in an increasingly data-driven world. ... Blockchain technology, with its decentralized and immutable nature, has the tremendous potential to revolutionize information governance across industries. By 2024, as businesses continue to recognize the benefits, we can expect a significant increase in the adoption of blockchain for secure and transparent transaction ledgers.


Data Professional Introspective: Demystifying Data Culture

We are discussing data culture from several points of view: what new content should be added to the DCAM, where would it fall within the current framework structure, what changes we propose to that structure, what modifications should be made to existing content, how the new/modified content would be assessed, and so on. ... One can begin decomposing data culture from a high-level vision, which summarizes what the organization has accomplished when it can feel confident in asserting that, “We have a strong data culture.” One can also compile a collection of activities and behaviors that demonstrate a developed data culture, and then categorize them and parse them into the DCAM. Or, one can apply a combination of the two approaches, which is the path the working group has followed. The working definition posited to date includes a summary description of a strong data culture: “A strong data culture promotes data-driven decision-making, data transparency, and the alignment of data and analytics to business objectives. It prioritizes strategic data use and encourages sharing and collaboration around data.”


Tackling technical debt in the insurance industry

The impact of technical debt on insurers spans various dimensions. Data inefficiencies arise, leading to compliance issues and difficulties in recruiting and retaining talent. Outdated processes hinder optimal decision-making, impacting both established and newer insurers. Addressing technical debt requires insurers to foster a culture of change, emphasising the risks of neglecting this issue and aligning strategies with broader organisational objectives. Tackling technical debt involves immediate action, prioritised backlog creation, and adaptive development processes. Insurers are advised to navigate technical debt through a combination of incremental and transformational changes. Incremental adjustments and breakthrough advancements should complement comprehensive restructuring efforts for sustained and effective resolution. The roadmap to a resilient, innovative future in insurance hinges on proactive management of technical debt. Insurers must embark on their journey towards pricing transformation to remain competitive and future-ready.



Quote for the day:

"In the end, it is important to remember that we cannot become what we need to be by remaining what we are." -- Max De Pree