Daily Tech Digest - January 21, 2024

What is RAG? More accurate and reliable LLMs

Retrieval-Augmented Generation (RAG) is an AI framework that significantly impacts the field of Natural Language Processing (NLP). It is designed to improve the accuracy and richness of content produced by language models. Here’s a synthesis of the key points regarding RAG from various sources: RAG is a system that retrieves facts from an external knowledge base to provide grounding for large language models (LLMs). This grounding ensures that the information generated by the LLMs is based on accurate and current data, which is particularly important given that LLMs can sometimes produce inconsistent outputs; The framework operates as a hybrid model, integrating both retrieval and generative models. This integration allows RAG to produce text that is not only contextually accurate but also rich in information. The capability of RAG to draw from extensive databases of information enables it to contribute contextually relevant and detailed content to the generative process; RAG addresses a limitation of foundational language models, which are generally trained offline on broad domain corpora and are not updated with new information post-training.


Redefining Quantum Bits: The Graphene Valley Breakthrough

Because quantum information is much more prone to being corrupted – and therefore become unsuitable for computational tasks – by the surrounding environment than its classical counterpart, researchers who study different qubit candidates must characterize their coherence properties: these tell them how well and for how long quantum information can survive in their qubit system. In most traditional quantum dots, electron spin decoherence can be caused by the spin-orbit interaction, which introduces an unwanted coupling between the electron spin and the vibrations of the host lattice, and the hyperfine interaction between the electron spin and the surrounding nuclear spins. In graphene as well as in other carbon-based materials, spin-orbit coupling and hyperfine interaction are both weak: this makes graphene quantum dots especially appealing for spin qubits. The results reported by Garreis, Tong, and co-authors add one more promising facet to the picture. ... The hexagonal symmetry observed in this so-called real space is also present in momentum space, where the vertices of the lattice don’t correspond to the spatial locations of carbon atoms but to values of momentum associated with the free electrons on the lattice.


5 Ways AI Can Make Your Human-To-Human Relationships More Effective

Understanding your audience is a major challenge for many business leaders. After all, if you knew what did or didn’t appeal to your audience, it would be much easier to speak to them in a meaningful, engaging way that sparks lasting connections. And AI can help here, too. This was illustrated to me during a recent conversation with James Webb, co-founder and CTO of Comb Insights, whose app uses proprietary AI to provide sentiment scores on comments on social media posts. "Using AI to quickly evaluate the overall sentiment of the comments on a post can give business leaders an immediate understanding of whether their content resonated with their audience,” he told me in an interview. “Seeing the ratio of positive to neutral or negative comments, and seeing the most common words that show up in the comments, can provide quick insights into why a post succeeded or failed. With this instant understanding of their audience, businesses can pivot in the type of social media content they produce so they can strengthen these important digital relationships.”


The missing link of the AI safety conversation

From a practical standpoint, the high cost of AI development means that companies are more likely to rely on a single model to build their product — but product outages or governance failures can then cause a ripple effect of impact. What happens if the model you’ve built your company on no longer exists or has been degraded? Thankfully, OpenAI continues to exist today, but consider how many companies would be out of luck if OpenAI lost its employees and could no longer maintain its stack. Another risk is relying heavily on systems that are randomly probabilistic. We are not used to this and the world we live in so far has been engineered and designed to function with a definitive answer. Even if OpenAI continues to thrive, their models are fluid in terms of output, and they constantly tweak them, which means the code you have written to support these and the results your customers are relying on can change without your knowledge or control. Centralization also creates safety issues. These companies are working in the best interest of themselves. If there is a safety or risk concern with a model, you have much less control over fixing that issue or less access to alternatives.


Intro to Digital Fingerprints

Digital fingerprinting is a technique used to identify users across different websites based on their unique device and browser characteristics. These characteristics - fingerprint parameters, can include various software, hardware (CPU, RAM, GPU, media devices - cameras, mics, speakers), location, time zone, IP, screen size/resolution, browser/OS languages, network, internet provider-related and other attributes. The combination of these parameters creates a unique identifier - fingerprint, that can be used to track a user's online activity. Fingerprints play a crucial role in online security, enabling services to identify and authenticate unique users. They also make it possible for users to trick such systems to stay anonymous online. However, if you can manipulate your fingerprints, you can run tens or hundreds or more different accounts to pretend that they are unique, authentic users. While this may sound cool, it has serious implications as it can make it possible to create an army of bots that can spread spam and fakes all over the internet, potentially resulting in fraudulent actions.


Looking at a data-driven financial future for India

In the intricate landscape of financial services, managing vast data, complex silos, and strict compliance demands a strategic solution. A hybrid data mesh is an innovative approach to financial operations that brings flexibility and coherence. This method combines a distributed architecture with an SSOT, ensuring accurate, secure, and compliant data handling. Data distribution across systems and functions facilitates quick insights while adhering to quality and privacy standards. The hybrid data mesh concept integrates the advantages of a distributed architecture tailored to domain-specific data with the SSOT, providing enhanced flexibility and scalability. This fusion ensures data coherence and accuracy while allowing domain independence, reinforcing security, and streamlining traceable and auditable compliance. Predictive models can be tailored to specific products or customer segments by harnessing AI and ML tools, enhancing decision-making in a dynamic market. This streamlined approach identifies growth opportunities and nurtures a culture of adaptability and innovation.


L&D trends that will define 2024

AI-assisted coding/software development employs AI to help write and review code. The potential of the technology to assist new developers in improving their code and saving time is valuable. The edtech sector, in particular, will employ AI to create customised learning experiences besides using tools that offer instant feedback on code. We could be looking at automating assessments for unbiased, error-free evaluations. Manually identifying personalised learning journeys for numerous individuals is time-consuming and extremely difficult. AI-assisted coding can help solve this operational challenge. Soon, we’ll give users quick, accurate responses and allow them to accelerate their learning journeys. ... Organisations will focus on data-driven, business-aligned learning initiatives for specific job-role competencies. This is to qualify L&D impact by easily tracking employee metrics such as job performance, efficiency, engagement, and employee satisfaction in new ways. When properly implemented, the accumulated data can raise confidence levels among higher-ups and lead to sustained investment in training practices. Organisations also analyse the information to identify areas of positive impact and focus on L&D in those regions for frequently better outcomes. 


New Guidance Urges US Water Sector to Boost Cyber Resilience

"Cyber threat actors are aware of - and deliberately target - single points of failure," the guidance states. "A compromise or failure of a water and wastewater sector organization could cause cascading impacts throughout the sector and other critical infrastructure sectors." The incident response guide aims to provide organizations with best practices for all four stages of the incident response life cycle - from preparation through detection, recovery and post-incident activities. The guidance says "the cyber incident reporting landscape is constantly evolving" and encourages water sector officials to review their reporting obligations and "consider engaging in additional voluntary reporting and/or information sharing" measures. Eric Goldstein, CISA's executive director for cybersecurity, said in a statement announcing the joint guidance that the U.S. water and wastewater sector "is under constant threat from malicious cyber actors." "In the new year, CISA will continue to focus on taking every action possible to support 'target-rich, cyber-poor' entities like WWS utilities by providing actionable resources and encouraging all organizations to report cyber incidents," he said.


Banking at the Precipice: Navigating the Fifth Industrial Revolution

As retail banking stands amid the Fourth Industrial Revolution’s digital transformation, leaders now must prepare for an imminent Fifth Industrial Revolution poised to profoundly reshape markets and experiences. Defined by extreme personalization, mass customization and precision augmentation, the emerging revolution’s exact disruptions remain somewhat undefined. Yet advancements in generative artificial intelligence, ambient interfaces and hyper-connectivity hint at consumer-in-command days ahead. ... Most of these Fifth Industrial Revolution financial applications seem unimaginable today. Imagine augmented live views layering physical surfaces like a retail store, billboard or car dealership with tailored offers based on persona identification and real-time transactional and behavioral data. Moving further, imagine a ‘digital twin agent’ seamlessly negotiating a personalized deal or pre-approved financing instantly. In this world, augmented and mixed reality interfaces, bridging physical and virtual worlds, will be able to move money experiences from transactions to value-based propositions based on where your eyes focus and engagements you have had in the past.


How generative AI is changing entrepreneurship

Entrepreneurs are expected to do a wide range of time-consuming tasks, from writing emails and answering phone calls to orchestrating product demonstrations and coding a website. “AI does all of those things well,” Mollick said. “It lets you focus more on what your top skill is, and it kind of handles everything else.” Generative AI can also serve as a guide. “A third of Americans have a business idea that they haven’t acted on because they don’t know what to do next,” Mollick said. “The AI can tell you what to do next, help you write the emails, [and] help you build the product.” Mollick noted that users should be aware of the benefits and limitations of the technology. “It’s kind of like an intern who wants to make you happy and therefore lies a lot and is kind of naive [and] never admits that they made a mistake,” he said. “Once you think about [AI] that way, you end up in much better shape.” Generative AI is a new general-purpose technology — one that comes around once in a generation and touches just about everything humans do, Mollick said, like electricity, computers, and the internet have. For entrepreneurs, generative AI can assist with researching ideas, coming up with logos and names, creating a website, and more, Mollick said.



Quote for the day:

"Leadership is not about titles, positions, or flow charts. It is about one life influencing another." -- John C. Maxwell

Daily Tech Digest - January 20, 2024

CISOs Struggle for C-Suite Status Even as Expectations Skyrocket

In many instances, CISOs who want clear risk guidance from their board don't get it. Barely more than one-third (36%) described their board as offering them clear enough insight into their organization's risk tolerance levels for them to act upon. "The evolution of the CISO role over the past few years has accelerated dramatically," says Nick Kakolowski, research director at IANS. With organizations digitizing more of their operations, CISOs are taking on more responsibilities and have become de facto owners of digital risk, he says. "[But] organizations haven't figured out how to support and empower them as the scope of the role grows." Concerns have been growing within the CISO community in recent years about the escalating expectations around the role, even as their ability to meet those expectations has remained largely unchanged. Incidents like one last October where the SEC charged SolarWinds CISO Tim Brown with fraud and internal control failures over the 2020 breach at the company, and where a judge sentenced former Uber CISO Joe Sullivan to three years of probation over a 2016 breach, have fueled those concerns. 


Three of four CISOs ready for job change

“Satisfaction has been rising consistently for the past few years, but last year, it dipped,” says IANS Research Director Nick Kakolowski. “Last year, the pressure on CISOs ratcheted up big time with the new SEC rules and CISOs being held personally liable for breaches. ... “The environment surrounding CISOs is extremely turbulent right now, and their individual exposure to lawsuits is at an all-time high. CISOs face a real danger of being indicted or sued for things outside of their control,” adds Patrick “Pat” Arvidson, chief strategy officer for Interpres, a maker of a threat-informed defense surface management platform. ... Another finding in the report is that CISOs aren’t getting the facetime with boards that they need. Eighty-five percent of CISOs in the survey indicated their board should offer clear guidance on their organization’s risk tolerance for the CISO to act on, but only 36% found that to be the case. “We are seeing some boards figuring this out and being effective there, but across the board, there’s either a lack of visibility at the board level—CISOs aren’t consistently reporting to the board—or CISOs and boards haven’t figured out how to speak each other’s language,” Kakolowski says.


How Accelerated Adoption of a Data Governance Framework Helped a Large Financial Services Organization Build a Snowflake Data Vault

The Domain Working Group meetings were instrumental in helping both business stakeholders and technology developers walk through examples and requirements for merging sometimes incomplete, inaccurate, and inconsistent data from 3 sources into a single complete, accurate, and consistent golden record. As business stakeholders started to understand the savings in time spent querying 3 data sources, reconciling and explaining differences between sources, and deciding which data is most trusted, and also started to see the benefits of having a single authoritative view of their domain data, enthusiasm for the Data Vault initiative increased. Embedding data governance practices and tools by creating a data governance workstream within a business or technology project is one of many approaches an organization can take to expand or accelerate engagement, adoption, and implementation of a data governance program. The success of this Data Vault project was partially attributed to the established data governance framework and team, but the biggest benefit was the adoption of data governance by dozens of previously unaware employees through exposure to the data governance program and witnessing real-life benefits of active end-to-end data governance made part of their everyday job responsibilities.


Putting a Number on Bad Data

Several quantifiable metrics can serve as a starting point for evaluating the cost of bad data, including the rate of occurrence or number of incidents per year, time to detection, and time to resolution. ... Number and frequency of incidents: While some companies may experience data incidents on a daily basis, others may go days – if not weeks – without one. The criticality of the incidents can vary from something “minor,” such as stale data linked to a dashboard that nobody has used in ages, to a data duplication problem causing the server to overcharge and ultimately go down. ... Mean time to resolution (MTTR): What happens once an incident is reported? MTTR is the average time spent between becoming aware of a data incident and resolving it. The resolution time is greatly influenced by the criticality of the incident and the complexity of the data platform, which is why we are considering the average for the purpose of this framework. ... Mean time to production (MTTP) is the average time it takes to ship new data products or, in other words, the average time to market for data products. This could be the time spent by an analyst “cleaning” the data for a data science model. 


Microservices Architecture: Navigating the Buzz

Despite the apparent advantages, there are various challenges that I think are important to highlight. Worth noting is that they are all avoidable when considered and planned around upfront. A common reason why teams end up sticking with a traditional monolithic approach includes the fact that microservices bring increased complexity. This complexity comes in the form of teams needing to understand how to design, build, and manage distributed systems. More specifically, not knowing how to implement a reliable communication protocol for microservices to be able to communicate is a recurring pain point that leads to decreased system performance, and in turn, has teams switching back to their monolithic system. Another challenge that arises from having an increased number of interactions comes in the form of system testing and debugging. Aside from these difficulties, another major concern when considering microservices includes that of security. Implementing robust authentication, authorization, and encryption across each and every service is crucial.


Attribute-based encryption could spell the end of data compromise

The history of ABE goes back to a ground-breaking 2005 paper titled “Fuzzy Identity-Based Encryption.” Fifteen years later, recognizing the paper’s significance, the International Association for Cryptologic Research (IACR) gave it a 2020 Test of Time Award. One of its co-authors, Dr. Brent Waters, later said the paper has had a three-fold impact. First, there has been the concept of ABE as its own application with distinctive new use cases, several of which are discussed below. Second, the cryptographic research community not only has spent years studying ABE, but also used ABE as a building block, leveraging it to obtain new results in work on other problems. Third, according to Dr. Waters, the work in ABE “inspired us to rethink encryption in even bigger and grander ways.” One such overflow has been functional encryption, which allows a user to learn only a function of a data set. For ABE, the end goal is fine-grained access to the data itself. On its own, that’s a revolution. An ABE scheme can provide the right user with a key to very specific data. Not to an entire file cabinet, so to speak, but to a single line item within a category of filed documents.


The Cashless Future: Convenience Versus Privacy and Freedom

While convenience reigns, privacy and governmental control remain crucial considerations. Financial inclusion must also be championed, ensuring everyone has access to secure and equitable payment methods. This transition demands careful navigation, balancing innovation with the principles of trust and individual freedom. The spectre of Central Bank Digital Currencies (CBDCs) further fuels the debate. Some fear a dystopian future controlled by governments with direct access to digital money. However, this ignores the historical evolution of ethics, regulations and frameworks. Laws like the Ten Commandments and the Magna Carta, enacted in our tribal and agrarian past, have evolved alongside society, forming the cornerstone of trust and control within our modern, interconnected world. We willingly relinquish information through regulatory KYC and AML practices in exchange for the security and transparency of banks. Similarly, could CBDCs provide a trusted digital foundation without succumbing to the anxieties of overreach? Perhaps the future holds not a revolution, but an evolution. A landscape where a foundational digital currency, overseen by central banks, coexists with diverse ecosystems like Disney coins or Amazon credits.


Harnessing the Power of Diverse Data with Fern Halper

Halper began with a quick overview of what constitutes diverse data. “Diverse data is pretty much just what it sounds like -- data in formats other than structured data,” she said. “This includes unstructured and semistructured data (for example, XML and JSON) and data from different sources (such as social media and IoT devices).” She explained that this diverse data is becoming more important as companies seek any way to compete better in their markets. “For example, a company can use the unstructured or semistructured data from their call center interactions to better predict when a customer might churn.” This diverse data can also be used to uncover hidden insights, make better predictions, and otherwise better respond to what’s happening on the ground, she added. “This reflects what we saw in the survey, which was that the primary driver for using diverse data, cited by 53% of respondents, was to better understand customers. This was followed by use cases related to operational efficiency, which were cited by 43%.” The conversation then turned to the subject of how organizations were managing all this data.


Deprecated npm packages that appear active present open-source risk

The problem is probably much worse because Aqua only checked direct dependencies, not transient ones as well — the dependencies of dependencies. The dependency chain for npm packages can go many levels deep and not accounting for this is a common reason why vulnerable code might make it into projects undetected. “​​This situation becomes critical when maintainers, instead of addressing security flaws with patches or CVE assignments, opt to deprecate affected packages,” the Aqua researchers said in their report. “What makes this particularly concerning is that, at times, these maintainers do not officially mark the package as deprecated on npm, leaving a security gap for users who may remain unaware of potential threats.” ... The npm repository package maintainers do have the option of marking packages as deprecated, which will appear as a warning to users visiting the page. They can also include a note for users with additional information such as alternatives. This can be considered as official deprecation. However, other signs can indicate that a project is dead even if it doesn’t have a big warning on it. 


How CEOs can mitigate compounding risks

Leaders should instruct their risk management functions to broaden the aperture on the risk scenarios they monitor to include compounding risks. For example, once risk managers have identified the top risks to the business, they often create an enterprise-level risk management map. Instead, the team should consider how and which individual risks could combine to create a new compounding risk, with particular focus on risks that may be minor individually but have high frequency (IT outages, for instance). Looking at the business through the lens of the customer rather than through product offerings can help risk managers see small but recurring friction points that could cause customers to leave. ... Many compounding risks stem from trends with long-term time horizons such as climate change, market or business model innovations, or changing consumer behaviors. These risks tend to build slowly until they hit the tipping point of becoming existential for the organization. A horizon planning approach can help management teams address risks that can emerge at various stages by looking at three horizons: first, maintaining and defending the core business; second, nurturing emerging businesses; and third, creating genuinely new businesses.



Quote for the day:

"Whenever you see a successful business, someone once made a courageous decision." -- Peter F. Drucker

Daily Tech Digest - January 19, 2024

SolarWinds VP Offers 2024 Predictions on AI

As CIOs are either in the process of implementing AI into observability efforts, or at the very beginning stages, Stewart says data hygiene and management is going to be a key factor. “One of the key components is really understanding where you’re at on that observability journey,” he says. “There are a lot of disparate tools and different observability offerings that may be very segmented … The key is having the full data set across that stack that allows the AI technology to leverage that data, because if the engines don’t have the data across the stack, then there’s going to be parts of the puzzle that are missing, and AI is just not going to be able to accommodate.” ... “IT budgets aren’t getting bigger,” Stewart says. “And in many cases, budgets are shrinking based on concerns with the economy. Folks are looking for ways to save and some of that will certainly come through automation and efficiencies. And some of that will come through tool and vendor consolidation. The ability to leverage various AI technologies is certainly something that people are interested in to realize those efficiencies.”


Beware of hidden cloud fees

Fees can complicate cost management more so when transferring data across different cloud platforms, which is typical for multicloud deployments. Also, various factors such as location, geography, and data type can significantly impact the size of these charges. Egress charges, levied for data transferred out of a cloud service provider’s network, are now a hot button, even though they’ve been a part of cloud bills for years. High egress charges can inflate operational costs and restrict organizations from transitioning between cloud providers or moving their data to more cost-effective alternatives, even back to their enterprise data centers. As one of my clients put it, they feel their data is being held for ransom. ... Of course, many are looking to the cloud providers charging these fees to fix the issues. They may not be legally obligated to remove these fees, but they are listening to cloud users and have taken steps to reduce egress fees. Many enterprises are questioning their need to be in the cloud in the first place and could move to other platforms if costs are too high. Much of the repatriation that’s occurring is purely for cost issues. All things being equal, companies would rather stay in the cloud. If enterprises could get relief from annoying fees, this could keep some companies in place on the public cloud providers.


New study urges industry to address generational division in tech skills

As artificial intelligence becomes increasingly common in industries, experts are urging companies to address the gaps to sustain organisation capability. “Technology is transforming organisations – faster and more diverse than ever. Communication, collaboration, financial savings, productivity and security are underpinning these shifts and forming the catalyst for change,” said Greg Weiss, an HR consultant, onboarding expert, and the founder of Career365. Capterra’s study identified the three primary challenges that hinder the speed of digital transformation. These include the usage gap among employees, limited access to resources or training, and the constant introduction of new tools making it difficult to adapt. The research also revealed that while millennials are naturally inclined to digital tools (87 per cent), baby boomers and Generation Z are equally drawn to new technology (85 per cent). “The appetite is definitely there. It’s a matter of how these employees are facilitated and bridging the digital generation gap is crucial. A cookie-cutter approach to training and support doesn’t work in a divergent workforce – as their alignment differs,” Weiss said.


The Case for (and Against) Monorepos on the Frontend

Monorepos aren’t just for enterprise applications and large companies like Google, Savkin said. As it stands now, though, polyrepos tend to be the most common approach, with each line of business or functionality having its own repo. Take, for example, a bank. Its website or app might have a credit card section and an auto loan section. But what if there needs to be a common message, function or even just a common design change across the divisions? Polyrepo makes that harder, he said. “Now I need to do a coordination thing with team A, team B,” Savkin said. “In a polyrepo case, it can take many months.” In a monorepo, it’s easy to make that one change in as little as a day, he added. It also enables sharing components and libraries across development teams. Monorepos helped Jotfrom, an online forms company based in San Francisco, reduce its technical debt on the frontend, according to frontend architect and engineering director Berkay Aydin. Aydin wrote last week about the company’s move to a monorepo for the frontend. “We don’t have multiple configs or build processes anymore,” Aydin wrote. “Now, we’re sure every application is using the same configurations.”


Enterprises struggle with Agile methodology, reports long-standing survey of practitioners

According to the report, Agile is most successful at small companies. “Those in small companies are more likely than those in medium and large ones to say they are satisfied [with Agile],” the report states, and “74 percent of small companies (versus 62 percent at large companies) said at least 50 percent of their applications were delivered on time and with quality”. A key problem, which will not be a surprise to developers, is that “the business side is very slow to embrace Agile. Almost half of survey takers pointed to a generalized resistance to organizational change or culture class as the reasons why the business side is not embracing Agile, up 7 points from 2022.” ... Scrum is a specific Agile methodology and used by 63 percent of Agile teams, according to the report, which also states that Scrum has been the most popular Agile methodology since 2006 when the survey was first conducted. That said, even Scrum has many variants and the survey states that “the Agile landscape continues to be very fragmented.” 22 percent of survey respondents said that “we don’t follow a mandated framework” and 12 percent that “we created our own enterprise Agile framework.”


The OWASP AI Exchange: an open-source cybersecurity guide to AI components

In the context of AI systems, OWASP’s AI Exchange discusses development-time threats in relation to the development environment used for data and model engineering outside of the regular applications development scope. This includes activities such as collecting, storing, and preparing data and models and protecting against attacks such as data leaks, poisoning and supply chain attacks. Specific controls cited include development data protection and using methods such as encrypting data-at-rest, implementing access control to data, including least privileged access, and implementing operational controls to protect the security and integrity of stored data. Additional controls include development security for the systems involved, including the people, processes, and technologies involved. This includes implementing controls such as personnel security for developers and protecting source code and configurations of development environments, as well as their endpoints through mechanisms such as virus scanning and vulnerability management, as in traditional application security practices. Compromises of development endpoints could lead to impacts to development environments and associated training data.


NIST Offers Guidance on Measuring and Improving Your Company’s Cybersecurity Program

The publication is designed to be used together with any risk management framework, such as NIST’s Cybersecurity Framework or Risk Management Framework. It is intended to help organizations move from general statements about risk level toward a more coherent picture founded on hard data. “Everyone manages risk, but many organizations tend to use qualitative descriptions of their risk level, using ideas like stoplight colors or five-point scales,” said NIST’s Katherine Schroeder, one of the publication’s authors. “Our goal is to help people communicate with data instead of vague concepts.” Achieving this goal, according to the authors, involves moving from qualitative descriptions of risk — perhaps using broad categories such as high, medium or low risk level — to quantitative ones that carry less ambiguity and subjectivity. An example of the latter would be a statement that 98% of authorized system user accounts belong to current employees and 2% belong to former employees. The team developed the new draft guidance partly in response to public requests and feedback from a pre-draft call for comment. 


What is credential stuffing and how can I protect myself? A cybersecurity researcher explains

Credential stuffing is a type of cyber attack where hackers use stolen usernames and passwords to gain unauthorised access to other online accounts. In other words, they steal a set of login details for one site, and try it on another site to see if it works there too. This is possible because many people use the same username and password combination across multiple websites. It is common for people to use the same password for multiple accounts (even though this is very risky). Some even use the same password for all their accounts. This means if one account is compromised, hackers can potentially access many (or all) their other accounts with the same credentials. ... The best way is to never reuse passwords across multiple sites or apps. Always use a unique and strong password for each online account. Choose a password or pass phrase that is at least 12 characters long, is complex, and hard to guess. It should include a mix of uppercase and lowercase letters, numbers, and symbols. Don’t use pet names, birthdays or anything else that can be found on social media. You can use a password manager to generate unique passwords for all your accounts and store them securely. 


54% data fiduciaries lack experience in enforcing data protection laws

The research findings are based on the provisions of India's Digital Personal Data Protection (DPDP) Act that was enacted in August 2023. The rules for the Act are yet to be released for public consultation. The findings are part of the research carried out by the think tank Esya Centre in a report called "An Empirical Evaluation of the Implementation Challenges of the Digital Personal Data Protection Act 2023: Insights and Recommendations for the Way Forward." It has involved insights from 16 industry stakeholders, of which 13 are data fiduciaries and three are experts. "India has come a long way from the early iterations of the Data Protection Bill to the enactment of the Digital Personal Data Protection Act, 2023. The decision to eschew localization requirements and a compliance-heavy framework heralds a commitment to a progressive framework. It is now time to ensure that the prospective rules maintain the forward-thinking approach underpinning the parent Act and preserve a compliance-light data protection regime in the country," said Meghna Bal, Head of Research, Esya Centre.


Navigating The 'Fog Of A Cyberattack': Critical Lessons In Governance From The SEC Cybersecurity Rule

The short breach notification timeline attached to the SEC’s new cybersecurity disclosure rule is loud and clear: C-Suite leaders and boards have important work to do in ensuring their organizations can quickly identify, understand and publicly disclose material cybersecurity events and impacts. In this case, the expression “fog of war” is a useful analogy for understanding a critical complication. The term recognizes that many factors on which action in war is based are “wrapped in a fog of greater or lesser uncertainty.” The fog of a cyber event will similarly make the four-business-day timeline incredibly challenging. ... Instead of making battle plans mid-crisis, prepare now, establishing how incidents are identified, how reports get written and who’s responsible for determining materiality. Create rough boundaries for evaluating materiality (e.g., questions to ask, example incidents) to make decisions as clear as possible. Incomplete information is better than no information. You may not have a complete picture to share publicly, and that’s okay. But when you do your initial disclosure, establish when your next update will be shipped. 



Quote for the day:

“The more you loose yourself in something bigger than yourself, the more energy you will have.” -- Norman Vincent Peale

Daily Tech Digest - January 18, 2024

A tougher balancing act in 2024, the year of the CISO

What’s making things more difficult for CISOs? The ESG/ISSA data indicates that business aspects of running a cybersecurity program like working with the board, overseeing regulatory compliance, and managing a budget are primary contributing factors. This makes sense as the CISO role has evolved from technical overseer to business executive over the past few years. At the same time, organizations have increased their dependence on IT for automation, optimization, customer service, and digital transformation. ... Like their non-CISOs colleagues, CISOs are particularly stressed by things like an overwhelming workload, working with disinterested business managers, and keeping up with the security requirements of new business initiatives. It’s worth noting that 26% of CISOs are also stressed about monitoring the security status of third parties their organization does business with as compared with 12% of non-CISOs. Third-party relationships are often associated with business processes and therefore tied closely with business units. Unfortunately, security teams probably don’t have deep visibility into the day-to-day security performance at these firms. 


How do agile and DevOps interrelate?

Agile and DevOps have much in common. In fact, DevOps grew as an offshoot (or improvement) of agile, as many industry leaders found dysfunction in IT and software development. While incremental improvements support quality products, the competing objectives of individual IT workers lowered overall performance. To remedy the problem, developers proposed the more integrated approach of DevOps. Of course, the new philosophy offered different core values, which caused a split between two opposing communities. Developers grappled with what looked like conflicting philosophies, leading to the most common misconception about agile and DevOps: that they don’t interrelate. On the surface, the pundits have much to draw on. DevOps engineers focus on software scalability, speed, and team integration. Agile focuses on the slower, iterative process of software development, with more emphasis on continuous testing. More importantly, Agile silos individuals while DevOps integrates. Without the operability of DevOps, infrastructure responsibility falls to the wayside. But without the basic building blocks of the incremental, customer-focused method of agile, DevOps has no fundamental processes on which to stand.


AI Fraud Act Could Outlaw Parodies, Political Cartoons, and More

So just how broad is this bill? For starters, it applies to the voices and depictions of all human beings "living or dead." And it defines digital depiction as any "replica, imitation, or approximation of the likeness of an individual that is created or altered in whole or part using digital technology." Likeness means any "actual or simulated image… regardless of the means of creation, that is readily identifiable as the individual." Digital voice replica is defined as any "audio rendering that is created or altered in whole or part using digital technology and is fixed in a sound recording or audiovisual work which includes replications, imitations, or approximations of an individual that the individual did not actually perform." This includes "the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital means, technology, service, or device." These definitions go way beyond using AI to create a fraudulent ad endorsement or musical recording. They're broad enough to include reenactments in a true-crime show, a parody TikTok account, or depictions of a historical figure in a movie.


Navigating digital transformation in insurance sector: Challenges, opportunities and innovations

Notable developments are the changes that regulators have come up with in cybersecurity, the Information Security Management, the connectivity management with the website, with the vendors and employees, and the digital transformation that they have pushed us to. Because today, everything is digitally handled, an employee actually meets a customer, and the customer fills the form digitally; there is no mechanical filling of forms, although that practice is still there in many parts of the country and in many companies also. Having said that the digital absorption has become higher in percentage. So, when we handle things digitally, you will have to think through, and therefore today, employees are forced to think through what are the controls they could have. Like we have introduced OTP, so each state’s customer is forced to think and answer questions on OTP. You have to give OTP for that, like a policy that I bought about last week, so I had to do six OTPs in that company. I was wondering why so many OTPs are required, but when I look at the way the processes were handled by the salesperson, it was quite effective and efficient, and at the same time, it’s all for the safety of the customer, that thought process is given to the customer.


Productivity Paradox: Productivity in the Age of Knowledge Work

We sometimes forget that every employee within an enterprise is, at their core, also a consumer. Their personal preferences, shaped by daily interactions with personal technology, inevitably spill over into their professional lives. Consequently, the ubiquity of Macs, iPhones and iPads in the consumer market has sparked a growing demand for these devices in the workplace. This shift has only been hastened by the “Bring Your Own Device” (BYOD) movement, wherein employees sought to use their trusted personal devices for professional tasks, yearning for the familiarity and ease of use they’ve grown accustomed to. Instead of resisting this tide, more and more forward-thinking enterprises are instead leaning in. For one, IT leaders have recognized that specific hardware platforms matter less these days as they shift more of their applications to the public cloud. Reliability is another major factor, especially for remote employees who don’t have an IT helpdesk at their disposal. And when our survey respondents were asked if they agreed or disagreed with the statement “Apple takes enterprise security, compliance, and privacy concerns more seriously than other vendors,” three-quarters of them concurred.


6 hot networking and data center skills for 2024

“The current rage in AI technology is more than just a fad,” Leary says. “It is delivering real measurable benefits to IT organizations and the businesses they serve. And there is no sign of slowdown on the horizon. Driven by pressing capacities and costs, physical data center designs are changing significantly” with generative AI. Organizations need IT staffers who can help assure that generative AI is provided the data, data processing, and data exchanges needed to deliver on its promise, Leary says. ... Knowledge about cyber security products and services—as well as the threats they guard against—never go out of fashion. Organizations are facing a barrage of threats against their networks and data centers, so finding people with related skills remains a high priority. “Companies are constantly having to pay attention to their security as more and more cyber attacks happen,” Vick says. “That is something that is not slowing down, so they are having to update their firewalls and other security features.” Enterprises are building out their security teams, in some cases looking for people to update their security posture with a variety of technologies.


Navigating data management modernisation to deliver the AI-ready tech stack of the future

Forward-thinking IT leaders already see a direct correlation between modernising the data management journey across the entire tech stack and facilitating the extraction of value from data at the speed and scale needed for real-time intelligence. The same Alteryx research suggests that digital transformation relating to AI and machine learning will be the number one characteristic of the future enterprise, and tech stack priorities are already shifting to reflect this. Generative AI, quantum computing and machine learning operations (MLOps) are cited as the technologies most likely to see the largest shift in accelerated adoption in the future. While it only seems a short time since the pandemic forced many organisations to accelerate transformation at breakneck speeds, the rise of AI-related technologies will reinfuse these transformations. Why? Because AI has lowered the barrier to delivering productivity gains by delivering data-driven insights with just a sentence or a prompt. With countless data-oriented AI technologies and intelligent systems already available, the ultimate goal of this transformation is to modernise the data management journey across the entire stack.


Continuous Quality Assurance: Strategizing Automated Regression Testing for Codebase Resilience

In times of QA software testing, automation regression processes can be enabled to autonomously identify any unexpected behaviors or regressions in the software. ... End-users anticipate a consistent and dependable performance from software, recognizing that any disruptions or failures can profoundly affect their productivity and overall user experience. The implementation of regression testing proves invaluable in identifying unintended consequences, validating bug fixes, upholding consistency across versions, and securing the success of continuous deployment. Through early identification and resolution of regressions, development teams can proactively safeguard against issues reaching end-users, thereby preserving the quality and reliability of their software. ... Automated regression testing can be strategized based on the complexity of the codebase for approaches like retesting everything, selective re-testing, and prioritized re-testing. Tools such as Functionize, Selenium, Watir, Sahi Pro, and IBM Rational Functional Tester can be used to automate regression testing and improve efficiency.


Sustainable Partnerships Pay Dividends

The first step in establishing highly effective partnerships, von Koeller says, is to identify what an organization hopes to achieve and establish a roadmap for meeting specific goals and metrics. This requires a focus on shared values and coordination across departments and groups. “You have to understand your footprint, understand what environmental impact an action has, and how to engage suppliers to achieve alignment around your targets,” she explains. Open and honest communication among partners is vital. Too often, larger companies fail to understand what suppliers can do and what they can’t do, particularly when they’re located in faraway countries, or a supply chain has numerous layers. Partners downstream and upstream face their own set of challenges -- environmental, political, and practical -- that can make it difficult to conform to strict standards. A particularly daunting aspect -- especially for smaller firms supplying raw materials or specialized components -- is onerous data collection and reporting requirements, Linich notes. As a result, smaller companies may require funding and technical assistance from larger partners, including aid in setting up software and IT systems that support sustainability.


Get started with Anaconda Python

The Anaconda distribution is a repackaging of Python aimed at developers who use Python for data science. It provides a management GUI, a slew of scientifically oriented work environments, and tools to simplify the process of using Python for data crunching. It can also be used as a general replacement for the standard Python distribution, but only if you’re conscious of how and why it differs from the stock version of Python. ... The most noticeable thing Anaconda adds to the experience of working with Python is a GUI, the Anaconda Navigator. It is not an IDE, and it doesn’t try to be one, because most Python-aware IDEs can register and use the Anaconda Python runtime themselves. Instead, the Navigator is an organizational system for the larger pieces in Anaconda. With the Navigator, you can add and launch high-level applications like RStudio or Jupyterlab; manage virtual environments and packages; set up “projects” as a way to manage work in Anaconda; and perform various administrative functions. Although the Navigator provides the convenience of a GUI, it doesn’t replace any command-line functionality in Anaconda, or in Python generally. 



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor

Daily Tech Digest - January 17, 2024

Improving Supply Chain Security, Resiliency

Regulatory compliance plays a vital role in how cybersecurity strategies are built: Compliance mandates like GDPR and the NIST Cybersecurity Framework provide foundations for data protection, access control, and incident response. “With these baselines in place, organizations can ensure that there is a certain level of security across all supply chain partners, which reduces the overall risk landscape,” Bachwani says. “Compliance also fosters a culture of security, which drives continuous improvement.” He adds that the pressure to meet regulatory standards necessitates ongoing risk assessments, proactive risk management practices, and regular vulnerability patching, which prioritizes cybersecurity in decision-making. “Regulatory frameworks often come with heavy fines and reputational damage for those who do not comply,” Bachwani notes. “This incentivizes everyone within the supply chain to prioritize cybersecurity and invest in robust safeguards.” Christopher Warner, senior security consultant at GuidePoint Security, says regulatory frameworks often specify security controls and standards that organizations must follow.


Quantum entanglement discovery is a revolutionary step forward

This discovery opens the door to new quantum communication protocols, utilizing topology as a medium for quantum information processing. Such protocols could revolutionize how we encode and transmit information in quantum systems, especially in scenarios where traditional encoding methods fail due to minimal entanglement. In summary, the significance of this research lies in its potential for practical applications. For decades, preserving entangled states has been a major challenge. The team’s findings suggest that topology can remain intact even as entanglement decays, offering a novel encoding mechanism for quantum systems. Professor Forbes concludes with a forward-looking statement, saying, “We are now poised to define new protocols and explore the vast landscape of topological nonlocal quantum states, potentially revolutionizing how we approach quantum communication and information processing.” ... It’s a physical process where pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the others, even when the particles are separated by a large distance.


Staffing levels: are data centers at risk of unnecessary outages?

As for whether there were sufficient staff onsite during the Microsoft outage, and what should be the optimal number of staff present, John Booth, Managing Director of Carbon3IT Ltd, and Chair of the Energy Efficiency Group of the Data Centre Alliance, says it very much depends on the design and scale of the data center, as well as on the level of automation for monitoring and maintenance. Data centers are also often reliant on outsourced personnel for specific maintenance and emergency tasks and offer a 4-hour response. Beyond this, he suggests there is a need for more information to determine whether 7 staff were sufficient but admits that 3 members of staff are usually the norm for a night shift, “with perhaps more during the day depending on the rate of churn of equipment.” Davis adds that there is no reliable rule of thumb because each and every organization and site is different. However, there are generally accepted staff calculation techniques that can determine the right staffing levels for a particular data center site. As for the Microsoft incident, he’d need to formally do the calculations to decide whether 3 or 7 technicians were sufficient. It’s otherwise just a guess.


Projecting 2024 Cybertrends and C-Suite Responsibilities

Organizations must comply with various regulations and standards, such as the EU General Data Protection Regulation (GDPR), the US State of California Consumer Privacy Act (CCPA), the Payment Card Industry Data Security Standard (PCI DSS), and the US Health Insurance Portability and Accountability Act (HIPAA). Non-compliance can result in fines, legal action, or reputational damage. Compliance can be achieved if C-suite executives establish a compliance framework that requires them to assess and monitor their compliance status and implement necessary policies and procedures. They should also stay up to date on the changing regulatory and compliance landscape and engage with regulators and policymakers.The persistent cybersecurity skills gap is the shortage of qualified and experienced cybersecurity professionals on the job market. The cybersecurity skills gap can affect the ability of organizations to prevent, detect, and respond to cyberthreats. To help fill the skills gap, C-level executives should invest in the recruitment, retention, and development of their cybersecurity talent, and offer competitive compensation and benefits.


Here’s what you should look for in an OKR Management Tool

Communication is central to ensuring the success of any goal-setting framework. Make sure the technology you are leveraging allows the capturing of feedback, thoughts and comments on an ongoing basis. Using Keka’s OKR tool, teams can engage in meaningful discussions, share insights, and offer feedback directly on objectives and key results, fostering a culture of transparency and continuous improvement via the comments and 1 on 1 meeting feature. This functionality also empowers teams to set their own aligned goals, tailoring objectives to their unique strengths and challenges while still contributing to the larger organisational mission. ... Reminders about OKRs are highly advantageous as they keep objectives and key results at the forefront of individuals' and teams' attention, minimising the risk of goals becoming overlooked or forgotten during daily tasks. These reminders serve as nudges, encouraging consistent progress tracking, timely updates, and proactive adjustments. By maintaining goal visibility and urgency, this feature ensures that teams stay on track, deadlines are met, and alignment with broader strategic objectives remains strong, ultimately driving improved goal achievement and organisational success.


The CISO’s guide to accelerating quantum-safe readiness

With a dynamic perspective of their enterprise-wide cryptographic usage, CISOs can begin the work of cybersecurity risk assessments. This step involves working with cybersecurity and privacy managers to prioritize sensitive and critical data sets most at risk from “harvest now, decrypt later” attacks and with the highest business value and impact. To translate these insights into a quantum-safe strategy, security leaders should evaluate the business relevance in relation to the complexity of mitigation for specific assets so that they can plan their quantum-safe transition in a way that optimizes performance, compatibility and ease of integration. ... The final step in the journey to quantum-safe security is the transformation of cryptographic infrastructure to incorporate quantum-resistant cryptography. Before deploying quantum-safe solutions to their stack, security leaders should equip their teams with the tools and education to test the new cryptographic protocols and evaluate the potential impact on systems and performance. Quantum-safe solutions that can be updated without having to overhaul their cybersecurity infrastructure will help CISOs establish crypto-agility and ensure they can proactively and seamlessly address potential quantum vulnerabilities.


Magic Keyboard vulnerability allows takeover of iOS, Android, Linux, and MacOS devices

“The user does not have to have a keyboard paired with their phone already. And as long as Bluetooth is enabled on the Android device, at any time the phone is on them, and Bluetooth is on, the attacker can then force pair an emulated keyboard with the Android device and inject keystrokes, including at the lock screen.” Newlin then turned to Linux. “It turns out that the Linux attack is very, very similar,” he said. “On Linux, as long as the host is discoverable and connectable over Bluetooth, the attacker can force-pair a keyboard and inject keystrokes without the user’s confirmation. And so, this is distinct from Android in that the device has to be not only connectable but also discoverable and connectable on Linux for the attack.” Linux fixed this bug in 2020 but left the fix disabled by default. ... Newlin encourages security researchers to continue probing Bluetooth flaws. “I think it’ll probably be a while [before the full extent of Bluetooth flaws is known] because it will take the community actually fleshing these out and identifying all these additional effective systems beyond what I’ve seen myself,” he said.


How Edge Analytics Can Deliver the Competitive Edge Your Business Needs

Traditional data analytics models struggle to keep up with all the data that’s being generated. Traditional data analytics is also no match for today’s data velocity. As the speed at which data is created continues to grow, there will be an even greater need for real-time processing. The interpretation and application of real-time analytics can vary based on the specific industry and its requirements. Real-time analytics is a broad concept that is adapted to suit the needs of different industries and sectors. ...  By addressing these traditional data analytics challenges, edge analytics is becoming more prominent. It’s a natural progression -- taking data and business where they need to go now. ... Businesses can move faster with edge analytics because of its reduced latency. This is possible because edge analytics processes data closer to where it was generated, so organizations get data insights quicker. Reduced latency is particularly critical for applications that require real-time response such as battlefield scenarios, fraud detection, and supply chain management. Because edge analytics reduces the data load on the network, it also saves energy, reduces carbon emissions, and helps organizations meet their sustainability goals to protect the planet.


How OpenAI plans to handle genAI election fears

For its part, OpenAI said ChatGPT will redirect users to CanIVote.org for specific election-related queries. The company is also focusing on enhancing the transparency of AI-generated images using its DALL-E technology with plans to incorporate a "cr" icon on such photos, signaling they are AI-generated. The company also plans to enhance its ChatGPT platform by integrating it with real-time global news reporting, including proper attribution and links. The news initiative is an expansion of an agreement made last year with the German media conglomerate Axel Springer. Under that deal, ChatGPT users gain access to summarized versions of select global news content from Axel Springer's various media channels. ... There's no universal rule for how genAI should be used in politics. Last year, Meta declared it would prohibit political campaigns from using genAI tools in their advertising and mandate that politicians reveal any such use in their ads. Similarly, YouTube said all content creators must disclose whether their videos contain "realistic" but altered media, including those created with AI.


Storytelling for CIOs: From niche to bestseller

“For a CIO, or anyone in a senior position with responsibility for data, the best way to succeed is to make projects come to life,” says Caroline Carruthers, formerly a pioneering chief data officer at Network Rail, which manages train stations and infrastructure in the UK, and now CEO of data consultancy Carruthers and Jackson. “You can give people all the dashboards, charts and figures in the world, but it’s when you help them understand the thinking behind what you do and bring it to life that you get the buy-in you need.” Often, CIOs use stories as a form of Esperanto or a translation layer. “I always find there’s benefit in using a story to help my audience understand what can sometimes be very technical concepts that I’m trying to communicate to non-technical people,” says Adam Miller, CIO of UK insurer, Markerstudy Group. “Get the story right, then people understand the plan and you’ve a much better chance of them buying in. I also find that a good story is just as important for highlighting the impact of inaction too, which can often be the easiest option for people to take.”



Quote for the day:

"Leadership Seductions are behaviors or attitudes in which we become 'stuck'" -- Catherine Robinson-Walker