Daily Tech Digest - June 05, 2023

How to create generative AI confidence for enterprise success

The key to enterprise-ready generative AI is in rigorously structuring data so that it provides proper context, which can then be leveraged to train highly refined large language models (LLMs). A well-choreographed balance between polished LLMs, actionable automation and select human checkpoints forms strong anti-hallucination frameworks that allow generative AI to deliver correct results that create real B2B enterprise value. ... The initial phase of any company’s system is the blank slate that ingests information tailored to a company and its specific goals. The middle phase is the heart of a well-engineered system, which includes rigorous LLM fine-tuning. OpenAI describes fine-tuning models as “a powerful technique to create a new model that’s specific to your use case.” This occurs by taking generative AI’s normal approach and training models on many more case-specific examples, thus achieving better results. In this phase, companies have a choice between using a mix of hard-coded automation and fine-tuned LLMs. 

Governments worldwide grapple with regulation to rein in AI dangers

Although a number of countries have begun to draft AI regulations, such efforts are hampered by the reality that lawmakers constantly have to play catchup to new technologies, trying to understand their risks and rewards. “If we refer back to most technological advancements, such as the internet or artificial intelligence, it’s like a double-edged sword, as you can use it for both lawful and unlawful purposes,” said Felipe Romero Moreno, a principal lecturer at the University of Hertfordshire’s Law School whose work focuses on legal issues and regulation of emerging technologies, including AI. AI systems may also do harm inadvertently, since humans who program them can be biased, and the data the programs are trained with may contain bias or inaccurate information. “We need artificial intelligence that has been trained with unbiased data,” Romero Moreno said. “Otherwise, decisions made by AI will be inaccurate as well as discriminatory.”

10 notable critical infrastructure cybersecurity initiatives in 2023

In April, a group of OT security companies that usually compete with one another announced they were setting aside their rivalries to collaborate on a new vendor-neutral, open-source, and anonymous OT threat warning system called ETHOS (Emerging Threat Open Sharing). Formed as a nonprofit, ETHOS aims to share data on early threat indicators and discover new and novel attacks threatening industrial organizations that run essential services, including electricity, water, oil and gas production, and manufacturing systems. It has already gained US CISA endorsement, a boost that could give the initiative greater traction. All organizations, including public and private asset owners, can contribute to ETHOS at no cost, and founders envisage it evolving along the lines of open-source software Linux. ETHOS community and board members include some of the top OT security companies 1898 & Co., ABS Group, Claroty, Dragos, Forescout, NetRise, Network Perception, Nozomi Networks, Schneider Electric, Tenable, and Waterfall Security.

UK has time limit on ensuring cryptocurrency regulatory leadership

The report also said that interest in digital assets among investors and the general public led to the conclusion that cryptocurrency is more than a fad and is here to stay, and that cross-government planning is required if the UK wants to take the opportunities it offers. These recommendations followed contributions from the crypto sector regulators, industry experts and the general public. The report said: “Other countries around the world are moving quickly to develop clear regulatory frameworks for cryptocurrency and digital assets. The UK must move within a finite window of opportunity within the next 12-18 months to ensure early leadership within this sector.” Scottish Nationalist Party MP and chair of the APPG, Lisa Cameron MP, said: “This is the first report of its kind compiled jointly involving Members of Parliament and the House of Lords and we are keen that it contributes to evidence-based policy development across the sector.

3 things CIOs must do now to accurately hit net-zero targets

One of the immediate efforts CIOs can take to accelerate sustainability goals includes selecting energy-efficient software, which can have a major impact on energy consumption. Uniting Technology and Sustainability surveyedcompanies that said they were taking various approaches to incorporate sustainability throughout the software development lifecycle. ... This opportunity to collaborate with sustainability in mind extends to the influence CIOs hold over where and how employees work. By integrating remote working capabilities, the CIO plays a hand in an organization’s shift to an increasingly remote or hybrid workforce model—a move that can significantly reduce a company’s carbon footprint. This effort has the potential to not only create sustainability at scale, but increase employee satisfaction, which will power a more sustainable organization. ... CEOs believe new technology will allow them to reach sustainability goals and build resilience, with 55% of CEOs enhancing sustainability data collection capabilities, and 48% transitioning to a cloud infrastructure.

Serverless is the future of PostgreSQL

Shamgunov sees two primary benefits to running PostgreSQL serverless. The first is that developers no longer need to worry about sizing. All the developer needs is a connection string to the database without worrying about size/scale. Neon takes care of that completely. The second benefit is consumption-based pricing, with the ability to scale down to zero (and pay zero). This ability to scale to zero is something that AWS doesn’t offer, according to Ampt CEO Jeremy Daly. Even when your app is sitting idle, you’re going to pay. But not with Neon. As Shamgunov stresses in our interview, “In the SQL world, making it truly serverless is very, very hard. There are shades of gray” in terms of how companies try to deliver that serverless promise of scaling to zero, but only Neon currently can do so, he says. Do people care? The answer is yes, he insists. “What we’ve learned so far is that people really care about manageability, and that’s where serverless is the obvious winner. [It makes] consumption so easy. All you need to manage is a connection stream.” 

Cloud conundrum: The changing balance of microservices and monolithic applications

Containers and microservices are great for applications that can put everything together in a single place, and make it easier for developers to run across many different platforms and computing equipment. Containers are also better at scaling up and down an application than starting and stopping a whole bunch of VMs, since they take fraction of seconds to bring up, versus minutes for a VM. But there are still tradeoffs. Here is one way to describe the situation: “The microservices architecture is more beneficial for complex and evolving applications. But if you have a small engineering team aiming to develop a simple and lightweight application, there is no need to implement them.” But it would be wise not to discount VMs entirely. They can be an important stepping stone from the on-premises world, as Southwire Co. LLC’s Chief Information Officer Dan Stuart told SiliconANGLE in a recent interview. “We had a lot of old technology in our data center and were already familiar with VMware, so that made the move to Google’s Cloud easier,” he said.

A Case for Event-Driven Architecture With Mediator Topology

The most straightforward cases for reliability involve the converter services. The service locks a message in the queue when it starts processing and deletes it when it has finished its work and sent the result. If the service crashes, the message will become available again in the queue after a short timeout and can be processed by another instance of the converter. If the load grows faster than new instances are added or there are problems with the infrastructure, messages accumulate in the queue. They will be processed right after the system stabilizes. In the case of the Mediator, all the heavy lifting is again done by the Workflow Core library. Because all running workflows and their state are stored in the database, if an abnormal termination of the service occurs, the workflows will continue execution from the last recorded state. Also, we have configurations to retry failed steps, timeouts, alternative scenarios, and limits on the maximum number of parallel workflows. What’s more, the entire system is idempotent, allowing every operation to be retried safely without side effects and mitigating the concern of duplicate messages being received.

GDPR — How does it impact AI?

It is no surprise that legislation has lagged behind the unprecedented rise of AI, but this is where leaning more on data protection regulation may help to fill an important gap in the meantime. Another factor that has completely altered the landscape in the past five years is the UK’s exit from the EU, which brought additional complexities for the effective monitoring of personal data; while ‘UK GDPR’ is largely the same as the EU version it does carry some slight differences which make it an imperative for companies to increase education around data usage to understand the new policy landscape and avoid running afoul of these differences. ... Looking ahead, although the landscape has undoubtedly become far more complex, I remain a firm believer that the GDPR and AI can still work successfully in tandem as long as rigorous measures, checks and best practices are embedded firmly into business strategies and on the proviso that AI-related policy also evolves as a way to supplement existing data regulations.

The metaverse: Not dead yet

“We are in a winter for the metaverse, and how long that chill lasts remains to be seen,” said J.P. Gownder, vice president and principal analyst on Forrester's Future of Work team. Late last year, the analyst firm predicted a drop-off in interest during 2023 as a more realistic picture of the technology’s current possibilities emerged. “The hype was way exceeding the reality of the capabilities of the technology, the interest from customers — both business and consumer — and just the overall maturity of the market.” Yet the metaverse concept isn’t going away. “We think that, in the future, something like the metaverse will exist, whereby we have a 3D experience layer over the internet,” said Gownder. Don’t expect this to happen any time soon, though: the development of the metaverse could take a decade, according to Forrester. ... As metaverse hype subsides, the underlying technologies continue to develop and evolve, on both the hardware and software front. ... “There continues to be steady development of metaverse-type concepts. But just like we saw with the march to autonomous vehicles, this takes a long time to mature and put into place,” Lightman said.

Quote for the day:

“Being a leader, at its core, is about how we show up each day to work with the people in our charge.” -- Claudio Toyama

No comments:

Post a Comment