Showing posts with label eCommerce. Show all posts
Showing posts with label eCommerce. Show all posts

Daily Tech Digest - August 30, 2024

Balancing AI Innovation and Tech Debt in the Cloud

While AI presents incredible opportunities for innovation, it also sheds light on the need to reevaluate existing governance awareness and frameworks to include AI-driven development. Historically DORA metrics were introduced to quantify elite engineering organizations based on two critical categories of speed and safety. Speed alone does not indicate elite engineering if the safety aspects are disregarded altogether. AI development cannot be left behind when considering the safety of AI-driven applications. Running AI applications according to data privacy, governance, FinOps and policy standards is critical now more than ever, before this tech debt spirals out of control and data privacy is infringed upon by machines that are no longer in human control. Data is not the only thing at stake, of course. Costs and breakage should also be a consideration. If the CrowdStrike outage from last month has taught us anything it’s that even seemingly simple code changes can bring down entire mission-critical systems at a global scale when not properly released and governed. This involves enforcing rigorous data policies, cost-conscious policies, compliance checks and comprehensive tagging of AI-related resources.


AI and Evolving Legislation in the US and Abroad

The best way to prepare for regulatory changes is to get your house in order. Most crucial is having an AI and data governance structure. This should be part of the overall product development lifecycle so that you’re thinking about how data and AI is being used from the very beginning. Some best practices for governance include: Forming a cross-functional committee to evaluate the strategic use of data and AI products; Ensuring you have experts from different domains working together to design algorithms that produce output that is relevant, useful and compliant; Implementing a risk assessment program to determine what risks are at issue for each use case; Executing an internal and external communication plan to inform about how AI is being used in your company and the safeguards you have in place. AI has become a significant, competitive factor in product development. As businesses develop their AI program, they should continue to abide by responsible and ethical guidelines to help them stay compliant with current and emerging legislation. Companies that follow best practices for responsible use of AI will be well-positioned to navigate current rules and adapt as regulations evolve.


The paradox of chaos engineering

Although chaos engineering offers potential insights into system robustness, enterprises must scrutinize its demands on resources, the risks it introduces, and its alignment with broader strategic goals. Understanding these factors is crucial to deciding whether chaos engineering should be a focal area or a supportive tool within an enterprise’s technological strategy. Each enterprise must determine how closely to follow this technological evolution and how long to wait for their technology provider to offer solutions. ... Chaos engineering offers a proactive defense mechanism against system vulnerabilities, but enterprises must weigh its risks against their strategic goals. Investing heavily in chaos engineering might be justified for some, particularly in sectors where uptime and reliability are crucial. However, others might be better served by focusing on improvements in cybersecurity standards, infrastructure updates, and talent acquisition. Also, what will the cloud providers offer? Many enterprises get into public clouds because they want to shift some of the work to the providers, including reliability engineering. Sometimes, the shared responsibility model is too focused on the desire of the cloud providers rather than their tenants. You may need to step it up, cloud providers.


Generative AI vs large language models: What’s the difference?

While generative AI has become popular for content generation more broadly, LLMs are making a massive impact on the development of chatbots. This allows companies to provide more useful responses to real-time customer queries. However, there are differences in the approach. A basic generative AI chatbot, for example, would answer a question with a set answer taken from a stock of responses upon which it has been trained. Introducing an LLM as part of the chatbot set-up means its response will become much more detailed and reactive and just like the reply has come from a human advisor, instead of from a computer. This is quickly becoming a popular option, with firms such as JP Morgan embracing LLM chatbots to improve internal productivity. Other useful implementations of LLMs are to generate or debug code in software development or to carry out brainstorms or research tasks by tapping into various online sources for suggestions. This ability is made possible by another related AI technology called retrieval augmented generation (RAG), in which LLMs draw on vectorized information outside of its training data to root responses in additional context and improve their accuracy.


Agentic AI: Decisive, operational AI arrives in business

Agentic AI, at its core, is designed to automate a specific function within an organization’s myriad business processes, without human intervention. AI agents can, for example, handle customer service issues, such as offering a refund or replacement, autonomously, and they can identify potential threats on an organization’s network and proactively take preventive measures. ... Cognitive AI agents can also serve as assistants in the healthcare setting by engaging with a patient daily to support mental healthcare treatment, and as student recruiters at universities, says Michelle Zhou, founder of Juji AI agents and an inventor of IBM Watson Personality Insights. The AI recruiter could ask prospective students about their purpose of visit, address their top concerns, infer the students’ academic interests and strengths, and advise them on suitable programs that match their interests, she says. ... The key to getting the most value out of AI agents is getting out of the way, says Jacob Kalvo, co-founder and CEO of Live Proxies, a provider of advanced proxy solutions. “Where agentic AI truly unleashes its power is in the ability to act independently,” he says. 


Protecting E-Commerce Businesses Against Disruptive AI-driven Bot Threats

Bot attacks have long been a thorn in the side of e-commerce platforms. With the growing number of shoppers regularly interacting and sharing their data on retail websites combined with high transaction volumes and a growing attack surface, these online businesses have been a lucrative target for cybercriminal activity. From inventory hoarding, account takeover, and credential stuffing to price scraping and fake account creation, these automated threats have often caused significant damage to e-commerce operations. By using a variety of sophisticated evasion techniques in distributed bot attacks such as rapidly rotating IPs and identities and manipulating HTTP headers to appear as legitimate requests, attackers have been able to evade detection by traditional bot detection tools.  ... With the evolution of Generative AI models and its increasing adoption by bot operators, bot attacks are expected to become even more sophisticated and aggressive in nature. In the future, Gen AI-based bots could be able to independently learn, communicate with other bots, and adapt in real-time to an application’s defensive mechanisms. 


How copilot is revolutionising business process automation and efficiency

Copilot is essential for optimising operations in addition to increasing productivity. Companies frequently struggle with inefficiencies brought on by human error and manual processes. Copilot ensures seamless operations and lowers the possibility of errors by automating these activities. For instance, automation of customer service. According to a survey, 72% of consumers believe that agents should automatically be aware of their personal information and service history. Customer relationship management (CRM) systems can incorporate Copilot to give agents real-time information and recommendations, guaranteeing a customised and effective service experience. The efficiency of customer support operations is further enhanced by intelligent routing of questions and automated responses. ... For example, Copilot can forecast performance, assess market trends, and provide investment recommendations in the financial industry. Deloitte claims that artificial intelligence (AI) can save operating costs in the finance sector by as much as 20%. Copilot’s automated data analysis and accurate recommendation engine help financial organisations remain ahead of the curve and confidently make strategic decisions.


Is your data center earthquake-proof?

Leuce explains that when Colt DCS designs the layout of a data center, it ensures the most critical parts, such as the data halls, electrical rooms, and other ancillary rooms required for business continuity, are placed on the isolation base. Other elements, such as generators, which are often designed to withstand an earthquake, can then be placed directly on the ground. ... A final technique employed by Colt DCS is the use of dampers – hydraulic devices that dissipate the kinetic energy of seismic events and cushion the impact between structures. Having previously deployed lead dampers at its first data center in Inzai, Japan, Colt’s has gone a step further at its most recently built facility in Keihanna, Japan, where it is using a combination of an oil damper made out of naturally laminated rubber plus a friction pendulum system, a type of base isolation that allows you to damp both vertically and horizontally. “The reason why we mix the friction pendulum with the oil damper is because with the oil damper, you can actually control the frequency in the harmonics pulsation of the building, depending on the viscosity of the oil, while the friction pendulum does the job of dampening the energy in both directions, so you bring both technologies together,” Leuce explains.


Digital IDV standards, updated regulation needed to fight sophisticated cybercrime

In the face of rising fraud and technological advancements, there is a growing consensus on the need for innovative approaches to financial security. As argued in a recent Forbes article, the upcoming election season presents an opportunity to rethink the ecosystem that supports financial innovation. In the article, Penny Lee, president and CEO of the Financial Technology Association (FTA), advocates for policies that foster technological advancements while ensuring robust regulatory frameworks to protect consumers from emerging threats. ... Amidst these challenges, the payments industry is experiencing a surge in innovation aimed at combating fraud and enhancing security. Real-time payments and secure digital identity systems are at the forefront of these efforts. The U.S. Payments Forum Summer Market Snapshot highlights a growing interest in real-time payments systems, which enable instant transfer of funds and provide businesses and consumers with immediate access to their money. These systems are designed to improve cash flow management and reduce the risk of fraud through enhanced authentication measures.


Transformer AI Is The Heathcare System's Way Forward

Transformer-based LLMs are adapting quickly to the amount of medical information the NHS deals with per patient and on a daily basis. The size of the ‘context windows’, or input, is expanding to accommodate larger patient files, critical for quick analysis of medical notes and more efficient decision making by clinical teams. Beyond speed, these models serve well for quality of output, which can lead to more optimal patient care. An ‘attention mechanism’ learns how different inputs relate to each other. In a medical context, this can include the interactions of different drugs in a patient’s record. It can find relationships between medicines and certain allergies, predicting the outcome of this interaction on the patient’s health. As more patient records become electronic, the larger training sets will allow LLMs to become more accurate. These AI models can do what takes humans hours of manual effort – sifting through patient notes, interpreting medical records and family history and understanding relationships between previous conditions and treatments. The benefit of having this system in place is that it creates a full, contextual picture of a patient that helps clinical teams make quick decisions about treatment and counsel.



Quote for the day:

"Are you desperate or determined? With desperation comes frustration; With determination comes purpose achievement, and peace." -- James A. Murphy

Daily Tech Digest - March 02, 2024

Rust on the Rise: New Advocacy Expected to Advance Adoption

Recent advocacy and research efforts from agencies like the National Security Agency (NSA), Cybersecurity and Infrastructure Security Agency (CISA), National Institute of Standards and Technology (NIST), and ONCD “can serve as valuable evidence of the considerable risk memory-safety vulnerabilities pose to our digital ecosystem,” the Rust Foundation‘s Executive Director & CEO, Rebecca Rumbul, told The New Stack. Moreover, Rumbul said The Rust Foundation believes that the Rust programming language is the most powerful tool available to address critical infrastructure security gaps. “As an organization, we are steadfast in our commitment to further strengthening the security of Rust through programs like our Security Initiative,” she said. Meanwhile, looking specifically at software development for space systems, the ONCD report says: both memory-safe and memory-unsafe programming languages meet the organization’s requirements for developing space systems. “At this time, the most widely used languages that meet all three properties are C and C++, which are not memory-safe programming languages, the report said.


The Power of Hyperautomation in Banking

Hyperautomation improves the operational efficiency within banks significantly as it helps in automating routine processes, that include document processing, transaction reconciliations, data entry, decreasing the requirement for manual intervention. Therefore, this not only augments processes but it also reduces errors, leading to a more reliable as well as cost-effective operation. Banks can use hyperautomation to offer personalized, 24/7 services to their customers. Chatbots & virtual assistants powered by Artificial Intelligence can respond to inquiries as well as perform transactions around the clock. Faster response times coupled with the ability for tailoring services to separate customer requirements leading to enhanced customer satisfaction as well as loyalty. “Hyperautomation facilitates organizations to improve customer experience by reducing the friction in user self-service applications and streamlining broken onboarding processes. It enables faster support and sales query resolution through relevant integrations, AI/ML, and assistive technologies,” says Arvind Jha, Former General Manager – Product Management and Marketing, Newgen Software.


What Is Data Completeness and Why Is It Important?

Data completeness is an important aspect of Data Quality. Data Quality is a reference to how accurate and reliable the data is overall. Data completeness specifically focuses on missing data or how complete the data is, rather than concerns of inaccurate or duplicated data. A lack of data completeness is normally the result of information that was never collected. For example, if a customer’s name and email address are supposed to be collected, but the email address is missing, it is difficult to communicate with the customer. ... Missing chunks of information restrict or bias the decision-making process. Attempting to perform analytics with incomplete data can produce blind spots and biases, and result in missed opportunities. Currently, business leaders use data analytics to make decisions that range from marketing to investment strategies to medical diagnostics. In some situations, data missing key pieces of information is still used, which can lead to dangerous mistakes and false conclusions. Assessing and improving data completeness should be done before performing analytics.


A socio-technical approach to data management is crucial in our decentralised world

To improve the odds of successfully building an effective data management strategy, working with a trusted and experienced data partner to help shift the organisation’s data culture is a crucial - and often missing - step. The Data and Analytics Leadership Annual Executive Survey 2023 found that cultural factors are the biggest obstacle to delivering value from data investments. Data fabrics, meshes and modern data stacks will continue to consolidate an increasingly decentralised world by making the management of data easier. However, to ensure control over security and governance, and to extract value from data that is trustworthy requires a tactical shift to what we call a socio-technical approach. In other words, any strategy must be made up of an investment in people, process and technology to be successful. This is because data management involves more than the technical aspects of data storage, processing and analysis. It also includes the social aspects of data governance, change management, data quality management, user upskilling and collaboration between different teams. Organisations that know how to use technology the best will have an edge over their competitors.


Blockchain is one step away from mainstream adoption

Blockchain’s growth is already reshaping traditional business processes and models. In the financial sector, blockchain facilitates faster and more secure transactions. Supply chain management benefits from increased transparency and traceability, ensuring the authenticity and integrity of products. Smart contracts automate and streamline complex agreements, minimizing the risk of fraud and error. And in addition to sparking rising trading volumes, the SEC’s approval of spot bitcoin ETFs sent a global signal of validation to governments reviewing the viability of blockchain applications in both the private and public sectors. Importantly, the evolution of blockchain has given credence to — and bestowed practicality upon — the concept of decentralized finance (DeFi). We’re already in a reality where traditional financial services are replicated, and even improved, using blockchain technology. This is transformative because it will eliminate the need for intermediaries, opening the door to financial participation for virtually anyone with internet access. This democratization of finance has the potential to provide financial services to underserved populations and redefine the global financial landscape.


Biometrics Regulation Heats Up, Portending Compliance Headaches

What this all means is that it will be complicated for companies doing business nationally because they will have to audit their data protection procedures and understand how they obtain consumer consent or allow consumers to restrict the use of such data and make sure they match the different subtleties in the regulations. Contributing to the compliance headaches: The executive order sets high goals for various federal agencies in how to regulate biometric information, but there could be confusion in terms of how these regulations are interpreted by businesses. For example, does a hospital's use of biometrics fall under rules from the Food and Drug Administration, Health and Human Services, the Cybersecurity and Infrastructure Security Agency, or the Justice Department? Probably all four. ... Meanwhile, AI-induced deepfake video impersonations by criminals that abuse biometric data like face scans are on the rise. Earlier this year, a deepfake attack in Hong Kong was used to steal more than $25 million, and there are certainly others who will follow as AI technology gets better and easier to use for producing biometric fakes. The conflicting regulations and criminal abuses could explain why consumer confidence in biometrics has taken a nosedive.


The Role of Data in Crafting Personalized Customer Journeys

Through comprehensive customer profiles, data is sourced from multiple touchpoints in silos such as online visitors, purchases done, forms, customer support units, social media engagement, mobile app usage, and other channels as recognized in the CRM system. This further facilitates real-time data processing and identifies customer behaviors and preferences. As briefly discussed previously, predictive analytics consumes historical customer data and powers forecasting of expected behaviors and preferences. This segments data based on different parameters such as demographics, behaviors, preferences, etc. Ultimately, it acts as the seed for planting responsive marketing campaigns. While we are at it, an important strategy is cross-channel integration. Given the scale of marketing landscape, it is important to consider all channels and systems. So, the data collected from multiple sources is then integrated and analyzed through data management platforms to create a cross-channel, unified 360 view. Such interoperability delivers an omnichannel experience, thereby increasing their lifetime value. To ensure better customer loyalty, implement practices in alignment with the regulations. 


Checkout Lessons: What Banks Need to Borrow from eCommerce

eCommerce has much to teach the financial and healthcare industries, which also experience high seasonality and peak traffic periods. Events like 401(k) sign-ups, healthcare enrollments, and tax days are notorious for bringing down systems. In my experience, performance is synonymous with user experience. ... Many digital-first banks don’t operate physical branches. Their success is due to a singular focus on user experience, performance, speed, flexibility, and a mobile-first approach. This is what has won over the current generation of young people who do not need to visit a teller. It’s crucial for banks to recognize the importance of these advancements and to take action. Otherwise, they risk losing their competitive edge. In the U.S., some banks perform exceptionally well with only an online presence, with USAA as a prime example. Some companies, like Capital One, are innovating by transforming their banks into cafés. They provide WiFi, allowing customers to work and do more than just banking. This shift dramatically enhances the user experience.


Fintech at its Finest: Adding Value with Innovation

The best fintech platforms are constantly listening to their customers. Whether that’s through harnessing the power of AI to create an optimal user experience or continuously innovating based on customer feedback, a good fintech is creating exactly what its customers want and need. ... The best fintech platforms have innovative technologies at their core and are increasingly harnessing AI and machine learning to enhance their services. But crucially, they are also designed to be intuitive for users. After all, businesses have just 10 minutes to set up digital accounts or risk losing consumer trust. Millennials and Gen Z make up a significant part of fintech’s core market, so it’s providers who can cater to tech-savvy generations and prioritise smooth customer experiences that will differentiate themselves in an increasingly crowded market. ... In the bustling world of fintech, the top platforms set themselves apart by cleverly blending practices to ensure they keep growing and succeed – even when faced with challenges. These platforms develop excellent solutions, using technologies like blockchain, AI and fancy data analytics to tackle old financial problems and improve user experiences. 


Enabling Developers To Become (More) Creative

What influence does collaboration have on creativity? Now we are starting to firmly tread into management territory! Since software engineering happens in teams, the question becomes how to build a great team that's greater than the sum of its parts. There are more than just a few factors that influence the making of so-called "dream teams". We could use the term "collective creativity" since, without a collective, the creativity of each genius would not reach as far. The creative power of the individual is more negligible than we dare to admit. We should not aim to recruit the lone creative genius, but instead try to build collectives of heterogeneous groups with different opinions that manage to push creativity to its limits. ... Managers can start taking simple actions towards that grand goal. For instance, by helping facilitate decision-making, as once communication goes awry in teams, the creative flow is severely impeded. Researcher Damian Tamburri calls this problem "social debt." Just like technical debt, when there's a lot of social debt, don't expect anything creative to happen. Managers should act as community shepherds to help reduce that debt.



Quote for the day:

"A real entrepreneur is somebody who has no safety net underneath them." -- Henry Kravis

Daily Tech Digest - June 19, 2023

Finding the Nirvana of information access control or something like it

In the mythical land of Nirvana, where everything is perfect, CISOs would have all the resources they needed to protect corporate information. The harsh reality, which each CISO experiences on the daily, is that few entities have unlimited resources. Indeed, in many entities when the cost-cutting arrives, it is not unusual for security programs that have not (so far) positioned themselves as a key ingredient in revenue preservation to be thrown by the wayside — if you ever needed motivation to exercise access control to information, there you have it. ... For those who thought they were finished with Boolean logic in secondary school, its back — and attribute-based access control (ABAC) is a prime example of the practicality of utilizing the logic in decision trees to determine access permission. The adoption of ABAC allows access to protected information to be “hyper-granular.” An individual’s access may be initially defined by one’s role and certainly fall within the established policies. 


Goodbyes are difficult, IT offboarding processes make them harder

To ensure that the business continues even though the employee is gone, stale accounts are created with grace periods during which the employee’s credentials can still be used to access the organization’s networks. This is great for retaining the knowledge this employee accumulated and ensuring that their replacement is well-briefed, but since the employee is gone, nobody will remember to monitor their account, as malicious actors will soon notice. This employee may also have been forwarding emails to their personal email account or accessing their work email from personal devices for business purposes, making it easier for hackers to obtain sensitive company data and impossible for the organization to know. Existing offboarding processes may frustrate business executives due to their rigidity – and they aren’t alone in their annoyance. What’s bad for security is also, inevitably, bad for business. Security teams today must manually ensure that all access privileges, including access to various systems, applications, databases and physical facilities, be promptly terminated.

Leaders are made, not born: Although this is technically correct, which is why we rarely see 5 year olds running companies or countries (though, in fairness, the adults that do often fail to provide convincing signs of superior emotional or intellectual maturity), people’s potential for leadership can be detected at a very young age. Furthermore, the dispositional enablers that increase people’s talent for leadership have a clear biological and genetic basis. ... The best leaders are confident: Not true. Although confidence does predict whether someone is picked for a leadership role, once you account for competence, expertise, intelligence, and relevant personality traits, such as curiosity, empathy, and drive, confidence is mostly irrelevant. And yet, our failure to focus on competence rather than confidence, and our lazy tendency to select leaders on style rather than substance (such as during presidential debates, job interviews, and short-term in person interactions), contributes to most of the leadership problems described in point 1. Note that when leaders have too much confidence they will underestimate their flaws and limitations, putting themselves and others at risk.


How Organizations Can Create Successful Process Automation Strategies

Organizations can promote more collaboration by adopting a modified “Center of Excellence” (CoE) approach. In some companies, that might mean assembling a community devoted to process automation tasks and strategies, in which practitioners can share best practices and ask questions of one another. The CoE should help members from business and IT teams work together better by coordinating tasks, avoiding reinventing projects from scratch, and generally empowering them to drive continuous improvement together. Some organizations may want to create a central focus on process automation without using the actual CoE term. The terminology itself carries some legacy baggage from centralized Business Process Management (BPM) software. Some relied on a centralized approach for their CoE, counting on one team to implement process automation for the entire organization. That approach often led to bottlenecks for both developers and a line of business leaders, giving the CoE a bad reputation with few demonstrable results.


8 habits of highly secure remote workers

By working in a public place you are exposing yourself to serious cybersecurity risks. The first, and most direct one is over-the-shoulder attacks, also known as shoulder surfing. All this takes is for an observant, determined hacker to be sitting in the same space as you paying close attention to your every move. ... "As you use public Wi-Fi, you are exposing your laptop or your device to the same network somebody else can log on to so that means they can actually peruse through your network, depending on the security of the local network on your laptop," says Gartner VP Analyst, Patrick Hevesi. Doing work in a public space while also not using public Wi-FI may seem like a paradox, but there are simple and secure solutions. The first is using a VPN when accessing corporate information in public. ... "Your security is as good as your password, because that's the first first line of defense," says Shah. "You want to make sure that you have a good strong password, and also don't use the same password for all the other sites you may be accessing."


Multicloud deployments don't have to be so complicated

The solution to these problems is not scrapping a complex cloud deployment. Indeed, considering the advantages that multicloud can bring (cost savings and the ability to leverage best-of-breed solutions), it’s often the right choice. What gets enterprises in trouble is the lack of an actual plan that states where and how they will store, secure, access, manage, and use all business data no matter where it resides. It’s not enough to push inventory data to a single cloud platform and expect efficiencies. We’re only considering data complexity here; other issues also exist, including access to application functions or services and securing all systems across all platforms. Data is typically where enterprises see the problems first, but the other matters will have to be addressed as well. A solid plan tells a complete data access story and includes data virtualization services that can make complex data deployments more usable by business users and applications. It also enables data security and compliance using a software layer that can reduce complexity with abstraction and automation. Simple data storage is only a tiny part of the solution you need to consider.


E-Commerce Firms Are Top Targets for API, Web Apps Attacks

Attack vectors, such as server-side template injection, server-side request forgery and server-side code injection, have also become popular and may lead to data exfiltration and remote code execution. "This, in turn, may be playing a role in preventing online sales and damaging a company's reputation," the researchers said, citing an Arcserve survey in which 60% of consumers said they wouldn't buy from a website that had been breached in the previous 12 months. SSTI is a hacker favorite for zero-day attacks. Its use is well-documented in "some of the most significant vulnerabilities in recent years, including Log4j," the researchers said. Hackers mainly targeted commerce companies with Log4j, and 58% of all exploitation attempts happened in the space. The Hafnium criminal group popularized SSRFs, which they used to attack Microsoft's Exchange Servers and reportedly launched a supply chain cyberattack that affected 60,000 organizations, including commerce. Hafnium used the SSRF vulnerability to run commands to the web servers, according to the report.


It’s going to take AI to power AI

AI in the datacentre has the ability to act as a pair of eyes, keeping a keen watch on every aspect of the facility to detect and prevent threats. Analysing data from sources such as online access logs and network traffic would allow AI systems to watch for and alert organisations to cyber breaches in seconds. Further, we’re heading in the direction where AI-powered sensors could apply human temperature checks and facial recognition to monitor for physical intrusions. Ultimately, AI will have the opportunity to tune datacentres to operate like well-oiled machines, making sure all components work in harmony to deliver the highest level of performance in our AI-hungry world – a world pressurised by a cost-of-energy crisis and expanding cyber security threats. While the reality is more nuanced, put plainly, it is going to take AI to power AI. In fact, Gartner estimates that half of all cloud datacentres will use AI by 2025. It’s going to be a productive couple of years for industry developing one of the fastest-growing technologies, rolling it out, and doing so in a way that ensures trust.


Beyond ChatGPT: What is the Business Value of Generative Artificial Intelligence?

Beyond the attraction to the technology itself, generative AI has huge potential business value. Regardless of the processes, professions, or sectors of activity involved, the common thread among artificial intelligence projects is their shared objective of enabling, expediting, or enhancing human actions, either by facilitating or accelerating them. The use of AI usually starts with a question, or a problem. This is immediately followed by the analysis of a significant amount of exogenous information or endogenous information, with the aim of obtaining an answer to the question or problem through the creation of information useful to humans: aiding decision-making, detecting an anomaly, analyzing a hand-drawn schema, prioritizing problems to be solved, etc. More broadly, the automated generation of information makes it easier and safer to streamline some processes, such as moving from an idea to a first version by allowing for quicker validation or failure recognition, A/B testing, and simplified re-experimentation. 


Even in cloud repatriation there is no escaping hyperscalers

Hansson’s blog sparked pushback from cloud advocates like TelcoDR CEO Danielle Royston. She contended in an interview with Silverlinings that those using the cloud aren’t just paying for servers, but also for the proprietary tools the different cloud giants provide, the salaries they pay their top-tier developer talent, the hardware upgrades they make available to cloud users and the built-in security they offer. For those who use the cloud to its full potential, she said, the cloud is “the gift that keeps on giving.” Not only that, but those looking to repatriate workloads will need to invest significant time and money to transition back and hire more staff to develop new applications and manage the on-prem servers, she added. ... So, who’s right? Well, it seems the answer will vary by company and even by application. Pichai explained the cloud is the ideal environment for a small handful of workloads, namely “vanilla applications” which incorporate only standard rather than specialized features and “spikey applications” which need to scale on demand to accommodate irregular patterns of usage.



Quote for the day:

"To be an enduring, great company, you have to build a mechanism for preventing or solving problems that will long outlast any one individual leader" -- Howard Schultz

Daily Tech Digest - June 29, 2022

Why Data Is The Lifeblood Of Modern Organizations

AI – or machine learning, to be more specific – is powered by data (by which we generally mean information). This is because it uses information to “learn” how to make decisions. The more information it receives - such as, for example, road traffic conditions, in the case of a self-driving car – the better it can learn to do whatever it is supposed to do. Simply by watching examples of what happens when a vehicle travels on the road in different situations (environment, time of day, etc.), it gets better at understanding the decisions that have to be made to achieve its objective – traveling from A to B without hitting anything or hurting anyone. Likewise, the usefulness of IoT is down to its ability to transmit data between disparate devices that can then be used to make better decisions. When all of the machinery on a connected factory floor, for example, is talking to every other piece of machinery, it's possible to spot where performance issues are creating inefficiencies, as well as predict where malfunctions and breakdowns are likely to impair performance of the manufacturing operation as a whole.


How AI and Machine Learning will revolutionize the future of eCommerce

One of the numerous advantages of machine learning is the automation of many processes. Personalization is a prime illustration of this. The entire marketplace’s look may be altered using machine learning models for eCommerce to suit a specific buyer. AI personalization in eCommerce is primarily driven by user involvement, which improves the usability and appeal of the consumer experience (with more conversions and sales). Marketplaces want consumers to stay on their sites longer and make more purchases. To make it happen, they modify various website features to meet the specific user’s demands. ... The area of price adjustment is where you may see the full extent of machine learning’s advantages. eCommerce is one of those sectors where competition is quite severe, particularly in specialized consumer markets like hardware or beauty items—because of this, obtaining as many benefits as possible is essential if you want to draw in and keep clients. Price is one of the key motivators for 47% of eCommerce shoppers, according to a BigCommerce survey.


Hertzbleed explained

The first thing to note is that Hertzbleed is a new type of side-channel attack that relies on changes in CPU frequency. Hertzbleed is a real, and practical, threat to the security of cryptographic software. ... In short, the Hertzbleed attack shows that, under certain circumstances, dynamic voltage and frequency scaling (DVFS), a power management scheme of modern x86 processors, depends on the data being processed. This means that on modern processors, the same program can run at different CPU frequencies (and therefore take different wall-clock times). For example, we expect that a CPU takes the same amount of time to perform the following two operations because it uses the same algorithm for both. ... When running sustained workloads, CPU overall performance is capped by TDP. Under modern DVFS, it maximizes its performance by oscillating between multiple P-states. At the same time, the CPU power consumption is data-dependent. Inevitably, workloads with different power consumption will lead to different CPU P-state distribution.


Orlando will test if a physical city can be the center of the metaverse

The Orlando Economic Partnership (the region’s economic development group) is working with Unity to create a digital twin of the 800-square-mile metro area that will use new 3D technology to map out scenarios on everything from infrastructure to real estate to talent availability and more. The Unity rendering will capture 3D scans of exteriors and interiors of buildings, and it will help with the analysis of power grid expansions, traffic flow, stoplight timing, and climate change. The Orlando folks also participated in last week’s ringing of the Nasdaq bell in the metaverse by futurist Cathy Hackl, chief metaverse officer of Journey. Hackl is working with the city to help cement its reputation in the metaverse, and the bell ringing happened in both the physical stock exchange building and the metaverse. “I see the area from South Florida, which is focused on crypto, all the way up to Orlando, which is the simulation capital of the world, becoming one of the metaverse and Web3 innovation corridors to keep your eye on,” Hackl said.


Business AI solutions for beginners: What is vertical intelligence?

In the modern paradigm, one of your company’s greatest assets is the data generated by your employees, clients, and customers. And, sadly, most businesses are leaving money on the table by simply storing that data away somewhere to collect digital dust. The problem: How do you audit your company’s entire data ecosystem, deploy models to identify and infer actionable items, and turn those insights into positive business outcomes? The solution: vertical intelligence. Unfortunately, “vertical intelligence” is a buzzword. If you try to Google it, you’ll just get pages and pages of companies that specialize in it explaining why it’s important. Nobody really tells you what it is in the context of modern AI solutions. ... Vertical intelligence is the combination of human expertise and big data analytics applied with surgical precision and timing. As NowVertical Group’s COO, Sasha Grujicic, told Neural, we’re coming out of a once-in-a-century pandemic. And, unlike most industries, the world of AI had a positive surge during the COVID lockdowns.


One Day, AI Will Seem as Human as Anyone. What Then?

Even if no skills or capacities separate humans from artificial intelligence, there is still a reason and a means to fight the assessment that machines are people. If you attribute the same moral weight to something that can be trivially and easily digitally replicated as you do to an ape that takes decades to grow, you break everything—society, all ethics, all our values. If you could really pull off this machine moral status (and not just, say, inconvenience the proletariat a little), you could cause the collapse, for example, of our capacity to self-govern. Democracy means nothing if you can buy and sell more citizens than there are humans, and if AI programs were citizens, we so easily could. So how do we break the mystic hold of seemingly sentient conversations? By exposing how the system works. This is a process both “AI ethicists” and ordinary software “devops” (development and operations) call “transparency.” What if we all had the capacity to “lift the lid,” to change the way an AI program responds? Google seems to be striving to find the right set of filters and internal guardrails to make something more and more capable of human-like conversation. 


LaMDA Is An ‘AI Baby’ That Will Outrun Its Parent Google Soon

Compared to other chatbot conversations, LaMDA shows streaks of both consistency and randomness within a few lines of conversation. It maintains the logical connection even when the subject is changed without being prompted by a relevant question. ... That trait apart, the one other significant differentiating factor seems to be how it can reach out to external sources of information to achieve “factual groundedness”. A research paper published by Google with Cornell University, mentions that the model has been trained using around 1.56T words of public data and web text. Google very specifically mentions safety, in terms of the model’s consistency with a set of human values, bypassing harmful suggestions and resorting to unfair bias, and enhancing the model safety using a LaMDA classifier fine-tuned with a small amount of crowd worker-annotated data, which again leaves ample scope for ample debate and improvement as one crowdworker might think he is talking to LaMDA chatbot but he might be talking to another crowdworker.


How to Use Span in C# to Improve Application Performance

Using Span<> leads to performance increases because they are always allocated on the stack. Since garbage collection does not have to suspend execution to clean up objects with no references on the heap as often the application runs faster. Pausing an application to collect garbage is always an expensive operation and should be avoided if possible. Span<> operations can be as efficient as operations on arrays. Indexing into a span does not require computation to determine the memory address to index to. Another implementation of a Span in C# is ReadOnlySpan<>. It is a class exactly like Span<> other than that its indexer returns a readonly ref T, not a ref T. This allows us to use ReadOnlySpan<> to represent immutable data types such as String. Spans can use other value types such as int, byte, ref structs, bool, and enum. Spans can not use types like object, dynamic, or interfaces. ... Spans are not appropriate in all situations. Because we are allocating memory on the stack using spans, we must remember that there is less stack memory than heap memory. 


The making and value of metaverse worlds

Technology, media, and telecom companies, for instance, benefit directly by providing technological enablers, such as 5G, next-generation Wi-Fi or broadband networks, and new operating systems, app stores, and platforms to foster more content creation. Meanwhile, AR and VR tools are being actively explored and used in industries ranging from healthcare to industrial goods. Companies should start by familiarising their organisations with the potential impact of the metaverse. To start with, it’s important to do an assessment of how your business may be positively or negatively affected by the three biggest trends: the rise of m-worlds; improvements in AR, VR, and MR; and the expanding use of Web3 assets enabled by blockchain. Companies can then choose areas of focus in the metaverse and potential use cases for their own efforts. Finally, they can decide whether to become part of building this new infrastructure; monetise content and virtual assets; create B2B or B2C content, or even inward-facing experiences such as customer showrooms, virtual conferences, and remote collaboration solutions; or attract relevant audiences, both existing customers and prospects of interest.


CFOs and Automation: Battling Inflation, Increasing Employee Productivity

The CFO must also unlock the investment they've made in staff by providing them with tools that automate mundane, low-value work. “CFOs are fully aware that inflation drives up the cost of hiring and maintaining talent,” explains Karlo Bustos, vice president of professional services for Board International. “They must provide an environment where things aren't hard to do, in a very manual-based function such as invoicing collection activities, building out financial plans, and making financial models.” For CFOs to mitigate the expense of hiring talent and the manual nature of many tasks, they need to provide an environment of automation, collaboration, easily shared data, and enabling technologies. “Being proactive in automation is understanding the business,” he says. “CFOs are more inclined to invest in automation technology to deliver value, so that they can compress some of the inflationary pressures they have on their internal cost structure.” That perspective was shared by Wayne Slater, director of product marketing for Prophix, a performance management software provider. 



Quote for the day:

"Leadership is about change... The best way to get people to venture into unknown terrain is to make it desirable by taking them there in their imaginations." -- Noel Tichy

Daily Tech Digest - February 15, 2022

Cloud storage data residency: How to achieve compliance

Data residency and data sovereignty are increasingly governed by local laws. There is an increasing push towards data sovereignty, in part because of supply chain and security concerns. As Mathieu Gorge, CEO at compliance experts Vigitrust, points out, firms and governments alike are increasingly concerned about geopolitical risk. Firms also need to be aware of data adequacy requirements if they intend to move data across borders. This could come into play if they move between hyperscaler regions and AZs, or change SaaS providers. “There is adequacy between the UK and EU, but you are still relying on clauses in the contract to demonstrate that adequacy,” he cautions. Meanwhile, the challenge of data residency is becoming more complicated as more countries roll out data sovereignty regulations. The EU’s GDPR does not actually include stipulations on data residency, relying instead on data adequacy. The UK’s post-Brexit approach follows that of GDPR. But the growth local of data privacy laws is increasingly linked to more localised, or even nationalistic, views of IT resources, and specific regulations and laws can also set out data residency requirements.


Log4j hearing: 'Open source is not the problem'

“Open source is not the problem,” stated Dr. Trey Herr, director of the Cyber Statecraft Initiative with Atlantic Council think tank during a US Senate Committee on Homeland Security & Government Affairs hearing this week. “Software supply-chain security issues have bedeviled the cyber-policy community for years.” Experts have been predicting a long-term struggle to remedy the Log4j flaw and its impact. Security researchers at Cisco Talos for example stated that Log4j will be widely exploited moving forward, and users should patch affected products and implement mitigation solutions as soon as possible. The popular, Java-logging software is widely used in enterprise and consumer services, websites, and applications as an easy-to-use common utility to support client/server application development. If exploited, the Log4j weakness could let an unauthenticated remote actor take control of an affected server system and gain access to company information or unleash a denial of service attack. The Senate panel called on experts in order to find out about industry responses and ways to prevent future software exposures.


How to create data management policies for unstructured data

Automate as much as you can. A declarative approach is the goal. While there are many options available now using independent data management software to manage policies across storage, many organizations still employ IT managers and spreadsheets to create and track policies. The worst part of this bespoke manual effort is searching for files containing certain attributes and then moving or deleting them. These efforts are inefficient, incomplete, and impede the goals of having policies; it’s painful to maintain them, and IT professionals have too many competing priorities. Plus, this approach limits the potential of using policies to continuously curate and move data to data lakes for strategic AI and ML projects. Instead, look for a solution with an intuitive interface to build and execute on a schedule and that runs in the background without human intervention. Measure outcomes and refine. Any data management policy should be mapped to specific goals, such as cost savings on storage and backups. It should measure those outcomes and let you know their status so that if those goals are not being met, you can change the plans accordingly.


7 Ways to Fail at Microservices

Microservices envy is a problem, because microservices aren’t the sort of thing we should be envying. One of our consultants has a heuristic that if a client keeps talking about Netflix and asking for microservices, he knows the engagement is in trouble. Almost certainly, they’re not moving to microservices for the right reason. If the conversation is a bit deeper, and covers things like coupling and cohesion, then he knows they’re in the right space. The starting ambition for a microservices transformation should never be the microservices themselves. Microservices are the means to achieve a higher-level goal of business agility or resiliency or equivalent. Actually, microservices are not even the only means; they're a means. ... It’s important to ask "do you have microservices, or do you have a monolith spread over hundreds of Git repos?" That, unfortunately, is what we often see. This is a distributed monolith, and it’s a terrible thing. It's hard to reason about. It's more prone to errors than its monolithic equivalent. With a conventional monolith where it's all contained in a single development environment, you get benefits such as compile-time checking and IDE refactoring support.


Demystifying the UK’s National Cyber Strategy 2022

Cyber resilience and digital security overlap different “pillars” of the strategy but share the same goal of enhancing the security posture of the UK, which requires a whole of society outlook. The government’s efforts in taking an active role in the development and adoption of technologies critical to cyber space is applaudable. To remain in sync with the pace of change, there needs to be collaborative and active engagement with experts that have a deep understanding of the threats in cyber space and how to secure the technologies required. The National Cyber Strategy outlines the government’s vision to build on its influence and take on a leading role in promoting technologies and security best practices critical to cyber space globally. It must not wait until the telecommunications industry encounters problems with 5G deployments and organisations are left trying to retrospectively fix their security weaknesses. Organisations must build their networks securely from the start, and effective guidance will be key to supporting this development. 


Why Ransomware Groups Such as BlackCat Are Turning to Rust

BlackCat's migration to Rust, which can run on embedded devices and integrate with other languages, comes as no surprise to Carolyn Crandall, chief security advocate at network security specialist Attivo Networks. She tells ISMG that attackers are always going to innovate with new code that is designed to circumvent endpoint defense systems. Crandall says BlackCat ransomware is "extremely sophisticated" because it is human-operated and command line-driven. ... Anandeshwar Unnikrishnan, senior threat researcher at cybersecurity firm CloudSEK, tells ISMG that threat actors, especially malware developers, will eventually move away from traditional programing languages they formerly used to write malware, such as C or C++, and adopt newer languages, such as Rust, Go and Nim. Unnikrishnan says there are plenty of reasons for malware developers to migrate to languages such as Rust, Go and Nim. But the main reasons are because these newer languages are fast and can evade static analysis of most malware detection systems.


How healthy boundaries build trust in the workplace

Boundaries are the mental, emotional, and physical limits people maintain with respect to others and their environment, and psychologists consider them healthy if they ensure an individual’s continued well-being and stability. They serve many valuable functions. They help protect us, clarify our own responsibilities and those of others, and preserve our physical and emotional energy. They help us stay focused on ourselves, honor our values and standards, and identify our personal limits. Physical workplace boundaries may include delineating an individual’s personal space in a shared office or limiting body contact to handshakes rather than hugs. Mental boundaries reflect individuals’ important beliefs, values, and opinions. At work, that may mean not participating in activities that conflict with a person’s religious convictions, like betting pools, or personal choices, such as not drinking alcohol at office events. Emotional boundaries relate to people’s feelings being acknowledged and respected and may manifest as individuals not discussing their personal lives with coworkers.


Edge computing: 3 ways you can use it now

Edge infrastructure is what enables a “smart” factory floor, for example, armed with sensors and other connected devices that generate endless streams of data. “The manufacturing and warehousing sectors have been early adopters, with use cases like preventive maintenance and augmented reality/virtual reality (AR/VR) remote assistance applications powered by on-prem edge compute,” Mishra says. “Warehouse automation through robotics, location-based solutions, and supply chain optimization are also viewed as key use cases for edge.” A specific technology to watch for here is computer vision: the artificial intelligence (AI) discipline focused on computer-based recognition of images and/or video. “Manufacturing is doing really interesting work in the smart factory floor with quality control using computer vision to identify a slip in production quality before it becomes detectable to humans,” says Paul Legato, VP of platform engineering at Wallaroo. Experts expect that computer vision applications, powered by edge infrastructure, will be a hotbed of new use cases going forward.


Five lessons for building your B2B e-commerce audience

You need to grow and tend to relationships with your target audience, but those relationships will only be as good as the technology you deploy. Your technology is your connection. I’ve seen too many organisations succumb to the fear that digital platforms will take all the flavor out of their brand. But if you choose the right solution, you’re going to have more interaction, more connection, and more opportunities to convey your brand. E-commerce soars when it’s part of a high-quality omnichannel solution designed with B2B complexities in mind. Still not sure if tech is the answer? Private equity firms — key players in the B2B ecosystem —tend to keep their finger on the pulse of future-friendly concepts. You can sense which way the wind is blowing by the new talent they bring in. ... It might seem counterintuitive, but digital drives more human connection. One of today’s most compelling paradoxes is that while markets are more complex, and the buyer’s journey has a thousand detours — I’ll get to that point in a moment — there’s a clear imperative in that complexity and journey. 


Evolving a data integration strategy

In addition to a lack of sufficient data governance, poorly integrated data leads to poor customer service. “In the digital economy, the customer expects you to know and have ready insight into every transaction and interaction they have had with the organisation,” says Tibco CIO Rani Johnson. “If a portion of a customer’s experience is locked in a silo, then the customer suffers a poor experience and is likely to churn to another provider.” Breaking down such silos of data requires business change. “Building end-to-end data management requires organisational changes,” says Nicolas Forgues, former chief technology officer (CTO) at Carrefour, who is now CTO at consulting firm Veltys. “You need to train both internal and external staff to fulfil the data mission for the company.” Businesses risk missing the bigger picture, in terms of spotting trends or identifying indicators of changes, if they lack a business-wide approach to data management and a strategy for integrating silos. In Johnson’s experience, one of the reasons for poor visibility of data is that business functions and enterprise applications are often decentralised. 



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - January 12, 2022

NIST Updates Cybersecurity Engineering Guidelines

NIST’s publication is a resource for computer engineers and other professionals on the programming side of cybersecurity efforts. “This publication addresses the engineering-driven perspective and actions necessary to develop more defensible and survivable systems, inclusive of the machine, physical, and human components that compose those systems and the capabilities and services delivered by those systems,” the document reads. Spanning over 200 pages, the publication takes a holistic approach to systems engineering. NIST researchers give an overview of the objectives and concepts of modern security systems, primarily regarding the protection of a system's digital assets. One of the key updates NIST authors made in the latest version of the publication was a fresh emphasis on security assurances. In software systems engineering, assurance is represented by the evidence that a given system’s security procedures are robust enough to mitigate asset loss and prevent cyber attacks. Ron Ross, an NIST fellow and one of the authors of the document, told Nextgov that system assurances act as justifications that a security system can operate effectively.


9 ways that cybersecurity may change in 2022

On the plus side, digital wallets can ensure the identity of the user in business or financial transactions, reduce fraud and identity theft, and shrink the cost and overhead for organizations that typically create physical methods of authentication. On the minus side, a person can be at risk if their mobile device is lost or stolen, a device without power due to an exhausted battery is of little use when trying to present your digital IT, and any digital verification that requires connectivity will fail if there's no cellular or Wi-Fi available. ... Shadow or zombie APIs pose a security risk, as they're typically hidden, unknown and unprotected by traditional security measures. More than 90% of attacks in 2022 will focus on APIs, according to Durand. And for organizations without the right type of API controls and security practices, these shadow APIs will become the weak link. ... Information technology and operational technology will collide as IT teams assume responsibility for the security of physical devices. This trend will require interoperability between IT and OT, leading to a convergence of technology to determine who can physically get in a building and who can access key applications.


First for software, agile is a boon to manufacturing

Overall, applying agile methodologies should be a priority for every manufacturer. For aerospace and defense companies, whose complex projects have typically followed the long time horizons of waterfall development, agile design and development are needed to propel the industry into the age of urban air mobility and the future of space exploration. ... Over the past decade, agile software development has focused on DevOps—”development and operations”— which creates the interdisciplinary teams and culture for application development. Likewise, design companies and product manufacturers have taken the lessons of agile and reintegrated them into the manufacturing life cycle. As a result, manufacturing now consists of small teams iterating on products, feeding real-world lessons back into the supply chain, and using software tools to speed collaboration. In the aerospace and defense industry, well known for the complexity of its products and systems, agile is delivering benefits.


Observability, AI And Context: Protecting APIs From Today's (And Tomorrow's) Attacks

Today's digital economy is built on a foundation of APIs that enable critical communications, making it possible to deliver a richer set of services faster to users. Unfortunately, today's security solutions focus on an outmoded way of thinking. Most current organizations deploy security solutions and practices that revolve around network security, intrusion detection and mitigating application vulnerabilities. However, for modern API-driven applications that have become the de-facto deployment model for applications that operate in the cloud, these traditional security practices simply do not scale to meet the challenges of today's organizations. Due to the incredible complexity of APIs, as well as the breadth and depth of their deployment across organizations, security and IT teams need to tackle this problem in a structured process that takes into account API application security best practices and procedures that constantly evaluate an organization's APIs, the level of their security posture and their ability to automate remediated security actions when they are attacked.


2022 will be the year we all start talking about online data collection

From uncovering trends to conducting market research, there are countless reasons why businesses collect publicly available web data from their competitors. Though the competitors in question often also engage in data collection themselves, most will regularly block access attempts and make site changes to prevent their public data from being accessed, even though the information targeted is on public display. All this could be about to change. While it may seem counterintuitive – after all, why would you want to give away information to your competitors – some businesses are beginning to realise that it’s in their best interests to allow their public data to be collected by responsible, well-defined, and compliant data practitioners. Firstly, preventing data collection is like a game of whack-a-mole: When you block one tactic, smart practitioners will simply find another. Secondly, accepting some forms of data collection will enable businesses to accurately distinguish between organic user traffic and collector traffic, giving them a clearer insight into what data is being collected and by whom.


Omnichannel E-commerce Growth Increases API Security Risk

API-led connectivity overcomes obstacles that retailers face gathering data from disparate systems to then consolidate the data into monolithic data warehouses. Since each individual system updates separately, information may be out-of-date by the time it hits the database. APIs enable retailers to build an application network that serves as a connectivity layer for data stores and assets in the cloud, on-premises or in hybrid environments. As a result, mobile applications, websites, IoT devices, CRM and ERP systems (order management, point of sale, inventory management and warehouse management) can all work as one coherent system that connects and shares data in real-time. ... The downside to this rapid growth and development in e-commerce has been a concerning rise in API security attacks. Here, threat actors have executed numerous high-profile breaches against public-facing applications. For example, developers use APIs to connect resources like web registration forms to various backend systems. This tasking flexibility, however, also creates an entrance for automated attacks.


Collaborative Governance Will Be The Driver of The API Economy

Most companies with API programs don’t have advanced API management tools, and they can only do a couple of releases a year from inception to production. Collaborative governance, with an automated platform, is the future to plug the gap from a business standpoint and help them get to market quicker and faster. A whole team would understand how APIs mature and prepare responses for the varying requirements. ... Collaborative governance democratizes the API building process as anybody in a team should be able to build, manage, and maintain APIs. Add a low-code, results-driven platform or AI-assisted development tools to the mix, and developers won’t always need to learn about new tools and technologies from scratch or interact with multiple parties. Through centralizing ownership using version-controlled configuration, enterprises can avoid the disruption caused by manual errors or configuration changes and enable reusability. Time to production is also reduced due to continuous integration and delivery (CI/CD). 


How AI helps essential businesses respond to climate change

Underpinning the AI-based forecast platform, is a convolutional neural network (CNN) model. This extracts features from radar reflectivity and meteorological satellite images. This is supported by a trained machine-learning model, which is capable of performing highly accurate and close-to-real-time local weather forecasting in minutes. Meanwhile, a generative adversarial network (GAN) works to generate forecast images with exceptional clarity and detail. One of the benefits of this AI-based prediction model, is that it outperforms the traditional physics-based model; for example, the Global/Regional Assimilation and PrEdiction System (GRAPES) requires hours to generate forecasting data, which is far behind the pace needed for organisations that need to make near real-time decisions based on anticipated weather events. Some of the data is conveyed via high-resolution imagery with one-kilometre grid spacing, with updates every 10 minutes providing fresh insights, enabling real-time decisions to be made to plans or arrangements based on unfolding or predicted weather events. 


Stargate gRPC: The Better Way to CQL

In 2008, Google developed, open-sourced, and released Protocol Buffers — a language-neutral mechanism for serializing structured data. In 2015, Google released gRPC (also open source) to incorporate Protocol Buffers into work to modernize Remote Procedure Call (RPC). gRPC has a couple of important performance characteristics. One is the improved data serialization, making data transit over the network much more efficient. The other is the use of HTTP/2, which enables bidirectional communication. As a result, there are four call types supported in gRPC: Unary calls; Client-side streaming calls; Server-side streaming calls; and Bidirectional calls, which are a composite of client-side and server-side streaming. Put all this together and you have a mechanism that is fast — very fast when compared to other HTTP-based APIs. gRPC message transmission can be 7x to 10x faster than traditional REST APIs. In other words, a solution based on gRPC could offer performance comparable to native drivers.


2022 promises to be a challenging year for cybersecurity professionals

One thing the pandemic has demonstrated is an unprecedented shift in endpoints, workloads, and where data and applications reside. Today, the Federal workforce remains mostly remote and telework is being conducted over modern endpoints such as mobile devices and tablets, and the applications and productivity tools are now cloud-hosted solutions. To be effective, those additional endpoints and mobile devices need to be included in the Agency’s asset inventory, the devices need to be managed and validated for conformance with the Agency’s security policies, and the identities of the user and their device must be known and validated. Additionally, the applications that are cloud-hosted must be included in the zero-trust framework including being protected by strong, conditional access controls, effective vulnerability management and automated patch management processes. I am optimistic that we can make great strides towards improving cybersecurity in 2022, if we are smart and pragmatic about prioritization, risk management, and leveraging automation to help us work smarter not harder.



Quote for the day:

"Making those around you feel invisible is the opposite of leadership." --  Margaret Heffernan