Showing posts with label BYOC. Show all posts
Showing posts with label BYOC. Show all posts

Daily Tech Digest - June 13, 2025


Quote for the day:

"Never stop trying; Never stop believing; Never give up; Your day will come." -- Mandy Hale




Hacking the Hackers: When Bad Guys Let Their Guard Down

"For defenders, these leaks are treasure troves," says Ensar Seker, chief information security officer (CISO) at threat intelligence cybersecurity company SOCRadar. "When analyzed correctly, they offer unprecedented visibility into actor infrastructure, infection patterns, affiliate hierarchies, and even monetization tactics." The data can help threat intel teams enrich indicators of compromise (IoCs), map infrastructure faster, preempt attacks, and potentially inform law enforcement disruption efforts, he says. "Organizations should track these OpSec failures through their [cyber threat intelligence] programs," Seker advises. "When contextualized correctly, they're not just passive observations; they become active defensive levers, helping defenders move upstream in the kill chain and apply pressure directly on adversarial capabilities." External leaks — like the DanaBot leak — often ironically are rooted in the same causes that threat actors abuse to break into victim networks: misconfigurations, unpatched systems, and improper segmentation that can be exploited to gain unauthorized access. Open directories, exposed credentials, unsecured management panels, unencrypted APIs, and accidental data exposure via hosting providers are all other opportunities for external discovery and exploration, Baker says. 


Prioritising cyber resilience in a cloud-first world

The explosion of data across their multi-cloud, hybrid and on-premises environments is creating a cause for concern among global CIOs, with 86 per cent saying it is beyond the ability of humans to manage. Aware that the growing complexity of their multi-provider cloud environments exposes their critical data and puts their organisation’s business resilience at risk, these leaders need to be confident they can restore their sensitive data at speed. They also need certainty when it comes to rebuilding their cloud environment and recovering their distributed cloud applications. To achieve these goals and minimise the risk of contamination resulting from ransomware, CIOs need to ensure their organisations implement a comprehensive cyber recovery plan that prioritises the recovery of both clean data and applications and mitigates downtime. ... Data recovery is just one aspect of cyber resilience for today’s cloud-powered enterprises. Rebuilding applications is an often overlooked task that can prove a time consuming and highly complex proposition when undertaken manually. Having the capability to recover what matters the most quickly should be a tried and tested component of every cloud-first strategy. Fortunately, today’s advanced security platforms now feature automation and AI options that can facilitate this process in hours or minutes rather than days or weeks. 


Mastering Internal Influence, a Former CIO’s Perspective

Establishing and building an effective relationship with your boss is one of the most important hard skills in business. You need to consciously work with your supervisor in order to get the best results for them, your organization, and yourself. In my experience, your boss will appreciate you initiating a conversation regarding what is important to them and how you can help them be more successful. Some managers are good at communicating their expectations, but some are not. It is your job to seek to understand what your boss’s expectations are. ... You must start with the assumption that everyone reporting to you is working in good faith toward the same goals. You need to demonstrate a trusting, humble, and honest approach to doing business. As the boss, you need to be a mentor, coach, visionary, cheerleader, confidant, guide, sage, trusted partner, and perspective keeper. It also helps to have a sense of humor. It is first vital to articulate the organization’s values, set expectations, and establish mutual accountability. Then you can focus on creating a safe work ecosystem. ... You’ll begin to change the culture by establishing the values of the organization. This is an important step to ensure that everyone is on the same page and working toward the same goals. Then, you’ll need to make sure they understand what is expected of them.


The Rise of BYOC: How Data Sovereignty Is Reshaping Enterprise Cloud Strategy

BYOC allows customers to run SaaS applications using their own cloud infrastructure and resources rather than relying on a third-party vendor’s infrastructure. This framework transforms how enterprises consume cloud services by inverting the traditional vendor-customer relationship. Rather than exporting sensitive information to vendor-controlled environments, organizations maintain data custody while still receiving fully-managed services. This approach addresses a fundamental challenge in modern enterprise architecture: how to maintain operational efficiency while also ensuring complete data control and regulatory compliance. ... BYOC adoption is driven primarily by increasing regulatory complexity around data sovereignty. The article Cloud Computing Trends in 2025 notes that “data sovereignty concerns, particularly the location and legal jurisdiction of data storage, are prompting cloud providers to invest in localized data centers.” Organizations must navigate an increasingly fragmented regulatory landscape while maintaining operational consistency. And when regulations vary country by country, having data in multiple third-party networks can dramatically compound the problem of knowing which data is subject to a specific country’s regulations.


AI Will Steal Developer Jobs (But Not How You Think)

“There’s a lot of anxiety about AI and software creation in general, not necessarily just frontend or backend, but people are rightfully trying to understand what does this mean for my career,” Robinson told The New Stack. “If the current rate of improvement continues, what will that look like in 1, 2, 3, 4 years? It could be pretty significant. So it has a lot of people stepping back and evaluating what’s important to them in their career, where they want to focus.” Armando Franco also sees anxiety around AI. Franco is the director of technology modernization at TEKsystems Global Services, which employs more than 3,000 people. It’s part of TEKsystems, a large global IT services management firm that employs 80,000 IT professionals. ... This isn’t the first time in history that people have fretted about new technologies, pointed out Shreeya Deshpande, a senior analyst specializing in data and AI with the Everest Group, a global research firm. “Fears that AI will replace developers mirror historical anxieties seen during past technology shifts — and, as history shows, these fears are often misplaced,” Deshpande said in a written response to The New Stack. “AI will increasingly automate repetitive development tasks, but the developer’s role will evolve rather than disappear — shifting toward activities like AI orchestration, system-level thinking, and embedding governance and security frameworks into AI-driven workflows.”


Unpacking the security complexity of no-code development platforms

Applications generated by no-code platforms are first and foremost applications. Therefore, their exploitability is first and foremost attributed to vulnerabilities introduced by their developers. To make things worse for no-code applications, they are also jeopardized by misconfigurations of the development and deployment environments. ... Most platforms provide controls at various levels that allow white-listing / blacklisting of connectors. This makes it possible to put guardrails around the use of “standard integrations”. Keeping tabs of these lists in a dynamic environment with a large number of developers is a big challenge. Shadow APIs are even more difficult to track and manage, particularly when some of the endpoints are determined only in runtime. Most platforms do not provide granular control over the use of shadow APIs but do provide a kill switch for their use entirely. Mechanisms exist in all the platforms that allow for secure development of applications and automations if used correctly. These mechanisms that can help prevent injection vulnerabilities, traversal vulnerabilities and other types of mistakes have different levels of complexity in terms of their use by developers. Unvetted data egress is also a big problem in these environments just as it is in general enterprise environments. 


Modernizing Financial Systems: The Critical Role of Cloud-Based Microservices Optimization

Cloud-based microservices provide many benefits to financial institutions across operational efficiency, security, and technology modernization. Economically, these architectures enable faster transaction processing by reducing latency and optimizing resource allocation. They also lower infrastructure expenses by replacing monolithic legacy systems with modular, scalable services that are easier to maintain and operate. Furthermore, the shift to cloud technologies increases demand for specialized roles in cloud operations and cybersecurity. In security operations, microservices support zero-trust architectures and data encryption to reduce the risk of fraud and unauthorized access. Cloud platforms also enhance resilience by offering built-in redundancy and disaster recovery capabilities, which help ensure continuous service and maintain data integrity in the event of outages or cyber incidents. ... To build secure and scalable financial microservices, there are a few key technology stacks needed. They include Docker and Kubernetes containerization for managing multiple microservices, and cloud functions for serverless computing, which will be used to run calculations on demand. API Gateways will ensure that there is secure communication between services and Kafka for real-time data monitoring and streaming.


Ultra Ethernet Consortium publishes 1.0 specification, readies Ethernet for HPC, AI

Among the key areas of innovation in the UEC 1.0 specification is a new mechanism for network congestion control, which is critical for AI workloads. Metz explained that the UEC’s approach to congestion control does not rely on a lossless network as has traditionally been the case. It also introduces a new mode of operation where the receiver is able to limit sender transmissions as opposed to being passive. “This is critical for AI workloads as these primitives enable the construction of larger networks with better efficiency,” he said. “It’s a crucial element of reducing training and inference time.” ... Metz said that four workgroups got started after the main 1.0 work began, each with their own initiatives that solidify and simplify deploying UEC. These workgroups include: storage, management, compliance and performance. He noted that all of these workgroups have projects that are being developed to strengthen the ease-of-use, efficiency improvements in the next stages and simplified provisioning. UEC is also working on educational materials to help inform networking administrators on UEC technology and concepts. The group is also working industry ecosystem partners. “We have projects with OCP, NVM Express, SNIA, and more – with many more on the way to work on each layer – from the physical to the software,” Metz said. 


From Tools to Teammates: Are AI Agents the New Marketers?

The key difference is that traditional models were built as generic tools designed to perform a wide range of tasks. On the other hand, AI agents are designed to meet businesses’ specific needs. They can train a single agent or a group of them on their data to handle tasks unique to your business. This translates to better outcomes, improved performance, and stronger business impact. Another huge advantage of using AI agents is that they help unify marketing efforts and create a cohesive marketing ecosystem. Another major shift that comes with AI agent implementation is something called AIAO- AI Agent Optimisation. This is highly likely to become the next big alternative to traditional SEO. Now, marketers optimise content around specific keywords like “best project management software.” But with AIAO, that’s changing. AI agents are built to understand and respond to much more complex, conversational queries, like “What’s the best project management tool with timeline boards that works for marketing teams?” It’s no longer about integrating the right phrases into your content. It’s about ensuring your information is relevant, clear, and easy for AI agents to understand and process. Semantic search is going to take the lead. 


Under the Cloud: How Network Design Drives Everything

Let’s be clear, the network isn’t an accessory; it’s the key ingredient that determines how well your cloud performs, how secure your data is, how quickly you can recover from a disaster, and how easily you scale across borders or platforms. Think of it as the highway system beneath your business. Sleek, fast roads make for a smooth ride, while congested or patchy ones will leave you stuck in traffic. ... It’s tempting to get caught up in the flashier parts of cloud infrastructure, like server specs and cutting-edge tools, but none of it works well without a strong network underneath. Here’s the truth. Your network is doing the quiet, behind-the-scenes heavy lifting. It’s what keeps your games lag-free, your financial systems always on, and your hybrid workloads running smoothly across platforms even if it doesn’t get all the attention. You should think of your network as the glue that holds it all together – from your cloud services to your bare metal setup. It is what makes it all possible for AI models to work seamlessly across regions for backups to run smoothly in the background and for your users to enjoy fast, always-on experiences without ever thinking about what’s happening behind the scenes. ... A reliable, secure and performant network is nothing if it can’t be managed the right way. Having the right architecture, tools and knowledge to support it, is key for success.

Daily Tech Digest - March 31, 2025


Quote for the day:

"To succeed in business it is necessary to make others see things as you see them." -- Aristotle Onassis



World Backup Day: Time to take action on data protection

“The best protection that businesses can give their backups is to keep at least two copies, one offline and the other offsite”, continues Fine. “By keeping one offline, an airgap is created between the backup and the rest of the IT environment. Should a business be the victim of a cyberattack, the threat physically cannot spread into the backup as there’s no connection to enable this daisy-chain effect. By keeping another copy offsite, businesses can prevent the backup suffering due to the same disaster (such as flooding or wildfires) as the main office.” ... “As such, traditional backup best practices remain important. Measures like encryption (in transit and at rest), strong access controls, immutable or write-once storage, and air-gapped or physically separated backups help defend against increasingly sophisticated threats. To ensure true resilience, backups must be tested regularly. Testing confirms that the data is recoverable, helps teams understand the recovery process, and verifies recovery speeds, whilst supporting good governance and risk management.” ... “With the move towards a future of AI-driven technologies, the amount of data we generate and use is set to increase exponentially. With data often containing valuable information, any loss or impact could have devastating consequences.”


5 Common Pitfalls in IT Disaster Recovery (and How to Avoid Them)

One of the most common missteps in IT disaster recovery is viewing it as a “check-the-box” exercise — something to complete once and file away. But disaster recovery isn’t static. As infrastructure evolves, business processes shift and new threats emerge, a plan that was solid two years ago may now be dangerously outdated. An untested, unrefreshed IT/DR plan can give a false sense of security, only to fail when it’s needed most. Instead, treat IT/DR as a living process. Regularly review and update it with changes to your technology stack, business priorities, and risk landscape. ... A disaster recovery plan that lives only on paper is likely to fail. Many organizations either skip testing altogether or run through it under ideal, low-pressure conditions (far from the chaos of a real crisis). When a true disaster hits, the stress, urgency, and complexity can quickly overwhelm teams that haven’t practiced their roles. That’s why regular, scenario-based testing is essential. ... Even the most robust IT disaster recovery plan can fail if roles are unclear and communication breaks down. Without well-defined responsibilities and structured escalation paths, response efforts become disorganized and slow — often when speed matters most.


How CISOs can balance business continuity with other responsibilities

The challenge for CISOs is providing security while ensuring the business recovers quickly without reinfecting systems or making rushed decisions that could lead to repeated incidents. The new reality of business continuity is dealing with cyber-led disruptions. Organizations have taken note, with 46% of organizations nominating cybersecurity incidents as the top business continuity priority ... While CISOs may find that their remit is expanding to cover business continuity, a lack of clear delineation of roles and responsibilities can spell trouble. To effectively handle business continuity, cybersecurity leaders need a framework to collaborate with IT leadership. Responding to events requires a delicate balance between thoroughness of investigation and speed of recovery that traditional business continuity plan approaches may not fit. On paper, the CISO owns the protection of confidentiality, integrity, and availability, but availability was outsourced a long time ago to either the CIO or facilities, according to Blake. “BCDR is typically owned by the CIO or facilities, but in a cyber incident, the CISO will be holding the toilet chain for the attack, while all the plumbing is provided by the CIO,” he says


Two things you need in place to successfully adopt AI

A well-defined policy is essential for companies to deploy and leverage this technology securely. This technology will continue to move fast and innovate giving automation and machines more power in organizational decision-making, and the first line of defense for companies is a clear, accessible AI policy that the whole company is aware of and subscribes to. Enforcing a security policy also means defining what risk ratings are acceptable for an organization, and the ability to reprioritize the risk ratings as the environment changes. There are always going to be errors and false positives. Different organizations have different risk tolerances or different interpretations depending on their operations and data sensitivity. ... Developers need to have a secure code mindset that extends beyond basic coding knowledge. Code written by developers needs to be clear, elegant, and secure. If it is not, it leaves that written code open for attack. Secure coding training driven by industry is, therefore, a must and must be built into an organization’s DNA, especially during a time when the already prevalent AppSec dilemma is being intensified by the current tech layoffs.


3 things haven’t changed in software engineering

Strategic thinking has long been part of a software engineer’s job, to go beyond coding to building. Working in service of a larger purpose helps engineers develop more impactful solutions than simply coding to a set of specifications. With the rise in AI-assisted coding—and, thus, the ability to code and build much faster—the “why” remains at the forefront. We drive business impact by delivering measurable customer benefits. And you have to understand a problem before you can solve it with code. ... The best engineers are inherently curious, with an eye for detail and a desire to learn. Through the decades, that hasn’t really changed; a learning mindset continues to be important for technologists at every level. I’ve always been curious about what makes things tick. As a child, I remember taking things apart to see how they worked. I knew I wanted to be an engineer when I was able to put them back together again. ... Not every great coder aspires to be a people leader; I certainly didn’t. I was introverted growing up. But as I worked my way up at Intuit, I saw firsthand how the right leadership skills could deepen my impact, even when I wasn’t charged with leading anybody. I’ve seen how quick decision making, holistic problem solving, and efficient delegation can drive impact at every level of an organization. And these assets only become more important as we fold AI into the process.


Understanding AI Agent Memory: Building Blocks for Intelligent Systems

Episodic memory in AI refers to the storage of past interactions and the specific actions taken by the agent. Like human memory, episodic memory records the events or “episodes” an agent experiences during its operation. This type of memory is crucial because it enables the agent to reference previous conversations, decisions, and outcomes to inform future actions. ... Semantic memory in AI encompasses the agent’s repository of factual, external information and internal knowledge. Unlike episodic memory, which is tied to specific interactions, semantic memory holds generalized knowledge that the agent can use to understand and interpret the world. This may include language rules, domain-specific information, or self-awareness of the agent’s capabilities and limitations. One common semantic memory use is in Retrieval-Augmented Generation (RAG) applications, where the agent leverages a vast data store to answer questions accurately. ... Procedural memory is the backbone of an AI system’s operational aspects. It includes systemic information such as the structure of the system prompt, the tools available to the agent, and the guardrails that ensure safe and appropriate interactions. In essence, procedural memory defines “how” the agent functions rather than “what” it knows.


Why Leadership Teams Need Training In Crisis Management

You don’t have the time to mull over different iterations or think about different possibilities and outcomes. You and your team need to make a decision quickly. Depending on the crisis at hand, you’ll need to assess the information available, evaluate potential risks, and make a timely decision. Waiting can be detrimental to your business. Failure to inform customers that their information was compromised during a cybersecurity attack could lead them to take their business elsewhere. ... Crisis or not, communication is how teams facilitate information and build trust. During a crisis, it’s up to the leader to communicate efficiently and effectively to the internal teams. It’s natural for panic to ensue during a time of unpredictability and stress. ... it’s not only internal communications that you’re responsible for. You also need to consider what you’re communicating to your customers, vendors, and shareholders. This is where crisis management can come in handy. While you should know how best to speak to your team, communicating externally can present itself as more challenging. ... One crisis can be the end of your business if not handled properly and considerably. This is especially the case for businesses that undergo internal crises, such as cybersecurity attacks, product recalls, or miscalculated marketing campaigns.


SaaS Is Broken: Why Bring Your Own Cloud (BYOC) Is the Future

BYOC allows customers to run SaaS applications using their own cloud infrastructure and resources rather than relying on a third-party vendor’s infrastructure. This hybrid approach preserves the convenience and velocity of SaaS while balancing cost and ownership with the control of self-hosted solutions. Building a BYOC stack that is easy to adopt, cost-effective, and performant is a significant engineering challenge. But as a software vendor, there are many benefits to your customers that make it worth the effort. ... SaaS brought speed and simplicity to software consumption, while traditional on premises offered control and predictability. But a more balanced approach is emerging as companies face rising costs, compliance challenges, and the need for data ownership. BYOC is the consolidated evolution of both worlds — combining the convenience of SaaS with the control of on premises. Instead of sending massive amounts of data to third-party vendors, companies can run SaaS applications within their cloud infrastructure. This means predictable costs, better compliance, and tailored performance. We’ve seen this hybrid model succeed in other areas. Meta’s Llama gained massive adoption as users could run it on their infrastructure. 


What Happens When AI Is Used as an Autonomous Weapon

The threat to enterprises is already substantial, according to Ben Colman, co-founder and CEO at deepfake and AI-generated media detection platform Reality Defender. “We’re seeing bad actors leverage AI to create highly convincing impersonations that bypass traditional security mechanisms at scale. AI voice cloning technology is enabling fraud at unprecedented levels, where attackers can convincingly impersonate executives in phone calls to authorize wire transfers or access sensitive information,” Colman says. Meanwhile, deepfake videos are compromising verification processes that previously relied on visual confirmation, he adds. “These threats are primarily coming from organized criminal networks and nation-state actors who recognize the asymmetric advantage AI offers. They’re targeting communication channels first because they’re the foundation of trust in business operations.” Attackers are using AI capabilities to automate, scale, and disguise traditional attack methods. According to Casey Corcoran, field CISO at SHI company Stratascale, examples include creating more convincing phishing and social engineering attacks to automatically modify malware so that it is unique to each attack, thereby defeating signature-based detection.


Worldwide spending on genAI to surge by hundreds of billions of dollars

“The market’s growth trajectory is heavily influenced by the increasing prevalence of AI-enabled devices, which are expected to comprise almost the entire consumer device market by 2028,” said Lovelock. “However, consumers are not chasing these features. As the manufacturers embed AI as a standard feature in consumer devices, consumers will be forced to purchase them.” In fact, for organizations, AI PCs could solve key issues organizations face when using cloud and data center AI instances, including cost, security, and privacy concerns, according to a study released this month by IDC Research. This year is expected to be the year of the AI PC, according to Forrester Research. It defines an AI PC as one that has an embedded AI processor and algorithms specifically designed to improve the experience of AI workloads across the central processing unit (CPU), graphics processing unit (GPU), and neural processing unit, or NPU. ... “This reflects a broader trend toward democratizing AI capabilities, ensuring that teams across functions and levels can benefit from its transformative potential,” said Tom Mainelli, IDC’s group vice president for device and consumer research. “As AI tools become more accessible and tailored to specific job functions, they will further enhance productivity, collaboration, and innovation across industries.”

Daily Tech Digest - February 12, 2025


Quote for the day:

“If you don’t have a competitive advantage, don’t compete.” -- Jack Welch


Security Is Blocking AI Adoption: Is BYOC the Answer?

Enterprises face unique hurdles in adopting AI at scale. Sensitive data must remain within secure, controlled environments, avoiding public networks or shared infrastructures. Traditional SaaS models often fail to meet these stringent data sovereignty and compliance demands. Beyond this, organizations require granular control, comprehensive auditing and full transparency to trace every AI decision and data access. This ensures vendors cannot interact with sensitive data without explicit approval and documentation. These unmet needs create a significant gap, preventing regulated industries from deploying AI solutions while maintaining compliance and security. ... The concept of Bring Your Own Cloud (BYOC) isn’t new. It emerged as a middle ground between traditional SaaS and on-premises deployments, promising to combine the best of both worlds: the convenience of managed services with the control and security of on-premises infrastructure. However, its history in the industry has been marked by both successes and cautionary tales. Early BYOC implementations often failed to live up to their promises. Some vendors merely deployed their software into customer cloud accounts without proper architectural planning, resulting in what was essentially remotely managed on-premises environments. 


The Importance of Continuing Education in Data and Tech

Continuing education plays a vital role in workforce development and career advancement within the tech industries, where rapid technological advancements and evolving market demands necessitate a culture of lifelong learning. As businesses increasingly rely on sophisticated data analytics, artificial intelligence (AI), and cloud technologies, professionals in these fields must continuously update their skills to remain competitive. Continuing education offers a pathway for individuals to acquire new capabilities, adapt to emerging technologies, and gain proficiency in specialized areas that are in high demand. By engaging in ongoing learning opportunities, tech professionals can enhance their expertise, making them more valuable to their current employers and more attractive to potential future ones. ... Professional certifications and competency-based education have become significant avenues for career advancement in the data and tech field. As the landscape of technology rapidly evolves, organizations increasingly seek professionals who possess validated skills and up-to-date knowledge. Professional certifications serve as tangible proof of one’s expertise in specific areas such as data governance, analytics, cybersecurity, or cloud computing. These certifications, offered by leading industry bodies and tech companies, are designed to align with current industry standards and demands.


Agents, shadow AI and AI factories: Making sense of it all in 2025

“Agentic AI” promises “digital agents” that learn from us, and can perceive, reason problems out in multiple steps and then make autonomous decisions on our behalf. They can solve multilayered questions that require them to interact with many other agents, formulate answers and take actions. Consider forecasting agents in the supply chain predicting customer needs by engaging customer service agents, and then proactively adjusting warehouse stock by engaging inventory agents. Every knowledge worker will find themselves gaining these superhuman capabilities backed by a team of domain-specific task agent workers helping them tackle large complex jobs with less expended effort. ... However, the proliferation of generative, and soon agentic AI, presents a growing problem for IT teams. Maybe you’re familiar with “shadow IT,” where individual departments or users procure their own resources, without IT knowing. In today’s world we have “shadow AI,” and it’s hitting businesses on two fronts. ... Today’s enterprises create value through insights and answers driven by intelligence, setting them apart from their competitors. Just as past industrial revolutions transformed industries — think about steam, electricity, internet and later computer software — the age of AI heralds a new era where the production of intelligence is the core engine of every business. 


Is VMware really becoming the new mainframe?

“CIOs can start to unwind their dependence on VMware,” he says. “But they need to know it may not have any material reduction in their spend with Broadcom over multiple renewals. They’re going to have to get completely off Broadcom.” Still, Warrilow recommends that CIOs running VMware consider alternatives over the long term. They should also look for exit strategies for other market-dominant IT products they use, given that Broadcom has seen early success with VMware, he says. “The cautionary tale for CIOs is that this is just the beginning,” he says. “Every tech investment firm is going to be saying, ‘I want what Broadcom has with their share price.’  ... “The comparison works a bit, maybe from a stickiness perspective, because customers have built their applications and workload using virtualization technology on VMware,” he says. “When they have to do a mass refactoring of applications, it’s very, very hard.” But the analogy has its limitations because many users think of mainframes as a legacy technology, while VMware’s cloud-based products address future challenges, he adds. “The cloud is the future for running your AI workload,” Shenoy says. “Customers have trusted us for the last 20 to 25 years to run their business-critical applications, and the interesting part right now is we are seeing a lot of growth of these AI workloads and container workloads running on VMware.”


Deep Learning – a Necessity

It is essential in architecture that we realize that a skill set is not an arbitrary thing. It isn’t learn one skill and you are done. It also isn’t learn any skill from any background and you’re in. It is the application of all of the identified and necessary skills combined that makes a distinguished architect. It is also important to understand the purpose and context of mastery. Working in a startup is very different from working in a large corporation. Industry can change things significantly as well. Always remember that the profession’s purpose has to be paramount in the learning. For example, both doctors and lawyers have to deal with clients and need human interaction skills to be successful. Yet, the nature and implementation of these differ drastically. We will explore this point in a further article. However, do not underestimate the impact of changing the meaning of the profession while claiming similar skills. The current environment is rife with this kind of co-opting of the terminology and tools to alter the whole purpose of architecture fundamentally. ... In medicine and other professions, an individual studies and practices for 7+ years to become fully independent, and they never stop learning. This learning is tracked by both mentors and the profession. Because medicine is so essential to humans it is important that professionals are measured and constantly update and hone their competencies.


Crawl, then walk, before you run with AI agents, experts recommend

The best bet for percolating AI agents throughout the organization is to keep things as simple as possible. "Companies and employees that have already found ways to operationalize intelligent agents for simple tasks are best placed to exploit the next wave with agentic AI," said Benjamin Lee, professor of computer and information science at the University of Pennsylvania. "These employees would already be engaging generative AI for simple tasks and they would be manually breaking complex tasks into simpler tasks for the AI. Such employees would already be seeing productivity gains from using generative AI for these simple tasks." Rowan agreed that enterprises should adopt a crawl, walk, run approach: "Begin with a pilot program to explore the potential of multiagent systems in a controlled, measurable environment." "Most people say AI is at the toddler stage, whereas agentic AI is like a tween," said Ben Sapp, global practice lead of intelligence at Digital.ai. "It's functional and knows how to execute certain functions." Enterprises and their technology teams "should socialize the use of generative AI for simple tasks within their organizations," Lee continued. "They should have strategies for breaking complex tasks into simpler ones so that, when intelligent agents become a reality, the sources of productivity gains are transparent, easily understood, and trusted."


Growth of digital wallet use shaking up payment regulations and benefits delivery

Australian banks are calling on the government to pass legislation that accommodates payments with digital wallets within the country’s regulatory framework. A release from the Australian Banking Association (ABA) argues that with the country’s residents making $20 billion worth of payments across 500 million transactions each month with mobile wallets, all players within the payment ecosystem should be under the remit of the Reserve Bank of Australia. ... Digital wallets are by far the most popular method of making cross-border payments, according to a new report from Payments Cards & Mobile. The How Digital Wallets Are Transforming Cross-Border Transactions report shows digital wallets are chosen for international transactions by 42.1 percent. That makes them more people than the next two most popular methods, money transfer services (16.8 percent) and bank accounts (14.8 percent) combined. Transactions with digital wallets are much faster than wire transfers, are available to people who don’t possess bank accounts, and have lower fees than bank transfers, the report says. Interoperability remains a challenge, and regulations and infrastructure limitations could pose barriers to adoption, but the report authors only expect the dominance of digital wallets to increase in the years ahead.


My vision is to create a digital twin of our entire operations, from design and manufacturing to products and customers

We approach this transformation from three dimensions. First is empathy – truly understanding not just who our customers are, but their emotions. This is where the concept of creating a ‘digital twin’ of the customer comes in. Second is innovation – not just adopting new technologies but ensuring that our processes are lean, digitised, and seamless throughout the customer journey, from research to purchase, service, and brand loyalty. The goal is to provide a consistent and empathetic experience across all touchpoints.  ... The first challenge is identifying our customers. For example, if a distributor in one business also buys from another or if a consumer connects with one of our industrial projects, it’s hard to track. To address this, we launched a customer UID project, which has been in progress for months. It helps us identify customers across channels while keeping an eye on privacy and adhering to upcoming data protection regulations. The second part involves gathering all customer-related data in one place. Over the past three years, we unified all customer interactions into a single platform with a one CRM strategy, which was complex but essential. Now, with AI solutions like social listening combined with sentiment analysis, we can understand what our customers are saying about us and where we need to improve, both in India and globally. 


Will AI Chip Supply Dry Up and Turn Your Project Into a Costly Monster?

CIOs and other IT leaders face tremendous pressure to quickly develop GenAI strategies in the face of a potential supply shortage. With the cost of individual units, spending can easily reach into the multi-million-dollar range. But it wouldn’t be the first time companies have dealt with semiconductor shortages. During the COVID-19 pandemic, a spike in PC demand for remote work met with global shipping disruptions to create a chip drought that impacted everything from refrigerators to automobiles and PCs. “One thing we learned was the importance of supply chain resiliency, not being overly dependent on any one supplier and understanding what your alternatives are,” Hoecker says. “When we work with clients to make sure they have a more resilient supply chain, we consider a few things … One is making sure they rethink how much inventory do they want to keep for their most critical components so they can survive any potential shocks.” She adds, “Another is geographic resiliency, or understanding where your components come from and do you feel like you’re overly exposed to any one supplier or any one geography.” Nvidia’s GPUs, she notes, are harder to find alternatives for -- but other chips do have alternatives. “There are other places where you can dual-source or find more resiliency in your marketplace.”


WTF? Why the cybersecurity sector is overrun with acronyms

Imagine an organization is in the midst of a massive hack or security breach, and employees or clients are having to Google frantically to translate company emails, memos or crisis plans, slowing down the response. When these acronyms inevitably migrate into a cybersecurity company’s external marketing or communications efforts, they’re almost guaranteed to cause the general public to tune out news about issues and innovations that could have a far-reaching impact on how people live their lives and conduct their businesses. This is especially true as artificial intelligence (AI!) and machine learning (ML!) technologies expand and new acronyms emerge to keep pace with developments. Acronyms can also have unfortunate real-life connotations — point of sale, to name just one example. When shortened to POS, it can suggest something is… well, crappy. ... So, what’s behind the tendency to shorten terms to a jumble of often incomprehensible acronyms and abbreviations? “On the one hand, acronyms, abbreviations and jargon are used to achieve brevity, standardization and efficiency in communication, so if a profession is steeped in complex and technical language, it will likely be flowing with acronyms,” says Ian P. McCarthy, a professor of innovation and operations management at Simon Fraser University in Burnaby, British Columbia.