Daily Tech Digest - April 21, 2023

A team of ex-Apple employees wants to replace smartphones with this AI projector

It's a seamless blend of technology and human interaction that Humane believes can extend to daily schedule run-downs, seeing map directions, and receiving visual aids for cooking or when fixing a car engine -- as suggested by the company's public patents. The list goes on. Chaudhri also demoed the wearable's voice translator which converted his English into French while using an AI-generated voice to retain his tone and timbre, as reported by designer Michael Mofina, who watched the recorded TED Talk before it was taken down. Mofina also shared an instance when the wearable was able to recap the user's missed notifications without sounding invasive, framing them as, "You got an email, and Bethany sent you some photos." Perhaps the biggest draw to Humane and its AI projector is the team behind it. That roster includes Chaudri, a former Director of Design at Apple who worked on the Mac, iPod, iPhone, and other prominent devices, and Bethany Bongiorno, also from Apple and was heavily involved in the software management of iOS and MacOS.


Three issues with generative AI still need to be solved

Generative AI uses massive language models, it’s processor-intensive, and it’s rapidly becoming as ubiquitous as browsers. This is a problem because existing, centralized datacenters aren’t structured to handle this kind of load. They are I/O-constrained, processor-constrained, database-constrained, cost-constrained, and size-constrained, making a massive increase in centralized capacity unlikely in the near term, even though the need for this capacity is going vertical. These capacity problems will increase latency, reduce reliability, and over time could throttle performance and reduce customer satisfaction with the result. The need is for more of a more hybrid approach where the AI components necessary for speed are retained locally (on devices) while the majority of the data resides centrally to reduce datacenter loads and decrease latency. Without a hybrid solution — where smartphones and laptops can do much of the work — use of the technology is likely to stall as satisfaction falls, particularly in areas such as gaming, translation, and conversations where latency will be most annoying.


Exploring The Incredible Capabilities Of Auto-GPT

The first notable application is code improvement. Auto-GPT can read, write and execute code and thus can improve its own programming. The AI can evaluate, test and update code to make it faster, more reliable, and more efficient. In a recent tweet, Auto-GPT’s developer, Significant Gravitas, shared a video of the tool checking a simple example function responsible for math calculations. While this particular example only contained a simple syntax error, it still took the AI roughly a minute to correct the mistake, which would have taken a human much longer in a codebase containing hundreds or thousands of lines. ... The second notable application is in building an app. Auto-GPT detected that Varun Mayya needed the Node.js runtime environment to build an app, which was missing on his computer. Auto-GPT searched for installation instructions, downloaded and extracted the archive, and then started a Node server to continue with the job. While Auto-GPT made the installation process effortless, Mayya cautions against using AI for coding unless you already understand programming, as it can still make errors.


The Best (and Worst) Reasons to Adopt OpenTelemetry

Gathering telemetry data can be a challenge, and with OpenTelemetry now handling essential signals like metrics, traces and logs, you might feel the urge to save your company some cash by building your own system. As a developer myself, I totally get that feeling, but I also know how easy it is to underestimate the effort involved by just focusing on the fun parts when kicking off the project. No joke, I’ve actually seen organizations assign teams of 50 engineers to work on their observability stack, even though the company’s core business is something else entirely. Keep in mind that data collection is just a small part of what observability tools do these days. The real challenge lies in data ingestion, retention, storage and, ultimately, delivering valuable insights from your data at scale. ... At the very least, auto-instrumentation will search for recognized libraries and APIs and then add some code to indicate the start and end of well-known function calls. Additionally, auto-instrumentation takes care of capturing the current context from incoming requests and forwarding it to downstream requests.


OpenAI’s hunger for data is coming back to bite it

The Italian authority says OpenAI is not being transparent about how it collects users’ data during the post-training phase, such as in chat logs of their interactions with ChatGPT. “What’s really concerning is how it uses data that you give it in the chat,” says Leautier. People tend to share intimate, private information with the chatbot, telling it about things like their mental state, their health, or their personal opinions. Leautier says it is problematic if there’s a risk that ChatGPT regurgitates this sensitive data to others. And under European law, users need to be able to get their chat log data deleted, he adds. OpenAI is going to find it near-impossible to identify individuals’ data and remove it from its models, says Margaret Mitchell, an AI researcher and chief ethics scientist at startup Hugging Face, who was formerly Google’s AI ethics co-lead. The company could have saved itself a giant headache by building in robust data record-keeping from the start, she says. Instead, it is common in the AI industry to build data sets for AI models by scraping the web indiscriminately and then outsourcing the work of removing duplicates or irrelevant data points, filtering unwanted things, and fixing typos.


Executive Q&A: The State of Cloud Analytics

Many businesses are trying hard right now to stay profitable during these times of economic uncertainty. The startling takeaway to us was that business and technical leaders see cloud analytics as the tool -- not a silver bullet, but a critical component -- for staying ahead of the pack in the current economic climate. Not only that, organizations need to do more with less and, as it turns out, cloud analytics is not only a wise investment during good economic times, but also in more challenging economic times. Businesses reap benefits from the same solution (cloud analytics) in either scenario. For example, cloud analytics is typically more cost-effective than on-premises analytics solutions because it eliminates the need for businesses to invest in expensive hardware and IT infrastructure. It also offers the flexibility businesses need to quickly experiment with new data sources, analytics tools, and data models to get better insights -- without having to worry about the underlying infrastructure.


AI vs. machine learning vs. data science: How to choose

It's a common topic for organizational leaders—they want to be able to articulate the core differences between AI, machine learning (ML), and data science (DS). However, sometimes they do not understand the nuances of each and thus struggle to strategize their approach to things such as salaries, departments, and where they should allocate their resources. Software-as-a-Service (SaaS) and e-commerce companies specifically are being advised to focus on an AI strategy without being told why or what that means exactly. Understanding the complexity of the tasks you aim to accomplish will determine where your company needs to invest. It is helpful to quickly outline the core differences between each of these areas and give better context to how they are best utilized. ... To decide whether your company needs to rely on AI, ML, or data science, focus on one principle to begin: Identify the most important tasks you need to solve and let that be your guide.


The strong link between cyber threat intelligence and digital risk protection

ESG defined cyber threat intelligence as, “evidence-based actionable knowledge about the hostile intentions of cyber adversaries that satisfies one or several requirements.” In the past, this definition really applied to data on IoCs, reputation lists (e.g., lists of known bad IP addresses, web domains, or files), and details on TTPs. The intelligence part of DRP is intended to provide continuous monitoring of things like user credentials, sensitive data, SSL certificates, or mobile applications, looking for general weaknesses, hacker chatter, or malicious activities in these areas. For example, a fraudulent website could indicate a phishing campaign using the organization’s branding to scam users. The same applies for a malicious mobile app. Leaked credentials could be for sale on the dark web. Bad guys could be exchanging ideas for a targeted attack. You get the picture. It appears from the research that the proliferation of digital transformation initiatives is acting as a catalyst for threat intelligence programs. When asked why their organizations started a CTI program, 38% said “as a part of a broader digital risk protection effort in areas like brand reputation, executive protection, deep/dark web monitoring, etc.”


4 perils of being an IT pioneer

An enterprise-wide IT project is deemed successful only when a team member at the lowest level of the hierarchy adopts it. Ensuring adoption of any new solution is always a challenge. More so a solution based on a new technology. There’s push back from end users because they find the idea of losing power or skills in the face of new technology disconcerting. For any IT leader, crossing this mental inertia is always among the toughest challenges. Moreover, IT leaders have seen many initiatives based on new technologies fail because there was no buy-in from the company’s top leadership. Even if users adopt the new technology, the initially learning curve is often steep, impacting productivity. Most organizations can’t afford or aren’t ready to accept the temporary revenue loss due to the disruption caused by the new technology. Therefore, business and IT leaders must have a clear understanding of the risk/reward principle when rolling out new tech. Buy-in from top management as a top-down mandate can make adoption of new technology easier.


Is Generative AI an Enterprise IT Security Black Hole?

Shutting the door on generative AI might not be a possibility for organizations, even for the sake of security. “This is the new gold rush in AI,” says Richard Searle, vice president of confidential computing at Fortanix. He cited news of venture capital looking into this space along with tech incumbents working on their own AI models. Such endeavors may make use of readily available resources to get into the AI race fast. “One of the important things about the way that systems like GPT-3 were trained is that they also use common crawl web technology,” Searle says. “There’s going to be an arms race around how data is collected and used for training.” That may also mean increased demand for security resources as the technology floods the landscape. “It seems like, as in all novel technologies, what’s happening is the technology is racing ahead of the regulatory oversight,” he says, “both in organizations and the governmental level.”



Quote for the day:

"Our chief want is someone who will inspire us to be what we know we could be." -- Ralph Waldo Emerson

Daily Tech Digest - April 20, 2023

How to succeed as a fractional CIO

A fractional CIO is typically an experienced IT leader who is external to the enterprise yet acts as an accountable leader and extension of the executive team, says Dave Hartman, president of IT management consulting firm Hartman Executive Advisors. “A fractional CIO thinks beyond technical needs and considers the needs of the organization from a strategic business perspective.” ... Beyond expertise and management skills, a fractional CIO can provide an independent point of view to enterprise leadership in critical areas, such as emerging technologies and IT security, as well as updating or building a technology roadmap. “This can help key decision-making and can sometimes segue into providing more hands-on help in executing on the chosen path,” says Amelia Tyagi, co-founder and CEO of Business Talent Group. In some cases, an enterprise may turn to a fractional CIO to serve as an interim executive, assuming the leadership role for a fixed period of time. “Fractional and interim CIOs are particularly effective solutions for companies that are undergoing rapid change, or those that have an unexpected leadership gap” Tyagi explains.


AI Heightens Cyber Risk for Legacy Weapon Systems

Artificial intelligence's nascent centrality to offensive weapons development means the United States should take bold steps to ensure that adversaries are unable to develop their models, said Rand Corp. CEO Jason Matheny. "These AI models right now are very brittle," Matheny said. "We need to be thinking about ways that we can slow down progress elsewhere by doing things like adversarial attacks, data poisoning and model inversion. Let's use the tricks that we're seeing used against us and make sure that we understand the state of the art." Data poisoning - in which adversaries alter the data used to train AI models in order to distort the resulting algorithms - is already a risk for the United States, said Shift5 co-founder and CEO Josh Lospinoso. "These are real problems," he said. "We need to think clearly about shoring up those security vulnerabilities in our AI algorithms before we deploy these broadly and have to clean the mess up afterwards."


TUC says government is failing to protect workers from AI harms

While AI-powered workplace surveillance offers greater control to organisations over worker behaviour, Pakes said the increasing datafication of employees is also a “profit centre” for employers, which can sell the data on to third parties. “Not all of us, but many of us, can take our work just about anywhere now with technology, but it also means our work and our bosses can follow us just about everywhere, into our private lives, into our homes,” he said, adding that AI-powered surveillance is no longer restricted to the “canary in the coal mine” of logistics and warehouse workers. “It doesn’t matter if you’re blue collar or white collar, doesn’t matter if you’re in a factory, in the office or at home – this software can check us and track us and invade us, and we really need to talk about it.” Gina Neff, executive director of the Minderoo Centre for Technology and Democracy at the University of Cambridge, said that as part of her research she has interviewed numerous economists who only offered a “collective shrug” when asked what they think the overall, long-term impact of AI will be on work.


ENISA’s Threat Landscape and the Effect of Ransomware

According to ENISA, cybersecurity threats continued to grow during the COVID-19 pandemic. The pandemic increased cybersecurity threats and attack surfaces. It also provided attackers opportunities to exploit the new normal, partly because of the growth in people’s online presence (e.g., social media), hybrid working models and the transition to more cloud-based solutions. The boom in the transportation industry’s courier, express and parcel (CEP) business was also a factor because, during the pandemic, CEP delivery services became a critical infrastructure. The acceleration in new artificial intelligence (AI) technology and advanced features (e.g., AI adaptability through machine learning [ML] and automated phishing email distributions) also spurred the growth of the cybersecurity threat. These cyberattacks become more mainstream, leading to more targeting of enterprises through home offices. In addition, state-backed or state-sponsored groups have taken advantage of the pandemic to conduct cyberespionage and implement COVID-19–related social engineering lures.


This new technology could blow away GPT-4 and everything like it

Known as Hyena, the technology is able to achieve equivalent accuracy on benchmark tests, such as question answering, while using a fraction of the computing power. In some instances, the Hyena code is able to handle amounts of text that make GPT-style technology simply run out of memory and fail. "Our promising results at the sub-billion parameter scale suggest that attention may not be all we need," write the authors. That remark refers to the title of a landmark AI report of 2017, 'Attention is all you need'. In that paper, Google scientist Ashish Vaswani and colleagues introduced the world to Google's Transformer AI program. The transformer became the basis for every one of the recent large language models. But the Transformer has a big flaw. It uses something called "attention," where the computer program takes the information in one group of symbols, such as words, and moves that information to a new group of symbols, such as the answer you see from ChatGPT, which is the output.


Skills-first hiring can increase talent pools by up to 20x

Generally, employers are looking for candidates with transferable in-demand skills such as leadership and specific technological abilities, Duke said. “In nearly all cases, employers will find that candidates always have skills that can be applied to a position, even if the candidate is coming from a vastly different industry. This increases the talent pool and makes it easier for employers to find good candidates,” she said. As an example of how the skills-first approach works, Duke said that when employers looking to hire digital marketing managers use this method, the available talent pool increases by almost 22x. “That’s because many of the skills associated with this job are common across other jobs and industries,” she explained. “In this case, about 30 separate job titles across the U.S. have relevant skills for this job, but most companies would overlook those candidates.” Every role at an organization can be broken down into a set of skills needed to do the job well. Every person has a set of skills, whether they’re an existing employee or part of an external talent pool, Duke said. 


Data privacy implications of ChatGPT

Although automated decision-making can be useful for organizations, there are serious concerns and risks to individuals subject to such processes, such as adverse legal effects based on processes they may not understand or that may be exacerbating and replicating biases and discriminatory practices. For example, the American Civil Liberties Union has opined that “AI is built by humans and deployed in systems and institutions that have been marked by entrenched discrimination . . . bias is in the data used to train the AI . . . and can rear its head throughout the AI’s design, development, implementation, and use.” Similar concerns were raised in a 2022 Constangy webinar on AI featuring Commissioner Keith Sonderling of the Equal Employment Opportunity Commission. Further, the Italian data protection authority is investigating additional data privacy implications of ChatGPT, such as whether it can comply with the GDPR, its legal basis for processing, collecting, and storing mass amounts of personal data, and its lack of age verification tools. In the meantime, Italy has temporarily banned ChatGPT.


Why is ETL Dying?

Traditional ETL pipelines have faced difficulties in supporting the agility required by modern analytics use cases, leaving business users waiting in line for their desired results. As a result, ETL pipelines are often viewed as a hindrance to better performance and businesses must carefully assess their current role and explore how they can be optimally leveraged in the contemporary analytics landscape. Traditional ETL processes require moving large amounts of data across various stages and systems, making them slow, demanding on resources, and prone to errors. This can be challenging for modern data-driven businesses to manage, as traditional ETL tools often come with a high price tag and demand substantial investments in hardware, software, and personnel resources. In contrast, newer data platforms present pre-built services and extensions that can lessen these expenses and enable enterprises to concentrate on providing meaningful outcomes to their users. For example, Google Data Stream is an instance of this approach, which is capable of managing real-time CDC with minimal coding or setup.


These medical IoT devices carry the biggest security risks

"Advances in technology are essential to improve the speed and quality of care delivery as the industry is challenged with a shortage of care providers, but with increasingly connected care comes a bigger attack surface," said Mohammad Waqas, Armis' principal solutions architect for healthcare. "Protecting every type of connected device, medical, IoT, even the building management systems, with full visibility and continuous contextualised monitoring is a key element to ensuring patient safety." The prevalence of unprotected devices comes as the healthcare sector continues to face fresh cybersecurity risks. The sector saw a 31% climb in threat activities between January and March this year compared to the previous quarter, according to Armis, citing figures from its intelligence platform. Other evidence suggests the healthcare sector is increasingly reliant on connected devices. ... Singapore's Cyber Security Agency (CSA) has also warned that critical IoT devices are potential targets in ransomware attacks, with cyber criminals recognising that the infection of these devices could lead to significant downtime costs and damage.


IBM takes a pragmatic approach to enterprise AI

IBM has integrated AI with its mainframes. The newest z16 Big Iron boasts an AI accelerator built onto its core Telum processor that can do 300 billion deep-learning inferences per day with one millisecond latency, according to IBM. The latest version of its z/OS operating system will include a new AI Framework for system operations to optimize IT processes, simplify management, improve performance, and reduce skill requirements. The new version will also support technologies to deploy AI workloads co-located with z/OS applications and will feature improved cloud capabilities. IBM said AI-powered workload management will intelligently predict upcoming workloads and react by allocating an appropriate number of batch runs, thus eliminating manual fine-tuning and trial-and-error approaches. “Systems are getting more and more complex, so we want to simply operations through with AI and automation by bringing a very prescriptive solution to our clients that will give them value out of the box and then much more,” Chopra said.



Quote for the day:

"Leaders need to strike a balance between action and patience." -- Doug Smith

Daily Tech Digest - April 19, 2023

Why Your Current Job May Be Holding Back Your IT Career

Failing to pursue professional development opportunities and not maintaining a current and relevant skillset are both great ways to shift a career into neutral. “This includes not keeping up with the latest industry trends and technologies, not networking with other professionals, and not pursuing additional training or education opportunities,” Delfine says. “IT professionals need to continually develop their skillsets and be aware of and learn new methods and tools that can be applied across multiple industries.” Another mistake is spending too little or too much time in a particular role. Knowing when to stay and when to move on is a skill within itself, says Erin Goheen, vice president of technology at freight and logistics services firm XPO. “I've seen cases where job-hopping can be detrimental to one's career because it prohibits technologists from maximizing the amount of learning and skill development gained in a particular role,” she explains. “Conversely, if you’re in a role for too long and you're no longer learning and expanding your professional capabilities, other professionals who are actively growing in similar roles will pass you in their career trajectories.”


Top risks and best practices for securely offboarding employees

Shadow IT and information systems that aren’t part of a business’s identity and access management (IAM) architecture are a huge risk to successful, secure offboarding, says Richard Jones, global CISO at Orange Cyberdefense. This is magnified for cloud and SaaS systems/applications that don’t require specific network access or physical presence in an office, with IT teams often unaware of the extent of employees’ SaaS usage. ... Another challenge is managing software asset licenses. If employees aren’t properly offboarded from cloud system licenses this can lead to excessive IT costs as well as security risks, as licenses are often changed per user, per month, Jones says. It’s not just the risks of outgoing employees themselves that CISOs need to consider. “In most cases, mass layoffs cause remaining employees to be concerned about their job security, which can increase insider threats and introduce security gaps caused by unintentional negligence,” says Mohan Koo, CTO at DTEX Systems.


How Cybersecurity Leaders Can Capitalize on Cloud and Data Governance Synergy

In today’s modern organizations, explosive amounts of digital information are being used to drive business decisions and activities. However, both organizations and individuals may not have the necessary tools and resources to effectively carry out data governance at a large scale. I’ve experienced this scenario in both large private and public sector organizations: trying to wrangle data in complex environments with multiple stakeholders, systems, and settings. It often leads to incomplete inventories of systems and their data, along with who has access to it and why. Cloud-native services, automation, and innovation enable organizations to address these challenges as part of their broader data governance strategies and under the auspices of cloud governance and security. Many IaaS hyperscale cloud service providers offer native services to enable activities such as data loss protection (DLP). For example, AWS Macie automates the discovery of sensitive data, provides cost-efficient visibility, and helps mitigate the threats of unauthorized data access and exfiltration.


Seven Tips for Achieving Dynamic Professional Transformation with Framework Modeling

Framework modeling can be a significant differentiator and can empower professionals with rich knowledge repositories of best practices derived from frameworks. The modeling of the framework offers a big-picture approach and life cycle perspective for achieving goals. This can aid professionals as existing and emerging technologies impact which professional skills are relevant and required in the market. Innovative technologies continue to emerge and create an impact on employment due to new services made possible through innovation and automation. For example, there is much speculation about how ChatGPT will impact employment opportunities in various lines of work. There is also widespread concern that management will prefer to harness technology rather than employees when considering value delivery in the future. Hence, professionals as knowledge workers can benefit by upgrading their skills by adapting the framework modeling approach. ...  Framework modeling can be considered the skill of carving the required knowledge from the structure and contents of a framework per an enterprise’s needs.


FBI and FCC warn about “Juicejacking” – but just how useful is their advice?

The idea is simple: people on the road, especially at airports, where their own phone charger is either squashed away deep in their carry-on luggage and too troublesome to extract, or packed into the cargo hold of a plane where it can’t be accessed, often get struck by charge anxiety. Phone charge anxiety, which first became a thing in the 1990s and 2000s, is the equivalent of electric vehicle range anxiety today, where you can’t resist plugging in for a bit more juice right now, even if you’ve only got a few minutes to spare, in case you hit a snag later on in your journey. But phones charge over USB cables, which are specifically designed so they can carry both power and data. So, if you plug your phone into a USB outlet that’s provided by someone else, how can you be sure that it’s only providing charging power, and not secretly trying to negotiate a data connection with your device at the same time? What’s if there’s a computer at the other end that’s not only supplying 5 volts DC, but also sneakily trying to interact with your phone behind your back?


7 keys to controlling serverless cloud costs

Overprovisioning memory and CPU allocation are two culprits often found behind serverless computing cost overruns. When you execute a serverless function in your cloud application, your CSP allocates resources according to the function’s configuration. Then when billing time comes around, your CSP bases your billing on the amount of resources your application consumes. It makes good business sense to spend the extra time during the design phase to determine the appropriate amount of resources that each serverless function requires, so you’re minimizing costs. Train your cloud developers to use compute only when necessary, advises CloudZero. They give the example of using step functions to call APIs instead of Lambda functions, meaning you only pay for the step functions. The major CSPs and cloud management platforms include key performance indicator (KPI) monitoring dashboards of one form or another. You can also use observability tools, such as Datadog, for KPI monitoring. Monitoring your serverless KPIs should figure prominently in your project and deployment plans.


New DDoS attacks on Israel’s enterprises, infrastructure should be a wake-up call

“Generally speaking, all these attacks happen with more or less sophisticated forms, either abusing different vulnerabilities and systems or brute force DDoS,” Izrael said. “What’s different about these is that an unsophisticated DDoS tactic would be to blast a website with traffic and take it down. What’s happening here is that attackers have been targeting a lot of weak spots where they are taking down services.” Izrael added that the attackers have also managed to hobble, albeit briefly, smart IoT functionality at individual homes, buildings and other structures. Justin Cappos, professor of computer science and engineering at the NYU Tandon School of Engineering, said network provisioning operators need to pay attention to any new group launching large-scale DDoS attacks. ... Izrael said the combination of direct attacks by the Iranian government and indirect attacks by affiliated groups achieves two goals: keeping the provenance of the attacks very murky and making the attack seem bigger because the origin of the attacks is unclear. 


Rising to the challenge: the role of boards in effective bank governance

Effective governance has been a priority of our supervision for several years, and will continue to be in the years to come. As part of our work on this priority, we are carrying out an update of our supervisory expectations on governance. Today’s seminar is an important opportunity to listen to the industry as we fine-tune those expectations, and marks one of many milestones along the way. Particularly in the current climate, it is essential for banks to have strong and effective governance. A bank needs a board that can steer it through calm and stormy waters alike, setting the compass on the strategy for the bank, while ensuring a sustainable business model and monitoring risks in a forward-looking manner. In today’s environment, backward-looking indicators of risk might be misleading. It is therefore more important than ever for boards to be vigilant. Boards need to take a proactive approach to identifying emerging risks and trends, assessing potential impacts on the bank, and taking appropriate actions to mitigate them.


Unlocking the power of a multigenerational workforce

Those organisations that don’t innovate die a slow death; those who are not open to change and not forward-looking will not be far behind. Organisations have to constantly employ different ‘listening methods’ to gauge the pulse of employees across generations, check on new trends and keep revisiting their programs and policies to imbibe what’s new, instead of sticking to the ‘tried and tested’. ... Learning only happens when one’s thoughts and opinions are challenged by those people from entirely different backgrounds or have a very different thought process from that of one’s own. The influx of talent from diverse groups, especially from across generations hence continues being very essential for the organisation. The early-age talent brings enthusiasm and challenge; the older age group folks infuse much-needed wisdom and experience! Sensitising managers and leaders: Since they hold the staff for taking the organisation ahead, especially in turbulent times. ‘How to lead a team with members across generations’ is a learning module that organisations must learn to invest in – incorporating elements like empathy, situational leadership and leaving one’s ego behind.


CIO Fletcher Previn on designing the future of work

The network that can properly support hybrid work needs to be more distributed, porous and has a very different attack surface than when we were all in the office. Technologies like Zero Trust become even more important, along with split tunnel VPNs and having the right endpoint security strategy so you don’t have to backhaul all the traffic in order to inspect it. You need carrier and path diversity at your carrier neutral facilities and network points of presence, and you want to have a good peering strategy so you can bring applications closer to the end users and take traffic off the public internet. Full-stack observability becomes more urgent in a hybrid world. How do we really understand our employee experience our employees are having when they are connecting from across all sorts of networks that we don’t manage? We need to understand the performance of the public internet and various SaaS tools in order to really know what our hybrid work experience is going to be for our people. We also need tools that provide valuable observability that lets us detect and fix problems before our employees even know there is an issue brewing.



Quote for the day:

"Leadership should be born out of the understanding of the needs of those who would be affected by it. " -- Marian Anderson

Daily Tech Digest - April 17, 2023

The Power Of Silence: 10 Reasons Silent People Are Successful

Being silent often goes hand-in-hand with improved observation. When you’re not focused on expressing your thoughts, you have the more mental bandwidth to take in your surroundings. This heightened awareness allows you to understand people and situations better. Many successful individuals credit their observation skills as contributing to their achievements. By carefully observing their environment, they can identify opportunities and threats others might overlook. A quiet mind leads to better focus and concentration. When you’re silent, it’s easier to direct your attention to the task, free from distractions or competing thoughts. This improved focus can enhance your decision-making abilities and boost your overall productivity. ... Silence can be a powerful tool for emotional regulation. Silent individuals often excel at managing their emotions, avoiding impulsive actions, and maintaining composure in challenging situations. Staying calm under pressure can lead to better decision-making and increased resilience. 


7 cybersecurity mindsets that undermine practitioners and how to avoid them

Security is often seen as a standalone function or additional product that is bolted onto the real infrastructure or as a discrete thing to be finalized and delivered. This is a long-standing view in software development, something similar to the way we once thought about quality: as a distinct, separate component of things. “Quality is not an act, it’s a habit,” according to an elegant paraphrase of Aristotle. Just like quality, security is not a finished product but rather an ongoing discipline. When we see security as a practice, to be continually refined and honed, it frees up the energy to engage it as such. We grow healthier by exercising regularly and monitoring our diet daily; such is security. If we want to get good at guitar or a martial art, we must keep coming back to it and refining it, but there is always more to develop — just as in security. Instead of bemoaning this fact, we can lean into it and use it to fuel our efforts. It’s actually a blessing to work in a field that always has room for growth and can fully engage our capabilities. 


A distributed database load-balancing architecture with ShardingSphere

The key point of ShardingSphere-Proxy cluster load balancing is that the database protocol itself is designed to be stateful (connection authentication status, transaction status, Prepared Statement, and so on). If the load balancing on top of the ShardingSphere-Proxy cannot understand the database protocol, your only option is to select a four-tier load balancing proxy ShardingSphere-Proxy cluster. In this case, a specific proxy instance maintains the state of the database connection between the client and ShardingSphere-Proxy. Because the proxy instance maintains the connection state, four-tier load balancing can only achieve connection-level load balancing. Multiple requests for the same database connection cannot be polled to multiple proxy instances. Request-level load balancing is not possible. ... Theoretically, there is no functional difference between a client connecting directly to a single ShardingSphere-Proxy or a ShardingSphere-Proxy cluster through a load-balancing portal. However, there are some differences in the technical implementation and configuration of the different load balancers.


How Synthetic Data Can Help Train AI and Maintain Privacy

Common use cases for synthetic data include software engineering when new features are built but no production data is available, says Jim Scheibmeir, senior director analyst with Gartner. For instance, if software is tested for an autonomous vehicle, and it needs new information about the weather or obstructions in the road, he says. Different scenarios can be generated to test that autonomous algorithm to prepare it. Data scientists who are trying to create new algorithms, Scheibmeir says, or need to prove out new hypotheses might struggle to get their hands on production data. That limited availability might have to do with restricted access, compliance, or regulation, making synthetic data attractive. The rise of generative AI might also play a role in synthetic data generation. “Certainly, ChatGPT is going to reinvigorate our imagination of what generative can do for us,” Scheibmeir says. “Gartner urges organizations to look at proper test data management, including synthetic data generation, for a few different reasons.” 


What business executives don’t understand about IT

When the CEO doesn’t think IT is important enough to get top-level attention, that message filters down to the rest of the corporation. IT is not viewed to be as important as Sales, Finance, Manufacturing, Operations, or Marketing — dangerous in a highly competitive environment where efficient or innovative systems can spell the difference between the corporation’s success or failure. ... Systems development is another important area executives need to understand. The systems IT develops will not be used by IT; rather, they will become integral to the requesting department. It is important, therefore, that management understand the processes involved in proposing the system, estimating the cost, determining the ROI, producing the deliverables, changing the specifications and time frames, and measuring the system effectiveness. After all, the completed system may impact sales projections, departmental costs, and individual incentives, to name a few. Management must also assure that the people in the user organization are given the time and recognition to do the work required to develop the precise specifications of the system. 


Tech companies including Adobe are taking a new look at a big industry debt issue

Despite the drag of technical debt that the data suggests, some industry executives say it gets a bad reputation. “If you’re tech-debt-free, you’re not innovating,” said Frans Xavier, CTO of low-code/no-code security automation platform Swimlane. In this sense, technical debt is a signal of iteration. In fact, in a recent report from consumer electronics company TE Connectivity, 55% of the engineers surveyed said it’s iteration — not total transformation — that represents innovation at its core. Adobe head of strategic development for creative cloud partnerships Chris Duffey is looking to reshape technical debt. “I would offer to reframe technical debt as the value of insight gathering throughout the innovation creation process,” Duffey said. The “fail fast” dogma that propels much of the technology industry (when not taken literally) references experimentation, insight gathering, and optimization, he added. This can be hard to see when you look solely at the data, in part because it’s difficult to quantify the process of innovation. 


Moving beyond DEI: Fostering belongingness in the workplace

Measuring belongingness is different than simply measuring diversity and inclusion. Diversity and inclusion are behaviours, meaning they can be mostly measured through policy and procedures. On the other hand, belongingness is an emotional response that covers an array of factors such as an individual’s trust, comfortability, and openness towards the company. Therefore, belongingness happens when the employee is ‘valued’, and value here means not only are they acknowledged and appreciated for their work, but they also understand how their work contributes to the company’s vision, mission, key priorities, and growth. It also means that ‘they matter’ – being part of the teams, on ‘top of mind’ for leading and driving the initiatives, being ‘trusted’ and ‘cared for’, and that is the ultimate cement that joins them to the culture of the company. Belongingness is in these little things that define “moments that matter” – let's explain that in greater detail with questions that come to an employee’s mind when they experience an organization.


Enhance data governance with distributed data stewardship

Data stewards are a central point of contact. They enforce accountability of the data lifecycle, and oversee data governance and visibility. In many instances, data stewardship is a centralized business or IT function. These settings require enterprise data governance or expertise in data management and governance execution. Distributed data stewardship is a model or framework that allows teams closest to the data to manage access and permissions. Data management is decentralized and resides within the business unit. ... The core component of a distributed data stewardship program is similar to a data stewardship one. The success of such a model depends on how well a decentralized IT, governance and distributed access management model works. Because a distributed data stewardship model delegates data management responsibilities throughout the enterprise, the fundamental difference between a data stewardship model and a distributed data stewardship model is in shifting an organization toward decentralizing data access. This requires time, effort, cadence and key stakeholders who agree and adhere to such a framework.


Cognitive flexibility: the science of how to be successful in business and at work

Cognitive flexibility aids learning under uncertainty and to negotiating complex situations. This is not merely about changing your decisions. Higher cognitive flexibility involves rapidly realising when a strategy is failing and changing strategies. The importance of cognitive flexibility was first discovered in clinical patients. The function engages areas of the brain involved with decision making, including the prefrontal cortex and striatal circuitry. When this circuitry becomes dysfunctional due to neurological diseases or psychiatric disorders, it can cause rigidity of thought and a failure to adapt. Cognitive flexibility is required in many real-world situations. The category of workers that requires the highest level of adaptability is arguably entrepreneurs. Entrepreneurs need to show flexibility not only in terms of idea generation, but also for resource allocation and social exchanges. Indeed, our previous research has shown that entrepreneurs, compared with high-level managers, have increased cognitive flexibility. This ultimately helps them to solve problems and make risky decisions successfully.


IT leadership: Mission-driven IT and finding your "why"

People talk about IT strategy or tech strategy or product strategy; they talk about deliverables, roadmaps, all of that stuff. To me, it all starts with the mission, and our mission is to transform lives by unlocking better evidence. And really what that means day to day is helping facilitate and support and enable the clinical trial process, which we know in recent years especially has—the importance of which is really second to none. It’s accelerated during the pandemic, naturally, as we look for treatments and preventatives for Covid. But now, what it’s done is it’s poured gas on the fire in a whole bunch of other areas, too. So the industry is working faster than ever, and I like to think we’re doing life-changing work. I believe we are. And the technology that we build at Clario and the expertise that we bring helps support the companies that are running clinical trials, the sponsors, the people who are running trials day to day, the sponsor—or the trial teams, as well as the sites. You know, the folks, the nurses, the clinicians, the physicians who are all part of this process and helping facilitate this.



Quote for the day:

"Increasingly, management_s role is not to organize work, but to direct passion and purpose." -- Greg Satell

Daily Tech Digest - April 15, 2023

6 best practices to develop a corporate use policy for generative AI

The first step to craft your corporate use policy is to consider the scope. For example, will this cover all forms of AI or just generative AI? Focusing on generative AI may be a useful approach since it addresses large language models (LLMs), including ChatGPT, without having to boil the ocean across the AI universe. ... Involve all relevant stakeholders across your organization – This may include HR, legal, sales, marketing, business development, operations, and IT. Each group may see different use cases and different ramifications of how the content may be used or mis-used. Involving IT and innovation groups can help show that the policy isn’t just a clamp-down from a risk management perspective, but a balanced set of recommendations that seek to maximize productive use and business benefit while at the same time manage business risk. Consider how generative AI is used now and may be used in the future – Working with all stakeholders, itemize all your internal and external use cases that are being applied today, and those envisioned for the future.


There Is No Resilience without Chaos

Chaos engineering has emerged as an increasingly essential process to maintain reliability for applications — or in not only cloud native but any IT environment. Unlike pre-production testing, chaos engineering involves determining when and how software might break in production by testing it in a non-production scenario. In this way, chaos engineering becomes an essential way to prevent outages long before they happen. ... Chaos engineering, when done properly, requires observability. Problems and issues that can cause outages and the greater performance can be detected well ahead of time as bugs, poor performance, security vulnerabilities, etc. become manifest during a proper chaos engineering experiment. Once these bugs and kinks that can potentially lead to outages if left unheeded are detected and resolved, true continued resiliency in DevOps can be achieved. In the event of a failure, the SRE or operations person seeking the source of error is often overloaded with information.


Data Governance: Simple and Practical

Purpose-driven data governance programs narrow their focus to deliver urgent business needs and defer much of the rest, with a couple caveats. First, data governance programs are doomed to fail without senior executive buy-in and continuous engagement of key stakeholders. Without them, no purpose can be fulfilled. Second, data governance programs must identify and gain commitment from relevant (but perhaps not all) data owners and stewards, but that doesn’t necessarily mean roles and responsibilities need to be fully fleshed out right away. Identify the primary purpose then focus on it – sounds like a simple formula, but it’s not obvious. Many data governance leaders are quick to define and pursue their three practices or five pillars or seven elements, and why shouldn’t they? They need those capabilities, but wanting it all comes at the sacrifice of getting it now. Generate business value with your primary purpose before expanding. ... An insurer explained to me their dashboards weren’t always refreshed, and when they were, wide fluctuations in values made it impossible to make informed decisions.


What is platform engineering? Evolving devops

The developer portal is the main mechanism and expression of platform engineering. Its main purpose is to gather together the developer's tooling, documentation, and interactivity in one place. It is a kind of front end to the organization's developer infrastructure. Developer portals (aka internal developer platforms) have evolved out of several needs and trends. This primer on developer portals delineates these tools into three types: universal service catalog, API catalog tied to API gateway, and microservices catalog. APIs figure large in platform engineering because the uptake of microservices architecture has caused a great deal of increased complexity for modern software teams. Orchestrating microservices in a large organization can be very challenging. Just understanding what microservices are involved in a given use case can be difficult. A developer portal offers a unified view into the overall web of microservices. Another aspect of the developer portal is offering a standard framework to combine the tools used by the organization.


EU privacy regulators to create task force to investigate ChatGPT

In a statement posted on its website, the EDPB said the task force was intended to “foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities.” Last month, Italy’s data privacy regulator issued a temporary ban against ChatGPT over alleged privacy violations relating to the chatbot’s collection and storage of personal data. Italy's guarantor for the protection of personal data ordered the temporary halt on the processing of Italian users’ data by ChatGPT’s parent firm OpenAI, unless it complied with EU privacy laws. In order to have the service reinstated, the Italian guarantor outlined a list of data protection requirements that OpenAI must comply with, including increased transparency into how ChatGPT processes data, the right for nonusers to opt out of having their data processed, and an age-gating system for signing up to the service. In the wake of the ban, OpenAI CEO Sam Altman tweeted: “We of course defer to the Italian government and have ceased offering ChatGPT in Italy (though we think we are following all privacy laws).”


The mechanics of entrepreneurship

Lidow codifies this innovative shove, arguing that entrepreneurs invent and create enduring change in one of three ways: by scaling supply, scaling demand, or scaling simplicity. The first category includes those who scaled up their supply by devising an efficient system and then repeating it. In the late 1700s, the enterprising coin-maker Matthew Boulton, for example, leveraged his superior knowledge of metalworking to create a new process for producing coins quickly and uniformly—spawning countless societal changes. This included the swarm of entrepreneurs in the early- to mid-1800s who conceived the modern railway. Titans of the second category, scaled demand, include cultivators of desire like Wedgwood, Selfridge, and the American PR pioneer Edward Bernays, who coined the phrase “public relations” and created the industry. Through carefully cultivated propaganda campaigns, Bernays convinced wide swaths of folks in the US to support the country’s efforts in World War I and, later, stimulated broad demand for products such as bacon and tobacco.


3 IT leadership mistakes to avoid

The first exercise we undertook was brainstorming and agreeing on a set of operating principles, such as all ideas would be respected regardless of which side they came from; facts and data—not emotion—would drive decision-making; and creating a positive client experience would be our collective North Star. These principles became our rallying cry and helped lead the team to a very successful client conversion. Contrast that with leaders who set rigid rules for their teams to follow. Leading by a set of hard rules will limit innovation, hinder individual and team development, and create a constant need to add or modify the rules as situations change. ... There is no such thing as a perfect organizational structure—there’s only an array of alternatives, each with its own respective strengths and weaknesses. The only way to make an inherently flawed organizational structure work is to have individuals collaborate under a common strategy, purpose, and shared goals. Great teams also take individuals who are willing to sacrifice for the good of the whole.


Data leader Tejasvi Addagada on the value of data governance

If data is siloed, it cannot be used for developing insights and products. For an organization that is yet to invest in managing its data and thinks centralization is costly or a bottleneck, a data mesh architecture is a decentralized approach at its core, with its domain team ingesting its operational and analytical data and developing data products. ... From the initial concept of corporate governance, IT governance has evolved into the recent concept of data governance. Globally, the adoption of cloud services, the evolution of modern data stacks, and improved data literacy have led to a greater interest in governing data over the past years. Implementing data governance is necessary to get sustainable value from data. A subfunction can be formalized as an authorized provisioning service. It can support activities that help ensure that a data element can be rightfully sourced from a designated provisioning point. In addition, it can have the domain team express their trust in certifying data as a system of record as well as authorized to provision.


Google Cloud Unveils AI Tools to Streamline Preauthorizations

“The Claims Acceleration Suite’s Claims Data Activator uses Document AI, Healthcare Natural Language API, and Healthcare API to convert this unstructured data to structured data and establish data interoperability,” Waldron says. “This speeds up the process, and significantly reduces administrative burdens and costs, enabling experts to make faster, more informed decisions that improve patient care.” A quick prior authorization process is essential to speeding up the process for a patient who may need approval for transportation to an important medical procedure such as a colonoscopy, according to Waldron. Patients also seek prior authorizations to use a digital device as part of weight management or a care management plan for conditions such as diabetes. A goal of Google’s Claims Data Activator is to make healthcare prior authorization data more interoperable, or accessible for all parties. 


Data sharing between public and private is the answer to cybersecurity

Businesses and governments are already interlinked in their attempts to keep ahead of cybercriminals. You only need to look at examples of the recent Royal Mail attack, which saw the NCSC and the business working together to reduce its impact. And across the Pond, Biden’s newly announced Cybersecurity Strategy will focus on ensuring closer collaboration on cyber between government and industry. Whilst all of this is moving in the right direction in this regard, there’s more work to be done to create more intentional and systematic cross-sharing and learning from one another. To kickstart the open flow of knowledge in the industry, both public and private organizations could sponsor a wider peer network for security experts that streamlines intelligence from private to public or vice versa and offers support. Gartner offers a Peer Connect network of business leaders that encourages the open discussion of trends and ideas, critical to business decision-making. 



Quote for the day:

"A leader's dynamic does not come from special powers. It comes from a strong belief in a purpose and a willingness to express that conviction." -- Kouzes & Posner

Daily Tech Digest - April 07, 2023

Why leadership training fails — and how to fix it

It could be attributed to the existing culture of the leadership team. Do the organization’s leaders possess a growth mindset, or is intellectual humility lacking? Unless a leader wants to improve, it’s unlikely that they will. Leaders must be motivated to make the time and have the patience for the kind of reflective practice that makes learning stick. What about the group of high-potential employees, who have every intention of applying what they learn, but then struggle to translate the skills and knowledge into practice? It’s possible that the way the program is designed could be hindering successful learning transfer. One size leadership training does not fit all. To be effective, it must be designed with the learners’ needs in mind, whether they’re high-performing individual contributors without supervisory experience, or C-suite executives. If participants don’t find the content relevant to their role and objectives, learner engagement will suffer. Lastly, “leadership training” cannot be presented as a one-off event at the organization but rather, an ongoing process. Formal training is only one aspect of learning.


Delivery Leadership is both an Art and a Science

In the present time, with businesses becoming increasingly interconnected and globalized, enterprises worldwide seek modern technology products that are straightforward yet impactful in enhancing their operational efficiency, productivity, market penetration, and reducing operational expenses. These business requirements prompt organizations to explore technologies such as Cloud computing, ERP, AI, Data Analytics, Automation, and Business Intelligence. The provision of intricate IT Solutions and Services demands expertise and attributes from both the Art and Science aspects of the field. ... Delivery Leaders must have exposure to Industry, domain, and business knowledge to be successful in their role. They must place themselves in the shoes of customers and think about what value-add services their customers perceive to be important for their businesses. Proactive approach and futuristic thinking are the two most important skills a delivery leader must possess. They must also encourage a culture in which sharing prescriptive approaches and making business recommendations become the new norm. 


Asynchronous Patterns for Microservice Communication

Since Microservices communicate using Asynchronous methods, keeping their patterns fast and responsive is essential. Fortunately, there are several quick ways to do this. For example, having your services communicate asynchronously with RabbitMQ or Kestrel is a good idea before Synchronous methods. This way, you can maximize network efficiency while minimizing response delays. You can also use Kestrel’s retries for excellent reliability and scalability when communicating between machines. In addition, it’s a good idea to use event-driven communication for better responsiveness between your components and clients. If you want to connect with multiple microservices without creating dependencies or tightly coupling them, consider using asynchronous message-based communication in your microservices architecture. This approach leverages events to facilitate communication between microservices, which is commonly referred to as event-driven communication.


GPT and the Future of High-Performance Computing and Big Data Analytics

The emergence of Generative Pre-trained Transformer (GPT) models has revolutionized the field of high-performance computing and big data analytics. GPT models are capable of learning from large datasets and producing highly accurate results with minimal effort. This has enabled organizations to quickly analyze large datasets and extract meaningful insights. GPT models have been successfully used in a variety of applications, such as natural language processing, image recognition, and machine translation. With the increasing availability of large datasets, GPT models are expected to become even more powerful and efficient. This will enable organizations to gain deeper insights into their data and make better decisions. In addition, GPT models can be used to speed up the development of high-performance computing systems. GPT models can be used to optimize the hardware and software components of these systems, allowing them to run faster and more efficiently. 


Essential Soft Skills for Testers: Unlocking Success in Your Testing Career

Collaboration skills are essential for testers, as they often work closely with various team members, including developers, product managers, and other stakeholders, to ensure the delivery of high-quality products. In this section, we will explore three crucial collaboration skills that enable testers to be effective team players: active participation, cross-functional cooperation, and providing and receiving constructive feedback. Active participation refers to engaging fully in team activities, sharing ideas, and contributing meaningfully to discussions and decisions. ,,. Cross-functional cooperation is the ability to work effectively with team members from different departments or areas of expertise. Testers who excel in cross-functional cooperation can effectively communicate with developers, designers, product managers, and others to identify and resolve issues, share knowledge, and promote a shared understanding of project goals. This skill is particularly important for testers in agile environments, where cross-functional teams are the norm and effective cooperation is critical for delivering high-quality products on time.


What Engineers Need to Know About Using Agile for Digital Transformation

As Agile software development techniques became more widely applied, so the pace of technological change continued to quicken. From the emergence of the cloud to the increase in mobility, and onto the rise of data analytics and artificial intelligence, businesses in every sector began using IT systems to power internal processes and external services. Digital transformation has emerged as shorthand for businesses seeking to reinvent themselves on a foundation of digital data and technology. Whether it’s digitizing paper records, creating new electronic channels to market or analyzing data to produce new insights, companies can use technology to improve an existing business process. Agile development has played a crucial role in many of these digitalization programs, especially the creation of IT applications. The successful rollout of these software-focused projects has encouraged engineers to start thinking about how Agile techniques can be used in other areas of IT, including digital transformation initiatives.


How generative AI can hurt cloud operations

Generative AI algorithms can be incompatible with existing cloud computing systems, leading to integration issues. This can delay the deployment of generative AI algorithms and cause problems with system performance or efficiency. ... Generative AI algorithms can exhibit unpredictable behavior, which leads to unexpected outcomes. This can result in system errors, degraded system performance, and other issues that are impossible to predict. I suspect we’ll get better at predicting behavior as we learn more about generative AI system operations, but the learning curve will be painful. I’ve already had some generative AI systems pulled off cloud systems due to unpredictable behavior and, what’s worse, unpredictable cloud computing bills. Generative AI is an unstoppable force in the enterprise technology space. It’s yet another technology made more accessible and affordable by cloud computing, and the easy availability of this technology will reverberate through the marketplace. Generative AI will become a technology that allows businesses to succeed by out-innovating their competition.


Data, AI and automation will never replace humans. Fact

While these technologies are nothing new, they do continue to advance at pace. This presents the opportunity to leverage them to help solve some of the biggest challenges we face in society as well as in business. But, we will only succeed when we remain as the masters of the technology, not the servants. Using AI and automation to empower people, not replace them, allows organisations to be data-driven yet technology-enabled and people-centric. Where these pieces of software are used as tools to help humans do their best work and remove the drudgery of manual tasks. And it makes complete sense. Because there will always be a moment of truth when a human must be involved at a crucial point. An automated process might take someone 75 per cent of the way, but a person needs to complete the rest. And if they can put all their effort into that 25 per cent, the result will be a better outcome for the employee, the customer and the organisation that brings them together. Ultimately, those who try to remove people from the equation are destined to fail. 


How artificial intelligence can inform decision-making

To implement AI for decision-making, organizations need a modern data infrastructure to support new data types and often massive amounts of data. Many organizations are moving to the cloud for data management and making use of data engineers and newer pipeline tools to help integrate data and make sure it is trustworthy. They are also hiring DevOps teams to deploy models and monitor them in production. According to a TDWI Best Practices Report, 67 percent of organizations deploying AI technologies today state that AI projects are built by data scientists and are deployed into production by DevOps teams. Some organizations are also using augmented intelligence applications, where AI is infused into the software to automate functionality, such as data cleansing, deriving insights, or building predictive models. In addition to hiring specialists, organizations must also encourage all employees to build excitement and trust. It is essential to involve stakeholders in the design and implementation of AI systems to ensure that they understand how the systems work and are comfortable using them.

CDOs Want Increased Investments in Data Management, Cloud

“The challenge comes from managing the increased volume and variety of data, the need to integrate it into business intelligence, and most notably, the need to keep data infrastructure updated to ensure various data types are supported and organizations are following data compliance considerations,” he says. He explains CDOs are seeking greater investments in data management because they need to identify how they can create a measurable impact for the customer experience. “This can only be done by collecting and analyzing data, which can often be tedious and require large sums of time and resources,” Adya says. “Hence, many are doubling down on data management resources to get the job done quicker and easier.” As the digital ecosystem evolves, enterprises are being forced to innovate and rely on cloud to accelerate digital transformation. “Effectively leveraging data through the cloud gives organizations a competitive edge and increases resiliency by being able to respond to disruptions and spot new market opportunities through intelligent data,” he explains.



Quote for the day:

"To have long term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley