Daily Tech Digest - November 30, 2023

Super apps: the next big thing for enterprise IT?

Enterprise super apps will allow employers to bundle the apps employees use under one umbrella, he said. This will create efficiency and convenience, where different departments can select only the apps they want, much like a marketplace, to customize their working experiences. Other advantages of super apps for enterprises include providing a more consistent user experience, combating app fatigue and app sprawl, and enhancing security by consolidating functions into one company-managed app. Gartner analyst Jason Wong said the analyst firm is seeing interest in super apps from organizations, including big box stores and other retailers, that have a lot of frontline workers who rely on their mobile devices to do their jobs. One company that has adopted a super app to enhance the experience of its frontline workers and other employees is TeamHealth, a leading physician practice in the US. TeamHealth is using an employee super app from MangoApps, which unifies all the tools and resources employees use daily within one central app.


Meta faces GDPR complaint over processing personal data without 'free consent'

The case centres on whether Meta can legitimately claim to have obtained free consent from its customers to process their data, as required under GDPR, when the only alternative is for customers to pay a substantial fee to opt out of ad-tracking. The complaint will be watched closely by social media companies such as TikTok, which are reported to be considering offering ad-free services to customers outside the US to meet the requirements of European data protection law. Meta denied that it was in breach of European data protection law, citing a European Court of Justice ruling in July 2023 which it said expressly recognised that a subscription model was a valid form of consent for an ad-funded service. Spokesman Matt Pollard referred to a blog post announcing Meta’s subscription model, which stated, “The option for people to purchase a subscription for no ads balances the requirements of European regulators while giving users choice and allowing Meta to continue serving all people in the EU, EEA and Switzerland”.


India’s Path to Cyber Resilience Through DevSecOps

DevSecOps, a collaborative methodology between development, security, and operations, places a strong emphasis on integrating security practices into the software development and deployment processes. In India, the approach has gained substantial traction due to several reasons, including a security-first mindset, adherence to compliance requirements and escalating cybersecurity threats. A survey revealed that the primary business driver for DevSecOps adoption is a keen focus on business agility, achieved through the rapid and frequent delivery of application capabilities, as reported by 59 per cent of the respondents. From a technological perspective, the most significant factor is the enhanced management of cybersecurity threats and challenges, a factor highlighted by 57 per cent of the participants. Businesses now understand the importance of proactive security measures. DevSecOps encourages a security-first mentality, ensuring that security is an integral part of the development process from the outset.


Cybersecurity and Burnout: The Cybersecurity Professional's Silent Enemy

In the world of cybersecurity, where digital threats are a constant, the mental health of professionals is an invaluable asset. Mindfulness not only emerges as a shield against the stress and burnout that pose security risks to organizations, but it also becomes a key strategy to reduce the costs associated with lost productivity and staff turnover. By adopting mindfulness practices and preventing burnout, cybersecurity professionals not only preserve their well-being, but also contribute to a healthier work environment, improve the responsiveness and effectiveness of cybersecurity teams, and ensure the continued success of companies in this critical technology field. Cybersecurity challenges are multidimensional. They cannot be managed in only one dimension. Mindfulness is an essential tool to keep us one step ahead. By recognizing the value of emotional well-being in the fight against cyberattacks, we can build a stronger and more sustainable defense. Cybersecurity is not only a technical issue, but also a human one, and mindfulness presents itself as a key piece in this intricate security puzzle.


Will AI replace Software Engineers?

While AI is automating some tasks previously done by devs, it’s not likely to lead to widespread job losses. In fact, AI is creating new job opportunities for software engineers with the skills and expertise to work with AI. According to a 2022 report by the McKinsey Global Institute, AI is expected to create 9 million new jobs in the United States by 2030. The jobs that are most likely to be lost to AI are those that are routine and repetitive, such as data entry and coding. However, software engineers with the skills to work with AI will be in high demand. ... Embrace AI as a tool to enhance your skills and productivity as a software engineer. While there's concern about AI replacing software engineers, it's unlikely to replace high-value developers who work on complex and innovative software. To avoid being replaced by AI, focus on building sophisticated and creative solutions. Stay up-to-date with the latest AI and software engineering developments, as this field is constantly evolving. Adapt to the changing landscape by acquiring new skills and techniques. Remember that AI and software engineering can collaborate effectively, as AI complements human skills. 


Bridging the risk exposure gap with strategies for internal auditors

Without a strategic view of the future — including a clear-eyed assessment of strengths, weaknesses, opportunities, threats, priorities, and areas of leakage — internal audit is unlikely to recognize actions needed to enable success. There is no bigger threat to organizational success than a misalignment between exponentially increasing risks and a failure to respond due to a lack of vision, resources, or initiative. Create and maintain a good, well-documented strategic plan for your internal audit function. This can help you organize your thinking, force discipline in definitions, facilitate implementation, and continue asking the right questions. Nobody knows for certain what lies ahead, and a well-developed strategic plan is a key tool for preparing for chaos and ambiguity. ... Companies may have less time than they think to prepare for compliance, and internal auditors should be supporting their organizations in getting the right enabling processes and technologies in place as soon as possible. This will require a continuing focus on breaking down silos and improving how internal audit collaborates with its risk and compliance colleagues. 


Generative AI in the Age of Zero-Trust

Enter generative AI. Generative AI models generate content, predictions, and solutions based on vast amounts of available data. They’re making waves not just for their ‘wow’ factor, but for their practical applications. It’s only natural that employees would gravitate to the latest technology offering the ability to make them more efficient. For cybersecurity, this means potential tools that offer predictive threat analysis based on patterns, provide automatic code fixes, dynamically adjust policies in response to evolving threat landscapes and even automatically respond to active attacks. If used correctly, generative AI can shoulder some of the burdens of the complexities that have built up over the course of the zero-trust era. But how can you trust generative AI if you are not in control of the data that trains it? You can’t, really. ... This is forcing organizations to start setting generative AI policies. Those that choose the zero-trust path and ban its use will only repeat the mistakes of the past. Employees will find ways around bans if it means getting their job done more efficiently. Those who harness it will make a calculated tradeoff between control and productivity that will keep them competitive in their respective markets.


Organizations Must Embrace Dynamic Honeypots to Outpace Attackers

There are a number of ways in which AI-powered honeypots are superior to their static counterparts. The first is that because they can independently evolve, they can become far more convincing through automatic evolution. This sidesteps the problem of constantly making manual adjustments to present the honeypot as a realistic facsimile. Secondly, as the AI learns and develops, it will become far more adept at planting traps for unwary attackers, meaning that hackers will not only have to go slower than usual to try and avoid said traps but once one is triggered, it will likely provide far richer data to defense teams about what attackers are clicking on, the information they’re after, how they’re moving across the site. Finally, using AI tools to design honeypots means that, under the right circumstances, even tangible assets can be turned into honeypots. ... Therefore, having tangible assets such as honeypots allows defense teams to target their energy more efficiently and enables the AI to learn faster, as there will likely be more attackers coming after a real asset than a fake one.


Almost all developers are using AI despite security concerns, survey suggests

Many developers place far too much trust in the security of code suggestions from generative AI, the report noted, despite clear evidence that these systems consistently make insecure suggestions. “The way that code is generated by generative AI coding systems like Copilot and others feels like magic," Maple said. "When code just appears and functionally works, people believe too much in the smoke and mirrors and magic because it appears so good.” Developers can also value machine output over their own talents, he continued. "There’s almost an imposter syndrome," he said. ... Because AI coding systems use reinforcement learning algorithms to improve and tune results when users accept insecure open-source components embedded in suggestions, the AI systems are more likely to label those components as secure even if this is not the case, it continued. This risks the creation of a feedback loop where developers accept insecure open-source suggestions from AI tools and then those suggestions are not scanned, poisoning not only their organization’s application code base but the recommendation systems for the AI systems themselves, it explained.


Former Uber CISO Speaks Out, After 6 Years, on Data Breach, SolarWinds

Sullivan says the key mistake he made was not bringing in third-party investigators and counsel to review how his team handled the breach. "The thing we didn't do was insist that we bring in a third party to validate all of the decisions that were made," he says. "I hate to say it, but it's more CYA." Now, Sullivan advises other CISOs and companies about navigating their responsibilities in disclosing breaches, especially as the new Securities & Exchange Commission (SEC) incident reporting requirements are set to take effect. Sullivan says he welcomes the new regulations. "I think anything that pushes towards more transparency is a good thing," he says. He recalls that when he was on former President Barack Obama's Commission on Enhancing National Cybersecurity, Sullivan was pushing to give companies immunity if they are transparent early on during security incidents. That hasn't happened until now, according to Sullivan, who says the jury is still out on the new regulations, which will require action starting in December.



Quote for the day:

"The distance between insanity and genius is measured only by success." -- Bruce Feirstein

Daily Tech Digest - November 29, 2023

How the Rise of the Gig Economy Is Changing Startup Investment

The gig economy’s ascent, defined by short-term, project-based work, is propelled by smartphone pervasiveness and widespread use of the internet. Enabled by seamless digital interactions, gig workers connect with employers on platforms. Increasing demand for flexible work, driven by a desire for autonomy and personalised schedules, fuels this shift. This tech-driven evolution offers a dynamic alternative to traditional employment models. ... As the gig economy reshapes work dynamics, startup investors position themselves to capitalise on the flexibility, efficiency, and rapid scaling of these forward-thinking businesses. The gig economy’s rise is transforming how investors evaluate startups. There’s a noticeable shift toward companies that not only understand the gig economy complexities but also effectively harness its potential. Seamless integration of gig workers is now a pivotal factor in startup investment assessments. Investors value companies employing the gig economy for enhanced flexibility, cost efficiency, and access to specialised skills. Startups with a strategic approach to the gig economy are increasingly the preferred choice for investors navigating the evolving business landscape.


The Emergence of Corporate Data Governance

It is both an opportunity and a challenge for corporate governance as new technologies such as big data platforms and cloud platforms digitize and process data at scale in firms. The pandemic has also intensified the need for digitizing data and improving accountability in organizations. Thereby, data availability in firms started to increase, which has become a strategic asset to drive the firm’s valuation. A growing amount of data combined with insignificant and poor-quality information has been a challenge for large corporations for years. In 2008, by conducting a survey of 200 organizations across the globe, Pierce, Dismute, and Yonke stated that 58% recognized data as a strategic asset. The management of data as information and its intelligibility has become a high priority in corporate governance and regulatory compliance. Data Governance is generally defined as the allocation of roles, decision-making rights, and accountability relating to data assets. The goal of Data Governance is to ensure that policies and ownership of data are enforced within an organization.


The Role of the CISO in Digital Transformation

Digital transformation isn't solely technical. It involves the entire organization, is driven by business needs and customer expectations, and can impact the way that work gets done from top to bottom. In the absence of a strong CISO making their voice heard, it's all too easy for decisions to be made that may not fully consider critical security implications. A strong CISO is an effective collaborator, working as an equal partner with key stakeholders such as the CIO, CTO, and CEO. A CISO needs to connect the dots between security and business success, using a combination of technical expertise and organizational influence to ensure security controls are properly incorporated, even during times of rapid organizational change. The difference between a capable CISO and an exceptional one often comes down to the ability to see both the big picture of business strategy and the fine details of technical security at the same time. Business units seeking new technological solutions may not have the necessary visibility beyond their individual spans of control to consider factors like data security and the flow of sensitive information between multiple different cloud-based tools. 


What’s the Go language really good for?

Go binaries run more slowly than their C counterparts, but the difference in speed is negligible for most applications. Go performance is as good as C for the vast majority of work, and generally much faster than other languages known for speed of development (e.g., JavaScript, Python, and Ruby). Executables created with the Go toolchain can stand alone, with no default external dependencies. The Go toolchain is available for a wide variety of operating systems and hardware platforms, and can be used to compile binaries across platforms. Go delivers all of the above without sacrificing access to the underlying system. Go programs can talk to external C libraries or make native system calls. In Docker, for instance, Go interacts with low-level Linux functions, cgroups, and namespaces, to work container magic. The Go toolchain is freely available as a Linux, MacOS, or Windows binary or as a Docker container. Go is included by default in many popular Linux distributions, such as Red Hat Enterprise Linux and Fedora, making it somewhat easier to deploy Go source to those platforms. 


How Digital Transformation is revolutionizing the Financial-Tech industry

In an age of rising cyber threats and data breaches, security is paramount in the FinTech industry. Digital transformation has ushered in new security measures, such as biometric authentication and blockchain technology. These innovations have made financial transactions more secure, ensuring that customers’ data and assets remain protected. Furthermore, digital transformation has revolutionized the way we make payments. The rise of mobile payments and digital wallets has made traditional methods, such as cash and cheques, almost obsolete. Today, individuals can make instantaneous payments using their smartphones or wearable devices, eliminating the need for physical currency or cards. This has not only made payments more convenient but has also increased financial inclusion by providing access to financial services for those without a bank account. Breaking down barriers of financial ways that often fail to reach underbanked or unbanked populations, leaving millions without access to vital financial services. FinTech solutions, driven by mobile technology, are reaching previously untouched markets.


The 3 biggest risks from generative AI - and how to deal with them

As well as assessing data protection risks across external processes, organizations need to be aware of how employees use data in generative AI applications and models. Litan says these kinds of risks cover the unacceptable use of data, which can compromise the decision-making process, including being slack with confidential inputs, producing inaccurate hallucinations as outputs, and using intellectual property from another company. Add in ethical issues and fears that models can be biased, and business leaders face a confluence of input and output risks. Litan says executives managing the rollout of generative AI must ensure people across the business take nothing for granted. ... Businesses deal with a range of cybersecurity risks on a day-to-day basis, such as hackers gaining access to enterprise data due to a system vulnerability or an error by an employee. However, Litan says AI represents a different threat vector. "These are new risks," she says. "There's prompt injection attacks, vector database attacks, and hackers can get access to model states and parameters."


Demystifying Container Security for Developers

Any container security program must take into account the security of the creation and contents of the containers themselves. There are several criteria by which to analyze container security, starting with the foundational elements of host security, then considering platform security elements, and finally, examining the elements of the container and orchestrator itself. Infrastructure security includes the integrity of the physical and virtual resources that underpin container operations, both metaphorically and literally. Containers run on physical hardware somewhere, therefore the hosting environment’s security affects the security of the containerized environment. ... While many deployments are done in different ways, the hosting operating system’s security and controls are vital for the same reasons infrastructure security is — if the OS is compromised, the workloads operating in it cannot be protected. The security of containerized architectures relies on best practices including strong identity and access management, OS security controls and secure deployments under an assumed compromise model.


Striking the right balance in leadership behaviour

Most of the leaders demonstrate either task-driven behaviour or people- and relationship-driven behaviour. The leaders demonstrating task-driven behaviour are more focused on allocating tasks, organising the work, providing the structure, setting the work context, defining roles and responsibilities, ensuring feedback mechanisms, and diligent processes for ensuring timely delivery of the task and outcomes. People-driven leaders prioritise relationships, build camaraderie, work in a more harmonious culture, and value respect, trust, and other humane aspects of teamwork. While these two behavioural approaches differ, they are inherently linked. However, an excessive emphasis on either end can undermine both the organization's progress and team cohesion. Striking a harmonious balance between these approaches is pivotal. It's not just about finding equilibrium; it's about seamlessly integrating these behavioural styles. This integration enhances productivity, fosters stronger team bonds, and fortifies the organization to tackle challenges for sustained growth.


Analyst Panel Says Take the Quantum Computing Plunge Now…

Sorensen said, “What’s so magical about quantum right now is, is the beauty of the low barrier to entry. In the old days if you wanted to get an HPC, and Jay knows this, you had to drop $25 million to bring a Cray in. You had to hire 25 guys and they lived downstairs in the basement. They never came out and they wrote code all the time and they spoke a language that you didn’t understand, and you had to pay them an awful lot of money to do that. The barriers to entry to get into HPC was high. “The barrier to entry in quantum is you sit down, you go to AWS or Strangeworks. You pick your cloud access model of choice, you sign up for a couple of bucks, you grab a couple of new hires that just came out of with a degree in quantum chemistry or something, and you go and you play, and you figure out how that’s going to work. So, the barriers to entry of quantum are amazing. I’ve said it before, and I’ll say it again, if it wasn’t for cloud access, none of us would be sitting here vaguely interested in quantum; it’s what really is driving interest.” Boisseau had a similar take. 


Adopting Asynchronous Collaboration in Distributed Software Teams

In an async-first culture though, we adopt a bias for action. We make the best decision we can at the moment, document it, and move on. The focus is on getting things done. If something’s wrong, we learn from it, refactor and adapt. The bias for action improves the team’s ability to make and record decisions. Decisions are, after all, the fuel for high-performing teams. All these benefits aside, an async-first culture also helps you improve your meetings. When you make meetings the last resort, the meetings you have, are the ones you need. You’ll gain back the time and focus to make these few, purposeful meetings useful to everyone who attends. Going async-first does not mean that synchronous interactions are not valuable. Team interactions benefit from a fine balance between asynchronous and synchronous collaboration. On distributed teams, this balance should tilt towards the asynchronous, lest you fill everyone’s calendars with 80 days of meetings a year. That said, you can’t ignore the value of synchronous collaboration.



Quote for the day:

"If you want people to to think, give them intent, not instruction." -- David Marquet

Daily Tech Digest - November 28, 2023

How a digital design firm navigated its SOC 2 audit

One of the more intense aspects of the audit was the testing of our incident response plan. We had to provide records of past incidents, how they were handled, and the lessons learned. Moreover, the auditors conducted tabletop exercises to assess our preparedness for potential future security events. After weeks of evaluation, the auditors presented their findings. We excelled in some areas, such as in our encryption of sensitive data and our robust user authentication systems. However, they also identified areas for improvement, like the need for more granular access controls and enhanced monitoring of system configurations. Post-audit, we were given a roadmap of sorts--a list of recommendations to address the identified deficiencies. This phase was dedicated to remediation, where we worked diligently to implement the auditors’ suggestions and improve our systems. Reflecting on the transformative impact of SOC 2 certification, L+R has discerned a profound shift in the dynamics of client engagement and internal processes. SOC 2 certification transcends the realm of compliance, fostering enriched dialogues, bolstering trust, and catalyzing decision-making at the executive level.


Is anything useful happening in network management?

The first of these is a management take on something that's already becoming visible in a broader way: absorbing the network into something else. Companies have said for years that the data center network, the LAN, is really driven by data center hardware/software planning and not by network planning. They're now finding that a broader use of hybrid cloud, where the cloud becomes the front-end technology for application access, is pulling the WAN inside the cloud. The network, then, is becoming less visible, and thus explicit network management is becoming less important. ... The second development gaining attention is being proposed by a number of vendors, the largest being Nokia. It envisions using "digital twin" technology, something most commonly associated with IoT and industrial metaverse applications, to construct a software model of the network based on digital twins of network devices. With this approach, the network becomes in effect an industrial system, and potentially could then be monitored and controlled by tools designed for industrial IoT and industrial metaverse deployments. 


The Basis for Business and Solution Architecture

The Business Architect, just like the Solution Architect, is a business technology strategist. The delivery of technology driven business value is core to their professional capability and career. So for that purpose they share a set of skills/competencies, language, professional goals, and experiences with each other. Any other method of architecture has been shown to fail. Without the shared capabilities and focus the team quickly begins to cannibalise its own value proposition to the enterprise and argue about baseline definitions, purpose and ownership. The level of team synergy and shared community is one of the most important first steps to a mature architecture practice. With that in place the Business and Solution Architects work very well together and in alignment with the EA from strategy through execution to measured value. Business Architects must focus on program level outcomes, those that are scoped at the business capability and/or department or region level. These levels are where real business goals and measurements occur and stand closest to the customer while retaining executive authority.


Is the Future of Data Centers Under the Sea?

One of the most appealing aspects of underwater data centers is their proximity to large population centers. Around half of the world’s population lives within 125 miles of a coastal area. Situating data centers near coastal population centers would allow for lower latency and more efficient handling of data. This would increase speeds for various digital services. Perhaps counterintuitively, the servers themselves might also benefit from being dropped in the drink. The inert gases and liquids used to fill underwater data centers are less corrosive than ambient air, leading to longer lifespans for the equipment. The servers are also protected from possible human damage incurred by everyday movement -- people banging into them, dropping them, or accidentally unplugging them. Placing a data center pod or retrieving it for maintenance is fairly simple, according to Subsea Cloud’s Williams. “Let's say the water is 100 meters deep. It's just an hour an hour job. If it’s 3,000 meters deep, it will probably take five or six hours to get the pod down.”


What you don’t know about data management could kill your business

Contributing to the general lack of data about data is complexity. There are many places in the enterprise where data spend happens. Individual business units buy data from third parties, for example. Taking enterprise-wide inventory of all the data feeds being purchased and getting an accurate picture of how all that purchased data is being put to use would be a good first step. The reality is that a significant portion of the data sloshing about modern enterprises is replicated in multiple locations, poorly classified, idiosyncratically defined, locked in closed platforms, and trapped in local business processes. Data needs to be made more liquid in the way of an asset portfolio — that is, transformed to ease data asset reuse and recombination. ... Traditionally business schools have avoided data as a topic, pumping out business leaders who erroneously feel that data is someone else’s job. I recall the mean-spirited dig at early career Harvard Business School alums expecting their assistants to bring in the day’s work arrayed as a case study — that is, a crisp 20-page synopsis of all the relevant issues.


Stop panic buying your security products and start prioritizing

Look inward and optimize. Companies need to understand what inside their networks and data is most attractive and most vulnerable to attackers. Get visibility into what you have, calculate the value of your tools, and use the information to move forward. Understanding risk by gaining full visibility into what you already have can allow companies to communicate better with investors and the public in the case of an attack or breach. For example, they will be able to give clear information about the impact (or lack of impact) on the business when an attack occurs and lay out clear steps for remediation, not having to guess the next best course of action. ... It is important to remember that the goal is not to buy more tools to chase the growing number of vulnerabilities that experts find every day, but to protect the assets that are most relevant to overall vital business operations and limit the fallout of inevitable cyber incidents. By attaching a dollar value to the cyber risks the organization is up against, you will be in a much better position to discuss your security plan and budgetary needs.


US, UK Cyber Agencies Spearhead Global AI Security Guidance

The guidance, the agencies say, is good for all AI developers but is particularly aimed at AI providers who use models hosted by a third party or who use APIs to connect to a model. Risks include adversarial machine learning threats stemming from vulnerabilities in third-party software and hardware applications. Hackers can exploit those flaws to alter model classification and regression performance and to corrupt training data and carry out prompt injection or data poisoning attacks that influence the AI model's decisions. Hackers can also target vulnerable systems to allow unauthorized actions as well as extract sensitive information. The guidance describes cybersecurity as a "necessary precondition" for AI safety and resilience. CISA, in particular, has been on a protracted campaign to evangelize the benefits of secure by design while also warning tech companies that the era of releasing products to the public containing security flaws must come to an end (see: US CISA Urges Security by Design for AI). The guidelines represent a "strong step" in providing universal standards and best practices for international AI security operations and maintenance, according to Tom Guarente, vice president of external and government affairs for the security firm Armis. "But the devil is in the details."


Data De-Identification: Balancing Privacy, Efficacy & Cybersecurity

There are two primary laws guiding online privacy: the General Data Protection Regulation (GDPR) and the California Privacy Rights Act (CPRA), although many countries and states have started to write their own. Among the various safeguard measures, data de-identification is a prime one. Both define data de-identification as the process of making PII anonymized in a way that any piece of secondary information, when associated with the personal data, cannot identify the individual. The industry unanimously agrees on some entities as personal data including a name, address, email address, and phone number. Others, such as an IP address (and versions of it) are based on interpretation. These laws neither explicitly list the attributes that are personal nor do they mention how and when to anonymize, beyond sharing a few best practices. ... However, full anonymization of personal data and the data linked to it is useless to businesses in this ever-digital world. Every new technological breakthrough demands massive input of data sets — both personal and aggregated. 


NCSC publishes landmark guidelines on AI cyber security

The set of guidelines has been designed to help the developers of any system that incorporates AI make informed decisions about cyber security during the development process – whether it is being built from scratch or as an add-on to an existing tool or service provided by another entity. The NCSC believes security to be an “essential pre-condition of AI system safety” and integral to the development process from the outset and throughout. In a similar way to how the secure-by-design principles alluded to by CISA’s Easterly are increasingly being applied to software development, the cognitive leap to applying the same guidance to the world of AI should not be too difficult to make. The guidelines, which can be accessed in full via the NCSC website, break down into four main tracks – Secure Design, Secure Development, Secure Deployment and Secure Operation and Maintenance, and include suggested behaviours to help improve security. These include taking ownership of security outcomes for customers and users, embracing “radical transparency”, and baking secure-by-design practices into organisational structures and leadership hierarchies.


AI partly to blame for spike in data center costs

The research firm attributes the year-over-year jump in absorption to the impact of AI requirements, as well as the growth of traditional cloud computing needs across the region. However, the industry is facing challenges, not the least of which is securing enough power. "Securing power continues to be a challenge for developers, pushing some to the outskirts of primary markets, as well as fueling growth in secondary and tertiary markets. This has led to an uptick in land-banking by companies hoping to secure space and power for future growth," DatacenterHawk said in its report. ... Cloud providers continue to consume most of the capacity in North America. AI and machine learning also continue to drive activity across US and Canadian markets, though the full impact of these rapidly growing industries has yet to be seen, the report noted. Submarkets within major markets will continue to grow from hyperscale users and data center operators, DatacenterHawk predicts. Older, enterprise data centers will be targets for AI companies that need bridge power quickly, providing an environment that allows them to grow larger over time.



Quote for the day:

"Amateurs look for inspiration; the rest of us just get up and go to work." -- Chuck Close

Daily Tech Digest - November 27, 2023

From Risk to Resilience: Safeguarding BFSI Against Increasing Threats

As financial transactions increasingly migrate to digital platforms, safeguarding sensitive data and systems has become the linchpin for maintaining trust and stability in the industry. Customer trust forms the bedrock of any successful financial institution. With the advent of digital banking and the proliferation of online transactions, customers expect their financial data to be treated with the utmost confidentiality and security. A single breach can erode trust irreparably, leading to customer attrition and reputational damage. To uphold trust, BFSI organizations must adopt a proactive cybersecurity posture. This entails not only implementing robust security measures but also fostering a culture of cybersecurity awareness among employees and customers alike. ... Converged IAM represents a paradigm shift in cybersecurity strategy. It combines traditional IAM, which manages user identities and access to resources, with Identity Governance and Administration (IGA), which ensures compliance with internal policies and external regulations. This convergence empowers organizations to have a unified view of user identities and their associated access rights, thereby bolstering security measures.


Innovation in data centers: Navigating challenges and embracing sustainability

Navigating the challenge of finding solutions that meet all constraints is a constant endeavour in the data center industry. Daily operations involve continuous optimization efforts, where sustainability and cost-effectiveness are pivotal considerations. Contrary to common perception, sustainable solutions are not invariably more expensive; their cost-effectiveness depends on the thorough assessment of environmental implications. Consider the approach the industry has taken to battery technology optimization as an example. Traditionally, lead batteries have been a standard industry solution. However, exploring new technologies, such as lithium-ion batteries, introduces a diverse range of options. While these batteries may be more intricate and expensive in the production phase, a holistic lifecycle analysis reveals their extended service life and lower total cost of ownership. This emphasises the need to evaluate innovation not only in terms of initial costs but also in terms of environmental impact and the overall project lifecycle.


Rise of the cyber CPA: What it means for CISOs

The biggest value-add these new talents are likely to deliver is in helping CISOs sell security programs more effectively. "CISOs are not known to speak in [terms of] ROI effectively, at least not in the practical ROI issues lines of business executives care about. And after hearing these ineffective arguments for years, many CFOs are eventually not listening," Yigal Rechtman, managing partner of Rechtman Consulting, a New Jersey-based compliance and forensic accounting firm, tells CSO. Even if the new cyber accountants don't immediately deliver better ROI arguments, argues Phil Neray, the VP of cyber defense security at Gem Security, their financial approach and different mindsets might prove quite valuable. "Fighting our cyber adversaries requires having different approaches and different viewpoints and different worldviews," he tells CSO. "Therefore, having a diversity of perspectives on your security team is going to make your team stronger. And these cyber accountants might do just that."


Why it’s the perfect time to reflect on your software update policy

The foundation of a sound software update policy begins with thorough pre-work. This involves setting the groundwork for delivering successful updates, creating an inventory of devices, documenting baseline configurations, and understanding the applications that are critical to business operations. Organizations must establish baseline configurations and communicate the requisite standards to users. A comprehensive inventory of all devices used for work, including BYOD and unmanaged devices, is essential. This also encompasses documenting the end of support for devices being phased out, noting the critical business applications in use, and understanding which devices and users depend on them. Identifying devices that are no longer receiving security updates yet access critical applications should be a priority. Similarly, sufficient staff must be allocated to the help desks to cope with increased queries during update rollouts. Organizations should also prepare a diverse group of informed early adopters and testers from across the business spectrum to ensure that feedback is timely and representative. 
It’s easy to predict a rosy future but far harder to deliver it. Gates can gush that “agents will be able to help with virtually any activity and any area of life,” all within five years, but for anyone who has actually used things like Midjourney to edit images, the results tend to be really bad, and not merely in terms of quality. I tried to make Mario Bros. characters out of my peers at work and discovered that Caucasians fared better than Asians. ... “The key to understanding the real threat of prompt injection is to understand that AI models are deeply, incredibly gullible by design,” notes Simon Willison. Willison is one of the most expert and enthusiastic proponents of AI’s potential for software development (and general use), but he’s also unwilling to pull punches on where it needs to improve: “I don’t know how to build it securely! And these holes aren’t hypothetical, they’re a huge blocker on us shipping a lot of this stuff.” The problem is that the LLMs believe everything they read, as it were. By design, they ingest content and respond to prompts. They don’t know how to tell the difference between a good prompt and a bad one.


Scaling SRE Teams

Scaling may come naturally if you do the right things in the right order. First, you must identify what your current state is in terms of infrastructure. How well do you understand the systems? Determine existing SRE processes that need improvement. For the SRE processes that are necessary but are not employed yet, find the tools and the metrics necessary to start. Collaborate with the appropriate stakeholders, use feedback, iterate, and improve. ... SLOs set clear, achievable goals for the team and provide a measurable way to assess the reliability of a service. By defining specific targets for uptime, latency, or error rates, SRE teams can objectively evaluate whether the system is meeting the desired standards of performance. Using specific targets, a team can prioritize their efforts and focus on areas that need improvement, thus fostering a culture of accountability and continuous improvement. Error budgets provide a mechanism for managing risk and making trade-offs between reliability and innovation. 


IT staff augmentation pros and cons

IT staff augmentation does have aspects that organisations will need to consider before they decide to adopt it. James suggests there can be an impact on workplace culture, which is often overlooked by organisations. “Certain contractors, particularly those who frequently work for the same company or who are contracted on a longer-term basis, will ingratiate themselves into the fabric and culture of a team, while others simply want to get in, get their work done and get out,” says James. “This could affect internal staff, employee relations and damage company culture.” Temporary team members could also feel left out and be less motivated than in-house employees, suggests Martin Hartley, group CCO of emagine Consulting, which can cause issues if they then decide to leave. “If you put the time into bringing someone into your team and they don’t work out, businesses are forced to repeat the process and go back out to the market to find a person with the right skills to replace that role, which takes time,” he warns. There can also be legal issues to think about.


10 things keeping IT leaders up at night

Many CIOs are troubled by a similar, related issue — a lack of full knowledge of and visibility into what they have in their IT environments. “It’s not knowing what you don’t know,” says Laura Hemenway, president, founder, and principle of Paradigm Solutions, which supports large enterprise-wide transformations. Many IT departments lack strong documentation around their code, processes, and systems, says Hemenway, who also serves as a fractional CIO and is a leader with the Arizona SIM chapter. Additionally, they don’t know all the places where their organization’s data lives, who touches it, and why. “CIOs went through so much so quickly in the past few years, that there is no transformation project that’s not full of data unknowns, process gaps, broken interfaces, or expired programs,” she says, calling them “all ticking time bombs.” “And unless CIOs take the time to create a solid foundation, this is going to be pulling at them, rolling around in the back of their head,” she says.


How To Prepare Your Organization for AI’s Impact on Data Processes and Products

In many ways, the advanced data literacy trend is connected to the growth of AI technologies. However, beyond the need for users to understand the breadth and power of AI, advanced data literacy requires users know the related terminologies, KPIs, and metrics connected with AI. Data literacy is not just about how data is consumed. It's about how it is classified within your data ecosystem. You should focus on the literacy needs of each user and train them to understand the parts of your data ecosystem they will need to access to contribute to AI product development. You must implement a solid, comprehensive framework that informs when and how you roll out data literacy programs in the various departments and teams in your organization. ... You will also need a compliance strategy that incorporates all the latest requirements and includes processes for implementation. The best way to ensure this is through a dedicated data governance tool. The tool should be configured to ensure that only verified users can access specific data assets and that data is only made available for usage according to compliance regulations. 


Unlocking Business Growth: A Strategiv Approach To Tackling Technical Debt

Tech debt is like a time bomb ticking within an organisation. While short-term solutions may work at first, they often crumble over time, causing costly rework and inefficiencies. A series of trade-offs and technical workarounds begin to compromise the long-term health of an organisation’s technological infrastructure. Although it is distinct from obsolescence or depreciation, its consequences are far-reaching: loss of systemic intelligence, compromised operational advantage and ineffective use of capital. In the simplest terms, it causes organisations to pay 20-40% to manage this inefficiency; it also consumes as much as 30% of the IT team’s talent. ... The first step is to reframe tech debt as a component of modernisation. This pivot refocuses the organisations on a clear and shared vision for modernisation efforts. It is an opportunity for candid executive conversations to assess the existing tech debt and strategies for the future. Step two gets to the business benefits of modernisation. These extend beyond the IT department, so the effort to support and manage modernisation does as well. 



Quote for the day:

When the sum of the whole is greater than the sum of the parts that's the power of teamwork -- Gorden Tredgold

Daily Tech Digest - November 26, 2023

European Commission Failing to Tackle Spyware, Lawmakers Say

As that deadline looms, lawmakers accused the European Commission of failing to act. On Thursday, they passed a resolution that attempts to force the European Commission to present the legislative changes recommended in May by the PEGA Committee. At a plenary session in Strasbourg, EU lawmakers said that the European Commission's inaction had facilitated an uptick in recent spyware cases. Such cases have included the alleged targeting of exiled Russian journalist Galina Timchenko using Pegasus when she was based in Germany, as well as the Greek government's attempt to thwart investigations into spyware abuse by its ministers. In contrast to the EU approach, lawmakers highlighted the U.S. government's blacklisting in July of European spyware firms Intellexa and Cytrox and the Biden administration's citing of the companies' risk to U.S. national security and foreign policy. Speaking at the Thursday plenary, EU Justice Commissioner Didier Reynders condemned using spyware to illegally intercept personal communications, adding that member states cannot use "national security" as a legal basis to circumvent existing laws and indiscriminately target their citizens.


Mastering the art of differentiation: Vital competencies for thriving in the age of artificial intelligence

With AI designed to make decisions using algorithms grounded in data and patterns, these algorithms are only as dependable as the data they are trained on and can be influenced by the assumptions and biases of their creators. Consequently, it is imperative to employ critical thinking skills to assess AI decisions and guarantee that they align with our values and objectives. Moreover, critical thinking is essential for resolving complex issues that may exceed AI’s capabilities. Developing critical thinking skills involves cultivating the ability to analyze, evaluate, and synthesize information to make informed decisions. ... In this rapidly evolving modern landscape, heavily influenced by digital technologies, cultivating a high LQ is indispensable for the long-term success and sustainability of both employees and organizations. In the business world, change is constant, making continuous learning and development essential at every level of the organization to ensure we consistently make the right decisions. High LQ empowers employees to foster innovation and creativity, cultivate resilience, and position themselves more effectively to future-proof their careers. 


Digital advocacy group criticizes current scope of the EU AI Act

The group’s core argument is that the AI Act now goes beyond its original intended scope, and should instead remain focused on high-risk use cases, rather than being directed at specific technologies. Digital Europe also warned that the financial burden the Act could place on companies wanting to bring AI-enabled products to market could make operating out of the EU unstainable for smaller organizations. “For Europe to become a global digital powerhouse, we need companies that can lead on AI innovation also using foundation models and GPAI (general-purpose AI),” the statement read. “As European digital industry representatives, we see a huge opportunity in foundation models, and new innovative players emerging in this space, many of them born here in Europe. Let’s not regulate them out of existence before they get a chance to scale, or force them to leave.” The letter was signed by 32 members of Digital Europe and outlined four recommendations that signatories believe would allow the Act to strike the necessary balance between regulation and innovation.


HR Leaders unleashing retention success through employee well-being

“The pandemic brought the discourse on mental health to the forefront and normalised talk about stress and mental health in all forums. Accordingly, a formalised framework to address the mental health of employees has been put in place. Wellness webinars on these topics are delivered through tie-ups with service providers and in-house subject matter experts. Webinars on mental health are regularly organised with an aim to destigmatise mental health through increasing awareness on topics such as mental health awareness, digital & screen detox and, stress management, etc. We continuously work on instituting policies that are customised as per the individual and life-stage needs of the employees. An employee assistance program, in tie-up with a service provider, is in place to facilitate mental health conversations with qualified professionals. In addition, the employees are nudged to incorporate habits that help take care of their mental well-being as an unconscious part of their lives. Initiatives such as the 'Mental Health Bingo’ card and ‘I De-stress myself by __’ campaigns are launched. 


How generative AI changes the data journey

We see generative AI used in the observability space throughout many industries, especially regarding compliance. Let’s look at healthcare, an industry where you must comply with HIPAA. You are dealing with sensitive information, generating tons of data from multiple servers, and you must annotate the data with compliance tags. An IT team might see a tag that says, “X is impacting 10.5.34 from GDPR…” The IT team may not even know what 10.5.34 means. This is a knowledge gap—something that can very quickly be fulfilled by having generative AI right there to quickly tell you, “X event happened, and the GDPR compliance that you’re trying to meet by detecting this event is Y…” Now, the previously unknown data has turned into something that is human readable. Another use case is transportation. Imagine you’re running an application that’s gathering information about flights coming into an airport. A machine-generated view of that will include flight codes and airport codes. Now let’s say you want to understand what a flight code means or what an airport code means. Traditionally, you would use a search engine to inquire about specific flight or airport codes. 


Banks May Be Ready for Digital Innovation: Many of the Staff Aren’t

A major roadblock to training workers is that many don’t actually bank with their employer. This makes training critical, especially for frontline staff members, says John Findlay, chief executive and founder of digital learning company LemonadeLXP, based in Ontario, Canada. “If their staff doesn’t bank with them, they don’t use the technologies on offer and it’s pretty difficult for them to promote them to customers,” he says. It’s also difficult for them to answer customer questions. Brian McNutt, U.S. vice president of product management at Dutch engagement platform Backbase, says banks should incentivize their staff to actually use their services as much as possible. One approach is to offer special rates or deals to employees, he says. “I think that really the most important thing is that they are customers themselves. There’s really no replacement for that. For somebody to really be able to empathize or understand customers, they have to experience the products themselves.”


The Future of Software Engineering: Transformation With Generative AI

The application of Generative AI in software engineering is not just a technical enhancement but a fundamental change in how software is conceptualized, developed, and maintained. This section delves into the key themes that underline this transformative integration, elucidating the diverse ways in which Generative AI is reshaping the field. Generative AI is revolutionizing the way code is written and maintained. AI models can now understand programming queries in natural language and translate them into efficient code, significantly reducing the time and effort required from human developers. This has several implications:Enhanced productivity: Developers can focus on complex problem-solving rather than spending time on routine coding tasks. Learning and development: AI models can suggest best coding practices and offer real-time guidance, acting as a learning tool for novice programmers. Code quality improvement: With AI's ability to analyze vast codebases, it can recommend optimizations and improvements, leading to higher quality and more maintainable code.


Reports: China’s Alibaba Shuts Down Quantum Lab

DoNews reported this week that Alibaba’s DAMO Academy –Academy for Discovery, Adventure, Momentum and Outlook — has closed down its quantum laboratory due to budget and profitability reasons. The budget ax claimed more than 30 people — possible among China’s brightest quantum researchers — lost their positions, according to the news outlet’s internal sources. For further claims of proof, DoNews reports that the official website of DAMO Academy has also removed the introduction page of the quantum laboratory. According to the story, translated into English: “Insiders claimed that Alibaba’s DAMO Academy Quantum Laboratory had undergone significant layoffs, but it was not clear at that time whether the entire quantum computing team had been disbanded.” Media further suggest that many of the DAMO Academy quantum team members who were laid off have begun to send their resumes to other companies. According to The Quantum Insider’s China’s Quantum Computing Market brief, Alibaba is a diverse tech conglomerate that has been active in quantum since 2015. The company’s Quantum Lab Academy teaching employees and students about the prospects of quantum computing.


It’s time the industry opts for collaborative manufacturing

The transition from an analogue factory to a digital one underscores the necessity of a coherent and efficient digital infrastructure. This transformation extends beyond the primary tasks of manufacturing, adding efficiency at every stage, including the cutting room. Investments in IoT-enabled machinery, though costly, can lead to significant improvements in output and efficiency. ... The technology underlines the importance of integrated planning software, which aids in production planning, order flow management and the efficient consumption of raw materials.” As technology continues to evolve and digitisation gains ground, an important question emerges while making the roadmap: What are the social implications of this technological revolution? In a city like Bengaluru and its surrounding manufacturing hubs, more than 3.5 million women toil in the garment industry, forming the majority of the workforce. Their livelihoods hinge on operating sewing machines, a vocation they might continue for the next two decades. 


The Digital Revolution in Banking: Exploring the Future of Finance

As banks continue to close their physical branches, it becomes crucial to balance the convenience of digital banking and the personalized service that customers crave. While online banking has become increasingly popular, some still prefer the in-person experience of visiting their local branch and interacting with staff. This is especially important when it comes to welcoming new customers. To address this, emerging technologies, such as augmented reality (AR) and virtual reality (VR), may offer a solution to bridge the gap between digital convenience and personalized service. Imagine you are a banking executive looking for ways to improve your customer experience. You know that digital banking is the future, but you also understand that some customers still crave the personalized service of visiting a physical branch. This is where augmented reality (AR) and virtual reality (VR) come in. By incorporating AR into your mobile app, you can enhance the interface and provide customers with more information in an immersive way. 



Quote for the day:

"Success is the sum of small efforts, repeated day-in and day-out." -- Robert Collier

Daily Tech Digest - November 25, 2023

Building a Successful Data Quality Program

Assessing Data Quality often includes establishing a standard of acceptable Data Quality, using data profiling and analysis techniques, and using statistical methods to identify and correct any Data Quality issues. The key features (often called “dimensions”) that should be examined and measured are: Completeness:- Data should not be missing or have incomplete values. Uniqueness:- Locate and eliminate copies to ensure the information in the organization’s data files is free of duplication. Validity:- This refers to how useful the data is, and how well the data conforms to the organization’s standards. Timeliness:- Old information that is often no longer true or accurate needs to be removed. Data can be measured using its relevance and freshness. Out-of-date data should be eliminated, so as not to cause confusion. Accuracy:- This is the precision of data, and how accurately it represents the real-world information. Consistency:- When data is copied, the information should be consistent and accurate. The need for a single source of accurate in-house data provides a good argument for the use of master data and its best practices.


Building brand trust in a new era of data privacy

Emily emphasized the importance of anonymizing data to utilize it in aggregate without compromising individual privacy, a task that requires close collaboration between technical and marketing departments. Anita introduced the intriguing concept of a Chief Trust Officer, a role highlighted by Deloitte, which spans data, business, and marketing, safeguarding all aspects of compliance and privacy. The idea of having such a partner resonated with her, underlining the multifaceted nature of trust in business operations. Jake echoed the sentiment, stressing the need for understanding the types of data at hand and leveraging them without violating regulations - a balance that is critical yet challenging to achieve. These insights from the panelists underscore a common theme: building brand trust in the digital age is a multifaceted challenge that requires a blend of transparency, consistency, and compliance. As we continue to delve into this topic, it's clear that the role of data privacy is not just a technical issue but a cornerstone of the customer-brand relationship.


How Does Technical Debt Affect QA Testers

How many times have your testers been caught off guard at the last minute when the delivery manager abruptly appeared and said, “Guys, we need to launch our product in a week, and we are very sorry for not communicating this sooner? Please complete all test tasks ASAP so that we can begin the demo.” Simply put, any missing tests or “fix it later” attitude can result in a tech debt problem. Lack of test coverage, excessive user stories, short sprints, and other forms of “cutting corners” due to time constraints all contribute significantly to the building of technical debt in QA practice. When the complexity of the testing mesh began to grow with each new sprint, a US-based online retailer with a strong presence across various websites and mobile apps found itself in a real-world “technical debt” dilemma. ... Most QA managers mistakenly believe that tech debt is a legitimate result of putting all of your work on the current sprint alone, which leads to completing test coverage manually and completely ignoring automation. According to agile principles, we should see the tech debt problem as an inability to maintain and meet QA benchmarks.


How digital twins will enable the next generation of precision agriculture

Digital twins are digital representations of physical objects, people or processes. They aid decision-making through high-fidelity simulations of the twinned physical system in real time and are often equipped with autonomous control capabilities. In precision agriculture, digital twins are typically used for monitoring and controlling environmental conditions to stimulate crop growth at an optimal and sustainable rate. Digital twins provide a live dashboard to observe the environmental conditions in the growing area, and with varying autonomy, digital twins can control the environment directly. ... Agriculture is among the lowest-digitalized sectors, and digital maturity is an absolute prerequisite to adopting digital twins. As a consequence, costs related to digital maturity often overshadow technical costs in smart agriculture. A company undergoing the early stages of digitalization will have to think about choosing a cloud provider, establishing a data strategy and acquiring an array of software licences, to name just a few critical challenges.


What are Software Design Patterns?

Software design patterns are an essential aspect of software development that helps developers and engineers create reusable and scalable code. These patterns provide solutions to commonly occurring problems in software design, enabling developers to solve these problems efficiently and effectively. In essence, a software design pattern is a general solution to a recurring problem in software design that has been proven to be effective. It's like a blueprint for a specific type of problem that developers can use to create software systems that are reliable, maintainable, and scalable. Software design patterns have been around for a long time and are widely used in the software development industry. They are considered to be a best practice in software design because they provide a standardized approach to solving common problems, making it easier for developers to communicate and collaborate with one another. In this blog, we will explore what software design patterns are, the different types of software design patterns, and the benefits of using them in software development. 


Examples of The Observer Pattern in C# – How to Simplify Event Management

The observer pattern is an essential software design pattern used in event-driven programming and user interface development. It is composed of three primary elements: the subject, observer, and concrete observers. The subject class is responsible for keeping track of the observer objects and notifying them of changes in the subject’s state. On the other hand, the observer is the object that wishes to be notified when the state of the subject changes. Finally, the concrete observer is an implementation of the observer interface. One of the observer pattern’s significant advantages is its capability to facilitate efficient event management in software development. By leveraging this ability, developers can trigger related events without the need for tightly coupling the pieces of code leading to the events. The observer pattern also ensures that the code continues to be free from changes that would cause a ripple effect or the chain reaction of changes. The observer pattern’s primary components are the Subject, Observer, and Concrete Observer. The subject defines the interface for attaching and detaching observers from the subject object. 


Cloud Computing: A Comprehensive Guide to Trends and Strategies

As a company moves to the cloud, they reduce the number of servers and other hardware their IT department has to maintain. Cloud computing efficiently uses today’s powerful processors, fast networks, and massive amounts of storage. Cloud virtual machines allow businesses to run multiple servers on one physical machine. Containers take that concept a step further. Containers are a lightweight form of virtualization that packages applications and their dependencies in a portable manner. This means that if, for instance, a company wants to run a web server, they no longer have to devote physical or virtual machines to host the server software. A container with only the needed bits runs in the cloud, appearing to the outside world as if it were its dedicated machine. Many containers can run in the same cloud instance for maximum efficiency. This approach is sometimes called serverless computing or Function as a Service (FaaS). The application-level isolation inherent in serverless computing restricts the attack surface that attackers can exploit.


Judges Urged To Stay Abreast Of Electronic Tools

The Cyber Security Authority (CSA), with funding support from the European Commission Technical Assistance and Information Exchange (TAIEX) Instrument is undertaking a series of workshops across Ghana to enhance the capacity of the judiciary and prosecutors regarding cybercrime and electronic evidence as a decisive factor in contributing to the rule of law. Expressing excitement about the training, the Chief Justice said e-commerce, e-trade, e-contracts, and intellectual property rights, among others, were now being conducted virtually, and the expertise of judges in these new trends was a prerequisite for the efficient trial of cyber-related cases, particularly in the gathering of electronic data. “Judges must develop new working skills by staying abreast of the digital space. You must develop leadership skills in this arena if you want to remain relevant in the system,” she stressed. Albert Antwi-Boasiako, stated that the major regulatory activity being undertaken by the Authority to license cybersecurity service providers and accredit cybersecurity establishments and professionals was tailored to support the training of the judges.


Candy Alexander Explains Why Bandwidth and Security are Both in High Demand

It became painfully clear to everyone that the primary component for productivity depended on bandwidth. The increased bandwidth of networks has become the primary factor of success; whether you're a business just looking to ensure the productivity of your remote workers or provide a cloud service, throughput is everything. And with that, the world has expanded ubiquitous access and high availability of networks. In today's digital world, businesses of all sizes rely on data. That data is used to make decisions, operate efficiently, and serve customers. Data is essential for everything, from product to development, marketing, and customer support. However, with the rise of remote work and cloud computing, it has become more challenging to ensure that the data is always accessible and secure. The application of cybersecurity's golden triad of confidentiality, integrity, and availability is now focused on data rather than the on-premises systems and networks. Again, it's data that has become more important than ever before. 


Why Departments Hoard Data—and How to Get Them to Share

"Data hoarding within organizations can be attributed to a combination of cultural, operational and psychological factors," said Jon Morgan, CEO and editor-in-chief of Venture Smarter, a consulting firm in San Francisco. "When departments view data as a source of power or control, they are less inclined to share it with others, fearing that it might diminish their influence." Operational inefficiencies can also lead to data hoarding. "If access to data is cumbersome or time-consuming, employees may be less motivated to share it, preferring to keep it close for their own convenience," Morgan said. In addition, "psychological factors like fear of criticism or a desire to protect one's domain can also drive data hoarding." Employees may worry that sharing data will expose their mistakes or weaknesses, leading to a reluctance to collaborate, he said. Jace McLean, senior director of strategic architecture at Domo, a data platform based in American Fork, Utah, said he believes that cultural factors are the most important lever to use in changing data-hoarding habits. 



Quote for the day:

"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller