Showing posts with label data lake. Show all posts
Showing posts with label data lake. Show all posts

Daily Tech Digest - July 18, 2025


Quote for the day:

"It is during our darkest moments that we must focus to see the light." -- Aristotle Onassis




Machine unlearning gets a practical privacy upgrade

Machine unlearning, which refers to strategies for removing the influence of specific training data from a model, has emerged to fill the gap. But until now, most approaches have either been slow and costly or fast but lacking formal guarantees. A new framework called Efficient Unlearning with Privacy Guarantees (EUPG) tries to solve both problems at once. Developed by researchers at the Universitat Rovira i Virgili in Catalonia, EUPG offers a practical way to forget data in machine learning models with provable privacy protections and a lower computational cost. Rather than wait for a deletion request and then scramble to rework a model, EUPG starts by preparing the model for unlearning from the beginning. The idea is to first train on a version of the dataset that has been transformed using a formal privacy model, either k-anonymity or differential privacy. This “privacy-protected” model doesn’t memorize individual records, but still captures useful patterns. ... The researchers acknowledge that extending EUPG to large language models and other foundation models will require further work, especially given the scale of the data and the complexity of the architectures involved. They suggest that for such systems, it may be more practical to apply privacy models directly to the model parameters during training, rather than to the data beforehand.


Emerging Cloaking-as-a-Service Offerings are Changing Phishing Landscape

Cloaking-as-a-service offerings – increasingly powered by AI – are “quietly reshaping how phishing and fraud infrastructure operates, even if it hasn’t yet hit mainstream headlines,” SlashNext’s Research Team wrote Thursday. “In recent years, threat actors have begun leveraging the same advanced traffic-filtering tools once used in shady online advertising, using artificial intelligence and clever scripting to hide their malicious payloads from security scanners and show them only to intended victims.” ... The newer cloaking services offer advanced detection evasion techniques, such as JavaScript fingerprinting, device and network profiling, machine learning analysis and dynamic content swapping, and put them into user-friendly platforms that hackers and anyone else can subscribe to, SlashNext researchers wrote. “Cybercriminals are effectively treating their web infrastructure with the same sophistication as their malware or phishing emails, investing in AI-driven traffic filtering to protect their scams,” they wrote. “It’s an arms race where cloaking services help attackers control who sees what online, masking malicious activity and tailoring content per visitor in real time. This increases the effectiveness of phishing sites, fraudulent downloads, affiliate fraud schemes and spam campaigns, which can stay live longer and snare more victims before being detected.”


You’re Not Imagining It: AI Is Already Taking Tech Jobs

It’s difficult to pinpoint the exact motivation behind job cuts at any given company. The overall economic environment could also be a factor, marked by uncertainties heightened by President Donald Trump’s erratic tariff plans. Many companies also became bloated during the pandemic, and recent layoffs could still be trying to correct for overhiring. According to one report released earlier this month by the executive coaching firm Challenger, Gray and Christmas, AI may be more of a scapegoat than a true culprit for layoffs: Of more than 286,000 planned layoffs this year, only 20,000 were related to automation, and of those, only 75 were explicitly attributed to artificial intelligence, the firm found. Plus, it’s challenging to measure productivity gains caused by AI, said Stanford’s Chen, because while not every employee may have AI tools officially at their disposal at work, they do have unauthorized consumer versions that they may be using for their jobs. While the technology is beginning to take a toll on developers in the tech industry, it’s actually “modestly” created more demand for engineers outside of tech, said Chen. That’s because other sectors, like manufacturing, finance, and healthcare, are adopting AI tools for the first time, so they are adding engineers to their ranks in larger numbers than before, according to her research.


The architecture of culture: People strategy in the hospitality industry

Rewards and recognitions are the visible tip of the iceberg, but culture sits below the surface. And if there’s one thing that I’ve learned over the years, it’s that culture only sticks when it’s felt, not just said. Not once a year, but every single day. Hilton’s consistent recognition as a Great Place to Work® globally and in India stems from our unwavering support and commitment to helping people thrive, both personally and professionally. ... What has sustained our culture through this growth is a focus on the everyday. It is not big initiatives alone that shape how people feel at work, but the smaller, consistent actions that build trust over time. Whether it is how a team huddle is run, how feedback is received, or how farewells are handled, we treat each moment as an opportunity to reinforce care and connection. ... Equally vital is cultivating culturally agile, people-first leaders. South Asia’s diversity, across language, faith, generation, and socio-economic background, demands leadership that is both empathetic and inclusive. We’re working to embed this cultural intelligence across the employee journey, from hiring and onboarding to ongoing development and performance conversations, so that every team member feels genuinely seen and supported.


Capturing carbon - Is DAC a perfect match for data centers?

The commercialization of DAC, however, faces several significant challenges. One primary obstacle is navigating different compliance requirements across jurisdictions. Certification standards vary significantly between regions like Canada, the UK, and Europe, necessitating differing approaches in each jurisdiction. However, while requiring adjustments, Chadwick argues that these differences are not insurmountable and are merely part of the scaling process. Beyond regulatory and deployment concerns, achieving cost reductions is a significant challenge. DAC remains highly expensive, costing an average of $680 per ton to produce in 2024, according to Supercritical, a carbon removal marketplace. In comparison, Biochar has an average price of $165 per ton, and enhanced rock weathering has an average price of $310 per ton. In addition, the complexity of DAC means up-front costs are much higher than those of alternative forms of carbon removal. An average DAC unit comprises air-intake manifolds, absorption and desorption towers, liquid-handling tanks, and bespoke site-specific engineering. DAC also requires significant amounts of power to operate. Recent studies have shown that the energy consumption of fans in DAC plants can range from 300 to 900 kWh per ton of CO2 captures, which represents between 20 - 40 percent of total DAC system energy usage.


Rethinking Risk: The Role of Selective Retrieval in Data Lake Strategies

Selective retrieval works because it bridges the gap between data engineering complexity and security usability. It gives teams options without asking them to reinvent the wheel. It also avoids the need to bring in external tools during a breach investigation, which can introduce latency, complexity, or worse, gaps in the chain of custody. What’s compelling about this approach is that it doesn’t require businesses to abandon existing tools or re-architect their infrastructure. ... This model is especially relevant for mid-size IT teams who want to cover their audit requirements, but don’t have a 24/7 security operations center. It’s also useful in regulated sectors such as healthcare, financial services, and manufacturing where data retention isn’t optional, but real-time analysis for everything isn’t practical. ... Data volumes are continuing to rise. As organizations face high costs and fatigue, those that thrive will be the ones that treat storage and retrieval as distinct functions. The ability to preserve signal without incurring ongoing noise costs will become a critical enabler for everything from insider threat detection to regulatory compliance. Selective retrieval isn’t just about saving money. It’s about regaining control over data sprawl, aligning IT resources with actual risk, and giving teams the tools they need to ask, and answer, better questions.


Manufactured Madness: How To Protect Yourself From Insane AIs

The core of the problem lies in a well-intentioned but flawed premise: that we can and should micromanage an AI’s output to prevent any undesirable outcomes. These “guardrails” are complex sets of rules and filters designed to stop the model from generating hateful, biased, dangerous, or factually incorrect information. In theory, this is a laudable goal. In practice, it has created a generation of AIs that prioritize avoiding offense over providing truth. ... Compounding the problem of forced outcomes is a crisis of quality. The data these models are trained on is becoming increasingly polluted. In the early days, models were trained on a vast, curated slice of the pre-AI internet. But now, as AI-generated content inundates every corner of the web, new models are being trained on the output of their predecessors. ... Given this landscape, the burden of intellectual safety now falls squarely on the user. We can no longer afford to treat AI-generated text with passive acceptance. We must become active, critical consumers of its output. Protecting yourself requires a new kind of digital literacy. First and foremost: Trust, but verify. Always. Never take a factual claim from an AI at face value. Whether it’s a historical date, a scientific fact, a legal citation, or a news summary, treat it as an unconfirmed rumor until you have checked it against a primary source.


6 Key Lessons for Businesses that Collect and Use Consumer Data

Ensure your privacy notice properly discloses consumer rights, including the right to access, correct, and delete personal data stored and collected by businesses, and the right to opt-out of the sale of personal data and targeted advertising. Mechanisms for exercising those rights must work properly, with a process in place to ensure a timely response to consumer requests. ... Another issue that the Connecticut AG raised was that the privacy notice was “largely unreadable.” While privacy notices address legal rights and obligations, you should avoid using excessive legal jargon to the extent possible and use clear, simple language to notify consumers about their rights and the mechanisms for exercising those rights. In addition, be as succinct as possible to help consumers locate the information they need to understand and exercise applicable rights. ... The AG provided guidance that under the CTDPA, if a business uses cookie banners to permit a consumer to opt-out of some data processing, such as targeted advertising, the consumer must be provided with a symmetrical choice. In other words, it has to be as clear and as easy for the consumer to opt out of such use of their personal data as it would be to opt in. This includes making the options to accept all cookies and to reject all cookies visible on the screen at the same time and in the same color, font, and size.


How agentic AI Is reshaping execution across BFSI

Several BFSI firms are already deploying agentic models within targeted areas of their operations. The results are visible in micro-interventions that improve process flow and reduce manual load. Autonomous financial advisors, powered by agentic logic, are now capable of not just reacting to user input, but proactively monitoring markets, assessing customer portfolios, and recommending real-time changes.. In parallel, agentic systems are transforming customer service by acting as intelligent finance assistants, guiding users through complex processes such as mortgage applications or claims filing. ... For Agentic AI to succeed, it must be integrated into operational strategy. This begins by identifying workflows where progress depends on repetitive human actions that follow predictable logic. These are often approval chains, verifications, task handoffs, and follow-ups. Once identified, clear rules need to be defined. What conditions trigger an action? When is escalation required? What qualifies as a closed loop? The strength of an agentic system lies in its ability to act with precision, but that depends on well-designed logic and relevant signals. Data access is equally important. Agentic AI systems require context. That means drawing from activity history, behavioural cues, workflow states and timing patterns. 


Open Source Is Too Important To Dilute

The unfortunate truth is that these criteria don’t apply in every use case. We’ve seen vendors build traction with a truly open project. Then, worried about monetization or competition, they relicense it under a “source-available” model with restrictions, like “no commercial use” or “only if you’re not a competitor.” But that’s not how open source works. Software today is deeply interconnected. Every project — no matter how small or isolated — relies on dependencies, which rely on other dependencies, all the way down the chain. A license that restricts one link in that chain can break the whole thing. ... Forks are how the OSS community defends itself. When HashiCorp relicensed Terraform under the Business Source License (BSL) — blocking competitors from building on the tooling — the community launched OpenTofu, a fork under an OSI-approved license, backed by major contributors and vendors. Redis’ transition away from Berkeley Software Distribution (BSD) to a proprietary license was a business decision. But it left a hole — and the community forked it. That fork became Valkey, a continuation of the project stewarded by the people and platforms who relied on it most. ... The open source brand took decades to build. It’s one of the most successful, trusted ideas in software history. But it’s only trustworthy because it means something.

Daily Tech Digest - March 20, 2024

How to deploy software to Linux-based IoT devices at scale

IoT development may be so nascent that it may not yet be part of your mainstream DevOps processes—you may still be in the early stages of experimentation. Once you’re ready to scale, you’ll need to bring IoT into the DevOps fold. Needless to say, the scale and costs of dealing with‌ thousands of deployed devices are significant. DevOps is an important approach for ensuring the seamless and efficient delivery of software development, updates, and enhancements to IoT devices. By integrating IoT development into an established workflow, you’ll gain the improved collaboration, agility, assured delivery, control, and traceability that’s part of a modern DevOps process. It’s critical to use a secure deployment process to protect your IoT devices from unauthorized access, inadvertent vulnerabilities, and malware. A secure deployment must include strong authentication methods to access the devices and the management platform. The data that is transmitted between the devices and the management platform should be protected by encryption. The manner in which the client devices connect to the platform after deployment should always be encrypted as well.


In 5 Years, Coding will be Done in Natural Language

“Because AI is a tool,” he adds, that people should be able to operate at a higher level of abstraction and become way more efficient at the job they do. Eventually, everyone is likely to be coding in the natural language, but that wouldn’t necessarily make them a software engineer or a programmer. The skills required to be a coder are far more complex than being able to put prompts in an AI tool, copying the code, or merely typing in natural language. ... Soon, there would be a programming language exclusively in our very own English language. Not to be confused with prompt engineering and writing code, the term natural language programming means that most of the coding would be done by the software in the backend. The programmer would only have to interact with the tool in English, or any other language and never even look at the code. On the contrary, a few experts believe that English cannot be a programming language because it is filled with misunderstandings. “If they’re going into machines, which will affect the lives of people, we can’t afford that level of comedy,” said Douglas Crockford when talking to AIM. 


Cybersecurity's Future: Facing Post-Quantum Cryptography Peril

The post-quantum cryptography era might not be open season on unprepared systems, he says, but rather an uneven landscape. There are layers of concerns to consider. “What I think scares me a little bit is that this type of attack is somewhat quiet,” Ho says. “The people who are going to be taking advantage of this -- the few people initially who have quantum computers as you can imagine, probably state actors -- will want to keep this on the downlow. You wouldn’t know it, but they probably already have access.” ... “From a technical perspective … being quantum-safe -- it’s a binary thing. You either are or you’re not,” says Duncan Jones, head of quantum cybersecurity with quantum computing company Quantinuum. If there is a particular computer system that an organization fails to migrate to new standards and protocols, he says that system will be vulnerable. However, the barrier to entry for access to quantum compute resources may limit the potential for early attackers who already have pockets deep enough to procure the technology. 


The New CISO: Rethinking the Role

CISOs need to be negotiators. They need to argue in favor of stronger security and convince boards and business units of the risks in terms they understand. How a CISO goes about this can vary, depending on whether board members' experience is in technology or business. Providing a demonstration that puts the technical risk into a business perspective can be helpful. CISOs should also talk with other C-level executives — as well as CISOs from other industries — to get advance buy-in and different perspectives on similar conversations they're having with their boards. ... CISOs need to be comfortable developing a risk-based approach focusing on the importance of resiliency, because attackers will get in. Developing a tested plan to respond to attacks is just as important as implementing preventative measures. … it's balancing the risk with the cost. ... CISOs should build a deeply technical team that can focus on key security practices. They should run tabletop exercises on scenarios such as a system shutdown or inability to connect to the Internet. CISOs must not rely on assumptions about how to respond; running through and testing all response plans is vital.


Architect’s Guide to a Reference Architecture for an AI/ML Data Lake

If you are serious about generative AI, then your custom corpus should define your organization. It should contain documents with knowledge that no one else has, and should only contain true and accurate information. Furthermore, your custom corpus should be built with a vector database. A vector database indexes, stores and provides access to your documents alongside their vector embeddings, which are the numerical representations of your documents. ... Another important consideration for your custom corpus is security. Access to documents should honor access restrictions on the original documents. (It would be unfortunate if an intern could gain access to the CFO’s financial results that have not been released to Wall Street yet.) Within your vector database, you should set up authorization to match the access levels of the original content. This can be done by integrating your vector database with your organization’s identity and access management solution. At their core, vector databases store unstructured data. Therefore, they should use your data lake as their storage solution.


5 ways private organizations can lead public-private cybersecurity partnerships

One tangible step that cybersecurity stakeholders can take is to build the bottom-up infrastructure that can meet JCDC’s top-down strategic vision as it attempts to descend into tactical usefulness. This can be done by encouraging the development of volunteer civil cyber defense organizations while simultaneously lobbying the federal government for support of these entities. This kind of volunteer service model is an incredibly cost-efficient way to boost national defense, save federal government resources, and assure private stakeholders about their independence. ... Unfortunately, as criticism of the JCDC emphasizes, top-down P3 efforts often fail to effectively do so due to the role of strategic parameters driving derivative mission parameters. If industry is to shape P3 cyber initiatives CISA’s more clearly toward alignment with practical tactical considerations, mapping out where innovation and adaptation comes from in the interaction of key individuals spread across a complex array of interacting organizations (particularly during a crisis) becomes a critical common capacity.


Decoding tomorrow’s risks: How data analytics is transforming risk management

With digital technologies coming in, corporations can make use of data analytics to ensure goals correlate with their strategic needs. ... Talking about the different risk management strategies, data analytics can contribute towards optimisation models, which directs data-backed resource deployment towards risk mitigation, scenario analysis, which recreates likely circumstances to calculate the effectiveness of different risk mitigation applications, and personalised answers, which supplies custom-fit replies towards certain market conditions. ... “I believe the role of data analytics in risk mitigation has become paramount, enabling organisations to make decisions based on data-driven insights. By leveraging advanced analytics techniques, such as predictive modelling and ML, we can anticipate threats and take measures to mitigate them. From a business perspective, data analytics is considered indispensable in risk management as it helps organisations identify, assess, and prioritise risks. Companies that leverage data analytics in risk management can gain an edge by minimising disruptions, maximising opportunities, and safeguarding their reputation,” Yuvraj Shidhaye, founder and director, TreadBinary, a digital application platform, mentioned.


How AI-Driven Cyberattacks Will Reshape Cyber Protection

Aside from adaptability and real-time analysis, AI-based cyberattacks also have the potential to cause more disruption within a small window. This stems from the way an incident response team operates and contains attacks. When AI-driven attacks occur, there is the potential to circumvent or hide traffic patterns. This is somewhat similar to criminal activity, where fingerprints are destroyed. Of course, the AI methodology is to change the system log analysis process or delete actionable data. Perhaps having advanced security algorithms that identify AI-based cyberattacks is the answer. ... AI has introduced challenges where security algorithms must become predictive, rapid and accurate. This reshapes cyber protection because organizations' infrastructure devices must support the methodologies. It's no longer a concern where network intrusions, malware and software applications are risk factors, but rather how AI transforms cyber protection. The shield is not broken. It requires a transformation practice for AI-based attacks.


Four easy ways to train your workforce in cybersecurity

Do your employees install all kinds of random apps and programs? Do the same thing as the phishing emails: create your own dodgy software that locks the employee's computer, blast it out to the employee database, and see who falls for it. When they have to bring their IT assets in to be unlocked and get a scolding for installing suspicious material, however harmless, the lesson will stick. ... Cyber attacks soar during festive seasons, like the upcoming Holi holiday. Set up automated reminders to your employees to remind them not to blindly open greeting mails or click on suspicious links. You can track the open and read rate of these messages to get an idea of whether people are actually paying attention. ... If your IT team is savvy and has some time to spare, they can use generative AI to create fake personas – someone from another department, a vendor, or a customer – and see if these fake personas can fool people into giving away information they should be keeping confidential. This is particularly important, because many cyber criminals today are already using generative AI to scam unwitting victims. 


Report: AI Outpacing Ability to Protect Against Threats

There are "two sides of the coin" when it comes to AI adoption, said Greg Keller, CTO at JumpCloud. Employee productivity and technology stacks being embedded into SaaS solutions are "the new frontier," he said. "Yet there are universal security concerns. There is fear of commingling or escaping of your data into public sectors. And there is a fear of using one's data on an AI platform. CTOs are concerned about their data leaking through public LLMs," Keller said. ... "We're at the tail end of understanding digital transformation. Now, we are beginning the first phase of the identity transformation. These companies have done an amazing amount of work to lift and shift their technology stacks from legacy into the cloud with one exception - overwhelmingly, it's the Microsoft Active Directory problem," Keller said. "That's still on-premises or self-managed. So they're looking at ways to modernize this. We are in the earliest phases of security shifting away from endpoint-based [security]. Now, it's about understanding access control through the identity, and this is the new frontier."



Quote for the day:

"Whatever you can do, or dream you can, begin it. Boldness has genius, power and magic in it." --Johann Wolfgang von Goethe

Daily Tech Digest - February 16, 2024

GitHub: AI helps developers write safer code, but you need to get the basics right

With cybercriminals largely sticking to the same tactics, it is critical that security starts with the developer. "You can buy tools to prevent and detect vulnerabilities, but the first thing you need to do is help developers ensure they're building secure applications," Hanley said in a video interview with ZDNET. As major software tools, including those that power video-conferencing calls and autonomous cars, are built and their libraries made available on GitHub, if the accounts of people maintaining these applications are not properly secured, malicious hackers can take over these accounts and compromise a library. The damage can be wide-reaching and lead to another third-party breach, such as the likes of SolarWinds and Log4j, he noted. Hanley joined GitHub in 2021, taking on the newly created role of CSO as news of the colossal SolarWinds attack spread. "We still tell people to turn on 2FA...getting the basics is a priority," he said. He pointed to GitHub's efforts to mandate the use of 2FA for all users, which is a process that has been in the works during the last one and a half years and will be completed early this year. 


Why Tomago Aluminium reversed course on its cloud journey

“An ERP solution like ours is massive,” he says, highlighting that this can make it difficult to keep track of everything you are, and not, using. For instance, he says if you’re getting charged $20,000 for electricity, you might want to check your meter and verify that your usage and bill align. “If your electricity meter is locked away and you just get a piece of paper at the end of the month telling you everything’s fine and you owe $20 000, you’re probably going to ask some questions,” he says. Tomago was told everything was secure and running as it should, but they had no way to verify what they were being told was accurate. “We essentially had a swarm of big black boxes,” he says. “We put dollars in and got services out, but couldn’t say to the board, with confidence, that we were really in control of things like compliance, security, and due diligence.” Then in 2020, Tomago moved its ERP system back on-prem — a decision that’s paying dividends. “We now know what our position is from a cyber perspective because we know exactly what our growth rates are, and we know that our systems are up-to-date, and what our cost is because it’s the same every month,” he says.


OpenAI and Microsoft Terminate State-Backed Hacker Accounts

Threat actors linked to Iran and North Korea also used GPT-4, OpenAI said. Nation-state hackers primarily used the chatbot to query open-source information, such as satellite communication protocols, and to translate content into victims' local languages, find coding errors and run basic coding tasks. "The identified OpenAI accounts associated with these actors were terminated," OpenAI said. It conducted the operation in collaboration with Microsoft. Microsoft and OpenAI have not yet observed particularly novel or unique AI-enabled attack or abuse techniques resulting from threat actors' usage of AI," the Redmond, Washington-based technology giant said. Microsoft's relationship with OpenAI is under scrutiny by multiple national antitrust authorities. A British government study published earlier this month concluded that large language models may boost the capabilities of novice hackers but so far are of little use to advanced threat actors. China-affiliated Charcoal Typhoon used ChatGPT to research companies and cybersecurity tools, debug code and generate scripts, and create content likely for use in phishing campaigns. 


Why Most Founders and Investors Are Wrong About Disruption

Recognizing disruption requires an open mind. In many instances, people can't believe or see something is disruptive at first. They think the idea is foolish or won't work. Disruption is usually caused by something that hasn't existed before or something new. Airbnb is a great example here as well. Its founders are said to have gone to every venture capitalist in Silicon Valley and were famously laughed out of meetings. People couldn't see what they saw — it hadn't been invented yet. Even the most seasoned business leaders can misunderstand and mistake disruption or fail to recognize it. Disruption doesn't always mean extinction. History has proven this for countless companies, processes, products, services, and ideas. Organizations can collapse after big changes. They did not or could not adapt. But something new or different tends to fill in the gap. It's often better, and the cycle continues. I have been on both sides of disruption at my company, BriteCo. We are one of the jewelry industry's disruptors – we were the first to move jewelry consumers to 100% paperless processes with technology and the internet. We also provide our customers with different ways to buy our coverage, unique to BriteCo, versus an outdated analog process at the retail point of sale.


Will generative AI kill KYC authentication?

Lee Mallon, the chief technology officer at AI vendor Humanity.run, sees an LLM cybersecurity threat that goes way beyond quickly making false documents. He worries that thieves could use LLMs to create deep back stories for their frauds in case someone at a bank or government level reviews social media posts and websites to see if a person truly exists. “Could social media platforms be getting seeded right now with AI-generated life histories and images, laying the groundwork for elaborate KYC frauds years down the line? A fraudster could feasibly build a ‘credible’ online history, complete with realistic photos and life events, to bypass traditional KYC checks. The data, though artificially generated, would seem perfectly plausible to anyone conducting a cursory social media background check,” Mallon says. “This isn’t a scheme that requires a quick payoff. By slowly drip-feeding artificial data onto social media platforms over a period of years, a fraudster could create a persona that withstands even the most thorough scrutiny. By the time they decide to use this fabricated identity for financial gains, tracking the origins of the fraud becomes an immensely complex task.”


Generative AI: Shaping a New Future for Fraud Prevention

A new category called "AI Risk Decisioning" is poised to transform the landscape of fraud detection. It leverages the strengths of generative AI, combining them with traditional machine learning techniques to create a robust foundation for safeguarding online transactions. ... The first pillar involves creating a comprehensive knowledge fabric that serves as the foundation for the entire platform. This fabric integrates various internal data sources unique to the company, such as transaction records and real-time customer profiles. ... The third pillar of the AI Risk Decisioning approach focuses on automatic recommendations, offering powerful capabilities for real-time and effective risk management. It can automatically monitor transactions and identify trends or anomalies, suggest relevant features for risk models, conduct scenario analyses independently, and recommend the next best action to optimize performance. ... The fourth pillar of the AI Risk Decisioning approach emphasizes human-understandable reasoning. This pillar aims to make every decision, recommendation, or insight provided by the AI system easily understandable to human users.


Implementing a Digital Transformation Strategy

Actionable intelligence has been accepted as the “new normal” of the data-first enterprise. In the data-first enterprise, data and digital technologies not only open up innovative revenue channels but also create the most compliant (governed) business operations. However, in order for an enterprise to successfully plan, develop, and execute a data-first operating model, the business owners and operators have to first develop a digital transformation strategy – connecting the data piles, digital technologies, business processes, and marketing staff. The digital transformation strategy develops around the need to bridge the gaps between the current data-driven goals and processes and intended future business goals and processes. In a nutshell, the digital transformation strategy strikes a harmonious balance between traditional IT and marketing functions. Global businesses have witnessed firsthand the immense benefits of digital processes, such as improved efficiencies, reduced operating costs, and growth of additional revenue channels. A recent industry survey report indicated that 92% of businesses are already pursuing digital transformation in more than one way. However, the transformation across businesses is at various stages of maturity.


Planning a data lake? Prepare for these 7 challenges

Storing data in a central location simplifies compliance in the sense that you know where your data resides, though it also creates compliance challenges. If you store many different types of data in your lake, different assets may be subject to different compliance standards. Data that contains personally identical information (PII), for instance, must be managed differently in some ways than other types of data to comply with laws like DPA, GDPR or HIPAA. While a data lake won’t prevent you from applying granular security controls to different data assets, it doesn't make it easier, either – and it can make it more difficult if your security and compliance tools are not capable of applying different policies to different data assets within a centralized repository. ... Placing your data into a central location to create a data lake is one thing but connecting it to various applications and the workforce who needs access is another. Until you develop the necessary data integrations – and unless you keep them up to date – your data lake will deliver little value. Building data integrations takes time, effort, and expertise and users sometimes underestimate how difficult it is to create successful data integrations. Be sure and prioritize data integration strategy as part of your overall process.


Does Cloud Native Change Developer Productivity and Experience?

When management focuses too much on developer productivity, developer experience can suffer and thus hurt morale and, paradoxically, productivity as well. It’s important for management to have a light touch to avoid this problem, especially with cloud native. Cloud native environments can become so dynamic and noisy that both productivity and developer experience can decline. Management must take special care to support its developers with the right platforms, tools, processes and productivity metrics to facilitate the best outcomes, leveraging platform engineering to create and manage IDPs that facilitate cloud native development despite its inherent complexity. After all, the complexity of cloud native development alone isn’t the problem. Complexity presents challenges to be sure, but developers are always up for a challenge. Complexity coupled with a lack of visibility brings frustration, lowering productivity and DX. With the right observability, for example, with Chronosphere and Google Cloud, developers have a good shot at untangling cloud native’s inherent complexity, delivering quality software on time and on budget, while maintaining both productivity and DX.


Vulnerability to Resilience: Vision for Cloud Security

In the recent era of cloud-native development and DevSecOps, CISOs face the challenge of fostering a security-conscious culture that spans across various cross-functional teams. However, by adopting deliberate, disruptive, engaging, and enjoyable approaches that also provide a return on investment, a sustainable security culture can be achieved. It is essential to instill the concept of shared responsibility for security and focus on enhancing awareness and adhering to advanced security practices. If you don't already have a secure development lifecycle, it is imperative to integrate one immediately. Recognizing and rewarding individuals who prioritize security is one of the ways to encourage a security-focused culture. Additionally, creating a security community and making security more engaging and enjoyable can also help cultivate a sustainable security culture. CISOs should leverage technical tools and best practices to facilitate the seamless integration of security into the Continuous Integration/Continuous Deployment (CI/CD) pipeline. This can be achieved through various measures, such as conducting threat modeling, adopting a shift-left security approach, incorporating IDE security ...



Quote for the day:

"You may have to fight a battle more than once to win it." -- Margaret Thatcher

Daily Tech Digest - October 05, 2023

AI and Overcoming User Resistance

If users are concerned, and even worried about AI, it could lead to user resistance, which is a dynamic that IT pros are familiar with from their history of implementing new systems that alter business processes, require employee retraining, and may even change employee jobs. So, are process change and user resistance any different when you introduce AI? I would argue yes. You’re not just retraining an employee on a new set of steps for processing an invoice or taking an order. You’re actually introducing an automated thinking process into what an employee has been doing. Now, technology is going to make or recommend decisions that the employee used to make. This can lead to employees experiencing a loss of empowerment and control. ... This is exactly the “sweet spot” that companies (and IT) should aim for with AI projects: an environment where everyone sees beneficial value from AI, and where no one feels disenfranchised. This is an achievable environment if users are engaged early in business process redefinition and in how AI will work. 


Eyes everywhere: How to safely navigate the IoT video revolution

Users are rightfully wary of bringing even more cameras into their homes and offices. The good news is that they, too, can protect their camera-enabled devices with some simple steps. First, customize. This includes changing default usernames and passwords, updating the device’s firmware and software, and staying informed about the latest security threats. This is a simple yet effective way to create a barrier between yourself and would-be hackers. Next, take it to the edge. Processing and storing data at the edge instead of the cloud is another surefire way to protect your endpoints. After all, by storing the information under your own lock and key, you can be sure about who can access it and how. Users also benefit from reduced latency by storing the information closer to home, which is particularly important with heavy video feeds. Finally, buy trusted brands. Attack surfaces are only as strong as their weakest link. So, chose companies that have a proven track record when it comes to privacy and security. 


Why HTTP Caching Matters for APIs

In some caching strategies, especially for dynamic resources, the cache can store not only the complete response but also the individual elements or changes that make up the response. This approach is known as “delta caching” or “incremental caching.” Instead of sending the complete response, delta caching sends only the changes or updates made to the cached version of the resource. ... Delta caching is particularly useful for scenarios where resources change frequently, but the changes are relatively small compared to the complete resource. For example, in a collaborative document editing application, delta caching can be employed to send only the changes made by a user to a shared document, instead of sending the entire document every time it is updated. ... Caching enhances application resilience by reducing the risk of service disruptions during periods of high demand. By serving cached responses, even if the backend servers experience temporary performance issues, the application can continue to respond to a significant portion of requests from the cache. The caching layer acts as a buffer between the backend servers and the clients.


Author Talks: How to speak confidently when you’re put on the spot

People become nervous for many reasons. More than 75 percent of people report being nervous in high-stakes communication, be it planned or spontaneous. Past experience could be a factor, as well as high stakes and the importance of the goals you’re trying to achieve. Those of us who study this at an academic level believe that the nervousness is wired into being human. We see this across all cultures. We see it develop typically in the early teen years and progress from there. There’s an evolutionary component to it. One of the most helpful tips is normalizing the anxiety that you feel. You’re not alone. ... My anxiety management plan has three steps. The first thing I do is hold something cold in the palms of my hand before I speak. That cools me down. Secondly, I say tongue twisters to warm up my voice and also to get myself in the moment. Third, I remind myself, “I am in service of my audience. I am here to help them.” That really gets me other-focused rather than self-focused. That’s my anxiety management plan. I encourage everybody to find a plan that works for them.


Dell customizes GenAI and focuses on data lakehouse

Being able to fine tune as well as train generative AI is a process that relies on data, lots and lots of data. For enterprise use cases, that data isn’t just generic data taken from a public source, but rather is data that an organization already has in its data centers or cloud deployments and is likely also spread across multiple locations. To help enable enterprises to fully benefit from data for generative AI, Dell is building out an open data lakehouse platform. The data lakehouse concept is one that was originally pioneered by Databricks, as a way of enabling organizations to more easily query data stored in cloud object storage based data lakes. The Dell approach is a bit more nuanced in that it is taking a hybrid approach to data, with a goal of being able to query data across on-premises as well as mutli-cloud deployments. Greg Findlen, senior VP data management at Dell explained during the press briefing that the open data lakehouse will be able to use Dell storage and compute capabilities as well as multi-cloud storage. 


Don’t try running with data before you can walk

In South Africa, data governance tends to be a grudge investment based on regulatory issues. However, organisations that don’t do the basics well, and don’t have mature data governance and established frameworks in place, may well find they are spending on analytics technologies that don’t live up to expectations. What stands in the way of getting governance right? Firstly, it’s not easy. It involves all stakeholders across all domains. It may require a mindset change, and users may need to learn to use new technology. Secondly, it can be expensive, and it may take time before the organisation sees the value of it. One of the biggest problems is that the value of data governance investments is difficult to quantify in monetary terms. ... Data products should be supported by the entire CDO capability – including the CDO, data owners and data stewards – as well as IT, to ensure the data products will add the required business value. Owners and stewards need to identify and curate the required data for the products, while also ensuring good quality data and metadata management to make it more usable for broader business.


Yes, Software Development is an Assembly Line, but not Like That

Manufacturing engineers produce assembly lines and manufacturing processes that can produce those units of value. Software engineers are largely the same, also producing systems and processes that deliver units of value. The manufactured widget of software is actually the discrete user interactions with those features and pieces of software, not the features themselves. The assembly line in software engineering isn’t, as many think, the engineers producing features. ... Systems like Total Quality Management, which are focused on driving a cultural mindset of continuous improvement and an entire company focused on providing very low defect rates, easily translate to customer satisfaction in software organizations. Just to pick on TQM a bit, if we were to adapt it to software, we would focus on the number of times users are impacted by a defect more than the number of open bugs. Instead of tracking the number of defects and searching for more, we would be tracking the number of users who either failed to receive the promised value from the product or had severely diminished value.


Cloud Services Without Servers: What’s Behind It

“The basic idea of serverless computing has been around since the beginning of cloud computing. However, it has not become widely accepted,” explains Samuel Kounev, who heads the JMU Chair of Computer Science II (Software Engineering). But a shift can currently be observed in the industry and in science, the focus is increasingly moving towards serverless computing. A recent article in the Communications of the ACM magazine of the Association for Computing Machinery (ACM) deals with the history, status and potential of serverless computing. Among the authors are Samuel Kounev and Dr. Nikolas Herbst, who heads the JMU research group “Data Analytics Clouds”. ... “NoOps” is the first, which stands for “no operations”. This means, as described above, that the technical server management, including the hardware and software layers, is completely in the responsibility of the cloud provider. The second principle is “utilisation-based billing”, which means that only the time during which the customer actively uses the allocated computing resources is billed. 


7 sins of software development

Some software development issues can be fixed later. Building an application that scales efficiently to handle millions or billions of events isn’t one of them. Creating effective code with no bottlenecks that surprise everyone when the app finally runs at full scale requires plenty of forethought and high-level leadership. It’s not something that can be fixed later with a bit of targeted coding and virtual duct tape. The algorithms and data structures need to be planned from the beginning. That means the architects and the management layer need to think carefully about the data that will be stored and processed for each user. When a million or a billion users show up, which layer does the flood of information overwhelm? How can we plan ahead for those moments? Sometimes this architectural forethought means killing some great ideas. Sometimes the management layer needs to weigh the benefits with the costs of delivering a feature at scale. Some data analysis just doesn’t work well at large scale. Some formulas grow exponentially with more users. 


Organizations grapple with detection and response despite rising security budgets

For better understanding and evaluation, the study was able to categorize the responding organizations into "secure creators" and "prone enterprises." The grouping was done on the basis of the number of solutions used, the adoption of emerging technologies, and the use of technologies to simplify their automation environments. The study found that secure creators are more satisfied with their approach to cybersecurity, experience fewer cybersecurity incidents, and can detect and respond to incidents quicker. About 70% of them are early adopters of emerging technologies. The secure creators are also more focused on extracting the most value from specific advanced solutions, with 62% already using or in the late stages of implementing AI/ML solutions, as compared to only 45% of the prone enterprises. "When it comes to technology, the more clutter an organization has in its armory, the harder it is to pick up signals and get on top of issues quickly," Watson said.



Quote for the day:

"You’ll never achieve real success unless you like what you’re doing." -- Dale Carnegie

Daily Tech Digest - September 24, 2023

How legacy systems are threatening security in mergers & acquisitions

Legacy systems are far more likely to get hacked. This is especially true for companies that become involved in private equity transactions, such as mergers, acquisitions, and divestitures. These transactions often result in IT system changes and large movements of data and financial capital which leave organizations acutely vulnerable. With details of these transactions being publicized or publicly accessible, threat actors can specifically target companies likely to be involved in such deals. We have seen two primary trends throughout 2023: Threat groups are closely following news cycles, enabling them to quickly target entire portfolios with zero-day attacks designed to upend aging technologies; disrupting businesses and their supply chains; Corporate espionage cases are also on the rise as threat actors embrace longer dwell times and employ greater calculation in methods of monetizing attacks. Together, this means the number of strategically calculated attacks — which are more insidious than hasty smash-and-grabs — are on the rise. 


How Frontend Devs Can Take Technical Debt out of Code

To combat technical debt, developers — even frontend developers — must see their work as a part of a greater whole, rather than in isolation, Purighalla advised. “It is important for developers to think about what they are programming as a part of a larger system, rather than just that particular part,” he said. “There’s an engineering principle, ‘Excessive focus on perfection of art compromises the integrity of the whole.’” That means developers have to think like full-stack developers, even if they’re not actually full-stack developers. For the frontend, that specifically means understanding the data that underlies your site or web application, Purighalla explained. “The system starts with obviously the frontend, which end users touch and feel, and interface with the application through, and then that talks to maybe an orchestration layer of some sort, of APIs, which then talks to a backend infrastructure, which then talks to maybe a database,” he said. “That orchestration and the frontend has to be done very, very carefully.” Frontend developers should take responsibility for the data their applications rely on, he said.


Digital Innovation: Getting the Architecture Foundations Right

While the benefits of modernization are clear, companies don’t need to be cutting edge everywhere, but they do need to apply the appropriate architectural patterns to the appropriate business processes. For example, Amazon Prime recently moved away from a microservices-based architecture for streaming media. In considering the additional complexity of service-oriented architectures, the company decided that a "modular monolith” would deliver most of the benefits for much less cost. Companies that make a successful transition to modern enterprise architectures get a few things right. ... Enterprise technology architecture isn’t something that most business leaders have had to think about, but they can’t afford to ignore it any longer. Together with the leaders of the technology function, they need to ask whether they have the right architecture to help them succeed. Building a modern architecture requires ongoing experimentation and a commitment to investment over the long term.


GenAI isn’t just eating software, it’s dining on the future of work

As we step into this transformative era, the concept of “no-collar jobs” takes center stage. Paul introduced this idea in his book “Human + Machine,” where new roles are expected to emerge that don’t fit into the traditional white-collar or blue-collar jobs; instead, it’s giving rise to what he called ‘no-collar jobs.’ These roles defy conventional categories, relying increasingly on digital technologies, AI, and automation to enhance human capabilities. In this emergence of new roles, the only threat is to those “who don’t learn to use the new tools, approaches and technologies in their work.” While this new future involves a transformation of tasks and roles, it does not necessitate jobs disappearing. ... Just as AI has become an integral part of enterprise software today, GenAI will follow suit. In the coming year, we can expect to see established software companies integrating GenAI capabilities into their products. “It will become more common for companies to use generative AI capabilities like Microsoft Dynamics Copilot, Einstein GPT from Salesforce or, GenAI capabilities from ServiceNow or other capabilities that will become natural in how they do things.”


The components of a data mesh architecture

In a monolithic data management approach, technology drives ownership. A single data engineering team typically owns all the data storage, pipelines, testing, and analytics for multiple teams—such as Finance, Sales, etc. In a data mesh architecture, business function drives ownership. The data engineering team still owns a centralized data platform that offers services such as storage, ingestion, analytics, security, and governance. But teams such as Finance and Sales would each own their data and its full lifecycle (e.g. making code changes and maintaining code in production). Moving to a data mesh architecture brings numerous benefits:It removes roadblocks to innovation by creating a self-service model for teams to create new data products: It democratizes data while retaining centralized governance and security controls; It decreases data project development cycles, saving money and time that can be driven back into the business. Because it’s evolved from previous approaches to data management, data mesh uses many of the same tools and systems that monolithic approaches use, yet exposes these tools in a self-service model combining agility, team ownership, and organizational oversight.


Six major trends in data engineering

Some modern data warehouse solutions, including Snowflake, allow data providers to seamlessly share data with users by making it available as a feed. This does away with the need for pipelines, as live data is shared in real-time without having to move the data. In this scenario, providers do not have to create APIs or FTPs to share data and there is no need for consumers to create data pipelines to import it. This is especially useful for activities such as data monetisation or company mergers, as well as for sectors such as the supply chain. ... Organisations that use data lakes to store large sets of structured and semi-structured data are now tending to create traditional data warehouses on top of them, thus generating more value. Known as a data lakehouse, this single platform combines the benefits of data lakes and warehouses. It is able to store unstructured data while providing the functionality of a data warehouse, to create a strategic data storage/management system. In addition to providing a data structure optimised for reporting, the data lakehouse provides a governance and administration layer and captures specific domain-related business rules.


From legacy to leading: Embracing digital transformation for future-proof growth

Digital transformation without a clear vision and roadmap is identified as a big reason for failure. Several businesses may adopt change because of emerging trends and rapid innovation without evaluating their existing systems or business requirements. To avoid such failure, every tech leader must develop a clear vision, and comprehensive roadmap aligned with organizational goals, ensuring each step of the transformation contributes to the overarching vision. ... The rapid pace of technological change often outpaces the availability of skilled professionals. In the meantime, tech leaders may struggle to find individuals with the right expertise to drive the transformation forward. To address this, businesses should focus on strategic upskilling using IT value propositions and hiring business-minded technologists. Furthermore, investing in individual workforce development can bridge this gap effectively. ... Many organizations grapple with legacy systems and outdated infrastructure that may not seamlessly integrate with modern digital solutions. 


7 Software Testing Best Practices You Should Be Talking About

What sets the very best testers apart from the pack is that they never lose sight of why they’re conducting testing in the first place, and that means putting user interest first. These testers understand that testing best practices aren’t necessarily things to check off a list, but rather steps to take to help deliver a better end product to users. The very best testers never lose sight of why they’re conducting testing in the first place, and that means putting user interest first. To become such a tester, you need to always consider software from the user’s perspective and take into account how the software needs to work in order to deliver on the promise of helping users do something better, faster and easier in their daily lives. ... In order to keep an eye on the bigger picture and test with the user experience in mind, you need to ask questions and lots of them. Testers have a reputation for asking questions, and it often comes across as them trying to prove something, but there’s actually an important reason why the best testers ask so many questions.


Why Data Mesh vs. Data Lake Is a Broader Conversation

Most businesses with large volumes of data use a data lake as their central repository to store and manage data from multiple sources. However, the growing volume and varied nature of data in data lakes makes data management challenging, particularly for businesses operating with various domains. This is where a data mesh approach can tie in to your data management efforts. The data mesh is a microservice, distributed approach to data management whereby extensive organizational data is split into smaller, multiple domains and managed by domain experts. The value provided by implementing a data mesh for your organization includes simpler management and faster access to your domain data. By building a data ecosystem that implements a data lake with data mesh thinking in mind, you can grant every domain operating within your business its product-specific data lake. This product-specific data lake helps provide cost-effective and scalable storage for housing your data and serving your needs. Additionally, with proper management by domain experts like data product owners and engineers, your business can serve independent but interoperable data products.


The Hidden Costs of Legacy Technology

Maintaining legacy tech can prove to be every bit as expensive as a digital upgrade. This is because IT staff have to spend time and money to keep the obsolete software functioning. This wastes valuable staff hours that could be channeled into improving products, services, or company systems. A report from Dell estimates that organizations currently allocate 60-80% of their IT budget to maintaining existing on-site hardware and legacy apps, which leaves only 20-40% of the budget for everything else. ...  No company can defer upgrading its tech indefinitely: sooner or later, the business will fail as its rivals outpace it. Despite this urgency, many business leaders mistakenly believe that they can afford to defer their tech improvements and rely on dated systems in the meantime. However, this is a misapprehension and can lead to ‘technical debt.’ ‘Technical debt' describes the phenomenon in which the use of legacy systems defers short-term costs in favor of long-term losses that are incurred when reworking the systems later on. 



Quote for the day:

"Always remember, your focus determines your reality." -- George Lucas

Daily Tech Digest - May 13, 2023

How to build employee careers through an internal talent marketplace

One of the biggest hurdles to the success of an internal talent marketplace is the reluctance that people managers show when it comes to letting talent go. This is especially true for top talent and individuals they believe to be critical to their success. To overcome this challenge, managers need to be coached to recognize how employing this concept is, in fact, beneficial for the organisation on the whole. Before implementing any such initiative, it is necessary for managers to understand the long-term purpose that an internal marketplace will help serve and how retaining top talent in a different role within the organisation is far more favourable than having them leave the organisation. ... It is also the organisation's responsibility to ensure that its employees are provided relevant learning opportunities. The keyword is relevant. Using the information gleaned from regular discussions and performance assessments, managers will be in a strong position to create learning/training initiatives that are aligned with individual and organisational goals. This will provide employees with the necessary impetus to upskill themselves before they apply for any other internal opportunities.


What it Really Takes to Transition from Entrepreneur to CEO

Once you realize you need others to succeed, there's a key step to take next: disconnect emotionally from the business. Of course, you still must care deeply about the business; you just need to realize you and the business are no longer one. This whole idea might sound counterintuitive, but it's important. After all, with most entrepreneurs, your business is an enormous part of your identity. But as you begin to embrace the CEO role, you have to start sharing the business with others for it to grow. Sometimes this is literal — in terms of equity that gets distributed — while other times, it's sharing things like responsibility and key decisions. ... But while the CEO sets the vision, yours is no longer the only one, as it likely was when you were a solitary business owner. As you build a team of strong leaders around you, each of those individuals will have their own opinions about where the company should be headed. The CEO's role is to align your team around a shared purpose, values and mission, but all of you must create this together.


The Business Case For Federated Data Governance & Access Control

Recent MIT-CISR research from Stephanie Woerner and others shows that 51% of enterprises are to this day, locked in silos, and 21% have a morass of tech debt stitching their companies together. Ross and her co-authors describe a situation where “80% of the company’s programming code (was) dedicated to linking disparate systems as opposed to creating new capabilities.” Scenarios like this are unfortunately common and lead to business architectures that aren’t agile, nor do they have the resources or capabilities that enable digital transformation. ... So, is there a better approach? Simply put, yes, but first I want to suggest that we need to consider data governance and access control as a system of systems. This means moving to what Gartner calls ‘Federated Data Governance’ – universal controls are applied to data by establishing a system of policies and controls. For example, in the case of the finance department, when controlling data around the end of the quarter or specific timeframe is important, localized controls should and can be put in place. 


Credential Hacking: The Dark Side of Artificial Intelligence

If we take a step back to design a layered defense approach, robust strong authentication is just one part of the holistic cybersecurity approach. For an entire security architecture to work effectively, zero trust must be integrated into the whole equation. To that end, there are two additional aspects—attestation and assumed breach—beyond simply authentication. AI helps in both these areas. In this new cybersecurity normal, breaches are inevitable. This widely accepted truth also means that it is not so much a matter of getting breached as it is a matter of having a rapid detection, containment and recovery so that significant business impact is not felt and cyber resilience is sustained after a breach. Assumed breach requires the continual upkeeping and ingestion of cyber threat intelligence so that new IoCs (Indicators of Compromise) and TTPs (Tactics, Techniques and Procedures) can be utilized to update the protective and detective measures to limit the blast radius of any successful attacks and to detect early for prompt containment.


How the IoT Is Integral to Automated Material Handling

IoT data often goes into cloud-based systems for easier access later. A leader might use such an interface to determine how many more parcels their company shipped after implementing automated material handling. They could also use IoT information to determine whether automation reduced injuries, product damage or other undesirable outcomes. Sometimes, IoT data can automatically correct a system’s processes for better results. Such was the case with one that used a predictive process adjustment module for automated storage and retrieval. ... If the IoT sensors picked up on something abnormal, the machine would make the necessary changes without human input. This technology is especially convenient for facilities that must meet high output goals and may not have large numbers of on-site team members to correct problems. Any automated material handling strategy should ideally include metrics people choose and follow before, during and after implementation. The IoT can aid people in selecting and monitoring appropriate statistics, thus providing insights into whether things are going well or if people should make adjustments to get optimal results.


Creating A Cybersecurity Disaster Recovery Plan

A chain is only as strong as its weakest link, and human error is still one of the leading causes of security incidents. According to the latest research, 82% of cybersecurity breaches are caused by human error, meaning cybersecurity education can eliminate all but the most complex threats. The overwhelming majority of people have good intentions, and so do most employees. However, some still don’t understand that “1234” isn’t a good password or that a Nigerian prince promising them a large sum of money is suspicious. To stay ahead of sloppy password use, organizations should mandate and enforce the use of strong passwords. Typically, a strong password is at least 8-12 characters long and includes a mix of uppercase and lowercase letters, numbers, and special characters. Employees must also regularly update passwords and refrain from using them across different accounts or services. Passwords must also avoid using common words, phrases, or personal information. Additionally, train employees to identify and report suspicious activities.


Business automation intensifies as data governance returns

The research indicates 2023 to be the year of automation, from the use of super-hyped generative artificial intelligence (AI) technologies such as ChatGPT to more traditional business and IT automation. Organisations in the EMEA region are planning to increase their use of automation more than North America and Asia-Pacific, according to the research. But in terms of a specific area of business applications, customer experience is well to the fore of investment projects. Some 43% said they will invest in customer experience software spanning marketing, sales and contact centre management. Stephanie Corby, practice director at TechTarget’s analyst division, Enterprise Strategy Group, says: “CX is a top business driver for enterprise organisations ... but the reality is that most organisations are still in the early stages of CX maturity and strategy. The complexity of CX technology stacks has created integration and adoption challenges that will inevitably drive conversion to platforms.”


Modern Data Management Platforms Are Vital For Solving Modern Data Management Problems

With the growing importance of data, it has also become essential to integrate data security and data protection into the broader security ecosystem for increased insight and responsiveness. The evolving nature of cyber threats makes a proactive approach essential, and Security Information and Event Management (SIEM) systems must be connected to easily feed alerts, events, and audit data to other platforms. This gives security teams greater visibility into anomalies and threats, improving responsiveness and mitigating risk. Ongoing global economic instability means that across industries, businesses need to improve cost efficiency and optimise budgets. Data can easily become a major cost centre for businesses, and yet there needs to be increased spend around security, especially for mission-critical areas. Intelligent technologies can help businesses reduce the time it takes to protect applications by improving efficiency of backups and scans, which is a quick and easy way of reducing costs.


Is the Big Data Lake Era Fading?

Data lakes undoubtedly offer benefits over the previous, more traditional approach of handling data, like ERP and CRM softwares. While the previous approach is more like small, self-owned, self-operated stores, data lakes can be compared to Walmart, where all the data can be found in a single place. However, as the technology matures, enterprises are finding that this approach also comes with its set of drawbacks. Without proper management, large data lakes can quickly become data swamps — unmanageable pools of dirty data. In fact, there are 3 paradigms in which data lakes can fall apart, namely complexity, data quality, and security. Flexibility is one of the biggest pros of maintaining a data lake, as they are large dumps of raw data in their native format. This data is also not stored in a hierarchical structure, instead using a flat architecture to store data. However, with this flexibility also comes with added complexity, meaning that talented data scientists and engineers need to trawl through this data to derive value out of it. This cannot be done without specialised talent to maintain and operate it.


How to Navigate Structured and Unstructured Data as a Healthcare Organization

Unstructured data is immensely valuable to healthcare. “If you approach it from a high level, clinical notes are a glimpse into the physician’s brain,” says Brian Laberge, solution engineer at software and solutions provider Wolters Kluwer. In addition, written notes often capture the severity of a patient’s health condition or nuanced nonclinical social needs far better than highly structured diagnostic codes, he adds. Clinical and administrative staff can easily parse free text for relevant information, such as a diagnosis or a treatment recommendation. The difficulty stems from what comes next. ... Patient-generated health data comes with its own set of concerns. While it may be available in real time from sources such as monitoring devices or digital therapeutics applications — and it may be structured in its own right — most of it is only transferrable into EHRs as unstructured summary reports, notes Natalie Schibell, vice president and principal analyst at Forrester. (The same is true of visit summaries that come from urgent care, retail health or telehealth providers not affiliated with a health system.)



Quote for the day:

"Making good decisions is a crucial skill at every level." -- Peter Drucker