Showing posts with label dpo. Show all posts
Showing posts with label dpo. Show all posts

Daily Tech Digest - October 15, 2025


Quote for the day:

"Blessed are those who can give without remembering and take without forgetting." -- Anonymous



One Leader, Two Roles: The CISO-DPO Hybrid Model

The convergence is not without its challenges. The breadth of combined responsibilities could potentially lead to overload and burnout for leaders trying to keep pace with evolving technical threats and fast-changing privacy regulations. In addition, lapses in compliance could lead to hefty penalties for the organization, particularly as regulatory bodies are now penalizing CISOs for faltering in their compliance and reporting efforts - a reminder that continuous learning is not optional, but essential. This hybrid role requires people who are multi-skilled and knowledgeable in both domains, a seemingly daunting task. CISOs and DPOs must be viewed as closely associated partners - not as individuals who can cause a conflict of interest - in their compliance journey.  ... A hybrid role enables faster translation of regulatory requirements into security controls, resulting in accelerated compliance efforts and improved resilience overall. An integrated approach thus becomes far more efficient than individuals operating in silos, such as the DPO having to rely on a CISO who does not necessarily have a DPO-specific mandate but only an overarching security focus. Enterprises can create an ecosystem where security and privacy reinforce each other, and organizations can foster collaboration, and build trust and long-term value in an era of relentless digital risk.


Beyond the Black Box: Building Trust and Governance in the Age of AI

Without enough controls, organizations run the risk of being sanctioned by regulators, losing their reputation, or facing adverse impacts on people and communities. These threats can be managed only by an agile, collaborative AI governance model that prioritizes fairness, accountability, and human rights. ... Organizations must therefore strike a balance between openness and accountability, holding back to protect sensitive assets. This can be achieved by constructing systems that can explain their decisions clearly, keeping track of how models are trained, and making decisions using personal or sensitive data interpretable. ... Methods like adversarial debiasing, sample reweighting, and human evaluators assist in fixing errors prior to their amplification, making sure the results reflect values like justice, equity, and inclusion. ... Privacy-enhancing technologies (PETs) promote the protection of personal data while enabling responsible usage. For example, differential privacy adds a touch of statistical “noise” to keep individual identities hidden. Federated learning enables AI models to learn from data distributed across multiple devices, without needing access to the raw data. ... Compliance must be embedded in the AI lifecycle by means of impact assessments, documentation, and control scaling, especially for high‑risk applications like biometric identification or automated decision‑making.


The rise of purpose-built clouds

The rise of purpose-built clouds is also driving multicloud strategies. Historically, many enterprises have avoided multicloud deployments, citing complexity in managing multiple platforms, compliance challenges, and security concerns. However, as the need for specialized solutions grows, businesses are realizing that a single vendor can’t meet their workload demands. ... Another major reason for purpose-built clouds is data residency and compliance. As regional rules like those in the European Union become stricter, organizations may find that general cloud platforms can create compliance issues. Purpose-built clouds can provide localized options, allowing companies to host workloads on infrastructure that satisfies regulatory standards without losing performance. This is especially critical for industries such as healthcare and financial services that must adhere to strict compliance standards. Purpose-built platforms enable companies to store data locally for compliance reasons and enhance workloads with features such as fraud detection, regulatory reporting, and AI-powered diagnostics. ... The rise of purpose-built clouds signals a philosophical shift in enterprise IT strategies. Instead of generic, one-size-fits-all solutions, organizations now recognize the value in tailoring investments to align directly with business objectives. 


Establishing Visibility and Governance for Your Software Supply Chain

Even if your organization isn’t the direct target, you can fall victim to attackers. A supply chain attack designed to gain access to a bank, for example, could also poison your supply chain. The attackers will gladly take your customer information or hold your servers hostage to ransomware. Modern software supply chains are incredibly complex webs of third-party code. To properly secure the supply chain, organizations must first gain visibility into all of the components that go into their applications. This is necessary not just on a per-application basis, but across the entire portfolio. ... The first step is to start building software bills of materials (SBOMs) at build time. The SBOM records what goes into your software, so it is the foundational piece of building asset visibility. You can then use that information to build a knowledge graph about your supply chain, including vulnerabilities and software licenses. When you aggregate these SBOMs across all of your application portfolio, you get a holistic view of all dependencies. ... One final piece of the puzzle is tracking software provenance. Tracking and gating on software provenance gives you another avenue to protect yourself from vulnerable code. This is often overlooked, but given the prevalence of attacks against open source library repositories, it’s more important than ever. 


Avoiding chain of custody crisis: In-house destruction for audit-proof compliance

Chain of custody refers to the documented and unbroken trail of accountability that records the lifecycle of a sensitive asset; from creation and use to final destruction. For data stored on physical media like hard disk drives (HDDs), solid state drives (SSDs), or e-media, maintaining a secure and traceable chain of custody is essential for demonstrating regulatory compliance and ensuring operational integrity. ... With the right high-security equipment, such as NSA-listed paper shredders, hard drive crushers and shredders, and disintegrators, destruction can occur at the point of use – or at least within the facility – under supervision and with real-time documentation. This eliminates transport risks, reduces reliance on third parties, and keeps sensitive data within your organization’s security perimeter. ... Compliance auditors are increasingly looking beyond destruction certificates. They want transparency. That means policies, procedures, logs, and physical proof. With an in-house program, organizations can tailor destruction workflows to meet specific regulatory frameworks, from NIST 800-88 guidelines to DoD or ISO standards. ... High-security data destruction isn’t just about preventing breaches. It’s about instilling confidence both internally with leadership and stakeholders, and externally with regulators and clients. By keeping destruction in-house, organizations send a clear message: data security is non-negotiable.


If Architectures Could Talk, They’d Quote Your Boss

Architecture doesn’t fail in the codebase. It fails in the meeting rooms. In the handoffs. In the silences between teams who don’t talk — or worse, assume they understand each other. The real complexity lives between the lines — not of code, but of communication. And once we stop pretending otherwise, we begin to see that the technical is inseparable from the social. ... There’s a deep irony in the fact that many of us in software come from a binary world — one shaped by certainty, logic, and repeatability. We’re trained to seek out 1s and 0s, true or false, compile or fail. But architecture lives in the fog — in uncertainty, trade-offs, and risk. It’s not a world of 1s and 0s, but of shifting constraints and grey zones. Where engineers long for clarity, architecture demands comfort with ambiguity. Decisions rarely have a single correct answer — they have consequences, compromises, and contexts that evolve over time. It’s a game of incomplete information, where clarity emerges only through conversation, alignment, and compromise. This also explains why so many of our colleagues feel frustrated. Requirements change. Priorities shift. Stakeholders contradict each other. And it’s tempting to see all that as failure. But it’s not failure — it’s the environment. It’s how complex systems grow. Architecture isn’t about eliminating uncertainty. It’s about giving teams just enough structure to move within it with confidence.


CIOs’ AI confidence yet to match results

According to a new survey from AIOps observability provider Riverbed, 88% of technical specialists and business and IT leaders believe their organizations will make good on their AI expectations, despite only 12% currently having AI in enterprise-wide production. Moreover, just one in 10 AI projects have been fully deployed, respondents say, suggesting that enthusiasm is significantly outpacing the ability to deliver. ... One problem with IT leaders’ possible overconfidence about AI expectations is that most organizations have no concrete expectations to begin with, says Warren Wilbee, CTO of supply chain software provider ToolsGroup. “Are the expectations a 10% productivity again, or a 2% drop in staffing?” he says. “The expectations are ill-defined.” Other AI experts see AI enthusiasm outpacing the difficulties of deploying the technology. In many cases, company leaders underestimate the technology requirements and the compliance and governance demands, says Patrizia Bertini, managing partner at UK IT regulatory advisory firm Aligned Consulting Group. ... Many organizations’ leaders don’t understand the full implications of rolling out and using AI, he says, with many not realizing the extent to which the technology will change the nature of work. Instead of executing tasks, many employees will manage agents that complete those tasks — a seismic shift. “Agentic AI holds enormous potential, but the path to full deployment will take time, requiring effort and investment,” he says. 


What if your privacy tools could learn as they go?

The research explains that traditional local differential privacy methods tend to be conservative because they assume no knowledge about the data. This leads to adding more noise than needed, which harms data utility. The PML approach narrows that gap by making use of whatever knowledge can be safely derived from the data itself. This design shift resonates with challenges seen in industry. ... Beyond the case studies, the research provides a set of mathematical results that can be applied to other privacy settings. It shows how to compute optimal mechanisms under uncertainty, including closed-form solutions for simple binary data and a convex optimization program for more complex datasets. These results mean that privacy engineers could, in theory, design systems that automatically adjust to the available data. The framework explains how to choose privacy parameters to meet a desired balance between protection and accuracy, given a known probability of error. ... This research offers a way to improve one of the biggest tradeoffs in privacy engineering: the loss of utility caused by assuming no prior knowledge about the data-generating process. By allowing systems to safely incorporate limited, empirically derived information, it becomes possible to provide strong privacy guarantees while preserving more data usefulness. The findings also suggest that privacy guarantees do not have to come at such a steep cost to data utility. 


13 cybersecurity myths organizations need to stop believing

Big tech platforms have strong verification that prevents impersonation - Some of the largest tech platforms like to talk about their strong identity checks as a way to stop impersonation. But looking good on paper is one thing, and holding up to the promise in the real world is another. “The truth is that even advanced verification processes can be easily bypassed,” says Ben Colman ... Buying more tools can bolster cybersecurity protection - One of the biggest traps businesses fall into is the assumption that they need more tools and platforms to protect themselves. And once they have those tools, they think they are safe. Organizations are lured into buying products “touted as the silver-bullet solution,” says Ian McShane. “This definitely isn’t the key to success.” Buying more tools doesn’t necessarily improve security because they often don’t have a tools problem but an operational one. ... Hiring more people will solve the cybersecurity problem - Professionals who are truly talented and dedicated to security are not that easy to find. So instead of searching for people to hire, businesses should prioritize retaining their cybersecurity professionals. They should invest in them and offer them the chance to gain new skills. “It is better to have a smaller group of highly trained IT professionals to keep an organization safe from cyber threats and attacks, rather than a disparate larger group that isn’t equipped with the right skills,” says McShane.


Where Stale Data Hides Inside Your Architecture (and How to Spot It)

Every system collects stale data over time — that part is obvious. What’s less obvious is how much of it your platform will accumulate and, more importantly, whether it builds up in places it never should. That’s no longer just an operational issue but an architectural one. ... Stale data often hides not in caching itself but in the gaps between cache layers. When application, storefront, and CDN caches don’t align, the system starts serving conflicting versions of the truth, like outdated prices or mismatched product images. ... A clear warning sign that your cache may hide stale data is when problems vanish after cache purges, only to return later. It often means the layers are competing rather than cooperating. ... One of the heaviest anchors for enterprise systems is transactional history that stays in production far longer than it should. Databases are built to serve current workloads, not to carry the full weight of years of completed orders and returns. ... Integrations with legacy systems often look stable because they “just work.” The trouble is that over time, those connections become blind spots. Data is passed along through brittle transformations, copied into staging tables, or synchronized with outdated protocols. ... Preventing stale data requires making freshness an architectural principle. It often starts with centralized cache management, because without a single policy for invalidation and refresh, caches across layers will drift apart.

Daily Tech Digest - October 18, 2020

How Robotic Process Automation Can Become More Intelligent

Artificial Intelligence (AI) and its intrinsic disciplines, including Machine Learning (ML), Natural Language Processing (NLP), and so forth, help to acquire the learning and decision-making abilities in an RPA task. Basically, RPA is for doing. Artificial intelligence is for contemplating ‘what should be done’. Artificial intelligence makes RPA intelligent. Together, these advances offer ascent to Cognitive Automation, which automates many use cases, which were just inconceivable before. The most recent transformation was the point when the virtualized platforms permitted the expansion and expulsion of assets required for processes dependent on the workloads. This permitted the organizations to investigate opportunities to characterize their processes based on automated rules. This was the development of Robotic process automation. RPA goes above and beyond making the monotonous process automated so human intercession is lost. A straightforward application for this could be rule-based reactions you need to accommodate certain work processes. When you code in the Rules once they don’t need any kind of intervention and the RPA deals with everything. Organizations have profited by executing RPA based solutions and processes to reduce expenses multiple times.


So You Want to Be a Data Protection Officer?

The GDPR states the Data Protection Officer must be capable of performing their duties independently, and may not be “penalized or dismissed” for performing those duties. (The DPO’s loyalties are to the general public, not the business. The DPO’s salary can be considered a tax for doing business on the internet.) Philip Yannella, a Philadelphia attorney with Ballard Spahr, said: “A Data Protection Officer can’t be fired because of the decisions he or she makes in that role. That spooks some U.S. companies, which are used to employment at will. If a Data Protection Officer is someone within an organization, he or she should be an expert on GDPR and data privacy.” Not having a Data Protection Officer could get quite expensive, resulting in stiff fines on data processors and controllers for noncompliance. Fines are administered by member state supervisory authorities who have received a complaint. Yannella went on to say, “No one yet knows what kind of behavior would trigger a big fine. A lot of companies are waiting to see how this all shakes out and are standing by to see what kinds of companies and activities the EU regulators focus on with early enforcement actions.”


The State of Enterprise Architecture and IT as a Whole

EA is an enterprise-wide, business-driven, holistic practice to help steer companies towards their desired longer-term state, to respond to planned and unplanned business and technology change. Embracing EA Principles is a central part of EA, though rarely adopted. The focus in those early days was reducing complexity by addressing duplication, overlap, and legacy technology. With the line between technology and applications blurring, and application sprawl happening almost everywhere, a focus on rationalizing the application portfolio soon emerged. I would love to say that EA adoption was smooth, but there were many distractions and competing industry trends, everything from ERP to ITIL to innovation. The focus was on delivery and operations, and there was little mindshare for strategic, big-picture, and longer-term thinking. Practitioners were rewarded only for supporting project delivery. Many left the practice. And frankly, a lot of people who didn’t have EA-skills were thrust into the role. That further exacerbated adoption challenges and defined the delivery-oriented technology-focused path EA would follow. It is still dominant today.


Managing and Governing Data in Today’s Cloud-First Financial Services Industry

Artificial intelligence and machine learning technologies have proven to accelerate the ability for banks, insurance companies, and retail brokerages to successfully combat fraud, manage risk, cross sell and upsell, and provide tailored services to existing clients. To harness the power and potential of these solutions, financial institutions will look to leverage external data from third-party vendors and partners and in-house data to mine for the best answers and recommendations. Today’s cloud-native and cloud-first solutions offer financial institutions the ability to capture, process, analyze, and leverage the intelligence from this data much faster, more efficiently, and more effectively than trying to do it internally. Improving Customer Experience Through Digital Modernization: Banks and insurance companies have been modernizing and/or replacing legacy core systems, many of which have been around for decades, with cloud-native and cloud-first solutions. These include offerings from organizations like FIS Global, nCino, and my former employer EllieMae in the banking industry, and offerings from Guidewire and DuckCreek for cloud-native policy administration, claims, and underwriting solutions in the insurance sector.


Optimizing Privacy Management through Data Governance

Data governance is responsible for ensuring data assets are of sufficient quality, and that access is managed appropriately to reduce the risk of misuse, theft, or loss. Data governance is also responsible for defining guidelines, policies, and standards for data acquisition, architecture, operations, and retention among other design topics. In the next blog post, we will discuss further the segregation of duties shown in figure 1; however, at this point it is important to note that modern data governance programs need to take a holistic view to guide the organization to bake quality and privacy controls into the design of products and services. Privacy by design is an important concept to understand and a requirement of modern privacy regulations. At the simplest level it means that processes and products that collect and or process personal information must be architected and managed in a way that provides appropriate protection, so that individuals are not harmed by the processing of their information nor by a privacy breach. Malice is not present in all privacy breaches. Organizations have experienced breaches related to how they managed physical records containing personal information, because staff were not trained to properly handle the information.


The Definitive Guide to Delivering Value from Data using DataOps

The DataOps solution to the hand-over problem is to allow every stakeholder full access to all process phases and tie their success to the overall success of the entire end-to-end process ... Value delivery is a sprint, not a relay. Treat the data-to-value process as a unified team sprint to the finish line (business value) rather than a relay race where specialists pass the baton to each other in order to get to the goal. It is best to have a unified team spanning multiple areas of expertise responsible for overall value delivery instead of single specialized groups responsible for a single process phase. ... A well architected data infrastructure accelerates delivery times, maximizes output, and empowers individuals. The DevOps movement played an influential role in decoupling and modularizing software infrastructure from single monolithic applications to multiple fail-safe microservices. DataOps aims to bring the same influence to data infrastructure and technology. ... At its core, DataOps aims to promote a culture of trust, empowerment, and efficiency in organizations. A successful DataOps transformation needs strategic buy-in starting from C-suite executives to individual contributors.


How do Organizations Choose a Cloud Service Provider? Is it Like an Arranged Marriage?

While not as critical a decision as marriage, most organizations today face a similar trust-based dilemma- which cloud service provider to trust with their data? There is no debate over the clear value drivers for cloud computing- performance, cost and scalability to name a few. However, the lack of control and oversight could make organizations hesitant to hand over their most valuable asset- information, to a third party, trusting they have adequate information protection controls in place. With any trust-based decision, external validation can play an important role. Arranged marriages rely on positive feedback and references, mostly attested by the matchmaker. It also relies on supporting evidence such as corroborations of relatives and more tangible factors such as education/career history of the potential bride/ groom. In case of cloud service providers, independent validation such as certifications, attestation or other information protection audits could make or break a deal. The notion of cloud computing may have existed as far back as the 1960s but cloud services took the form we know of today with the launch of services from big players such as Amazon, Google and Microsoft in 2006-2007. 


Professor creates cybersecurity camp to inspire girls to choose STEM careers

The way I got into cybersecurity, I got into cybersecurity I would say maybe five years ago. But in the field of IT, I always like to pull things apart, and figure out how they work and problem solve. I was always in the field of IT. I worked as a programmer at IBM for a couple of years, and then I segued into the academy, because I felt I could be more impactful in front of a classroom. In an IBM setting and programming setting, I noticed I was one only woman and woman of color in that field. I said, "OK, I need to do something to change this." I went into the Academy and said, "Maybe if I was an instructor, I could then empower more young women to go and pursue this field of study." Then five years, as time went on, the cybersecurity discipline really became very hot. And really, that was very, very intriguing, how hackers were hacking in and sabotaging systems. Again, it was like a puzzle, problem solving, how can we, out-think the hacker, and how can we make things safe? That became very, very intriguing to me. Then when I wrote this grant, the GenCyber grant, which Dr. Li-Chiou Chen, my chair at the time, recommended that I explore a grant for GenCyber and I submitted it, and I was shocked that I won the grant.


Germany’s blockchain solution hopes to remedy energy sector limitations

If successfully executed, Morris explained that BMIL could serve as the basis for a wide range of DERs supporting both Germany’s wholesale and retail electricity markets: “This will make it easy, efficient and low cost for any DER in Germany to participate in the energy market. Grid operators and utility providers will also gain access to an untapped decarbonized Germany energy system.” However, technical challenges remain. Mamel from DENA noted that BMIL is a project built around the premise of interoperability — one of blockchain’s greatest challenges to date. While DENA is technology agnostic, Mamel explained that DENA aims to test a solution that will be applicable to the German energy sector, which already consists of a decentralized framework with many industry players using different standards. As such, DENA decided to take an interoperability approach to drive Germany’s energy economy, testing two blockchain development environments in BMIL. Both Ethereum and Substrate, the blockchain-building framework for Polkadot, will be applied, along with different concepts regarding decentralized identity protocols.


How to Overcome the Challenges of Using a Data Vault

Within the data vault approach, there are certain layers of data. These range from the source systems where data originates, to a staging area where data arrives from the source system, modeled according to the original structure, to the core data warehouse, which contains the raw vault, a layer that allows tracing back to the original source system data, and the business vault, a semantic layer where business rules are implemented. Finally, there are data marts, which are structured based on the requirements of the business. For example, there could be a finance data mart or a marketing data mart, holding the relevant data for analysis purposes. Out of these layers, the staging area and the raw vault are best suited to automation. The data vault modeling technique brings ultimate flexibility by separating the business keys, which uniquely identify each business entity and do not change often, from their attributes. These results, as mentioned earlier, in many more data objects being in the model, but also provides a data model that can be highly responsive to changes, such as the integration of new data sources and business rules.



Quote for the day:

"The closer you get to excellence in your life, the more friends you'll lose. People love average and despise greatness." -- Tony Gaskins