Showing posts with label cyberpsychology. Show all posts
Showing posts with label cyberpsychology. Show all posts

Daily Tech Digest - October 12, 2025


Quote for the day:

"Trust because you are willing to accept the risk, not because it's safe or certain." -- Anonymous



AI and Data Governance: The Power Duo Reshaping Business Intelligence

Fortunately, the relationship between AI and data governance isn’t one-sided. By leveraging automation, pattern recognition, and real-time analytics, AI enables organizations to manage data quality, compliance, and security more effectively. AI models can identify inaccuracies or inconsistencies, flag anomalies, and automatically correct missing or duplicate records, minimizing the risk of generating misleading results from poor-quality datasets. It can track organizational data in real time, ensuring accurate classification of sensitive information, enforcing access controls, and proactively identifying policy violations before they escalate. This approach enables organizations to move away from manual auditing and adopt automated, self-correcting governance workflows. ... To leverage the full potential of the relationship between AI and governance, organizations must establish a continuous feedback loop between their governance frameworks and AI systems. AI shouldn’t function independently; it must be constantly updated and aligned with governance policies to maintain accuracy, transparency, and compliance. One of the best ways to achieve this is by using intelligent data platforms such as Semarchy’s master data management (MDM) and data catalog solutions. These solutions unify and control AI data from a trusted, single source of truth, ensuring consistency across business functions.


Building cyber resilience in a volatile world

Supply chain attacks show just how fragile the ecosystem can be, given that when one link breaks, the shockwaves ripple across agencies and sectors. That’s why the shift away from outmoded ideas of “prevention” by building walls around environments to a new kind of resilience is so stark. For example, zero trust is no longer optional; it’s the baseline. Verification must be constant, and assumptions about “safe” internal networks belong in the past. Meanwhile, AI governance and quantum-resistant cryptography have jumped from academic conversations to immediate government standards. Institutional muscle is being flexed too.  ... The transformation ahead is as much cultural as technical. Agencies must shift from being static defenders to dynamic operators, and need to be ready to adapt, recover, and press on even as attacks intensify. Cybersecurity is not just another line item in the IT budget, but rather the backbone of national resilience. The ability to keep delivering services, protect citizen trust, and safeguard critical infrastructure is now inseparable from how well agencies manage cyber risk. Resilience is not built by chance. It’s built through strategy, investment, and relentless partnership. It means turning frameworks into live capability, leveraging industry expertise, and embedding a mindset that sees cyber not as a constraint but as a foundation for confidence and continuity.


Fighting Disinformation Demands Confronting Social and Economic Drivers

Moving beyond security theater requires embracing ideological critique as a foundational methodology for information integrity policy research. This means shifting from “how do we stop misinformation?” to “what material and symbolic interests does information serve, and how do power relations shape what counts as legitimate knowledge?” This approach demands examining not just false information, but the entire apparatus through which beliefs become hegemonic, others verboten. Ideological critique offers three analytical tools absent from current information integrity policy research. First, it provides established scholarly techniques for examining how seemingly neutral technical systems encode worldviews and serve specific class interests. Platform algorithms, content moderation policies, and fact-checking systems all embed assumptions about authority, truth, and social order that more often than not favor existing power arrangements. Second, it offers frameworks for understanding how dominant groups maintain cognitive hegemony: the ability to shape not just what people think, but how they think. Third, it provides tools for analyzing how groups develop counter-hegemonic consciousness, alternative meaning-making systems and their ‘hidden transcripts’. Adopting these techniques can craft better policy responses to disinformation.


Cloud Infrastructure Isn't Dead, It's Just Becoming Invisible

Let's be honest: most cloud platforms are more alike than different. Storage, compute, and networking are commoditized. APIs are standard. Reliability and scalability is expected. Most agree that the cloud itself is no longer a differentiator, it's a utility. That's why the value is moving up the stack. Engineers don't need more IaaS, they need better ways to work with it. They want file systems that feel local, even when they're remote. They want zero-copy collaboration and speed. And they want all of that without worrying about provisioning, syncing, or latency. Today, cloud users are shifting their expectations toward solutions that utilize standard infrastructure such as object storage and virtual servers, yet abstract away the complexity. The appeal is in performance and usability improvements that make infrastructure feel invisible. ... What makes this shift important is that it's rooted in practical need. When you're working with terabytes or petabytes of high-resolution video, training a model on noisy real-world data, or collaborating across time zones on a shared dataset, traditional cloud workflows break down. Downloading files locally isn't scalable, and copying data between environments wastes time and resources. Latency is a momentum killer. This is where invisible infrastructure shines. It doesn't just abstract the cloud, it makes it better suited to the way developers actually build and collaborate today.


The great misalignment in business transformation

It’s easy to point the finger at artificial intelligence (AI) for today’s disruption in the tech workforce. After all, AI is changing how coding, analysis and even project management are done. Entire categories of tasks are being automated. Advocates argue that workers will inevitably be replaced, while critics frame it as the next wave of technological unemployment. Recent surveys have shown that employee optimism is fading. ... The problem is compounded by the emphasis on being “more artistic” or “more technical.” Both approaches miss the mark. Neither artistry for its own sake nor hyper-technical detail guarantees relevance if business problems remain unsolved. The technology industry has always experienced cycles of boom and bust. From the dot-com bubble to the recent AI surge, waves of hiring and layoffs are nothing new. What is new, however, is the growing realization that some jobs may not need to come back at all. ... Analysis without insight devolves into repetitive reporting, adding noise rather than clarity. Creativity without business grounding drifts into theatre, producing workshops and “innovation sessions” that inspire but fail to deliver results. Both are missing the target. Worse still, companies have proven they can operate without many of these roles altogether. The lesson is clear: being more artistic or more technical is not the answer. 


The Architecture Repository: Turning Enterprise Architecture into a Strategic Asset

While the Enterprise Continuum provides the context — a spectrum from generic to organization-specific models — the Architecture Repository provides the structure to store, manage, and evolve those models. ... At the heart of the repository lies the Architecture Metamodel. This is the blueprint for how architectural content is structured, related, and interpreted. It defines the vocabulary, relationships, and rules that govern the creation and classification of artifacts. The metamodel ensures consistency across the repository. Whether you’re modeling business processes, application components, or data flows, the metamodel provides a common language and structure. It’s the foundation for traceability, reuse, and integration. In practice, the metamodel is tailored to the organization’s needs. It reflects the enterprise’s modeling standards, governance policies, and stakeholder requirements. It’s not just a technical artifact — it’s a strategic enabler of clarity and coherence. ... Architecture must respond to real needs. The Architecture Requirements Repository captures all authorized requirements — business drivers, stakeholder concerns, and regulatory mandates — that guide architectural development. ... Architecture is not just about models — it’s about solutions. The Solutions Landscape presents the architectural representation of Solution Building Blocks (SBBs) that support the Architecture Landscape.


Cyberpsychology’s Influence on Modern Computing

Psychological research on decision making and cognitive processes has been fundamental to understanding perceptions and behavior in the areas of cybersecurity and cyberprivacy. Much of this work focuses on cognitive biases and emotional states, which inform the actions of both users and attackers. ... Both cognition and affect play a role in these phenomena. Specifically, under conditions of diminished information processing—such as in the case of cognitive demands or affective experiences such as a positive mood state—people are less likely to make decisions based on strongly held beliefs. For example, a consumer’s positive emotional state, such as happiness with the Internet, mediates the negative effects of information-collection concerns on their willingness to disclose personal information. Interestingly, cybersecurity experts are as vulnerable to phishing and social engineering attacks as those who are not cybersecurity experts. A deep understanding of the perceptual, cognitive, and emotional mechanisms that result in lapses of judgment or even behavior incongruent with one’s intellectual understanding is vital to minimizing such threats. In addition to cognitive and emotional states, personality models have provided insight into human behavior vis á vis technology. The “big five” personality theory, also known as the five-factor model, is a widely accepted framework that has been applied to a broad range of cyber-related behaviors, including cybersecurity.


The Cybersecurity Skills Gap and the Role of Diversity

Cybersecurity is often presented as a technically demanding field, she points out. “This further discourages some women from first entering the industry. For those who have, it’s then about being able to continue growing their careers when they may feel challenged by perceived technical demands,” says Pinkard. And today, cybersecurity is not a purely technical subject. Demand for technical skills will always exist, but the job has changed, says Amanda Finch, CEO, The Chartered Institute for Information Security. ... While the low number of women in cybersecurity is concerning, it’s also important to consider how other types of diversity can help fill the skills gap in the workforce. Inclusion and opportunity is “100% about more than just bringing in more women”: “It's about the different life perspective,” says Pinkard. Those “lived perspectives” are driven by areas such as neurodiversity, ethnic diversity and physical ability diversity, she says. ... Too many companies still treat diversity as a compliance exercise, says Mullins. “When it was no longer a legal requirement in the US, many simply stopped. Others will say, ‘we want more women’, but won’t update their maternity policies and complain that only men apply to their roles. Or they say ‘we want neurodiverse talent’, but resist implementing more flexible working policies to facilitate them.” 


Data quality is no longer optional

AI systems can only be as good as the data that feeds them. When information is incomplete, inconsistent or trapped in silos, the insights and predictions those systems produce become unreliable. The risk is not just missed opportunities but strategic missteps that erode customer trust and competitive positioning. ... Companies with a strong digital foundation are already ahead in AI adoption, and those without risk drowning in information while starving their AI models of the clean, reliable inputs they need. But before any organisation can realise AI’s full potential, it must first build a resilient data foundation, and the enterprises that place data quality at the heart of their digital strategy are already seeing measurable gains. By investing in robust governance, integrating AI with data management and removing silos across departments, they create connected teams and more agile operations. ... Raising data quality is not a one-off exercise; it requires a cultural shift that calls for collaboration across IT, operations and business units. Leaders must set clear standards for how data is captured, cleaned and maintained, and champion the idea that every employee is a steward of data integrity. The long-term challenge is to design data architectures that can support scale and complexity and embrace distributed paradigms that support interoperability. These architectures do more than maintain order. 


Shadow AI in Your Systems: How to Detect and Control It

"Shadow AI" is when people in an organization use AI tools like generative models, coding assistants, agentic bots, or third-party LLM services without getting permission from IT or cybersecurity. This is the next step in the evolution of "shadow IT," but the stakes are higher because models can read sensitive text, make API calls on their own, and do automated tasks across systems. Industry definitions and primers say that shadow AI happens when employees use AI apps without official supervision, which can lead to data leaks, privacy issues, and compliance problems. ... Agents that automate web interactions usually need credentials, API keys, or tokens to do things for employees. Agents can get into systems directly if keys are poorly managed or embedded in scripts. ... Queries are outbound traffic to known AI provider endpoints, nonstandard hostname patterns, or unusual POST bodies. Modern proxy and firewall logs often show ULRs and headers that show which model vendors are being used. Check your web gateway and proxy logs for spikes in API calls and endpoints that you don't know about. ... Agents often do a lot of navigations, clicks, and form submissions in a short amount of time, which is different from how people do it. Look for patterns in how people navigate, intervals that are always the same, or pages that are crawled in tight loops.

Daily Tech Digest - July 04, 2023

Banking Tech Forecast: Cloudy, With a Chance of Cyber Risk

The breakneck pace of adoption has also resulted in a shortage of security experts who understand the overlapping, yet unique needs of the two industries. The cybersecurity sector has faced a shortage of skilled security professionals for most of its existence. Cloud solutions help mitigate this, because security can be integrated into the infrastructure and managed in a centralized place, Betz said. "Even then, financial institutions are still expected to conduct due diligence and oversight of third parties. This ability to evaluate security in a complex environment requires a high level of skill, which will continue to be highly sought-after, she said. Hiring and retraining existing staff to meet the volume of needed workers is a challenge too, in addition to the regulatory landscape wanting to force multi-cloud infrastructure for resiliency, Leach said. This means that a financial institution may be required to support multiple cloud service providers that operate differently and have different approaches to security assets. 


Data Mesh: A Pit Stop on the Road to a Data-Centric Culture

The process behind these benefits is also noteworthy. Most importantly, the notion of data decentralization is deceptively simple, and potentially revolutionary. Think of how IT consumerization has upended traditional technology implementation: Where IT specialists once made all the decisions on which tools to buy for business professionals and dictated how all that hardware and software was to be used, those end users now call the shots. They freely buy the devices they want and download the apps they like, then wait for IT to catch up. This provides enormous benefits. With data mesh we’re seeing similar movement toward data democratization. When line-of-business teams and other constituencies within the enterprise gain unprecedented access, and even ownership, of business data that was previously guarded, it accelerates collaboration and enables custom strategies to solve specific business problems. Data access also becomes simpler when interfaces and navigation are not just user-friendly but attuned to the priorities of specific functions, rather than having a more generic or enterprise-wide approach.


5 key mistakes IT leaders make at board meetings

It’s important to avoid speaking technical jargon, but sometimes you’re asked to define a technical term or explain a technology. One approach both Puglisi and I recommend is to answer technical questions with analogies from your industry. We both worked in the construction industry, so, for example, we might help these executives understand Scrum in software development by comparing it to design-build and agile construction project methodologies. ... Sometimes you need a spark to create a sense of urgency, but don’t take this approach too far. I once heard a CISO say, “If you can’t convince the board, then scare them,” which might get a CISO a yes to an investment, but lose credibility over time. CISOs who are natural presenters and storytellers can connect with the board using these skills, but only if given sufficient time to use this approach. If presenting isn’t your best skill, or you only have a few minutes to present, storytelling may confuse directors, says Tony Pietrocola, president and co-founder of AgileBlue. 


Beyond Browsers: The Longterm Future of JavaScript Standards

Developers don’t want to write their code multiple times to run on different serverside runtimes, which they have to do today. It slows down development, increases the maintenance load and may put developers off supporting more platforms like Workers, Snell noted. Library and framework creators in particular are unhappy with that extra work. “They don’t want to go through all that trouble and deal with these different development life cycles on these different runtimes with different schedules and having to maintain these different modules.” That’s a disadvantage for the runtimes and platforms as well as for individual developers wanting to use tools like database drivers that are specifically written for one runtime or another, he pointed out. “We talk to developers who are creating Postgres drivers or MongoDB drivers and they don’t want to rewrite their code to fit our specific API. It would be great if it was an API that worked on all the different platforms.” WinterCG is trying to coordinate what Ehrenberg calls “a version of fetch that makes sense on servers” as well as supporting Web APIs like text encoder and the set timeout APIs in a compatible way across runtimes.


EU judgment sinks Meta’s argument for targeted ads

Article 6(1)(b) of the GDPR establishes that practice could only be justified on condition that if the data is not processed, the contract between the user and the service operator can’t be fulfilled. This contractual necessity for data processing is usually understood rather more narrowly. For example, it enables an online retailer to provide a customer’s address to a courier, which is clearly necessary data processing under the terms of the contract between the store and the customer. Meta had relied on Article 6(1)(b) as its main justification for data processing for targeted advertising by claiming that targeted advertising was part of the service it contractually owes its users – a clause it introduced as part of a change to its terms of service (ToS) made at the stroke of midnight on 25 May 2018, which is the precise second the GDPR first came into force. Max Schrems, founder of Austria-based data protection campaign group NOYB, argued that Meta seemed to have taken the view that it could just “add random elements” to the contract, i.e. its ToS, covering personalised advertising, to avoid offering users a yes or no consent option.


Cloud Equity Group’s Sean Frank Talks AI Mingling with the Cloud

What AI is going to allow cloud to do is really kind of predict those changes that are needed. So instead of reacting to slowdowns because you’re running out of memory, you’re running out of processing power -- it could predict when those things are going to happen based on analyzing historical data and then it can allocate the resources dynamically. For a small organization or a small environment, it may not sound like a big task but as you’re thinking about these larger enterprises that might be running hundreds or thousands of servers that are all running different programs and they interact, if one thing goes down it can affect the rest of the ecosystem. It really is a very important aspect in being able to help make sure that everything is up and can be maintained. From a maintenance standpoint as well, it really tends to a very significant benefit. Right now, maintenance in general, in IT and cloud-based infrastructure, is largely reactive. There’s a problem, the user reports the problem, and someone from the IT department will then investigate the problem.


Is Zero Trust Achievable?

Microsegmentation is often the biggest hurdle businesses will face when implementing Zero Trust and many decide to forego it. Because the architecture is now based on security need, it can be complex to implement, particularly over on-premise private networks which tend to be flat and have high levels of implicit trust. Microsegmentation projects can run on for months and failure rates are high. Forrester’s Best Practices for Zero Trust Microsegmentation found that out of the 14 vendors who attempted to microsegment their private networks, 11 failed and concluded that in order to succeed, senior management buy-in is needed that oversees the removal of implicit trust between identities. However, there are other issues that can see projects flounder. Many organisations have legacy systems in situ, for instance, that were designed to function on a perimeterised network so assume trust. Realistically, the business may have to build the ZTNA around these until they become are retired or replaced by cloud-based SaaS alternatives, suggests the National Cyber Security Centre in its Zero Trust Architecture Principles.


Data Governance: The Oft-Overlooked Pillar of Strong Data Quality

Ungoverned data typically originates because someone in the organization, such as a data analyst, produces data without establishing strong governance policies to control how the data will be shared, secured, and maintained over time. Data producers may also lack an understanding of the underlying data they are working with and the business rules for aggregating or disseminating various KPIs related to the data. The decision to share data without first sorting out these issues is not usually the result of malfeasance. On the contrary, ungoverned data often emerges because someone in the organization creates data that other people need, and in the rush to ensure that they can start using the data, sharing begins before anyone creates a plan in place for governing the data over the long term. It doesn’t help, either, that it’s common for businesses to have IT solutions in place that make it easy to share data, but not to govern it. The typical IT organization implements software that can store and distribute information across an organization, but the IT department has little or no knowledge of how different business units will use the data. 


Why cyberpsychology is such an important part of effective cybersecurity

Cyberpsychologists and enterprise cybersecurity practitioners both stress the need to better understand how people interact with technology to create a stronger cybersecurity posture. They point to statistics showing that most breaches involve some sort of human misstep. Verizon's 2023 Data Breach Investigations Report, for example, found that "74% of all breaches include the human element, with people being involved either via error, privilege misuse, use of stolen credentials or social engineering." As Huffman says, hackers "don't want to go toe-to-toe with your firewall. They don't want to challenge your antivirus, because that's very difficult, not when they can exploit the largest vulnerability on every network on the planet right now -- that's us, people. Cybercriminals are not just hacking computers; they are hacking humans. Because ... unlike computers, we actually respond to propaganda." Psychology gets at why humans do what they do, says Huffman, founder of cybersecurity services firm Handshake Leadership. 


CISA's New 'CyberSentry' Program to Tighten ICS Security

The CyberSentry program is part of the several cybersecurity provisions in the National Defense Authorization Act signed by U.S. President Joe Biden in December 2021. The act provisioned $768 billion in defense spending including the cybersecurity component of various national and federal agencies. The program aims to support CISA's efforts to defend U.S. critical infrastructure networks operators that support national critical functions such as power and water supply, banks and financial institutions and healthcare by monitoring both known and unknown malicious activity affecting IT and OT networks. The CyberSentry program is based on a mutual agreement between CISA and participating critical infrastructure partners. The program is voluntary and is provided at no additional fees or equipment costs to the participating partners. Under the program, CISA harnesses sensitive government information and provides visibility and mitigation of cyber threats targeting critical infrastructure. 



Quote for the day:

"You can only lead others where you yourself are willing to go." -- Lachlan McLean