Daily Tech Digest - March 04, 2023

How security leaders can effectively manage Gen Z staff

Gen Z will look for jobs in organizations that share their values. Gen Z is likely to remind their superiors of such values if they find themselves being asked to do something that goes against such. Be ready for situations like this and make sure the company’s values isn’t just a marketing creation. Another way to look at this is to proactively go after individuals whose values resonate with the company’s. All working generations have experienced pros and cons of work from home, the office or a mix of both. This is unlikely to be a Gen Z-only preference, but younger generations may be more prone to think, “Why do I need to go to a specific location to do a job I can perform from anywhere?” ... The two aspects here are peer training and paid training. Gen Z is eager to learn but also to move forward, now even though this may not be effective to all roles it can be a positive in cybersecurity where attackers and attacks are always evolving fast.


LastPass Hack Highlights Importance of Applicable Acceptable Use Policies

While LastPass has made it clear that several course corrective activities have taken place post-incident to prevent similar hacks, the argument that this type of exploitation was preventable persists. Specifically, one control that should be scrutinized is the LastPass Acceptable Use Policy (AUP). These important documents provide employees with a set of rules applied by the company that explain the methods through which employees may access or use corporate networks, devices or data. Many of these policies require that corporate data may only be accessed and managed on corporate systems. This specific provision allows the organization to control both physical and logical access to important information, such as business operations and client data. As the business world has morphed with a more distributed and remote configuration, corporate AUPs require additional scrutiny as well. Specifically, companies should take a hard look as to the applicability of the Bring Your Own Device (BYOD) mentality and consider the security implications that could emerge through mismanagement.


3 Steps to Unlock the Power of Behavioral Data

In practice, a strong data culture is a “decision culture” according to McKinsey research, which is a culture where an organization can accelerate the application of advanced analytics, powering improved business performance and decision-making. Furthermore, Forrester found that organizations that use data to derive insights for decision-making are almost three times more likely to achieve double-digit growth. So why is it such a challenge to create this type of culture? ... Data creation is the process of creating high-quality, contextual behavioral data to power AI and other advanced data applications. Instead of working with the data exhaust which happens as a result of SaaS applications and black box analytics tools, data creation allows a choice of metrics that would best reflect the organization’s needs. The great thing about this is that it saves data teams quite a lot of time as it continuously delivers a highly trusted real-time stream of data that evolves with the business.


5 steps for building a digital transformation-ready enterprise architecture

In a hyper-competitive and increasingly cloud-based business environment, it's clear that digital-first is the only way forward. Of course, the transformation could have been smoother. For most businesses, it's happened in fits and starts—a program written here, a piece of software implemented there. The end result, in many cases, has been a patchwork: out-of-date applications, redundant or overly complicated programs, and generally clogged internal processes. Think of a big, tangled pile of extension cords—it's unclear what goes where, what can be safely removed, what needs replacing, and so forth. These clogged processes present a serious problem for businesses engaged in digital transformation. They can slow down a company's inner workings and, over time, lead to lost productivity and revenue. That's why it's imperative for companies to clear away the cobwebs and redesign their internal processes for maximal productivity—to, in other words, embark on an organization-wide program of enterprise architecture.


Crucial role of data protection in the battle against ransomware

Central to any cybersecurity strategy being developed is the role of the IT infrastructure teams and storage administrators in the secure storage and protection of data.However, formulating and implementing a strategy alone will not be enough, organisations must rigorously test their resiliency plans. It is essential to identify the cracks in the defences as a proactive strategy, even as learnings are applied reactively. A key reason behind the rise of ransomware attacks is that the attack surface, the systems that are accessible and could be compromised, is massive and constantly growing. The larger the enterprise, the larger the attack surface, as the vulnerable endpoints and pieces of software being used are many. Any breach that occurs, thus must be quickly contained, and its impact as minimised as possible. Merely adding more storage to a data centre is not the solution. Enterprises will need to incorporate immutable storage and encryption technology and optimize the recovery process. 


US Cybersecurity Strategy Shifts Liability Issues to Vendors

The administration envisions that it will roll out more stringent software development practices, work with vendors to implement them in the software development process and then work with industry and Congress to establish a liability shield for companies that adopt those practices. That process will take well over a year, the senior administration official predicts. Veracode founder and Chief Technology Officer Chris Wysopal says drawing from the NIST Secure Software Development Framework for the safe harbor law is more aspirational than realistic since the liability shield must consider a company's maturity and security posture. Kalember says no current institutions are well positioned to assess compliance with NIST or assign blame after a security incident. "We need a few different levels of what building safe software means," Wysopal tells ISMG. "The SSDF is a good starting point, but I think it does need to be more practical and more basic."


The government cannot win at cyber warfare without the private sector

The Council on Foreign Relations (CFR) recommends “a program of deepening public-private collaboration between the Defense Department (DOD) and the defense industry” to stop these hacks. It suggests this because it recognizes that the private sector is who owns and operates the networks and systems that the problem countries target, while the public sector “lacks the same picture of the threat environment.” The CFR is right. Private-sector actors regularly face hackings and understand that their survival in the marketplace hinges upon addressing them swiftly and efficiently. The government, by contrast, doesn’t recognize many of these threats until they occur. The government has the ability to contract with anyone, so why wouldn’t it choose to work more closely with private companies. Consider the case of the Office of Personnel Management, which faced that headline-making 2015 hacking from China. 


Five Factors For Planning A Data Governance Strategy

Effective data governance begins with having a comprehensive record of the data within the organization; however, according to one survey, for two-thirds of organizations, at least half of their data is dark. This dark data represents untapped insights that are not being levered by the organization. Also concerning is the fact that this same absence of quality data and availability can result in an estimated 29% of an employee’s time being spent on non-value-added tasks. ... Data democratization can be shaped by AI-enabled governance policies that control access to the cataloged data. This self-service access to data affords a degree of autonomy for users to work with the data—and the insights it can provide—independently, regardless of their position within the organization. The impact of data democratization can be felt across an entire organization. Users are able to access data securely and work with data on their own without being occupied by tasks that produce no benefit to the organization. As a result, the IT department can be available to handle other important tasks. 


The Move to Unsupervised Learning: Where We Are Today

In addition to the need for explainability, another significant challenge to the widespread adoption of deep learning is the increasing reliance on the need for labeled data, that is, adding labels to raw data such as text files and images to identify them and provide context that machine learning models can recognize and learn from. Supervised learning has made significant and impressive advances in recent years, demonstrating the ability to learn from massive amounts of labeled data. There is, however, a limit to how much AI can advance using supervised learning alone. In many real-world scenarios, the availability of large amounts of labeled data is a challenge — either due to a lack of resources or the inherent nature of the problem itself. Ensuring class balance in the labeled data presents another challenge in that it’s often the case that some classes make up for a large proportion of the data, while other classes might not be adequately represented. Furthermore, ensuring the trustworthiness of labeled data can present another challenge. 


The Biggest Enterprise Architecture Trends in 2023

Most Enterprise Architects endlessly tweak their systems to improve change delivery. As with all things in life, the changes aren't perfect the first time around, and adapting is essential. Each round of change, however small, ultimately improves the system. Many trends overlap and adapting way-of-working ties in with using the social aspects of the architecture described above. Organizations can track the history of change initiatives to see the applications, processes, and information impacted over time. Understanding how the change works gives leaders vital information to make decisions. By tracking people, teams, and departments, organizational and communication pathways become clear. Over time, the tracking shows patterns of where change occurs. When it’s clear where change is happening and failing, the patterns can guide the reorganization of teams. It can also help teams work as independently as possible, improve cross-team coordination, and aid prioritization.



Quote for the day:

"Leaders think and talk about the solutions. Followers think and talk about the problems." -- Brian Tracy

Daily Tech Digest - March 03, 2023

Irish Authorities Levy GDPR Fine in Centric Health Breach

DPC says that while Centric stated in its initial breach notification that 70,000 data subjects were affected by the breach, it only issued notifications to the 2,500 individuals whose data was irretrievably lost in the incident. Besides the inadequate breach communication to affected individuals, the fine levied against Centric also reflects a variety of other GDPR infringements, including "failure to implement technical and organizational measures appropriate to the level of risk" posed to personal and special category data on Centric's server. "The failure to implement the necessary safeguards in an effective manner at the appropriate time led to the possibility of patients' personal data being erroneously disclosed to unauthorized people," the report says. Centric, in a statement provided to Information Security Media Group, says that at the time of the cyberattack, it immediately informed the DPC and cooperated fully with the investigation. "We want to assure our patients that we take our responsibility to protect their data and ensure the security of our IT systems very seriously," Centric says. 


Gitpod flaw shows cloud-based development environments need security assessments

"Many questions remain unanswered with the adoption of cloud-based development environments: What happens if a cloud IDE workspace is infected with malware? What happens when access controls are insufficient and allow cross-user or even cross-organization access to workspaces? What happens when a rogue developer exfiltrates company intellectual property from a cloud-hosted machine outside the visibility of the organization's data loss prevention or endpoint security software?," the Snyk researchers said in their report, which is part of a larger project to investigate the security of CDEs. ... In fact, CDEs are in many ways a big improvement over traditional IDEs: They can eliminate the configuration drift that happens over time with developer workstations/laptops, they can eliminate the dependency collisions that occur when developers work on different projects, and can limit the window for attacks because CDE workspaces run as containers and can be short-lived.


Responsible AI: The research collaboration behind new open-source tools offered by Microsoft

Through its Responsible AI Toolbox, a collection of tools and functionalities designed to help practitioners maximize the benefits of AI systems while mitigating harms, and other efforts for responsible AI, Microsoft offers an alternative: a principled approach to AI development centered around targeted model improvement. Improving models through targeting methods aims to identify solutions tailored to the causes of specific failures. This is a critical part of a model improvement life cycle that not only includes the identification, diagnosis, and mitigation of failures but also the tracking, comparison, and validation of mitigation options. The approach supports practitioners in better addressing failures without introducing new ones or eroding other aspects of model performance. “With targeted model improvement, we’re trying to encourage a more systematic process for improving machine learning in research and practice,” says Besmira Nushi, a Microsoft Principal Researcher involved with the development of tools for supporting responsible AI.


Now Microsoft has a new AI model - Kosmos-1

The researchers also tested how Kosmos-1 performed in the zero-shot Raven IQ test. The results found a "large performance gap between the current model and the average level of adults", but also found that its accuracy showed potential for MLLMs to "perceive abstract conceptual patterns in a nonverbal context" by aligning perception with language models. The research into "web page question answering" is interesting given Microsoft's plan to use Transformer-based language models to make Bing a better rival to Google search. "Web page question answering aims at finding answers to questions from web pages. It requires the model to comprehend both the semantics and the structure of texts. The structure of the web page (such as tables, lists, and HTML layout) plays a key role in how the information is arranged and displayed. The task can help us evaluate our model's ability to understand the semantics and the structure of web pages," the researchers explain.


How AI can improve quality assurance: seven tips

One of the areas where AI is proving its worth for quality assurance is in the software development sector. AI seems particularly well-suited to regression testing. That approach requires checking to ensure previously tested versions of software keep working as expected following code modifications. Or, AI could help create new test cases. Some AI models can recognise or come up with scenarios without prior exposure to them. If you’re thinking about using AI for testing help, confirm which processes that typically take humans the longest to do or where the errors happen most often. Then, assess whether AI might avoid some of those issues and speed up the steps testers typically go through when verifying all is well with new software. Also, keep in mind that using AI for software testing works best when you have a large data set. That’s why training your AI models thoroughly is so necessary, and not a step to take hastily. 


Edge Computing Eats the Cloud?

Additionally, Sedoshkin says that smartphones are “more compact” than a set of GPUs and peripheral components make more sense in an R&D lab environment. He predicts this trend will continue to intensify. “Many real-world applications require the usage of a smartphone anyway, and these devices are capable of running pre-trained neural networks on edge. Smartphone manufacturers will continue increasing computational power and memory capacity on edge devices. However, R&D labs will use specialized hardware for training and testing AI/ML algorithms, and DIY enthusiasts will use specialized lightweight chipsets," Sedoshkin says. In short, there is little to stop the encroach of edge computing on the cloud’s lofty turf. There isn’t much friction to slow it down, either. “The future of edge computing is an evolving landscape; however, ‘ubiquitous’ is the best word that describes it because it will evolve to be all around us,” Tiwari says. And by ubiquitous, industry watchers say they literally mean everywhere.


4 tips to freshen up your IT resume in 2023

Every IT hiring manager looks for professionals who are passionate about their work. And what better way to show this than by discussing your passion projects? In your resume’s contact information section, add a link to any outside projects you’ve worked on over the years, casual or professional. Remember that these don’t need to be overly complex or high-tech – the point is to show you’re passionate about technology outside of work. Even if your contributions involve small edits or suggestions to other people’s code, include them on your resume. That said, your profiles must be up to date. If you haven’t updated it in years, don’t include it. Keeping your IT resume updated and relevant in 2023 is crucial for job seekers in the competitive technology industry. And while many IT professionals get job offers without an optimized resume, an exceptional resume might just be what stands between you and your top-choice company.


The role of human insight in AI-based cybersecurity

Traditional cybersecurity solutions, like secure email gateways (SEGs), rely on pre-defined rules and patterns to identify potential threats. However, these rules and patterns can become outdated quickly, leading to a high rate of false positives and false negatives. Sophisticated phishing attacks can also evade SEG systems as they impersonate known trusted senders or takeover accounts. By using RLHF, the model can learn from human feedback and continuously adapt to new threats as they emerge. Enterprise security teams spend as much as 33% of their time dealing with phishing scams. Since traditional cybersecurity solutions often rely on manual processes, this leads to delays in detecting and responding to potential threats. By combining AI and RLHF, teams can better identify potential threats, resulting in up to a 90% reduction in the amount of time needed to identify and react to phishing scams, while also significantly reducing the organization’s risk posture.


Biden's Cybersecurity Strategy Calls for Software Liability, Tighter Critical Infrastructure Security

The requirements will be performance based, adaptable to changing requirements, and focus on driving adoption of secure-by-design principles. "While voluntary approaches to critical infrastructure security have produced meaningful improvements, the lack of mandatory requirements has resulted in inadequate and inconsistent outcomes," the strategy document said. Regulation can also level the playing field in sectors where operators are in a competition with others to underspend on security because there really is no incentive to implement better security. The strategy provides critical infrastructure operators that might not have the financial and technical resources to meet the new requirements, with potentially new avenues for securing those resources. Joshua Corman, former CISA chief strategist and current vice president of cyber safety at Claroty, says the Biden administration's choice to make critical infrastructure security a priority is an important one.


Interactive Microservices as an Alternative to Micro Front-Ends for Modularizing the UI Layer

Interactive microservices are based on a new type of web API that Qworum defines, the multi-phase web API. What differentiates these APIs from conventional REST or JSON-RPC web APIs is that endpoint calls may involve more than one request-response pair, also called a phase. ... Unbounded composability — Interactive microservices can call other end-points and even themselves during their execution. The maximum depth of allowed nested calls is unbounded, and each call disposes of a full-page UI regardless of nesting depth. This is unlike micro front-ends which typically cannot be nested beyond 1 or 2 levels at most, because the UI surface area that is allocated to each micro front-end becomes vanishingly smaller with increasing nesting depth. General applicability — Qworum services are more generally applicable for distributed applications than micro front-ends, as the latter are generally tied to a particular web application (ad hoc micro front-ends), front-end framework (React micro front-ends, Angular micro front-ends etc) or organisation.



Quote for the day:

"When building a team, I always search first for people who love to win. If I can't find any of those, I look for people who hate to lose." -- H. Ross Perot

Daily Tech Digest - March 02, 2023

Cyberattackers Double Down on Bypassing MFA

MFA flooding, where an attacker will repeatedly attempt to log in using stolen credentials to create a deluge of push notifications, aims at taking advantage of users' fatigue for security warnings. "Push notifications are a step up from SMS, but are susceptible to MFA flooding and MFA fatigue attacks, bombarding the victim with notifications in the hope they will click 'Allow' on one of them," Caulfield says. Another popular tactic — the account reset attack — aims to fool tech support into giving attackers control of a targeted account, an approach that led to the successful compromise of the developer Slack channel for Take-Two Interactive's Rockstar Games, the maker of the Grand Theft Auto franchise. "An attacker will compromise a user’s credentials, and then pose as a vendor or IT employee and ask the user for a verification code or to approve an MFA prompt on their phone," says Jordan LaRose, practice director for infrastructure security at NCC Group. "Attackers will often use the information they’ve already compromised as part of the social engineering attack to lull users into a false sense of security."


Three Trends That Could Impact Data Management In 2023

Like cybercrime, the digitization of the customer experience is almost as old as the computer itself, but it really came into its own in the mobile age. What some call digitization 1.0 was all about “mobile, simplified design and new kinds of applications.” Digitization 2.0 homed in on customer demand—“apps anywhere, anytime, on any interface, and with any method of interaction: voice, social media, chat, texting, wearables, and even when you are sitting in your car.” What I’m calling digitization 3.0 here is a doubling down on both 1.0 and 2.0 to make data even more usable and provide unprecedented access to it. I began this article by highlighting the value of data. I think the companies that can extract even more value from their data while maintaining its security, resiliency and privacy will be the ones that not only survive today’s economic uncertainty but thrive during and after it. The key is contextualizing your data to make it more useful. This starts with the steps I’ve listed in the preceding two sections as a foundation.


Best and worst data breach responses highlight the do's and don'ts of IR

When it comes to data breaches, is there a sliding scale? In other words, if a tiny school district gets hit with a ransomware attack, do we give the IT team a partial pass because they probably lack the resources and skill level of a more tech-savvy company? On the other hand, if a company whose entire business model is based on protecting user passwords gets hacked, do we judge them more harshly? Which brings us to LastPass, which experienced an embarrassing breach that was first announced in August 2022 as simply a minor incident confined to the application development environment. By December that breach had spread to customer data including company names, end-user names, billing addresses, email addresses, telephone numbers, and IP addresses. LastPass gets high marks for transparency. The company continued to issue public updates following the initial August announcement. 


‘Digital twin’ tech is twice as great as the metaverse

A “digital twin” is not an inert model. It’s a personalized, individualized, dynamically evolving digital or virtual model of a physical system. It’s dynamic in the sense that everything that happens to the physical system also happens to the digital twin — repairs, upgrades, damage, aging, etc.Companies are already using “digital twins” for integration, testing, monitoring, simulation, predictive maintenance on bridges, buildings, wind farms, aircraft and factories. But these are still very early days in the “digital twin” realm. ... A digital twin system has three parts: The physical system, the virtual digital copy of that physical system and a communications channel linking the two. Increasingly, this communication is the relaying of sensor data from the physical system. It’s made from three major technology categories. If you imagine a Venn diagram of “metaverse” technologies in one circle, “IoT” in a second circle and “AI” in the third, “digital twin” technology occupies the overlapping center. Digital twins are different from models or simulations in that they are far more complex and extensive and change with incoming data from the physical twin....” 


Backup testing: The why, what, when and how

The aim of all testing is to ensure you can recover data. Those recoveries might be of individual files, volumes, particular datasets – associated with an application, for example, or even an entire site, or several. So, testing has to happen at differing levels of granularity to be effective. That means the differing levels of file, volume, site, and so on, as above. But it also means by workload and system type, such as archive, database, application, virtual machine or discrete systems. At the same time, the backup landscape in an organisation is subject to constant change, as new applications are brought online, and as the location of data changes. This is more the case than ever with the use of the cloud, as applications are developed in increasingly rapid cycles, and by novel methods of deployment such as containers. ... So, it’s likely that testing will take place at different levels of the organisation on a schedule that balances practicality with necessity and importance. Meanwhile, that testing must consider the constantly changing backup landscape.


Considerations for Developing Cybersecurity Awareness Training

Regular, ongoing cybersecurity awareness training is important, and the best time to start is during the new employee onboarding process. This sets the correct expectations in terms of what to do and what not to do before a new employee has access to the enterprise’s information assets or data. ... Enterprises may consider using classroom-based training (physical, virtual or a mixture of both) or a learning management system (LMS) to automate the delivery and tracking of cybersecurity awareness training. There are many online LMS providers, such as Absorb LMS and SAP Litmos, and they provide useful tools for creating online courses, quizzes and surveys. After online courses are created, an enterprise can use the LMS to organize and distribute online courses to its employees as needed. The LMS can also be used to monitor training progress, view analytics and allow employees to provide feedback in order for the enterprise to recalibrate its learning program for maximum impact.


AI and data privacy: protecting information in a new era

First of all, business technology leaders should consider whether they need AI and whether their problems can't be solved by more conventional methods. There is nothing worse than the "I want ML/AI solutions in my business, but I don't know what for yet" approach. To introduce AI you need to consider the entire architecture that will build, train and deploy models and consider how to collect and process large amounts of data. This requires assembling a good team, consisting of people such as data engineers, ML engineers and data scientists. It’s necessary to process large amounts of data and master many tools, so it is not as simple as writing a web application in a standard framework. Tech leaders should also be aware that AI comes with risks. They will need more and more computing resources to build increasingly sophisticated AI platforms. They will need to stay constantly abreast of news from the world of AI where everything changes rapidly, and it may turn out that in six months a much better solution or model for a particular problem has already been created.


Think carefully before considering cloud repatriation

It’s particularly difficult for smaller companies to repatriate, simply because, at their scale, the savings aren’t worth the effort. Why buy real estate and hardware and pay extra salaries only to save a small amount? By contrast, very large companies have the scale to repatriate, But do they want to? “Do Visa, American Express or Goldman Sachs want to be in the IT hardware business?” asks Sample, rhetorically. “Do they want to try to take a modest gain by moving far outside their competency?” Switching can also be complicated when the cost of change isn’t considered part of the calculation. A marginal run rate savings gained from pulling an application back on-prem may be offset by the cost of change, which includes disrupting the business and missing out on opportunities to do other things, such as upgrading the systems that help generate revenue. A major transition may also cause down time—sometimes planned and other times unplanned. “A seamless cutover is rarely possible when you’re moving back to private infrastructure,” says Sample. 


The High Costs of Going Cloud-Native

When it comes to reasons to move to the public cloud, “saving money” has long since been replaced with “increased agility.” Like the systems vendors they are in the process of displacing, AWS, Microsoft Azure, and Google Cloud have figured out how to maximize their customers’ infrastructure spend, with a little (actually a lot of) help from the unalterable economics of data gravity. ... Titled “Cloud-Native Development Report: The High Cost of Ownership,” the white paper tracks the journey of a hypothetical company as it embarks on a transition to cloud-native computing. ... While many of the technologies at the core of the cloud-native approach, such as Kubernetes, are free and open source, there are substantial costs incurred, predominantly around the infrastructure and the personnel. In particular, the costs of finding IT practitioners with experience in using and deploying cloud-native technologies was among the biggest hurdles identified by the report, which you can read here.


Security automation: 3 priorities for CIOs

To begin with, the CIO has choices to make about the testing approaches that will be deployed. Automation in AppSec can refer to tools and processes, ranging from automated vulnerability scanning (dynamic analysis) and static code analysis to software composition analysis and other types of security testing. The most advanced approaches can take things a step further by combining multiple forms of testing – perhaps augmenting DAST with interactive application security testing (IAST) and software composition analysis (SCA) – into a single scan for a comprehensive analysis of the organization’s security risk posture in a single frame. ... Meanwhile, in workflow terms, IT leaders should use customizable solutions to trigger scans at certain points in the development pipeline or based on a predefined schedule. This will allow CIOs and their teams to coordinate scans at specific times or in response to certain events like deploying new code or detecting a security incident.



Quote for the day:

"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller

Daily Tech Digest - March 01, 2023

Intel Releases Quantum Software Development Kit Version 1.0 to Grow Developer Ecosystem

The SDK is a customizable and expandable platform providing greater flexibility when developing quantum applications. It also provides for users to compare compiler files, a standard feature in classical computing development, to discern how well an algorithm is optimized in the compiler. It allows users to see the source code and obtain lower levels of abstraction, gaining insight into how a system stores data. Additional features include: Code in familiar patterns - Intel has extended the industry-standard LLVM with quantum extensions and developed a quantum runtime environment that is modified for quantum computing, and the IQS provides a state-vector simulation of a universal quantum computer. Efficient execution of hybrid classical-quantum workflows - The compiler extensions allow developers to integrate results from quantum algorithms into their C++ project, opening the door to the feedback loops needed for hybrid quantum-classical algorithms like the quantum approximate optimization algorithm (QAOA) and quantum variational eigen-solver (VQE).


Day in the Life of a Chief Developer Experience Officer (CDXO)

According to Cauduro, the overarching goal is to put himself in the developer’s shoes—he constantly thinks about common developer workflows and considers how to create a seamless experience throughout the entire development life cycle. Next is spreading awareness throughout the company about DX principles and how to increase DX within their offerings. A CDXO will likely answer directly to executive leadership but might interface with many departments. A CDXO may direct teams to construct developer-specific tools, like libraries, documentation, SDKs and self-service environments. “DX is a mindset,” said Cauduro. “The whole company needs to be engaged in it.” “As with any C-level position, your job is on the one hand to make your team’s life easier in any way you can,” said Burazin. “And on the other to convey the developers’ issues or ideas to the company in hopes of nudging the company in the correct direction.”


ChatGPT vs GDPR – what AI chatbots mean for data privacy

As an open tool, the billions of data points ChatGPT is trained on are made accessible to malicious actors who could use this information to carry out any number of targeted attacks. One of the most concerning capabilities of ChatGPT is its potential to create realistic-sounding conversations for use in social engineering and phishing attacks, such as urging victims to click on malicious links, install malware, or give away sensitive information. The tool also opens up opportunities for more sophisticated impersonation attempts, in which the AI is instructed to imitate a victim’s colleague or family member in order to gain trust. Another attack vector might be to use machine learning to generate large volumes of automated, legitimate-looking messages to spam victims and steal personal and financial information. These kinds of attacks can be highly detrimental to businesses. For example, a payroll diversion Business Email Compromise (BEC) attack, composed of impersonation and social engineering tactics, can have huge financial, operational, and reputational consequences for an organisation 


‘Most web API flaws are missed by standard security tests’

APIs can become less of a liability by including security-focused team members during design, encouraging secure coding, conducting regular security tests, and monitoring programming calls for attacks and misuse. Securing web APIs requires a different approach to classic web application security, according to Ball. “Standard web application tests will result in false-negative findings for web APIs,” he explains. “Tools and techniques that are not calibrated specifically to web APIs will miss on nearly all of the common vulnerabilities.” A notable example is a vulnerability in the USPS Informed Visibility API, first reported by security researcher Brian Krebs. The web application was thoroughly tested one month before Krebs reported the data exposure. During testing, tools like Nessus and HP WebInspect were applied generically to the testing targets, and therefore a significant web API vulnerability went undetected. This undiscovered security flaw, in the USPS Informed Visibility API, allowed any authenticated user to obtain access to email addresses, usernames, package updates, mailing addresses, and phone numbers associated with 60 million customers.


Exploring biometrics within payments

It’s an obvious question but despite all the potential benefits of adopting biometric security, the technology still features several vulnerabilities and weak points. First, it cannot be relied upon for a fingerprint scanner or smartphone camera to be available at every transaction. While consumers can use biometric authorization on most mobile devices, desktops still make up a large portion of eCommerce sales. Additionally, companies will need to adopt hardware capable of reading and interpreting this data to accept biometric payments. The price of this hardware could be cost-prohibitive, depending on what is needed and how far a company wants to take contactless payments. Finally, we cannot forget the consumer factor. They are more anxious about their privacy and where personal data goes than ever before. Even if biometric scans do not actually save or store their biometric information, many consumers might still refuse to provide these identifiers.


Building resilience in a polycrisis world

Seeing and responding to risk differently first requires leaders to clearly pinpoint where plausible risks could materialize and do the most damage to key operations and services. This can be tricky if companies have traditionally approached risk in a siloed way. Company leaders should spend time with one another to work through what if? scenarios, with an eye toward highlighting where exactly in the business a problem or failure would be most catastrophic to customers. ... Now that the executives had their focus—the outcome of getting cash—they could begin looking at all the ways customers do so, including ATMs and the workers who service them, brick-and-mortar banks, and the tech and third parties that help with electronic transfer payments and build resilience across all functions, rather than focusing on individual mechanisms. Prioritization exercises also help leaders tease out false assumptions. Leaders at a UK housing management company had believed that collecting rents via the company’s app was the key to business continuity.


Field-Programmable Qubit Arrays: The Quantum Analog of FPGAs

FPQAs make quantum algorithms more resource-efficient by reducing qubit and gate overhead. The ability to quickly update the qubit layout and connectivity enables rapid testing, benchmarking and optimization of algorithms—in a way, delivering a customized computer for each calculation. One example of how FPQAs can be used to achieve better quantum computing performance is optimization. Many optimization problems can be described mathematically in terms of graphs. The nodes describe the variables in the optimization problem and the edges can represent various relationships between them. For instance, the nodes can describe the potential location of 5G towers, and edges describe pairs of towers that cannot be simultaneously operated without generating interference. In another, more abstract representation, each node can be a stock, and an edge between two nodes denotes that these stocks are correlated. These graphs can be mapped onto analog FPQAs by assigning each node to a qubit and setting the connectivity so that two qubits interact if the corresponding atoms have an edge.


CISA director urges tech industry to take responsibility for secure products

Accepting the continued use of unsafe technology products presents a greater risk to the nation than the Chinese spy balloon that was shot down off the coast of South Carolina and cannot be allowed to continue, Easterly said. “By design, we’ve normalized the fact that technology products are released to market with dozens, hundreds or thousands of defects — such poor construction would be unacceptable in any critical field,” she said during the address. The burden for cybersecurity has disproportionately been placed on consumers and small organizations who are least aware of the threats or able to protect themselves. Easterly said no one would be expected to go out and buy a car that lacked seat belts and air bags as standard features, and nobody should be expected to go out and pay additional money for secure technology products. Government can advance legislation to prevent technology companies from disclaiming liability by establishing higher standards of care, Easterly said.


Cybersecurity in wartime: how Ukraine's infosec community is coping

Defending organizations during an ongoing war put Cossack Labs' cybersecurity experts on an accelerated learning path, says Pilyankevich's colleague Anastasiia Voitova, head of customer solutions. "What I learned is that the priorities are very different from peacetime," she says. "The risks are different; the threats are very different. We have this real enemy. It's not textbook security. No. These are real issues, and we need to build real mitigation to these real issues." One could easily fall into the trap of creating systems that use the highest possible level of security, but Voitova believes this can be a mistake because a system that's too paranoid won't be usable. "This trade-off drama of how to balance security and usability, right now, can cost you even more because if you create a super secure system, but no one will use it, it will lead people to adopt insecure methods," she says. "And if insecure messages are intercepted, people might be injured." Such mistakes are more likely to occur as the war continues and users face prolonged stress and tiredness.


The CIO’s new C-suite mandate

Executives who used to stay in their own lane now find themselves needing closer alignment with one another to manage economic uncertainty, explosive growth, and digital and business transformations, and CIOs have become central figures as business strategists and changemakers. This new C-suite dynamic requires three big shifts to be successful, according to Dan Roberts, CEO of Ouellette & Associates Consulting. CIOs must change the narrative of their relationship with their counterparts, they must prepare their IT teams to deliver on the new narrative, and they must convince the C-suite to share the technology load. It’s a tall order for sure. “I would say just 10% to 15% [of C-suite relationships] are healthy and thriving and are in the trenches together with shared ownership and accountability,” Roberts says. But those CIOs who can look across the enterprise and find new ways to drive revenue or better orchestrate the customer experience and then can communicate and sell their vision to their C-suite counterparts are at the high end of the maturity curve, he adds.



Quote for the day:

"We get our power from the people we lead, not from our stars and our bars." -- J. Stanford

Daily Tech Digest - February 28, 2023

Does the Future of Work include Network as a Service (NaaS)?

NaaS enables companies to implement a network infrastructure that will evolve with time, providing the flexibility to adapt to business needs as time evolves. With NaaS, companies can focus on business outcomes and service level objectives for their network and the accessibility required for their community of workers, partners, and customers. NaaS eliminates organizations having to worry about keeping up with the pace of technology change by relying on the strength and expertise of their implementation partner. NaaS eliminates large upfront capital expenditure investments that often go into new network infrastructure design, planning, and implementation with a monthly subscription-based or flexible consumption model, alleviating the financial impact on rebuilding a new workplace environment. NaaS enables more flexibility by not tying the organization down to specific hardware or capital investments that may eventually become obsolete.


How Technical Debt Hampers Modernization Efforts for Organizations

“When you develop an application, you take certain shortcuts for which you're going to have to pay the price back later on,” explains Olivier Gaudin, cofounder and CEO of SonarSource, which develops open-source software for continuous code quality and security. “You accept that your code is not perfect. You know that it will have a certain cost when you come back to it later. It will be a bit more difficult to read, to maintain or to change.” ... Experts note the patience and long-term strategy required to overcome technical debt. “It’s a matter of focusing on longer-term strategy over short-term financial goals,” Orlandini says. “Unfortunately for Southwest, the issues were well-known. However, the business as a whole did not have the will or motivation to invest in fixing it until it was too late. They are an extreme example but serve as a very valid case in point of what can happen if you do not understand the issues and the ultimate repercussions of not investing to avoid a meltdown, in whatever form that would take for each organization.”


Just how big is this new generative AI? Think internet-level disruption

While resources and information availability increased by an unprecedented degree, so too did misinformation, scams, and criminal activity. One of the biggest problems with ChatGPT is that it presents completely wrong information as eloquently and confidently as it presents accurate information. Unless requested, it doesn't provide sources or cite where that information came from. Because it aggregates a tremendous amount of free-form information, it's often impossible to trace how it comes by its knowledge and assertions. This makes it ripe for corruption and gaming. At some point, AI designers will need to open their systems to the broader internet. When they do, oh boy, it's going to be rough. Today, there are entire industries dedicated to manipulating Google search results. I'm often required by clients to put my articles through software applications that weigh each word and phrase against how much Google oomph it produces, and then I'm asked to change what I write to appeal more to the Google algorithms.


Is blockchain really secure? Here are four pressing cyber threats you must consider

Blockchains use consensus protocols to reach agreement among participants when adding a new block. Since there is no central authority, consensus protocol vulnerabilities threaten to control a blockchain network and dictate its consensus decisions from various attack vectors, such as the majority (51%) and selfish mining attacks. ... The second threat is related to the exposure of sensitive and private data. Blockchains are transparent by design, and participants may share data that attackers can use to infer confidential or sensitive information. As a result, organizations must carefully evaluate their blockchain usage to ensure that only permitted data is shared without exposing any private or sensitive information. ... Attackers may compromise private keys to control participants’ accounts and associated assets by using classical information technology methods, such as phishing and dictionary attacks, or by exploiting vulnerabilities in blockchain clients’ software.


Behaviors To Avoid When Practicing Pair Programming

Despite its popularity, pair programming seems to be a methodology that is not wildly adopted by the industry. When it is, it might vary on what "pair" and "programming" means given a specific context. Sometimes pair programming is used in specific moments throughout the day of practitioners, as reported by Lauren Peate on the podcast Software Engineering Unlocked hosted by Michaela Greiler to fulfill specific tasks. But, in the XP, pair programming is the default approach to developing all the aspects of the software. Due to the variation and interpretation of what pair programming is, companies that adopt it might face some miss conceptions of how to practice it. Often, this is the root cause of having a poor experience while pairing.Lack of soft (social) skills ... The driver and navigator is the style that requires the pair to focus on a single problem at once. Therefore, the navigator is the one that should give support and question the driver's decisions to keep both in sync. When it does not happen, the collaboration session might suffer from a lack of interaction between the pair. 


When it comes to network innovation, we must protect the data ‘pipes’

We must conclude that any encrypted information collected by foreign intelligence services will eventually be cracked through sufficient compute power and time. This is one reason why super computers are part of the race for information dominance. At the level of supercomputers, the amount of compute is truly calculated in cost to build and cost to operate. If you do not have access to cutting edge chips, just increase the number of compute chips, central processing unit or graphics processing unit, or some other compute unit like an AI accelerator. It will cost more to make and cost more electricity to operate, but the amount of compute will be available to the government or corporation that invested in the system. Without a true “zero trust” scheme, any compromise of any node on any network becomes a pivot point for further attacks. The problem with “zero trust” is that to be effective, you need a mature network model that can be secured, not a “growing, organic network” that is adapting rapidly to meet the needs of the user.


Unstructured data and the storage it needs

As we’ve seen, unstructured data is more or less defined by the fact it is not created by use of a database. It may be the case that more structure is applied to unstructured data later in its life, but then it becomes something else. ... It’s quite possible to build adequately performing file and object storage on-site using spinning disk. At the capacities needed, HDD is often the most economic option.But advances in flash manufacturing have led to high-capacity solid state storage becoming available, and storage array makers have started to use it in file and object storage-capable hardware. This is QLC – quad-level cell – flash. This packs in four levels of binary switches to flash cells to provide higher storage density and so lower cost per GB than any other flash commercially usable currently. The trade-offs that come with QLC, however, are that flash lifetime can be compromised, so it’s better suited to large-capacity, less frequently accessed data. But the speed of flash is particularly well-suited to unstructured use cases, such as in analytics where rapid processing and therefore I/O is needed


The Cybersecurity Hype Cycle of ChatGPT and Synthetic Media

Historically, spearphishing messages have been partially or entirely crafted by people. However, synthetic chat makes it possible to automate this process – and highly advanced synthetic chat, like ChatGPT, makes these messages seem just as, or more convincing, than a human-written message. It also opens the door for automated, interactive malicious communications. With this in mind, threat actors can quickly and cheaply massify high-cost and highly effective approaches like spearphishing. These capabilities could be used to support cybercrime, nation-state operations and more. Advances like ChatGPT may also have a meaningful impact on information operations, which have come to the forefront due to foreign influence in recent US presidential elections. Technologies such as ChatGPT can generate lengthy, realistic content supporting divisive narratives, which could help scale up information operations.


How to de-risk your digital ecosystem

In short, in any de-risking framework, one must assume that the largest source of cyberthreats comes not from someone breaking in, but rather from a door left open for an uninvited guest. Organizations must adapt their mindset, their processes, and their resources accordingly. ... In many organizations, the responsibility for closing risk gaps falls to several leaders, but not to a single point of authority. The failure is understandable as digital ecosystems touch multiple dimensions of an enterprise. But then responsibility for the total risk environment and de-risking is shared — though not necessarily met. A lack of accountability results in a lack of power to act and set de-risking as a priority within the organization. ... Without understanding the context of the business, understanding and remediating risk is difficult to do effectively. For example, an outside vendor can be a potential source of risk but also plays a critical and central role in the business. Resolving and mitigating the issue may require special handling and attention.


Closing the Cybersecurity Talent Gap

Cybersecurity is often viewed as just another technical talent field, yet candidates are expected to possess a wide range of rapidly evolving knowledge and skills. When filling staffing gaps, leaders should examine the skill sets that are missing from their current team, such as creative problem solving, stakeholder communications, buy-in development, and change enablement. “Look for candidates who will help balance out existing team skills as opposed to individuals who match a specific technical qualification,” Glair says. Before hiring can begin, it's necessary to attract suitable candidates. Initial search steps should include website updates and social media posts, Glair says. He also suggests creating an internal “cybersecurity academy” that will build talent from within the organization. “This should include the technical, process, communications, and leadership skills needed to address today’s cybersecurity challenges,” Glair notes. Burnet recommends sponsoring a “sourcing jam.” “That means getting recruiters and/or hiring managers in a room together ... to trawl through their networks and get them to personally reach out.”



Quote for the day:

"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour