Daily Tech Digest - January 20, 2023

Generative AI isn’t about what you think it is

ChatGPT and other generative artificial intelligence (AI) programs like DALL-E are often thought of as a way to get rid of workers, but that isn’t their real strength. What they really do well is improve on the work people turn out. There’s often a conflict between doing something fast and doing it well — a conflict generative AI could end by helping people become better and faster creators. And clearly, if these tools were presented more as assistants rather than as a replacement for people, the blowback we’ve seen (most recently in court) could be tamped down. ... We usually measure productivity as the amount of work done in a given time — without taking into account the quality of that work. Typically, the faster you do something, the lower the quality. Quality in and of itself is an interesting subject. I remember reading the book “Zen and the Art of Motorcycle Maintenance,” which uses storytelling to explain how quality is fluid and depends on the perception of the person observing it. For instance, what’s considered high quality in a sweat shop would be completely unacceptable in a Bentley factory.


Enterprises remain vulnerable through compromised API secrets

While many security teams assign specific entitlements to API keys, tokens, and certificates, the survey discovered that more than 42% do not. That means they’re granting all-or-nothing access to any users bearing these credentials, which although is the path of least resistance in access management, also increases the security risk. Corsha’s researchers also found that 50% of respondents have little-to-no visibility into the machines, devices, or services (i.e., clients) that leverage the API tokens, keys, or certificates that their organizations are provisioning. Limited visibility can lead to secrets that are forgotten, neglected, or left behind, making them prime targets for bad actors to exploit undetected by traditional security tools and best practices. Another red flag: although 54% of respondents rotate their secrets at least once a month, 25% admit that they can take as long as a year to rotate secrets. The long-lived, static nature of these bearer secrets make them prime targets for adversaries, much like the static nature of passwords to online accounts.


The essential check list for effective data democratization

In many cases, only IT has access to data and data intelligence tools in organizations that don’t practice data democratization. So in order to make data accessible to all, new tools and technologies are required. Of course, cost is a big consideration, says Orlandini, as well as deciding where to host the data, and having it available in a fiscally responsible way. An organization might also question if the data should be maintained on-premises due to security concerns in the public cloud. But Kevin Young, senior data and analytics consultant at consulting firm SPR, says organizations can first share data by creating a data lake like Amazon S3 or Google Cloud Storage. ... Most organizations don’t end up with data lakes, says Orlandini. “They have data swamps,” he says. But data lakes aren’t the only option for creating a centralized data repository. Another is through a data fabric, an architecture and set of data services that provide a unified view of an organization’s data, and enable integration from various sources on-premises, in the cloud and on edge devices. A data fabric allows datasets to be combined, without the need to make copies, and can make silos less likely.


Creating Great Psychologically Safe Teams

Conflict avoidance can be corrosive, even deadly, causing teams to miss opportunities and needlessly exposing them to risk. Members might recognize hazards but decline to bring them up, perhaps for fear of being seen as throwing a colleague under the bus… No matter how sensitive the issue or how serious the criticism, members must feel free to voice their thoughts openly—though always constructively—and respond to critical input with curiosity, recognizing that it is a crucial step toward a better solution. Mamoli pointed out that "there is a lot of misunderstanding around psychological safety," saying that "it doesn’t mean we’re super nice to each other and feel comfortable all the time." She explained that the resulting behaviour should be that teams "hold each other accountable" and can safely provide direct feedback saying "this is what I need from you. Or you are not doing this." She said that "this is what we need to remember psychological safety really means."


Big Tech Behind Bars? The UK's Online Safety Bill Explained

One major criticism of the Online Safety Bill is that it poses a threat to freedom of expression due to its potential for censoring legal content. Rights organizations strongly opposed the requirement for tech companies to crack down on content that was harmful but not illegal. An amendment in November 2022 removed mention of "lawful but harmful" content from the text, instead obliging tech companies to introduce more sophisticated filter systems to protect people from exposure to content that could be deemed harmful. Ofcom will ensure platforms are upholding their terms of service. Child safety groups opposed this amendment, claiming that it watered down the bill. But as the most vocal proponents of the bill, their priority remains ensuring that the legislation passes into law. Meanwhile, concerns over censorship continue. An amendment to the bill introduced this week would make sharing videos that showed migrants crossing the channel between France and the UK in "a positive light" illegal. Tech companies would be required to proactively prevent users from seeing this content.


Quantum Computing Owes Security Risk: Its Implications are Unfavorable

The foundation of quantum computing is quantum mechanics, which is fundamentally different from classical computing. Bits are used in traditional computing to process information, and they can only be in one of two states: 0 or 1. Quantum bits, or qubits, which can be in multiple states at once, are used in quantum computing to process data. This enables quantum computers to carry out some computations much more quickly than traditional computers. The potential for quantum computing to defeat many of the encryption algorithms currently in use to safeguard sensitive data is one of its most important implications. Although encryption algorithms are made to be hard to crack, they still depend on mathematical puzzles that can be solved by conventional computers fairly quickly. Due to the speed at which quantum computing can solve these issues, encryption can be broken much more quickly. The security of sensitive data, including financial information, personal information, and secrets of national security, is seriously impacted by this. 


Combatting the ongoing issue of cyberattacks to the education sector

The growing threat of cyberattacks has underscored that organisations can no longer depend on conventional perimeter-based defences to protect critical systems and data. New regulations and industry standards are aimed at shifting the cybersecurity paradigm – away from the old mantra of ‘trust but verify’ and instead towards a Zero Trust approach, whereby access to applications and data is denied by default. Threat prevention is achieved by only granting access to networks and workloads utilising policy informed by continuous, contextual, risk-based verification across users and their associated devices. There are many starting points on the path to Zero Trust. However, one driving principle to determine your priority of implementation should be the knowledge that the easiest way for cyberattackers to gain access to sensitive data is by compromising a user’s identity. ... Furthermore, post-mortem analysis has repeatedly found that compromised credentials are subsequently used to establish a beachhead on an end-user endpoint, which typically serve as the main point of access to an enterprise network. 


How A Company’s Philosophy To ‘Shift Left’ Is Making Headway In The Data Privacy World

Whether privacy sits within legal, security, or both it is less important than ensuring your privacy team is well-resourced and able to collaborate with the organization as a whole. Key to this collaboration is making sure you have the necessary legal and engineering staff to conduct privacy reviews and navigate a rapidly evolving regulatory landscape. Separately, you need to overcome the perception that privacy is an obstacle to productivity and get your product and growth teams to see privacy as a competitive advantage that allows them to build quickly and win consumer trust. Otherwise, pushback, low adoption, and apathy will prevent you from making any real progress. To unify product development with privacy standards, you have to make it impossibly easy for product teams to comply with privacy standards. That means bringing the privacy program directly into their process, right where they are already working, as well as giving them easy-to-understand guardrails that let them build quickly, without having to engage in a painful back and forth with the privacy lawyers and engineers conducting privacy reviews.


Managing Expectations in Low-Code/No-Code Strategies

To guarantee a LC/NC strategy is successful, organizations must ensure there is a bulletproof infrastructure, data governance and security system in place, as well as full visibility into their data and applications. “As a first step, enterprises must gain an understanding of their data -- what it is, where it is and what it’s worth,” Mohan says. “From there, IT leaders can understand where security and compliance vulnerabilities lay and then work to eliminate these threats while ensuring sufficient oversight for potential legal and contractual issues.” While the responsibility of developing a LC/NC strategy falls, initially, on an enterprise’s CTO or CIO, Mohan advises tech leadership should loop in experts in data security, data protection and governance to address cyber and compliance threats and ensure employees are following proper company and legal protocols. ... “Every level of leadership can decide to use a low-code/no-code strategy, ranging from an engineering team manager who is tasked with building products for the company, to a CTO setting the strategic direction of the organization's engineering efforts,” he explains.


Attackers Crafted Custom Malware for Fortinet Zero-Day

The BoldMove backdoor, written in C, comes in two flavors: a Windows version and a Linux version that the threat actor appears to have customized for FortiOS, Mandiant said. When executed, the Linux version of the malware first attempts to connect to a hardcoded command-and-control (C2) server. If successful, BoldMove collects information about the system on which it has landed and relays it to the C2. The C2 server then relays instructions to the malware that ends with the threat actor gaining full remote control of the affected FortiOS device. Ben Read, director of cyber-espionage analysis at Mandiant, says some of the core functions of the malware, such as its ability to download additional files or open a reverse shell, are fairly typical of this type of malware. But the customized Linux version of BoldMove also includes capabilities to manipulate specific features of FortOS. "The implementation of these features shows an in-depth knowledge of the functioning of Fortinet devices," Read says. "Also notable is that some of the Linux variants features appear to have been rewritten to run on lower-powered devices."



Quote for the day:

"It is the responsibility of leadership to provide opportunity, and the responsibility of individuals to contribute." -- William Pollard

Daily Tech Digest - January 19, 2023

Security risks of ChatGPT and other AI text generators

Yet ChatGPT is likely just the beginning of AI-powered cybercrime. Over the next five years, future iterations of AI will indeed change the game for cybersecurity attackers and defenders, argues a research paper entitled "The security threat of AI-powered cyberattacks" released in mid-December 2022 by Traficom, the Finnish government's transportation and communications agency. "Current rapid progress in AI research, coupled with the numerous new applications it enables, leads us to believe that AI techniques will soon be used to support more of the steps typically used during cyberattacks," says Traficom. "We predict that AI-enabled attacks will become more widespread among less skilled attackers in the next five years. As conventional cyberattacks will become obsolete, AI technologies, skills and tools will become more available and affordable, incentivizing attackers to make use of AI-enabled cyberattacks." The paper says while AI cannot help with all aspects of a cyberattack, it will boost attackers' "speed, scale, coverage and sophistication" by automating repetitive tasks.


Why Innovation Depends on Intellectual Honesty

Anxious teams score high on intellectual honesty and moderate to low on psychological safety. Team members are encouraged to be brutally honest because it’s better to be right, and win, than it is to be nice. To return to Steve Jobs: He famously described his approach as being designed to keep “the B players, the bozos, from larding the organization. Only the A players survive.”5 Just as famously, he cared little for creating social cohesion. Apple’s former chief design officer, Jony Ive, has described a conversation during which Jobs berated him for wanting to be liked by his team at the expense of being completely honest about the quality of their work. This example illustrates two types of conflict that emerge from intellectual honesty: task conflict and relationship conflict. Task conflict — disagreement about the work — can be highly productive for innovation and team performance. But relationship conflict, which arises when the way someone says or does something makes people feel rejected, is detrimental. Here’s why. On teams that have an anxious culture, people are willing to push one another to learn through disagreement. 


8 ‘future of work’ mistakes IT leaders must avoid

Virtual reality is one technology that could have an impact on the future of work, and some IT leaders are considering the benefits. Oculus headsets from Meta, for example, are being rolled out on a trial basis at the University of Phoenix, which has made the decision to go fully remote. This was a big mindset change for Smith, who felt pre-pandemic that “face-to-face collaboration was better and high fidelity for creativity purposes,’’ he says. “Then, when everything shifted to full-time remote, it went against my core beliefs, so personally, I had to lean in.” Smith has come to realize that staying remote has not affected IT’s ability to collaborate and teams have been able to remain productive and launch “complex new products into the marketplace.” He says that working remotely has increased his ability to access tech talent outside of the Phoenix area. But when people were working in a hybrid model early on, there would be multiple conversations going on, and “people on the remote end were getting the short end of the stick” because they “couldn’t get a word in edgewise,’’ Smith recalls.


Top intelligent automation trends to watch in 2023

Automation technology will be key in automating previously inflexible processes whilst providing intelligent data led nudges that help agents work efficiently in a complex operating environment. This means that companies can offer an unprecedented level of flexibility and support to their staff, while making significant improvements to engagement and wellbeing. By improving engagement between employees and employers – and fostering a culture of support and encouragement – everyone benefits. ... Since machine learning (ML) rose to significance a decade or so ago, it has rapidly transformed nearly every industry. Businesses would be wise to sharpen their skills and learn what ML has to offer. Whilst technologies in the past only processed static, historical data, ML provides a real-time capability that transforms the gap. It can help organisations become better at predicting flows and responding to them proactively rather than reactively. The potential improvement to areas such as customer service is enormous. Solutions can leverage “productionising” ML models – by which a model is transformed to a scalable, observable, mission critical, production-ready software solution – at their core.


What kind of future will AI bring enterprise IT?

The incremental approach turns out to be the smartest way to build with AI/ML. As AWS Serverless Hero Ben Kehoe argues, “When people imagine integrating AI … into software development (or any other process), they tend to be overly optimistic.” A key failing, he stresses, is belief in AI/ML’s potential to think without a commensurate ability to fully trust its results: “A lot of the AI takes I see assert that AI will be able to assume the entire responsibility for a given task for a person, and implicitly assume that the person’s accountability for the task will just sort of … evaporate?” In the real world, developers (or others) have to take responsibility for outcomes. If you’re using GitHub Copilot, for example, you’re still responsible for the code, no matter how it was written. If the code ends up buggy, it won’t work to blame the AI. The person with the paystub will bear the blame, and if they can’t verify how they arrived at a result, well, they’re likely to scrap the AI model before they’ll give up their job. This is not to say that AI and ML don’t have a place in software development or other areas of the enterprise.


How CISOs can manage the cybersecurity of high-level executives

The risk faced by executives has grown rapidly as the pandemic-driven rise of hybrid work increased the blurring of professional and personal digital lives. Complex geopolitical tensions, opportunities for digital activism against corporates—particularly in industries with higher risk profiles—and the prospect of financial gain from targeting wealthy leaders have all raised the stakes on the personal digital lives of executives. A large organization, especially if it's a publicly listed company with a C-suite leadership team that has a presence in the media and on social media can be a lightning rod for the attention of bad actors, says Gergana Winzer, partner of cyber services with KPMG Australia. “Some of these small-time criminals have awakened to the reality of being able to make monetary returns by utilizing easy-to-buy malware or ransomware online and just deploying it across those types of high-net-worth individuals,” Winzer says. This class of personal risks can take many different forms, according to Pierson, who says one of the biggest risks is to intellectual property—the loss of corporate documents from executives’ personal devices or personal accounts where there are fewer or no controls.


Taking the Reins on IT Interoperability

Interoperability can be elusive because many organizations embark on tactical changes or fail to see the complete picture, Barnett says. “In many cases, they focus only on a part of the organization without fully understanding the impact on technology investments, process reengineering, and human capital assets,” he explains. The intersection of operational technology (OT) and information technology (IT) can prove particularly nettlesome. Historically, these two entities have operated separately, with attempts to connect systems and data an afterthought. “This often leads to the creation of data silos … that hinder agility, reduce productivity, impede customer experience improvements, and hamper scalability,” Barnett says. Business and IT leaders who ignore these problems do so at their own peril. Accenture found that 66% of organizations struggle with the sheer number of applications. This results in technical debt and a loss of agility, McKillips says. In addition, 60% are unable to align their application strategy with overall business goals and 44% struggle to identity the right business case or ROI. Remarkably, 34% believe interoperability is simply too expensive.


ICS Confronted by Attackers Armed With New Motives, Tactics, and Malware

The report identified top trends in the ICS threat landscape based on a compilation of information from various sources including open source media, CISA ICS-CERT advisories, and Nozomi Networks telemetry, as well as on exclusive IoT honeypots that Nozomi researchers employ for "a deeper insight into how adversaries are targeting OT and IoT, furthering the understanding of malicious botnets that attempt to access these systems," Gordon says. What researchers observed over the last six months was a significant uptick in attacks that caused disruption to a number of industries, with transportation and healthcare being among the top new sectors finding themselves in the crosshairs of adversaries among more traditional targets. Attackers are using various methods of initial entry to ICS networks, although some common weak security links that have historically plagued not just ICS but the entire enterprise IT sector — weak/cleartext passwords and weak encryption — continue to be the top access threats. Still, “Root” and “admin” credentials are most often used as a way for threat actors to gain initial access and escalate privileges once in the network, the findings show.


Cybersecurity CTO: A day in the life

Given the scope of the job, a CTO is rarely going to have a consistent daily schedule. Instead, goals and cadences are established weekly. That being said, I do go into the office every day. My typical workday begins at 9:30 a.m., and I take an electric scooter to get into the office. Our headquarters are located in Tel Aviv, so the weather is almost always perfect for the scooter. On a weekly basis, I hold one-on-one meetings with specific managers to understand team needs, review KPIs to ensure they’re being met, and review our proof-of-concept (POC) projects to ensure our customers and potential customers are advancing. These POC reviews are where we often catch technical issues, allowing us to fix them before they cause problems for our customers. While I’m responsible for several employees within our R&D department, I do my best to distance myself and empower our VP of R&D to manage the team. The goal is quality – not getting bogged down in how or when people work. I usually wrap up my time in the office around 6:30 or 7:00 p.m. 


Proven Solutions to Five Test Automation Issues

When you run your automated tests, you need the dependent systems to support your test scenarios. That includes setting up the API and service responses to match what is needed for your test cases. Setting up test data in backends might be problematic, as they might not be within your team’s control. Relying on another team to set up the test data for you means you may end up with incorrect or missing test data and, therefore, cannot continue working on or running your automated tests. Another issue is that even if you have the test data, running your automated tests frequently in the build pipeline might use it all up (test data burning). Then you need a test data refresh, which might take even longer than the partial test data setup, and you are blocked again. Even if you have all the test data you need, when you (or some other team) run their automated or manual tests against the same services, the test data might change (for example, the balance on account or list of purchased items by a user). The tests break again because of test data issues rather than actual issues with the product.



Quote for the day:

"People don't resist change. They resist being changed." -- Peter M. Senge

Daily Tech Digest - January 18, 2023

How Will Cloud Computing Make Drone-Based Solutions Smarter

Due to its multi-sector application, cloud-processed data becomes a valuable resource. For governments and enterprises, it can become a viable source of revenue. As new urban and rural projects are commissioned, these high-resolution datasets are crucial for the planning process. It is useful in satisfying several government schemes such as PM Gram Sadak Yojna, PM Awas Yojna, Bharat FiberNet, and many more. For instance, SVAMITVA data along with DEM layers can help officials chart out the most optimum route of power lines for rural electrification. Similarly, digital terrain maps can help ascertain the natural slopes and assist engineers in designing efficient gravity-aided sewage networks. Cloud computing creates a centralised repository of GIS data which has the potential to drive innovation. Prior to cloud processing, data sharing of this kind had software and hardware limitations. However, the cloud brings forth unified data standards across the country making it hassle-free to access high-quality data. 


Why Cybersecurity Learning and Development is a Lifeline During Economic Downturn

More than a third of Europe’s largest tech companies are currently based in London and the UK remains a beacon of technological innovation. Yet, our research suggests that tech companies across the UK lack the technological skills they need to thrive and remain safe in the challenging months ahead. With DCMS’ UK Data Skills Gap report highlighting that the supply of university graduates with specialist technological skills is limited, companies must accept they have a larger role to play to increase digital skills internally rather than simply looking outside for ready-made talent. Business leaders must put adequate investment and support behind the upskilling of current employees to bolster cybersecurity talent and drive innovation. At the same time, employees should prioritize cybersecurity-related L&D to make themselves an invaluable asset to their organization – proactively identifying training opportunities with a quality L&D partner, one that aligns with their unique learning style and objectives. While there is no cookie-cutter approach to upskilling, employees should be granted access to a range of learning opportunities as part of a defined path of individual development


Artificial intelligence is here, but the technology faces major challenges in 2023

Whether AI will replace human jobs is less important than more vital ethical questions that need to be addressed in 2023, Bhargava says. The more pressing concern is "who's making these things and what questions are they asking about what biases are baked into it." When tools like ChatGPT are designed by teams with limited perspectives and diversity, the result is a tool lacking in perspective. "These systems that get built … are mirrors for our culture and our practices," says Bhargava. "Which way do they point and who's looking in them? No, they don't embed bias; they reflect it." There are some measures being taken to address the ethical questions around AI bias. Dakuo Wang, associate professor of art and design and computer science, says ChatGPT's real innovation is how it uses human data labelers during the process of training the AI to limit bias and increase accuracy. But even then, the technology is only as good as the data it's been trained on. Without the right data, the inaccuracies and limitations become much more obvious––and potentially dangerous.


Ransomware Looms Large on Third-Party Risk Landscape

First, it is important to have a clear understanding of the enterprise’s IT-related supply chain. This includes identifying all of the suppliers, subcontractors and other partners that process, transmit or store data used in the creation of the enterprise’s products and services. It is also important to understand the relationships between these different entities, as well as the specific products and services that each one provides, which results in a mapping. Once the supply chain has been mapped out, the next step is to identify the potential risks associated with each component of the chain. This includes both external and internal risks. External risks might include things like natural disasters, political instability or economic downturns. Internal risks might include things like employee turnover, equipment failure or data breaches. To identify these risks, enterprises should consider conducting a risk assessment. This will involve gathering and analyzing data from a variety of sources, including supplier contracts, insurance policies and regulatory compliance reports. 


DevOps and platform engineering

Despite many new teams and job titles springing up around DevOps, the platform engineering team is, perhaps, the most aligned to the mindset and objectives of DevOps. Platform teams work with development teams to create one or more golden pathways representing a supported set of technology choices. These pathways don't prevent teams from using something else. Pathways encourage alignment without enforcing centralized decisions on development teams. Rather than pick up tickets, such as "create a test environment", platform teams create easy-to-use self-service tools for the development teams' use. A critical part of platform engineering is treating developers as customers, solving their problems and reducing friction while advocating the adoption of aligned technology choices. ... Platform engineering alone doesn't provide a complete organizational view of performance. The DevOps structural equation model shows us capabilities for leadership, management, culture, and product that are outside a platform team's scope.


The Internet of Things: What security risks should you look out for?

With more businesses adopting the IoT and with smart homes becoming increasingly popular, focusing on cybersecurity alone is not nearly enough. It is also important to ensure the physical security of these devices. Most of these devices are generally quite small and easily accessible and could be tampered with or stolen. Once stolen, these devices may be taken to another location where they can be disassembled and probed for any data. These stolen devices might also be used to breach the IoT systems to which they are connected. Moreover, a hacker could plant a bug in a device without even having to move it. These issues highlight how important physical security is and why companies need to take steps to ensure the physical safety of their device network. There are several standards for cybersecurity today, and in a lot of cases, companies are even required by law to comply with some of these standards. Unfortunately, no such international standards exist for the IoT. All we have are best practices and recommendations. While steps are being taken to strengthen IoT security, we have yet to see a framework of recognized, international standards for IoT security


A Platform Team Product Manager Determines DevOps Success

As you build platforms out across the organization, Kersten said, it’s important to ensure that the feedback loops expand accordingly. “If you first build self-service for your own team it tends to be a simpler problem,” he said. “You’ve got the feedback loops already. You should, within a team, be talking to each other. Thinking about what you do as self-service and trying to build those abstractions for yourself, then you’re hopefully freeing up time.” As the platform embraces other teams, “You can’t do platform engineering if you don’t have some way of talking to the people who are actually going to be using the services you build, and working out what their actual problems are, because their problems will be different from yours.” The “State of DevOps” report’s findings underscore the need for a product manager with these “soft skills” to make platform engineering a success at scale. Sixty-one percent of respondents said strong communication skills were the most important product management skills for a platform team’s success.


Why Applying Constant Pressure on Yourself Can Significantly Improve Your Productivity and Success

As with so many things, working through pressure gets easier with practice. It's like a muscle or a skill — you have to train it to strengthen it. No one is walking into the weight room for the first time and squatting with 400 pounds, nor would it be recommended. Without training, you're only going to hurt yourself. There's a reason Lionel Messi is consistently chosen to take penalty kicks; he's taken so many before and has found a way to be comfortable and successful through what's arguably the most pressure-inducing moment of the game. He's been put in the situation before and risen to the challenge repeatedly in a way other players haven't mastered yet. ... Different people have different strategies, but something I've found crucial is recognizing the adrenaline that comes with the feeling of pressure. On a physical level, the fear you might feel during those moments is not all that different from the feeling you get when you're excited, like climbing the highest point of a rollercoaster. The trick is channeling that adrenaline towards the latter and using it to fuel excitement rather than fear. 


AI and Human Creativity - Could it Lead to General Cognitive Decline?

AI might be able to generate new and novel ideas by remixing what is fed into it, but that doesn't necessarily help the humans who create the input improve their access to their own creative powers. It's not just about the quality of what is generated, it is also about improving our thinking skills. Creativity might be innate but we can always get better at inviting its presence. In my experience, that's a mental skill that improves with practice. And highlighting the importance of the human element in creativity is all well and good but creators in a hurry could be ever more inclined to just press a button to get the output to meet a deadline instead of going inside, reflecting, and finding that creative state necessary to doing it on their own. And, of course, yes, AI is a tool and it is about how you use it. I think it is also about how you frame its purpose and how that relates to our values as a society. Consider the relative importance of the intrinsic value of creativity versus a context that gives more weight to the speed of delivery and amount of output.


Enterprise Architecture Must Evolve for Digital Transformation

Current enterprise architectures (EAs) were being developed in the 1980s, and while there have been iterations of them since, widely adopted EAs are still utilizing the same architectural foundations as when they were established. Take for example The Open Group Architecture Framework (TOGAF), which had its first version published in 1995. The foundation still consists of the same four architectural domains: business, application, data, and technical. That foundation was laid before the internet existed. And this is part of the problem. Today it is not uncommon to equate technology with the worldwide connection that is so ingrained in our everyday lives. While TOGAF has managed to support businesses up to now by versioning, including integrating the internet and new capabilities into its architecture, it wasn’t purpose-built for today’s possibilities—digital business. Our understanding of what’s possible drives the need for modernizing enterprise architecture.



Quote for the day:

"No man can stand on top because he is put there." -- H. H. Vreeland

Daily Tech Digest - January 17, 2023

The 7 new rules of IT leadership

There’s no question that stable, strong IT infrastructure is more essential now than ever, yet CIOs can’t succeed by making a steady state the-end-all-be-all. Instead, they must be change agents who are not only OK with constant change but also advocate for it while ensuring infrastructure can scale and support that change. “Success is managing change versus moving from one fixed stone to another,” Cameron says. “So for CIOs to be really successful in this new environment, they need to be able to make change continuous, and they have to find ways as leaders to help their people understand how to do that.” He adds: “That means making structural changes.” There is mindset shift here but equally important — if not more so — is the need to change how work actually happens. One of the most prominent adjustments for IT is the move from approaching technology delivery as projects — something that’s planned, executed, and completed — to a product mindset that embraces incremental improvements delivered throughout a digital tool’s lifecycle.


Essential skills for becoming a CTO

The easiest way into a management and leadership role is to become an engineering manager before you become a CTO. Assuming you have that engineering manager role in your company, there are a whole bunch of great books on engineering management, such as The Pragmatic Programmer by David Thomas and Andrew Hunt. Another good one is Accelerate, which show you how to measure software delivery performance. A good general technical management book is The Manager’s Path by Camille Fournier, while The Five Dysfunctions of a Team by Patrick Lencioni is very good talking about psychological safety and intra relations within a team. ... Possibly soft skills have been neglected in the past. Nobody should be trying to take on a management or leadership position without any understanding of what it means to deal with people and motivate them. Empathy, communication and creating an environment of psychological safety so that people can really push the boundaries of what they work on without fear of reprisal, are really important in a management role. 


AI Lawyer: It's Starting as a Stunt, but There's a Real Need

advocates say AI's ability to sort information, spot patterns and quickly pull up data means that in a short time, it could become a "copilot" for our daily lives. Already, coders on Microsoft-owned GitHub are using AI to help them create apps and solve technical problems. Social media managers are relying on AI to help determine the best time to post a new item. Even we here at CNET are experimenting with whether AI can help write explainer-type stories about the ever-changing world of finance. So, it can seem like only a matter of time before AI finds its way into research-heavy industries like the law as well. And considering that 80% of low-income Americans don't have access to legal help, while 40% to 60% of the middle class still struggle to get such assistance, there's clearly demand. AI could help meet that need, but lawyers shouldn't feel like new technology is going to take business away from them, says Andrew Perlman, dean of the law school at Suffolk University. It's simply a matter of scale. "There is no way that the legal profession is going to be able to deliver all of the legal services that people need," Perlman said.


The EU wants to regulate your favorite AI tools

Lawmakers in Europe are working on rules for image- and text-producing generative AI models that have created such excitement recently, such as Stable Diffusion, LaMDA, and ChatGPT. They could spell the end of the era of companies releasing their AI models into the wild with little to no safeguards or accountability. These models increasingly form the backbone of many AI applications, yet the companies that make them are fiercely secretive about how they are built and trained. We don’t know much about how they work, and that makes it difficult to understand how the models generate harmful content or biased outcomes, or how to mitigate those problems. The European Union is planning to update its upcoming sweeping AI regulation, called the AI Act, with rules that force these companies to shed some light on the inner workings of their AI models. It will likely be passed in the second half of the year, and after that, companies will have to comply if they want to sell or use AI products in the EU or face fines of up to 6% of their total worldwide annual turnover.


CFOs zero in on digital transformation

Evaluating the results of one’s digital transformation efforts is a constant challenge for financial leaders, who must also deal with finding and retaining digital talent as well as aggregating all of the information one needs across their organization in order to build a technology roadmap, Horvat said. CFOs currently are focusing in on the finance function when it comes to their digital transformation efforts, therefore. “What they’re prioritizing is really maturing that FP&A function, getting FP&A-specific tools to platform their planning and budgeting,” Horvat said. Ninety percent of CFOs surveyed pointed to evaluating their finance strategy, scope and design as their top priority for 2023, according to the survey, while 83% pointed to planning finance transformation efforts. It is also important to note that CFOs are personally involved in their organizations’ digital transformation efforts both broadly and within the finance function, Horvat said. “I think a lot of it has to do with owning that strategy piece of it, to make sure that that it’s advancing in a way that serves the interests of the organization,” he said.


How to succeed in cyber crisis management and avoid a Tower of Babel

Organizations need to develop a working assumption of the main threat factors, targets, and practical ramifications of a cyberattack. The organization should also identify the main scenarios they may need to deal with, including a situation that results in shutting down the main business activities and a situation in which sensitive information is leaked or stolen. These should be made based on the nature of the organization, the sector in which it operates, its geographic location and history of cyber events. These scenarios should be updated constantly as the business and the threats change and grow. Publicly listed companies should also be aware of the risks to image and finances that could come with attacks as regulations increasingly require reporting of cyber incidents. In addition, each organization needs to determine its guiding principles, by answering key questions like whether it would negotiate with attackers and whether they would ever consider paying a ransom. It also needs to decide who will mitigate an attack – an internal team or an hired third party. 


How AI chatbot ChatGPT changes the phishing game

If attackers ask ChatGPT directly for ChatGPT to suggest some idea for a phishing email, they'll get a warning message that this topic is "not appropriate or ethical." But if they ask for suggestions for a marketing email, or an email to tell people about a new human resources webpage, or to ask someone to review a document prior to a meeting—that, ChatGPT will be very happy to do. ... ChatGPT is not limited to English. It says it knows about 20 languages, including Russian, Standard Chinese, Korean, but people have tested it with nearly 100. That means you can explain what you need in a language other than English, then ask ChatGPT to output the email in English. ChatGPT is blocked in Russia, but there's plenty of discussion in Russian explaining how to get to it via proxies and VPN services and how to get access to a foreign phone number to confirm your location. ... "ChatGPT and large language models in general will be used for benign content much more than for malicious content," says Andy Patel, researcher at WithSecure, who recently released a research report about hackers and GPT-3, an earlier version of ChatGPT.


Greener supply chains call for IoT innovation

With businesses and CEOs facing demands for environmental change and enhanced revenue growth simultaneously, supply chains need to be revolutionised. This can be achieved by strategically integrating the right systems and sensors to unlock opportunities, especially those that reduce energy consumption and waste throughout product lifecycles. The Gartner study that unearths the CEO findings is entitled 2022 CEO Survey: Sustainability and ESG Become Enduring Change. It says CEOs are also becoming increasingly aware that new technologies have a crucial role to play in supporting sustainability improvements. Artificial Intelligence (AI) was identified by 18% of respondents, putting it at the top of the list of sustainability supporting technologies, with digitalisation ranking second with 11%. While these findings indicate a growing awareness of technology’s potential to support sustainability, only 4% of CEOs identified IoT-related technologies as a primary example, when in fact it is set to be a major driver.


7 tell-tale signs of fake DevOps

An organization that hyper-focuses on a tool- and technology-centric DevOps culture, rather than on people and processes, is 180 degrees out of sync. “It’s crucial to assess current business practices and needs,” says Mohan Kumar, senior architect at TEKsystems, an IT service management firm. Kumar recommends prioritizing teams. “Instill DevOps culture into communication, collaboration, feedback collection, and analysis,” he suggests. “An experiment-friendly environment that allows developers to fail fast, recover fast, and learn faster builds a blame-free culture within the organization.” Kumar also suggests nurturing a stream of creative ideas by tapping into teams’ collective intelligence. DevOps adoption is an iterative process, so the CIO should begin by evaluating the development team’s current state and then gradually building a strategy of continuous improvement involving people, processes, and tools that can evolve along with future needs and developments. “Ultimately, creativity is a muscle that must be exercised continuously to grow,” Kumar observes.


Digital transformation: 4 tips to keep it human-centered

Rather than diving head-first into digital transformation, it is important to take a step back, consider these factors, and act accordingly. By taking a human-centered approach to digital transformation initiatives, organizations can use technology to transform the lives of the people they serve. We recently saw one of our customers create significant positive change when they considered the people involved in a necessary technology upgrade. ... Human-centered digital transformation requires companies to recognize that people lay the foundation for digital transformation and, therefore, must take the necessary steps to create a seamless experience throughout the process. The shift to a digital-first business environment can be challenging to all stakeholders as they are expected to adapt to rapid changes at an organizational level. Keeping pace with the changing needs of employees and customers will alleviate this burden and foster a strong company culture.v



Quote for the day:

"Practice isn't the thing you do once you're good. It's the thing you do that makes you good." -- Malcolm Gladwell

Daily Tech Digest - January 16, 2023

Why Cyber Insurance Will Revive Cyber Business Intelligence

Because cyber insurance deals with risk that has been transferred, there is a subtle but powerful distinction from the need to understand your own risk. In many cases, insurance companies that can curate low risk pools and a favorable loss ratio can significantly improve profits. That’s not the only way they make money, but it is one way. Now enter the resurgence of cyber business intelligence. While concepts like cyber threat intelligence and risk assessments focus on preventing loss, cyber business intelligence aligns with concepts already utilized elsewhere in a business environment. “What pieces of knowledge and trends can I follow – that by following them I can be more profitable?” This is a different mindset. This is one anchored on the idea that “you’ve got to spend money to make money.” This drives a culture and enthusiasm that can foster better innovation, better results and faster progress. There’s another key word there. Business. Not only relevant to technical experts, this information is equally relevant to business leaders and key decision makers. 


No Black Boxes: Keep Humans Involved In Artificial Intelligence

Not all AI needs are created equal. For instance, in low-stakes situations, such as image recognition for noncritical needs, it’s not likely necessary to understand how the programs are working. However, it is critical to understand how code operates and continues to develop in situations with important outcomes, including medical decisions, hiring decisions, or car safety decisions. It’s important to know where human intervention is needed and when it’s necessary for input and intervention. Additionally, because educated men mainly write AI code, according to (fittingly) the Alan Turing Institute, there’s a natural bias to reflect the experiences and worldviews of those coders. Ideally, coding situations in which the end goal implicates vital interests need to focus on “explainability” and clear points where the coder can intervene and either take control or adjust the program to ensure ethical and desirable end performance. Further, those developing the programs—and those reviewing them—need to ensure the source inputs aren’t biased toward certain populations.


3 Things New Engineering Managers Should Focus On

A high-performing team consist of engaged, happy, and motivated people — truly getting the best out of your team means getting the best from the individual. So what does that mean for you? It means quickly getting up to speed on each team member’s background, experiences, portfolio, strengths, growth areas, and goals. How do they want to be recognized? What style of feedback do they prefer? How do they learn best? What goals do they have? The more nuance you can learn about each person, the more successful you will be in leading them. By setting up 1:1 meetings, you’ll be able to learn about each person on your team, coach them, and discuss their progress towards goals. ... Instead of rolling in making changes, spend this time learning about the processes your team is already using. What are the team’s goals? How do they work together and separately? How does your team integrate with other teams — or not, currently, and is that an issue? Who are the customers and partners? 


Post-quantum cybersecurity threats loom large

Considering this net-positive shift in budgets, it’s no surprise that 74% of enterprise leaders have adopted or are planning to adopt quantum computing. Interestingly, nearly 30% of respondents that have adopted or plan to adopt quantum computing expect to see a competitive advantage due to quantum computing within the next 12 months. This represents more than a sevenfold increase year-over-year from 2021 (4%) and highlights the growing commitment to near-term quantum computing initiatives as the technology continues to mature. “We’re getting a unique glimpse into the quantum adoption mindset of global enterprise executives, which mirrors what we’re seeing in our customer base,” said Christopher Savoie, CEO of Zapata Computing. “These findings become more interesting when compared to the data we saw last year. Over the past 12 months, we’ve seen significant new developments in technology, particularly generative AI, and near-term advantages from quantum-inspired technologies that are fueling the momentum for quantum computing planning and adoption.


Data will be the king in 2023!

The importance of cyber-risk governance is no longer limited to CISOs anymore; conversations are deepening on how organizations can ensure data resiliency, adaptability, and security at the C-Suite level. As we approach 2023, business leaders will need to assess their data infrastructure with a five-point focus approach — scalability, flexibility, agility, security, and cost. Data protection and management will become a top-tier priority for business leaders. Significant amounts of the IT budget spend will be allocated and invested in technologies to prevent, detect, and recover from inevitable cyberattacks not if, but when they occur. A study by PWC stated that 62% of respondents expect their security budget to increase by as much as 10% in 2023. As cloud investments will continue to soar high in 2023, the parallel shifts in the threat landscape will also become more sophisticated. As per the recent Commvault-IDC survey, over 28% of Indian enterprises stated they will have multiple private and/or public cloud environments and migrate workloads and data between them by 2023. Thus, protection and data recoverability will be essential components in the enterprise security toolbox of organizations.


What to expect from SASE certifications

Compared to other networking certifications, like the CCNA, which is more about how to operate the technology, Cato’s SASE and SSE certifications are high-level overviews. “Our certification is more about what SASE and SSE mean, what are the implications, and what it means to different IT teams,” says Webber-Zvik. “You see presentations, whiteboards, reading materials, and at the end of each section, there is a quiz. When you complete all the sets and pass all the tests, you get the certification.” The majority of the material covered is not Cato-specific, he says. However, the certification does use Cato’s implementation of SASE and SSE in its examples. Take, for instance, single-pass processing. According to Gartner, this is a key characteristic of SASE, and it means that networking and security are integrated. “We explain it according to Gartner’s definition,” Webber-Zvik says. “We also provide an example of Cato’s implementation and use that to articulate what single-pass processing can look like when it’s outside Gartner theory and in real life.” There is no charge for Cato’s certification training and exam, but that might change, he says.


How to Overcome Challenges in an API-Centric Architecture

There are several potential solutions. If the use case allows it, the best option is to make tasks asynchronous. If you are calling multiple services, it inevitably takes too long, and often it is better to set the right expectations by promising to provide the results when ready rather than forcing the end user to wait for the request. When service calls do not have side effects (such as search), there is a second option: latency hedging, where we start a second call when the wait time exceeds the 80th percentile and respond when one of them has returned. This can help control the long tail. The third option is to try to complete as much work as possible in parallel by not waiting for a response when we are doing a service call and parallelly starting as many service calls as possible. This is not always possible because some service calls might depend on the results of earlier service calls. However, coding to call multiple services in parallel and collecting the results and combining them is much more complex than doing them one after the other.


The CIO’s role is changing – here’s why

Faced with an increasing number of threats both internal and external, CIOs have had to prioritise areas such as cyber security in recent years just to keep their businesses protected. In doing so, they’ve also been charged with embracing the latest technological developments such as artificial intelligence, big data analytics, and the plethora of connected devices that comprise the burgeoning Internet of Things; technologies that will foster greater innovation and provide their businesses with a more competitive edge. Increasingly, however, it won’t necessarily be an organisation’s IT department that drives the adoption of emerging technologies. More often, other areas of the business will now be in a better position to identify the innovative technology that will deliver greater customer value, and the specific use cases in which it can be implemented. 77 per cent of CIOs surveyed by Gartner claimed that IT staff are primarily providing innovation and collaboration capabilities, compared with 18 per cent stating that non-IT personnel are providing these tools. 


SRE in 2023: 5 exciting predictions

Whether it’s AI assistance, VR immersion, or web3 decentralization, 2023 will continue to push organizations to adopt cutting-edge technology. It’s a challenge to guess which of these ideas will flourish and which will flounder, but either way, having a reliable foundation will be necessary. Adopting even the most successful new ideas at scale will bring new obstacles and types of incidents. These growing pains of new technologies will require new approaches. As organizations experience these growing pains, they’ll turn to SRE to keep their customers happy while they adjust. Incident retrospectives can help teams handle new sources of incidents quickly, while a reliability mindset can keep customer happiness the number one priority. Reliability is the subjective experience of users based on their expectations of the service. While this is a helpful way to align priorities with customer needs, 2023 will bring an even more holistic definition of reliability. Organizations will start thinking about the reliability of their system, not just in terms of their users’ experiences, but as a complete package covering everything starting from development ideation.


10 data security enhancements to consider as your employees return to the office

The unauthorized disclosure of data isn’t always the result of malicious actors. Often, data is accidentally overshared or lost by employees. Keep your employees informed with cyber security education. Employees who go through regular phishing tests may be less likely to engage with malicious actors over email or text messaging. ... An inventory of software, hardware and data assets is essential. Having control over the assets with access to your corporate environment starts with an inventory. Inventories can be a part of the overall vulnerability management program to keep all assets up to date, including operating systems and software. Furthermore, a data inventory or catalogue identifies sensitive data, which allows appropriate security controls like encryption, access restrictions and monitoring to be placed on the most important data. ... Reducing your overall data footprint can be an effective way of reducing risk. 



Quote for the day:

"Smart leaders develop people who develop others, don't waste your time on those who won't help themselves." -- John C Maxwell

Daily Tech Digest - January 15, 2023

How confidential computing will shape the next phase of cybersecurity

At its core, confidential computing encrypts data at the hardware level. It’s a way of “protecting data and applications by running them in a secure, trusted environment,” explains Noam Dror—SVP of solution engineering at HUB Security, a Tel Aviv, Israel-based cybersecurity company that specializes in confidential computing. In other words, confidential computing is like running your data and code in an isolated, secure black box, known as an “enclave” or trusted execution environment (TEE), that’s inaccessible to unauthorized systems. The enclave also encrypts all the data inside, allowing you to process your data even when hackers breach your infrastructure. Encryption makes the information invisible to human users, cloud providers, and other computer resources. Encryption is the best way to secure data in the cloud, says Kurt Rohloff, cofounder and CTO at Duality, a cybersecurity firm based in New Jersey. Confidential computing, he says, allows multiple sources to analyze and upload data to shared environments, such as a commercial third-party cloud environment, without worrying about data leakage.


Not All Multi-Factor Authentication Is Created Equal

Many legacy MFA platforms rely on easily phishable factors like passwords, push notifications, one-time codes, or magic links delivered via email or SMS. In addition to the complicated and often frustrating user experience they create, phishable factors such as these open organizations up to cyber threats. Through social engineering attacks, employees can be easily manipulated into providing these authentication factors to a cyber criminal. And by relying on these factors, the burden to protect digital identities lies squarely on the end user, meaning organizations’ cybersecurity strategies can hinge entirely on a moment of human error. Beyond social engineering, man-in-the middle attacks and readily available toolkits make bypassing existing MFA a trivial exercise. Where there is a password and other weak and phishable factors, there is an attack vector for hackers, leaving organizations to suffer the consequences of account takeovers, ransomware attacks, data leakage, and more. A phishing-resistant MFA solution completely removes these factors, making it impossible for an end user to be tricked into handing them over even by accident or collected by automated phishing tactics.


Europe’s cyber security strategy must be clear about open source

While the UK government has tried to recognise the importance of digital supply chain security, current policy doesn’t consider open source as part of that supply chain. Instead, regulation or proposed policies focus only on third-party software vendors in the traditional sense but fail to recognise the building blocks of all software today and the supply chain behind it. To hammer the point, the UK’s 11,000+ word National Cyber Security Strategy does not include a single reference to open source. GCHQ guidance meanwhile remains limited, with little detailed direction beyond ‘pull together a list of your software’s open source components or ask your suppliers.’ ... In this sense, the EU has certainly been listening. The recently released Cyber Resilience Act (CRA) is its proposed regulation to combat threats affecting any digital entity and ‘bolster cyber security rules to ensure more secure hardware and software products’. First, the encouraging bits: the CRA doesn’t just call for vendors and producers of software to have (among other things) a Software Bill of Materials (SBoM) - it demands companies have the ability to recall components. 


Eight Common Data Strategy Pitfalls

Lack of data culture: Data hidden within silos with little communication between business units leads to a lack of data culture. Data Literacy and enterprise-wide data training is required to allow business staff to read, analyze, and discuss data. Data culture is the starting point for developing an effective Data Strategy.The Data Strategy is too focused on data and not on the business side of things: When businesses focus too much on just data, the Data Strategy may just end up serving the needs of analytics without any focus on business needs. An ideal Data Strategy enlists human capabilities and provides opportunities for training staff to carry out the strategy to meet business goals. This approach will work better if citizen data scientists are included in strategy teams to bridge the gap between the data scientist and the business analyst.Investing in data technology before democratizing data: In many cases, Data Strategy initiatives focus on quick investment in technology without first addressing data access issues. If data access is not considered first, costly technology investments will go to waste. 


Here's Why Your Data Science Project Failed (and How to Succeed Next Time)

Every data science project needs to start with an evaluation of your primary goals. What opportunities are there to improve your core competency? Are there any specific questions you have about your products, services, customers, or operations? And is there a small and easy proof of concept you can launch to gain traction and master the technology? The above use case from GE is a prime example of having a clear goal in mind. The multinational company was in the middle of restructuring, re-emphasizing its focus on aero engines and power equipment. With the goal of reducing their six- to 12-month design process, they decided to pursue a machine learning project capable of increasing the efficiency of product design within their core verticals. As a result, this project promises to decrease design time and budget allocated for R&D. Organizations that embody GE's strategy will face fewer false starts with their data science projects. For those that are still unsure about how to adapt data-driven thinking to their business, an outsourced partner can simplify the selection process and optimize your outcomes.


5 Skills That Make a Successful Data Manager

The role of a data manager in an organization is tricky. This person is often neither an IT guy who implements databases on his/her own, nor a business guy who is actually responsible for data or processes (that’s rather a Data Steward’s area of responsibility). So what’s the real value-add of a data manager (or even a data management department)? In my opinion, you need someone who is building bridges between the different data stakeholders on a methodical level. It’s rather easy to find people who consider themselves as experts for a particular business area, data analysis method or IT tool, but it is rather complicated to find one person who is willing to connect all these people and to organize their competencies as it is often required in data projects. So what I am referring to are skills like networking, project management, stakeholder management and change management HIwhich are required to build a data community step-by-step as backbone for Data Governance. Without people, a data manager will fail! So in my opinion, a recruiter who seeks for data managers should not only challenge technical skills but also these people skills.


Why distributed ledger technology needs to scale back its ambition

There is nonetheless an expectation that DLT can prove to be a net good for financial markets. Foreign exchange markets have an estimated $8.9 trillion at risk every day due to the final settlement of transactions between two parties taking days. This is why the Financial Stability Board and the Committee on Payments and Market Infrastructures have focused their efforts on enhancing cross-border payments with a comprehensive global roadmap. Part of this roadmap includes exploring the use of DLT and Central Bank Digital Currencies. The problem may not be the technology itself, but the aim of replacing current technology systems with distributed networks. DLT networks are being designed to completely overhaul and replace legacy technology that financial markets depend on today. Many pilot projects, such as mBridge and Jura, rely on a single blockchain developed by a single vendor. This introduces a single point of trust, and removes many of the benefits of disintermediation. 


Why is “information architecture” at the centre of the design process?

The information architecture within a design (both process and output) makes the balancing within the equation possible. It also ensures the equation is “solvable” by other people. It does this by introducing logical coherence. It ensures words, images, shapes and colours are used consistently. And it ensures that as we move from idea to execution, we stay true to the original intent — and can clearly articulate it — so that we can meaningfully measure the effectiveness of our design. Without this internal coherence and confidence that our output is an accurate, reliable test of our hypothesis, we’re not doing design. The power of design which has a consistent information architecture is that if we find that our idea (which we translate to intent, experiments and experiences) is not equal to the problem, we can interrogate every part of the equation. We may have made a mistake in execution. Maybe our idea wasn’t quite right. Or even more powerfully, maybe we didn’t really understand the problem fully. 


Improve Your Software Quality with a Strong Digital Immune System

You can improve your software quality with a strong digital immune system since a digital immune system is designed to guard against cyberattacks and other sorts of hostile activities on computer systems, networks, and hardware. It operates by constantly scanning the network and systems for indications of prospective threats and then taking the necessary precautions to thwart or lessen such dangers. This can entail detecting and preventing malicious communications, identifying and containing compromised devices, and patching security holes. A robust digital immune system should offer powerful and efficient protection against cyber threats and assist individuals and companies in staying secure online. Experts in software engineering are searching for fresh methods and strategies to reduce risks and maximize commercial impact. The idea of “digital immunity” offers a direction. It consists of a collection of techniques and tools for creating robust software programmes that provide top-notch user experiences. With the help of this roadmap, software engineering teams may identify and address a wide range of problems, including functional faults, security flaws, and inconsistent data.


Security Bugs Are Fundamentally Different Than Quality Bugs

For each one of the types of testing listed above, a different skillset is required. All of them require patience, attention to detail, basic technical skills, and the ability to document what you have found in a way that the software developers will understand and be able to fix the issue(s). That is where the similarities end. Each one of these types of testing requires different experience, knowledge, and tools, often meaning you need to hire different resources to perform the different tasks. Also, we can’t concentrate on everything at once and still do a great job at each one of them. Although theoretically you could find one person who is both skilled and experienced in all of these areas, it is rare, and that person would likely be costly to employ as a full-time resource. This is one reason that people hired for general software testing are not often also tasked with security testing. Another reason is that people who have the experience and skills to perform thorough and complete security testing are currently a rarity. 



Quote for the day:

"Leadership is particularly necessary to ensure ready acceptance of the unfamiliar and that which is contrary to tradition." -- Cyril Falls