Daily Tech Digest - November 17, 2022

Why Cybersecurity Should Highlight Veteran-Hiring Programs

Veterans who did not work in cybersecurity while in the military still have valuable skills that they bring to the field, however. The military emphasizes teamwork, adaptability, and responsibility, all traits that security professionals need to have. Military personnel are also trained in careful decision-making under extreme pressure using the available information. ... Earlier this year, the White House National Cyber Workforce and Education Summit issued a call to action to increase cybersecurity education and training opportunities. One of the announcements was to encourage more apprenticeship programs to help develop and train the cybersecurity workforce. In the months since, there have been a number of initiatives from cybersecurity organizations, including the SANS Institute. Many colleges and universities have specific training programs to give veterans hands-on experience in various areas. Cybersecurity training platform Cybrary said this week it is partnering with VetSec, a community of over 3,300 veterans working in or transitioning into cybersecurity, and TechVets, a bridge service for moving veterans, service leavers, reservists, and their families into IT careers.


ESG and C: Does Cybersecurity Deserve Its Own Pillar in ESG Frameworks?

Thefts of personal information during a cybersecurity breach erode trust on the part of customers investors, employees and other stakeholders, demonstrating the link between cyber risk and social risk. The new disclosure and reporting requirements embedded in the Security and Exchange Commission’s latest regulations governing the oversight of cybersecurity underline the link between governance risk and cyber risk. All this evidence shows that either cybersecurity is already part of ESG, and, perhaps, a more appropriate abbreviation should be ESGC. Most enterprise risk management policies have already expanded their oversight from purely financial risk to these other areas, including cybersecurity. Cyber risk can be as harmful to a company’s reputation and value as any other ESG issue, and the damage is inflicted and experienced in much the same way. As cyberattacks increase in size and frequency, the direct and indirect damage to companies — including loss of customer confidence, reputational damage, potential impact on the stock price and possible regulatory actions or litigation — arguably touches all aspects of ESG.


Efficient data governance with AI segmentation

An effective and efficient technology is available to replace such archaic methods and reduce risk fast, at a fraction of the cost: artificial intelligence (AI) segmentation. With AI-based segmentation, we ascertain what attributes of a file point to it being more likely to contain sensitive data after scanning just a small statistical sample of files. This provides us with important information to prioritize our search for high-risk data. For example, are Word documents at a higher risk than PowerPoint presentations? Is there a particular folder that is more likely to contain sensitive data? Once we have our riskiest data highlighted, we can immediately start a full scan and remediation process, eliminating the highest risk as early in the process as possible. Thus, we have prioritized the remediation process to achieve the greatest risk reduction in the least amount of time. For example, suppose we have many terabytes of data broken up into chunks of 100 terabytes. To index or scan 100 terabytes at a time could require several months of work, and it takes even longer to go through all of it.


What Makes a Good Cybersecurity Professional?

A cybersecurity professional is, at their core, an analytical person who looks at a problem from multiple points of view and devises an approach to solve the problem. When doing so, they must collaborate with people from different backgrounds and functions to understand the problem in depth and in context. This requires good communication skills, unlike some other technology-heavy roles in which a specification (spec) is provided and the task is strictly to achieve the spec. Someone could be an analyst, a risk advisor, a banker or a human resources (HR) professional and they could still be considered a cybersecurity professional. Cybersecurity relies on an understanding of human behavior and on contextual transactions in different lines of business. For example, a banker knows the pitfalls of processes when it comes to banking-related operations. They can bring this wealth of knowledge to cybersecurity operations by gaining the right skill set to build their technological capability. Imagine this to be a role where a consultant envisions the strategy and the solution and builds the technological stack required to solve the problem.


Agility, business more important than cost, network management for IT teams

Half of IT teams rated end-to-end visibility as a top priority, while just under that number said multicloud software-defined networks (SDNs) were on their most-wanted list. The bottom line in this area, said Cisco, was that although an unpredictable world challenges IT organisations, it also presents new opportunities for those that use technology to support dynamic business needs. It said IT needs to adopt an agile, cloud-like operations model for everything it does, including network operations. Cisco also remarked that as endpoints and applications become more dispersed and distributed, network complexity multiplies. While adoption of public cloud is growing, 50% of workloads are still deployed on-premise, and as a result, most environments will continue to be a mix of public cloud, hosted, private cloud, edge and on-premise environments, it said. CloudOps and NetOps figured highly in both operational and organisational trends and there was greater alignment between the objectives of both. In all, 49% of CloudOps and 42% of NetOps respondents said security was their top motivation for using multiple clouds, and both said business performance, security and agility were top priorities.


Networking for remote work puts the emphasis on people, not sites

Many companies had to support work-from-home (WFH) during COVID, and most looked forward to having their staff back in the office. Most now tell me that some or all of the staff isn’t coming back, and that remote work is a given for at least some positions, likely for a very long time. That’s opened major questions about how these now-forever-roaming workers are connected to information resources and to each other. Didn’t we solve this already, with Zoom and Teams? Sort of. Collaborative video applications provide a reasonable substitute for meetings, but you still have the challenge of application access and information delivery. A bit over 80% of enterprises I’ve talked with say they need to make a remote worker look like they’re at their desk, and they need to be able to work as though they were as well. During lockdown, most companies said they relied on sending files and documents to workers. A few used SD-WAN technology to connect workers’ homes to the company VPN. The former strategy is very limiting and inefficient; you can’t replace checking account status online by sending around documents.


5 ways to find hidden IT talent inside your organization

Nobody knows the hidden IT talents of non-IT employees better than their managers and co-workers. At TruStone, business leaders and managers are open to recognizing employees with IT potential that could benefit both the employee’s career and the company. “We’re transparent that this would be a great person for [an IT] career progression, so maybe they should come into IT,” Jeter says. Jeter often discovers talent through his team’s product management consults inside the organization. “With a lot of scaled agile framework, we have product owners that sit outside of IT but within the business in areas like consumer lending, member services, or mortgages. We have technologies to align with them and they orchestrate the backlog” and other supporting duties, Jeter says. “They see what IT does, and we see what they do — and some of them want to come into IT.” IT scored a new team member recently after a product owner in operations worked with IT on a product management consult. He had been with the company for nine years and worked in training before business operations.


Not patched Log4j yet? Assume attackers are in your network, say CISA and FBI

The ubiquitous nature of Apache Log4j means it's embedded in a vast array of applications, services and enterprise software tools that are written in Java and used by organizations around the world, many of which rushed to apply the fixes. But despite the urgent messaging around the need to apply critical security updates, there are still organizations that haven't done so – meaning they're still vulnerable to any cyber criminals or other malicious hackers looking to exploit Log4j. Now CISA and the FBI have warned organizations with affected VMware systems that didn't immediately apply patches or workarounds "to assume compromise and initiate threat hunting activities". The cybersecurity advisory (CSA) also warns any organizations that detect a compromise as a result of Log4j to "assume lateral movement" by the attackers, investigate any connected systems and audit accounts with high privilege access. "All organizations, regardless of identified evidence of compromise, should apply the recommendations in the mitigations section of this CSA to protect against similar malicious cyber activity," said the alert.


Don’t get lost in the cloud: How to manage multiple providers

For organizations that take an ad hoc approach to multi-cloud, the risks include cost overruns from engaging several different services, Linthicum warns. “They don’t have a good handle on how the money’s being spent in a particular cloud provider, even within the primary provider,” he says. “And they’re getting huge cloud bills they didn’t expect, to the point that the boards of directors and CIOs and C-suites are starting to notice.” Security concerns are another reason to centralize cloud management. Organizations with three or four providers should have a security system that spans all of them so they can avoid juggling several separate dashboards, Linthicum suggests. “That’s where people get confused, and that’s how breaches occur.” Besides a plan for how the various providers work together, Linthicum says, it’s crucial to have a financial operations (FinOps) program that monitors and manages usage and costs. “Many enterprises don’t have that right now,” he notes. “They wait for their bill to show up and then figure out what went on, and they’re just running it on a spreadsheet where they’re not getting a true FinOps program in place.”


When, Why and How Facilitation Skills Help Scrum Teams

Stress levels run high when two vocal members in the team, Minal and Linda, clash during the Sprint Retrospective. Minal expresses her disappointment with how little progress the team made during the Sprint. She adds sarcastically that maybe it was because of how they decided to implement the work, which was originally Linda’s idea. Linda responds quickly in a defensive tone, and calls out that Minal is always quick to point out what’s wrong but doesn’t contribute many suggestions of her own. The Retrospective delves into a tense back and forth between them and the Scrum Master who is facilitating the Retrospective, firmly stops their argument. The Scrum Master moves the discussion to improvement ideas for the team, and many items around managing work in progress and exploring new technologies are suggested, but ultimately the team is stuck at an impasse of what to carry forward. The timebox ends and the Scrum Master calls an end to the Sprint Retrospective with no plans for the team on how to improve.



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick

Daily Tech Digest - November 16, 2022

Should we measure developer productivity?

If we concede that it is possible to measure developer productivity (a proposition that I am not completely sold on), we then must ask whether we should do that. The desire to do so is certainly strong. Managers want to know who their best developers are, and they want metrics that will help them at performance evaluation time. HR wants to be able to document performance issues. CEOs want to know that the money they are spending is being used effectively. Even if you use new tools to measure individual developer productivity, those metrics will likely be gamed. Lines of code is considered a joke metric these days. “You want lines of code? I’ll give you lines of code!” Is number of commits per day or average time to first PR comment any different? If you measure individual developers on these metrics, they will most definitely improve them. But at what cost? Likely at the cost of team productivity. An old CEO of mine used to say that software development is a team sport. If individual developers are measured against each other on any metric, they will start competing with each other, especially if money and promotions are on the line.


Technology spending will rise next year. And this old favourite is still a top priority

White says huge macro-economic pressures around the globe are causing senior executives to think much more carefully about how to get close to customers, to boost growth, and to potentially take cost out of the business. She also refers to pressures on supply chains. Executives have seen the disruptions caused first by the pandemic and then Russia's invasion of Ukraine, and are now looking for tools to respond flexibly to fluctuations in supply and demand. The solutions to many of these challenges, says White, are likely to come via technology. And for many businesses, the starting point for that response is going to a continued investment in cloud computing. This focus on on-demand IT might seem surprising. After a decade or more on the IT agenda, and a couple of years of targeted investment due to the pandemic, you'd be forgiven for assuming that a shift to cloud computing was yesterday's news. ... However, the Nash Squared survey shows that interest in the cloud is still very much today's priority. "It's still growing and evolving as a market, with a quite young set of technologies and capabilities," says White.


A modern approach to enterprise software development

Given the potential benefits that low-code tools offer in terms of enabling people in the business to develop their own software to improve the efficiency of the business processes with which they interact, the industry is recognising the massive risk that this poses. Dyson’s Wilmot said the business has concentrated on operational excellence focused on project audits, adding that people and the process around low-code development are crucial. He suggested that CIOs should decide: “Who will be your core low-code coders in IT and in the business?” Wilmot also urged CIOs considering the idea of opening up low-code development to business users who would like to code, to ensure that processes are in place to prevent the code they develop from “running wild”. Clearly there are numerous opportunities to improve on how things work, especially in organisations that have grown organically over time, where, to achieve a business objective, employees need to use numerous systems that don’t talk to each other. More often than not, data has to be rekeyed, which is both error-prone and labour-intensive.


Three Ingredients of Innovative Data Governance

The first important feature of innovative data governance is providing a data set that is statistically similar to the real data set without exposing private or confidential data. This can be accomplished using synthetic data. Synthetic data is created using real data to seed a process that can then generate data that appears real but is not. Variational autoencoders (VAEs), generative adversarial networks (GANs), and real-world simulation create data that can provide a basis for experimentation without leaking real data and exposing the organization to untenable risk. VAEs are neural networks composed of encoders and decoders. During the encoding process, the data is transformed in such a way that its feature set is compressed. During this compression, features are transformed and combined, removing the details of the original data. During the decoding process, the compression of the feature set is reversed, resulting in a data set that is like the original data but different. The purpose of this process is to identify a set of encoders and decoders that generate output data that is not directly attributable to the initial data source.


What’s Holding Up Progress in Machine Learning and AI? It’s the Data, Stupid

While companies are having some success in putting machine learning and AI into production, they would be further along if data management issues weren’t getting in the way, according to Capital One’s new report, “Operationalizing Machine Learning Achieves Key Business Outcomes,” which was released today. ... “There’s a real appetite to scale that thing quickly,” he says. “And if you don’t step back and say, hey, the thing you stood up in the sandbox, let’s actually make sure that you’re systematizing it, making it widely available, putting metadata on top of it, putting traceability and flows, and doing sort of all the foundational scaffolding and infrastructure steps that are needed for this thing to be sustainable and reusable. “That requires a ton of discipline and hygiene and potentially waiting a bit before the thing that you want to scale up starts to see impact in the marketplace,” Kang continues. “The temptation is always there. So what ends up happening, through no ill intent, is these proof of concepts start to see impact, and then and then all of a sudden you find yourself in a place where there’s a bunch of data silos and a bunch of other data engineering infrastructure challenges.”


Twitter's CISO Takes Off, Leaving Security an Open Question

Twitter made huge strides towards a more rational internal security model and backsliding will put them in trouble with the FTC, SEC, 27 EU DPAs and a variety of other regulators," he said — ironically, in a tweet. "There is a serious risk of a breach with drastically reduced staff." Many others also view the cuts and the exodus of senior executives — both voluntarily and involuntarily — as severely crippling the social media giant's capabilities, especially in critical areas such as security, privacy, spam, fake accounts, and content moderation. "These are huge losses to Twitter," says Richard Stiennon, chief research analyst at IT-Harvest. "Finding qualified replacements will be extremely expensive." Kissner's exit is sure to add to what many view as a deepening crisis at Twitter following Musk's takeover. Among those that have been axed previously are CEO Parag Agarwal, chief financial officer Ned Segal, legal chief Vijaya Gadde, and general counsel Sean Edgett. Teams affected by Musk's layoffs reportedly include engineering, product teams, and those responsible for content creation, machine learning ethics, and human rights.


How to prepare for ransomware

We know that bad actors are motivated by financial gains, and we are starting to see evidence where they are mining the exfiltrated data for additional sources of potential revenue. For many years, the cyber security community has been saying it’s not a case of “if” you’ll be attacked, but “when”. That being the case, it is important to examine all these phases and make sure that adequate time and effort is allocated to preparing to defend against and prevent an incident, while also conducting the requisite detection, response and recovery activities. IT security leaders should work under the assumption that a ransomware attack will be successful, and ensure that the organisation is prepared to detect it as early as possible and recover as quickly as possible. The ability to quickly detect and contain a ransomware attack will have the biggest impact on any outage or disruption that is caused. The first and most common question is: should the ransom be paid? Ultimately, this has to be a business decision. It needs to be made at an executive or board level, with legal advice. 


The unimon, a new qubit to boost quantum computers for useful applications

To experimentally demonstrate the unimon, the scientists designed and fabricated chips, each of which consisted of three unimon qubits. They used niobium as the superconducting material apart from the Josephson junctions, in which the superconducting leads were fabricated using aluminum. The team measured the unimon qubit to have a relatively high anharmonicity while requiring only a single Josephson junction without any superinductors, and bearing protection against noise. The geometric inductance of the unimon has the potential for higher predictability and yield than the junction-array-based superinductors in conventional fluxonium or quarton qubits. "Unimons are so simple and yet have many advantages over transmons. The fact that the very first unimon ever made worked this well, gives plenty of room for optimization and major breakthroughs. As next steps, we should optimize the design for even higher noise protection and demonstrate two-qubit gates," added Prof. Möttönen.


Data privacy: why consent does not equal compliance

A serious blind spot for brands is caused by consent models. Many organisations assume that obtaining consent from users to collect and process their data ensures compliance. In reality, consent does not equal compliance. Many brands operate under an illusion of compliance, when, in fact, they are routinely leaking personal data across their media supply chain and tolerating the unlawful collection and sharing of data by unauthorised third parties. Research from Compliant reveals that there are a number of ways in which brands are inadvertently putting themselves at risk. For example, our analysis shows that of the 91 per cent of the EU advertisers using a Consent Management Platform (CMP), 88 per cent are passing user data to third-parties before receiving consent to do so. While a properly implemented CMP is a useful tool for securing consent, integrating them with legacy technologies and enterprise architectures is clearly a problem. Another risk stems from “piggybacking”, where unauthorised cookies and tags collect data from brand websites without the advertiser’s permission. P


Machine learning: 4 adoption challenges and how to beat them

Machine learning algorithms may still behave unpredictably after training to prepare for data analysis. This lack of clarity might be an issue when leveraging AI in decision-making leads to unexpected outcomes. As the Harvard Business School reported in its 2021 Hidden Workers: Untapped Talent report, ML-based automated hiring software rejected many applicants due to overly rigid selection criteria. That’s why ML-based analysis should always be complemented with ongoing human supervision. Talented experts should monitor your ML system’s operation on the ground and fine-tune its parameters with additional training datasets that cover emerging trends or scenarios. Decision-making should be ML-driven, not ML-imposed. The system's recommendation must be carefully assessed and not accepted at face value. Unfortunately, combining algorithms and human expertise remains challenging due to the lack of ML professionals in the job market. 



Quote for the day:

"Good leaders must first become good servants." -- Robert Greenleaf

Daily Tech Digest - November 15, 2022

The Chief Trust Officer Role Can Be the Next Career Step for CISOs

Many CISOs are already unofficially doing the work that comes with the CTrO role, according to Pollard. They are doing customer-facing work, navigating third-party risk management, and focusing on enterprise resilience. “CISOs that spend more time on customer-facing activity, they are at companies that grow faster,” Pollard asserted. “Cybersecurity touches revenue, and security leaders that are able to carve out the time to focus on customer activity help drive hyper growth.” CISOs who are driving growth for their companies are playing an important part on the leadership team, and if they’ve been in the role for a long enough time, it could be time to ask the question “What comes next?” CISOs who have been in their position for 48 months are due for a title-level promotion, according to Pollard. And CTrO is that next step. ... Through his research, Pollard is seeing the CTrO role filled at a number of organizations. Cisco has a chief trust officer. So does SAP. “We're not talking about small, innovative startups. We're talking about goliath businesses that recognize the importance of trust in what they do,” Pollard said.


How regulation of the metaverse could impact your business

The regulatory challenges faced by Web3 are currently much fresher, arguably more nuanced and in some cases, urgent. It cannot be regulated as a single entity, as its multitude of use cases demand a multitude of approaches. Specific rules governing the security and availability of systems, finance, archives, identity and IP rights will need to be set. The good news is that policymakers could leverage Web3’s benefits to impose regulation. As it’s based on decentralisation and automation, it’s not far-fetched to imagine the technology being used to enforce and automate taxation, for example. Currently, Web3 platforms like cryptocurrency exchanges or NFT marketplaces aren’t standardised, with inconsistent UX and language used to communicate concepts. Often, these platforms have little or no duty to educate about safety or establish protections, and while platforms like Coinbase and OpenSea do a good job here, it’s far from the norm and scams are still commonplace owing to lack of understanding.


Private 5G drives sustainable and agile industrial operations

Looking at business outcomes such as sustainability and agility, the partners regard industrial private 5G as an enabler of digital transformation in smart manufacturing to help deliver connected worker applications, mobile asset applications and untethered fixed industrial asset applications. The former are seen as able to increase visibility and intelligence through mobile digital tools, such as analytics, digital twins and augmented reality (AR), while mobile asset applications increase agility and efficiency with autonomous vehicles, such as automated guided vehicles (AGVs) and autonomous mobile robots (AMRs). The consortium’s tests were run according to an established test plan provided by Rockwell Automation with success criteria of zero faults. It outlined a series of test cases to establish reliable Ethernet/IP standard and safety (CIP Safety) I/O connections from a GuardLogix area controller, with a range of requested packet interval (RPI) settings – the rate at which the controller and the I/O exchange data – over the 5G RAN to the FLEX 5000 standard and safety I/O.


Who Moved My Code? An Anatomy of Code Obfuscation

The best security experts will tell you that there’s never an easy, or a single solution to protect your intellectual property, and combined measures, protection layers and methods are always required to establish a good protective shield. In this article, we focus on one small layer in source code protection: code obfuscation. Though it’s a powerful security method, obfuscation is often neglected, or at least misunderstood. When we obfuscate, our code becomes unintelligible, thus preventing unauthorized parties from easily decompiling, or disassembling it. Obfuscation makes our code impossible, (or nearly impossible), for humans to read or parse. Obfuscation is, therefore, a good safeguarding measure used to preserve the proprietary of the source code and protect our intellectual property. To better explain the concept of obfuscation, let’s take “Where’s Waldo” as an example. Waldo is a known illustrated character, always wearing his red and white stripy shirt and hat, as well as black-framed glasses. 


Should security systems be the network?

The appeal and real benefits of having the security systems be the whole network are clearest for smaller and midsized companies. They are more likely to have uniform and relatively simple needs, and also to have thinner staffing. They are more likely to have difficulty affording, attracting, and retaining the talent they need in both security and networking. So, having just one platform to become expert in, one platform to train new staff on or to outsource the management of lets them make the most of the staff they have. The benefits are less clear for larger company. These tend to have more complex environments and requirements, and are less likely to tolerate the risks of monoculture given they are better able to staff for and support a blended ecosystem. So, should security systems be the network? For smaller organizations, it looks viable with the caveats outlined above. For most larger organizations, I think the answer is currently no. Instead, they should focus on making their network systems a bigger part of the security infrastructure.


Democratization Is The Key To Upskill At Work And Improve ROI

Creating actionable data and analytics programs to educate employees is one of the most effective ways to bridge the skills gap. We have seen successes with executive-sponsored datathons or when companies gamify their learning experience. We also think it’s important for technical data experts to act as mentors to knowledge workers with domain expertise and guide them through the analytics process. We believe this collaboration between technical experts and domain experts will help organizations achieve breakthroughs with their data faster. Finally, analytics needs to be easy, not complex. Organizations should invest in technologies that move away from being highly dependent on writing code. ... Data and analytics generate ROI in many ways. First are the time savings. Organizations that shift from spreadsheet-based processes save several hours per week, sometimes up to a third of their time per worker – multiply this by all the domain experts and knowledge workers still stuck in spreadsheets and you’ve got some serious time savings. This is just the tip of the iceberg.


Top cybersecurity threats for 2023

Disgruntled employees can sabotage networks or make off with intellectual property and proprietary information, and employees who practice poor security habits can inadvertently share passwords and leave equipment unprotected. This is why there has been an uptick in the number of companies that use social engineering audits to check how well employee security policies and procedures are working. In 2023, social engineering audits will continue to be used so IT can check the robustness of its workforce security policies and practices. ... Cases of data poisoning in AI systems have started to appear. In a data poisoning, a malicious actor finds a way to inject corrupted data into an AI system that will skew the results of an AI inquiry, potentially returning an AI result to company decision makers that is false. Data poisoning is a new attack vector into corporate systems. One way to protect against it is to continuously monitor your AI results. If you suddenly see a system trending significantly away from what it has revealed in the past, it’s time to look at the integrity of the data.


Corporate execs confident on sustainability goals, admit more work needed

Efforts to achieve sustainability goals can broadly be grouped into several areas: green resources procurement, which includes sustainable energy and water; operational efficiency, which includes the IT value chain, supply chain and other scope 3 emission sources that make up 40% of all greenhouse gas emissions; and end of lifecycle, including circular economy or recycling products to create new ones. For example, data centers and cloud industries tend to focus on green energy procurement (since they use a lot of energy to power data centers) as well as operational efficiency to reduce power usage, according to Abhijit Sunil, a senior analyst with Forrester Research. “Standards are certainly evolving, and more and more organizations are held accountable for their commitments and how they take action towards it,” Sunil said. For example, Sunil noted, government scrutiny will continue to increase, holding more “greenwashers” accountable. Greenwashers are companies that deceptively purport that their products, aims and policies are environmentally friendly.


The office of 2023: Top workforce trends that will shape the year ahead

Roderick believes an overarching theme for the workplace in 2023 will be adjusting how employees work remotely. He says there could be an uptick in surveillance for remote workers that will allow managers to observe productivity, and executives could enforce return-to-office mandates as a reaction to a slowdown in business. ... "The world of work has been through huge changes since the pandemic, and it would be good not to see the positives of this change undone by a recession." Silverglate believes that technology, office redesign, and sustainability will all propel hybrid and remote working in 2023. Video conferencing became a staple in work-from-home practices, but VR is emerging to make the experience more immersive and productive. "When many are in person and a team member needs to be virtual, VR technology can truly reduce the perceived gap between the two, which is one of the largest complaints I've heard about the challenges of traditional video-conferencing technology as it relates to hybrid teams," he says.


From Async Code Reviews to Co-Creation Patterns

The way it goes is that once a developer thinks they are done with coding, they invite other team members to review their work. This is nowadays, typically done by raising a Pull Request and inviting others for a review. But, because reviewers are busy with their own work items and a plethora of other things happening in the team, they are not able to react immediately. So, while the author is waiting for a review, they also want to feel productive, thus they start working on something else instead of twiddling their thumbs and waiting for a review. Eventually, when reviewer(s) become available and provide feedback on the PR and/or ask for changes, the author of the PR is then not available because they are busy with something else. This delayed ping-pong communication can extend over several days/weeks and a couple of iterations, until the author and reviewer(s) converge on a solution they are both satisfied with and which gets merged into the main branch.



Quote for the day:

"How was your day? If your answer was "fine," then I don't think you were leading" -- Seth Godin

Daily Tech Digest - November 13, 2022

Cybersecurity leaders want to quit. Here's what is pushing them to leave

Almost a third of chief information security officers (CISOs) and IT security managers in the UK and US are considering leaving their current organization, according to new research. Not only that, but a third are planning to quit their jobs within the next six months. ... many IT security leaders are struggling to keep up with evolving threats and new cybersecurity practices, while also reporting issues around recruitment, retention and work-life balance that are prompting many to turn away from the industry. When asked about the aspect of their role that they disliked most, 30% cited the lack of a work-life balance, with 27% saying that much time was spent on 'firefighting' rather than addressing strategic business issues. On top of the 32% of CISOs planning a departure due to the stresses of the job, 52%, admitted that they are struggling to keep up to date with new frameworks and models such as Zero Trust, while a further 20% felt that having the right skills on their team was "a serious challenge".


Why Is Optimism a Critical Security Skill?

There’s a different way to think about the practice of security: as a vision- and mission-based endeavor. When security practitioners log in each day to start work, they are protecting people they care about: their colleagues, partners and customers. They’re also safeguarding their organizations’ ability to do business in a complex world by delivering vital products and services that others need and ensuring society functions as intended. As a result, security teams are creating a better world for everyone. For employees in organizations, these connections may either be explicit or implied. Security professionals who protect national infrastructure for a government agency, a nonprofit’s ability to deliver aid or an e-commerce firm’s ability to deliver goods will likely see the value in safeguarding their organizations’ business and operations. Yet, countless others provide processes or services that enable the effective functioning of businesses and community life. These professionals, too, should take pride in fulfilling their organization’s vision and mission.


Networking and Data Center Horrors That Scare IT Managers

Carrie Goetz, D.MCO and Principal/CTO at StrategITcom, LLC, and a frequent speaker at Network Computing events, offered up some spooky incidents she has encountered over the years. “There was the case of cleaning people plugging vacuum cleaners into UPS outlets when cleaning the data center and shutting them down. Happened every night sometime between 2 and 4 AM. The only way we caught it was to sit up there at that time.” “Or how about doing an audit of the gear in a data center? The customer thought they had about 2,600 servers, and we found over 3,000 physical machines. Some had not passed a bit of traffic in years.” Talk about a nightmare. She noted, “decommissioning was not in their vocabulary until after the audit.” Another example should send chills down any IT manager’s spine. “We took over a contract for a prison health care provider. They had previously hired another company. When all of the deliveries were late, the customer started investigating and found out that the company was staging their servers in a shed with a dirt floor and no AC running. They kept going up and down, and two failed for dirt and moisture.”


Hervé Tessler – ‘Cyberattacks can mean total reputational death’

When I joined the business 33 years ago, nobody ever talked about cybersecurity. I don’t recall the word. Everything was physical: what if somebody’s gotten into my flat or I’m afraid someone’s going to take my car, attack me in the street to take my watch or my wallet or whatever. Obviously, the world has become much more digital. All these fears and threats became digital. What I would say over the past few years is that we’ve seen a massive amplification of risk. Large companies have a board and an IT group under the board looking at cybersecurity. They take it very seriously. They are scared to death of any brand damage. They are relatively focused, which is not the same as being well equipped. What I’ve found out over the last six months is that more and more small and mid-sized businesses are paying a lot more attention to cyber. When I open the newspaper, there’s not a day without a small story about a major cyber negative impact on a business. There was a recent cyberattack on a French hospital that sent them back to the Middle Ages – it lasted for months. Of course, SMBs taking cybersecurity seriously is an opportunity for us to help them with this threat.


Cyber criminals have World Cup Qatar 2022 in their sights

The Digital Shadows Photon research team have been tracking cyber threats coalescing around the World Cup over the past 90 days using a specially created alert system. They have found that broadly, threats to the event can be arranged into four categories – brand protection, cyber threat, physical protection and data leakages. Of these, most of the observed activity relates to the cyber threat category. “Scams could present themselves in many forms,” the Photon team wrote in a newly published online advisory. “For instance, financially motivated threat actors often plant in malicious URLs spoofing these events to fraudulent sites, hoping to maximise their chances of scamming naive internet users for a quick, illicit, profit. “At the same time, hacktivist groups may exploit the public attention given to such events to exponentially increase the reach of their message. State-sponsored advanced persistent threat (APT) groups may also decide to target global sporting events to achieve state goals to the hosting country or the broader event community.”


Agile or V-Shaped: What Should Be Your Next Software Development Life Cycle Model?

The agile model is known for its flexibility and responsiveness to change. This makes it ideal for projects that are constantly evolving or that require quick turnarounds. However, this flexibility can also be a downside, as it can lead to scope creep and unrealistic expectations. The V-shaped model is more rigid and structured, but this can also be seen as a strength. This model helps to prevent scope creep by clearly defining the deliverables at each stage of the project. It also provides more structure and transparency, which can help to keep stakeholders informed and on track. However, the downside of this model is that it can be inflexible and resistant to change. So, which model is best for your project? Ultimately, it depends on your specific needs and objectives. The Agile software development life cycle model is a great choice for small to medium-sized projects. This is because it offers flexibility and adaptability, which are essential when working on smaller-scale projects. So if you need a flexible and responsive approach, then the agile model may be a better fit.


Veteran CIOs on leading IT today

It’s a different role as you shift from manager to director. It’s realizing your entire organization is essentially run by someone else tactically, so really backing off and letting them fail or succeed on their own. And then you have to figure out how to focus very differently on alignment, making sure all the leaders are on the same page, because now you have really good leaders and they’re all running in different paths. In IT especially, managers tend to want to hang on to some of the hands-on work after they become directors, and that’s often because they’re promoted due to skill, not leadership. So suddenly they find themselves at a director level and they’re just a really good engineer. They don’t have any other tool in their toolkit except doing it themselves. So it’s hard for them to let others own things completely. It’s really important to get a good mentor or someone in place to help them. Usually what you hear is the horror stories, where someone fails miserably and then they learn and pick themselves up and go again. We’ve got to find a way to prevent that.


IT security: 3 areas to prioritize for the rest of 2022

If companies fail to frequently audit access policies to ensure that external groups can only access the systems they need, this is another avenue that hackers can easily exploit. It’s also essential to immediately cut off access after parting ways with a consultant and periodically confirm that former contractors no longer have access. Ensure an established timeline for auditing access policies – and never allow it to slip. Addressing password hygiene is another critical consideration. According to the most recent Verizon Data Breach Investigations Report, over 80 percent of hacking incidents involved stolen credentials. And studies have repeatedly shown that at least 71 percent of people reuse passwords. If just one of the sites associated with a reused password has been breached, then all other accounts protected by that password are also at risk. With workforce management challenges on the horizon for 2023, it’s essential to implement policies and procedures addressing the inherent security vulnerabilities of the Great Resignation.


Cookies for MFA Bypass Gain Traction Among Cyberattackers

Stealing session cookies has become one of the most common ways that attackers circumvent multifactor authentication. The Emotet malware, the Raccoon Stealer malware-as-a-service, and the RedLine Stealer keylogger all have functionality for stealing sessions tokens from the browsers installed on a victim's system. In August, security software firm Sophos noted that the popular red-teaming and attack tools Mimikatz, Metasploit Meterpreter, and Cobalt Strike all could be used to harvest cookies from the browsers' caches as well, which the firm called "the new perimeter bypass." "Cookies associated with authentication to Web services can be used by attackers in 'pass the cookie' attacks, attempting to masquerade as the legitimate user to whom the cookie was originally issued and gain access to Web services without a login challenge," Sean Gallagher, a threat researcher with Sophos, stated in the August blog post. "This is similar to 'pass the hash' attacks, which use locally stored authentication hashes to gain access to network resources without having to crack the passwords."


Scrutinising AI requires holistic, end-to-end system audits

To combat the lack of internal knowledge around how AI systems are developed, the auditing experts agreed on the pressing needed for a standardised methodology for how to conduct a socio-technical audit. They added that while a standardised methodology currently does not exist, it should include practical steps to take at each stage of the auditing process, but not be so prescriptive that it fails to account for the highly contextual nature of AI. However, digital rights academic Michael Veale said standardisation is a tricky process when it comes to answering inherently social questions. “A very worrying trend right now is that legislators such as the European Commission are pushing value-laden choices around fundamental rights into SDOs [standards development organisations],” he said ... Another risk of prescriptive standardisation, according to Brown, is that the process descends into a glorified box-ticking exercise. “There’s a danger that interrogation stops and that we lose the ability to really get at the harms if they just become standardised,” he said.



Quote for the day:

"What I've really learned over time is that optimism is a very, very important part of leadership." -- Bob Iger

Daily Tech Digest - November 10, 2022

Building Higher-Quality Software With Open Source CD

Prior to the rise of open source CD solutions, companies often relied on point automation using scripts. These could improve efficiency a bit, but when companies moved from the monolithic architecture of a mainframe or on-premises servers to a microservices-based production environment, the scripts could not be easily adapted or scaled to cope with the more complex environment. This led to the formulation of continuous delivery orchestration solutions that could ensure code updates would flow to their destination in a repeatable, orderly manner. Two highly popular open source CD solutions have emerged, Spinnaker and Argo. Spinnaker was developed by Netflix and extended by Google, Microsoft and Pivotal. It was made available on GitHub in 2015. Spinnaker creates a “paved road” for application delivery, with guardrails to ensure only valid infrastructure and configurations reach production. It facilitates the creation of pipelines that represent a software delivery process. These pipelines can be triggered in a variety of ways, including manually, via a cron expression, at the completion of a Jenkins Job or other pipeline and other methods. 


Technical Debt is Quantifiable as Financial Debt: an Impossible Thing for Developers

There are many things about technical debt that can be quantified. Henney mentioned that we can list off and number specific issues in code and, if we take the intentional sense in which technical debt was originally introduced, we can track the decisions that we have made whose implementations need to be revisited. If we focus on unintentional debt, we can look at a variety of metrics that tell us about qualities in code. There’s a lot that we can quantify when it comes to technical debt, but the actual associated financial debt is not one of them, as Henney explained: The idea that we can run a static analysis over the code and come out with a monetary value that is a meaningful translation of technical debt into a financial debt is both a deep misunderstanding of the metaphor – and how metaphors work – and an impossibility. According to Henney, quantifying how much financial debt is present in the code doesn’t work. At the very least, we need a meaningful conversion function that takes one kind of concept, e.g., "percentage of duplicate code" or "non-configurable database access", and translates it to another, e.g., euros and cents


How industrial IoT is forcing IT to rethink networks

IIoT is redefining the types of data that enterprises use, and how networks process this data. For example, an IIoT network primarily transmits and processes unstructured data, not fixed record transactional data. In contrast, the corporate network processes data that is far more predictable, digestible and manageable. The bulk and the traffic of IIoT data virtually makes it a necessity to implement a single, private, dedicated network to each manufacturing facility for use with its IoT. Security is also a concern, because the networks that operate on the edges of the enterprise must often be maintained and administered by non-IT personnel who don’t have training in IT security practices. It’s not uncommon for someone on a production floor to shout a password to another employee so they can access a network resource — nor is it uncommon for someone on the floor to admit another individual into a network equipment cage that is supposed to be physically secured and accessible by only a few authorized personnel.


Cloud architects are afraid of automation

As humans, we’re just not that good. While we have experience driving cars and can look out the front window, we don’t have a perfect understanding of current data, past data, and what this data likely means in the operation and driving of the vehicle. Properly configured automation systems do. For the same reasons that we are anxious when our cars drive away without us actively turning the wheel, we are slow to adopt automation for cloud deployments. Those charged with making core decisions about automating security, operations, finops, etc., are actively avoiding automation, largely because they are uncomfortable with critical processes being carried out without humans looking on. I get it. At the end of the day, automation is a leap of faith that the automated systems will perform better than humans. I understand the concern that they won’t work. The adage is true: “To really screw things up requires a computer.” If you make a mistake in setting these systems up, you can indeed do real damage. So, don’t do that. However, as many people also say: “The alternative sucks.” Not using automation means you’re missing out on approaches and mechanisms to run your cloud systems cheaper and more efficiently


Cybersecurity, cloud and coding: Why these three skills will lead demand in 2023

As the scale and growth of software development accelerates, and with ongoing AI developments in programming and engineering, the role requirements of software development also look set to change. "AI/ML are changing the world of programming much like the calculator and the computer changed the world," says Stormy Peters, VP of Communities at GitHub. "These technological advancements are taking care of a lot of the mundane, grunt work that developers once had to devote all their time to. Development looks different now." ... As we enter 2023 and software development remains at the heart of business strategies, problem-solving, critical thinking and other human skills will prove integral. "While emerging technologies will increasingly enable them to stay in the flow and solve challenging problems, the technicalities in being able to program, engineer, and develop code through a high level understanding of AI, DevOps, and programming languages will also stay central in importance to the discipline," she adds.


How to effectively compare storage system performance

The best metrics to compare are the ones most applicable to the applications and workloads you will run. If the application is an Oracle database, the performance metric most applicable is 8 KB mixed read/write random IOPS. When the vendor only provides the 4 KB variation, there is a way to roughly estimate the 8 KB results -- simply divide the 4 KB results in half. If the vendor objects, ask for actual 8 KB test results. Use this same simple math for other I/O sizes. Throughput is somewhat more difficult to standardize, especially if vendors don't supply it. You can roughly calculate it by multiplying the sequential read IOPS by the size of the I/O. Latency is the most difficult to standardize, especially when vendors measure it differently. There are many factors that affect application latency, such as storage system load, storage capacity utilization, storage media, storage DRAM caching, storage network congestion, application server load, application server utilization and application server contention. The most important question to ask is how the vendor measured the latency, under what loads and from where. 


8 secrets of successful IT freelancers

An often-overlooked skill is having the knowledge, courage, and ability to steer the client in the right direction. “The customer wants to use the freelancer’s experience and proactivity, therefore it’s very important that the IT freelancer states his or her true opinion when he or she thinks that the customer is moving in the wrong direction,” says Soren Rosenmeier, CEO of Right People Group, a firm that matches clients with IT and business consultants. Don’t jump the gun, however. Before offering any crucial advice, it’s important to have a complete understanding of the issue at-hand. “There might also be a lot of other factors … in the organization that the IT freelancer is unaware of,” Rosenmeier notes. Therefore, prior to offering a suggestion, it’s important to first listen to exactly what the client wants. If the IT freelancer is honest and upfront, the client will receive the benefit of hiring a highly experienced expert, including insights from all the experience the freelancer has gained by working with many other organizations. “At the same time, the customer gets the simplicity and the execution that they want from an external expert that’s hired in to do a specific job,” Rosenmeier says.


In a managed service model, who is responsible and accountable for data?

When it comes to ensuring compliance, since accountability always lies with the business, it is essential to ensure that the MSP is compliant before outsourcing any data management functions. However, before this can be done, it is essential to establish what exactly it is that needs to be complied with, which is often the most difficult question, with a myriad of regulations and legislation being applicable depending on the sector and regions the business operates in. There are two pillars to consider when engaging with an MSP in regards responsibility for data management, one being the data availability and recovery, and the second, the retention of data, however the requirements for compliance, and ultimately accountability, in each will depend on the individual business. This means that before your data can be deemed compliant, you need to understand what that means for your business and have a framework in place that outlines this.


What the experts say about the cybersecurity skills gap

In terms of the skills that are needed, all three cybersecurity leaders agreed that there are various technical skills necessary, similar to any IT role. However, Killian pointed out that not every cybersecurity role is purely a technical one. “Technical skills are usually easier to learn than other important skills like curiosity, ability to ‘play’ in the grey – security issues are rarely obvious ‘yes or no’ problems to solve – and the ability to build relationships with stakeholders. So, unless technical skills are required for the role at hand, they should be prioritised appropriately in job postings,” she said. ... Naidoo reaffirmed that great attitudes and high aptitudes are essential as “technical skills can be taught”. However, she also said it’s important to keep on top of how the tech industry is evolving. “Whatever technical skills are needed in the industry, a corresponding security skill is necessary to secure that technology. So, whether that’s blockchain, quantum or artificial intelligence, or even traditional functions like networks, operating systems and databases, one needs to understand these technologies in order to properly secure them.”


How organisations can right-size their data footprint

“Going on a data diet can be healthy. Cutting out all that junk data that bloats our systems costs us money, raises our data risks and distracts us from the nutritious data that will help us grow. Sometimes, less is truly more.” To reduce data risks and identify useful data, organisations can create synthetic data, which is artificially created data with similar attributes to the original data. According to Gartner, synthetic data will enable organisations to avoid 70% of privacy violation sanctions. Parker said: “If you have sensitive customer data that you want to use but you can’t, you could replace it with synthetic data without losing any of the insights it can deliver.” She added that this could also facilitate data sharing across countries and in industries such as healthcare and financial services. In the UK, for example, the Nationwide Building Society used its transaction data to generate synthetic datasets that could be shared with third-party developers without risking customer privacy, she said. Parker said synthetic data will also enable organisations to plug gaps in the actual data used by artificial intelligence (AI) models. 



Quote for the day:

"Leverage is the ability to apply positive pressure on yourself to follow through on your decisions even when it hurts." -- Orrin Woodward