Daily Tech Digest - December 09, 2022

Why CIOs must think of themselves as products—and hostage negotiators

Despite the growing breadth of the CIO’s role, Tyler believes that technology executives can go further still, extending their value and influence within the organization by thinking of themselves less of a service provider and more of a ‘powerful, valuable product’, which senior executives, partners and peers need to do their jobs effectively. “Product value proposition in business terms is something we develop when we’re trying to create the next generation of goods and services for our customers or citizens—any stakeholder we’re working with,” he said ... “Your leadership is a product that all of your executive team, partners, peers, and all of the people in your organization needs,” he said. Tyler also suggested that CIOs must build their own product value proposition to deliver the maximum value to the business, and make a promise to stakeholders of how technology will help them achieve their desired outcomes, adding that technology leaders can take simple steps to start by understanding who consumes IT, and by deeply understanding their jobs, and how IT can remove pains and create gains


Digital transformation trends in 2023

Automation adoption is often viewed as one of the best methods that companies can employ to streamline processes and boost revenues without causing costs to spiral out of control. Recognising the benefits that can be brought by automation, 54 per cent of organisations have already begun implementing robotic process automation [RPA] into their processes, according to a Deloitte survey. With the economy in such dire straits, and the landscape continuing to look so grim for businesses, it is highly likely that we will see a greater number of companies investing in this technology than ever before. Not only will implementing automation be an affordable alternative to investing in a full digital transformation project for many businesses, it will also provide a basis upon which to create new efficiencies in the years to come. Indeed, Gartner predicts that, by 2024, hyper automation will enable organisations to lower their operational costs by 30 per cent. At a time of great economic uncertainty, when every penny that businesses spend needs to be clearly justified, automation is a proven and dependable cost saver.


5 Ways to Embrace Next-Generation AI

While AI can solve big problems, it doesn’t have to do so all at once. “In the projects I’ve seen successful from our customers or internally, it is just getting moving with the tools you have or small investments and then growing from there,” Mark Maughan, chief analytics officer at cloud business intelligence platform Domo, asserted. “Just get moving. Get started. Test, learn, grow, and iterate.” ... A significant part of gaining that buy-in is having the talent that can communicate effectively with the various stakeholders. Leveraging storytelling to illustrate the problem and how AI can solve it is a powerful tool. The earlier organizations engage relevant stakeholders, the more likely the project is to be successful. That ability to communicate may or may not come naturally, but it can be learned. “One thing that we've always found very helpful is either rotations or shadowing for anyone in the data science analytic organization around the different business and operational stakeholder groups to get a better understanding of what is actually going on,” Finnerty said.


The cybersecurity challenges and opportunities of digital twins

Unfortunately, while CISOs should be key stakeholders in digital twin projects, they are almost never the ultimate decision maker, says Alfonso Velosa, research vice president of IoT at Gartner. “Since digital twins are tools to drive business process transformation, the business or operational unit will often lead the initiative. Most digital twins are custom-built to address a specific business requirement,” he says. When an enterprise buys a new smart asset, whether a truck, backhoe, elevator, compressor, or freezer, it will often come with a digital twin, according to Velosa. “Most of the operational teams will need a streamlined and cross-IT—not just CISO—set of support to integrate them into their broader business processes and to manage security.” If proper cybersecurity controls aren’t put in place, digital twins can expand a company’s attack surface, give threat actors access to previously inaccessible control systems, and expose pre-existing vulnerabilities. When the digital twin of a system is created, the potential attack surface effectively doubles—adversaries can go after the systems themselves or attack the digital twin of that system.


RegTech Can Help Solve ESG Data Management and Trust Challenges

The benefits that RegTech offers sustainability data officers go beyond mere automation of processes; they also include a guarantee that the data will be seamlessly integrated into a financial institution’s broader data assets and can ensure that it is traceable and auditable. Katie Carrasco agreed. The head of ESG at Global Innovation Fund, an impact investment vehicle said that RegTech could provide a firm with a 360-degree view of its ESG data, a matter that’s critical to good governance, which in turn is important in ensuring the accurate identification of opportunities and risks. Data traceability and auditability will also provide for credibility, argued Mary Anne Bullock, global strategic account director for Solidatus. At a time when greenwashing is making headlines and undermining the ESG project, Bullock said that being able to trace data from source to use-case would help demonstrate its veracity, offer transparency into firms’ activities and build trust. Building trust comes down to credible metrics, said Seethepalli, and that would come when auditability is added into the data management mix. 


As Complexity Challenges Security, Is Time the Solution?

"Complexity leaves me in a very depressing place; complexity is just forever increasing," said Moss, who's the founder of Black Hat and regularly opens the conference by detailing leading challenges as well as potential solutions. For addressing complexity, he said, "time has got me pretty excited." Simply put, being strategic about doing things faster - including detection and recovery - gives organizations one tactic to blunt the impact of increased complexity. Not all complexity involves technological evolution, such as malware built to better evade defenses, or criminals wielding zero-day exploits. Researchers debuted last week ChatGPT, a prototype, conversational AI chatbot that can sometimes appear to be human. This means added complexity for security professionals, since many security tools use attackers' poor command of English to detect and block phishing attacks. Expect criminals to soon use tools such as ChatGPT to write lures that seem to have been crafted by a native speaker, said Daniel Cuthbert, a veteran cybersecurity researcher who's a member of the U.K. government's new cyber advisory board.


Meta’s behavioral ads will finally face GDPR privacy reckoning in January

If Meta is forced to ask users if they want “personalized” ads (its favored euphemism for surveillance ads), that is definitely big news — given that rates of denials when web users are actually given a choice over targeted ads are typically very high. The crux of noyb’s original complaints against Meta services was that users were not offered a choice to deny its processing for advertising — despite the GDPR stipulating that if consent is the legal basis being claimed for processing personal data, it must be specific, informed and freely given. However — plot twist! — it later emerged that as the GDPR came into application, Meta had quietly switched from claiming consent as its legal basis for this behavioral advertising processing to saying it is necessary for the performance of a contract — and claiming users of Facebook and Instagram are in a contract with Meta to receive targeting ads. This argument implies that Meta’s core service is not social networking; it’s behavioral advertising. Max Schrems, noyb’s honorary chairman and long-time privacy law thorn in Facebook’s side, has called this an exceptionally shameless attempt to bypass the GDPR.


Introduction to Interface-Driven Development (IDD)

This concept already existed in some areas, such as Protocol-oriented programming in Swift or Interface-based programming in Java, and it is based on Design by Contract by Bertrand Meyer, described in his book “Object-Oriented Software Construction”. In the book, he discusses standards for contracts between a method and a caller. Also, Hunt and Thomas rely upon a similar concept in their “The Pragmatic Programmer” book, in the section on Prototyping Architecture: “Most prototypes are constructed to model the entire system under consideration. As opposed to tracer bullets, none of the modules in the prototype system need to be particularly functional. What you are looking for is how the system hangs together as a whole, again deferring details.“ The problem that this process needs to solve is components that are vaguely defined during design, and we tend to give more responsibility to some components than is necessary. A usual implication of such design is bad and untestable code.


Leveraging the full potential of zero trust

In line with the motivations behind cloud migration, Zscaler found that a focus on wider strategic outcomes is missing from how organizations are planning emerging technology initiatives. Regarding the single most challenging aspect of implementing emerging technology projects, 30% cited adequate security, followed by budget requirements for further digitization (23%). However, only 19% cited dependency on strategic business decisions as a challenge. While budget concerns are natural, the focus on securing the network while ignoring strategic business alignment suggests organizations are focused on security without a full understanding of its business benefit, and that zero trust itself is not yet understood as a business enabler. “The state of zero trust transformation within organizations today is promising – implementation rates are strong,” said Nathan Howe, VP of Emerging Tech, 5G at Zscaler. “But organizations could be more ambitious. There’s an incredible opportunity for IT leaders to educate business decision-makers on zero trust as a high-value business driver, especially as they grapple with providing a new class of hybrid workplace or production environment and reliant on a range of emerging technologies, such as IoT and OT, 5G and even the metaverse.


Going from Architect to Architecting: the Evolution of a Key Role

A fundamental principle of today’s software architecture is that it's an evolutionary journey, with varying routes and many influences. That evolution means we change our thinking based on what we learn, but the architect has a key role in enabling that conversation to happen. ... The architect playing a sole role in the software development game is no longer the case. Architecting a system is now a team sport. That team is the cross functional capability that now delivers a product and is made up of anyone who adds value to the overall process of delivering software, which still includes the architect. Part of the reason behind this, as discussed earlier, is that the software development ecosystem is a polyglot of technologies, languages (not only development languages, also business and technical), experiences (development and user) and stakeholders. No one person can touch all bases. This change has surfaced the need for a mindset shift for the architect; working as part of a team has great benefits but in turn has its challenges.



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor

Daily Tech Digest - December 08, 2022

NASA's next-gen robot will explore space and do your chores at home

The robot will be utilized in three sectors: commerce, space and personal home use -- in that specific order, from structured to unstructured environments. "Structured means you can control the environment," Cardenas says. "Unstructured means the environment is very dynamic – and there's no more dynamic environment than the home, right?" Before Apollo can become your newest family member, the robot has to be affordable, safe and agile enough to operate in such a dynamic environment. ... "One of NASA's goals is not just to develop technology for space exploration," said Azimi. "We also want these technologies to be available for use on Earth and that the outcome of the development projects that we undertake with our partners will be available to as many people as possible, to the maximum benefit of humanity in general." One major way Apollo will be able to help humanity is by supporting the commercial sector. Apollo will mitigate supply chain issues by doing the jobs that people don't necessarily want to do but are still vital to sustaining industry and the economy.


Five Actionable Success Tips for Security Professionals in 2023

Have a personal incident response plan - We all have CIRT/SIRT teams, major incident response plans and playbooks, but how many of us consider the real personal impact if we need to deploy these plans? Everyone has a home life, and they will differ greatly, but no one can run 24/7. Some of us have caring responsibilities, and we all get stressed. ... Turn the camera on/go into the office (if you have one) - This is about making connections and re-connecting in a post pandemic world. Lots of us may have lost our offices but it’s so important to try and keep the human connectivity within technical professions. ... Know your business - This is focused on understanding how the business you work for makes its money. Working in cybersecurity and GRC, we are keen to see risks mitigated and controls applied, but the biggest risk to a business is that it doesn’t survive, and we need to be clear that our job is to help the business grow by protecting what it cares about and being trusted advisors, not the people who say 


The Hidden Cost of Software Automation

Nothing is free. Even after we automate a process, it is not free. We shift the entire cost from manual work to the cost of creating and maintaining the automation. It is the maintenance cost of automation that a lot of time gets neglected. Assuming the automation code no needs to change and improve, there’s still a need to upgrade the tools or library from time to time. These are all future overhead that is not present when creating the automation. ... After a while, the people who created the automation were no longer on the team. Nobody feels it’s a problem, as it still works. The people who created it don’t even remember it as well, as it was created “centuries” ago. But like any software, nothing lasts. One day, something needs to change. This automated code is like a black box to the entire team. We cannot do incremental evolvement to it. The only option is to recreate everything from scratch without having a reference of what was done internally in the past. The bigger the automation, the costlier it is to rebuild one. The cost of lost context has no warning sign. It’s a time bomb that is not IF, but WHEN it will explode if we don’t attend to it.


The Future of Technology Depends on the Talent to Run it

First, we need an upskilled team, trained in the technical competencies that will allow us to create and upgrade our products or services. We also need a consistent team, AKA low turnover. Our team should benefit from the institutional knowledge that comes with longer staff tenure. This means that we need to keep our staff happy enough to stay onboard, but it also means that we need to be scaling our teams thoughtfully, not recklessly. We’ve seen how companies that grow too quickly can end up suffering from sudden layoffs. This impedes company success, both because of a shrunken staff and because workers on the job market will be less interested in working for companies that could let them go with little notice. And finally, to get to the point where we have a highly skilled team with low turnover, we need to streamline our hiring and onboarding processes.


Is Your Data Team Enabled To Deliver The Killer Punch?

Even Financially, it makes more sense to embed Data into the Product teams. Traditionally Data Teams are often treated as Cost Centres while Product teams are as Profit centers. When we nurture an aggressive ambition to leverage data as differentiators and identify possible new revenue opportunities, it’s ironic to continue Data as part of cost centers that are highly vulnerable to cost-cutting and first in line to get hit by Industry slowness. It’s akin to cutting the limb and taking up a driving job !! This dilemma about “Centralized or Federated” Data teams doesn’t have a cookie-cutter response; it’s a function of organizational maturity. A centralized model is a foundational step; this will help to identify, establish and refine the scope, process, guidelines, and, more essentially, harvesting niche data talents. When the journey commences here, it shouldn’t end but evolve. The Federated model is the next, the Product teams have an embedded data component similar to the Agile team having a functional tester. Certain non-negotiables, such as Data Privacy ( e.g., GDPR) security, Data Governance, and Cross product features, will require a representative(s) from product teams to come together to establish and implement enterprise guidelines.


3 Essential Tips for Adopting DevSecOps

Build generation is the best time to include a scan that checks to see if new vulnerabilities have been added. This scan should check the entire application, not just the new code. Adding this check to the pipeline will force developers to update and patch vulnerabilities in order for the pipeline to run. ... A good observability setup is not just for monitoring application health. It can also be helpful for identifying security issues. For example, a spike in an endpoint can be an attack. Therefore, you want to create intelligent alerts that combine information about access sources, failed access attempts, operating systems and databases. Along with these alerts, you can add some predefined actions to prevent an attack from taking down your application. For example, try to figure out your app’s average usage and block or redirect access if you get an unexpected spike. But make sure that you’re on the same page with marketing and other departments so that you can properly prepare and change your limits when a spike is detected or predicted.


Complexity is the enemy of cloud security

Most IT shops don’t consider complexity a significant metric to track when researching cybersecurity or cloud security. It’s often neglected because most security is a siloed set of processes. The architecture teams look at security as a black box where stuff is tossed over a wall and somehow magically becomes secure. We’ve needed to integrate security with development, architecture, and operations for a long time. Some organizations practice devsecops (development, security, and operations) and integrate these concepts, bringing everyone’s expertise to bear on all problems. In an ideal world, security is never somebody else’s problem because the lines of demarcation between development, architecture, security, and operations do not exist. Everyone works together across all development, design, and deployment aspects. Security is systemic to everything, which is the correct way to view it. When security is everywhere, it also becomes a factor when defining core cloud and non-cloud architectures, including the amount of complexity introduced and how to effectively manage it.


How Can Emerging Technology Actually Drive Value for Companies?

There is a connection between advancing data management analytics practices and the ability to derive value from emerging technologies. The most successful companies understand how to turn emerging technology into action. The framework for making emerging technology actionable begins with the question: Is the technology ready for your company? “Can it do what your business needs it to do?” Hopkins asked. Next, leaders need to consider if their companies are ready for the technology. “We really think about three maturity windows in which the emerging technologies will deliver return on investment,” Hopkins said. Already, some of these technologies are being widely used in companies today. “There's cloud data computing and natural language processing. Those things are delivering benefits for mainstream average firms today,” Hopkins pointed out. Others on the list, like explainable AI, edge intelligence and intelligent agents, are two to four years out for most firms, according to Hopkins. TuringBots, Web3, and extended reality could be five or more years out.


IT leaders adjust budget priorities as economic outlook shifts

These days, IT leaders are keeping a closer eye than usual on pricing, and in some cases are buying out their long-term cloud contracts to give themselves more flexibility. “Executive leadership doesn’t want to hear we’re locked in and can’t move,” US Silica’s Piddington says. Vendors “want to true you up but never want to true you down,” he adds, and shorter-term contracts can help incent them to do so. ... Although the supply-chain shortage and other factors have caused prices to increase for two or three years now, IDC’s Minton says IT buyers have had enough. “There’s pushback now,” he says, and when there was once more tolerance for the reasons behind vendor price increases, IT leaders are now saying they just can’t keep pace and must keep budgets within a narrow range. Piddington agrees, saying that the situation is forcing IT executives to “be smarter” and understand where the opportunities are within each vendor relationship to “pull the right levers.” Having strong relationships with vendors, and not just engaging in transactional deals, can “give you more potential” to create the flexibility to work with them on pricing.


Australia to develop new cyber security strategy

It would be unreasonable to expect to see detailed policy proposals, given that the minister was announcing work to develop a strategy, not the strategy itself. But her stated goal is to make Australia “the world’s most cyber-secure country by 2030”. O’Neil listed four ways that the government plans to make that happen: bringing the nation into the fight to protect citizens and the economy; strengthening international engagements so that Australia can be a global cyber leader; strengthening critical infrastructure and government networks; and building sovereign cyber security capabilities. During questions after the address, O’Neil said: “We’re not spending enough on cyber defence at the moment. One of my challenges is how we are going to address that problem.” She noted that securing government infrastructure will be expensive. The minister appeared to be calling for bipartisan support for the development and implementation of the strategy when she said: “Many in the opposition are good, thoughtful people who know that the approach we are taking – strong, serious, depoliticised – is how we make our country safer.”



Quote for the day:

"Leadership is a way of thinking, a way of acting and, most importantly, a way of communicating." -- Simon Sinek

Daily Tech Digest - December 07, 2022

4 Ways CFOs Can Mitigate Costs of Poor Data Management

CFOs can also join forces with their IT counterparts to elevate security procedures as part of the company ethos (without detracting from employee productivity). Incentivizing employees to be mindful of data security and data management policies that could lead to financial impacts is one way to jump-start this effort. ... When approving IT expenditures, CFOs have a great opportunity to ensure the emphasis is on data management projects that reduce financial risk and prevent waste of resources. For example, it may be worth investing in the establishment of a single sign-on for company employees. A single sign-on allows access to company data to be quickly turned off upon an employee’s departure. In general, tools that speed up the response time to vulnerabilities and reduce the attack surface — and hopefully stop breaches before they happen — are worth prioritizing. Freeware for data sanitization exists, but enterprise-grade tools provide assurances, such as certificates of erasure, which equate to less risk. Also, automating the different stages of data management processes not only increases productivity but can also significantly expedite the recycling or disposal of assets, mitigating storage issues and security risks.


The Unheard Story of Lost Anonymity

Although there are data privacy regulations in the picture, it is expected that pieces of our information will fall into some wrong hands through organization acquisitions, data breaches or data theft. Have you at any point asked yourself why banks you have never opened an account with flood you with calls offering loans and credit cards? Or why you receive countless spam messages from unknown numbers asking you to update your KYC? How do these people you never shared your information with know your full name and your number? It is important to understand that your number is not simply a number. It is connected to a lot of information that may be sensitive—for example, your employer information, bank balance, personally identifiable information (PII) or maybe even personal health information. This information might begin from data you provided to a bank, to an e-recharge website or to a retail/e-commerce store where you might have made a purchase; however, from that point onward, your consent does not make any difference. Your information could be sold to anyone, from a marketing agency to criminals looking for targets.


What is ChatGPT and why does it matter? Here's what you need to know

Despite looking very impressive, ChatGPT still has limitations. Such limitations include the inability to answer questions that are worded a specific way, requiring rewording to understand the input question. A bigger limitation is a lack of quality in the responses it delivers -- which can sometimes be plausible-sounding but make no practical sense or can be excessively verbose. Lastly, instead of asking for clarification on ambiguous questions, the model just takes a guess at what your question means, which can lead to unintended responses to questions. Already this has led developer question-and-answer site StackOverflow to at least temporarily ban ChatGPT-generated responses to questions. "The primary problem is that while the answers that ChatGPT produces have a high rate of being incorrect, they typically look like they might be good and the answers are very easy to produce," says Stack Overflow moderators in a post. Critics argue that these tools are just very good at putting words into an order that makes sense from a statistical point of view, but they cannot understand the meaning or know whether the statements it makes are correct.


HHS: Web Trackers in Patient Portals Violate HIPAA

The warning from the department's Office of Civil Rights comes months after revelations that medical providers have used free web user tracking code offered by Facebook and Google in websites frequented by patients. Facebook parent Meta faces a proposed class action alleging it violated privacy law by collecting patient information via its Pixel tracker, including data on doctors, conditions and appointments. At least three major healthcare organizations in recent weeks have treated their previous use of web tracking code as a reportable data breach. ... "Providers, health plans, and HIPAA-regulated entities, including technology platforms, must follow the law. This means considering the risks to patients' health information when using tracking technologies,” said HHS OCR Director Melanie Fontes Rainer in a statement. The bulletin specifies that trackers embedded into login pages such as a patient or health plan beneficiary portal or a telehealth platform are particularly susceptible to transmitting protected health information if they contain trackers.


The Case for Transparency in Data Collection

The relationship between consumers and data transparency (or in some cases, lack of transparency) is not unique to internet marketing. Parallels can be drawn between online data transparency and methods used for years by retailer loyalty programs. Long before the internet, enrolling in a loyalty program gave the issuer access to a consumer’s personal spending habits, geographic spending data, and other personal data -- and consumers rarely read the fine print in their agreements. The reality is, reading and taking the time to digest privacy policies is a huge ask, especially in the context of the internet, which has become synonymous with instant gratification. One study found it takes more than 200 hours -- longer than a typical work month -- to read the average privacy policy word-for-word on the websites we visit each year. Although that is an entertaining statistic, most consumers do not have any idea what they are saying yes to when signing into apps or agreeing to a website’s terms of service. They are blissfully unaware of exactly how companies use consumer data to test marketing campaigns, improve the customer journey, or share with third parties. 


Tone From The Top: How Top-Down Inspires Bottom-Up In PrivSec

Simply put, privacy and security are everyone's responsibility in the modern workplace. Gone are the days when cybersecurity "belongs" to the IT department and privacy "lives" in the legal team. Physical security and guest privacy begin at the reception desk, social engineering defenses start in the call center, and business email compromise (BEC) prevention starts in the finance office. Privacy and security are role-based these days, and as such, you need to start right at the bottom and work your way up. ... In working your way up the chain, it generally goes well until you hit the complexities of divisions with certain powers to bypass policy, often at the executive level. When any member of an executive team regularly avoids adhering to policies, it undermines the policy that has been put in place. Often, this is exercised under the guise of "being agile" or "responding to demand" in the quickest way possible, but it does far more damage than it does good. Effectively in these situations, the executive has stopped playing by the company rules. And when this happens, it very easily and very rapidly rubs off on the rest of the staff and paves the way for "normalization of deviance." 


What you should know when considering cyber insurance in 2023

Security advisors and consultants say they see insurers asking more questions of those seeking insurance policies. They’re requiring proof that applicants have achieved certain levels of security hardening, such as SOC 2 compliance. They’re reviewing security strategies and policies as well as security training and awareness programs. “Insurance companies are taking a closer look at all of those,” Wilkison says. This in turn has required more involvement from enterprise security leaders in the insurance procurement process. ... CISOs may also have to make adjustments to their strategies based on insurer demands. “If you want to get your claim, you usually have to use their panel of vendors or follow their procedures,” says Michael Pisano, a managing director at global consulting firm Protiviti. For example, they will be required to have detailed response and recovery plans in place—in the event of an incident, insurers want clients to meet specific requirements, such as which lawyers should be used and what forensics should be performed, and by whom. 


How to Build Privacy By Design Into Customer Experience

The demands for data privacy are growing and there is no turning back. But is it too late to make a real difference? “We need to push back on the thinking that privacy is dead,” says Baber Amin, COO of Veridium, an integrated identity management platform provider. “It is not dead. In fact, more than ever, it needs to be nurtured and thought through in light of modern technology. A good example of not giving up is [the US Supreme Court case] Carpenter v United States.” The question remaining in this discussion is: Are companies ready, willing, and able to provide data privacy protections? ... “The proper way to go forward is through transparent privacy policies that notify users about the data and information we collect,” says Apu Pavithran, CEO of Hexnode, a device management company. “Transparency is the key if you want to generate trust and build a more valuable connection with consumers. However, building trust via openness requires time and effort, but can help firms outperform their competitors in terms of sales, revenue, and marketing ROI.”


Value-creating chief data officers: Cementing a seat at the top table

Across every industry, we found more CDO appointments have been made since our last study. The heavily regulated financial services industry—where effective use of data is vital for both reporting and compliance—continues to set the bar. Just over half of banks and insurers now have a CDO in place, a number that accounts for 22% of CDOs globally. But although we saw most CDOs appointed at banks (25), and capital goods (18) and software (13) firms this year, household and personal products, automotive, food and beverage, and retail organizations saw the highest year-on-year increase in the proportion of companies with a CDO. Regardless of industry, CDO growth is still being driven by the largest companies—those with multimillion-dollar revenues and the largest head count. This is likely due to their greater organizational and technological complexity. However, CDO appointments are on the rise across businesses of all sizes. The emergence of CDO positions in midsized firms suggests the role is beginning to be more widely recognized as a useful way to help executive teams pursue business growth.


What Is Code Churn?

Code churn, also known as code rework, is when a developer deletes or rewrites their own code shortly after it has been composed. Code churn is a normal part of software development and watching trends in code churn can help managers notice when a deadline is at risk, when an engineer is stuck or struggling, problematic code areas, or when issues concerning external stakeholders come up. It is common for newly composed code to go through multiple changes. The volume and frequency of code changes in a given period of time can vary due to several factors and code churn can be good or bad depending upon when and why it is taking place. ... Code churn varies depending on many factors. For instance, when engineers work on a fairly new problem, churn would most likely be higher than the benchmark, whereas when developers work on a familiar problem or a relatively easier problem, churn could most likely be lower. Churn could also vary depending on the stage of a project in the development lifecycle. Hence, it is important for engineering managers and leaders to develop a sense of the patterns or benchmarks of churn level for different teams and individuals across the organization.



Quote for the day:

"Listening to the inner voice, trusting the inner voice, is one of the most important lessons of leadership." -- Warren Bennis

Daily Tech Digest - December 06, 2022

Stealth Data Collection Threatens Employee Privacy

It’s no secret that collecting sensitive information comes with risks, says Alan Brill, senior managing director of the cyber risk practice, at business advisory firm Kroll. “You may be collecting information that's covered by laws or regulations, whether you know it or not,” he warns. “Collecting data that you don’t actually need in order to perform a business process represents 100% risk and 0% value.” Enterprise leadership has to recognize that collecting unneeded information, or information that's not used for intended purposes, can be an actual danger to the organization. “This decision should not be delegated solely to IT leaders,” Brill says. ... The fastest way to identify confidential and unnecessary data is by using advanced data loss prevention (DLP) capabilities to search for specific patterns, such as email addresses, phone numbers, protected health information, and personally identifiable information (PHI/PII) data types, says Doug Saylors, a cybersecurity partner with global technology research and advisory firm ISG. Another protection measure, aimed at limiting traffic visibility, is to require remote workers to use VPN connections whenever linking to the enterprise network, he adds.


AWS names 6 key trends driving machine learning innovation and adoption

Increasing volumes of data, and different types of data, are being used to train ML models. This is the second key trend Saha identified. Organizations are now building models that have been trained on structured data sources such as text, as well as unstructured data types including audio and video. Having the ability to get different data types into ML models has led to the development of multiple services at AWS to help in training models. One such tool that Saha highlighted is SageMaker Data Wrangler, which helps users process unstructured data using an approach that makes it practical for ML training. AWS also added new support for geospatial data in SageMaker this week at the re:Invent conference. ... The final key trend that will drive ML forward is democratizing the technology, making tools and skills accessible to more people. “Customers tell us that they … often have a hard time in hiring all the data science talent that they need,” Saha said. The answers to the challenge of democratization, in Saha’s view, lie in continuing to develop low-code and use case-driven tools, and in education.


Balancing cybersecurity costs and business protection

For many SMEs, cuts to cybersecurity budgets may feel justified due to a lack of breaches encountered in the past. However, the reality is those defences are why they’ve never had an attack. You wouldn’t get rid of a house alarm because you’ve never been burgled. Cybersecurity should be no different. Organisations may also think they can do away with security measures because they’re too small – that they’re not a juicy enough target. But the opposite can be true. Hackers can see smaller businesses as easy prey that won’t have the same calibre of defence as a large corporation – and more likely to give in to demands too. ... When thinking about cybersecurity, another area that is often overlooked is the possibility of human error. While the risk of an employee retaining data accidentally can be just as serious as an external hacker, preventing accidental breaches shouldn’t cost the earth and there are simple ways to minimise the chance of one happening. Regular training is the most effective ways to prevent a slip-up and will empower staff to stay on top of new threats. It’s important, however, that this training is targeted and being applied in the right areas.


Great Leaders Manage Complexity with Self-Awareness and Context Awareness

Undoubtedly, people across organizations have expectations of “leaders.” In a general sense, they expect them to lead. In my experience, this entails a diverse set of expectations from various people within a collective or shared context. The most common expectations I’ve come across are providing answers and clarity, guidance, context, direction and vision, structure, and accountability. Think of how expectations are entangled with the framing of leadership. People seem to have different specific needs to take steps toward something and make progress. My experience is that a person’s historical experiences significantly influence their needs, which vary with context. People’s awareness about themselves, a specific situation, and others vary. So what people think is needed is sometimes not relevant or appropriate. These are some reasons I’ve found the specifics of leadership challenging, to say the least. Some of the sources that I’ve found particularly helpful when managing these challenges—understanding individual and contextual needs—are SCARF by David Rock and Wardley Mapping.


Machine Learning Models: A Dangerous New Attack Vector

Researchers demonstrated how such an attack would work in a POC focused on the PyTorch open source framework, showing also how it could be broadened to target other popular ML libraries, such as TensorFlow, scikit-learn, and Keras. Specifically, researchers embedded a ransomware executable into the model's weights and biases using a technique akin to steganography; that is, they replaced the least significant bits of each float in one of the model's neural layers, Janus says. Next, to decode the binary and execute it, the team used a flaw in PyTorch/pickle serialization format that allows for the loading of arbitrary Python modules and execute methods. They did this by injecting a a small Python script at the beginning of one of the model's files, preceded by an instruction for executing the scrip, Janus says. "The script itself rebuilds the payload from the tensor and injects it into memory, without dropping it to the disk," she says. ... The resulting weaponized model evades current detection from antivirus and endpoint detection and response (EDR) solutions while suffering only a very insignificant loss in efficacy, the researchers said. 


How to get cloud migration right

A successful migration — like a house renovation — begins with an analysis of your current environment. Knowing how DNS/DHCP functions in your environment, as well as identifying adjacent technologies and integrations, security posture, and business processes is a necessary step. It won’t prevent all surprises during migration, but it can help. Next, outline and explore the challenges related to your current network architecture. Stakeholders should arrive with a vision of their ideal infrastructure. What things do they not want to see in their new network? What do they want to prevent, improve, and optimize — and how do they expect the cloud to help? Resilience drives many enterprises to cloud migration. This might occur after crippling outages that disrupt user experiences and business operations. But the hunt for efficiency and new IT initiatives that can reduce service level agreements are also factors. There’s another often-ignored factor that can derail cloud migrations: not including the right stakeholders. In an on-premises environment, the main stakeholders were the data center or network team. Successful cloud migrations demand inclusion. 


When blaming the user for a security breach is unfair – or just wrong

The best place to start is understanding employee roles, resources, and access habits, Laxdal says. For example, financial workers should understand the specific risks to business accounts and social engineering attempts such as BEC scams that may target them. Development departments will have different risk areas to focus on; for example, their IP on hosted servers or malware hidden in public open-source libraries. HR, on the other hand, is dealing with PII (financial, banking, and healthcare information) that shouldn’t be shared over any channel, particularly given that anyone can impersonate a CEO and request files or transfers. “All of these vectors are being used globally against information assets and are overwhelmingly credential-based attacks that are perpetrated through phishing. Users need to understand why and be part of that discussion with real-world examples,” Laxdal explains. “Sit down with your employees, ask about their typical day and access requirements. And understand each functional area of the business so you can design controls and training for their business.”


7 ways to cope with a C-level rival

Embrace and uplift your foe, regardless of whether they embrace and uplift you, advises Paola Saibene, principal consultant at IT and business advisory and consulting firm Resultant. “Your focus should be on revamping, refreshing, reinventing, and progressing, so that you’ll be known as a leader, no matter what,” she says. “If you put your focus on getting better and better, and being inclusive along the way, the rival will have fewer and fewer opportunities to bring you down.” Also try to view the situation from the antagonist’s perspective, no matter how unjustified it may be. “Take the opportunity to bring them into a conversation to deconstruct the issue,” Saibene recommends. “If it’s not a personal issue, it’s solvable.” Finally, if all efforts at compromise fail, feel free to proceed with no regrets, Saibene suggests, fully realizing that you tried your best. As you struggle with your nemesis, it’s important to maintain your composure and not let emotions get in the way of your decision-making process, says Kimberley Tyler-Smith, a former McKinsey & Co. analyst. Currently the strategist at career tech service company Resume Worded, Tyler-Smith advises seeking impartial help.


Data Clean Rooms: Enabling Analytics, Protecting Privacy

According to Forrester, to qualify as a data clean room, security and privacy controls must be embedded in the tool so that enterprise and customer data is protected before it’s shared and analyzed. This means they must include strong identity and access management capabilities, and encryption of data entering the “clean room,” among other protections. But it’s not just the tools that need to incorporate these protections. Forrester says that clean rooms must have processes in place to protect privacy, too. For instance, a critical process would include normalizing data before entering the room and verifying the degree of de-identification when data leaves the room. Another essential piece is a process to assess risks in the data clean room, according to Forrester. Forrester lists two other keys to data clean rooms: transparent data governance controls, and a self-governing analysis experience. One data clean room provider is Snowflake. Originally known for its cloud data warehouse services, the company was already known for working with end-customer data, and it was talking about data clean rooms back in January 2020 as a way to continue data analysis while abiding by new regulations such as GDPR and the California Consumer Privacy Act.


Adopting Low Code/No Code: Six Fitnesses to Look For

LCNC platforms usually offer more than one hosting option. Typical hosting options include the LCNC provider's cloud and self-hosting within your own infrastructure. Choosing the provider's cloud has the benefit of making use of their end-to-end DevOps toolchain which should be operationally easy to manage and in turn, cost-effective. Note that this benefit is best when a significant number of applications are hosted within their environment, and those applications are isolated in nature (meaning, limited integrations with existing applications residing in your hosting infrastructure). You also need to check if the provider's cloud offers transparency to your deployed resources and offers support tiered to your application’s criticality. For example, if you have a significant number of applications developed on the Mendix platform and you plan to build more applications on it, then choosing Mendix Cloud may be cost-effective and operationally simplistic.



Quote for the day:

"Being honest and open is the only way to convince cynical employees that you truly want to establish a partnership with them." - Florence M. Stone

Daily Tech Digest - December 05, 2022

Is SASE right for your organization? 5 key questions to ask

Many analysts say that SASE is particularly beneficial for mid-market companies because it replaces multiple, and often on-premises, tools with a unified cloud service. Many large enterprises, on the other hand, will not only have legacy constraints to consider, but they may also prefer to take a layered security approach with best-of-breed security tools. Another factor to consider is that the SASE offering might be presented as a consolidated solution, but if you dig a little deeper is might actually be a collection of different tools from various partnering vendors, or features obtained through acquisition that have not been fully integrated. Depending on the service provider, SASE offers a unified suite of security services, including but not limited to encryption, multifactor authentication, threat protection, Data Leak Prevention (DLP), DNS, and traditional firewall services. ... With incumbents such as Cisco, VMware, and HPE all rolling out SASE services, enterprises with existing vendor relationships may be able to adopt SASE without needing to worry much about protecting previous investments.


How gamifying cyber training can improve your defences

Gamification is an attempt to enhance systems and activities by creating similar experiences to those in games, in order to motivate and engage users, while building their confidence. This is typically done through the application of game-design elements and game principles (dynamics and mechanics) in non-game contexts. Research into gamification has proved that it has positive effects. ... Gamification has been dismissed by some as a fad, but the application of elements found within game playing, such as competing or collaborating with others and scoring points, can effectively translate into staff training and improve engagement and interest. “The way that cyber security training sessions are happening is changing and it’s for the better,” says Helen McCullagh, a cyber risk specialist for an end-user organisation. “If you look at the engagement of sitting people down and them doing a one-hour course every year, then it is merely a box-ticking exercise. Organisations are trying to get 100% compliance, but what you have are people sitting there doing their shopping list.”


The 3 Phases Of The Metaverse

There are several misconceptions about the metaverse today. In simple terms, the metaverse is the convergence of physical and digital on a digital plane. In its ideal phase, you can access the metaverse from anywhere, just like the internet. Early metaverse apps were focused on creating games with tokenized incentives (play-to-earn) and hadn’t initially been thought of as contributing to the next phase of the internet. One of the most prominent examples is the online game Second Life, which is regarded as the earliest web2-based metaverse platform. Users have an identity projected through an avatar and participate in activities—very much a limited “second” life. ... Unlike the previous phase, Phase 2 is all about creating utilities. Brands, IP holders and companies investing in innovation have been collaborating with gaming metaverse dApps to understand consumer behaviors and economic dynamics. No-coding tools, as well as software development kits, in this phase, are empowering the end user to co-create alongside developers, designers, brands and retail investors. Still, interoperability—the import and export of digital assets—is only possible on a single chain, and the user experience is still seen as gaming in 2-D or 3-D environments.


Why the Agile approach might not be working for your projects

Although Scrum is a well-described methodology, when applied in practice it is often tailored to the specific circumstances of the organisation. These adaptations are often called ScrumBut (“we use Scrum, but …”). Some deviations from the fundamental principles of Scrum, however, may be problematic. These undesirable deviations are called anti-patterns — bad habits formed and influenced by the human factor. What exactly can we consider an anti-pattern? It can be a disagreement on whether or not the task is completed, a disruption caused by the customer, unclear items in the backlog, the indecisiveness of stakeholders (customers, management, etc.), and lack of authority or poor technical knowledge on the part of the Scrum master. We collected detailed information in three Scrum teams using a variety of data collection procedures over a sustained period of time — including observation, surveys, secondary data, and semi-structured interviews – to get a detailed understanding of anti-patterns, and their causes and consequences.


Rise of Data and Asynchronization Hyped Up at AWS re:Invent

Because it was believed that asynchronous programming was difficult, he said, operating systems tended to have restrained interfaces. “If you wanted to write to the disk, you got blocked until the block was written,” Vogels said. Change began to emerge in the 1990s, he said, with operating systems designed from the ground up to expose asynchrony to the world. “Windows NT was probably the first one to have asynchronous communication or interaction with devices as a first principle in the kernel.” Linux, Vogels said, did not pick up asynchrony until the early 2000s. The benefit of asynchrony, he said, is it is natural compared with the illusion of synchrony. When compute systems are tightly coupled together, it could lead to widespread failure if something goes wrong, Vogels said. With asynchronous systems, everything is decoupled. “The most important thing is that this is an architecture that can evolve very easily without have to change any of the other components,” he said. “It is a natural way of isolating failures. If any of the components fails, the whole system continues to work.”


Entity Framework Fundamentals

EF has two ways of managing your database. In this tutorial, I will explain only one of them; code first. The other one is the database first. There is a big difference between them, but code first is the most used. But before we dive in, I want to explain both approaches. Database first is used when there is already a database present and the database will not be managed by code. Code first is used when there is no current database, but you want to create one. I like code first much more because I can write entities (these are basically classes with properties) and let EF update the database accordingly. It's just C# and I don't have to worry about the database much. I can create a class, tell EF it's an entity, update the database, and all is done! Database first is the other way around. You let the database 'decide' what kind of entities you get. You create the database first and create your code accordingly. ... With Entity Framework, it all starts with a context. It associates entities and relationships with an actual database. Entity Framework comes with DbContext, which is the context that we will be using in our code.


How Executive Coaching Can Help You Level Up Your Organization

As we all know, the desire for personal growth is extremely valuable- however, as employee demands from the workplace have shifted, leadership skills have not. As employees climb the ranks, they find their way into leadership without necessarily learning the skills and techniques required to lead. Many new leaders turn to a trusted mentor who would only provide information based on lived experience. On the other hand, executive coaches are tasked with improving performances and capabilities as their day job. But there is a misconception that executive coaches are for leaders who have done something wrong. While it's true that an executive coach could support a difficult employee become a better teammate, they can also be guides for leaders to pursue their desired career paths. Leadership coaching explains that the main drivers of innovation in an organization are the people and the corporate culture, and it can provide leaders with the tools to master these levers. An executive coaching professional can guide leaders through the steps that allow them to set the foundations of an innovative and competitive company.


Ransomware: Is there hope beyond the overhyped?

The old way of thinking about cyber security was imagining it like a castle. You’ve got the vast perimeter – the castle walls – and inside was the keep, where employees and data would live. But now organisations are operating in various locations. They’ve got their cloud estate in one or more providers, source code residing in another location, and vast amounts of work devices that are now no longer behind the castle walls, but at employees’ homes – the list could go on for ever. These are all areas that could potentially be breached and used to gain intelligence on the business. The attack surface is growing, and the castle wall can no longer circle around all these places to protect them. Attack surface management will play a big part in tackling this issue. It allows security and IT teams to almost visualise the external parts of the business and identify targets and assesses risks based on the opportunities they present to a malicious attacker. In the face of a constantly growing attack surface, this can enable businesses to establish a proactive security approach and adopt principles such as assume breach and cyber resilience.


How data analysts can help CIOs bridge the tech talent shortfall

Business analytics are only as good as the data they’re using. Given the wealth and complexity of data, it’s easy to understand why leaders are often overwhelmed in their attempts to access better analytics and insights. This is where data professionals can help. Data scientists and analysts are statistics, math, databases, and systems experts. They are especially adept at looking at historical metrics, recognizing patterns, pulling in market insights, and identifying outlier data to ensure the best points are utilized. They’re also able to organize vast amounts of unstructured data, which is often very valuable but difficult to analyze, by leveraging conventional databases and other tools to make the data more actionable. ... It’s also important to look at the attributes of the data scientists and analysts themselves. In addition to having technical skills, data professionals with a background in programming, data visualization, and machine learning are also highly valuable. On the non-technical side, they should have strong interpersonal and communication skills to relay their findings to the tech team and those without a tech or math background.


What Does Technical Debt Tell You?

Making most architectural decisions at the beginning of a project, often before the QARs are precisely defined, results in an upfront architecture that may not be easy to evolve and will probably need to be significantly refactored when the QARs are better defined. Contrastingly, having a continuous flow of architectural decisions as part of each Sprint results in an agile architecture that can better respond to QAR changes. Almost every architectural decision is a trade-off between at least two QARs. For example, consider security vs. usability. Regardless of the decision being made, it is likely to increase technical debt, either by making the system more vulnerable by giving priority to usability or making it less usable by giving priority to security. Either way, this will need to be addressed at some point in the future, as the user population increases, and the initial decision to prioritize one QAR over the other may need to be reversed to keep the technical debt manageable. Other examples include scalability vs. modifiability, and scalability vs. time to market. These decisions are often characterized as "satisficing", i.e., "good enough". 



Quote for the day:

"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr