Daily Tech Digest - November 15, 2022

The Chief Trust Officer Role Can Be the Next Career Step for CISOs

Many CISOs are already unofficially doing the work that comes with the CTrO role, according to Pollard. They are doing customer-facing work, navigating third-party risk management, and focusing on enterprise resilience. “CISOs that spend more time on customer-facing activity, they are at companies that grow faster,” Pollard asserted. “Cybersecurity touches revenue, and security leaders that are able to carve out the time to focus on customer activity help drive hyper growth.” CISOs who are driving growth for their companies are playing an important part on the leadership team, and if they’ve been in the role for a long enough time, it could be time to ask the question “What comes next?” CISOs who have been in their position for 48 months are due for a title-level promotion, according to Pollard. And CTrO is that next step. ... Through his research, Pollard is seeing the CTrO role filled at a number of organizations. Cisco has a chief trust officer. So does SAP. “We're not talking about small, innovative startups. We're talking about goliath businesses that recognize the importance of trust in what they do,” Pollard said.


How regulation of the metaverse could impact your business

The regulatory challenges faced by Web3 are currently much fresher, arguably more nuanced and in some cases, urgent. It cannot be regulated as a single entity, as its multitude of use cases demand a multitude of approaches. Specific rules governing the security and availability of systems, finance, archives, identity and IP rights will need to be set. The good news is that policymakers could leverage Web3’s benefits to impose regulation. As it’s based on decentralisation and automation, it’s not far-fetched to imagine the technology being used to enforce and automate taxation, for example. Currently, Web3 platforms like cryptocurrency exchanges or NFT marketplaces aren’t standardised, with inconsistent UX and language used to communicate concepts. Often, these platforms have little or no duty to educate about safety or establish protections, and while platforms like Coinbase and OpenSea do a good job here, it’s far from the norm and scams are still commonplace owing to lack of understanding.


Private 5G drives sustainable and agile industrial operations

Looking at business outcomes such as sustainability and agility, the partners regard industrial private 5G as an enabler of digital transformation in smart manufacturing to help deliver connected worker applications, mobile asset applications and untethered fixed industrial asset applications. The former are seen as able to increase visibility and intelligence through mobile digital tools, such as analytics, digital twins and augmented reality (AR), while mobile asset applications increase agility and efficiency with autonomous vehicles, such as automated guided vehicles (AGVs) and autonomous mobile robots (AMRs). The consortium’s tests were run according to an established test plan provided by Rockwell Automation with success criteria of zero faults. It outlined a series of test cases to establish reliable Ethernet/IP standard and safety (CIP Safety) I/O connections from a GuardLogix area controller, with a range of requested packet interval (RPI) settings – the rate at which the controller and the I/O exchange data – over the 5G RAN to the FLEX 5000 standard and safety I/O.


Who Moved My Code? An Anatomy of Code Obfuscation

The best security experts will tell you that there’s never an easy, or a single solution to protect your intellectual property, and combined measures, protection layers and methods are always required to establish a good protective shield. In this article, we focus on one small layer in source code protection: code obfuscation. Though it’s a powerful security method, obfuscation is often neglected, or at least misunderstood. When we obfuscate, our code becomes unintelligible, thus preventing unauthorized parties from easily decompiling, or disassembling it. Obfuscation makes our code impossible, (or nearly impossible), for humans to read or parse. Obfuscation is, therefore, a good safeguarding measure used to preserve the proprietary of the source code and protect our intellectual property. To better explain the concept of obfuscation, let’s take “Where’s Waldo” as an example. Waldo is a known illustrated character, always wearing his red and white stripy shirt and hat, as well as black-framed glasses. 


Should security systems be the network?

The appeal and real benefits of having the security systems be the whole network are clearest for smaller and midsized companies. They are more likely to have uniform and relatively simple needs, and also to have thinner staffing. They are more likely to have difficulty affording, attracting, and retaining the talent they need in both security and networking. So, having just one platform to become expert in, one platform to train new staff on or to outsource the management of lets them make the most of the staff they have. The benefits are less clear for larger company. These tend to have more complex environments and requirements, and are less likely to tolerate the risks of monoculture given they are better able to staff for and support a blended ecosystem. So, should security systems be the network? For smaller organizations, it looks viable with the caveats outlined above. For most larger organizations, I think the answer is currently no. Instead, they should focus on making their network systems a bigger part of the security infrastructure.


Democratization Is The Key To Upskill At Work And Improve ROI

Creating actionable data and analytics programs to educate employees is one of the most effective ways to bridge the skills gap. We have seen successes with executive-sponsored datathons or when companies gamify their learning experience. We also think it’s important for technical data experts to act as mentors to knowledge workers with domain expertise and guide them through the analytics process. We believe this collaboration between technical experts and domain experts will help organizations achieve breakthroughs with their data faster. Finally, analytics needs to be easy, not complex. Organizations should invest in technologies that move away from being highly dependent on writing code. ... Data and analytics generate ROI in many ways. First are the time savings. Organizations that shift from spreadsheet-based processes save several hours per week, sometimes up to a third of their time per worker – multiply this by all the domain experts and knowledge workers still stuck in spreadsheets and you’ve got some serious time savings. This is just the tip of the iceberg.


Top cybersecurity threats for 2023

Disgruntled employees can sabotage networks or make off with intellectual property and proprietary information, and employees who practice poor security habits can inadvertently share passwords and leave equipment unprotected. This is why there has been an uptick in the number of companies that use social engineering audits to check how well employee security policies and procedures are working. In 2023, social engineering audits will continue to be used so IT can check the robustness of its workforce security policies and practices. ... Cases of data poisoning in AI systems have started to appear. In a data poisoning, a malicious actor finds a way to inject corrupted data into an AI system that will skew the results of an AI inquiry, potentially returning an AI result to company decision makers that is false. Data poisoning is a new attack vector into corporate systems. One way to protect against it is to continuously monitor your AI results. If you suddenly see a system trending significantly away from what it has revealed in the past, it’s time to look at the integrity of the data.


Corporate execs confident on sustainability goals, admit more work needed

Efforts to achieve sustainability goals can broadly be grouped into several areas: green resources procurement, which includes sustainable energy and water; operational efficiency, which includes the IT value chain, supply chain and other scope 3 emission sources that make up 40% of all greenhouse gas emissions; and end of lifecycle, including circular economy or recycling products to create new ones. For example, data centers and cloud industries tend to focus on green energy procurement (since they use a lot of energy to power data centers) as well as operational efficiency to reduce power usage, according to Abhijit Sunil, a senior analyst with Forrester Research. “Standards are certainly evolving, and more and more organizations are held accountable for their commitments and how they take action towards it,” Sunil said. For example, Sunil noted, government scrutiny will continue to increase, holding more “greenwashers” accountable. Greenwashers are companies that deceptively purport that their products, aims and policies are environmentally friendly.


The office of 2023: Top workforce trends that will shape the year ahead

Roderick believes an overarching theme for the workplace in 2023 will be adjusting how employees work remotely. He says there could be an uptick in surveillance for remote workers that will allow managers to observe productivity, and executives could enforce return-to-office mandates as a reaction to a slowdown in business. ... "The world of work has been through huge changes since the pandemic, and it would be good not to see the positives of this change undone by a recession." Silverglate believes that technology, office redesign, and sustainability will all propel hybrid and remote working in 2023. Video conferencing became a staple in work-from-home practices, but VR is emerging to make the experience more immersive and productive. "When many are in person and a team member needs to be virtual, VR technology can truly reduce the perceived gap between the two, which is one of the largest complaints I've heard about the challenges of traditional video-conferencing technology as it relates to hybrid teams," he says.


From Async Code Reviews to Co-Creation Patterns

The way it goes is that once a developer thinks they are done with coding, they invite other team members to review their work. This is nowadays, typically done by raising a Pull Request and inviting others for a review. But, because reviewers are busy with their own work items and a plethora of other things happening in the team, they are not able to react immediately. So, while the author is waiting for a review, they also want to feel productive, thus they start working on something else instead of twiddling their thumbs and waiting for a review. Eventually, when reviewer(s) become available and provide feedback on the PR and/or ask for changes, the author of the PR is then not available because they are busy with something else. This delayed ping-pong communication can extend over several days/weeks and a couple of iterations, until the author and reviewer(s) converge on a solution they are both satisfied with and which gets merged into the main branch.



Quote for the day:

"How was your day? If your answer was "fine," then I don't think you were leading" -- Seth Godin

Daily Tech Digest - November 13, 2022

Cybersecurity leaders want to quit. Here's what is pushing them to leave

Almost a third of chief information security officers (CISOs) and IT security managers in the UK and US are considering leaving their current organization, according to new research. Not only that, but a third are planning to quit their jobs within the next six months. ... many IT security leaders are struggling to keep up with evolving threats and new cybersecurity practices, while also reporting issues around recruitment, retention and work-life balance that are prompting many to turn away from the industry. When asked about the aspect of their role that they disliked most, 30% cited the lack of a work-life balance, with 27% saying that much time was spent on 'firefighting' rather than addressing strategic business issues. On top of the 32% of CISOs planning a departure due to the stresses of the job, 52%, admitted that they are struggling to keep up to date with new frameworks and models such as Zero Trust, while a further 20% felt that having the right skills on their team was "a serious challenge".


Why Is Optimism a Critical Security Skill?

There’s a different way to think about the practice of security: as a vision- and mission-based endeavor. When security practitioners log in each day to start work, they are protecting people they care about: their colleagues, partners and customers. They’re also safeguarding their organizations’ ability to do business in a complex world by delivering vital products and services that others need and ensuring society functions as intended. As a result, security teams are creating a better world for everyone. For employees in organizations, these connections may either be explicit or implied. Security professionals who protect national infrastructure for a government agency, a nonprofit’s ability to deliver aid or an e-commerce firm’s ability to deliver goods will likely see the value in safeguarding their organizations’ business and operations. Yet, countless others provide processes or services that enable the effective functioning of businesses and community life. These professionals, too, should take pride in fulfilling their organization’s vision and mission.


Networking and Data Center Horrors That Scare IT Managers

Carrie Goetz, D.MCO and Principal/CTO at StrategITcom, LLC, and a frequent speaker at Network Computing events, offered up some spooky incidents she has encountered over the years. “There was the case of cleaning people plugging vacuum cleaners into UPS outlets when cleaning the data center and shutting them down. Happened every night sometime between 2 and 4 AM. The only way we caught it was to sit up there at that time.” “Or how about doing an audit of the gear in a data center? The customer thought they had about 2,600 servers, and we found over 3,000 physical machines. Some had not passed a bit of traffic in years.” Talk about a nightmare. She noted, “decommissioning was not in their vocabulary until after the audit.” Another example should send chills down any IT manager’s spine. “We took over a contract for a prison health care provider. They had previously hired another company. When all of the deliveries were late, the customer started investigating and found out that the company was staging their servers in a shed with a dirt floor and no AC running. They kept going up and down, and two failed for dirt and moisture.”


Hervé Tessler – ‘Cyberattacks can mean total reputational death’

When I joined the business 33 years ago, nobody ever talked about cybersecurity. I don’t recall the word. Everything was physical: what if somebody’s gotten into my flat or I’m afraid someone’s going to take my car, attack me in the street to take my watch or my wallet or whatever. Obviously, the world has become much more digital. All these fears and threats became digital. What I would say over the past few years is that we’ve seen a massive amplification of risk. Large companies have a board and an IT group under the board looking at cybersecurity. They take it very seriously. They are scared to death of any brand damage. They are relatively focused, which is not the same as being well equipped. What I’ve found out over the last six months is that more and more small and mid-sized businesses are paying a lot more attention to cyber. When I open the newspaper, there’s not a day without a small story about a major cyber negative impact on a business. There was a recent cyberattack on a French hospital that sent them back to the Middle Ages – it lasted for months. Of course, SMBs taking cybersecurity seriously is an opportunity for us to help them with this threat.


Cyber criminals have World Cup Qatar 2022 in their sights

The Digital Shadows Photon research team have been tracking cyber threats coalescing around the World Cup over the past 90 days using a specially created alert system. They have found that broadly, threats to the event can be arranged into four categories – brand protection, cyber threat, physical protection and data leakages. Of these, most of the observed activity relates to the cyber threat category. “Scams could present themselves in many forms,” the Photon team wrote in a newly published online advisory. “For instance, financially motivated threat actors often plant in malicious URLs spoofing these events to fraudulent sites, hoping to maximise their chances of scamming naive internet users for a quick, illicit, profit. “At the same time, hacktivist groups may exploit the public attention given to such events to exponentially increase the reach of their message. State-sponsored advanced persistent threat (APT) groups may also decide to target global sporting events to achieve state goals to the hosting country or the broader event community.”


Agile or V-Shaped: What Should Be Your Next Software Development Life Cycle Model?

The agile model is known for its flexibility and responsiveness to change. This makes it ideal for projects that are constantly evolving or that require quick turnarounds. However, this flexibility can also be a downside, as it can lead to scope creep and unrealistic expectations. The V-shaped model is more rigid and structured, but this can also be seen as a strength. This model helps to prevent scope creep by clearly defining the deliverables at each stage of the project. It also provides more structure and transparency, which can help to keep stakeholders informed and on track. However, the downside of this model is that it can be inflexible and resistant to change. So, which model is best for your project? Ultimately, it depends on your specific needs and objectives. The Agile software development life cycle model is a great choice for small to medium-sized projects. This is because it offers flexibility and adaptability, which are essential when working on smaller-scale projects. So if you need a flexible and responsive approach, then the agile model may be a better fit.


Veteran CIOs on leading IT today

It’s a different role as you shift from manager to director. It’s realizing your entire organization is essentially run by someone else tactically, so really backing off and letting them fail or succeed on their own. And then you have to figure out how to focus very differently on alignment, making sure all the leaders are on the same page, because now you have really good leaders and they’re all running in different paths. In IT especially, managers tend to want to hang on to some of the hands-on work after they become directors, and that’s often because they’re promoted due to skill, not leadership. So suddenly they find themselves at a director level and they’re just a really good engineer. They don’t have any other tool in their toolkit except doing it themselves. So it’s hard for them to let others own things completely. It’s really important to get a good mentor or someone in place to help them. Usually what you hear is the horror stories, where someone fails miserably and then they learn and pick themselves up and go again. We’ve got to find a way to prevent that.


IT security: 3 areas to prioritize for the rest of 2022

If companies fail to frequently audit access policies to ensure that external groups can only access the systems they need, this is another avenue that hackers can easily exploit. It’s also essential to immediately cut off access after parting ways with a consultant and periodically confirm that former contractors no longer have access. Ensure an established timeline for auditing access policies – and never allow it to slip. Addressing password hygiene is another critical consideration. According to the most recent Verizon Data Breach Investigations Report, over 80 percent of hacking incidents involved stolen credentials. And studies have repeatedly shown that at least 71 percent of people reuse passwords. If just one of the sites associated with a reused password has been breached, then all other accounts protected by that password are also at risk. With workforce management challenges on the horizon for 2023, it’s essential to implement policies and procedures addressing the inherent security vulnerabilities of the Great Resignation.


Cookies for MFA Bypass Gain Traction Among Cyberattackers

Stealing session cookies has become one of the most common ways that attackers circumvent multifactor authentication. The Emotet malware, the Raccoon Stealer malware-as-a-service, and the RedLine Stealer keylogger all have functionality for stealing sessions tokens from the browsers installed on a victim's system. In August, security software firm Sophos noted that the popular red-teaming and attack tools Mimikatz, Metasploit Meterpreter, and Cobalt Strike all could be used to harvest cookies from the browsers' caches as well, which the firm called "the new perimeter bypass." "Cookies associated with authentication to Web services can be used by attackers in 'pass the cookie' attacks, attempting to masquerade as the legitimate user to whom the cookie was originally issued and gain access to Web services without a login challenge," Sean Gallagher, a threat researcher with Sophos, stated in the August blog post. "This is similar to 'pass the hash' attacks, which use locally stored authentication hashes to gain access to network resources without having to crack the passwords."


Scrutinising AI requires holistic, end-to-end system audits

To combat the lack of internal knowledge around how AI systems are developed, the auditing experts agreed on the pressing needed for a standardised methodology for how to conduct a socio-technical audit. They added that while a standardised methodology currently does not exist, it should include practical steps to take at each stage of the auditing process, but not be so prescriptive that it fails to account for the highly contextual nature of AI. However, digital rights academic Michael Veale said standardisation is a tricky process when it comes to answering inherently social questions. “A very worrying trend right now is that legislators such as the European Commission are pushing value-laden choices around fundamental rights into SDOs [standards development organisations],” he said ... Another risk of prescriptive standardisation, according to Brown, is that the process descends into a glorified box-ticking exercise. “There’s a danger that interrogation stops and that we lose the ability to really get at the harms if they just become standardised,” he said.



Quote for the day:

"What I've really learned over time is that optimism is a very, very important part of leadership." -- Bob Iger

Daily Tech Digest - November 10, 2022

Building Higher-Quality Software With Open Source CD

Prior to the rise of open source CD solutions, companies often relied on point automation using scripts. These could improve efficiency a bit, but when companies moved from the monolithic architecture of a mainframe or on-premises servers to a microservices-based production environment, the scripts could not be easily adapted or scaled to cope with the more complex environment. This led to the formulation of continuous delivery orchestration solutions that could ensure code updates would flow to their destination in a repeatable, orderly manner. Two highly popular open source CD solutions have emerged, Spinnaker and Argo. Spinnaker was developed by Netflix and extended by Google, Microsoft and Pivotal. It was made available on GitHub in 2015. Spinnaker creates a “paved road” for application delivery, with guardrails to ensure only valid infrastructure and configurations reach production. It facilitates the creation of pipelines that represent a software delivery process. These pipelines can be triggered in a variety of ways, including manually, via a cron expression, at the completion of a Jenkins Job or other pipeline and other methods. 


Technical Debt is Quantifiable as Financial Debt: an Impossible Thing for Developers

There are many things about technical debt that can be quantified. Henney mentioned that we can list off and number specific issues in code and, if we take the intentional sense in which technical debt was originally introduced, we can track the decisions that we have made whose implementations need to be revisited. If we focus on unintentional debt, we can look at a variety of metrics that tell us about qualities in code. There’s a lot that we can quantify when it comes to technical debt, but the actual associated financial debt is not one of them, as Henney explained: The idea that we can run a static analysis over the code and come out with a monetary value that is a meaningful translation of technical debt into a financial debt is both a deep misunderstanding of the metaphor – and how metaphors work – and an impossibility. According to Henney, quantifying how much financial debt is present in the code doesn’t work. At the very least, we need a meaningful conversion function that takes one kind of concept, e.g., "percentage of duplicate code" or "non-configurable database access", and translates it to another, e.g., euros and cents


How industrial IoT is forcing IT to rethink networks

IIoT is redefining the types of data that enterprises use, and how networks process this data. For example, an IIoT network primarily transmits and processes unstructured data, not fixed record transactional data. In contrast, the corporate network processes data that is far more predictable, digestible and manageable. The bulk and the traffic of IIoT data virtually makes it a necessity to implement a single, private, dedicated network to each manufacturing facility for use with its IoT. Security is also a concern, because the networks that operate on the edges of the enterprise must often be maintained and administered by non-IT personnel who don’t have training in IT security practices. It’s not uncommon for someone on a production floor to shout a password to another employee so they can access a network resource — nor is it uncommon for someone on the floor to admit another individual into a network equipment cage that is supposed to be physically secured and accessible by only a few authorized personnel.


Cloud architects are afraid of automation

As humans, we’re just not that good. While we have experience driving cars and can look out the front window, we don’t have a perfect understanding of current data, past data, and what this data likely means in the operation and driving of the vehicle. Properly configured automation systems do. For the same reasons that we are anxious when our cars drive away without us actively turning the wheel, we are slow to adopt automation for cloud deployments. Those charged with making core decisions about automating security, operations, finops, etc., are actively avoiding automation, largely because they are uncomfortable with critical processes being carried out without humans looking on. I get it. At the end of the day, automation is a leap of faith that the automated systems will perform better than humans. I understand the concern that they won’t work. The adage is true: “To really screw things up requires a computer.” If you make a mistake in setting these systems up, you can indeed do real damage. So, don’t do that. However, as many people also say: “The alternative sucks.” Not using automation means you’re missing out on approaches and mechanisms to run your cloud systems cheaper and more efficiently


Cybersecurity, cloud and coding: Why these three skills will lead demand in 2023

As the scale and growth of software development accelerates, and with ongoing AI developments in programming and engineering, the role requirements of software development also look set to change. "AI/ML are changing the world of programming much like the calculator and the computer changed the world," says Stormy Peters, VP of Communities at GitHub. "These technological advancements are taking care of a lot of the mundane, grunt work that developers once had to devote all their time to. Development looks different now." ... As we enter 2023 and software development remains at the heart of business strategies, problem-solving, critical thinking and other human skills will prove integral. "While emerging technologies will increasingly enable them to stay in the flow and solve challenging problems, the technicalities in being able to program, engineer, and develop code through a high level understanding of AI, DevOps, and programming languages will also stay central in importance to the discipline," she adds.


How to effectively compare storage system performance

The best metrics to compare are the ones most applicable to the applications and workloads you will run. If the application is an Oracle database, the performance metric most applicable is 8 KB mixed read/write random IOPS. When the vendor only provides the 4 KB variation, there is a way to roughly estimate the 8 KB results -- simply divide the 4 KB results in half. If the vendor objects, ask for actual 8 KB test results. Use this same simple math for other I/O sizes. Throughput is somewhat more difficult to standardize, especially if vendors don't supply it. You can roughly calculate it by multiplying the sequential read IOPS by the size of the I/O. Latency is the most difficult to standardize, especially when vendors measure it differently. There are many factors that affect application latency, such as storage system load, storage capacity utilization, storage media, storage DRAM caching, storage network congestion, application server load, application server utilization and application server contention. The most important question to ask is how the vendor measured the latency, under what loads and from where. 


8 secrets of successful IT freelancers

An often-overlooked skill is having the knowledge, courage, and ability to steer the client in the right direction. “The customer wants to use the freelancer’s experience and proactivity, therefore it’s very important that the IT freelancer states his or her true opinion when he or she thinks that the customer is moving in the wrong direction,” says Soren Rosenmeier, CEO of Right People Group, a firm that matches clients with IT and business consultants. Don’t jump the gun, however. Before offering any crucial advice, it’s important to have a complete understanding of the issue at-hand. “There might also be a lot of other factors … in the organization that the IT freelancer is unaware of,” Rosenmeier notes. Therefore, prior to offering a suggestion, it’s important to first listen to exactly what the client wants. If the IT freelancer is honest and upfront, the client will receive the benefit of hiring a highly experienced expert, including insights from all the experience the freelancer has gained by working with many other organizations. “At the same time, the customer gets the simplicity and the execution that they want from an external expert that’s hired in to do a specific job,” Rosenmeier says.


In a managed service model, who is responsible and accountable for data?

When it comes to ensuring compliance, since accountability always lies with the business, it is essential to ensure that the MSP is compliant before outsourcing any data management functions. However, before this can be done, it is essential to establish what exactly it is that needs to be complied with, which is often the most difficult question, with a myriad of regulations and legislation being applicable depending on the sector and regions the business operates in. There are two pillars to consider when engaging with an MSP in regards responsibility for data management, one being the data availability and recovery, and the second, the retention of data, however the requirements for compliance, and ultimately accountability, in each will depend on the individual business. This means that before your data can be deemed compliant, you need to understand what that means for your business and have a framework in place that outlines this.


What the experts say about the cybersecurity skills gap

In terms of the skills that are needed, all three cybersecurity leaders agreed that there are various technical skills necessary, similar to any IT role. However, Killian pointed out that not every cybersecurity role is purely a technical one. “Technical skills are usually easier to learn than other important skills like curiosity, ability to ‘play’ in the grey – security issues are rarely obvious ‘yes or no’ problems to solve – and the ability to build relationships with stakeholders. So, unless technical skills are required for the role at hand, they should be prioritised appropriately in job postings,” she said. ... Naidoo reaffirmed that great attitudes and high aptitudes are essential as “technical skills can be taught”. However, she also said it’s important to keep on top of how the tech industry is evolving. “Whatever technical skills are needed in the industry, a corresponding security skill is necessary to secure that technology. So, whether that’s blockchain, quantum or artificial intelligence, or even traditional functions like networks, operating systems and databases, one needs to understand these technologies in order to properly secure them.”


How organisations can right-size their data footprint

“Going on a data diet can be healthy. Cutting out all that junk data that bloats our systems costs us money, raises our data risks and distracts us from the nutritious data that will help us grow. Sometimes, less is truly more.” To reduce data risks and identify useful data, organisations can create synthetic data, which is artificially created data with similar attributes to the original data. According to Gartner, synthetic data will enable organisations to avoid 70% of privacy violation sanctions. Parker said: “If you have sensitive customer data that you want to use but you can’t, you could replace it with synthetic data without losing any of the insights it can deliver.” She added that this could also facilitate data sharing across countries and in industries such as healthcare and financial services. In the UK, for example, the Nationwide Building Society used its transaction data to generate synthetic datasets that could be shared with third-party developers without risking customer privacy, she said. Parker said synthetic data will also enable organisations to plug gaps in the actual data used by artificial intelligence (AI) models. 



Quote for the day:

"Leverage is the ability to apply positive pressure on yourself to follow through on your decisions even when it hurts." -- Orrin Woodward

Daily Tech Digest - November 09, 2022

5 ways to use predictive insights to get the most from your data

With the proliferation of SaaS tools, we seem to be collecting so much more data, yet most companies still struggle to integrate it properly to extract insights that would be indicative of future performance. There are a variety of reasons for that: internal data privacy, legacy mindset around who owns what data, lags in data warehousing strategy or operational know-how about the mechanics of integrating it. ... The CMO Survey found that after a decade of integrating customer data across channels, marketers are still struggling, with most giving their organization a 3.5 out of 7 score on the effectiveness of their customer information integration across purchasing, communication and social media channels. ... Too often organizations are overly focused on dashboards and analyzing past trends to determine future actions. Dashboards and reports are often thought of as the final deliverables of data, but this thinking is limiting data’s value. Think about how your acquisition, monetization and retention journeys are orchestrated today, then feed predictive scoring data right into those business systems and tools. 


Coming Clean: Why Cybersecurity Transparency Is A Strength, Not A Weakness

In the wake of the new disclosure proposals, the management of cybersecurity events can no longer be an afterthought in maintaining operating standards. It’s now been elevated to a major concern along with financial risks, such as capital and credit risk. Despite the technical challenges, compliance is generally straightforward. Organizations must develop discipline in how they detect and defend against cyber threats. In addition, they must improve the way they report on them. If they don’t want their next cyber incident to turn into a material event, they need to minimize the risk of a breach in the first place. Remember, the opposite of due diligence is negligence. One way to get started is to focus on the application layer, as that’s where the “money” is. Decades of focus on network-based threats have improved the protection from some cyberattacks, but many business applications remain vulnerable. Applications suffer numerous vulnerabilities outlined by the OWASP Top 10. These are known, common threats that can be countered by using Web application firewalls.


AI eye checks can predict heart disease risk in less than minute, finds study

“This AI tool could let someone know in 60 seconds or less their level of risk,” the lead author of the study, Prof Alicja Rudnicka, told the Guardian. If someone learned their risk was higher than expected, they could be prescribed statins or offered another intervention, she said. Speaking from a health conference in Copenhagen, Rudnicka, a professor of statistical epidemiology at St George’s, University of London, added: “It could end up improving cardiovascular health and save lives.” Circulatory diseases, including cardiovascular disease, coronary heart disease, heart failure and stroke, are major causes of ill health and death worldwide. Cardiovascular disease alone is the most common cause of death globally. It accounts for one in four deaths in the UK alone. While several tests to predict risk exist, they are not always able to accurately identify those who will go on to develop or die of heart disease. Researchers developed a fully automated AI-enabled tool, Quartz, to assess the potential of retinal vasculature imaging – plus known risk factors – to predict vascular health and death.


Mobile Application Security Best Practices

Strong credentials are a must for both web and mobile application development. For mobile apps, you can choose to either have a native login flow, which means the user enters their credentials within the app, or a web-based login flow, where they are directed to a web browser to login. Native login flows provide a better user experience but are generally thought to be less secure. Hypermedia authentication APIs are a solution now popping up to bridge this gap and provide the best of both worlds. Hypermedia authentication APIs interact with the authorization server directly without the need for an intermediary like the browser window. Regardless of how the user enters their credentials, your app should enforce some type of password policy to ensure a strong password is used, and it should not store the access and refresh tokens anywhere except secure storage (like the iOS keychain or Android Keystore). ... Finally, your mobile app should follow best practices for secure coding, just as you would with web applications. Security should be incorporated from the start of the app’s design, with testing occurring throughout the development process.


Cybersecurity threats: what awaits us in 2023?

Businesses will still be mostly concerned with ransomware. The conflict between Russia and Ukraine has marked an end to any possible law enforcement cooperation in the foreseeable future. We can therefore expect that cybercrime groups from either block will feel safe to attack companies from the opposing side. Some may even perceive this as their patriotic duty. The economic downturn will lead more people to poverty, which always translates to increased criminality, and we know ransomware to be extremely profitable. ... Zero trust will take on greater prominence with the continued role of the remote and hybrid workplace. Remote work will continue driving the need for zero trust since hybrid work is now the new normal. With the federal government mandating agencies to adopt zero-trust network policies and design, we expect this to become more common and the private sector to follow suit as 2023 becomes the year of verifying everything. ... In 2023, we might see a slight decline in the raw number of ransomware attacks, reflecting the slowdown of the cryptocurrency markets. 


Google and Renault are creating a 'software-defined vehicle'

Renault will leverage Google's Cloud technology to securely manage data capture and analytics. They'll also use Google's ML and AI capabilities. "Our collaboration with Renault Group has improved comfort, safety, and connectivity on the road," Sundar Pichai, CEO of Google and Alphabet, said in a statement. "Today's announcement will help accelerate Renault Group's digital transformation by bringing together our expertise in the cloud, AI, and Android to provide for a secure, highly-personalized experience that meets customers' evolving expectations." Google shares that some features of the SDV will include predictive maintenance, accurate real-time detection of vehicle failures, a better driving experience, and insurance models reflective of driving behaviors. "Equipped with a shared IT platform, continuous over-the-air updates, and streamlined access to car data, the SDV approach developed in partnership with Google will transform our vehicles to help serve future customers' needs," said Luca de Meo


Why automating finance is just an integration game

What is clear is the increasing demand for decision intelligence with financial analytics at its heart. RPA suppliers are increasingly repositioning themselves as automated intelligence companies, using RPA tools to drive key functions, such as finance. Gartner believes a third of large organisations will be using decision intelligence for structured decision-making to improve competitive advantage in the next two years. Recent research by enterprise application integration firm Jitterbit backs this up. Focusing on mid-sized companies (referred to as Mittelstand) in the DACH region (comprising Germany, Austria and Switzerland), Jitterbit found that 73% of these businesses want to be hyperautomated within three years because “the health of their company depends on it”. The barriers to achieving this are typical – too many manual data process, isolated data silos and a lack of departmental integration. What is becoming clear is that financial analytics can be the core and the catalyst of intelligent automation transformations. 


Detecting Cyber Risks Before They Lead to Downtime

To avoid costly downtime, threats to operational continuity must be detected and investigated as early as possible. That can be accomplished by scanning connected devices for configuration changes and vulnerabilities. However, unlike traditional IT, OT assets cannot be continuously scanned in the same manner and many risks will remain unnoticed. Instead, a system designed for manufacturing environments must have the ability to passively monitor the network infrastructure to locate assets and detect behavior changes and anomalies. That requires understanding dozens of industrial protocols and continuously monitoring the communications and checking against a database of OT/ICS-specific Indicators of Compromise (IOCs, or evidence of a breach) and CVEs. The bane of many monitoring systems is they produce a flood of information about potential harm, not all of it urgent. To be useful, critical alerts must be prioritized based on operational or cybersecurity risk so the right team can respond. For example, OT engineers need to quickly spot undesired process values, incorrect measurements or when a critical device fails so they can resolve issues more quickly.


Challenges to Successful AI Implementation in Healthcare

Incorporating AI systems could improve healthcare efficiency without compromising quality, and this way, patients could receive better and more personalized care. Investigations, assessments, and treatments can be simplified and improved by using AI systems that are smart and efficient. However, implementing AI in healthcare is challenging because it needs to be user-friendly and procure value for patients and healthcare professionals. AI systems are expected to be easy to use and user-friendly, self-instructing, and not require extensive prior knowledge or training. Besides being simple to use, AI systems should also be time-saving and never demand different digital operative systems to function. ... The healthcare experts noted that implementing AI systems in the county council will be difficult due to the healthcare system’s internal capacity for strategic change management. For the promotion of capabilities to work with implementation strategies of AI systems at the regional level, experts highlighted the need for infrastructure and joint ventures with familiar structures and processes. 


AI Ethics: Four Essentials CIOs Must Know

Enterprises must investigate how the data used to train the algorithm is used in order to develop explainable AI. Although this won’t address the bias issue, it will guarantee that firms are aware of the underlying causes of any problems so they can take appropriate action. Synthetic data, in addition to actual data sets, is essential for addressing ethical issues. For instance, synthetic data can be used to correct biases in real data that are unjust and skewed toward particular groups of individuals. Additionally, synthetic data can be used to boost the volume and produce an objective dataset if the volume is inadequate. ... Executives must design AI systems that can instantly identify fabricated data and immoral behavior. This necessitates screening suppliers and partners for the improper use of AI in addition to examining a company’s own AI. Examples include the employment of convincing false text and videos to discredit competitors or the use of AI to carry out sophisticated cyber-attacks. As AI technologies become more accessible, this problem will worsen.



Quote for the day:

"Good leaders make people feel that they're at the very heart of things, not at the periphery." -- Warren G. Bennis

Daily Tech Digest - November 08, 2022

Public sector IT projects need ethical data practices from start

“When you openly communicate your purpose – when you’re open about how you’re doing things, how data has been used – you instil the most valuable thing when doing data projects, which is trust,” he says. “We are probably in a place where people don’t trust organisations or the government with their data, and that’s a bad place to be… so the transparency, openness, communicating purpose, engaging with people is fairly key.” Ahmed adds that, even in situations where it is not possible to consult the public – for example, when systems are being built for law enforcement or intelligence purposes, or just internally for use by civil servants – openness among the internal stakeholders is still very important. All of this should also be documented: “To make [ethical] framework that’s suitable, to what you’re doing as an organisation, keep referring to it, refresh it, measure your outcomes, learn from it… the potential loss of revenue or reputation for an organisation or government departments are huge if it goes wrong. With ethics, the win is not getting things wrong, the win is having less negative impact or more positive impact.”


Securing APIs and Microservices in the Cloud

No matter what your role is within the software development and API process, you need to be thinking about the implications of your work. Are you being as secure as possible? Because we end up with delays, fails, or worse. What I mean by this is, obviously, you have delays or fails. That can just be annoying when you end up with everything behind schedule, over budget. Everyone a little bit stressed. It's more than just that. If we look at security over the last five years, it used to be a case where you would get your personal details stolen. It might be credit cards. It might be address, or in America, social security number. What you're starting to see now, especially in the last year or two, is a lot more ransomware. While before, it would be like, we're going to take your data, and that's obviously bad. People can replace credit cards. It's a bit harder to replace your social security, there has been large rises of fraud. It's bad. Ransomware is worse, because then you are totally locked out of your system. Then you are, as the name suggests, at ransom.


The Fiction of Sentient AI

Artificial intelligence has a unique possibility that tickles this part of the imagination: it seems familiar. We can recognize a part of ourselves in it: the systematic thinking part. It is rationality without irrationality. Pure reason in action where the results are always clean and binary. There is a non-human purity of thought in the elegance of its solutions. We have always yearned futilely for this purity — in our food, our philosophical quests, our relationships, even in our gods. But this purity, the simple and bold ease with which we solve at a rapid pace our most bothersome tasks creates both complacency and trepidation. The complacency is apparent. We have always, as a species, had an infinite capacity for self congratulation. The trepidation part is more intriguing. It exposes how poorly we have actually sized ourselves up, and reveals at once, both the fallacy of thought, its ambitions and its limitations. The fever-dream of sentient AI and the climax of our race is a trite template for thousands of sci-fi stories across several medium.


Ways to spot if your organisation has a false sense of security – and what to do about it

We tend to think of the security team as solely responsible for all aspects of systems security implementation and management, with the IT team more focused on enabling the workstreams of the organisation through data protection, and ensuring backup and recovery systems are properly implemented. Given the complementary nature of security threats, and needing to easily backup and recover if / when an attack occurs, it seems logical that these two groups would collaborate closely. But, that’s often not the case. In fact, we found this split between the two roles to be worryingly prevalent in our research. 19% of UK SecOps decision-makers responding to our survey believe collaboration with IT is not strong, and 5% went as far as to call it “weak.” Flipping the coin, among IT decision-makers, 16% believe collaboration is not strong. Across the two roles, in total, 20% of IT and SecOps respondents believe the collaboration between the two is not strong.


How ‘synthetic media’ will transform business forever

The world is on the brink of a revolution in various “realities” — virtual, augmented, and mixed (VR, AR and MR). Mark Zuckerberg’s Meta is planning a grand pivot from old-school social networking to next-generation virtual or mixed reality, which Zuckerberg famously branded “the Metaverse.” By definition, AR, VR, and MR involve digital content, either existing in a digital world (VR) or superimposed on the real world (AR). Very close to all of this content will come in the form of synthetic media. In fact, Meta has already introduced a new AI-powered synthetic media engine, called Make-A-Video. As with the new generation of AI-art engines, Make-A-Video uses text prompts to create videos. Meta is currently promoting this engine as a very fast way for creators to create video content or virtual environments. Normally, a company making, say, marketing content would need to hire a production crew, pay for post-production work, hire actors, find a location — all that. But products like Make-A-Video suggest that, in the near future, a single creative could make video productions alone in a few hours.


Low-code and no-code automation accelerates user experience for financial institutions

Low-code and no-code automations help the business users, non-engineers and non-developers solve the end-to-end customer experience problems easily and quickly by creating the use cases themselves. The modern technology stack of low-code automation tools enable the citizen developers to solve their problems and replicate to scale across other businesses without having to rely on their overburdened IT staff. One example of low-code automation is how many traditional systems are focused on identity verification or KYC, whereas, most of the frauds for a loan financing may be happening through a fraud paystub of the consumer. As pay stubs are the correlators for whether a consumer will pay back their loans successfully on time, it is important to apply AI/ML automation to understand the validity of this important correlator to avoid defaults. Understanding a fraudulent transaction versus a clean transaction quickly and correctly is also very beneficial for the merchants as well as end customers for better user experience and revenue generation.

One of the challenges of developer platforms is that they shouldn’t be viewed as things that can just be built, launched, and then forgotten about; they require constant evolution and maintenance. The feedback loops initiated by a considered communication strategy will help here, but it’s also important to consider the ways the platform evolves alongside the organization and emerging technologies. A technique included for the first time in Volume 27 of our Technology Radar—incremental developer platforms—can be particularly useful as a way of responding to these multifaceted demands. Such an approach not only ensures alignment with the specific needs of users, but also prevents the platform from being derailed by over-ambition—something that typically stems from a preconceived vision of what the platform should look like. The virtues of an incremental approach to software are widely accepted by the industry, so why don’t we bring this thinking to the way we think about platforms and internal tooling?


Question your successes as much as your failures

What did you do to pull it off? Can it be replicated? What did you learn? It’s an exercise that many leaders I’ve interviewed talked up even before the pandemic. “I’ve learned to question success a lot more than failure,” said Kat Cole, who is the president and chief operating officer of Athletic Greens, a nutrition company. As she told me in an interview years ago, “I’ll ask more questions when sales are up than I do when they’re down. I ask more questions when things seem to be moving smoothly, because I’m thinking: ‘There’s got to be something I don’t know. There’s always something.’ This approach means that people don’t feel beat up for failing, but they should feel very concerned if they don’t understand why they’re successful.” Successes can feel like moments for celebration, rather than furrowed-brow scrutiny. But it is precisely those moments of interrogation that can lead to insights and longer-term competitive advantages. Today’s shorter economic cycles create more momentum, both good and bad, and you want to be riding a wave rather than trying to paddle against it.


What is confidential computing?

There are as many ways confidential computing can work as there are companies coding them, but recall the definition noted above. Google Cloud uses confidential virtual machines with secure encrypted virtualization extension supported by 3rd Gen AMD EPYC CPUs and cloud computing cloud processes. Data remains encrypted in memory with node-specific, dedicated keys that are generated and managed by the processor, which security keys generated within the hardware during node creation. From there, they never leave that hardware. Today, IBM claims to be on the fourth generation of their confidential computing products, starting with IBM Cloud’s Hyper Protect Services and Data Shield in 2018. In pride of place with Hyper Protect services comes a FIPS 140-2 Level 4 certified cloud hardware security module. Both products are rated for regulations such as HIPAA, GDPR, ISO 27K and more. IBM also offers HPC Cluster, a portion of IBM cloud where customers’ clusters are made confidential using “bring your own encrypted operating system” and “keep your own key” capabilities. 


The security dilemma of data sprawl

Security teams can look at specific verticals to understand what’s working (and what’s not) when it comes to limiting data sprawl in the workplace. The finance sector, for instance, is a prime example of an industry that has more stringent security controls and regulations, therefore limiting apps in the workplace. The Netskope Threat Labs team recently found that fewer than 1 in 10 employees in finance use personal applications at work. Instead, they use managed apps that are closely monitored by security teams. Other sectors are having a more difficult time limiting data sprawl, given the remote nature of the business and less stringent industry regulations. Retail employees, for example, are using a slew of cloud apps in the workplace regularly. In fact, 40% of users in the retail industry are uploading data to personal apps. It is crucial for IT security teams, not just in this sector but across all industries, to take proactive measures to help minimize the risk of data sprawl.



Quote for the day:

"The quality of leadership, more than any other single factor, determines the success or failure of an organization." -- Fred Fiedler and Martin Chemers

Daily Tech Digest - November 07, 2022

Introduction to SPIFFE/SPIRE

The Secure Production Identity Framework For Everyone (SPIFFE) is a specification for workload identity. According to Gilman, the easiest way to think about SPIFFE is as a passport. Similar to how people are issued passports in a common shape with a barcode and standard information, SPIFFE dictates the standard methods to prove and validate the identity of a service. It’s like bringing the “Sign in with Google” experience to the software services themselves, he adds. There are three key components in SPIFFE. First, SPIFFE specifies that services shall identify themselves with what’s called a SPIFFE ID, which is defined as a URI in the format of spiffe://trust-domain-name/path. These IDs are then encoded into a SPIFFE Verifiable Identity Document or SVID. SVIDs aren’t so much a document type themselves — instead, they support either X.509 or JWT document types. Last but not least, SPIFFE specifies a workload API that issues and rotates these SVIDs, along with the keys needed to validate them. SPIRE is the code that implements the SPIFFE specification—you can think of it as a production-ready SPIFFE runtime environment. 


C-Suite Businesses are placing a high priority on cybersecurity

The C-suite now frequently discusses cybersecurity in boardroom discussions. IT and business leaders have historically had difficulty cooperating on cyber risk management, but this disagreement seems to be worse than ever right now. According to a study, over 90% of IT decision-makers think their organisation would be willing to forego cybersecurity in favour of other objectives. Such a strategy for short-term gains are not worth the risk regarding cybersecurity, which includes monetary losses and reputational harm. An organisation must resolve this business-IT conflict and come to a consensus on cyber risk as a crucial component of business risk in order to succeed in the post-pandemic era of hybrid or remote workforces. Organisations will be able to maximise their commercial opportunities and prevent pricey breaches by using this to better identify, communicate, and mitigate cyber risk across the workplace. Additionally, research shows that 38% of business decision-makers and 50% of IT leaders believe the C-Suite fully comprehends cyber dangers. 


Disadvantages of industrial IoT

Using IIoT creates massive amounts of data. That wouldn’t matter, were it not for the fact that this information needs to be processed quickly in order to be of any use. Especially when applied to digital operations, data processing is key to success. Additionally, all this generated information brings matters of privacy and security into question. IoT itself is a relatively new concept, and protecting the data that it collects will require companies to find different and more efficient ways to sort through digital assets. At the very least, businesses operating with IIoT technology should be sure to invest in secure cloud computing infrastructure. Without strong digital assets, IIoT implementation will become even more complicated and risky than it already is. ... Transitioning to IIoT is costly. Regardless of the need for new systems, as mentioned above, current IoT expenses are already high. This is because IIoT uses sophisticated software to analyze productivity and predict future trends and issues. It is also capable of deploying smart-sensing software for use in technology and agricultural businesses. Combined with the network that IIoT provides to companies, the expense of developing a digital strategy can be hefty.


10 future trends for working with business leaders

CEOs of the world’s largest companies tell IDC that they already make around 30% of their revenue from digital products, and they expect that proportion to grow in the years to come. IDC identifies three dimensions along which enterprises can achieve this growth. First, they can exploit new channels: e-commerce, mobile apps, or the creation of new distribution paths such as enabling the circular economy. Second, they can adopt additional revenue models: pay-per-use, subscriptions, dynamic pricing, transaction fees, or payment for outcomes. And third, they can seek to monetize new digital assets: data, intellectual property, or virtual objects. Developing such new revenue streams requires that CIOs keep pressing ahead with digital spending. “If you pause, you’re already behind,” he says. Building new products may involve skills that CIOs don’t yet have on their roster. “You have to have the right mix of in-house and partners that can enable quicker development,” says Powers.


How to prepare for a SOC 2 audit – it’s a big deal, so you’d better get ready

“Companies tend to write their controls down and never look at them again, so preparing for the audit is an appropriate time to look at and update them if they don’t reflect what you’re doing,” says Paul Perry, a member of the Emerging Trends Working Group with the governance group ISACA and the Security, Risk and Controls Practice Leader with accounting and advisory firm Warren Averett. Auditors want to see well-documented policies, but they also want to see them in action to verify that organizations are doing in day-to-day practice what those policies say they should be doing. For example, software engineers may be testing code, but they need to do so in a manner that follows the process and documentation requirements outlined in the organization’s policies. That’s the kind of action auditors will want to see, Yawn says. Review security and privacy controls to ensure they’re aligned with the organization’s own security and privacy policies as well as regulatory requirements and industry best practices.


Does data need to come from a single source of truth?

Using potentially flawed data in the decision-making process not only leads to incorrect decision-making, but can have a negative impact on future data operations. If there isn’t real clarity about where the source of the data is, what it’s quality is and what it really means, how can employees really trust that data? And if they can’t trust it, the consequences can be serious, with executives developing a negative view of data-driven decision making and underinvesting in future data projects. It’s a vicious data circle that can end in a business not fully realising the true value from arguably its most important asset. It is crucial, therefore, that data is trusted and accurate, but ensuring data is reliable across multiple different sources is another challenge entirely The key is giving employees a single pane of glass through which to see all of the available data. This not only provides a single point of reference for employees that allows them to search for data on a reliable platform, but also gives them access to data from a wide range of different sources such as CRM or ERP systems.


How Chipmakers Are Implementing Confidential Computing

"Everybody wants to continue to reduce the attack surface of data," says Justin Boitano, vice president and general manager of Nvidia's enterprise and edge computing operations. "Up to this point, it is obviously encrypted in transit and at rest. Confidential computing solves the encrypted in-use at the infrastructure level." Nvidia is taking a divergent approach to confidential computing with Morpheus, which uses artificial intelligence (AI) to keep computer systems secure. For example, Morpheus identifies suspicious user behavior by using AI techniques to inspect network packets for sensitive data. "Security analysts can go and fix the security policies before it becomes a problem," Boitano says. "From there, we also realize the big challenges — you have to kind of assume that people are already in your network, so you have also got to look at the behavior of users and machines on the network." Nvidia is also using Morpheus to establish security priorities for analysts tracking system threats.


Memory-Based Cyberattacks Become More Complex, Difficult To Detect

There are two broad classifications of memory attacks. The first involves attacks on storage devices that are used to boot or load an operating system or software for a machine. Greenberg said that often, but not always, these require physical access to the machine to mount an effective attack on the storage, although an already compromised machine may further corrupt the storage such that the machine remains permanently compromised until it is completely erased and restarted. Encryption can help protect these storage devices. The second involves RAM devices that store temporary data. These devices are more likely to be attacked through the machine itself, including through internet-connected attacks. Physical attacks on RAM are also a possibility. Most systems’ security comes from physical security combined with built-in memory protection and run-time security provided through the system. “But as new ways of exploiting cybersecurity weaknesses are discovered over time, more advanced memory types tend to contain mitigating features for those methods,” Greenberg said.


The new CIO security priority: Your software supply chain

Whether it’s components, APIs, or serverless functions, most organizations underestimate what they’re using by an order of magnitude unless they run routine inventories, Worthington points out. “They find out that some of these APIs aren’t using proper authentication methods or are maybe not written in a way that they expected them to be and maybe some of them are even deprecated,” she says. Beyond vulnerabilities, evaluating the community support behind a package is as important as understanding what the code does because not all maintainers want the burden of having their code treated as a critical resource. “Not all open source is made the same,” she warns. “Open source may be free to download but certainly the use of it is not free. Your use of it means that you as are responsible for understanding the security posture behind it, because it’s in your supply chain. You need to contribute back to it. Your developers need to participate in fixing vulnerabilities,” says Worthington, who suggests organizations should also be prepared to contribute monetarily, either directly to open-source projects or to initiatives that support them with resources and funds.


The Future of DevOps Is No-Code

For DevOps, the starting point for upskilling is to train non-DevOps personnel to become effective members of the DevOps team. And this is where no-code and low-code DevOps tools come in. With no-code and low-code tools, even complete development novices can learn to build websites and applications. If someone has enough computer knowledge to drag-and-drop, they can probably learn no-code tools. And those with a little more computer savvy can put low-code tools to good use. As their name suggests, no-code and low-code tools facilitate software and application development with minimal need for writing or understanding code. Instead of building code, developers rely on visual, drag-and-drop processes to piece together pre-made functionality. So instead of needing to understand the intricacies of specific programming languages, developers need only have a good feel for the business’s needs, the overall application architecture, and the application’s workflows.



Quote for the day:

"You don't have to hold a position in order to be a leader." -- Anthony J. D'Angelo