Daily Tech Digest - October 02, 2022

CIOs Still Waiting for Cloud Investments to Pay Off

“You discover your cloud architecture is immature when you’re surprised by the bill,” said Mr. Roese. “Usually the reason it was expensive is that they processed the data in a suboptimal way, in the wrong place with the wrong tools with the wrong economic model.” Roughly 67% of 1,000 senior technology leaders at U.S. firms across industries said they have yet to see a significant return on cloud investments, KPMG said Thursday in its annual technology survey. “Many first movers expected significant IT cost efficiency from their cloud investments,” said Barry Brunsman, a principal in KPMG’s CIO Advisory group. “We have seen a shift away from that expectation in favor of speed and agility.” Mr. Brunsman said the most common issues preventing a better return on cloud spending were insufficient talent or skills among tech teams, added security and compliance requirements, and a misalignment on expected outcomes. Global cloud spending is projected to reach $830.5 billion by the end of the year, up 17.5% from 2021, but slowing from last year’s growth rate of 18.3%, according to International Data Corp. It expects growth to drop to 16.3% next year.


To Code or Not to Code: The Benefits of Automating Software Testing

The hoped-for solution to these problems has been test automation. This offers a way to reduce manual involvement, test greater volumes, remove the risk of human error and accelerate time to market as much as tenfold for a serious competitive advantage. Yet, even organizations that have rolled out automated testing have discovered that a big problem remains. They can’t easily scale these solutions because professional-grade coding skills are still required. Even if they are marketed as “low-code,” they’re still far too complex for business users. Even testers usually lack the coding skills to set up tests on their own. As a result, coding skills remain a resource bottleneck that slows down the testing process and limits collaboration with business users. What’s more, as automated testing becomes more pervasive throughout the organization, the maintenance workload grows. All those test engineers an organization hired to implement an automation framework increasingly spend their time maintaining code instead of building bigger and smarter test scopes. Scaling becomes impossible.


Where JavaScript is headed in 2022

Angular is showing ominous signs of weakness around retention and interest, ranking near the bottom at #9. Nevertheless, it remains #2 for actual usage, and #3 for awareness. Vue continues to be a strong contender, with a decent ranking across all categories. Overall the story on the front end is of incremental refinements, rather than revolutionary upheaval. And on the back end? Next.js instigated the full-stack JavaScript movement, and remains second only to Express in both awareness and usage. The comparison of Next to Express is of course imperfect. Express is a server-side framework only, the workhorse of Node-based HTTP. ... Another interesting discovery in relation to languages that compile to JavaScript is the popularity of Elm. Elm is an ingenious functional language geared for web development, and highly regarded for its enablement of fast and fluent applications. But it’s also a mothballed project without any commits for months. The takeaway? Clearly the basic ideas in Elm are still desirable and popular. Perhaps a new leader could take up the project and carry it forward to the benefit of the entire ecosystem.


Blockchain: An Immutable Future?

The general consensus is that the pandemic delayed the adoption of distributed ledger technologies. Companies worldwide felt the repercussions of supply chain disruption and changes to consumer habits, which meant that the implementation of blockchain fell low on priority lists. However, in some cases, blockchain technology was used effectively to coordinate logistics. Despite the limited use, the global crisis helped drive further discussion about blockchain’s benefits; for example, ledger technologies could potentially have helped counter the fake vaccines that flooded the market during the height of the pandemic. This topic also feeds into a wider conversation about the influx of counterfeit medicines in pharma supply chains. ... The computing power needed to mine (add information to the blockchain) plus the duplication of work is an obvious source of environmental concern. Beyond environmental challenges, every piece of data added to a blockchain needs to be transcribed onto every copy of the blockchain, resulting in a much greater cost than server or cloud storage. 


The 5 Absolute Best X-Factors That Increase Enterprise Value

The problem with the traditional business model is that you don’t know when clients return, if at all. Both cash flow and forecasting are problematic. The first step in transforming your business model is implementing recurring revenue. Again, you can look to Software as a Service (SaaS) for inspiration. Chances are, you pay a monthly fee for your favorite movie streaming service. Customers don’t miss hidden billing surprises as they know exactly what they’ll pay. You enjoy the predictability of cash flow and now have accurate budgets. The second step in transforming your business model is creating long-term exclusive contracts. Your mission is to show clients why they are better off with a long-term commitment. ... A rich and thriving culture is the foundation of your business. It’s your culture that plays a role in the customer experience. Your culture determines if you leverage a market opportunity or not. Buyers look for a culture that promotes resilience, innovation, and accountability. Your future buyer knows that people come and go in a business, including you. A rich and thriving culture transcends people and takes a life of its own.


Three Ways Banks Can Engage Younger Consumers in the Metaverse

The metaverse lets banks roll out the virtual red carpet for customers, with tailored experiences for specific segments and personas. Personalized virtual banking enables that special something that leaves customers feeling valued. Within a metaverse branch, banks can create virtual rooms in which avatars of relationship managers and customer advisors work one-on-one with high-net-worth individuals, for instance. They might also provide services to individuals looking to create a college fund or businesses interested in obtaining loans. Metaverse banking’s combination of personalization and community puts a fresh, modern spin on CX, and it’s an especially powerful draw for young banking consumers who are critical to the future of banking. ... To connect with the next generation of connected consumers, banks must begin building their presence among the more popular metaverses and increase engagement with younger demographic audiences through 3D banking, personalized services and DAOs. The good news? For payment providers and retail and commercial banks, there are no obstacles surrounding the preparation, and it is not too late to get ahead. 


Biology Inspires a New Kind of Water-Based Circuit That Could Transform Computing

Since this is closer to the way the brain transports information, they say, their device could be the next step forward in brain-like computing. "Ionic circuits in aqueous solutions seek to use ions as charge carriers for signal processing," write the team led by physicist Woo-Bin Jung of the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) in a new paper. "Here, we report an aqueous ionic circuit… This demonstration of the functional ionic circuit capable of analog computing is a step toward more sophisticated aqueous ionics." A major part of signal transmission in the brain is the movement of charged molecules called ions through a liquid medium. Although the incredible processing power of the brain is extremely challenging to replicate, scientists have thought that a similar system might be employed for computing: pushing ions through an aqueous solution. This would be slower than conventional, silicon-based computing, but it might have some interesting advantages.


Failure of Russia’s cyber attacks on Ukraine is most important lesson for NCSC

She said three things could be attributed to the “unexpected” lack of success – “impressive” Ukrainian cyber defences, “incredible” support from the cyber security sector and “impressive collaboration” between the US, EU, Nato, the UK, and others. “Just as we have seen inspirational and heroic defence by Ukrainian military on the battlefield, we have seen incredibly impressive defensive cyber operations by Ukrainian cyber security practitioners. Many commentators have suggested that this has been the most effective defensive cyber activity undertaken under sustained pressure in history,” added Cameron. She said the constant cyber attacks on Ukraine, emanating from Russia, over the past decade had prepared the country’s cyber defences. “In many ways, Russia has made Ukraine match fit over the past 10 years by consistently attacking them,” said Cameron. “We haven’t seen ‘cyber Armageddon’, but that’s not a surprise to cyber professionals, who never expected it. What we have seen is a very significant conflict in cyber space – probably the most sustained and intensive cyber campaign on record.”


Feds: Chinese Hacking Group Undeterred by Indictment

The United States began publicly indicting Chinese hackers in 2014 in a strategy to pressure Beijing by exposing the organizations and individuals behind state-sponsored cybertheft. The strategy seemed to pay dividends when Chinese leader Xi Jinping in September 2015 pledged to cease cyber-enabled economic espionage. The strategy's utility has since come under mounting fire as it became apparent that Chinese state-sponsored hacking responded to Xi's promise by becoming stealthier rather than by ending. "The indictment did not hinder APT41's operations as they progressed into 2021," concludes the Department of Health and Human Services' Health Sector Cybersecurity Coordination Center in a Thursday threat brief. "Stealing foreign IP is a primary objective of state-sponsored Chinese cyberespionage groups such as APT41, as it contributes to China's ambitious business and economic development goals," says Paul Prudhomme, a former Department of Defense threat analyst who is head of threat intelligence advisory at Rapid7.


What Is Artificial Intelligence in Software Testing?

You may wonder, “Don’t test automation tools do this already?” Of course, test automation tools already have AI in effect, but they have limitations. Where AI shines in software development is when it’s applied to remove those limitations, enabling software test automation tools to provide even more value to developers and testers. The value of AI comes from reducing the direct involvement of the developer or tester in the most mundane tasks. We still have a great need for human intelligence in applying business logic, strategic thinking, creative ideas, and the like. For example, consider that most, if not all, test automation tools run tests for you and deliver results. Most don’t know which tests to run, so they run all of them or some predetermined set. What if an AI-enabled bot could review the current state of test statuses, recent code changes, code coverage, and other metrics, and then decide which tests to run and run them for you? Bringing in decision-making that’s based on changing data is an example of applying AI. Good news! Parasoft handles automated software testing at this level.



Quote for the day:

"It's not the position that makes the leader. It_s the leader that makes the position." -- Stanley Huffty

Daily Tech Digest - October 01, 2022

3 wins and 3 losses for cloud computing

The cloud can successfully provide business agility. I always tell my clients that most enterprises moved to cloud for the perceived cost savings but stayed for the agility. Cloud provides businesses with the ability to turn IT on a dime, and enterprises that move fast in today’s more innovative markets (such as retail, healthcare, and finance) find that the speed at which cloud systems can change provides a real force multiplier for the business. The cloud offers industrial-strength reliability. Most who pushed back on cloud computing argued that we would put all our eggs in a basket that could prove unreliable. ... The businesses that moved to cloud computing anticipated significant cost savings. Those savings never really materialized except for completely new businesses that had no prior investment in IT. In fact, most enterprises looked at their cloud bills with sticker shock. The primary culprit? Enterprises that did not use cloud finops programs to effectively manage cloud costs. Also, cloud providers offered pricing and terms that many enterprises did not understand (and many still don’t).


Data storytelling: A key skill for data-driven decision-making

Rudy is a firm believer in letting the data unfold by telling a story so that when the storyteller finally gets to the punch line or the “so what, do what” there is full alignment on their message. As such, storytellers should start at the top and set the stage with the “what.” For example, in the case of an IT benchmark, the storyteller might start off saying that the total IT spend is $X million per year (remember, the data has already been validated, so everyone is nodding). The storyteller should then break it down into five buckets: people, hardware, software, services, other (more nodding), Rudy says. Then further break it down into these technology areas: cloud, security, data center, network, and so on (more nodding). Next the storyteller reveals that based on the company’s current volume of usage, the unit cost is $X for each technology area and explains that compared to competitors of similar size and complexity, the storyteller’s organization spends more in certain areas, for example, security (now everyone is really paying attention), Rudy says.


Active vs. Passive Network Monitoring: What Should Your Company Use?

Active network monitoring, also known as synthetic network monitoring, releases test traffic onto the network and observes that traffic as it travels through. This traffic is not taken from actual transactions that occur on a network, but rather sent through the network in order for your monitoring solution to examine it on its path. Test traffic usually mimics the typical network traffic that flows through your system so your administrators will gain the most relevant insights to its network. ... Passive network monitoring refers to capturing network traffic that flow through a network and analyzing it afterwards. Through a collection method like log management or network taps, passive monitoring compiles historic network traffic to paint a bigger picture of your company’s network performance. The primary use for passive network monitoring is for discovering and predicting performance issues that happen at specific instances and areas of your network. ... The question that might be passing through your mind is “should my business use active monitoring or passive monitoring for my network performance strategy?”


A New Linux Tool Aims to Guard Against Supply Chain Attacks

“Not too long ago, the only real criteria for the quality of a piece of software was whether it worked as advertised. With the cyber threats facing Federal agencies, our technology must be developed in a way that makes it resilient and secure,” Chris DeRusha, the US federal chief information security officer and deputy national cyber director, wrote in the White House announcement. "This is not theoretical: Foreign governments and criminal syndicates are regularly seeking ways to compromise our digital infrastructure.” When it comes to Wolfi, Santiago Torres-Arias, a software supply chain researcher at Purdue University, says that developers could accomplish some of the same protections with other Linux distributions, but that it’s a valuable step to see a release that’s been stripped down and purpose-built with supply chain security and validation in mind. “There’s past work, including work done by people who are now at Chainguard, that was kind of the precursor of this train of thought that we need to remove the potentially vulnerable elements and list the software included in a particular container or Linux release,” Torres-Arias says.


Using governance to spur, not stall, data access for analytics

“Without good governance controls, you not only have the policy management risk, but you also risk spending much, much more money than you intend, much faster,” says Barch. “We knew that maximizing the value of our data, especially as the quantity and variety of that data scales, was going to require creating integrated experiences with built-in governance that enabled the various stakeholders involved in activities like publishing data, consuming data, governing data and managing the underlying infrastructure, to all seamlessly work together.” What does this blended approach to data governance look like? For Capital One, it’s what Barch calls “sloped governance.” With a sloped governance approach, you can increase governance and controls around access and security for each level of data. For example, private user spaces, which don’t contain any shared data, can have minimal data governance requirements. As you move further into production, the controls get stricter and take more time to be implemented.


Microsoft: Hackers are using open source software and fake jobs in phishing attacks

The hacking group has targeted employees in media, defense and aerospace, and IT services in the US, UK, India, and Russia. The group was also behind the massive attack on Sony Pictures Entertainment in 2014. Also known as Lazarus, and tracked by Microsoft as ZINC, Google Cloud's Mandiant threat analysts saw the group spear-phishing targets in the tech and media sectors with bogus job offers in July, using WhatsApp to share a trojanized instance of PuTTY. "Microsoft researchers have observed spear-phishing as a primary tactic of ZINC actors, but they have also been observed using strategic website compromises and social engineering across social media to achieve their objectives," MSTIC notes. "ZINC targets employees of companies it's attempting to infiltrate and seeks to coerce these individuals into installing seemingly benign programs or opening weaponized documents that contain malicious macros. Targeted attacks have also been carried out against security researchers over Twitter and LinkedIn."


New deepfake threats loom, says Microsoft’s chief science officer

In a Twitter thread, MosaicML research scientist Davis Blaloch described interactive deepfakes as “the illusion of talking to a real person. Imagine a scammer calling your grandmom who looks and sounds exactly like you.” Compositional deepfakes, he continued, go further with a bad actor creating many deepfakes to compile a “synthetic history.” ... The rise of ever-more sophisticated deepfakes will “raise the bar on expectations and requirements” of journalism and reporting, as well as the need to foster media literacy and raise awareness of these new trends. In addition, new authenticity protocols to confirm identity might be necessary, he added – even new multifactor identification practices for admittance into online meetings. There may also need to be new standards to prove content provenance, including new watermark and fingerprint methods; new regulations and self-regulation; red-team efforts and continuous monitoring.


How Does WebAuthn Work?

WebAuthn is quite clever. It leverages the power of public key cryptography to create a way for users to log in to mobile and web applications without those applications having to store any secret information at all. Usually, when one thinks of public key cryptography, one thinks of using it to send a secret message to a person who then decrypts it and reads it. Well, this can kind of work in reverse. If you send them a message encrypted with their public key, then they – and only they – are the only ones who can decrypt because only they have the private key that corresponds to the given public key. Once they do, you can be highly confident that they are the entity that they say they are. Currently, all the major browsers – Chrome, Firefox, Edge, and Safari – all support the WebAuthn specification. If your phone – iPhone or Android – has a fingerprint reader or facial scanner, it supports WebAuthn. Windows provides WebAuthn support via Windows Hello. All of this translates to passwordless authentication quite nicely.


Why developers hold the key to cloud security

APIs drive cloud computing. They eliminate the requirement for a fixed IT architecture in a centralized data center. APIs also mean attackers don’t have to honor the arbitrary boundaries that enterprises erect around the systems and data stores in their on-premises data centers. While identifying and remediating misconfigurations is a priority, it’s essential to understand that misconfigurations are just one means to the ultimate end for attackers: control plane compromise. This has played a central role in every significant cloud breach to date. Empowering developers to find and fix cloud misconfigurations when developing IaC is critical, but it’s equally important to give them the tools they need to design cloud architecture that’s inherently secure against today’s control plane compromise attacks. ... Developers are in the best (and often only) position to secure their code before deployment, maintain its secure integrity while running, and better understand the specific places to provide fixes back in the code. But they’re also human beings prone to mistakes operating in a world of constant experimentation and failure. 


IT enters the era of intelligent automation

Companies also need to optimize business processes to increase the effectiveness of automation, Nallapati says. “Working together in a partnership, the business unit and the automation teams can leverage their expertise to refine the best approach and way forward to optimize the efficiency of the bot/automation,” she says. Technology leaders should make sure to get business leaders and users involved in the IA process, Ramakrishnan says. “Educate them about the possibilities and collaborate with them in joint problem-solving sessions,” he says. ... “With a large number of customers and a large number of invoices to process every day, any small savings through automation goes a long way in increasing productivity, accuracy, and improving employee and end customer satisfaction,” Ramakrishnan says. Similar to the type of hackathons that are common in IT organizations today, Ramakrishnan says, “we partnered with the business to have a business-side hackathon/ideathon. We educated the key users from the billing team on the possibilities of automation, and then they were encouraged to come back with ideas on automation.”



Quote for the day:

"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith

Daily Tech Digest - September 30, 2022

5 Signs That You’re a Great Developer

Programming directly changes your brain to work in another way, you’re starting to think more algorithmically and solve problems faster, so it really affects other aspects of your life Good programmers not only can learn anything else much faster, especially if we’re talking about tech-related directions, but also they are great examples of entrepreneurs and CEO. Look at Elon Musk, for instance, he was a programmer and built his own game when he was 12. ... As we briefly discussed previously, programming encourages creative thinking and teaches you how to approach problems in the most effective way. But in order to do so, you must first be able to solve a lot of these difficulties and have a passion for doing so; only then will you probably succeed as a developer. If you’ve just started and thought that easy, then you’re completely wrong. You just haven’t figured out genuinely challenging problems, and the more you learn, the more difficult and complex the difficulties get. Because you need to not only solve it, but also solve it in the most effective way possible, speed up your algorithm, and optimize everything. 


Experimental WebTransport over HTTP/3 support in Kestrel

WebTransport is a new draft specification for a transport protocol similar to WebSockets that allows the usage of multiple streams per connection. WebSockets allowed upgrading a whole HTTP TCP/TLS connection to a bidirectional data stream. If you needed to open more streams you’d spend additional time and resources establishing new TCP and TLS sessions. WebSockets over HTTP/2 streamlined this by allowing multiple WebSocket streams to be established over one HTTP/2 TCP/TLS session. The downside here is that because this was still based on TCP, any packets lost from one stream would cause delays for every stream on the connection. With the introduction of HTTP/3 and QUIC, which uses UDP rather than TCP, WebTransport can be used to establish multiple streams on one connection without them blocking each other. For example, consider an online game where the game state is transmitted on one bidirectional stream, the players’ voices for the game’s voice chat feature on another bidirectional stream, and the player’s controls are transmitted on a unidirectional stream. 
Software builders across Amazon require consistent, interoperable, and extensible tools to construct and operate applications at our peculiar scale; organizations will extend on our solutions for their specialized business needs. Amazon’s customers benefit when software builders spend time on novel innovation. Undifferentiated work elimination, automation, and integrated opinionated tooling reserve human interaction for high judgment situations. Our tools must be available for use even in the worst of times, which happens to be when software builders may most need to use them: we must be available even when others are not. Software builder experience is the summation of tools, processes, and technology owned throughout the company, relentlessly improved through the use of well-understood metrics, actionable insights, and knowledge sharing. Amazon’s industry-leading technology and access to top experts in many fields provides opportunities for builders to learn and grow at a rate unparalleled in the industry. As builders we are in a unique position to codify Amazon’s values into the technical foundations; we foster a culture of belonging by ensuring our tools, training, and events are inclusive and accessible by design.


The Troublemaker CISO: How Much Profit Equals One Life?

We take for granted that those who are charged with protecting us are doing so with our best interest at heart. There is no shaving off another few cents just to increase value to shareholders over the life of a person. Lucky for me, there is a shift in the boardrooms and governing bodies to see how socially responsible you are and whether you are acting in the best interest of the people and not just the investors. If the members of the board and governing body are considering these topics when steering a business, isn't it time to relook at how and why we do things? Are we as CISOs not accountable to leadership to impress on them the risk that IOT/internet connectivity poses to critical networks - and especially to healthcare? It is time to be firm in expressing the risk and saying we would rather spend a bit more money and time and do it the safe way. And this should be listed as the top risk in the company. The other big issue I have with this type of network being connected is one of transparency.


Digital Twins Offer Cybersecurity Benefits

A key difficulty, from a cybersecurity perspective, is the fact drug production lines are made up of multiple different technologies, running different operating systems that are often provided by different suppliers. “Integrating multiple systems from different suppliers can provide expanded attack surface that can be exploited by cyber adversary,” continues Mylrea. To address this, Mylrea and Grimes developed what they refer to as “biosecure digital twins”—replicas of manufacturing lines they use to identify potential points of attack for hackers. “The digital twin is essentially a high-fidelity virtual representation of critical manufacturing processes. From a security perspective, this improves monitoring, detection, and mitigation of stealthy attacks that can go undetected by most conventional cybersecurity defenses,” explains Mylrea. “Beyond security, the biosecure digital twin can optimize performance and productivity by detecting when critical systems deviate from their ideal state and correct in real time to enable predictive maintenance that prevent costly faults and safety failures.”


Unlocking cyber skills: This year’s essential back-to-school lesson plan

Technology is continually advancing, which will only create more avenues for cybersecurity roles in the future. While it’s essential to inform students about the types of careers in cybersecurity, teachers and career advisors should be aware of the skills and qualities the sector needs beyond technical computer and software knowledge. Once this is achieved, it can shed light on the roles students can go onto. Technical skills are critical in cybersecurity, yet they can be learned, fostered, and evolved throughout a student’s career. Schools need to tap into individual students’ strengths in hopes of encouraging them to pursue cyber positions. Broadly, cybersecurity enlists leaders, communicators, researchers, critical thinking… the list goes on. Having the qualities needed to fulfil various roles in the industry can position a student remarkably when they first start in the industry. Yet, this comes down to their mentors in high school being able to communicate that a student’s inquisitive nature or presenting skills can be applied to various sectors.


Data literacy: Time to cure data phobia

Data literacy is an incredibly important asset and skill set that should be demonstrated at all levels of the workplace. In simple terms, data literacy is the fundamental understanding of what data means, how to interpret it, how to create it and how to use it both effectively and ethically across business use cases. Employees who have been trained in and applied their knowledge of how to use company data demonstrate a high level of data literacy. Although many people have traditionally associated data literacy skills with data professionals and experts, it’s becoming necessary for employees from all departments and job levels to develop certain levels of data literacy. The Harvard Business Review stated: “Companies need more people with the ability to interpret data, to draw insights and to ask the right questions in the first place. These are skills that anyone can develop, and there are now many ways for individuals to upskill themselves and for companies to support them, lift capabilities,and drive change. Indeed, the data itself is clear on this: Data-driven decision-making markedly improves business performance.”


To BYOT & Back Again: How IT Models are Evolving

The growing complexity of IT frameworks is startling. A typical enterprise has upwards of 1,200 cloud services and hundreds of applications running at any given moment. On top of that, employees have their own smartphones, and many use their own routers and laptops. Meanwhile, various departments and groups -- marketing, finance, HR and others -- subscribe to specialized cloud services. The difficulties continue to pile up -- particularly as CIOs look to build out more advanced data and AI frameworks. McKinsey & Company found that between 10% and 20% of IT budgets are devoted to adding more technology in an attempt to modernize the enterprise and pay down technical debt. Yet, part of the problem, it noted, is “undue complexity” and a lack of standards, particularly at large companies that stretch across regions and countries. In many cases, orphaned and balkanized systems, data sprawl, data silos, and complex device management requirements follow. For CIOs seeking simplification and tighter security, the knee-jerk reaction is often to clamp down on choices and options.


IT leadership: What to prioritize for the remainder of 2022

To deliver product-centric value, it’s best to have autonomous, cross-functional teams running an Agile framework. Those teams can include technical practitioners, design thinkers, and business executives. Together, they can increase business growth by as much as 63%, Infosys’ Radar report uncovered. Cross-pollination efforts can spread Agile across the entire enterprise, building credibility and trust among high-level stakeholders toward an iterative process that can deliver meaningful, if incremental, business results. Big-bang rollouts, with a raft of modernizations released in one fell swoop, may seem attractive to management or other stakeholders. But they carry untold risk: developers scrambling to fix bugs after the fact, account teams working to retain disgruntled customers. Approach cautiously, and consider an Agile roadmap of smaller, iterative developments instead of the momentous release. It also breaks down the considerable task of application modernization into smaller, bite-sized chunks. 


How Policy-as-Code Helps Prevent Cloud Misconfigurations

Policy-as-code is a great cloud configuration solution because it eliminates the potential for human error and makes it more difficult for hackers to interfere. Policy compliance is crucial for cloud security, ensuring that every app and piece of code follows the necessary rules and conditions. The easiest way to ensure nothing slips through the cracks is to automate the compliance management process. Policy-as-code is also a good choice in a federated risk management model. A set of common standards are applied across a whole organization, although departments or units retain their own methods and workflows. PaC fits seamlessly into this high-security system by scaling and automating IT policies throughout a company. Preventing cloud misconfiguration relies on effectively ensuring every app and line of code is adhering to an organization’s IT policies. PaC offers some key benefits that make this possible without being a hassle. Policy-as-code improves the visibility of IT policies since everything is clearly defined in code format. 



Quote for the day:

"Without courage, it doesn't matter how good the leader's intentions are." -- Orrin Woodward

Daily Tech Digest - September 29, 2022

Hybrid work is the future, and innovative technology will define it

We’re starting to see an amplification of recognition tools, of coaching platforms, of new and exciting ways to learn that are leveraging mobility and looking at how people want to work and to meet them where they are, rather than saying, “Here’s the technology, learn how to use it.” It’s more about, “Hey, we’re learning how you want to work and we’re learning how you want to grow, and we’ll meet you there.” We’re really seeing an uptake in the HR tech space of tools that acknowledge the humaneness underneath the technology itself. ... The second layer consists of the business applications we’ve come to know and love. Those include HR apps, business applications, supply chain applications, and financial applications, et cetera. Certainly, there is a major role in this distributed work environment for virtual application delivery and better security. We need to access those mission-critical apps remotely and have them perform the same way whether they’re virtual, local, or software as a service (SaaS) apps -- all through a trusted access security layer.


Web 3: How to prepare for a technological revolution

It hardly needs to be said, but over the last few decades, the internet has grown to be arguably the most integral cog ensuring a smooth-running, functioning society. It is so ingrained that almost every industry in the world would be unable to function properly without it. And this reliance will only grow as Web 3 becomes the norm, which makes it critical that we begin to educate children now on its uses and how to navigate it. Already, many of today’s adults will find it difficult to explain what Web 3 is, let alone teach the next generation how to use it. Educating children early will not only help them thrive in the future, but they will also be able to pass gained knowledge up the chain to their parents. This is, of course, just history repeating itself. It is the equivalent of kids showing their parents how to use a touch screen or work their email. But the revolution Web3 is about to bring is on a different scale to any previous technological advancement. Soon, the greatest opportunities will be solely available on the new internet, and it is critical we ensure every child has the opportunity to succeed.


Health data governance and the case for regulation

Without appropriate data governance procedures and training in place, healthcare organizations are likely to find themselves in danger of noncompliance. HIPAA violations in particular can occur at any level of an organization; if an undertrained staff member or unsecured database is operating in your organization, there’s a huge likelihood that they will eventually misuse patient data and breach HIPAA regulations. This kind of breach can lead to noncompliance, fines, legal issues, poorer patient experiences and even a loss of trust within the greater medical community. Data governance means the difference between a successful and fully operational facility and a facility that gets shut down by the government. On the other hand, when data governance principles are applied successfully in the healthcare sector, a slew of benefits outside of basic compliance can be realized. Patients feel confident that their information is safe and begin to refer their friends and family members to your network. Data becomes easier to find, label and organize for new operational use cases and emerging patient technologies. 


Closing the Gap Between Complexity and Performance in Today’s Hybrid IT Environments

Nowadays, the increasing need for security on all fronts has fueled collaboration between teams on a regular basis. This, in turn, has spurred more proactivity from an internal IT operations perspective. Proactivity, bolstered by a unified view into traffic and communication, is a key aspect of closing the gap between cloud complexity and performance — because it starts at the IT cultural level. Technical capabilities like deep observability can support team prioritization of detection and management on a more holistic level, addressing all aspects of IT infrastructure. With this, organizations can feel more confident in overcoming cloud-based challenges and mitigating connected cyber vulnerabilities as a collective force. An all-encompassing, proactive approach is needed to speedily detect cyber threats, respond to the corresponding activity, and enact a remediation plan. Within hybrid and multi-cloud environments, data and communication costs can skyrocket. The most common use cases stem from packets, which can interfere with control of and visibility into the right data. 


Blockchain and artificial intelligence: on the wave of hype

Blockchain is an innovative digital information storage system storing data in an encrypted, distributed ledger format. In work, the data is encrypted and distributed across multiple computers, which creates tamper-proof. It is a secure database that can only be read and updated by those with permission. There are a few examples on the web today of blockchain and artificial intelligence being interconnected. Academics and scientists conducted the study. But we see the two concepts working well together. ... Today’s computers are extremely fast, but they also require a constant supply of data and instructions, without which it is impossible to process information or perform tasks. Therefore, the blockchain used on standard computers requires significant computing power because of the encryption processes. Secure data monetization could be the result of combining blockchain and artificial intelligence. Monetization of collected is a source of revenue for many companies. Among the big and famous ones are Facebook and Google resources.


How MLops deployment can be easier with open-source versioning

Among the many reasons why there are a growing number of vendors in the sector, a significant one is because building and deploying ML models is often a complicated process with many manual steps. A primary goal of MLops tools is to help automate the process of building and deploying models. While automation is important, it only solves part of the complexity. A key challenge for artificial intelligence (AI) models, that was identified in a recently released Gartner report, is that approximately only half of AI models actually end up making it into production. From Guttmann’s perspective, with application development, developers tend to have a linear way of building things. This implies that for example, new code written six months after the initial development is better than the original. That same view does not tend to work with machine learning as the process involves more research and more experimentation to determine what actually works best. “Development is always money sunk into the problem until you actually see the fruits of the effort and we want to decrease that development time to a minimum,” he said.


Robotic Process Automation Will Shape the Future of Hotel Operations

On the guest-facing side of the business, automation can be applied to virtually every touchpoint of the guest journey. Marketing automation comes in the form of upsell opportunities, re-marketing or recovery campaigns in the pre-stay, pre-check-in and post-cancellation stages. In the back of the house, automation is helping the marketing, revenue and sales departments get more done with fewer resources. Integrated CRM systems have become the heart of new, guest-centric personalization strategies such as automated email marketing programs that are proving to be huge time-savers. Revenue managers are tapping automation to stay on top of pricing and demand trends. RPA reduces the common challenges presented by running a business on a fragmented tech stack. Siloed systems often lead to a great deal of manual effort, such as copying, importing, exporting data from one system to another, or the common “swivel-chair integration.” Through RPA, operators can create workflows that fill feature gaps or replicate features from other systems, saving them time and money.


Russian hackers' lack of success against Ukraine shows that strong cyber defences work

Since the invasion, Cameron said, "what we have seen is a very significant conflict in cyberspace – probably the most sustained and intensive cyber campaign on record." But she also pointed to the lack of success of these campaigns, thanks to the efforts of Ukrainian cyber defenders and their allies. "This activity has provided us with the clearest demonstration that a strong and effective cyber defence can be mounted, even against an adversary as well prepared and resourced as the Russian Federation." Cameron argued that not only does this provide lessons for what countries and their governments can do to protect against cyberattacks, but there are also lessons for organisations on how to protect against incidents, be they nation-state backed campaigns, ransomware attacks or other malicious cyber operations. "Central to this is a commitment to long-term resilience," said Cameron. "Building resilience means we don't necessarily need to know where or how the threat will manifest itself next. Instead, we know that most threats will be unable to breach our defences. And when they do, we can recover quickly and fully."


The Unlikely Journey of GraphQL

GraphQL is drawing the spotlight because refactoring or modernization of applications into microservices is stressing REST to its limits. As information consumers, we expect more from the digital platforms that we use. Shop for a product, and we also will want to find reviews, competing offers, autofill keyword search, and likely other options. Monolithic apps crack under the load, and for similar reasons, the same fate could be happening to REST, which requires pinpoint commands to specific endpoints. And with complex queries, lots of pinpoint requests. Facebook developers created GraphQL as a client specification for alleviating the bottlenecks that were increasingly cropping up when fetching data from polyglot sources to a variety of web and mobile clients. With REST, developers had to know all the endpoints. By contrast, with GraphQL, the approach is declarative: specify what data you need rather than how to produce it. While REST is imperative, GraphQL is declarative. 


Cryptojacking, DDoS attacks increase in container-based cloud systems

The Sysdig repot also noted that there has been a jump in DDoS attacks that use containers since the start of Russian invasion of Ukraine. "The goals of disrupting IT infrastructure and utilities have led to a four‑fold increase in DDoS attacks between 4Q21 and 1Q22," according to the report. "Over 150,000 volunteers have joined anti‑Russian DDoS campaigns using container images from Docker Hub. The threat actors hit anyone they perceive as sympathizing with their opponent, and any unsecured infrastructure is targeted for leverage in scaling the attacks." Otherwise, a pro-Russian hacktivist group, called Killnet, launched several DDoS attacks on NATO countries. These include, but are not limited to, websites in Italy, Poland, Estonia, Ukraine, and the United States. “Because many sites are now hosted in the cloud, DDoS protections are more common, but they are not yet ubiquitous and can sometimes be bypassed by skilled adversaries,” Sysdig noted. “Containers pre‑loaded with DDoS software make it easy for hacktivist leaders to quickly enable their volunteers.”



Quote for the day:

"Good leaders value change, they accomplish a desired change that gets the organization and society better." -- Anyaele Sam Chiyson

Daily Tech Digest - September 28, 2022

How to Become an IT Thought Leader

Being overly tech-centric is a common mistake aspiring thought leaders make. Such individuals start with a technology, then look for problems to solve. “Instead, it's important to remember that an IT thought leader drives digital change,” Zhao says. “Understanding the technology is only one aspect of IT thought leadership.” Ross concurs. “I’ve seen several troubling examples of large technology purchases occurring before key business requirements were fully understood,” he says. “Seek first to understand the desired business outcomes and remember that technology is a potential enabler of those outcomes, but never a cure-all.” A strong business case is essential for any proposed new technology, Bethavandu says. “If your company is not ready for, say, DevOps or containerization, be self-aware and don’t push for those projects until your organization is ready,” he states. On the other hand, excessive caution can also be dangerous. “If you want to be a thought leader, you have to be bold and you cannot be afraid of failing,” Bethavandu says.


Most Attackers Need Less Than 10 Hours to Find Weaknesses

Overall, nearly three-quarters of ethical hackers think most organizations lack the necessary detection and response capabilities to stop attacks, according to the Bishop Fox-SANS survey. The data should convince organizations to not just focus on preventing attacks, but aim to quickly detect and respond to attacks as a way to limit damage, Bishop Fox's Eston says. "Everyone eventually is going to be hacked, so it comes down to incident response and how you respond to an attack, as opposed to protecting against every attack vector," he says. "It is almost impossible to stop one person from clicking on a link." In addition, companies are struggling to secure many parts of their attack surface, the report stated. Third parties, remote work, the adoption of cloud infrastructure, and the increased pace of application development all contributed significantly to expanding organizations' attack surfaces, penetration testers said. Yet the human element continues to be the most critical vulnerability, by far. 


Discover how technology helps manage the growth in digital evidence

With limited resources, even the most skilled law-enforcement personnel are hard-pressed to comb through terabytes of data that may include hours of videos, tens of thousands of images, and hundreds of thousands of words in the form of text, email, and other sources. One possible solution is to augment skilled investigators and forensic examiners with technology. Some of the key technological capabilities that can be applied to this problem are AI and machine learning. AI and machine learning models and applications create processes that read, watch, extract, index, sort, filter, translate, and transcribe information from text, images, and video. By utilizing technology to carve through and analyze data, it’s possible to reduce the data mountain to a series of small hills of related content and add tags that make it searchable. That allows people to spend their time and energy on work that is most valuable in the investigation. The good news is that help is available. Microsoft has multiple AI and machine learning processes within our Microsoft Azure Cognitive Services. 


The modern enterprise imaging and data value chain

The costs and consequences of the current fragmented state of health care data are far-reaching: operational inefficiencies and unnecessary duplication, treatment errors, and missed opportunities for basic research. Recent medical literature is filled with examples of missed opportunities—and patients put at risk because of a lack of data sharing. More than four million Medicare patients are discharged to skilled nursing facilities (SNFs) every year. Most of them are elderly patients with complex conditions, and the transition can be hazardous. ... “Weak transitional care practices between hospitals and SNFs compromise quality and safety outcomes for this population,” researchers noted. Even within hospitals, sharing data remains a major problem. ... Data silos and incompatible data sets remain another roadblock. In a 2019 article in the journal JCO Clinical Cancer Informatics, researchers analyzed data from the Cancer Imaging Archive (TCIA), looking specifically at nine lung and brain research data sets containing 659 data fields in order to understand what would be required to harmonize data for cross-study access.


Cloud’s key role in the emerging hybrid workforce

One key to the mistakes may be the overuse of cloud computing. Public clouds provide more scalable and accessible systems on demand, but they are not always cost-effective. I fear that much like when any technology becomes what the cool kids are using, cloud is being picked for emotional reasons and not business reasons. On-premises hardware costs have fallen a great deal during the past 10 years. Using these more traditional methods of storage and compute may be way more cost-effective than the cloud in some instances and may be just as accessible, depending on the location of the workforce. My hope is that moving to the cloud, which was accelerated by the pandemic, does not make us lose sight of making business cases for the use of any technology. Another core mistake that may bring down companies is not having security plans and technology to support the new hybrid workforce. Although few numbers have emerged, I suspect that this is going to be an issue for about 50% of companies supporting a remote workforce.


Why zero trust should be the foundation of your cybersecurity ecosystem

Recently, zero trust has developed a large following due to a surge in insider attacks and an increase in remote work – both of which challenge the effectiveness of traditional perimeter-based security approaches. A 2021 global enterprise survey found that 72% respondents had adopted zero trust or planned to in the near future. Gartner predicts that spending on zero trust solutions will more than double to $1.674 billion between now and 2025. Governments are also mandating zero trust architectures for federal organizations. These endorsements from the largest organizations have accelerated zero trust adoption across every sector. Moreover, these developments suggest that zero trust will soon become the default security approach for every organization. Zero trust enables organizations to protect their assets by reducing the chance and impact of a breach. It also reduces the average breach cost by at least $1.76 million, can prevent five cyber disasters per year, and save an average of $20.1 million in application downtime costs.


Walls between technology pros and customers are coming down at mainstream companies

Tools assisting with this engagement include "prediction, automation, smart job sites and digital twins," he says. "We have resources in each of our geographic regions where we scale new technology from project to project to ensure the 'why' is understood, provide necessary training and support, and educate teams on how that technology solution makes sense in current processes and day-to-day operations." At the same time, getting technology professionals up to speed with crucial pieces of this customer collaboration -- user experience (UX) and design thinking -- is a challenge, McFarland adds. "There is a widely recognized expectation to create seamless and positive customer experiences. That said, specific training and technological capabilities are a headwind that professionals are experiencing. While legacy employees may be fully immersed and knowledgeable about a certain program and its technical capabilities, it is more unusual to have both the technical and UX design expertise. The construction industry is working to find the right balance of technology expertise and awareness with UX and design proficiencies."


Why Is the Future of Cloud Computing Distributed Cloud?

Distributed cloud freshly redefines cloud computing. It states that a distributed cloud is a public cloud architecture that handles data processing and storage in a distributed manner. Said, a business using dispersed cloud computing can store and process its data in various data centers, some of which may be physically situated in other regions. A content delivery network (CDN), a network architecture that is geographically spread, is an example of a distributed cloud. It is made to send content (most frequently video or music) quickly and efficiently to viewers in various places, significantly lowering download speeds. Distributed clouds, however, offer advantages to more than just content producers and artists. They can be utilized in multiple business contexts, including transportation and sales. It is possible to use a distributed cloud even in particular geographical regions. For instance, a supplier of file transfer services can format video and store content on CDNs spread out globally using centralized cloud resources.


How to Become a Data Analyst – Complete Roadmap

First, understand this, the field of Data Analyst is not about computer science but about applying computational, analysis, and statistics. This field focuses on working with large datasets and the production of useful insights that helps in solving real-life problems. The whole process starts with a hypothesis that needs to be answered and then involvement in gathering new data to test those hypotheses. There are 2 major categories of Data Analyst: Tech and Non-Tech. Both of them work on different tools and Tech domain professionals are required to possess knowledge of required programming languages too (such as R or Python). The working professional should be fluent in statistics so that they can present any given amount of raw data into a well-aligned structure. ... Today, Billions of companies are generating data on daily basis and using it to make crucial business decisions. It helps in deciding their future goals and setting new milestones. We’re living in a world where Data is the new fuel and to make it useful data analysts are required in every sector. 


Software developer: A day in the life

An analytics role will require you to learn new skills continuously, look at things in new ways, and embrace new perspectives. In technology and business, things happen quickly. It is important to always keep up with what is happening in the industries in which you are involved. Never forget that at its core, technology is about problem-solving. Don’t get too attached to any coding language; just be aware that you probably won’t be able to use the language you like, do the refactor you want, or perform the update you expect all the time. The end focus is always on the client, and their needs take priority over developer preferences. Be prepared to use English every day. To keep your skills sharp, read documentation, talk to others often, and watch videos. ... Any analytics professional who is interested in elevating their career should always be attentive to new technologies and updates, become an expert in some specific language/technology, and understand the low level of programming in a variety of languages. Finally, if you enjoy logic, math, and problem-solving, consider a career in software development. The world needs your skills to solve big challenges.



Quote for the day:

"Leadership Principle: As hunger increases, excuses decrease." -- Orrin Woodward

Daily Tech Digest - September 27, 2022

In the shift to hybrid work, don’t overlook your in-person workforce

As companies think through their workforce strategies, taking a few critical steps can help. First, make sure that the in-person cohort receives the same amount of consideration as remote and hybrid workers. New ways of working clearly pose challenges in terms of productivity, but there is a real risk in senior leaders focusing most of their time and attention on remote-work issues. Second, measure employee sentiment, over time, to understand which factors are successful in boosting engagement and morale among the in-person workforce, and where the organization can improve. Third, look for ways to increase the autonomy of in-person workers. Encourage them to make suggestions about how their work can be done better, and empower them to act on those suggestions. Create some degree of flexibility in terms of scheduling. For example, enable workers to have more say in setting schedules, and allow workers to trade shifts. Fourth, invest in upskilling initiatives; they are a key driver of empowerment and engagement. 


Caught in the crossfire of cyber conflict

Cyber events are now routinely crossing thresholds that would have been viewed as increasingly risky 20 years ago. The result is that offensive cyber operations are now manageable for countries such as the US but are now catastrophic for smaller countries that are thrust into the cyber conflict space. The potential scale of this effect likely makes smaller countries ideal targets for sophisticated actors looking to demonstrate their capabilities. Iran appears to have stronger evidence on Israel’s role in the ‘Predatory Sparrow’ campaign (the two countries have been exchanging attacks for years) but opted to attack Albania’s government for harbouring the MeK—using the disruptive incident to send a message to Iran’s enemies. This incident is chilling because it shows the spread of sophisticated cyber capabilities, and the growing intent to conduct such operations. Most theories around cyber conflict have kept the US as a key player in such conflicts—‘Predatory Sparrow’ and Iran’s response have shown that this is outdated. 


How DevOps Practices will Expedite AI Adoption?

Although AI has developed and revolutionized many corporate processes, there are still obstacles to overcome because it necessitates a lot of human labor. Getting a dataset, training it, cleaning it, and making predictions appear increasingly tricky. A different problem is creating a fluid generalized training pattern or transferring a specific approach from one situation to another. Businesses could adapt their operational procedures to achieve more noticeable outcomes, such as the DevOps culture, which results in practical development, deployment, and operation pipeline. ... DevOps and IT teams must work closely to achieve this; as a result, a central repository for model artifacts is required, and ML engineers must redesign the production model. Thus, a smooth collaboration between the IT, DevOps, and data scientists teams is crucial. MLOps, or machine learning operations, is a different way of describing the confluence of people, processes, practices, and underlying technology that automate the implementation, monitoring, and management of AI/ML models in production in a scalable and thoroughly controlled manner.


India: Crucial cyberwarfare capabilities need to be upgraded

The world has seen many cases of cyber-attacks in espionage and sabotage. Many significant cyberattacks in the military and civil spaces have occurred in recent months. APT41, a Chinese state-sponsored hacking group, allegedly hacked into six US state governments between May 2021 to February 2022. Another Distributed Denial of Service (DDoS) attack in the preceding month was the cyber-attack on Israeli government websites. While the government has said this was the cyber-attack Israel has faced, investigations are yet to determine the source of the attack. Similarly, a targeted cyber-attack campaign on Russian research institutes was discovered in June 2021. The target was research institutes under the Rostec Corporation, whose primary expertise is the research and development of highly technological defence solutions. In India, researchers detected a new ransomware that made its victims donate money to the needy. However, this ransomware, called Goodwill, also acts maliciously by causing temporary or even permanent loss of company data and the possible closure of a company’s operations and finances.


The API gateway pattern versus the Direct client-to-microservice communication

In a microservices architecture, the client apps usually need to consume functionality from more than one microservice. If that consumption is performed directly, the client needs to handle multiple calls to microservice endpoints. What happens when the application evolves and new microservices are introduced or existing microservices are updated? If your application has many microservices, handling so many endpoints from the client apps can be a nightmare. Since the client app would be coupled to those internal endpoints, evolving the microservices in the future can cause high impact for the client apps. ... When you design and build large or complex microservice-based applications with multiple client apps, a good approach to consider can be an API Gateway. This pattern is a service that provides a single-entry point for certain groups of microservices. It's similar to the Facade pattern from object-oriented design, but in this case, it's part of a distributed system. The API Gateway pattern is also sometimes known as the "backend for frontend" (BFF) because you build it while thinking about the needs of the client app.


Why Choose a NoSQL Database? There Are Many Great Reasons

Speed is critical to innovation, but so is flexibility. A core principle of agile development is responding quickly to change. Often when the requirements change, the data model also needs to change. With relational databases, developers often have to formally request a “schema change” from the database administrators. This slows down or stops development. By comparison, a NoSQL document database fully supports agile development because it is schema-less and does not statically define how the data must be modeled. Instead, it defers to the applications and services, and thus to the developers as to how data should be modeled. With NoSQL, the data model is defined by the application model. Applications and services model data as objects (such as employee profile), multivalued data as arrays (roles) and related data as nested objects or arrays (for instance, manager relationship). Relational databases, however, model data as tables of rows and columns — related data as rows within different tables, multivalued data as rows within the same table.


Securing the Internet of Things

Unlike humans, who need to be able to access a potentially unbounded number of destinations (websites), the endpoints that an IoT device needs to speak to are typically far more bounded. But in practice, there are often few controls in place (or available) to ensure that a device only speaks to your API backend, your storage bucket, and/or your telemetry endpoint. Our Zero Trust platform, however, has a solution for this: Cloudflare Gateway. You can create DNS, network or HTTP policies, and allow or deny traffic based not only on the source or destination, but on richer identity- and location- based controls. It seemed obvious that we could bring these same capabilities to IoT devices, and allow developers to better restrict and control what endpoints their devices talk to (so they don’t become part of a botnet). ... Security continues to be a concern: if your device needs to talk to external APIs, you have to ensure you have explicitly scoped the credentials they use to avoid them being pulled from the device and used in a way you don’t expect.


Modern Enterprise Data Architecture

In traditional architecture development, data modeling is the simple task of deriving data elements from requirements, depicting the relation between the entities through entity relationship (ER) diagrams, and defining the parameters (data types, constraints, validations) around the data elements. This means that data modeling is done as a single-step activity in a traditional architecture by defining the data definition language (DDL) scripts from requirements. ... A database acts as the brain for an IT application because it serves as the central store for data being transacted and referenced in the application. Database administrators (DBAs) handle database tuning, security activities, backup, DR activities, server/platform updates, health checks, and all other management and monitoring activities of databases. When you use a cloud platform for application and database development, the aforementioned activities are critical for better security, performance, and cost efficiency. 


Data privacy can give businesses a competitive advantage

It is a similar story of a competitive edge waiting to be revealed through compliance when it comes to protecting personal data. The fines that non-compliance brings are perhaps one of the most-reported aspects of the new regulation. Serious breaches can cost a company €20m, or 4 per cent of global annual revenue per offence, but the Information Commissioner’s Office (ICO) has been very clear it has no intention to scapegoat businesses using these powers. The GDPR is very clear that data has to be held and processed securely and though the law does not outline how, Article 32 provides a clear prescription for what is expected. The ICO’s advice is that processing the minimum amount of personally identifiable information possible is a good start. Then, storing it securely and in an encrypted form makes sense. In certain circumstances, anonymising data so it can collectively provide insight without revealing identities is another tactic many organisations are using. Securing data so it cannot be hacked is a worthy end in its own right. 


7 Metrics to Measure the Effectiveness of Your Security Operations

The main objective of a resilient security operations program should be lowering an organization's MTTD and MTTR to limit any damage done by a cyber incident to your organization. MTTD measures the amount of time it takes to discover a potential security threat. This metric helps you understand the effectiveness of your organization's security operations and your team's speed and ability to recognize a threat. Therefore, the goal is to keep this metric as low as possible in order to reduce the impact of a compromise on your organization. Meanwhile, MTTR helps you measure the time it takes to respond to a threat once it is detected. A higher response time indicates that a compromise could lead to a damaging data breach. The goal is to speed up your response and decrease your risk, just like MTTD. Both MTTD and MTTR are key metrics to measure and improve your team's capabilities since it is crucial to track the effectiveness of your team as your organization's maturity grows. Like any fundamental business operation, to mature your organization you should measure operational effectiveness to determine whether your organization is reaching its KPIs and SLAs.



Quote for the day:

"Leadership is the art of giving people a platform for spreading ideas that work" -- Seth Godin