Daily Tech Digest - October 02, 2022

CIOs Still Waiting for Cloud Investments to Pay Off

“You discover your cloud architecture is immature when you’re surprised by the bill,” said Mr. Roese. “Usually the reason it was expensive is that they processed the data in a suboptimal way, in the wrong place with the wrong tools with the wrong economic model.” Roughly 67% of 1,000 senior technology leaders at U.S. firms across industries said they have yet to see a significant return on cloud investments, KPMG said Thursday in its annual technology survey. “Many first movers expected significant IT cost efficiency from their cloud investments,” said Barry Brunsman, a principal in KPMG’s CIO Advisory group. “We have seen a shift away from that expectation in favor of speed and agility.” Mr. Brunsman said the most common issues preventing a better return on cloud spending were insufficient talent or skills among tech teams, added security and compliance requirements, and a misalignment on expected outcomes. Global cloud spending is projected to reach $830.5 billion by the end of the year, up 17.5% from 2021, but slowing from last year’s growth rate of 18.3%, according to International Data Corp. It expects growth to drop to 16.3% next year.


To Code or Not to Code: The Benefits of Automating Software Testing

The hoped-for solution to these problems has been test automation. This offers a way to reduce manual involvement, test greater volumes, remove the risk of human error and accelerate time to market as much as tenfold for a serious competitive advantage. Yet, even organizations that have rolled out automated testing have discovered that a big problem remains. They can’t easily scale these solutions because professional-grade coding skills are still required. Even if they are marketed as “low-code,” they’re still far too complex for business users. Even testers usually lack the coding skills to set up tests on their own. As a result, coding skills remain a resource bottleneck that slows down the testing process and limits collaboration with business users. What’s more, as automated testing becomes more pervasive throughout the organization, the maintenance workload grows. All those test engineers an organization hired to implement an automation framework increasingly spend their time maintaining code instead of building bigger and smarter test scopes. Scaling becomes impossible.


Where JavaScript is headed in 2022

Angular is showing ominous signs of weakness around retention and interest, ranking near the bottom at #9. Nevertheless, it remains #2 for actual usage, and #3 for awareness. Vue continues to be a strong contender, with a decent ranking across all categories. Overall the story on the front end is of incremental refinements, rather than revolutionary upheaval. And on the back end? Next.js instigated the full-stack JavaScript movement, and remains second only to Express in both awareness and usage. The comparison of Next to Express is of course imperfect. Express is a server-side framework only, the workhorse of Node-based HTTP. ... Another interesting discovery in relation to languages that compile to JavaScript is the popularity of Elm. Elm is an ingenious functional language geared for web development, and highly regarded for its enablement of fast and fluent applications. But it’s also a mothballed project without any commits for months. The takeaway? Clearly the basic ideas in Elm are still desirable and popular. Perhaps a new leader could take up the project and carry it forward to the benefit of the entire ecosystem.


Blockchain: An Immutable Future?

The general consensus is that the pandemic delayed the adoption of distributed ledger technologies. Companies worldwide felt the repercussions of supply chain disruption and changes to consumer habits, which meant that the implementation of blockchain fell low on priority lists. However, in some cases, blockchain technology was used effectively to coordinate logistics. Despite the limited use, the global crisis helped drive further discussion about blockchain’s benefits; for example, ledger technologies could potentially have helped counter the fake vaccines that flooded the market during the height of the pandemic. This topic also feeds into a wider conversation about the influx of counterfeit medicines in pharma supply chains. ... The computing power needed to mine (add information to the blockchain) plus the duplication of work is an obvious source of environmental concern. Beyond environmental challenges, every piece of data added to a blockchain needs to be transcribed onto every copy of the blockchain, resulting in a much greater cost than server or cloud storage. 


The 5 Absolute Best X-Factors That Increase Enterprise Value

The problem with the traditional business model is that you don’t know when clients return, if at all. Both cash flow and forecasting are problematic. The first step in transforming your business model is implementing recurring revenue. Again, you can look to Software as a Service (SaaS) for inspiration. Chances are, you pay a monthly fee for your favorite movie streaming service. Customers don’t miss hidden billing surprises as they know exactly what they’ll pay. You enjoy the predictability of cash flow and now have accurate budgets. The second step in transforming your business model is creating long-term exclusive contracts. Your mission is to show clients why they are better off with a long-term commitment. ... A rich and thriving culture is the foundation of your business. It’s your culture that plays a role in the customer experience. Your culture determines if you leverage a market opportunity or not. Buyers look for a culture that promotes resilience, innovation, and accountability. Your future buyer knows that people come and go in a business, including you. A rich and thriving culture transcends people and takes a life of its own.


Three Ways Banks Can Engage Younger Consumers in the Metaverse

The metaverse lets banks roll out the virtual red carpet for customers, with tailored experiences for specific segments and personas. Personalized virtual banking enables that special something that leaves customers feeling valued. Within a metaverse branch, banks can create virtual rooms in which avatars of relationship managers and customer advisors work one-on-one with high-net-worth individuals, for instance. They might also provide services to individuals looking to create a college fund or businesses interested in obtaining loans. Metaverse banking’s combination of personalization and community puts a fresh, modern spin on CX, and it’s an especially powerful draw for young banking consumers who are critical to the future of banking. ... To connect with the next generation of connected consumers, banks must begin building their presence among the more popular metaverses and increase engagement with younger demographic audiences through 3D banking, personalized services and DAOs. The good news? For payment providers and retail and commercial banks, there are no obstacles surrounding the preparation, and it is not too late to get ahead. 


Biology Inspires a New Kind of Water-Based Circuit That Could Transform Computing

Since this is closer to the way the brain transports information, they say, their device could be the next step forward in brain-like computing. "Ionic circuits in aqueous solutions seek to use ions as charge carriers for signal processing," write the team led by physicist Woo-Bin Jung of the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) in a new paper. "Here, we report an aqueous ionic circuit… This demonstration of the functional ionic circuit capable of analog computing is a step toward more sophisticated aqueous ionics." A major part of signal transmission in the brain is the movement of charged molecules called ions through a liquid medium. Although the incredible processing power of the brain is extremely challenging to replicate, scientists have thought that a similar system might be employed for computing: pushing ions through an aqueous solution. This would be slower than conventional, silicon-based computing, but it might have some interesting advantages.


Failure of Russia’s cyber attacks on Ukraine is most important lesson for NCSC

She said three things could be attributed to the “unexpected” lack of success – “impressive” Ukrainian cyber defences, “incredible” support from the cyber security sector and “impressive collaboration” between the US, EU, Nato, the UK, and others. “Just as we have seen inspirational and heroic defence by Ukrainian military on the battlefield, we have seen incredibly impressive defensive cyber operations by Ukrainian cyber security practitioners. Many commentators have suggested that this has been the most effective defensive cyber activity undertaken under sustained pressure in history,” added Cameron. She said the constant cyber attacks on Ukraine, emanating from Russia, over the past decade had prepared the country’s cyber defences. “In many ways, Russia has made Ukraine match fit over the past 10 years by consistently attacking them,” said Cameron. “We haven’t seen ‘cyber Armageddon’, but that’s not a surprise to cyber professionals, who never expected it. What we have seen is a very significant conflict in cyber space – probably the most sustained and intensive cyber campaign on record.”


Feds: Chinese Hacking Group Undeterred by Indictment

The United States began publicly indicting Chinese hackers in 2014 in a strategy to pressure Beijing by exposing the organizations and individuals behind state-sponsored cybertheft. The strategy seemed to pay dividends when Chinese leader Xi Jinping in September 2015 pledged to cease cyber-enabled economic espionage. The strategy's utility has since come under mounting fire as it became apparent that Chinese state-sponsored hacking responded to Xi's promise by becoming stealthier rather than by ending. "The indictment did not hinder APT41's operations as they progressed into 2021," concludes the Department of Health and Human Services' Health Sector Cybersecurity Coordination Center in a Thursday threat brief. "Stealing foreign IP is a primary objective of state-sponsored Chinese cyberespionage groups such as APT41, as it contributes to China's ambitious business and economic development goals," says Paul Prudhomme, a former Department of Defense threat analyst who is head of threat intelligence advisory at Rapid7.


What Is Artificial Intelligence in Software Testing?

You may wonder, “Don’t test automation tools do this already?” Of course, test automation tools already have AI in effect, but they have limitations. Where AI shines in software development is when it’s applied to remove those limitations, enabling software test automation tools to provide even more value to developers and testers. The value of AI comes from reducing the direct involvement of the developer or tester in the most mundane tasks. We still have a great need for human intelligence in applying business logic, strategic thinking, creative ideas, and the like. For example, consider that most, if not all, test automation tools run tests for you and deliver results. Most don’t know which tests to run, so they run all of them or some predetermined set. What if an AI-enabled bot could review the current state of test statuses, recent code changes, code coverage, and other metrics, and then decide which tests to run and run them for you? Bringing in decision-making that’s based on changing data is an example of applying AI. Good news! Parasoft handles automated software testing at this level.



Quote for the day:

"It's not the position that makes the leader. It_s the leader that makes the position." -- Stanley Huffty

Daily Tech Digest - October 01, 2022

3 wins and 3 losses for cloud computing

The cloud can successfully provide business agility. I always tell my clients that most enterprises moved to cloud for the perceived cost savings but stayed for the agility. Cloud provides businesses with the ability to turn IT on a dime, and enterprises that move fast in today’s more innovative markets (such as retail, healthcare, and finance) find that the speed at which cloud systems can change provides a real force multiplier for the business. The cloud offers industrial-strength reliability. Most who pushed back on cloud computing argued that we would put all our eggs in a basket that could prove unreliable. ... The businesses that moved to cloud computing anticipated significant cost savings. Those savings never really materialized except for completely new businesses that had no prior investment in IT. In fact, most enterprises looked at their cloud bills with sticker shock. The primary culprit? Enterprises that did not use cloud finops programs to effectively manage cloud costs. Also, cloud providers offered pricing and terms that many enterprises did not understand (and many still don’t).


Data storytelling: A key skill for data-driven decision-making

Rudy is a firm believer in letting the data unfold by telling a story so that when the storyteller finally gets to the punch line or the “so what, do what” there is full alignment on their message. As such, storytellers should start at the top and set the stage with the “what.” For example, in the case of an IT benchmark, the storyteller might start off saying that the total IT spend is $X million per year (remember, the data has already been validated, so everyone is nodding). The storyteller should then break it down into five buckets: people, hardware, software, services, other (more nodding), Rudy says. Then further break it down into these technology areas: cloud, security, data center, network, and so on (more nodding). Next the storyteller reveals that based on the company’s current volume of usage, the unit cost is $X for each technology area and explains that compared to competitors of similar size and complexity, the storyteller’s organization spends more in certain areas, for example, security (now everyone is really paying attention), Rudy says.


Active vs. Passive Network Monitoring: What Should Your Company Use?

Active network monitoring, also known as synthetic network monitoring, releases test traffic onto the network and observes that traffic as it travels through. This traffic is not taken from actual transactions that occur on a network, but rather sent through the network in order for your monitoring solution to examine it on its path. Test traffic usually mimics the typical network traffic that flows through your system so your administrators will gain the most relevant insights to its network. ... Passive network monitoring refers to capturing network traffic that flow through a network and analyzing it afterwards. Through a collection method like log management or network taps, passive monitoring compiles historic network traffic to paint a bigger picture of your company’s network performance. The primary use for passive network monitoring is for discovering and predicting performance issues that happen at specific instances and areas of your network. ... The question that might be passing through your mind is “should my business use active monitoring or passive monitoring for my network performance strategy?”


A New Linux Tool Aims to Guard Against Supply Chain Attacks

“Not too long ago, the only real criteria for the quality of a piece of software was whether it worked as advertised. With the cyber threats facing Federal agencies, our technology must be developed in a way that makes it resilient and secure,” Chris DeRusha, the US federal chief information security officer and deputy national cyber director, wrote in the White House announcement. "This is not theoretical: Foreign governments and criminal syndicates are regularly seeking ways to compromise our digital infrastructure.” When it comes to Wolfi, Santiago Torres-Arias, a software supply chain researcher at Purdue University, says that developers could accomplish some of the same protections with other Linux distributions, but that it’s a valuable step to see a release that’s been stripped down and purpose-built with supply chain security and validation in mind. “There’s past work, including work done by people who are now at Chainguard, that was kind of the precursor of this train of thought that we need to remove the potentially vulnerable elements and list the software included in a particular container or Linux release,” Torres-Arias says.


Using governance to spur, not stall, data access for analytics

“Without good governance controls, you not only have the policy management risk, but you also risk spending much, much more money than you intend, much faster,” says Barch. “We knew that maximizing the value of our data, especially as the quantity and variety of that data scales, was going to require creating integrated experiences with built-in governance that enabled the various stakeholders involved in activities like publishing data, consuming data, governing data and managing the underlying infrastructure, to all seamlessly work together.” What does this blended approach to data governance look like? For Capital One, it’s what Barch calls “sloped governance.” With a sloped governance approach, you can increase governance and controls around access and security for each level of data. For example, private user spaces, which don’t contain any shared data, can have minimal data governance requirements. As you move further into production, the controls get stricter and take more time to be implemented.


Microsoft: Hackers are using open source software and fake jobs in phishing attacks

The hacking group has targeted employees in media, defense and aerospace, and IT services in the US, UK, India, and Russia. The group was also behind the massive attack on Sony Pictures Entertainment in 2014. Also known as Lazarus, and tracked by Microsoft as ZINC, Google Cloud's Mandiant threat analysts saw the group spear-phishing targets in the tech and media sectors with bogus job offers in July, using WhatsApp to share a trojanized instance of PuTTY. "Microsoft researchers have observed spear-phishing as a primary tactic of ZINC actors, but they have also been observed using strategic website compromises and social engineering across social media to achieve their objectives," MSTIC notes. "ZINC targets employees of companies it's attempting to infiltrate and seeks to coerce these individuals into installing seemingly benign programs or opening weaponized documents that contain malicious macros. Targeted attacks have also been carried out against security researchers over Twitter and LinkedIn."


New deepfake threats loom, says Microsoft’s chief science officer

In a Twitter thread, MosaicML research scientist Davis Blaloch described interactive deepfakes as “the illusion of talking to a real person. Imagine a scammer calling your grandmom who looks and sounds exactly like you.” Compositional deepfakes, he continued, go further with a bad actor creating many deepfakes to compile a “synthetic history.” ... The rise of ever-more sophisticated deepfakes will “raise the bar on expectations and requirements” of journalism and reporting, as well as the need to foster media literacy and raise awareness of these new trends. In addition, new authenticity protocols to confirm identity might be necessary, he added – even new multifactor identification practices for admittance into online meetings. There may also need to be new standards to prove content provenance, including new watermark and fingerprint methods; new regulations and self-regulation; red-team efforts and continuous monitoring.


How Does WebAuthn Work?

WebAuthn is quite clever. It leverages the power of public key cryptography to create a way for users to log in to mobile and web applications without those applications having to store any secret information at all. Usually, when one thinks of public key cryptography, one thinks of using it to send a secret message to a person who then decrypts it and reads it. Well, this can kind of work in reverse. If you send them a message encrypted with their public key, then they – and only they – are the only ones who can decrypt because only they have the private key that corresponds to the given public key. Once they do, you can be highly confident that they are the entity that they say they are. Currently, all the major browsers – Chrome, Firefox, Edge, and Safari – all support the WebAuthn specification. If your phone – iPhone or Android – has a fingerprint reader or facial scanner, it supports WebAuthn. Windows provides WebAuthn support via Windows Hello. All of this translates to passwordless authentication quite nicely.


Why developers hold the key to cloud security

APIs drive cloud computing. They eliminate the requirement for a fixed IT architecture in a centralized data center. APIs also mean attackers don’t have to honor the arbitrary boundaries that enterprises erect around the systems and data stores in their on-premises data centers. While identifying and remediating misconfigurations is a priority, it’s essential to understand that misconfigurations are just one means to the ultimate end for attackers: control plane compromise. This has played a central role in every significant cloud breach to date. Empowering developers to find and fix cloud misconfigurations when developing IaC is critical, but it’s equally important to give them the tools they need to design cloud architecture that’s inherently secure against today’s control plane compromise attacks. ... Developers are in the best (and often only) position to secure their code before deployment, maintain its secure integrity while running, and better understand the specific places to provide fixes back in the code. But they’re also human beings prone to mistakes operating in a world of constant experimentation and failure. 


IT enters the era of intelligent automation

Companies also need to optimize business processes to increase the effectiveness of automation, Nallapati says. “Working together in a partnership, the business unit and the automation teams can leverage their expertise to refine the best approach and way forward to optimize the efficiency of the bot/automation,” she says. Technology leaders should make sure to get business leaders and users involved in the IA process, Ramakrishnan says. “Educate them about the possibilities and collaborate with them in joint problem-solving sessions,” he says. ... “With a large number of customers and a large number of invoices to process every day, any small savings through automation goes a long way in increasing productivity, accuracy, and improving employee and end customer satisfaction,” Ramakrishnan says. Similar to the type of hackathons that are common in IT organizations today, Ramakrishnan says, “we partnered with the business to have a business-side hackathon/ideathon. We educated the key users from the billing team on the possibilities of automation, and then they were encouraged to come back with ideas on automation.”



Quote for the day:

"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith

Daily Tech Digest - September 30, 2022

5 Signs That You’re a Great Developer

Programming directly changes your brain to work in another way, you’re starting to think more algorithmically and solve problems faster, so it really affects other aspects of your life Good programmers not only can learn anything else much faster, especially if we’re talking about tech-related directions, but also they are great examples of entrepreneurs and CEO. Look at Elon Musk, for instance, he was a programmer and built his own game when he was 12. ... As we briefly discussed previously, programming encourages creative thinking and teaches you how to approach problems in the most effective way. But in order to do so, you must first be able to solve a lot of these difficulties and have a passion for doing so; only then will you probably succeed as a developer. If you’ve just started and thought that easy, then you’re completely wrong. You just haven’t figured out genuinely challenging problems, and the more you learn, the more difficult and complex the difficulties get. Because you need to not only solve it, but also solve it in the most effective way possible, speed up your algorithm, and optimize everything. 


Experimental WebTransport over HTTP/3 support in Kestrel

WebTransport is a new draft specification for a transport protocol similar to WebSockets that allows the usage of multiple streams per connection. WebSockets allowed upgrading a whole HTTP TCP/TLS connection to a bidirectional data stream. If you needed to open more streams you’d spend additional time and resources establishing new TCP and TLS sessions. WebSockets over HTTP/2 streamlined this by allowing multiple WebSocket streams to be established over one HTTP/2 TCP/TLS session. The downside here is that because this was still based on TCP, any packets lost from one stream would cause delays for every stream on the connection. With the introduction of HTTP/3 and QUIC, which uses UDP rather than TCP, WebTransport can be used to establish multiple streams on one connection without them blocking each other. For example, consider an online game where the game state is transmitted on one bidirectional stream, the players’ voices for the game’s voice chat feature on another bidirectional stream, and the player’s controls are transmitted on a unidirectional stream. 
Software builders across Amazon require consistent, interoperable, and extensible tools to construct and operate applications at our peculiar scale; organizations will extend on our solutions for their specialized business needs. Amazon’s customers benefit when software builders spend time on novel innovation. Undifferentiated work elimination, automation, and integrated opinionated tooling reserve human interaction for high judgment situations. Our tools must be available for use even in the worst of times, which happens to be when software builders may most need to use them: we must be available even when others are not. Software builder experience is the summation of tools, processes, and technology owned throughout the company, relentlessly improved through the use of well-understood metrics, actionable insights, and knowledge sharing. Amazon’s industry-leading technology and access to top experts in many fields provides opportunities for builders to learn and grow at a rate unparalleled in the industry. As builders we are in a unique position to codify Amazon’s values into the technical foundations; we foster a culture of belonging by ensuring our tools, training, and events are inclusive and accessible by design.


The Troublemaker CISO: How Much Profit Equals One Life?

We take for granted that those who are charged with protecting us are doing so with our best interest at heart. There is no shaving off another few cents just to increase value to shareholders over the life of a person. Lucky for me, there is a shift in the boardrooms and governing bodies to see how socially responsible you are and whether you are acting in the best interest of the people and not just the investors. If the members of the board and governing body are considering these topics when steering a business, isn't it time to relook at how and why we do things? Are we as CISOs not accountable to leadership to impress on them the risk that IOT/internet connectivity poses to critical networks - and especially to healthcare? It is time to be firm in expressing the risk and saying we would rather spend a bit more money and time and do it the safe way. And this should be listed as the top risk in the company. The other big issue I have with this type of network being connected is one of transparency.


Digital Twins Offer Cybersecurity Benefits

A key difficulty, from a cybersecurity perspective, is the fact drug production lines are made up of multiple different technologies, running different operating systems that are often provided by different suppliers. “Integrating multiple systems from different suppliers can provide expanded attack surface that can be exploited by cyber adversary,” continues Mylrea. To address this, Mylrea and Grimes developed what they refer to as “biosecure digital twins”—replicas of manufacturing lines they use to identify potential points of attack for hackers. “The digital twin is essentially a high-fidelity virtual representation of critical manufacturing processes. From a security perspective, this improves monitoring, detection, and mitigation of stealthy attacks that can go undetected by most conventional cybersecurity defenses,” explains Mylrea. “Beyond security, the biosecure digital twin can optimize performance and productivity by detecting when critical systems deviate from their ideal state and correct in real time to enable predictive maintenance that prevent costly faults and safety failures.”


Unlocking cyber skills: This year’s essential back-to-school lesson plan

Technology is continually advancing, which will only create more avenues for cybersecurity roles in the future. While it’s essential to inform students about the types of careers in cybersecurity, teachers and career advisors should be aware of the skills and qualities the sector needs beyond technical computer and software knowledge. Once this is achieved, it can shed light on the roles students can go onto. Technical skills are critical in cybersecurity, yet they can be learned, fostered, and evolved throughout a student’s career. Schools need to tap into individual students’ strengths in hopes of encouraging them to pursue cyber positions. Broadly, cybersecurity enlists leaders, communicators, researchers, critical thinking… the list goes on. Having the qualities needed to fulfil various roles in the industry can position a student remarkably when they first start in the industry. Yet, this comes down to their mentors in high school being able to communicate that a student’s inquisitive nature or presenting skills can be applied to various sectors.


Data literacy: Time to cure data phobia

Data literacy is an incredibly important asset and skill set that should be demonstrated at all levels of the workplace. In simple terms, data literacy is the fundamental understanding of what data means, how to interpret it, how to create it and how to use it both effectively and ethically across business use cases. Employees who have been trained in and applied their knowledge of how to use company data demonstrate a high level of data literacy. Although many people have traditionally associated data literacy skills with data professionals and experts, it’s becoming necessary for employees from all departments and job levels to develop certain levels of data literacy. The Harvard Business Review stated: “Companies need more people with the ability to interpret data, to draw insights and to ask the right questions in the first place. These are skills that anyone can develop, and there are now many ways for individuals to upskill themselves and for companies to support them, lift capabilities,and drive change. Indeed, the data itself is clear on this: Data-driven decision-making markedly improves business performance.”


To BYOT & Back Again: How IT Models are Evolving

The growing complexity of IT frameworks is startling. A typical enterprise has upwards of 1,200 cloud services and hundreds of applications running at any given moment. On top of that, employees have their own smartphones, and many use their own routers and laptops. Meanwhile, various departments and groups -- marketing, finance, HR and others -- subscribe to specialized cloud services. The difficulties continue to pile up -- particularly as CIOs look to build out more advanced data and AI frameworks. McKinsey & Company found that between 10% and 20% of IT budgets are devoted to adding more technology in an attempt to modernize the enterprise and pay down technical debt. Yet, part of the problem, it noted, is “undue complexity” and a lack of standards, particularly at large companies that stretch across regions and countries. In many cases, orphaned and balkanized systems, data sprawl, data silos, and complex device management requirements follow. For CIOs seeking simplification and tighter security, the knee-jerk reaction is often to clamp down on choices and options.


IT leadership: What to prioritize for the remainder of 2022

To deliver product-centric value, it’s best to have autonomous, cross-functional teams running an Agile framework. Those teams can include technical practitioners, design thinkers, and business executives. Together, they can increase business growth by as much as 63%, Infosys’ Radar report uncovered. Cross-pollination efforts can spread Agile across the entire enterprise, building credibility and trust among high-level stakeholders toward an iterative process that can deliver meaningful, if incremental, business results. Big-bang rollouts, with a raft of modernizations released in one fell swoop, may seem attractive to management or other stakeholders. But they carry untold risk: developers scrambling to fix bugs after the fact, account teams working to retain disgruntled customers. Approach cautiously, and consider an Agile roadmap of smaller, iterative developments instead of the momentous release. It also breaks down the considerable task of application modernization into smaller, bite-sized chunks. 


How Policy-as-Code Helps Prevent Cloud Misconfigurations

Policy-as-code is a great cloud configuration solution because it eliminates the potential for human error and makes it more difficult for hackers to interfere. Policy compliance is crucial for cloud security, ensuring that every app and piece of code follows the necessary rules and conditions. The easiest way to ensure nothing slips through the cracks is to automate the compliance management process. Policy-as-code is also a good choice in a federated risk management model. A set of common standards are applied across a whole organization, although departments or units retain their own methods and workflows. PaC fits seamlessly into this high-security system by scaling and automating IT policies throughout a company. Preventing cloud misconfiguration relies on effectively ensuring every app and line of code is adhering to an organization’s IT policies. PaC offers some key benefits that make this possible without being a hassle. Policy-as-code improves the visibility of IT policies since everything is clearly defined in code format. 



Quote for the day:

"Without courage, it doesn't matter how good the leader's intentions are." -- Orrin Woodward