Daily Tech Digest - October 01, 2022

3 wins and 3 losses for cloud computing

The cloud can successfully provide business agility. I always tell my clients that most enterprises moved to cloud for the perceived cost savings but stayed for the agility. Cloud provides businesses with the ability to turn IT on a dime, and enterprises that move fast in today’s more innovative markets (such as retail, healthcare, and finance) find that the speed at which cloud systems can change provides a real force multiplier for the business. The cloud offers industrial-strength reliability. Most who pushed back on cloud computing argued that we would put all our eggs in a basket that could prove unreliable. ... The businesses that moved to cloud computing anticipated significant cost savings. Those savings never really materialized except for completely new businesses that had no prior investment in IT. In fact, most enterprises looked at their cloud bills with sticker shock. The primary culprit? Enterprises that did not use cloud finops programs to effectively manage cloud costs. Also, cloud providers offered pricing and terms that many enterprises did not understand (and many still don’t).

Data storytelling: A key skill for data-driven decision-making

Rudy is a firm believer in letting the data unfold by telling a story so that when the storyteller finally gets to the punch line or the “so what, do what” there is full alignment on their message. As such, storytellers should start at the top and set the stage with the “what.” For example, in the case of an IT benchmark, the storyteller might start off saying that the total IT spend is $X million per year (remember, the data has already been validated, so everyone is nodding). The storyteller should then break it down into five buckets: people, hardware, software, services, other (more nodding), Rudy says. Then further break it down into these technology areas: cloud, security, data center, network, and so on (more nodding). Next the storyteller reveals that based on the company’s current volume of usage, the unit cost is $X for each technology area and explains that compared to competitors of similar size and complexity, the storyteller’s organization spends more in certain areas, for example, security (now everyone is really paying attention), Rudy says.

Active vs. Passive Network Monitoring: What Should Your Company Use?

Active network monitoring, also known as synthetic network monitoring, releases test traffic onto the network and observes that traffic as it travels through. This traffic is not taken from actual transactions that occur on a network, but rather sent through the network in order for your monitoring solution to examine it on its path. Test traffic usually mimics the typical network traffic that flows through your system so your administrators will gain the most relevant insights to its network. ... Passive network monitoring refers to capturing network traffic that flow through a network and analyzing it afterwards. Through a collection method like log management or network taps, passive monitoring compiles historic network traffic to paint a bigger picture of your company’s network performance. The primary use for passive network monitoring is for discovering and predicting performance issues that happen at specific instances and areas of your network. ... The question that might be passing through your mind is “should my business use active monitoring or passive monitoring for my network performance strategy?”

A New Linux Tool Aims to Guard Against Supply Chain Attacks

“Not too long ago, the only real criteria for the quality of a piece of software was whether it worked as advertised. With the cyber threats facing Federal agencies, our technology must be developed in a way that makes it resilient and secure,” Chris DeRusha, the US federal chief information security officer and deputy national cyber director, wrote in the White House announcement. "This is not theoretical: Foreign governments and criminal syndicates are regularly seeking ways to compromise our digital infrastructure.” When it comes to Wolfi, Santiago Torres-Arias, a software supply chain researcher at Purdue University, says that developers could accomplish some of the same protections with other Linux distributions, but that it’s a valuable step to see a release that’s been stripped down and purpose-built with supply chain security and validation in mind. “There’s past work, including work done by people who are now at Chainguard, that was kind of the precursor of this train of thought that we need to remove the potentially vulnerable elements and list the software included in a particular container or Linux release,” Torres-Arias says.

Using governance to spur, not stall, data access for analytics

“Without good governance controls, you not only have the policy management risk, but you also risk spending much, much more money than you intend, much faster,” says Barch. “We knew that maximizing the value of our data, especially as the quantity and variety of that data scales, was going to require creating integrated experiences with built-in governance that enabled the various stakeholders involved in activities like publishing data, consuming data, governing data and managing the underlying infrastructure, to all seamlessly work together.” What does this blended approach to data governance look like? For Capital One, it’s what Barch calls “sloped governance.” With a sloped governance approach, you can increase governance and controls around access and security for each level of data. For example, private user spaces, which don’t contain any shared data, can have minimal data governance requirements. As you move further into production, the controls get stricter and take more time to be implemented.

Microsoft: Hackers are using open source software and fake jobs in phishing attacks

The hacking group has targeted employees in media, defense and aerospace, and IT services in the US, UK, India, and Russia. The group was also behind the massive attack on Sony Pictures Entertainment in 2014. Also known as Lazarus, and tracked by Microsoft as ZINC, Google Cloud's Mandiant threat analysts saw the group spear-phishing targets in the tech and media sectors with bogus job offers in July, using WhatsApp to share a trojanized instance of PuTTY. "Microsoft researchers have observed spear-phishing as a primary tactic of ZINC actors, but they have also been observed using strategic website compromises and social engineering across social media to achieve their objectives," MSTIC notes. "ZINC targets employees of companies it's attempting to infiltrate and seeks to coerce these individuals into installing seemingly benign programs or opening weaponized documents that contain malicious macros. Targeted attacks have also been carried out against security researchers over Twitter and LinkedIn."

New deepfake threats loom, says Microsoft’s chief science officer

In a Twitter thread, MosaicML research scientist Davis Blaloch described interactive deepfakes as “the illusion of talking to a real person. Imagine a scammer calling your grandmom who looks and sounds exactly like you.” Compositional deepfakes, he continued, go further with a bad actor creating many deepfakes to compile a “synthetic history.” ... The rise of ever-more sophisticated deepfakes will “raise the bar on expectations and requirements” of journalism and reporting, as well as the need to foster media literacy and raise awareness of these new trends. In addition, new authenticity protocols to confirm identity might be necessary, he added – even new multifactor identification practices for admittance into online meetings. There may also need to be new standards to prove content provenance, including new watermark and fingerprint methods; new regulations and self-regulation; red-team efforts and continuous monitoring.

How Does WebAuthn Work?

WebAuthn is quite clever. It leverages the power of public key cryptography to create a way for users to log in to mobile and web applications without those applications having to store any secret information at all. Usually, when one thinks of public key cryptography, one thinks of using it to send a secret message to a person who then decrypts it and reads it. Well, this can kind of work in reverse. If you send them a message encrypted with their public key, then they – and only they – are the only ones who can decrypt because only they have the private key that corresponds to the given public key. Once they do, you can be highly confident that they are the entity that they say they are. Currently, all the major browsers – Chrome, Firefox, Edge, and Safari – all support the WebAuthn specification. If your phone – iPhone or Android – has a fingerprint reader or facial scanner, it supports WebAuthn. Windows provides WebAuthn support via Windows Hello. All of this translates to passwordless authentication quite nicely.

Why developers hold the key to cloud security

APIs drive cloud computing. They eliminate the requirement for a fixed IT architecture in a centralized data center. APIs also mean attackers don’t have to honor the arbitrary boundaries that enterprises erect around the systems and data stores in their on-premises data centers. While identifying and remediating misconfigurations is a priority, it’s essential to understand that misconfigurations are just one means to the ultimate end for attackers: control plane compromise. This has played a central role in every significant cloud breach to date. Empowering developers to find and fix cloud misconfigurations when developing IaC is critical, but it’s equally important to give them the tools they need to design cloud architecture that’s inherently secure against today’s control plane compromise attacks. ... Developers are in the best (and often only) position to secure their code before deployment, maintain its secure integrity while running, and better understand the specific places to provide fixes back in the code. But they’re also human beings prone to mistakes operating in a world of constant experimentation and failure. 

IT enters the era of intelligent automation

Companies also need to optimize business processes to increase the effectiveness of automation, Nallapati says. “Working together in a partnership, the business unit and the automation teams can leverage their expertise to refine the best approach and way forward to optimize the efficiency of the bot/automation,” she says. Technology leaders should make sure to get business leaders and users involved in the IA process, Ramakrishnan says. “Educate them about the possibilities and collaborate with them in joint problem-solving sessions,” he says. ... “With a large number of customers and a large number of invoices to process every day, any small savings through automation goes a long way in increasing productivity, accuracy, and improving employee and end customer satisfaction,” Ramakrishnan says. Similar to the type of hackathons that are common in IT organizations today, Ramakrishnan says, “we partnered with the business to have a business-side hackathon/ideathon. We educated the key users from the billing team on the possibilities of automation, and then they were encouraged to come back with ideas on automation.”

Quote for the day:

"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith

No comments:

Post a Comment