Daily Tech Digest - June 30, 2023

3 things that make a CIO-CFO dream team

A study conducted by Gartner, detailed in the report “CIOs: Improve How You collaborate With Your CFO,” found that when CFOs are asked how well their most senior IT executive understands the impact of technology on finance, more than half indicate that their IT counterparts are lacking in this area. But surprisingly few companies choose CIOs for their financial skills. “Financial knowledge is not something clients typically ask for when recruiting a CIO,” says Thistle. “However, the CIO will be expected to understand and manage IT costs and budgets, both Capex and Opex.” ... “Even when there is a CFO of IT, the person at the top, the CIO, still needs to understand finance,” he says. “Most CIOs don’t have the benefit of a background in finance. I’ve never met a CIO who went into IT to manage money, yet that’s what they have to do. They have to run IT like a business within a business.” The biggest challenge is not getting a handle on cost, but on value. CIOs can easily show cost on a general ledger. But estimating the future value of technology is more art than science. Investment decisions need to be driven by business outcomes that can be measured and, ideally, monetized. 


‘Shadow’ AI use becoming a driver of insider cyber risk

“People don’t need to have malicious intent to cause a data breach,” said Ray. “Most of the time, they are just trying to be more efficient in doing their jobs. But if companies are blind to LLMs accessing their back-end code or sensitive data stores, it’s just a matter of time before it blows up in their faces.” Insider threats are thought to be the underlying reason for almost 60% of data breaches, according to Imperva’s own data, but many are still not properly prioritised by organisations since a not insignificant number of them are simply cases of human error – a recent study by the firm found that 33% of organisations don’t perceive insiders as a significant threat. Ray said trying to restrict AI usage inside the organisation now was very much a case of shutting the stable door after the horse had bolted. “Forbidding employees from using generative AI is futile,” said Ray. “We’ve seen this with so many other technologies – people are inevitably able to find their way around such restrictions and so prohibitions just create an endless game of whack-a-mole for security teams, without keeping the enterprise meaningfully safer.”


Generative AI may help make 'low-code' more 'no-code' - but with important caveats

AI will ultimately serve "as a way to enable low-code and no-code environments," says Leon Kallikkadan, vice president of technology at Atrium. "I also think that as other partnerships can come onboard it will make low-code and no-code more of a possibility. I believe it will be a phased approach whereby as you, the human developer builds it, an AI component will start creating a vision or future step. The long-term possibilities depend on how deep the integration is, but yes, it can go that far to become a low-code, no-code environment." No and low-code solutions may be a good fit for non-technical users. "Low code is more geared towards non-coders," says Jesse Reiss, CTO of Hummingbird. "It provides organizations with the ability to reimagine business processes without obtaining steep IT expertise. This is crucial for small- to medium-sized businesses, especially during the ongoing labor challenge where they can be short-staffed or do not have the resources to support business operations." Generative AI is more suitable for development work requiring high-level expertise, experts state.


Top Issues Architecture Leaders Need to Address in 2023

Over the next five years, leaders need to be aware that the architect resource shortage will not improve. Resources may be unavailable in the marketplace as you look to refill your bench. Today, there are 10 to 20 open positions for every available architect looking for a job, and the future job market looks bleak. This resource shortage means architecture leaders will either need to develop the skills and experiences internally or they will need to look at how they utilize technology to do more with fewer people, and most probably a combination of both. If you’re looking to do more with less or training new architects, determine now how to maintain the tribal knowledge of your senior architects. ... Most of today’s architects analyze in Excel or the standalone modeling tools they work in. When architects are only looking at a minimal set of information, they are missing the broad operational data available across the organization, which are found in systems like CMDB, CRMs, ERPs, HR solutions, and facility management systems to gather critical operational data about what’s going on in terms of manufacturing processes, business processes, org structures, costs, and more.


SEC notice to SolarWinds CISO and CFO roils cybersecurity industry

The move by the SEC will make CSOs more individually accountable for cybersecurity, said Agnidipta Sarkar, a former CISO of pharmaceuticals company Biocon. "Though it doesn’t mean that the CISO has been charged, it is a new milestone. From today onwards, CISOs will increasingly be made accountable for the decisions they take or did not take," Sarkar said. However, attributing blame solely to the CISO or CFO might not always be fair or accurate, said Ruby Mishra, CISO at KPMG India. "In order to manage cybersecurity effectively, the organization adopts a multilayered approach involving various stakeholders and departments. Holding the CISO or CFO solely responsible for a cyberattack may overlook the collective responsibility," Mishra said. ... "Before issuing the notice, the SEC may have considered a variety of factors, including specific circumstances, and legal frameworks, or may have demonstrated negligence if CISO failed to implement adequate security measures, neglected SEC policies, guidelines, and practices, or ignored known vulnerabilities," Mishra said.


3 Initiatives CISOs Can Take for Their Security and Resilience Journey

Businesses can help reduce the risk of a data breach by creating the right cyber defense and recovery plans. This comprehensive strategy should include the following: A risk assessment of the IT environment’s threat landscape; An incident response plan that defines in detail the procedures to follow after a breach; A business continuity plan that outlines how to recover from a breach as quickly and gracefully as possible. According to the U.S. Department of Defense, “zero trust” means that organizations should “never trust, always verify” (DOD CIO, 2022). Rather than granting indiscriminate access to applications, devices, and other IT assets, businesses should give users only the resources they need when they need them. In a zero-trust approach, all users, devices, and applications are treated as potentially compromised, with the organization’s defenses locked down accordingly. Techniques may include strict access controls, multifactor authentication (MFA), and monitoring user activities. Certified CISOs should act to define a zero-trust strategy that aligns with the organization’s IT governance and compliance requirements.


Proxmox 8: New Features and Home Lab Upgrade Instructions

Proxmox VE (Proxmox Virtual Environment) is an open-source server virtualization management solution allowing users to manage virtual machines, such as Windows or Linux machines and Linux containers. It’s based on the Debian Linux distribution and combines two virtualization technologies, KVM (Kernel-based Virtual Machine) and LXC (Linux Containers), managed through a web-based interface. The Proxmox platform is used in virtual environments to improve efficiency and ease management tasks. It allows users to deploy, manage, and monitor virtual machines (VMs) and containers, network settings, storage systems, and more, all from a single, integrated platform. Proxmox also provides high-level features like live migrations of VMs without downtime, high availability, or automated backups, making it a robust choice for managing virtual environments, whether for small businesses or larger enterprises. Its open-source nature allows for active community involvement and provides a cost-effective solution for virtualization needs.


Secret CSO: Dan Garcia, EnterpriseDB

It’s important to surround yourself with people who are there to support you and push you to be the best that you can. Having a strong support system is vital. Along the way I had many mentors, some who played an important role for where I was at the time. Mandy Andress who is the CISO at Elastic, provided me with my opportunity within Security Operations at MassMutual and I’ll always be grateful for that chance. ... Balance. Information security is one of the few areas within the business that cuts through multiple departments, functions, skill sets, and problems that need attention. Finding the balance of where to spend your time and resources can be challenging, but it’s an important thing to get right in order to most effectively solve organisational problems. ... Hiring within information security will always be challenging. We’re not just looking for technical skills, but also an individual’s experience, their past organisations’ security posture, and how those companies approached processes and program structure.


Inside the race to build an ‘operating system’ for generative AI

The operating-system analogy helps to illustrate the magnitude of the change that generative AI is bringing to enterprises. It is not just about adding a new layer of software tools and frameworks on top of existing systems. It is also about giving the system the authority and agency to run its own process, for example deciding which LLM to use in real time to answer a user’s question, and when to hand off the conversation to a human expert. In other words, an AI managing an AI, according to Intuit’s Srivastava. Finally, it’s about allowing developers to leverage LLMs to rapidly build generative AI applications. This is similar to the way operating systems revolutionized computing by abstracting away the low-level details and enabling users to perform complex tasks with ease. Enterprises need to do the same for generative AI app development. Microsoft CEO Satya Nadella recently compared this transition to the shift from steam engines to electric power. “You couldn’t just put the electric motor where the steam engine was and leave everything else the same, you had to rewire the entire factory,” he told Wired.


A Perfect Wave: Event Driven Business Architecture

In general, in traditional IT data used to be hidden behind fortified castle walls. Access was difficult and the main purpose was to store the data securely. This is changing. Nowadays, modern IT has started to act as a nervous system ensuring that data is made available asap where it is needed, and that it can be used immediately to gain an advantage based on fully up to date information. Let’s have a quick look three customer citations that describe very well why customers move to Event Driven Business Architecture: “We need to move at the speed of business“, Scott, IT, Fortune 500 customer, translating to: everything has become so much faster and we need to be able to support our business; “We want our ERP to be a team player“, Derrick, Fortune 500 customer, translating to: player skills don’t just add up in a team sport, they multiply. This is why your ERP talking to your SuccessFactors talking to your Ariba in real time is so important. It adds lots of value; “It’s a sin“, Alex, Automotive Supplier, translating to: it is a sin not to use your business data. Don’t just hide it and lock it away so that nobody can use it like it is often still done



Quote for the day:

"There is no substitute for knowledge." -- W. Edwards Deming

Daily Tech Digest - June 29, 2023

How the new deepfake reality will impact cyber insurance

Deepfakes can ruin a company's reputation, bypass biometric controls, phish unsuspecting users into clicking malicious links, and convince financial agents to transfer money to offshore accounts. Attacks leveraging deepfakes can happen over many channels from social media to fake person-to-person video calls over Zoom. Voicemail, Slack channels, email, mobile messaging, and metaverses are all fair game for distributing deepfake scams to businesses and personal users. Cyber liability insurers are beginning to take notice, and as they do, their security requirements are beginning to adjust to the new 'fake' reality. This includes, but is not limited to, better hygiene across the enterprise, renewed focus on home worker systems, enforced multifactor authentication, out-of-band confirmation to avoid falling for deepfake phishing attempts, user and partner education, and third-party context-based verification services or tools. ... For the most part, organizations will need to focus on requirements that are in their cyber insurance policies. 


Confronting Financial Fraud in Payments with the Help of AI

AI is being considered for fraud protection efforts in different ways by these organizations. Schmiedl said JPMorgan Chase has evolved from algorithms to machine learning and neural nets to look at fraudulent card activity, examining unstructured data, and entity extraction. “There’s an inherent signal in every email,” he said. “Actors that are trying to create fraudulent emails tend to basically use different patterns and you can learn those patterns through AI/ML.” JPMorgan Chase is assessing the use of large language models, Schmiedl said, for fraud, risk, and other possible areas. Such efforts have focused on in-house data and resources, he said, focusing on the firm's own ecosystem rather than looking externally. “If you start using these models and outside data, you start to see things that are presented like facts that aren’t facts,” Schmiedl said. Swift is building a new AI platform, Bhatia said, with tech players such as Google, Microsoft, and others. “We really believe that this is going to help us add on to the rule-based engines that we already have today and really bring a higher success rate in helping with fraud,” she said.


Recovery options: Copy-on write vs redirect-on-write snapshots

Consider a copy-on-write system, which duplicates blocks before overwriting them with new data. In essence, when a block within the protected entity needs to be changed, the system copies that block to a separate snapshot area before it is overwritten. This approach uses three I/O operations for each write: one read and two writes. Prior to overwriting a block, its previous value must be read and written to a different location, followed by the write of the new data. Should a process attempt to access the snapshot at a later time, it does so through the snapshot system, which is aware of which blocks have been changed since the snapshot was created. ... In contrast, a redirect-on-write system utilizes pointers to represent all protected entities. When a block needs to be changed, the storage system simply redirects the pointer associated with that block to another block and writes the data there. The snapshot system maintains a record of all block locations constituting a given snapshot, which is essentially a list of pointers that correspond to the block locations.


Five Ways AI is Likely to Change How Organizations Approach Information Risk and Security

Many security events and incidents are the result of insecure application code, applications that are misconfigured or applications that have been manipulated by adversaries and used as part of their attack activities. The volume of security-related software patches and updates that are produced by application vendors on an ongoing basis has provided clear evidence that current approaches to application security must be enhanced to be effective. AI is likely to accelerate these enhancements by integrating application security-based LMMs into application development and security testing and protective tools such as static application security testing (SAST) and dynamic application security testing (DAST), software composition analysis (SCA), web application firewalls (WAFs), application programming interface (API) security gateways, and quality assurance and penetration testing. These LMMs can ensure that application source code and running applications are tested against—and are resilient to—variations and permutations of known and expected attacker methods and tactics in a highly efficient and risk-based testing environment.


Svelte vs Angular: Pros and Cons of Modern Web Development

Introduced to the world in 2016, Svelte emerged as the unlikely hero in the tangled saga of JavaScript frameworks. Its mission? To revolutionize the way we think about reactivity in web apps. Svelte has a sort of "wax on, wax off" philosophy: rather than doing all the heavy lifting in the browser, it does its magic in the build step. While other frameworks are trying to build a luxurious skyscraper complete with a rooftop pool and a helipad, Svelte is content with constructing a cozy, energy-efficient home that fulfills all your needs. In the fast-paced, ever-changing universe of web development, sometimes less is more. Think of Svelte as a stealthy web ninja – it's lightweight, fast, and packs a powerful punch. It's reactive - change the state, and the DOM updates automatically. It’s like having a little elf inside your code, waiting patiently to sweep away any unnecessary work. ... Don't just take my word for it - look around! Angular is powering everything from IBM's online support pages to Delta Airlines' booking platform. It's as versatile as it is powerful, and it's up for whatever challenge you're ready to throw its way.


State of the API: Microservices Gone Macro and Zombie APIs

Engineers and developers ranked zombie APIs as a higher concern than executives did, who placed “loss of institutional memory” as slightly more concerning than loss of maintenance, aka zombie APIs. ... “That’s the emergence of zombie APIs, because a lot of institutional knowledge lies with the people who built it,” Sobti told The New Stack. “Once the people transition out, the change management is complex, and that’s where cataloging your API has internal APIs, in particular, becomes very critical.” API catalogs can keep track of internal APIs in one place, he added. There are dedicated teams that are now responsible for not just building the underlying infrastructure that allows the catalogs to exist, but also managing the catalog and creating the practices on building to get those APIs into the catalogs. That is where reuse becomes critical, he added. As further proof of the need for better documentation, the survey found that a lack of documentation was cited as the primary obstacle to consuming an API.


Taking IT outsourcing to the next level

When two parties enter a complex IT outsourcing deal, they need to work collaboratively, communicate effectively, and build trust. This is where relational contracts come in. Unlike transactional contracts focusing on legal obligations and penalties, relational contracts emphasize collaboration, communication, and problem-solving by specifying mutual goals and establishing governance structures to keep the parties’ expectations and interests aligned over the long term. Formally, a relational contract is defined as “A legally enforceable written contract establishing a commercial partnership within a flexible contractual framework based on social norms and jointly defined objectives, prioritizing a relationship with the continuous alignment of interests before the commercial transactions.” Complex relationships in which it is impossible to predict every what-if scenario are tailor-made for relational contracts. Large IT outsourcing projects provide a strong example of this, due to the technical complexity of the work and the number of stakeholders involved. 


AI-led business processes – getting the balance right between business impact and staff satisfaction

While seen in parts today, it is showing signs of potential scale that would create larger impacts on business processes. In addition, businesses across multiple industries are predicted to focus more on the value add that can only be contributed by human employees. According to research from Boston Consulting Group, just 30 per cent of AI investment is spent on algorithms and technologies, while the remaining 70 per cent has gone towards embedding AI into business processes and agile ways of working. ... The expertise within, and alongside that of partner companies of AIM Reply has been vital in helping organisations across retail, consumer packaged goods, manufacturing, logistics, financial services and insurance drive value from evolving AI capabilities. Focused on serving as a boutique for AI and hyperautomation platforms and solutions, the company encourages its clients to adopt an end-to-end approach that focuses on business goals, enabling the business make informed decisions based on data, and business benefits, rather than use cases.


AI requirements exceed infrastructure capabilities for many IT teams, study finds

Companies lacking in the proper hardware to do AI training have two options: make a massive investment in hardware or turn to cloud service providers for AI-as-a-service, which most of the top cloud service providers now offer. Rather than make the million-dollar investment in hardware, an enterprise could upload the data to be processed and the cloud service provider can do the heavy lifting. The enterprise could take the trained data models back when the processing is done. Customers often will opt for end-to-end solutions from AI vendors in the cloud, especially initially, “because they make it easy for the customers with a simple button,” Voruganti said. But variable cloud costs – which enterprises incur with each read or write to cloud-based data, or with every data extraction, for example  – may cause IT teams to reconsider that approach.Voruganti said he’s seeing companies choose to place foundation models with different cloud service providers based on their areas of expertise. 


Risk Management of Human and Machine Identity in a Zero Trust Security Context

While humans and their associated accounts are often the primary targets of security measures, they merely represent the activity of the machines they interact with. In a Zero Trust deployment, embracing the concept of "machine as proxy human" becomes crucial. This approach allows organizations to apply security rules and surveillance to all devices, treating them like a malicious human is operating behind them. By considering machines as proxy humans within the context of Zero Trust, organizations can extend security measures to encompass all devices and systems within their environment. This includes user devices, servers, IoT devices, and other interconnected components. Organizations can enforce strict access controls by treating machines as potential threat actors, applying behavioral analytics, and continuously monitoring for suspicious activities or deviations from expected behavior. This shift in mindset enables organizations to proactively detect and respond to potential security threats, regardless of whether they originate from human actors or compromised machines. 



Quote for the day:

"Leadership is absolutely about inspiring action, but it is also about guarding against mis-action." -- Simon Sinek

Daily Tech Digest - June 28, 2023

The digital future may rely on ultrafast optical electronics and computers

Our system of ultrafast optical data transmission is based on light rather than voltage. Our research group is one of many working with optical communication at the transistor level – the building blocks of modern processors – to get around the current limitations with silicon. Our system controls reflected light to transmit information. When light shines on a piece of glass, most of it passes through, though a little bit might reflect. That is what you experience as glare when driving toward sunlight or looking through a window. We use two laser beams transmitted from two sources passing through the same piece of glass. One beam is constant, but its transmission through the glass is controlled by the second beam. By using the second beam to shift the properties of the glass from transparent to reflective, we can start and stop the transmission of the constant beam, switching the optical signal from on to off and back again very quickly. With this method, we can switch the glass properties much more quickly than current systems can send electrons. So we can send many more on and off signals – zeros and ones – in less time.


SEO Poisoning Attacks on Healthcare Sector Rising, HHS Warns

Some threat actors also use targeted types of SEO poisoning, including spear-phishing, to go after specific users, such as IT administrators and other privileged users. "The technique enables attackers to target and customize their attacks to specific audiences, making them more challenging to identify and defend against," HHS HC3 wrote. Common SEO poisoning methods also include typosquatting, which targets users who might open their browser and input a website address that has an inadvertent typo or click on a link with a misspelled URL, HHS HC said. Attackers often register domain names that are similar to legitimate ones but contain minor spelling errors. ... Another tactic is cloaking, which involves displaying search engine crawlers with different material than what is presented to the user when the link is clicked; manipulating search ranking by artificially increasing a website's click-through rate to boost its ranking in search engine; and using private link networks, which involves connecting a group of unrelated websites resulting in a network of backlinks to a main website.


Democratizing data to fuel data-driven business decisions

Gone are the days when all an organization’s data lived in just one place. Business and technical users alike must be able to fully leverage data that spans cloud, distributed, and mainframe infrastructures. Because of this, effective data intelligence tools are those that can be applied at scale and support various technology connections to successfully plug and play into large organizations’ complex environments. This is increasingly critical as a growing number of organizations turn to hybrid solutions to leverage the benefits of both the cloud and the mainframe. According to a recent Rocket Software survey, an overwhelming 93% of respondents strongly agree with the sentence, “I believe my organization needs to embrace a hybrid infrastructure model that spans from mainframe to cloud.” Today’s data intelligence tools need to remove the barriers that prevent organizations from leveraging their data assets to the fullest. From being able to find and use the right data to increasing data use and trust, maintaining a competitive edge requires organizations to leverage trusted data to make informed and strategic business decisions.


I'm done with Red Hat (Enterprise Linux)

This past week, Red Hat took that knife and twisted it hard, when they published this blog post. Let there be no mistake: this was meant to destroy the distributions the community built to replace what Red Hat took away. There were only two things that kept me around after Red Hat betrayed us the first time: First, instead of attacking the community of open source users, many Red Hatters reached out and asked, "How can we do better?" It didn't heal the wound, but it meant something, knowing someone at Red Hat would at least listen. Second—and more importantly—Rocky Linux and AlmaLinux stepped in. They prevented a mass-exodus from the Red Hat ecosystem, giving developers like me a stable target for my open source work. But Rocky and Alma relied on Red Hat sharing their source code. Here's how it used to work:Red Hat would grab a copy of Linux They would add magic sauce that makes it Red Hat Enterprise Linux; They would release a new version; They would update a source code repository with all the data required to build it from scratch


What’s Next for JavaScript: New Features to Look Forward to

TypeScript was developed to make JavaScript developers more productive, rather than to replace JavaScript, but it’s also been a source of improvements to the language. Currently, you use TypeScript to make types explicit in your code while you’re writing it — but then you remove them when your code runs. Still some way off, the stage 1 Type Annotations proposal for including type information in JavaScript code but having them treated as comments by JavaScript engines is important, because it converges TypeScript and JavaScript for consistency in a way that keeps them aligned, but also makes it clear that they’re working at different layers. Developers can use first class syntax for types, whether that’s TypeScript or Flow syntax with long JSDoc comment blocks, and know that their code is still compatible with JavaScript engines and JavaScript tooling — avoiding the complexity of needing a build step to erase the types before their code will run, Palmer pointed out. “There’s huge value just in having static types that only exist during development and are fully erased during runtime,” he explained.


Human brain-inspired computer memory design set to increase energy efficiency and performance

The researchers focused on hafnium oxide, an insulating material commonly used in the semiconductor industry. However, there was one significant obstacle to overcome: hafnium oxide lacks structure at the atomic level, making it unsuitable for memory applications. But the team found an ingenious solution by introducing barium into thin films of hafnium oxide, resulting in the formation of unique structures within the composite material. These novel structures, known as vertical barium-rich "bridges," allowed electrons to pass through while the surrounding hafnium oxide remained unstructured. At the points where these bridges met the device contacts, an adjustable energy barrier was created. This barrier influenced the electrical resistance of the composite material and enabled multiple states to exist within it. ... One remarkable aspect of this breakthrough is that the hafnium oxide composites are self-assembled at low temperatures, unlike other composite materials that require expensive high-temperature manufacturing methods.


Enhancing Security With Data Management Software Solutions

Implementing a data management platform offers a multitude of advantages. Foremost, it ensures heightened data integrity, furnishing dependable and accurate information crucial for making well-informed decisions. Furthermore, it mitigates the risk of data breaches, shielding the reputation of your business and preserving the trust of your customers. Lastly, it streamlines regulatory compliance, which proves invaluable considering the stringent data regulations prevalent in numerous jurisdictions. This feature serves as a lifeline, aiding in the avoidance of potential legal entanglements and financial repercussions that may arise from non-compliance. By embracing a comprehensive data management platform, businesses can enjoy the assurance of data accuracy, fortify their security measures, and navigate the complex landscape of regulatory requirements with ease, ultimately fostering growth, resilience, and long-term success. When confronted with a myriad of data management solutions, selecting the ideal one for your business requires careful consideration. 


Wi-Fi 7 is coming — here's what to know

Here's how the Wi-Fi Alliance explains the upcoming standard: “Based on the developing IEEE 802.11be standard, Wi-Fi 7 will be the next major generational Wi-Fi technology evolution. Wi-Fi 7 focuses on physical (PHY) and medium access control (MAC) improvements capable of supporting a maximum throughput of at least 30Gbps to increase performance, enable Wi-Fi innovations, and expand use cases. Additional Wi-Fi 7 enhancements will support reduced latency and jitter for time sensitive networking applications including AR/VR, 4K and 8K video streaming, automotive, cloud computing, gaming, and video applications, as well as mission critical and industrial applications. As with other Wi-Fi generations, Wi-Fi 7 will be backward compatible and coexist with legacy devices in the 2.4, 5, and 6 GHz spectrum bands.” The alliance promises peak data rates of 46Gbps, which is almost four times faster than Wi-Fi 6 (802.11ax) and 6E and five times faster than Wi-Fi 5 (802.11ac). Wi-Fi 7 is also known as IEEE 802.11be Extremely High Throughput (EHT). It works in the 2.4GHz, 5GHz, and 6Ghz bands.


Hackers Targeting Linux and IoT Devices for Cryptomining

These bots also are instructed to download and execute additional scripts to brute-force every host in the hacked device's subnet and backdoor and any vulnerable systems using the Trojanized OpenSSH package. The bots' purpose is to maintain persistence and deploy mining malware crafted for Hiveon OS systems, which are Linux-based open-source operating systems designed for cryptomining. Microsoft attributed the campaign to a user named "asterzeu" on the cardingforum.cx hacking forum. The user offered multiple tools for sale on the platform, including an SSH backdoor, Microsoft said. Microsoft's disclosure comes two days after a report on a similar campaign was published by the AhnLab Security Emergency Response Center. The attack campaign consists of the Tsunami - another name for Kaiten - DDoS bot being installed on inadequately managed Linux SSH servers, the report said. As observed in Microsoft's analysis, Tsunami also installed various other malware and cryptominer and obfuscation tools, such as ShellBot, XMRig CoinMiner and Log Cleaner.


Most popular generative AI projects on GitHub are the least secure

The OpenSSF Scorecard is a tool created by the OpenSSF to assess the security of open-source projects and help improve them. The metrics it bases the assessment on are different facts about the repository such as the number of vulnerabilities it has, how often it's maintained, and if it contains binary files. By running Scorecard on a project, different parts of its software supply chain will be checked, including the source code, build dependencies, testing, and project maintenance. The purpose of the checks is to ensure adherence to security best practices and industry standards. Each check has a risk level associated with it, representing the estimated risk associated with not adhering to a specific best practice. Individual check scores are then compiled into a single aggregate score to gauge the overall security posture of a project. Currently, there are 18 checks that can be divided into three themes: holistic security practices, source code risk assessment, and build process risk assessment. The Scorecard assigns an ordinal scale between 0 to 10 and a risk level score for each check.



Quote for the day:

"Becoming a leader is synonymous with becoming yourself. It is precisely that simple, and it is also that difficult." -- Warren G. Bennis

Daily Tech Digest - June 27, 2023

The unhappy reality of cloud security in 2023

Configuration problems are often the most significant risk to cloud data and the most often overlooked. Show me a breach, and I’ll show you something stupid that allowed it to happen. One recent example is a large car manufacturer that had more than two million customers’ data exposed due to misconfigurations in its cloud storage systems. Rarely are properly configured security systems bypassed to gain access to data. Often, storage systems are left exposed or databases need more encryption. ... Not only are APIs provided by the cloud vendors, APIs are also built into business applications. They provide “keys to the kingdom” and are often left as open access points to business data. Other emerging threats include the use of generative AI systems to automate fakery. As I covered here, these AI-driven attacks are occurring now. As bad actors get better at leveraging AI systems (often free cloud services), we’ll see automated attacks that can work around even the most sophisticated security systems. It will be tough to keep up with the new and innovative ways attacks can occur.


How Automation Enables Remote Work

In a remote work setup, effective communication is paramount. Automation tools such as Slack and Microsoft Teams facilitate better communication by automating tasks like scheduling meetings, sending reminders and translating messages. These tools can also automate the process of organizing and archiving conversations, making it easier to retrieve information when needed. Additionally, they can automate the process of updating team members about project changes or important announcements. These features ensure that all team members are on the same page, enhancing collaboration, reducing the chances of miscommunication and ultimately leading to a more cohesive and efficient team. Automation in human resources (HR) is a game-changer in remote work settings. HR automation software can streamline recruitment, automating resume sorting, interview scheduling and follow-up emails. It can also enhance onboarding, automating welcome emails and account setups. Performance management can be improved with automated feedback collection and goal tracking. 


Self-healing code is the future of software development

It’s easy to imagine a more iterative process that would tap in the power of multi-step prompting and chain of thought reasoning, techniques that research has shown can vastly improve the quality and accuracy of an LLM’s output. An AI system might review a question, suggest tweaks to the title for legibility, and offer ideas for how to better format code in the body of the question, plus a few extra tags at the end to improve categorization. Another system, the reviewer, would take a look at the updated question and assign it a score. If it passes a certain threshold, it can be returned to the user for review. If it doesn’t, the system takes another pass, improving on its earlier suggestions and then resubmitting its output for approval. We are lucky to be able to work with colleagues at Prosus, many of whom have decades of experience in the field of machine learning. I chatted recently with Zulkuf Genc, Head of Data Science at Prosus AI. He has focused on Natural Language Processing (NLP) in the past, co-developing an LLM-based model to analyze financial sentiment, FinBert, that remains one of the most popular models at HuggingFace in its category.


Why an ecosystem helps IT projects move forward

To support the data strategy set by the company’s chief data officer, the team needed to specify the capabilities required from a data platform with the company’s tech strategy, which is about being cloud-first. Stuart Toll, senior enterprise architect at LGIM, said that time to market, integration time and skills were among the criteria used to assess the data platform providers. For Toll, while LGIM could have probably made any data platform work, he said “we are an asset management firm”. “We buy where we can and only build to differentiate.” This influenced the company’s data integration strategy. LGIM did not want to be in the business of stitching lots of tools together, as Matt Bannock, head of data engineering at LGIM, explained. ... Bannock said that with some tools, IT departments need to spend time on data integration. “Being able to just start working with the data, start running the calculation and start generating the output is much more valuable to us than the potential half a percent advantage we could achieve if we created our own ecosystem,” he said. “There’s a lot of benefit in buying into an ecosystem.”


Key Considerations When Hiring a Chief Information Security Officer

Look for candidates who possess a deep understanding of cybersecurity technologies, risk management frameworks, and regulatory compliance. Experience in managing security incidents, implementing security controls, and developing effective security strategies is also crucial. ... A CISO must understand the business landscape in which the organization operates. They should align security objectives with overall business goals and demonstrate a keen understanding of the organization’s risk appetite. A CISO with business acumen can effectively prioritize security investments, articulate the value of security measures to executive management, and build a security program that supports the organization’s strategic objectives. ... The field of cybersecurity is ever-evolving, with new threats emerging regularly. It is crucial for a CISO to stay up-to-date with the latest trends, technologies, and best practices in information security. Look for candidates who demonstrate a commitment to continuous learning, involvement in industry forums, and participation in relevant certifications and conferences.


10 things every CISO needs to know about identity and access management (IAM)

CISOs must consider how to move away from passwords and adopt a zero-trust approach to identity security. Gartner predicts that by 2025, 50% of the workforce and 20% of customer authentication transactions will be passwordless. ... Identity threat detection and response (ITDR) tools reduce risks and can improve and harden security configurations continually. They can also find and fix configuration vulnerabilities in the IAM infrastructure; detect attacks; and recommend fixes. By deploying ITDR to protect IAM systems and repositories, including Active Directory (AD), enterprises are improving their security postures and reducing the risk of an IAM infrastructure breach. ... Attackers are using generative AI to sharpen their attacks on the gaps between IAM, PAM and endpoints. CrowdStrike’s Sentonas says his company continues to focus on this area, seeing it as central to the future of endpoint security. Ninety-eight percent of enterprises confirmed that the number of identities they manage is exponentially increasing, and 84% of enterprises have been victims of an identity-related breach.


Decentralized Storage: The Path to Ultimate Data Privacy and Ownership

A move towards decentralization opens up a significant possibility for individual users; monetization. Data sovereignty would allow users to monetize their data and available storage space. Contributing towards the network storage would allow users to earn passive income purely from allowing other users to store data on their drives. This could be an alarming concept for users at first. But the realization that only you could access your data on the network using your key, regardless of which node it is stored in, should significantly help overcome this fear. Decentralization also has important implications for businesses and organizations. For example, companies can reduce the risks associated with data breaches and protect customer information more effectively, allowing for more trust with customers in the long term. Organizations could also contribute to network storage on a larger scale, allowing for new economic opportunities.


Too Much JavaScript? Why the Frontend Needs to Build Better

Often, it boils down to one common problem: Too much client-side JavaScript. This is not a cost-free error. One retailer realized they were losing $700,000 a year per kilobyte of JavaScript, Russell said. “You may be losing all of the users who don’t have those devices because the experience is so bad,” he said. That doesn’t mean developers are wrong to ship client-side JavaScript, which is why Russell hates to be prescriptive about how to handle the problem. Sometimes, it makes sense depending on the data model and whether it has to live on the client so that you can access the next email (think Gmail) or paint the next operation quickly (think Figma). The usage tends to correlate with very long sessions, he said. But developers should realize, too, that some frameworks prioritize this approach. “The premise of something like React, the premise of something like Angular, is that [the] data model is local,” he said. “So if that premise doesn’t meet the use case, then those tools just fundamentally don’t make sense,” he said. “You really do have to shoehorn them in for some kind of an exogenous reason, then you hope that it plays out.


The hardest part of building software is not coding, it’s requirements

Is the idea behind using AI to create software to just let those same stakeholders talk directly to a computer to create a SMS based survey? Is AI going to ask probing questions about how to handle all the possible issues of collecting survey data via SMS? Is it going to account for all the things that we as human beings might do incorrectly along the way and how to handle those missteps? In order to produce a functional piece of software from AI, you need to know what you want and be able to clearly and precisely define it. There are times when I’m writing software just for myself where I don’t realize some of the difficulties and challenges until I actually start writing code. Over the past decade, the software industry has transitioned from the waterfall methodology to agile. ... So many software projects using waterfall have failed because the stakeholders thought they knew what they wanted and thought they could accurately describe it and document it, only to be very disappointed when the final product was delivered. Agile software development is supposed to be an antidote to this process.


Beyond Backups: Evolving Strategies for Data Management and Security

As businesses continue to generate more data, the need to revamp data management services including the implementation of effective data backup and recovery strategies has become central. Comprehensive data backup continues to evolve, and AI and ML have become potent tools in this field, revolutionizing the way organizations approach data backup and recovery. In 2023, we will witness an increase in the adoption of AI/ML technologies such as self-monitoring and management of IT assets and automation along with orchestration of IT activities across on-premises and the cloud. AI will play an increasingly important role both for malicious purposes and to build more proactive and pre-emptive strategies. To enable a competitive advantage to all our customers, our data protection capabilities are fuelled by AI/ML at their core. Our self-driving backup uses AI and ML to automate backup and recovery operations and management, including setup, monitoring, deep visibility real-time insights, and service level agreement (SLA) tracking.



Quote for the day:

"The problem with being a leader is that you're never sure if you're being followed or chased." -- Claire A. Murray

Daily Tech Digest - June 26, 2023

Generative AI Gets a SaaSy Touch

Although pure-play Generative AI startups, Rephrase.ai, Blend, ProbeAI, Fasthr.AI to name a few, are not that common yet in India, it has set foot into many industries such as healthcare, music and art, finance, advertising and marketing, gaming and entertainment, among others. This has opened up huge opportunities for the SaaS (software as a service) sector. Tapping into the opportunity are India's leading SaaS companies, who are following the footsteps of global software giants like SAP, Salesforce, and IBM and investing in Generative AI like never before. ... Other prominent SaaS companies are also investing heavily in Generative AI. Freshworks recently unveiled reddy Self-Service, Freddy Copilot and Freddy Insights to make AI more accessible to every workplace. The new predictive and assistive generative AI capabilities embedded within Freshworks solutions and platform are said to go beyond content generation and help support agents, sellers, marketers, IT teams and leaders become more efficient with a revolutionary new way to interact with their business software.


When low-code and no-code can accelerate app modernization

“Software architecture requires broadening our perspective and considering the larger system, a wider range of possibilities, and a longer time scale,” says Andrew Davis, senior director of methodology at Copado. “Legacy applications are living proof that a significant portion of the cost of software is in its long-term maintenance. Low-code and no-code applications are designed to reduce complexity and thus increase maintainability over time.” Low-code or no-code platforms can help accelerate application modernization, but not in every case. You need a good match between the application's business requirements, user experience, data, architecture, integrations, security needs, compliance factors, and reporting with a low or no-code platform’s capabilities. ... Some types of applications and use cases are better candidates for low-code and no-code. Applications used for departmental business processes such as approval workflows, onboarding, content management, work queues, request processing, budget management, and asset tracking are high on the list.


The CISO’s toolkit must include political capital within the C-suite

Where a security leader sits in a company’s pecking order or to whom they report “is fundamentally irrelevant, because every organization sees things differently,” according to John Stewart, president of Talons Ventures and a former chief security and trust officer at Cisco. “The relevant piece is access, support, authorities, and accountability,” Stewart tells CSO. Stewart has cautioned CISOs many times to be careful of the “I need to report to the CEO to be effective” instinct. “That suggests either the business, the culture, or the individual are ineffective.” A more effective approach should be, according to Stewart: “I need access to the CEO with their support and a clear understanding of my responsibilities and authorities that is backed up with action.” This is pretty much in line with the thinking of Malcolm Harkins, former CISO at Intel and other entities, who tells CSO that it is “unimportant” to whom an individual CISO reports. “The CISO is the one who should be responsible and accountable for mitigating risk,” he says. 


Is starting with a startup the right choice?

When presented with the choice between failing with a plan or succeeding without one, many startup founders and teams lean towards the latter. The prevailing belief is that success is the ultimate goal, regardless of the presence or absence of a plan. However, this perspective fails to consider the reliability, repeatability, and scalability of such success. Without a plan, it becomes uncertain whether the success can be attributed to one's own efforts, sheer luck, or the misfortune of competitors. Relying on hope alone is not a sound strategy. In the fast-paced world of startups, the constant action and hustle often mask personal weaknesses and systemic deficiencies. It is only when the economic tide recedes that the true vulnerabilities of startups are exposed, as astutely pointed out by Warren Buffet's observation, "Only when the tide goes out do you discover who's been swimming naked." During a favorable economic climate, many startups may appear successful without a well-defined strategy. They simply need to be in the right place at the right time. 


Office workers feel AI is better than a human boss

Looking at employee concerns around AI bosses, softer skills and capabilities is the key area respondents think a robot would lack. In addition, just under half of employees think they would struggle to see a robot as an authoritative figure. Despite this, over one in five (22%) admit they would feel more comfortable talking about their frustrations at work to a robot over their boss. According to Business Name Generator, this may be due to not wanting to cause conflict or emotional distress that is part of human interaction. The survey found that 18% would trust a robot boss over their current one. The office workers polled said that a lack of appreciation is the biggest frustration they feel about their bosses, with 14% experiencing this currently. Micromanagement, being a “know it all”, and lacking patience are among the frustrations featured in the top 10, with over one in 10 respondents experiencing this. Poor management skills such as bosses who are disorganised or have unclear expectations also featured in the top 10 of complaints.


A Walkthrough of Adopting Infrastructure as Code

The world of cloud infrastructure is a bit daunting. Pulumi supports over 100 clouds. AWS has over 200 services with over 1,000 individual resources and over 300,000 configurable properties across all of them. Pulumi is a multicloud tool, but “multicloud” does not mean “lowest common denominator.” Instead, Pulumi exposes all those individual clouds, resources and properties in their raw, unadulterated form. The benefit of this is that you have the entire capabilities of all of these clouds right at your fingertips. The downside is that to use them you need to understand these clouds and how to use them properly. As a result, you’ll probably quickly find that you want a starting point, rather than a blank page. Pulumi Templates are a good way to get started. They represent over a dozen of the most common application and infrastructure architectures on the most popular clouds. They were built to be simple enough to be understandable at a glance but complete enough to be useful in practice. 


Data Governance in Higher Ed is Critical. Here’s How to Achieve and Sustain It.

“It’s important to remember, however, that data is an institutional asset,” she says. Divided as institutions are by different schools and academic departments — not to mention professional departments like IT, enrollment or student life — silos are often unavoidable in higher education. Those silos often are one of the biggest barriers to developing and implementing an effective data governance strategy across an institution. That’s because data governance works best as a Venn diagram than in silos, says Matthew Hagerty, a consultant who specializes in IT, efficiency and analytics, and faculty engagement at EAB, a Washington D.C.-based education consulting firm. “Make sure the right people are in the room to craft that policy,” Hagerty says. He says that many times during initial data governance meetings, “maybe halfway through, someone will raise their hand and ask, “‘Wait a second. Why isn’t Bob from finance here? Who’s representing human resources in this committee?’”


Hate being more productive? Ignore AI agents

Business school professor and technologist Ethan Mollick offers what I’ve found to be very useful framing for how to think about generative AI: “It is not good software, [rather] it is pretty good people.” And rather than thinking about AIs as people who replace those already on the payroll, treat them like “eager interns” that can help them be more productive. This metaphor can help on two fronts. First, it keeps the need for human supervision front and center. Just as hiring and productively managing interns is a valuable competency for an organization, so too is using ChatGPT, Microsoft’s CoPilot, or Google’s Bard. But you would no more blindly trust this class of model than you would even the most promising intern. Second, and as important: IT isn’t responsible for hiring interns in Finance and HR. Likewise, Finance and HR (and every other function) must build their own competency i figuring out how to use these tools to be more productive. The job to be done is closer to answering domain-specific staffing questions than IT questions.


Top Tips for Weeding Out Bad Data

Bad data often really means low quality data. In this case, it’s up to the data owner to define the acceptable level of quality in terms of relevance, accuracy, age, or other criteria. “But bad data can also mean inappropriate data, in which case “appropriate” would need to be defined,” says Erik Gfesser, director and chief architect at business advisory firm Deloitte Global. One enterprise’s highly useful data might be meaningless to another. Since many use cases aren’t particularly demanding, data quality doesn’t always have to adhere to the same standards. “As such, judgment often needs to be used to determine what’s appropriate,” he explains. It’s also important to check for duplicate records, which can be caused by data entry errors or identical data being retrieved from multiple sources. “A clearly defined data governance program and an enterprise-level data pipeline design that’s shared enterprise-wide are the best ways to prevent duplicate records,” Shah recommends. It’s possible to identify outliers and detect anomalies by comparing values that appear to be significantly different from the rest of the data or by running statistical tests, such as regression analysis, hypothesis testing, or correlation analysis, to identify patterns in data, Shah says.


Choosing the Right Data Architecture

"If it turns out that none of those references is close to your scale, doing what you want to do, then you know you're well beyond the frontier of the vendor’s product." If that’s the case, then you need to conduct tests to help control and manage your risk. "The best kind of test is a full-scale, realistic benchmark, and the best case is where you have more than one credible vendor." Winter recommends testing two or three solutions and comparing the results. You can see if any vendor can demonstrate they have the capability to meet your most critical requirements. If multiple vendors pass this test, then examine differences in cost, complexity, and the agility of the solution. "These differences can be very revealing. Once you've illuminated what's going on via testing, you can get into much deeper conversations with the vendor about what you're seeing in the behavior of the system. We've had remarkable experiences doing this, even with the most modern systems in the cloud."



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor

Daily Tech Digest - June 25, 2023

Traffic Routing in Ambient Mesh

The ambient mesh deployment model is much leaner than the sidecar data plane deployment model, allowing for incremental adoption of service mesh features and making it less risky. As ambient mesh includes fewer components, this leads to reduced infrastructure costs and performance improvements, as captured in this blog post. Ambient mesh does all this while retaining all the service mesh critical features, including zero trust security. ... The new Rust-based ztunnel proxy is responsible for mTLS, authentication, L4 authorization and telemetry in the ambient mesh. Its job is to proxy the traffic between ambient mesh pods. Optionally, the ztunnel proxies to L7 waypoint proxies, ingress and, in the future, egress proxies. Ztunnels on different nodes establish a tunnel using HBONE (HTTP-Based Overlay Network Environment). Similarly, the tunnel gets established between the ztunnel and the waypoint proxy, if one exists. The tunnel that’s established between the ztunnels allows the source ztunnel to connect to the destination workload on behalf of the source workload.


Unleashing Business Growth: The Power of Adopting Enterprise Architecture

Enterprise architecture plays a vital role in the success and growth of modern businesses. By aligning business and IT strategies, enhancing agility, optimizing resources, mitigating risks, and fostering innovation, EA provides a solid foundation for sustained growth and competitive advantage. As businesses continue to navigate an increasingly complex landscape, leveraging the business-critical values of Enterprise Architecture becomes imperative to welcome new opportunities and drive long-term success. So, whether you are a business leader, IT professional, or decision-maker, embracing EA as a strategic imperative will position your organization for growth, resilience, and innovation in the ever-changing business landscape. Remember, an ingenious Enterprise Architecture Development is not a one-time effort but an ongoing journey of adaptation and improvement. It requires collaboration, commitment, and continuous refinement to realize its full potential in driving business growth.


IT firms expect to increase hiring next quarter, ManpowerGroup says

Among the skills most in demand in IT are project managers, business analysts, and software developers. "I wish we could clone full stack developers. We can't find enough of them," Doyle said. In past years, ManpowerGroup’s survey has been conducted by telephone. This year, it was done online. Regionally, the strongest hiring intentions for next quarter are in the west, with 43% of employers planning to add to workers, according to ManpowerGroup. In the northeast, 40% of employers plan to increase staff; the midwest is expected to see a 32% increase; and companies in the south are expected to boost hiring by 29%. Large organizations with more than 250 employees are more than three times as optimistic as small firms (with fewer than 10 employees) to hire in the next quarter, with employment outlooks of +47% and +14%, respectively. Earlier this month, the US Bureau of Labor Statistics (BLS) released its hiring data for the month of May; it showed a 0.3% increase in overall unemployment — from 3.4% to 3.7%.


Building Effective Defenses Against Social Engineering

In addition to awareness training and education, quite a number of technologies are available to augment and fortify efforts to limit the impact of social engineering attacks. Cloud-based email security gateways are just one example. Depending on budget, staffing, age of existing infrastructure, the value of the assets to be protected and other aspects, a layered defense strategy may range from relatively low-cost and simple to more elaborate (and expensive) endeavors. Enforcement of strong passwords is an example of a relatively cheap, easy and fast tactic that can be highly effective in averting data breaches and other cyberattacks. Other strategies and techniques can be rolled out in parallel with existing technologies to minimize disruption while preparing for a new, stronger security infrastructure. A zero-trust network architecture (ZTNA) is one such example; it can be deployed alongside a secure sockets layer (SSL) virtual private network (VPN), working as an overlay at first to boost security and eventually replacing it.


Data Breach Lawsuit Alleges Mismanagement of 3rd-Party Risk

The latest GoAnywhere-related lawsuit alleges that ITx could have prevented the theft of sensitive data "had it limited the patient information it shared with its business associates and employed reasonable supervisory measures to ensure that adequate data security practices, procedures and protocols were being implemented and maintained by business associates." ITx's "collective inadequate safeguarding and supervision of class members' private information that they collected and maintained, and its failure to adequately supervise its business associates, vendors and/or suppliers" has put the plaintiffs and class members at risk for ID fraud and theft crimes, the complaint also alleges. The lawsuit says victims will be at higher risk for phishing, data intrusion and other illegal schemes through the misuse of their private information. It also points out that their data is still held by ITx and could be exposed to future breaches without the court's corrective action. The lawsuit seeks monetary damages, lifetime credit and identity monitoring for the plaintiff and class members, as well as a court order for ITx to take measures to prevent any future similar data security incidents.


Who owns the code? If ChatGPT's AI helps write your app, does it still belong to you?

Attorney Richard Santalesa, a founding member of the SmartEdgeLaw Group based in Westport, Conn., focuses on technology transactions, data security, and intellectual property matters. He points out that there are issues of contract law as well as copyright law -- and they're treated differently. From a contractual point of view, Santalesa contends that most companies producing AI-generated code will, "as with all of their other IP, deem their provided materials -- including AI-generated code -- as their property." OpenAI (the company behind ChatGPT) does not claim ownership of generated content. According to their terms of service, "OpenAI hereby assigns to you all its right, title and interest in and to Output." Clearly, though, if you're creating an application that uses code written by an AI, you'll need to carefully investigate who owns (or who claims to own) what. For a view of code ownership outside the US, ZDNET turned to Robert Piasentin, a Vancouver-based partner in the Technology Group at McMillan LLP, a Canadian business law firm.


Shadow SaaS, changing contracts and ChatGPT adoption: SaaS trends to watch

As more companies move to remote work, many find that shorter (one-year) contracts are preferable because they allow for more flexibility. Reducing contract lifetime is also a way for organizations to reduce overhead costs. One-year contracts accounted for 79% of all contracts in 2020 and 85% of all contracts in 2022. Three-year and longer contracts declined the most year-over-year. In 2023, SaaS spend per employee averaged $9,643. Large businesses spent an average of $7,492 per employee in 2022, while medium-sized businesses spent $10,045 and small and medium-sized businesses spent $11,196. The large businesses spent less because they received volume discounts and enterprise-wide licensing agreements, as well as better efficiency of scale with consumption-based apps, Productiv said. “To avoid shadow IT, organizations need to develop appropriate SaaS governance policies that help teams take their free and purchased apps out of the shadows and ensure the right level of corporate policies for procurement, security and compliance,” Chandarana said.


How AI is reshaping demand for IT skills and talent

AI opens new doors for security threats and compliance issues as well that organizations must be prepared to address. “On the technical side, I see security as hugely important,” says Hendrickson. “A lot of companies say, ‘We’re not letting people touch ChatGPT yet, we’re just not allowing it—it’s blocked.’” But end-users’ propensity for finding ways to improve their work processes will no doubt lead to greater levels of shadow IT around such emerging technologies, and thus, security implications will eventually need to be tackled beyond simply trying to hold back the tide. Moreover, Hendrickson points to the fact that just a few years ago, discussions around machine learning centered around its ability to break encryption, and with quantum machine learning on the horizon, that concern has only increased. As companies navigate AI in the workplace, they’re going to need skilled professionals who can identify potential risks and pinpoint possible solutions. There are also increased complexities around “managing the infrastructure and platforms that provide resources to power applications, and to store and access data,” says Kim.


Decision Rights Rule the World – Architecture Design Part 3

Think of the number of decisions made related to technology daily in your organization. Try to imagine, every library, product, SaaS tool, vendor agreement, pattern, style, and reference model that is being chosen by one or more people. From huge (ERP, standardizing a single cloud vendor, information management structures) to small (library dependency, pattern application to code, GitHub structure). The real question is, how many of those are architecturally relevant (Note: it is NOT all of them)? And how many of them come with a decision record of any kind? I have asked this question of countless audiences and teams over time. The answer is… almost none. And that is scary. We end up with WHAT we decided, not WHY we decided. Traceability, understanding, decision excellence are all thrown out the window because we think it might take too, long. Just FYI, whenever I have implemented decision management in teams, important decisions (structural, value-based, etc) go FASTER not slower. The decision record allows us to focus on apples to apples instead of long-winded, emotionally charged, opinion-heavy, biased arguments.


Structured for Success: 4 Architectural Pillars of Cyber Resilience

Having centralized visibility is fundamental to not only taking control of cloud environments but also bridging silos. In a recent survey conducted by Forrester, 83% of IT decision-makers said a single consolidated view for managing their organizations’ cloud and IT services would help achieve their business outcomes — including improving their cybersecurity posture. ... Immutable data storage enables the storing of data after it is written, such that it's impossible to change, erase or otherwise interfere with it. This functionality guards against malware, ransomware, and both unintentional and malicious human behavior. Since it effectively protects data against any change or erasure, as would be typical in a ransomware attack that tries to encrypt data, immutability is commonly regarded as a prerequisite in the battle against ransomware. ... Beyond this 3-2-1 rule, organizations need a scalable backup and recovery infrastructure — one that makes management fast and simple – to sustain business continuity and operations in the current cybersecurity landscape.



Quote for the day:

"Leadership without mutual trust is a contradiction in terms." -- Warren Bennis