Daily Tech Digest - March 18, 2024

Generative AI will turn cybercriminals into better con artists. AI will help attackers to craft well-written, convincing phishing emails and websites in different languages, enabling them to widen the nets of their campaigns across locales. We expect to see the quality of social engineering attacks improve, making lures more difficult for targets and security teams to spot. As a result, we may see an increase in the risks and harms associated with social engineering – from fraud to network intrusions. ... AI is driving the democratisation of technology by helping less skilled users to carry out more complex tasks more efficiently. But while AI improves organisations’ defensive capabilities, it also has the potential for helping malicious actors carry out attacks against lower system layers, namely firmware and hardware, where attack efforts have been on the rise in recent years. Historically, such attacks required extensive technical expertise, but AI is beginning to show promise to lower these barriers. This could lead to more efforts to exploit systems at the lower level, giving attackers a foothold below the operating system and the industry’s best software security defences.


Get the Value Out of Your Data

A robust data strategy should have clearly defined outcomes and measurements in place to trace the value it delivers. However, it is important to acknowledge the need for flexibility during the strategic and operational phases. Consequently, defining deliverables becomes crucial to ensure transparency in the delivery process. To achieve this, adopting a data product approach focused on iteratively delivering value to your organization is recommended. The evolution of DevOps, supported by cloud platform technology, has significantly improved the software engineering delivery process by automating development and operational routines. Now, we are witnessing a similar agile evolution in the data management area with the emergence of DataOps. DataOps aims to enhance the speed and quality of data delivery, foster collaboration between IT and business teams, and reduce the associated time and costs. By providing a unified view of data across the organization, DataOps enables faster and more confident data-driven decision-making, ensuring data accuracy, up-to-datedness, and security. It automates and brings transparency to the measurements required for agile delivery through data product management.


Exposure to new workplace technologies linked to lower quality of life

Part of the problem is that IT workers need to stay updated with the newest tech trends and figure out how to use them at work, said Ryan Smith, founder of the tech firm QFunction, also unconnected with the study. The hard part is that new tech keeps coming in, and workers have to learn it, set it up, and help others use it quickly, he said. “With the rise of AI and machine learning and the uncertainty around it, being asked to come up to speed with it and how to best utilize it so quickly, all while having to support your other numerous IT tasks, is exhausting,” he added. “On top of this, the constant fear of layoffs in the job market forces IT workers to keep up with the latest technology trends in order to stay employable, which can negatively affect their quality of life.” ... “As IT has become the backbone of many businesses, that backbone is key to the businesses operations, and in most cases revenue,” he added. “That means it’s key to the business’s survival. IT teams now must be accessible 24 hours a day. In the face of a problem, they are expected to work 24 hours a day to resolve it. ...”


6 best operating systems for Raspberry Pi 5

Even though it has been nearly seven years since Microsoft debuted Windows on Arm, there has been a noticeable lack of ARM-powered laptops. The situation is even worse for SBCs like the Raspberry Pi, which aren’t even on Microsoft’s radar. Luckily, the talented team at WoR project managed to find a way to install Windows 11 on Raspberry Pi boards. ... Finally, we have the Raspberry Pi OS, which has been developed specifically for the RPi boards. Since its debut in 2012, the Raspberry Pi OS (formerly Raspbian) has become the operating system of choice for many RPi board users. Since it was hand-crafted for the Raspberry Pi SBCs, it’s faster than Ubuntu and light years ahead of Windows 11 in terms of performance. Moreover, most projects tend to favor Raspberry Pi OS over the alternatives. So, it’s possible to run into compatibility and stability issues if you attempt to use any other operating system when attempting to replicate the projects created by the lively Raspberry Pi community. You won’t be disappointed with the Raspberry Pi OS if you prefer a more minimalist UI. That said, despite including pretty much everything you need to use to make the most of your RPi SBC, the Raspberry Pi OS isn't as user-friendly as Ubuntu.


Speaking without vocal cords, thanks to a new AI-assisted wearable device

The breakthrough is the latest in Chen's efforts to help those with disabilities. His team previously developed a wearable glove capable of translating American Sign Language into English speech in real time to help users of ASL communicate with those who don't know how to sign. The tiny new patch-like device is made up of two components. One, a self-powered sensing component, detects and converts signals generated by muscle movements into high-fidelity, analyzable electrical signals; these electrical signals are then translated into speech signals using a machine-learning algorithm. The other, an actuation component, turns those speech signals into the desired voice expression. The two components each contain two layers: a layer of biocompatible silicone compound polydimethylsiloxane, or PDMS, with elastic properties, and a magnetic induction layer made of copper induction coils. Sandwiched between the two components is a fifth layer containing PDMS mixed with micromagnets, which generates a magnetic field. Utilizing a soft magnetoelastic sensing mechanism developed by Chen's team in 2021, the device is capable of detecting changes in the magnetic field when it is altered as a result of mechanical forces—in this case, the movement of laryngeal muscles.


We can’t close the digital divide alone, says Cisco HR head as she discusses growth initiatives

At Cisco, we follow a strengths-based approach to learning and development, wherein our quarterly development discussions extend beyond performance evaluations to uplifting ourselves and our teams. We understand that a one-size-fits-all approach is inadequate. To best play to our employees' strengths, we have to be flexible, adaptable, and open to what works best for each individual and team. This enables us to understand individual employees' unique learning needs, enabling us to tailor personalised programs that encompass diverse learning options such as online courses, workshops, mentoring, and gamified experiences, catering to diverse learning styles. As a result, our employees are energized to pursue their passions, contributing their best selves to the workplace. Measuring the quality of work, internal movements, employee retention, patents, and innovation, along with engagement pulse assessments, allows us to gauge the effectiveness of our programs. When it comes to addressing the challenge of retaining talent, it's essential for HR leaders to consider a holistic approach. 


Vector databases: Shiny object syndrome and the case of a missing unicorn

What’s up with vector databases, anyway? They’re all about information retrieval, but let’s be real, that’s nothing new, even though it may feel like it with all the hype around it. We’ve got SQL databases, NoSQL databases, full-text search apps and vector libraries already tackling that job. Sure, vector databases offer semantic retrieval, which is great, but SQL databases like Singlestore and Postgres (with the pgvector extension) can handle semantic retrieval too, all while providing standard DB features like ACID. Full-text search applications like Apache Solr, Elasticsearch and OpenSearch also rock the vector search scene, along with search products like Coveo, and bring some serious text-processing capabilities for hybrid searching. But here’s the thing about vector databases: They’re kind of stuck in the middle. ... It wasn’t that early either — Weaviate, Vespa and Mivlus were already around with their vector DB offerings, and Elasticsearch, OpenSearch and Solr were ready around the same time. When technology isn’t your differentiator, opt for hype. Pinecone’s $100 million Series B funding was led by Andreessen Horowitz, which in many ways is living by the playbook it created for the boom times in tech.


The Role of Quantum Computing in Data Science

Despite its potential, the transition to quantum computing presents several significant challenges to overcome. Quantum computers are highly sensitive to their environment, with qubit states easily disturbed by external influences – a problem known as quantum decoherence. This sensitivity requires that quantum computers be kept in highly controlled conditions, which can be expensive and technologically demanding. Moreover, concerns about the future cost implications of quantum computing on software and services are emerging. Ultimately, the prices will be sky-high, and we might be forced to search for AWS alternatives, especially if they raise their prices due to the introduction of quantum features, as it’s the case with Microsoft banking everything on AI. This raises the question of how quantum computing will alter the prices and features of both consumer and enterprise software and services, further highlighting the need for a careful balance between innovation and accessibility. There’s also a steep learning curve for data scientists to adapt to quantum computing.


AI-Driven API and Microservice Architecture Design for Cloud

Implementing AI-based continuous optimization for APIs and microservices in Azure involves using artificial intelligence to dynamically improve performance, efficiency, and user experience over time. Here's how you can achieve continuous optimization with AI in Azure:Performance monitoring: Implement AI-powered monitoring tools to continuously track key performance metrics such as response times, error rates, and resource utilization for APIs and microservices in real time. Automated tuning: Utilize machine learning algorithms to analyze performance data and automatically adjust configuration settings, such as resource allocation, caching strategies, or database queries, to optimize performance. Dynamic scaling: Leverage AI-driven scaling mechanisms to adjust the number of instances hosting APIs and microservices based on real-time demand and predicted workload trends, ensuring efficient resource allocation and responsiveness. Cost optimization: Use AI algorithms to analyze cost patterns and resource utilization data to identify opportunities for cost savings, such as optimizing resource allocation, implementing serverless architectures, or leveraging reserved instances.


4 ways AI is contributing to bias in the workplace

Generative AI tools are often used to screen and rank candidates, create resumes and cover letters, and summarize several files simultaneously. But AIs are only as good as the data they're trained on. GPT-3.5 was trained on massive amounts of widely available information online, including books, articles, and social media. Access to this online data will inevitably reflect societal inequities and historical biases, as shown in the training data, which the AI bot inherits and replicates to some degree. No one using AI should assume these tools are inherently objective because they're trained on large amounts of data from different sources. While generative AI bots can be useful, we should not underestimate the risk of bias in an automated hiring process -- and that reality is crucial for recruiters, HR professionals, and managers. Another study found racial bias is present in facial-recognition technologies that show lower accuracy rates for dark-skinned individuals. Something as simple as data for demographic distributions in ZIP codes being used to train AI models, for example, can result in decisions that disproportionately affect people from certain racial backgrounds.



Quote for the day:

"The most common way people give up their power is by thinking they don't have any." -- Alice Walker

Daily Tech Digest - March 17, 2024

Generative AI will drive a foundational shift for companies — IDC

“Over the last year, most organizations debated creating Chief AI Officers and centers of excellence to decide how to embed AI and create new business centers for new AI-enabled products and services,” said Rick Villars, group vice president of IDC’s Worldwide Research division. CIOs are also rethinking their capital investment plans and staffing needs based on AI initiatives, according to Villars, including how AI will affect an organization’s long-term revenue and profitability. Most organizations are likely to choose a hybrid approach to building out their AI plans — that is, companies will partner with service providers while also customizing existing AI platforms such as ChatGPT, as well as building their own proprietary, but smaller, AI models for specific use cases. “All applications you buy will become more intelligent. ... Phil Carter, group vice president of IDC’s Worldwide Thought Leadership Research, said organizations shouldn’t expect an immediate ROI from their investments. Like other major economic shifts, such as arrival of the tractors for farming, the arrival of genAI technology can take decades to achieve widespread adoption and ROI.


Blockchain in trademark and brand protection, explained

Through the use of blockchain technology, firms are able to generate irreversible documentation of product legitimacy. It is possible to provide each product with a unique identification number that allows retailers and customers to instantly confirm its legitimacy. In addition to shielding customers against fake items, this also helps firms preserve their goodwill, ensure data integrity, and win over new customers. Additionally, supply chains benefit from the transparency and traceability that blockchain offers, allowing firms to monitor the flow of goods from manufacturing to distribution. Businesses can use blockchain technology to confirm the legitimacy of products and spot any illegal or fake goods that are trading in the market. ... it might be difficult and expensive to integrate blockchain technology with current systems and procedures. To apply blockchain efficiently, firms might need to redesign their infrastructure and make considerable investments in new technology and knowledge. This can be a major hurdle, particularly for smaller companies with tighter budgets. The implementation of blockchain in brand protection is further complicated by problems with scalability and interoperability.


Open source is not insecure

It’s too easy whenever there is a major vulnerability to malign the overall state of open source security. In fact, many of these highest profile vulnerabilities show the power of open source security. Log4shell, for example, was the worst-case scenario for an OSS vulnerability at a scale and visibility level—this was one of the most widely used libraries in one of the most widely used programming languages. (Log4j was even running on the Mars rover. Technically this was the first intergalactic OSS vulnerability!) The Log4shell vulnerability was trivial to exploit, incredibly widespread, and seriously consequential. The maintainers were able to patch it and roll it out in a matter of days. It was a major win for open source security response at the maintainer level, not a failure. ... But today, most software consumption is occurring outside of distributions. The programming language package managers themselves—npm (JavaScript), pip (Python), Ruby Gems (Ruby), composer (PHP)—look and feel like Linux distribution package managers, but they work a little differently. They basically offer zero curation—anyone can upload a package and mimic a language maintainer.


AI is keeping GitHub chief legal officer Shelley McKinley busy

“I would say that AI is taking up [a lot of] my time — that includes things like ‘how do we develop and ship AI products,’ and ‘how do we engage in the AI discussions that are going on from a policy perspective?,’ as well as ‘how do we think about AI as it comes onto our platform?’,” McKinley said. The advance of AI has also been heavily dependent on open source, with collaboration and shared data pivotal to some of the most preeminent AI systems today — this is perhaps best exemplified by the generative AI poster child OpenAI, which began with a strong open-source foundation before abandoning those roots for a more proprietary play ... “Regulators, policymakers, lawyers… are not technologists,” McKinley said. “And one of the most important things that I’ve personally been involved with over the past year, is going out and helping to educate people on how the products work. People just need a better understanding of what’s going on, so that they can think about these issues and come to the right conclusions in terms of how to implement regulation.” At the heart of the concerns was that the regulations would create legal liability for open source “general purpose AI systems,” which are built on models capable of handling a multitude of different tasks.


Is OpenAI Opening Up To Quantum?

It’s likely that the potential for quantum to solve certain computational tasks critical to OpenAI’s growth is one reason for the quantum feelers, as it were. First, as AI models become more sophisticated, the computational resources required to train them have skyrocketed. Quantum computing offers a potential solution to this bottleneck, promising speed-ups for specific types of computations, including those involved in machine learning and optimization problems. Quantum computers could one day — relying on superposition and entanglement — process vast amounts of data in ways that classical computers will struggle to manage and — again, eventually — use far less economic and environmental resources. ChatGPT CEO Sam Altman recently made headlines for reports that he was seeking $7 trillion to make chips, apparently to feed this massive need for speed and processing power. He’s since said the reports on that figure were inaccurate, but the move still underscores OpenAI’s computational dilemma — grow, but reduce costs and improve performance. In a sentence, then, the potential integration of quantum computing with AI could boost model efficiency.


Flexera 2024 State of the Cloud Reveals Spending as the Top Challenge of Cloud Computing

“This is a complex year for cloud adoption. Organizations are navigating economic uncertainties by investing in generative AI, security, and sustainability while prioritizing cost management,” said Brian Adler, Senior Director, Cloud Market Strategy at Flexera. He further added “Cloud adoption continues to grow. The shift toward hybrid and multi-cloud environments underscores the importance of comprehensive cost management, with nearly half of all workloads and data now in the public cloud. FinOps practices and cloud centers of excellence are growing as companies move toward centralized, strategic cloud management.” The report also shows an increase in multi-cloud usage, increasing to 89% from 87% last year. Sixty-one percent of large enterprises use multi-cloud security, and 57% use multi-cloud FinOps as cost optimization tools. Organizations are taking a centralized approach to cloud with 63% of organizations already having a cloud center of excellence (CCOE) and 14% planning on creating one within the next year. Sustainability has been high on the priority list of organizations.


Cloud CISO Perspectives: Easing the psychological burden of leadership

CISOs are the public face of an organization’s security team, and they sit at the nexus of the security experts, engineers, and developers who report to them, the organization’s security policies, and the executives and board of directors who they report to. They often are blamed for security breaches that occur on their watch, and yet CISOs are not fleeing their jobs — recent data suggests that, despite the stress of the role, they stay at their employer for more than four and a half years at a time. While a CISO who has stayed with one company for five years has clearly demonstrated their dedication to defending their organization’s data and supporting its security teams, it doesn’t mean that they’re happy. High-profile data breaches are on the rise, and government agencies are imposing stricter regulatory requirements including increasing levels of legal accountability (and even personal liability) for their organization’s cybersecurity posture. The stresses CISOs contend with can take a psychological toll, lead to poor decision-making, and even burnout. 


Tech Transformation in Food Technology with AI

AI-driven predictive analytics offer crop management assistance. AI employs historical data, weather patterns, and soil conditions to detect crop yield forecasts, optimal planting times, and potential disease outbreaks. This proactive approach allows farmers to implement preventive measures, adjust farming practices, and mitigate risks, ultimately improving crop quality and quantity. ... Automation is crucial in streamlining food processing operations. AI-powered robotics and machine learning systems automate repetitive tasks such as sorting, grading, and packaging, enhancing efficiency, consistency, and speed. This reduces labour costs and minimises human errors, ensuring uniform product quality and meeting stringent industry standards and consumer expectations. ... AI technologies optimise every aspect of the food supply chain, from farm to fork. AI algorithms optimise logistics by analysing data on transportation, inventory management, and consumer preferences. They minimise transportation costs and reduce food wastage. Real-time monitoring and predictive analytics enable proactive decision-making, ensuring timely delivery and optimal utilisation of resources.


Modernizing Data Management with Karen Lopez

“One thing I’ve found working in the data industry is that there’s always something new coming over the horizon,” Lopez began. “Even so, we can still find ourselves suffering from the same struggles I was working on 35 years ago.” However, she pointed out, although relational databases were the core of everything until about 10 years ago, at that time there was an explosion of other types of databases and data stores -- a fact that makes the addition of the word “modern” much more meaningful than it otherwise might have been. “There are just so many more opportunities for new approaches to data management now,” she added. “I’m usually more of a skeptic when I see ‘modern’ in front of anything,” Lopez said. “There are certain standards, principles, and practices that work even in this new environment. It usually takes someone with a lot of hard-won experience to be able to tell whether one of these new systems or tools is trustworthy. Some of these things may be really exciting, but they just don’t catch on. For example, maybe they’re not scalable or they don’t meet the cost-benefit test -- there are plenty of reasons.”


Navigating Application Security in the AI Era

AI-generated code and organization-specific AI models have quickly become important parts of corporate IP. This begs the question: Can compliance protocols keep up? AI-generated code is typically created by puzzling together multiple pieces of code found in publicly available code stores. However, issues arise when AI-generated code pulls these pieces from open source libraries with license types that are incompatible with an organization’s intended use. Without regulation or oversight, this type of “non-compliant” code based on un-vetted data can jeopardize intellectual property and sensitive information. Malicious reconnaissance tools could automatically extract the corporate information shared with any given AI model, or developers may share code with AI assistants without realizing they’ve unintentionally revealed sensitive information. ... AI can be used to deliberately create malicious, difficult-to-detect code and insert it into open-source projects. AI-driven attacks are often vastly different than what human hackers would create – and different from what most security protocols are designed to protect, allowing them to evade detection. 



Quote for the day:

"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr

Daily Tech Digest - March 16, 2024

New knowledge base compiles Microsoft Configuration Manager attack techniques

“As with most 30-year-old technologies, Configuration Manager was not designed with modern security considerations,” the SpecterOps researchers said in a blog post announcing the new resource. “Many of its default configurations enable various components of its attack surface. Couple that with the inherent challenges of Active Directory environments and you have a massive attack surface suffering from a combined 55 years of technical debt.” The researchers claim they’ve encountered Configuration Manager deployments in almost every Active Directory environment they’ve investigated, a testament to the utility and popularity of the platform which allows admins to deploy applications, software updates, operating systems and compliance settings on a wide scale to servers and workstations. ... One of the most common insecure configurations for Configuration Manager encountered by SpecterOps are overprivileged network access accounts, which is one of the many accounts that SCCM uses for its various tasks. “We (very) commonly find the network access account to be configured as the client push installation account (local admin on all clients), SCCM Administrator, or even domain administrator,” the researchers said.


The IaC Weight on DevOps’ Shoulders

On the one hand, distributing the IaC load lessens the burden on the DevOps teams, but the downside is that it becomes difficult to understand which resources are actually in use and which have been temporarily created for testing purposes. With many owners creating resources on demand, once they are no longer needed, these leftovers create confusion around dependencies and make cloud platforms disorganized and difficult to maintain. Just like enabling more hands to touch IaC creates greater sprawl and disorder, more users with less governance invite careless sprawl in terms of costs as well. This often results in duplicate and unused resources accumulating, wasting budgets that are currently tight, and every penny counts. With a lack of automation and oversight, environments grow messy and expensive. The sprawl issues can also impact security, as expanding permissions raises valid security concerns that are intensified when clouds become disorganized and difficult to maintain. Well-intentioned developers may misconfigure resources or expose sensitive systems, and without proper methods to manage drift or misconfiguration, this can pose real risks to organizations and systems. Another important aspect that also increases with less oversight is intentional insider risk.


How Observability Is Different for Web3 Apps

Many blockchain networks impose a fee for every transaction relayed over the network and successfully written to the blockchain. On the Ethereum network, for example, this fee is known as gas. As a result, it is critical that you not only monitor the functionality of your Web3 dApp but also pay close attention to the economic efficiency of it. Transactions that are unnecessarily large or too many transactions increase the cost of running your Web3 dApp. ... Decentralized applications rely heavily on smart contracts. A smart contract refers to a self-executing program deployed on a blockchain and executed by the nodes that run the network. Web3 dApps depend upon smart contracts for their operations. They serve as the “backend logic” of the dApp, running on the “server” (blockchain network). The operations executed by a smart contract often incur transaction fees. These fees are used to compensate the nodes that run the blockchain network for the computational power they provide to run the smart contract code. Additionally, smart contracts often handle sensitive operations like releasing or receiving funds in the form of cryptocurrency. 


10 Cloud Security Best Practices 2024: Expert Advice

Digital supply chain security must be at the top of every company’s agenda as organizations increasingly work with third and fourth parties to drive innovation, said Nataraj Nagaratnam, IBM Fellow and CTO for Cloud Security at IBM. Modern enterprises require a vast array of hybrid and multi-cloud environments to support data storage and applications, he said. While industry cloud platforms with built-in security and controls are already helping enterprises within regulated industries de-risk the digital supply chain, including protecting banks and the vendors they transact with, organizations will need to continue to be diligent. Cloud security services can help reduce risk and enhance the compliance of cloud environments. He told Techopedia: “Enterprises must take a holistic approach to their hybrid cloud cybersecurity strategies by adopting risk management solutions that can help them gain visibility into third- and fourth-party risk posture while achieving continuous compliance.” Enterprise technology analyst David Linthicum added that it’s important for companies to vet and monitor third-party cloud service providers to ensure they meet security standards and align with the organizations’ requirements.


Data Governance Coaching: A Newcomer's Journey As A Data Manager

Companies are increasingly recognizing the importance of reliable data for informed decision-making. At the heart of this transformation are individuals like me, new data managers tasked with overseeing specific data domains within the enterprise. The foundational element of this data-driven shift lies in the role concept, a framework that identifies and nominates data managers based on their skills, knowledge, and passion for data. Despite their different expertise and company affiliations, this group has a common goal – to ensure high-quality data within their respective responsibility areas. Tackling an initial use case within our data domain is crucial to embark on this journey successfully. ... The narrative of a data manager’s journey in a forward-thinking company emphasizes continuous growth through data governance coaching. A comprehensive approach, including training, use case implementation, and ongoing support, is successfully operationalizing data managers. Past insights stress the importance of the close link between business processes and data management, the seamless identification of data managers, the operational-level conceptualization, and the recognition of varied data domains. 


Building a Sustainable Data Ecosystem

While data sharing is essential for advancing generative AI technology, it also presents significant challenges, particularly regarding privacy, security, and ethical use of data. As generative AI models become increasingly sophisticated, concerns about potential misuse, unauthorized access, and infringement of individual rights have grown. Developing sustainable policy frameworks is crucial to address these challenges and ensure that generative AI technology is deployed responsibly and ethically. Effective policies can establish guidelines and standards for data-sharing practices, promote transparency and accountability, and mitigate risks associated with privacy violations and misuse of generated content. Moreover, robust policy frameworks can foster stakeholder trust, encourage collaboration, and contribute to generative AI technology's long-term sustainability and advancement. Generative AI is a subset of artificial intelligence focused on creating new content that mimics or resembles human-generated content, such as images, text, or sound. This is achieved through machine learning techniques, including deep learning algorithms such as Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and transformers.


Why Are There Fewer Women Than Men in Cybersecurity?

The tech industry, including cybersecurity, has been rightly criticized for its "bro culture," which can be unwelcoming and even hostile to women. This culture is characterized by practices and attitudes that devalue women's contributions, overlook them for promotions and challenging projects, and subject them to harassment and discrimination. The recent surge in employee population growth from other cultures, many of which are used to the devaluation of women outside of the workforce, doesn’t translate well or do anything reformative. Such an environment not only discourages women from remaining in the field but also dissuades others from entering it. The underrepresentation of women in cybersecurity is also self-perpetuating due to the lack of visible female role models in the field. Women considering a career in cybersecurity often find few examples of successful female professionals to inspire them. This lack of visibility contributes to the misconception that cybersecurity is not a viable or welcoming career path for women. The absence of female mentors and role models means that aspiring women in cybersecurity lack guidance, support and networking opportunities that are crucial for career development and advancement in any and all fields.


Answers for the IT Skills Gap

One effective strategy is to deploy autonomous automation into your enterprise storage infrastructure, so it reduces the level of complexity, thereby decreasing the dependence on specialized IT skills that are becoming harder to find. With the power of autonomous automation, an admin can manage petabytes of storage easily and cost effectively. ... A complementary strategy is to automate the technical support process through Artificial Intelligence for IT Operations (AIOps). AIOps supports scalable, multi-petabyte storage-as-a-service (STaaS) solutions, enabling enterprises to simplify and centralize IT operations and improve cost management. ... A third strategy for shortening the gap is through storage consolidation. We have a $20 billion enterprise customer that went from 27 storage arrays from three different vendors to only four arrays. A Fortune 100 customer dramatically reduced their storage infrastructure, going from 450 floor tiles to only 50 floor tiles running all the same applications and workloads. This consolidation had many benefits, but one of the key ones was reducing the need for IT manpower. You don’t need such high-level skills with years of experience when the need for IT resources has been streamlined.


6 CISO Takeaways From the NSA's Zero-Trust Guidance

After tackling any other fundamental pillars, companies should look kick off their foray into the Network and Environment pillar by segmenting their networks — perhaps broadly at first, but with increasing granularity. Major functional areas include business-to-business (B2B) segments, consumer-facing (B2C) segments, operational technology such as IoT, point-of-sale networks, and development networks. After segmenting the network at a high level, companies should aim to further refine the segments, Rubrik's Mestrovich says. "If you can define these functional areas of operation, then you can begin to segment the network so that authenticated entities in any one of these areas don't have access without going through additional authentication exercises to any other areas," he says. "In many regards, you will find that it is highly likely that users, devices, and workloads that operate in one area don't actually need any rights to operate or resources in other areas." Zero-trust networking requires companies to have the ability to quickly react to potential attacks, making software-defined networking (SDN) a key approach to not only pursuing microsegmentation but also to lock down the network during a potential compromise.


The Role of Enterprise Architecture in Business Transformation

In the context of strategy management, tools such as strategic roadmaps and business model canvases can support in planning and communicating the business objectives of your organization. To put the strategy into execution, businesses need to organize their resources – people, process, information and technologies – into a composable set of capabilities. These are usually documented in the form of a business capability map. To provide an overview of the available and required resources, portfolios such as process portfolio, application portfolio management, data catalogue and technology radar need to be in place. One or more capabilities are described in operating models. Here, organizations define how the elements of the portfolio are connected to realize the said capabilities. By analysing capability maturity, data quality, and technology fitness, strategic gaps are identified and roadmaps for implementation and transformation are specified to close these gaps. ... EA can serve many initiatives and therefore many stakeholders in your organization. However, no matter how convenient and simple EA can be, we cannot expect everyone to be familiar with every aspect of EA, nor with the modeling languages that are used to implement it.



Quote for the day:

"Leadership means forming a team and working toward common objectives that are tied to time, metrics, and resources." -- Russel Honore