Daily Tech Digest - June 26, 2022

Only 3% of Open Source Software Bugs Are Actually Attackable, Researchers Say

Making the determination of what's attackable comes by looking beyond the presence of open source dependencies with known vulnerabilities and examining how they're actually being used, says Manish Gupta, CEO of ShiftLeft. "There are many tools out there that can easily find and report on these vulnerabilities. However, there is a lot of noise in these findings," Gupta says. ... The idea of analyzing for attackability also involves assessing additional factors like whether the package that contains the CVE is loaded by the application, whether it is in use by the application, whether the package is in an attacker-controlled path, and whether it is reachable via data flows. In essence, it means taking a simplified threat modeling approach to open source vulnerabilities, with the goal of drastically cutting down on the fire drills. CISOs have already become all too familiar with these drills. When a new high-profile supply chain vulnerability like Log4Shell or Spring4Shell hits the industry back channels, then blows up into the media headlines, their teams are called to pull long days and nights figuring out where these flaws impact their application portfolios, and even longer hours in applying fixes and mitigations to minimize risk exposures.


The Power and Pitfalls of AI for US Intelligence

Depending on the presence or absence of bias and noise within massive data sets, especially in more pragmatic, real-world applications, predictive analysis has sometimes been described as “astrology for computer science.” But the same might be said of analysis performed by humans. A scholar on the subject, Stephen Marrin, writes that intelligence analysis as a discipline by humans is “merely a craft masquerading as a profession.” Analysts in the US intelligence community are trained to use structured analytic techniques, or SATs, to make them aware of their own cognitive biases, assumptions, and reasoning. SATs—which use strategies that run the gamut from checklists to matrixes that test assumptions or predict alternative futures—externalize the thinking or reasoning used to support intelligence judgments, which is especially important given the fact that in the secret competition between nation-states not all facts are known or knowable. But even SATs, when employed by humans, have come under scrutiny by experts like Chang, specifically for the lack of scientific testing that can evidence an SAT’s efficacy or logical validity.


Data Modeling and Data Models: Not Just for Database Design

The prevailing application-centric mindset has caused the fundamental problems that we have today, Bradley said, with multiple disparate copies of the same concept in system after system after system after system. Unless we replace that mindset with one that is more data-focused, the situation will continue to propagate, he said. ... Models have a wide variety of applicable uses and can present different levels of detail based on the intended user and context. Similarly, a map is a model that can be usedlike models are used in a business. Like data models, there are different levels of maps for different audiences and different purposes. A map of the counties in an election will provide a different view than a street map used for finding an address. A construction team needs a different type of detail on a map they use to connect a building to city water, and a lesson about different countries on a globe uses still another level of detail targeted to a different type of user. Similarly, some models are more focused on communication and others are used for implementation.


Microverse IDE Unveiled for Web3 Developers, Metaverse Projects

"With Microverse IDE, developers and designers collaboratively build low-latency, high-performance multiuser Microverse spaces and worlds which can then be published anywhere," the company said in a June 21 news release. As part of its Multiverse democratization effort, Croquet has open sourced its Microverse IDE Metaverse world builder and some related components under the Apache License Version 2.0 license so developers and adopters can examine, use and modify the software as needed. ... The California-based Croquet also announced the availability of its multiplane portal technology, used to securely connect independent 3D virtual worlds developed by different parties, effectively creating the Metaverse from independent microservices. These connections can even span different domains, the company said, thus providing safe, secure and decentralized interoperability among various worlds independent of the large technology platforms. "Multiplane portals solve a fundamental problem in the Metaverse with linking web-based worlds in a secure and safe way," the company said.


5 Firewall Best Practices Every Business Should Implement

Changes that impact your IT infrastructure happen every single day. You might install new applications, deploy additional network equipment, grow your user base, adopt non-traditional work practices, etc. As all this happens, your IT infrastructure’s attack surface will also evolve. Sure, you can make your firewall evolve with it. However, making changes to your firewall isn’t something you should take lightly. A simple mistake can take some services offline and disrupt critical business processes. Similarly, you could also expose ports to external access and compromise their security. Before you apply changes to your firewall, you need to have a change management plan. The plan should specify the changes you intend to implement and what you hope to achieve. ... Poorly configured firewalls can be worse than having no firewall, as a poorly installed firewall will give you a false sense of security. The same is true with firewalls without proper deployment planning or routine audits. However, many businesses are prone to these missteps, resulting in weak network security and a failed investment.


Debate over AI sentience marks a watershed moment

While it is objectively true that large language models such as LaMDA, GPT-3 and others are built on statistical pattern matching, subjectively this appears like self-awareness. Such self-awareness is thought to be a characteristic of artificial general intelligence (AGI). Well beyond the mostly narrow AI systems that exist today, AGI applications are supposed to replicate human consciousness and cognitive abilities. Even in the face of remarkable AI advances of the last couple of years there remains a wide divergence of opinion between those who believe AGI is only possible in the distant future and others who think this might be just around the corner. DeepMind researcher Nando de Freitas is in this latter camp. Having worked to develop the recently released Gato neural network, he believes Gato is effectively an AGI demonstration, only lacking in the sophistication and scale that can be achieved through further model refinement and additional computing power. The deep learning transformer model is described as a “generalist agent” that performs over 600 distinct tasks with varying modalities, observations and action specifications. 


Data Architecture Challenges

Most traditional businesses preserved data privacy by holding function-specific data in departmental silos. In that scenario, data used by one department was not available or accessible by another department. However, that caused a serious problem in the advanced analytics world, where 360-degrees customer data or enterprise marketing data are everyday necessities. Companies, irrespective of their size, type, or nature of business, soon realized that to succeed in the digital age, data had to be accessible and shareable. Then came data science, artificial intelligence (AI), and a host of related technologies that transformed businesses overnight. Today, an average business is data-centric, data-driven, and data-powered. Data is thought of as the new currency in the global economy. In this globally competitive business world, data in every form is traded and sold. For example, 360-degrees customer data, global sales data, health care data, and insurance history data are all available with a few keystrokes. A modern Data Architecture is designed to “eliminate data silos, combining data from all corners of the company along with external data sources.” 


One in every 13 incidents blamed on API insecurity – report

Lebin Cheng, vice president of API security at Imperva, commented: “The growing security risks associated with APIs correlate with the proliferation of APIs, combined with the lack of visibility that organizations have into these ecosystems. At the same time, since every API is unique, every incident will have a different attack pattern. A traditional approach to security where one simple patch addresses all vulnerabilities doesn’t work with APIs.” Cheng added: “The proliferation of APIs, combined with the lack of visibility into these ecosystems, creates opportunities for massive, and costly, data leakage.” ... By the same metric, professional services were also highly exposed to API-related problems (10%-15%) while manufacturing, transportation, and utilities (all 4-6%) are all in the mid-range. Industries such as healthcare have less than 1% of security incidents attributable to API-related security problems. Many organizations are failing to protect their APIs because it requires equal participation from the security and development teams, which have historically have been somewhat at odds. 


What Are Deep Learning Embedded Systems And Its Benefits?

Deep learning is a hot topic in machine learning, with many companies looking to implement it in their products. Here are some benefits that deep learning embedded systems can offer: Increased Efficiency and Performance: Deep learning algorithms are incredibly efficient, meaning they can achieve high-performance levels even when running on small devices. This means that deep learning embedded systems can be used to improve the performance of existing devices and platforms or to create new devices that are powerful and efficient. Reduced Size and Weight: Deep learning algorithms are often very compact and can be implemented on small devices without sacrificing too much performance or capability. This reduces the device’s size and weight, making it more portable and easier to use. Greater Flexibility: Deep learning algorithms can often exploit complex data sets to improve performance. This means deep learning embedded systems can be configured to work with various data sets and applications, giving them greater flexibility and adaptability.


State-Backed Hackers Using Ransomware as a Decoy for Cyber Espionage Attacks

The activity cluster, attributed to a hacking group dubbed Bronze Starlight by Secureworks, involves the deployment of post-intrusion ransomware such as LockFile, Atom Silo, Rook, Night Sky, Pandora, and LockBit 2.0. "The ransomware could distract incident responders from identifying the threat actors' true intent and reduce the likelihood of attributing the malicious activity to a government-sponsored Chinese threat group," the researchers said in a new report. "In each case, the ransomware targets a small number of victims over a relatively brief period of time before it ceases operations, apparently permanently." Bronze Starlight, active since mid-2021, is also tracked by Microsoft under the emerging threat cluster moniker DEV-0401, with the tech giant emphasizing its involvement in all stages of the ransomware attack cycle right from initial access to the payload deployment. ... The key victims encompass pharmaceutical companies in Brazil and the U.S., a U.S.-based media organization with offices in China and Hong Kong, electronic component designers and manufacturers in Lithuania and Japan, a law firm in the U.S., and an aerospace and defense division of an Indian conglomerate.



Quote for the day:

"Leadership has a harder job to do than just choose sides. It must bring sides together." -- Jesse Jackson

Daily Tech Digest - June 25, 2022

What Are CI And CD In DevOps And How Do They Work?

The purpose of continuous delivery is to put a packed item into production. The whole delivery process, including deployment, is automated using a CD. CD tasks may involve provisioning infrastructure, tracking changes (ticketing), deploying artifacts, verifying and tracking those changes, and ensuring that these changes do not occur if any problems arise. Certain parts of continuous delivery will be used by some firms to help them maintain their operational duties. A good example is employing a CD pipeline to handle infrastructure deployment. Some organizations will leverage their CD pipelines to coordinate infrastructure setup and configuration using configuration management automated processes such as Ansible, chef, or puppet. A CI/CD pipeline may appear to be overhead, but it is not. It is essentially an executable definition of the procedures that any developer must take in order to deliver a new edition of a software product. Without an automated pipeline, developers would have to complete these processes manually, which would be significantly less productive.


Why You Need to Be an Influencer Brand and the 3 Rs of Becoming One

Of course, brands creating content has been around for decades. Content marketing is creating and distributing valuable, relevant and consistent content to attract/retain an audience, driving profitable action. The difference is that influencer brands have shifted their entire orientation to a consumer-centric integrated marketing communications (IMC) mindset. Influencer brands go beyond blogs, infographics, eBooks, testimonials, and how-to guides that appeal to the head. They have learned to appeal to the heart of their audience. This comes from seeing the world from the target's perspective. A shift that can be seen following the three Rs of influence to direct brand content creation. For example, the focus of Yeti Coolers' content and engagement isn't selling coolers. It is selling a lifestyle that the coolers help enable. For example, they organize products so customers can shop by activity. Images and copy lead with stories of the adventures their audience can have with the gear — fishing, hunting, camping, by the coast, in the snow, on the ranch and in the rodeo arena.


3 certification tips for IT leaders looking to get ahead

If leveraged properly, certifications can also assist IT decision-makers in their key leadership responsibilities. For example, Puneesh Lamba, CIO of Shahi Exports, an apparel manufacturing company, acknowledges that “certifications have helped him perform better in board meetings, thereby making it easier to get approvals on IT spending.” “Typically, CIOs from large technology companies have strong IT skills but poor communications skills, while it’s just the opposite for CIOs in customer facing B2C companies. These technology leaders need to get certified in areas that they lack. While CIOs push their team to get certified, they need to come out of their comfort zones and follow suit,” says Chandra. But the benefits of certifications won’t accrue automatically. IT leaders seeking to advance their skills and careers need to build a strategy aimed at squeezing the maximum value out of what certifications can offer. Here, four CIOs share their experiences in pursuing certifications and offer advice on how to make the most of these valuable career advancement tools as an IT leader.


Magnetic superstructures as a promising material for 6G technology

The race to realize sixth generation (6G) wireless communication systems requires the development of suitable magnetic materials. Scientists from Osaka Metropolitan University and their colleagues have detected an unprecedented collective resonance at high frequencies in a magnetic superstructure called a chiral spin soliton lattice (CSL), revealing CSL-hosting chiral helimagnets as a promising material for 6G technology. The study was published in Physical Review Letters. Future communication technologies require expanding the frequency band from the current few gigahertz (GHz) to over 100 GHz. Such high frequencies are not yet possible, given that existing magnetic materials used in communication equipment can only resonate and absorb microwaves up to approximately 70 GHz with a practical-strength magnetic field. Addressing this gap in knowledge and technology, the research team led by Professor Yoshihiko Togawa from Osaka Metropolitan University delved into the helicoidal spin superstructure CSL.


Don’t fall into the personal brand trap

While you can try to emulate the positive qualities of branding, the truth is that rulebook wasn’t designed with you in mind. Brands are static creations, while you must be a dynamic participant in your life and career. Brands let the consensus of others dictate their values and meaning, while you must discover both for yourself. Brands chase consistency by reorienting to match the expectations of “consumers,” while you must have reserve room to grow and develop without a sense of self-fraudulence. Take the personal-branding prescription too far, and you run the risk of cementing your identity to the brand. New passions are unexplored. Fears and struggles must be ignored over concerns of not being “on brand.” And your life endeavors are filtered through the lens of marketability rather than the pursuit of their intrinsic worth.All of which can be counterproductive to your sense of authenticity. As one meta-analysis found, authenticity had a positive relationship with both well-being and engagement. But to achieve that, you must meet yourself as you are today, not who you were 10 years ago when you settled on your personal brand.


Is NextJS a Better Choice Than React in 2022?

If you know, React, you kind of know NextJS. This is because Next is a React framework.
You have components just like in React. CSS has a different naming convention, but that's the biggest change. The reason Next is so good is that it gives you options. If you want a page to have good SEO, you can use ServerSideProps. If you want to use CSR, you can use UseEffect to call your APIs, like React. Adding typescript to your Next project also is very simple. You even have a built-in router and don't have to use React router. The option to choose between CSR, SSR, and SSG is what makes Next the best. You even get a free trial on Vercel for your Next project. Now that you're convinced that you should Next.js, you might wonder how to change your existing website to Next. Next.js is designed for gradual adoption. Migrating from React to Next is pretty straightforward and can be done slowly by gradually adding more pages. You can configure your server so that everything under a specific subpath points to the Next.js app. If your site is abc.com, you can configure abc.com/about to serve a Next.js app. This has been explained really well in the Next.js docs.


How machine learning AI is going to change gaming forever

Obviously, machine learning techniques have broad implications for almost every sector of life, but how they will intersect across gaming has potentially some of the broadest implications for Microsoft as a business. One problem the video game industry generally faces right now pertains to the gap between expectations and investment. Video games are becoming increasingly complex to make, fund, and manage, as they explode in exponential complexity and graphical fidelity. We've seen absolutely insane Unreal Engine demos that showcase near-photorealistic scenes and graphics, but the manual labor involved to produce a full game based on some of these principles is truly palpable both in terms of time, and expense. What is typically thought of as "AI" in a gaming context generally hasn't been AI in the true sense of the word. Video game non-player characters (NPCs) and enemies generally operate on a rules-based model that often has to be manually crafted by a programmer. Machine learning models are importantly far more fluid, able to produce their own rules within parameters, and respond dynamically to new information on the fly.


Reflections about low-code data management

As more people began using the Internet, better tools and resources became available. Today, the market is full of low-code Content Management Systems (CMS) and drag-and-drop website builders (WordPress, HubSpot, Shopify, Squarespace, etc.) that make it easy to create a professional-looking website without any coding knowledge. While there are still a handful of very specific use cases where you would need to code a website from scratch, organizations realized that using a low-code CMS or drag-and-drop builder was a much better option in the vast majority of cases. This shift has led to a dramatic decrease in the amount of time and effort required to build a website. In fact, you can now create an entire website in just a few hours using these low-code tools. With every great shift comes some level of resistance. At first, web developers were skeptical of (or outright opposed to) low-code tools for the following reasons:Fear of Replacement: Developers saw these tools as a threat to their jobs. Power & Flexibility: Developers were unconvinced that they would be powerful, flexible, or customizable enough to produce the same quality of work. 


Inside the Metaverse: Architects See Opportunity in a Virtual World

“The metaverse is not an escape, and it's not a video game,” Patrik Schumacher, principal at Zaha Hadid Architects (ZHA), told RECORD. “It will become the immersive internet for corporations, for education, for retail, and also for socializing and networking in more casual arenas. Everything we are doing in the real world could potentially be substituted or augmented or paralleled with interactions in the metaverse.” ZHA was one of the first major firms to take the plunge into metaverse design. In early March, the firm announced that it would build an entire metaverse city—a digital version of the unrecognized, and as yet unbuilt, sovereign state “Liberland'' that was founded seven years ago by the right-wing Czech politician Vít Jedlička. “At the time, I was very frustrated with planning regulations and overbearing political constraints on city development,” says Schumacher, who has long fought against government intervention in urban development.


5 social engineering assumptions that are wrong

Users may be more inclined to interact with content if it appears to originate from a source they recognize and trust, but threat actors regularly abuse legitimate services such as cloud storage providers and content distribution networks to host and distribute malware as well as credential harvesting portals, according to Proofpoint. “Threat actors may prefer distributing malware via legitimate services due to their likelihood of bypassing security protections in email compared to malicious documents. Mitigating threats hosted on legitimate services continues to be a difficult vector to defend against as it likely involves implementation of a robust detection stack or policy-based blocking of services which might be business relevant,” the report read. ... There’s a tendency to assume that social engineering attacks are limited to email, but Proofpoint detected an increase in attacks perpetuated by threat actors leveraging a robust ecosystem of call center-based email threats involving human interaction over the telephone. “The emails themselves don’t contain malicious links or attachments, and individuals must proactively call a fake customer service number in the email to engage with the threat actor. ...”



Quote for the day:

"The ability to stay calm and polite, even when people upset you, is a superpower." -- Vala Afshar

Daily Tech Digest - June 24, 2022

Toward data dignity: Let’s find the right rules and tools for curbing the power of Big Tech

Enlightened new policies and legislation, building on blueprints like the European Union’s GDPR and California’s CCPA, are a critical start to creating a more expansive and thoughtful formulation for privacy. Lawmakers and regulators need to consult systematically with technologists and policymakers who deeply understand the issues at stake and the contours of a sustainable working system. That was one of the motivations behind the creation of the  >Ethical Tech Project —to gather like-minded ethical technologists, academics, and business leaders to engage in that intentional dialogue with policymakers. We are starting to see elected officials propose regulatory bodies akin to what the Ethical Tech Project was designed to do—convene tech leaders to build standards protecting users against abuse. A recently proposed federal watchdog would be a step in the right direction to usher in proactive tech regulation and start a conversation between the government and the individuals who have the know-how to find and define the common-sense privacy solutions consumers need.


For HPC Cloud The Underlying Hardware Will Always Matter

For a large contingent of those ordinary enterprise cloud users, the belief is that a major benefit of the cloud is not thinking about the underlying infrastructure. But, in fact, understanding the underlying infrastructure is critical to unleashing the value and optimal performance of a cloud deployment. Even more so, HPC application owners need in-depth insight and therefore, a trusted hardware platform with co-design and portability built in from the ground up and solidified through long-running cloud provider partnerships. ... In other words, the standard lift-and-shift approach to cloud migration is not an option. The need for blazing fast performance with complex parallel codes means fine-tuning hardware and software. That’s critical for performance and for cost optimization, says Amy Leeland, director of hyperscale cloud software and solutions at Intel. “Software in the cloud isn’t always set by default to use Intel CPU extensions or embedded accelerators for optimal performance, even though it is so important to have the right software stack and optimizations to unlock the potential of a platform, even on a public cloud,” she explains.


NSA, CISA say: Don't block PowerShell, here's what to do instead

Defenders shouldn't disable PowerShell, a scripting language, because it is a useful command-line interface for Windows that can help with forensics, incident response and automating desktop tasks, according to joint advice from the US spy service the National Security Agency (NSA), the US Cybersecurity and Infrastructure Security Agency (CISA), and the New Zealand and UK national cybersecurity centres. ... So, what should defenders do? Remove PowerShell? Block it? Or just configure it? "Cybersecurity authorities from the United States, New Zealand, and the United Kingdom recommend proper configuration and monitoring of PowerShell, as opposed to removing or disabling PowerShell entirely," the agencies say. "This will provide benefits from the security capabilities PowerShell can enable while reducing the likelihood of malicious actors using it undetected after gaining access into victim networks." PowerShell's extensibility, and the fact that it ships with Windows 10 and 11, gives attackers a means to abuse the tool. 


How companies are prioritizing infosec and compliance

This study confirmed our long-standing theory that when security and compliance have a unified strategy and vision, every department and employee within the organization benefits, as does the business customer,” said Christopher M. Steffen, managing research director of EMA. Most organizations view compliance and compliance-related activities as “the cost of business,” something they have to do to conduct operations in certain markets. Increasingly, forward-thinking organizations are looking for ways to maximize their competitive advantage in their markets and having a best-in-class data privacy program or compliance program is something that more savvy customers are interested in, especially in organizations with a global reach. Compliance is no longer a “table stakes” proposition: comprehensive compliance programs focused on data security and privacy can be the difference in very tight markets and are often a deciding factor for organizations choosing one vendor over another.”


IDC Perspective on Integration of Quantum Computing and HPC

Quantum and classical hardware vendors are working to develop quantum and quantum-inspired computing systems dedicated to solving HPC problems. For example, using a co-design approach, quantum start-up IQM is mapping quantum applications and algorithms directly to the quantum processor to develop an application-specific superconducting computer. The result is a quantum system optimized to run particular applications such as HPC workloads. In collaboration with Atos, quantum hardware start-up, Pascal is working to incorporate its neutral-atom quantum processors into HPC environments. NVIDIA’s cuQuantum Appliance and cuQuantum software development kit provide enterprises the quantum simulation hardware and developer tools needed to integrate and run quantum simulations in HPC environments. At a more global level, the European High Performance Computing Joint Undertaking (EuroHPC JU) announced its funding for the High-Performance Computer and Quantum Simulator (HPCQS) hybrid project. 


Australian researchers develop a coherent quantum simulator

“What we’re doing is making the actual processor itself mimic the single carbon-carbon bonds and the double carbon-carbon bonds,” Simmons explains. “We literally engineered, with sub-nanometre precision, to try and mimic those bonds inside the silicon system. So that’s why it’s called a quantum analog simulator.” Using the atomic transistors in their machine, the researchers simulated the covalent bonds in polyacetylene. According to the SSH theory, there are two different scenarios in polyacetylene, called “topological states” – “topological” because of their different geometries. In one state, you can cut the chain at the single carbon-carbon bonds, so you have double bonds at the ends of the chain. In the other, you cut the double bonds, leaving single carbon-carbon bonds at the ends of the chain and isolating the two atoms on either end due to the longer distance in the single bonds. The two topological states show completely different behaviour when an electrical current is passed through the molecular chain. That’s the theory. “When we make the device,” Simmons says, “we see exactly that behaviour. So that’s super exciting.”


Is Kubernetes key to enabling edge workloads?

Lightweight and deployed in milliseconds, containers enable compatibility between different infrastructure environments and apps running across disparate platforms. Isolating edge workloads in containers protects them from cyber threats while microservices let developers update apps without worrying about platform-level dependencies. Benefits of orchestrating edge containers with Kubernetes include:Centralized Management — Users control the entire app deployment across on-prem, cloud, and edge environments through a single pane of glass. Accelerated Scalability — Automatic network rerouting and the capability to self-heal or replace existing nodes in case of failure remove the need for manual scaling. Simplified Deployment — Cloud-agnostic, DevOps-friendly, and deployable anywhere from VMs to bare metal environments, Kubernetes grants quick and reliable access to hybrid cloud computing. Resource Optimization — Kubernetes maximizes the use of available resources on bare metal and provides an abstraction layer on top of VMs optimizing their deployment and use.


Canada Introduces Infrastructure and Data Privacy Bills

The bill sets up a clear legal framework and details expectations for critical infrastructure operators, says Sam Andrey, a director at think tank Cybersecure Policy Exchange at Toronto Metropolitan University. The act also creates a framework for businesses and government to exchange information on the vulnerabilities, risks and incidents, Andrey says, but it does not address some other key aspects of cybersecurity. The bill should offer "greater clarity" on the transparency and oversight into what he says are "fairly sweeping powers." These powers, he says, could perhaps be monitored by the National Security and Intelligence Review Agency, an independent government watchdog. It lacks provisions to protect "good faith" researchers. "We would urge the government to consider using this law to require government agencies and critical infrastructure operators to put in place coordinated vulnerability disclosure programs, through which security researchers can disclose vulnerabilities in good faith," Andrey says.


Prioritize people during cultural transformation in 3 steps

Addressing your employees’ overall well-being is also critical. Many workers who are actively looking for a new job say they’re doing so because their mental health and well-being has been negatively impacted in their current role. Increasingly, employees are placing greater value on their well-being than on their salary and job title. This isn’t a new issue, but it’s taken on a new urgency since COVID pushed millions of workers into the remote workplace. For example, a 2019 Buffer study found that 19 percent of remote workers reported feeling lonely working from home – not surprising, since most of us were forced to severely limit our social interactions outside of work as well. Leaders can help address this by taking actions as simple as introducing more one-to-one meetings, which can boost morale. One-on-one meetings are essential to promoting ongoing feedback. When teams worked together in an office, communication was more efficient mainly because employees and managers could meet and catch up organically throughout the day.


Pathways to a Strategic Intelligence Program

Strong data visualization capabilities can also be a huge boost to the effectiveness of a strategic intelligence program because they help executive leadership, including the board, quickly understand and evaluate risk information. “There’s an overwhelming amount of data out there and so it’s crucial to be able to separate the signal from the noise,” he says. “Good data visualization tools allow you to do that in a very efficient, impactful and cost-effective manner, and to communicate information to busy senior leaders in a way that is most useful for them.” Calagna agrees that data visualization tools play an important role in bringing a strategic intelligence to life for leaders across functions within any organization, helping them to understand complex scenarios and insights more easily than narrative and other report forms may permit. “By quickly turning high data volumes into complex analyses, data visualization tools can enable organizations to relay near real-time insights and intelligence that support better informed decision-making,” she says. Data visualization tools can help monitor trends and assumptions that impact strategic plans and market forces and shifts that will inform strategic choices.



Quote for the day:

"Patience puts a crown on the head." -- Ugandan Proverb