Daily Tech Digest - January 14, 2024

Quantum mechanics uncovers hidden patterns in the stock market

What does this mean for the stock market? It implies that higher volatility and a slower reversion to equilibrium amplify herding behavior among investors, especially during times of uncertainty and information asymmetry. The study goes further by testing this model with empirical data from the U.S. stock market. Using the growth rate of gross domestic product (GDP) and forecaster uncertainty as indicators for business cycles and economic uncertainty, respectively, they found a positive correlation between the power law exponent and the GDP growth rate, and a negative correlation with forecaster uncertainty. This confirms their theoretical predictions and highlights the role of economic uncertainty in linking business cycles with herding behavior in stock returns. ... “Our study shows that quantum mechanics can be a useful tool to understand the stock market, a complex system with many interacting agents. We hope that our study can inspire more interdisciplinary research that combines physics and finance to explore the hidden patterns and mechanisms of the stock market,” he states.


'We Never Upskill Fast Enough': NTT DATA Services CEO Bob Pryor on mastering change

It's always a challenge, and to be honest, we never upskill fast enough given the myriad of options available. However, we're heavily investing in training, development, and skilling across all levels. Retaining talent involves helping them acquire more advanced technologies and skills in high-demand disciplines. Individuals tend to find greater satisfaction in roles that require complexity over those that are simpler to master. Constantly evolving the mix of skills, technology, and labour is crucial. Take AI, for example—it doesn't eliminate labour; it enhances people's efficacy when working with AI. In healthcare, top oncologists use advanced AI algorithms for diagnosis, medical devices, and treatment. The challenge isn't whether they are displaced by technology but whether we're scaling them fast enough to use the advanced technologies we're investing in and developing. Working effectively with AI involves having people smart enough to ask the right questions—what to create, what questions to ask, and how to interpret language models. 


5 Ways To Upskill As A Leader And Gain Respect From Your Team

Leadership is about building relationships, not task lists. This year, upskill yourself by building these skills to develop a leadership style that inspires cooperation and motivation, not fear. ... Being polite shows the people around you that you respect them, and they are more likely to return the favor. It costs you nothing to be kind. A basic greeting can go a long way, as can asking about your employees’ weekends, family, etc. Remember to say please and thank you. Never interrupt when your employees are talking, and show that you respect their time, work, and ideas. ... Bossing people around doesn’t feel great long term. You know when there’s tension in your office and when people aren’t glad to see you. It’s not good for your mental health to spend nine hours a day (or more) with people who resent your presence. When you tap into your humanity to create better relationships with your employees and become a leader people enjoy working with, not only will you feel more respected as a person, but you’ll likely also enjoy the benefits of a happier workforce, such as higher productivity, better work and even higher profits.


Yes, We're Still Messing Up Hybrid Work. Here's Where Exactly We're Going Wrong.

Hybrid work environments are dynamic, and what works one day may not be effective the next. Managers must be trained to be flexible in their leadership approach, adapting to the varying needs of their team members. This adaptability also means being open to feedback and willing to continuously learn and evolve their management style. It involves understanding the unique challenges and opportunities of managing remote and in-office team members and being adept at creating a cohesive team culture that bridges the physical divide. Honing communication skills is another key focus. In a hybrid setup, clear and inclusive communication is paramount. Managers need to be adept at conveying their messages effectively across various digital platforms, ensuring that every team member, whether remote or in-office, feels equally involved and informed. ... Developing strategies for remote team building is equally important. Hybrid work models can lead to a sense of disconnection among team members.


It’s time to fix flaky tests in software development

Not only do flaky tests threaten the quality and speed of software delivery, they pose a very real threat to the happiness and satisfaction of software developers. Similar to other bottlenecks in the software development process, flaky tests take developers out of their creative flow and prevent them from doing what they love: creating software. Imagine a test passes on one run and fails on the next, with no relevant changes made to the codebase in the interim. This inconsistent behavior can create a fog of confusion, and lead developers down demoralizing rabbit holes to figure out what’s gone wrong. It’s a huge waste of time and energy. By addressing flaky tests, technology leaders can directly improve the developer experience. Instead of getting tangled up in a web of phantom problems that drain their time and energy, developers are able to spend more time on fulfilling tasks like creating new features or refining existing code. When erratic tests are eliminated, the development process runs much more smoothly, resulting in a more motivated and happier team.


Building Cybersecurity Resilience With the Power of Habit

Clear's principles and philosophy, advocating for small yet consistent changes, should resonate deeply with cyberprofessionals. These principles, while not originally intended for use in the cybersecurity realm, can be creatively applied to construct a robust framework for a resilient cybersecurity culture. Clear's principles can be adapted to the cultivation of cybersecurity habits. ... The journey can begin with the fundamentals, for example, the management of cloud access rights. This involves regularly reviewing who has access to what information or resources and why, revoking access rights when an employee changes roles or leaves the organization, and implementing the principle of least privilege, wherein users are given the minimum levels of access necessary to perform their jobs. These minor changes, when consistently applied, can become the building blocks of an enterprise’s cybersecurity framework. The cumulative effect of such microchanges can be surprising. 


Customer Experience Is King, but CIOs Could Do More to Help

The very nature of how customer experience projects get defined and shepherded places IT at the back of the room, as an executor of tasks but not as a strategic leader. Is this bad? Not necessarily, considering that the end business units interacting with the customer ostensibly have expertise in dealing with customers, and are in the best position to know what customers want. However, as technology becomes a more integral element of the selling, informing, fulfillment and servicing of customers, there also is unique expertise that IT brings to the table. It can be invaluable in improving the customer experience, and that can also avert disaster. Being able to sell non-stop, 24/7 to worldwide customers is a major driver of e-commerce, as is the ability to provide customers with self-service options that can reduce internal operational costs for companies. Analytics, which can assess an individual customer or demographic buying habits and anticipate what customers will want to buy next are seen as beneficial. 


Leveraging Chaos Engineering To Test The Resilience Of Distributed Computing Systems

It helps build the resilience of distributed computing systems and improves their ability to withstand unexpected disruptions. Read on to know how. Chaos engineering leverages the chaos theory to achieve this. Further, the chaos theory introduces random and unexpected behavior in a controlled manner to identify system weaknesses. How does it benefit organizations? By enabling them to identify system vulnerabilities even before they actually occur. As a result, an organization can proactively adopt measures to plug potential vulnerabilities and improve system stability. However, developers associated with a premier software development company use an innovative approach to chaos engineering. ... The concept might look similar to stress testing but they are not the same. There are some key differences. For one, the concept leverages the chaos theory to proactively identify system or network issues and correct them. It also tests and corrects all components at the same time. Here, developers associated with a software development company in New York tend to look beyond possible causes and obvious issues. 


Neither ‘Agile’ nor Architecture are Going Anywhere

Want to move the enterprise to little a or big A agile? Want to modernize the technology stack? Implement flex points in subsystems? Integration effectiveness? Harness information for outcomes? Deliver technology services? Event-Driven Architecture? Customer-Centric Design? Manage cross-system compatibility and quality attributes? Handle mergers and acquisitions well? Project/team thinking do not account for these outcomes. The product owner doesn’t understand them and the development lead is focused on speed, simplicity and delivery. They may not understand them either. Architecture connects big outcomes to little decisions. I have seen huge objectives brought low by simple development decisions. ... From the board room to the basement. From idea to outcome. In between operating responsibilities. In between competing business objectives. With partners. With vendors. With an ever changing technology adoption cycle. From finance to legal to customer impacts, it takes a LOT of fascilitation, discussion, decision making and prioritization to deliver a balanced advantageous technology strategy. 


Demystifying Cloud Trends: Statistics and Strategies for Robust Security

The Shared Responsibility Model is a security and compliance framework that defines the responsibilities of cloud service providers (CSPs) and cloud customers for securing every aspect of the cloud environment, including hardware, infrastructure, endpoints, data, configurations, settings, operating system (OS), network controls and access rights. In basic terms, this model helps clarify who is responsible for securing various aspects of the cloud infrastructure, services, and data. The division of responsibilities varies depending on the cloud deployment model. ... Implementing strong IAM practices enforcing the principle of least privilege to restrict access rights for users and systems and regularly reviewing and updating access permissions can have a major positive impact on an organization’s cloud security posture. It’s as simple as granting users and other cloud resources the authorization to access the required resources only to a required extent. Multi-factor authentication (MFA) adds an additional layer of security, ensuring that only authorized users have access to resources and data. 



Quote for the day:

"We become what we think about most of the time, and that's the strangest secret." -- Earl Nightingale

Daily Tech Digest - January 13, 2024

Frenemies to friends: Developers and security tools

Cultural shifts happen when security is built into the developer’s existing flow, as opposed to being injected as its own new stage in the pipeline. Look for points in their process where they are already in “pause” or “edit” mode, like at the Pull Request, where you can surface vulnerabilities and ask for remediation efforts. Doing so can avoid context switching and feelings of being interrupted. Capitalizing on an existing developer pause point can help train your developers to look at security vulnerabilities like functionality bugs, a skill they already have, while also shortening feedback loops. ... Developer-to-developer enablement is key. There is often a feeling of mistrust between engineering and security, but developers share the same interests and have the same priorities. Let individual contributors have an opportunity to educate and enable other individual contributors. If you have had a successful pilot or PoC team, or notice self-motivated folks using the tool proactively, give them space to share their experience with the tool. 


The Joys and Pains of DevOps

DevOps is very much a culture change in the way development, operations and even security work together. Even though DevOps aims to improve this, in many cases, these areas still function in silos. There are times when one area implements something that blocks another; and as a DevOps leader, you’re often in the middle trying to figure out the best path forward while also finding an acceptable middle ground. ... A well-engineered DevOps solution should render the team invisible. That includes both the happy path, when deployments succeed, as well as how well you enable teams to solve their deployment issues. There is also one common element of what makes DevOps rewarding: improving developer experience and business outcomes. Dale Francis, director of product development at Climavision, says the rewards of DevOps come from solving problems, so day-to-day operations become simple and the experience for developers better. In addition, maturing as a DevOps organization also lets everyone focus more on solving business problems, rather than fighting technical issues. 


Why Engineering Is Key To A Flourishing Workplace Culture

If your engineering strategy demands precision but your workplace culture tolerates ambiguity and shortcuts, you won't get anywhere. If your engineering strategy demands accountability but your workplace culture doesn't draw connections between an individual's efforts and the higher goals of the operation, you won't get anywhere. If your engineering strategy demands innovation but your workplace culture rewards risk aversion, you won't get anywhere. ... In an arena as complex and technical as engineering, it's easy to lose sight of the human side. Whether your workplace is in-person, remote or hybrid, it's crucial to create spaces (literal or virtual) where employees feel connected and empowered to ask questions. Trust and creativity flourish in an environment where autonomy and authentic connections coexist. ... Inertia is fatal to engineering. Regularly evaluate and adopt new technologies. Find out what your customers need. Find out what hurdles they're up against. Think three steps ahead so your tech stack supports the evolving needs of your business and the market.


Life's Too Short to Work With Incompatible People

Celebrate failure and learn to give feedback. When you embrace failure, you learn and course-correct more quickly. Failure is a sign you're doing something right. You're testing, learning, flexing your creative muscles and moving on efficiently after hitting a brick wall. You must build a team open to feedback to make the most of your failures for the company's good. Feedback is the mode by which we make positive changes out of failure. The challenge? Feedback makes most people cringe. We associate it with criticism as opposed to growth. ... Clear communication may seem like an obvious necessity on high-performing teams, but it's something that's often taken for granted. Unclear communication can quickly tank a team's efforts. A team that has mastered precise communication, on the other hand, can achieve incredible outcomes quickly. We follow an "open book" mentality at Wistia. On all-hands calls, we share candid information about the state of the company – inclusive of the good and the bad – so everyone has the big picture. 


Researchers demo new CI/CD attack techniques in PyTorch supply-chain

Khan initially found a critical vulnerability that could have led to the poisoning of GitHub Actions’ official runner images. The “runners” are the VMs that execute build actions defined inside GitHub Actions workflows. After reporting the vulnerability to GitHub and receiving a $20,000 bug bounty for it, Khan realized that the core issue he found was systemic and that thousands of other repositories were likely impacted. Since then, Khan and Stawinski found vulnerabilities in the software repositories and development infrastructure of major corporations and software projects and collected hundreds of thousands of dollars in rewards through bug bounty programs. Their “victims” included Microsoft Deepspeed, a Cloudflare application, the TensorFlow machine-learning library, the crypto wallets and nodes of several blockchains, and PyTorch, one of the most widely used open-source machine-learning frameworks. PyTorch was originally developed by Meta AI, a subsidiary of Meta, but its development is now governed by the PyTorch Foundation, an independent organization that operates under the Linux Foundation’s umbrella.


For a Secure Foundation, Health Systems Must Address Technical Debt

We need update network equipment, workstations. We may still even have Windows 2003 and 2008. And hardware is not as expensive as the applications that are on there. So that level of technical debt and competing for those dollars where in healthcare you need to have nice offices and that type of thing. So we’re competing with those, with other projects or capital where other organizations may think of that as just an ongoing IT update expense. ... I might hear this stuff at home occasionally, but it’s the same with IT projects. “Hey, we had an acquisition. We got them up and running. We didn’t take care of their technical debt so we’re assuming that.” We’re going through some of those servers now, it’s like, can we even find anybody that knows anything about it, or is it just everyone’s afraid to turn it off? What I like to say is if you didn’t sit around the right campfire, you don’t know the story. So for me, my job sometimes is just to keep asking those questions: “Who knows something about this server?” Sometimes it comes down to the scream test, but I’ve developed a quality, I call it positive persistence. I just keep asking questions politely until we make progress.


The way forward is to make technology 'human-like': Report

As the world undergoes a massive technological transformation, artificial intelligence (AI) and other disruptive technologies will increasingly adopt a more human-like or "Human by Design" approach, according to a new study published on Wednesday. These technologies becoming more human-like and intuitive for people to use, will increasingly lead to a new era of unprecedented productivity and creativity, said the report, titled 'Accenture Technology Vision 2024: Human by design, how AI unleashes the next level of human potential,' which also emphasizes that enterprises that prepare for this shift now will be the winners in the future. The research further highlights that as human-centric technologies continue to advance, they are becoming easier to interact with and more seamlessly integrated into every aspect of our lives. ... As AI, spatial computing, and body-sensing technologies evolve to imitate human capabilities and become less noticeable, the true focus will be on the people who are empowered with new capabilities to achieve what was once considered impossible.


Expert Insight: Andrew Snow on a landmark GDPR ruling

For organisations, it makes clear beyond all doubt that ignorance isn’t an excuse. In fact, if organisations – or managers within them – plead ignorance to the infringement now, they may face a higher fine than if they had taken responsibility for their actions. For regulators, an important precedent has been set. This ruling has provided them with clear direction on where the line falls when deciding on issuing administrative penalties, including fines. For instance, the EDPB [European Data Protection Board] recently reported on another case, involving the Slovak and Hungarian authorities, where there was a dispute over the ownership. The Hungarian regulator ultimately determined that both parties jointly determined the purposes of processing, so were joint controllers – and as such, breached the GDPR because their agreement failed to document this and, by extension, their respective responsibilities. Given the timing of this decision, it probably wasn’t influenced by the ECJ ruling, but I expect that future cases like this would use the ruling as a precedent.


What Are Digital Twins and How Can They Be Used in Healthcare?

Trayanova’s research is on applying personalized digital twin approaches to clinical decision-making. She aims to improve predictive diagnostics and to predict optimal treatment plans for patients. This is currently being used to treat patients with heart rhythm disorders. At Johns Hopkins, Trayanova and her team can create a personalized digital twin representing the geometry of a patient’s heart. The digital twin includes the heart’s structure; disease remodeling such as damage, fibrosis and inflammation identified through MRI or PET scans; and its electrical wave propagation. When an electrical wave propagates to the heart, it triggers a contraction. However, if a patient has scarring or other damage, the wave will catch in that area and, rather than propagating through the heart, it will recirculate and cause an arrythmia. To treat the arrythmia, the digital twin must accurately represent the damage as well as the electrical activity of each cell in the heart. “Now you have something that dynamically links the heart’s components,” Trayanova says. Using the digital twin, she and her team can send a signal and watch how the electrical wave propagates through the model. 


What will the metaverse mean for business models?

In media and entertainment, the primary model of business has evolved from ownership to subscription. In the past, most people bought CDs and DVDs to build a collection – today, owning vinyl is booming in popularity again. But for the majority of people, the accepted model is accessing songs, films and TV series online and building your own virtual library. The difference is that if you stop paying the subscription, you have nothing. Will it be the same in the metaverse? We’ll have to wait and see. But it’s safe to assume that people will want ownership of their assets without paying a subscription (except for the wallet that protects them). To complicate things, there is the question of what role content from Generative AI will play in metaverse business models. Today, it’s generally accepted that no one owns work created by Generative AI. But won't this change? In fact, this assumption may even be wrong – in the UK for example, the law implies that the creators of the AI platform own anything wholly created by it. 



Quote for the day:

"Great leaders do not desire to lead but to serve." -- Myles Munroe

Daily Tech Digest - January 12, 2024

Navigating Tomorrow: Becoming an Enterprise of the Future

Preparing for what lies ahead goes far beyond just implementing the right technologies, it is about developing a culture that embraces change with empathy. Cultivating a mindset across the organisation that values innovation, continuous learning, and agility ensures that every employee charges forward with confidence. In times of economic uncertainties and technological advancements, it is crucial that we practice empathy. Naturally, there is some fear that technologies like AI will replace human workers. As such, leaders must help employees understand that technology is here to augment their roles and empower them to spend more time on other valuable tasks. The key to embracing any new technology and providing access at scale is to get everyone in the team on board. Whether greeted with excitement or anxiety, leaders must champion this culture of change by encouraging employees to seek new ways of working while ensuring they remain engaged and valued. Certainly, data-driven decision-making will undoubtedly continue to be the cornerstone of future business attempts. 


The Importance of Enterprise Architecture in the Modern Business Landscape

The field of Enterprise Architecture is constantly evolving, driven by emerging trends and innovations. One of the significant trends is the adoption of cloud computing and hybrid IT environments. Cloud-based solutions offer scalability, flexibility, and cost-efficiency, making them increasingly popular among businesses. Enterprise Architecture helps organizations leverage these technologies by designing architectures that integrate cloud services and on-premises infrastructure, ensuring seamless operations and efficient resource utilization. Another emerging trend is the incorporation of artificial intelligence (AI) and machine learning (ML) in Enterprise Architecture practices. AI and ML technologies enable businesses to automate processes, analyze vast amounts of data, and gain valuable insights. By integrating AI and ML into their Enterprise Architecture frameworks, organizations can enhance decision-making, optimize business processes, and improve overall efficiency. Furthermore, the rise of digital transformation has had a significant impact on Enterprise Architecture. 


Top 8 challenges IT leaders will face in 2024

To guide an organization through uncertainty, IT leaders must help ensure everyone in the company is on the same page, Srivastava says. Instead of playing catch-up, he suggests a proactive approach with clear communication as a guiding principle. “It starts with establishing a clear set of agreed upon initiatives and outcomes for the organization,” he says. “We have to make sure everyone understands what they are doing, why they are doing it, and — most importantly — how success will be measured.” ... Security is a challenge that makes the list of top CIO worries perennially, but Grant McCormick, CIO of cybersecurity company Exabeam, notes a rising need for increased collaboration between IT and security teams to address the issue. “The role of the CIO has recently seen a massive convergence with cybersecurity,” says McCormick. “Regardless of whether or not security reports into the CIO, or another leader within the company, it is in everyone’s best interest to be conscious of the organization’s security posture and to enable IT and cybersecurity to work in a highly synchronized manner.”


Economic Uncertainty Doesn’t Mean Compromising Cybersecurity

This futuristic technology isn’t just something to tap into to enrich individual experiences; it is also to help solve some of society’s most pressing challenges and, most of all, to keep people safe. For cryptocurrencies, where there is estimated to be four times more fraud than in regular fiat payments, technology providers are devising new innovations to stay ahead. New solutions can help customers make informed decisions that protect their business, as well as the entire payments ecosystem. A simple dashboard can provide visibility of crypto spend, transaction volumes and an anti-money laundering risk rating exposure. Through solutions like these, banks and other businesses can earn and, importantly, keep the trust of their customers—on whom their business depends. Trust is fragile. It can be broken in a nanosecond. And as the global financial ecosystem expands, it’s getting harder for organizations to navigate the maze of cyber risks alone. Businesses, merchants, financial institutions and fintechs need trailblazing tools and expert knowledge to understand the risks they’re facing. 


Redefining Data Governance: Bridging The Gap Between Technical And Domain Experts

As the data industry gravitates toward decentralization, specifically federated systems, the absence of a robust framework in data governance, master data and data quality becomes glaringly evident. The prevailing issue in many companies is not the sheer volume of data or a lack of technological options but the erroneous assumption that their data is inherently primed for insights, AI applications and democratization. This misconception overshadows the real challenge: the need for a comprehensive approach to data management that integrates the expertise of domain professionals. The advent of practical AI applications marks a watershed moment in the history of data governance. This technology is not just a tool for automation; it serves as a bridge between the technical and business realms. It provides a platform where business experts can meaningfully contribute to data strategies and decision-making processes. Technical teams initially assumed the mantle of data governance out of necessity due to the requisite skill sets. 


Orchestrating Resilience Building Modern Asynchronous Systems

The first one is state management. Basically, the problem here is that you need to contemplate lots of possible combinations of states and events. For example, the "review received" message could come in while the campaign is in pending state instead of the relevant waiting state, or an out of sequence event could come in from somewhere, and so on. All of those cases need to be handled, even though they are not the most likely sequence of events and states. ... Handling retries becomes a task almost as complex as implementing primary logic, sometimes even more so. You can think of implementing your retry mechanisms in different ways, for example by storing a retry counter in the database and incrementing it on each failed attempt until either you succeed or reach the maximum allowed number of retries. Alternatively, you could embed the retry counter in the queue message itself, so you dequeue a message, process it, and, if it fails, re-enqueue the message and increment the retry count. In both cases this implies a huge overhead for developers.


Attackers deploy rootkits on misconfigured Apache Hadoop and Flink servers

In the attack chain against Hadoop, the attackers first exploit the misconfiguration to create a new application on the cluster and allocate computing resources to it. In the application container configuration, they put a series of shell commands that use the curl command-line tool to download a binary called “dca” from an attacker-controlled server inside the /tmp directory and then execute it. A subsequent request to Hadoop YARN will execute the newly deployed application and therefore the shell commands. Dca is a Linux-native ELF binary that serves as a malware downloader. Its primary purpose is to download and install two other rootkits and to drop another binary file called tmp on disk. It also sets a crontab job to execute a script called dca.sh to ensure persistence on the system. The tmp binary that’s bundled into dca itself is a Monero cryptocurrency mining program, while the two rootkits, called initrc.so and pthread.so, are used to hide the dca.sh script and tmp file on disk. The IP address that was used to target Aqua’s Hadoop honeypot was also used to target Flink, Redis, and Spring framework honeypots 


Merck's Cyberattack Settlement: What Does it Mean for Cyber Insurance Coverage?

The Merck and Mondelez cases are likely not going to be the last of their kind. More legal disputes between insurers and insureds, whether regarding war exclusions or other issues, could arise in the future. “I think that the cyber litigation is just getting started,” says Stern. More cases could drive change in the way cyber insurance companies approach risk tied to cyberattacks and what is considered cyberwarfare. When new risks challenge the existing approach to coverage, it drives industry change. “Maybe it takes a second or a third dispute to really achieve a definitive conclusion on that particular matter,” says Kannry. “Then, what can often happen is insurance industry says, ‘You know what, that type of loss needs to be understood and defined separately.’” Compared to many other insurance products, cyber insurance is relatively new. That means there remains plenty of room for the development of innovative ways to offer cyber insurance coverage. But the road forward likely won’t be without bumps for insurers and insureds.


Organizations Must Be Prudent To Realize Value In Generative AI

Rather than being swayed by the allure of generative AI capabilities, remain steadfast about the core features that can genuinely transform and enhance your operations. This pragmatic approach should be considered a short- to mid-term strategy for any forward-thinking organization. The reality is that features closely coupled with generative AI capabilities are still on the horizon. It will be at least a couple of years before they become commonplace. To navigate this transformative landscape effectively as an analytics professional, you must equip yourself with a deep understanding of generative AI. This proficiency will enable you to distinguish between features loosely coupled with generative AI and features that are natively and seamlessly integrated into the technology stack. Furthermore, keep a vigilant eye on the vendors supplying your critical business software. A vendor's stance and commitment to generative AI can profoundly impact how your organization operates in the future. 


LLM hype fades as enterprises embrace targeted AI models

LLMs were created by research teams exploring the capabilities of AI technology rather than as models designed to solve specific business problems. As a result, their capabilities are broad and shallow — writing a fairly generic email or press releases, for example. For the modern business, they have limited capabilities beyond that, requiring more data to produce results with any depth. While the AI landscape used to be dominated solely by OpenAI, major names in the tech world are beginning to outperform ChatGPT with their own LLMs, including Google’s new Gemini model. However, due to the broad capabilities of these new large language models, the text and image-based benchmarks used to determine the model’s prowess were just as general. These benchmarks ranged from simple multi-step reasoning to basic arithmetic. If an AI company’s gauge for a successful Generative AI platform is how correctly it can complete rudimentary math equations, that has little to no relevance for the work of an enterprise organization.



Quote for the day:

"Before you are a leader, success is all about growing yourself when you become a leader, success is all about growing others." -- Jack Welch

Daily Tech Digest - January 11, 2024

Four Ways the Evolution of AI Is Changing the Corporate Governance Landscape

There is no doubt that AI has been touted as the long-awaited answer to everyone’s productivity and efficiency woes. Tools like ChatGPT can do everything from generating interview questions to writing a song. It can create pictures, deliver data, and solve complex problems. Yet AI is not without its issues, and some believe that the most pressing dangers associated with this technology have not even begun to emerge. AI giants have been very clear that society must pay close attention to AI development. It’s crucial for directors and investors alike to understand that while science fiction movies seem like they belong in a fantasy realm, the reality they depict may not be as far-fetched as it seems. Similarly, scientists cannot take for granted that a bent toward corporate profit won’t motivate boards to push AI developers in that same direction. Instead of making an attempt to battle the behemoth of monetary thirst, it may be a better idea to come up with creative ways to make social goals and AI safety profitable. If developers can’t overcome the opposing viewpoint, why not try to find a way to join them?


The Incident Lifecycle: How a Culture of Resilience Can Help You Accomplish Your Goals

There are three points within the incident lifecycle where we can focus time and energy to improve the learning cycle and gain some bandwidth to improve resilience in the system. It’s not easy, because you’ll generally have to make small adjustments and changes along the way. CTOs won’t generally approve $100,000 for cross-incident analysis (that won’t be a marketable improvement to stakeholders) without evidence that it’s helpful. ... You need perspectives from across the organization. The discussion shouldn’t include only the incident manager and the person who pushed the bad code. I find that folks in marketing, product management, and especially customer support have great insights into the impact of an incident. When you meet, make sure it's an open conversation – the person facilitating should be talking less than anyone else in the room. This way, you will capture how this incident affected different groups. You may learn, for example, that the on-call engineer lacked dashboard access or customer support got slammed with complaints.


Nurturing Leadership Through The Power Of Reading

The most straightforward yet impactful way reading can contribute to self-development is through gaining knowledge. Whether extracting insights from books, articles or research papers, immersing oneself in written content is a foundational pillar of continuous development. This direct approach is not just about gathering information; it's also about internalizing concepts and lessons to create a reservoir of intellectual wealth for informed decision-making and sustained professional evolution. The simple power of reading remains a reliable means of absorbing knowledge—a timeless practice that can help propel individuals toward continuous growth and success. ... Reading also facilitates internal exploration. Self-help and philosophical literature invite introspection, which can nurture profound self-awareness. Atomic Habits by James Clear, for example, provides actionable insights for leaders seeking to enhance their habits and maximize their potential, fostering a deeper understanding of personal strengths and weaknesses. 


CI Is Not CD

A crucial difference I’ve often observed is that CI and CD tools have different audiences. While developers are often active on both sides of CI/CD, CD tools are frequently used by a wider group of people. ... CD tools have a range of subtle features that make it easier to handle deployment scenarios. They have a way to manage environments and infrastructure. This mechanism applies the correct configuration for each deployment and provides a way to handle deployments at scale, such as managing tenant-specific infrastructure or deployments to different locations (such as retail stores, hospitals or cloud regions). Alongside practical deployment features, CD tools also make the state of deployments visible to everyone who needs to know what software versions are where. This removes the need for people to ask for status updates, just as your task board handles work items. If you want to know your bank balance, you don’t want to phone your bank; you want to self-serve the answer instantly. The same is true for your deployments.


Managing CEO expectations is this year’s Priority No. 1

Today’s CEOs are more likely to get their IT visions from stories written by credulous writers authoring for online business media. That’s if we’re lucky. If we aren’t, they’ll want Tony Stark’s ability to conjure up high-tech solutions by gesticulating into a 3D touch interface while arguing with the AI that ran the Iron Man’s lab. That leaves it up to you, your company’s hard-working CIO, to temper the CEO’s expectations from what they infer from the Marvel Cinematic Universe to Earth 2024. Because CEOs’ real reality (“real” by definition) is likely to be disappointing compared to the MCU and other semi-fictional realities they see, hear of, or imagine, CIOs can worry a little less about how IT might disappoint them on this score. ... Okay, fair’s fair and fun’s fun. But few CEOs will be completely consumed by these semi-whimsical depictions of information technology’s future. They’ll continue to have practical concerns, too, like where all the money is that cloud computing was supposed to save them. Some disappointments, that is, are both evergreen and rooted in real reality. 


Embracing offensive cybersecurity tactics for defense against dynamic threats

The essence of a coalition approach in offensive cyber operations is straightforward: combining forces to enhance cyber defense capabilities. This approach is critical in today’s world, where cyber threats transcend national borders. By pooling resources, knowledge, and intelligence, a coalition approach facilitates a more comprehensive and effective response to cyber threats. In the financial industry for example we have FS-ISAC that supports all these. Effective implementation involves establishing clear communication channels, defining shared objectives, and ensuring mutual trust among participating entities. ... Looking ahead, the line between offense and defense in cybersecurity is blurring. The future I envision is one where these two are not distinct entities but different aspects of a singular, holistic strategy. Offensive tools will be used not just to attack but to inform, to scout for threats and act before they materialize. This integrated approach is akin to a martial artist’s stance, ready to block and strike simultaneously.


CES 2024: Will the Coolest New AI Gadgets Protect Your Privacy?

As Tschider points out, "COPPA doesn’t have any cybersecurity requirements to actually reinforce its privacy obligations. This issue is only magnified in contemporary AI-enabled IoT because compromising a large number of devices simultaneously only requires pwning the cloud or the AI model driving function of hundreds or thousands of devices. Many products don't have the kind of robust protections they actually need." She adds, "Additionally, it relies primarily on a consent model. Because most consumers don't read privacy notices (and it would take well over a hundred days a year to read every privacy notice presented to you), this model is not really ideal." For Tschider, a superior legal framework for consumer electronics might take bits of inspiration from HIPAA, or New York State's cybersecurity law for financial services. But really, one need only look across the water for an off-the-shelf model of how to do it right. For cybersecurity, the NIS 2 Directive out of the EU is broadly useful," Tschider says, adding that "there are many good takeaways both from the General Data Protection Regulation and the AI Act in the EU."


Critical Components for Data Fabric Success

In a physical data fabric, users access data, run analytics on it, or use APIs at a consumption layer to deliver the data wherever it is needed. Prior to that, data is modeled, prepared, and curated in the discovery layer, and transformed and/or cleansed as needed in the orchestration layer. In the ingestion layer, data is drawn from one or more data sources (which can be on premises or in the cloud) and stored in the persistence layer, which is usually a data lake or data warehouse. Logical data fabrics integrate data using data virtualization to establish a single, trusted source of data regardless of where the data is physically stored. This enables organizations to integrate, manage, and deliver distributed data to any user in real time regardless of the location, format, and latency of the source data. Unlike a logical data fabric, a physical data fabric requires the ability to physically centralize all the required data from multiple sources before it can deliver the data to consumers. Data also needs to be physically transformed and replicated every time and be adapted to each new use case. 


Boost Your Business With Digital Twin Technology

Digital twins allow businesses to answer questions that can directly impact strategic and operational decisions. “Organizations can move from answering simple questions about asset performance to understanding how these assets -- machines, assembly lines, supply chains -- will operate in the future, and what actions the business can take to meet performance and uptime goals,” Mann explains. Manufacturers are the businesses most likely to gain value from digital twin technology. “Manufacturers look to understand the causes of downtime, model scenarios to improve efficiency, and reduce waste,” says Devin Yaung, senior vice president, group enterprise, IoT products and services, at technology and business solutions provider NTT, in an email interview. Digital twins of individual machines permit instant views into maintenance issues and potential failures. “The growth of connected IoT sensors and devices has allowed all industries to gain insights into assets,” Yaung says. “Because of this explosion of connectivity, we are seeing large adoption not only in manufacturing but also in utilities, mining, hospitals, ports, airports, logistics/transportation, agriculture, and many other industries.”


Hey Gen. Z, you’re looking for tech jobs in all the wrong places

The pace of digital adoption and technological change today is far greater than it's ever been, according to Ger Doyle, senior vice president of US-based IT staffing firm Experis. The rise AI and genAI is likely to accelerate that trend, “so new graduates, as well as those in the workforce today, need to embrace a concept of life-long learning to stay relevant in the new world,” Doyle said. Pandor agreed: “Candidates should remain consistently curious throughout the job-searching process. Keeping up to date with the latest trends and developments in the digital world by reading technical news enables them to showcase their interest in the ever-changing sector when they do land a job interview. From a more practical perspective, talent can also continue to practice and enhance their technical skills while job hunting so that they are ready to hit the ground running.” Younger job candidates might not be aware of the breadth and diversity of roles available, Pandor said, and they shouldn’t rule out other opportunities early in their careers.



Quote for the day:

“Nobody talks of entrepreneurship as survival, but that’s exactly what it is.” -- Anita Roddick

Daily Tech Digest - January 10, 2024

It's Time to Take a Modern Approach to Password Management

Standards for decentralized identity are being advocated by recognized bodies such as W3C. While regulations and other aspects such as authorization, role, and attribute-based access are still further developing, businesses and institutions now have the opportunity to create interoperable designs that can seamlessly integrate with this new model. In this architecture, the most trusted identity providers are likely to play a dominant role as decentralized issuers (DID), which will be crucial for the adoption of VCs. Users are more likely to trust these established brands to certify their digital credentials. However, new vendors, brands, and institutions may emerge to compete in this space and position themselves as market leaders. Furthermore, a witness ledger, which offers traceability and trust of VC transactions, will likely be supported by a technology similar to blockchain network but more eco-friendly. This will enable digital merchants to verify the credibility of a credential, and ultimately their potential customers. 


Putting AI to Work: Systems of Intelligence and Actionable Agency

Pervasive AI will create a new System of Intelligence (SoI) that integrates data, technologies, platforms, and practices for the purposes of finding and understanding patterns, extracting insights, promoting efficiency and creativity, and facilitating decision-making. This will illuminate how organizations actually do on a functional basis, through real-time data inputs, allowing people greater awareness and meaningful action. Here is why: The system of intelligence is designed to work in a way that is different from traditional data systems or systems of record. Rather than requiring users to know how to extract insights from the data, the system of intelligence is designed to identify, ask questions and provide insights in a way that is easy for all users to understand. Standards and practices of the SoI are still emerging, which gives leaders the rare chance to both learn from and guide the development of a new system of work in the coming year. This is necessary work since imagining that nothing will change with AI is akin to thinking that, at the dawn of television, radio would simply be transposed wholesale, with no particular effect on culture, process, or business models.


Leveraging Blockchain Technology to Counter the Threat of Deepfake Videos

One of the fundamental features of blockchain is its immutability. Once data is added to a blockchain, it becomes virtually impossible to alter or erase. Applying this characteristic to video content could create an immutable record of the original footage, ensuring that any subsequent alterations or manipulations would be immediately apparent. ... Blockchain’s timestamping capabilities can provide a reliable chronology of when content is created, modified, or accessed. Integrating blockchain into the video creation process allows for the creation of a verifiable and transparent timeline for each piece of content. This timestamping ensures that any attempt to manipulate videos would be easily traceable, enabling swift identification of the source of misinformation and aiding in the attribution of responsibility. ... Blockchain operates on a decentralized network of nodes, each maintaining a copy of the ledger. This decentralized nature can be harnessed for video verification, where multiple nodes across the network can independently verify the authenticity of a given video.


How to Build Team Culture in a Remote-Work World

A positive team culture leads to happier employees. This may result in increased productivity over the long run. Because a positive work environment leads to things like friendships and increased levels of support between coworkers, you're more likely to see lower turnover rates and higher employee retention rates when you emphasize team culture. Positive team cultures also reduce levels of stress and anxiety among employees. With a less stressful environment, your skilled workers are more likely to remain with your company long-term. Additionally, they'll share their positive experiences as an employee. As this word-of-mouth spreads, your positive team culture may eventually result in your business becoming a sought-after place to work. In addition to these hiring and personnel benefits, positive team cultures correlate directly with profitability. You might engage in a chance discussion with a coworker over the water cooler or in the breakroom — leading to opportunities, collaborations and innovations that otherwise might not have happened.


Faster than ever: Wi-Fi 7 standard arrives

For home networks, Wi-Fi 7 enhances the performance of smart home devices, providing a more reliable connection for Internet of Things technologies. The improved bandwidth and speed are perfect for families, like mine, which have multiple devices streaming high-definition content simultaneously. Your overall Wi-Fi performance, whether it's just you or your family and friends, will see a dramatic improvement. In businesses, Wi-Fi 7 can support more devices with minimal interference. This capability makes it ideal for large offices and coworking spaces. The improved speed and stability facilitate seamless video conferencing and efficient cloud-based applications, which are essential for modern companies. All that's the good news. The bad news is that the 6 GHz wireless spectrum uses shorter wavelengths. Short wavelengths are great for fast data transfers at close range, So, they're great for connecting to your Wi-Fi 7-enabled HDTV a few feet away from your router. But short wavelengths are poor at connecting at long distances and suffer greater interference from physical obstructions, such as dense walls or floors in a building.


3 Essential Attitudes & Dispositions of Good Corporate Governance

While leadership and governance are two concepts that often work in tandem, every director must understand that they are not the same thing. Leadership refers to a person’s ability to influence the attitudes and actions of others to lead them toward a common goal. Governance, on the other hand, should be about making decisions that lead to increased corporate performance and meeting or exceeding agreed-upon targets. Being a good leader without shifting their mindset toward governance can make directors behave in selfish and territorial ways, putting undue focus on their own desires and beliefs. On the other hand, having the power to govern without the skill of leadership can often lead to passivity and a bend toward bureaucracy, which can easily stop board progress in its tracks. ... Many directors — especially those who are new to the position — struggle with speaking up when they see something happening that needs the board’s attention. They may be worried about receiving private or public backlash or derailing the company’s progress toward meeting targets. 


Why most companies suck at digital transformation

Focus on architecture in the wide, without forgetting architecture in the narrow. Enterprises need to understand the holistic architecture required to support accurate DX positive outcomes and not just focus on individual systems. This is an outcome of a comprehensive strategy, in that we’re utilizing all systems in place, including legacy and other on-premises assets, and establishing how they will work and play well with migrated or net-new systems existing on public clouds. If companies focus only on small systems or architectures, they usually neglect to understand how they will exist within a strategically defined DX ecosystem. This results in decoupled projects that may be impressive on their own but provide little or no value to the larger strategy that is more important than just the parts that make it up. ... The most significant issue is that most don’t even understand what digital transformation is, even those with the term in their titles. Instead, they focus on the tactics, meaning tools and technology, never understanding the plan to make things incrementally better.


Researchers develop technique to prevent software bugs

Baldur took several months to build. The work was done as a collaboration with Google, and built on top of a significant amount of prior research. First, whose team performed its work at Google, used Minerva, an LLM trained on a large corpus of natural-language text, and then fine-tuned it on 118GB of mathematical scientific papers and webpages containing mathematical expressions. Next, she further fine-tuned the LLM on a language, called Isabelle/HOL, in which the mathematical proofs are written. Baldur then generated an entire proof and worked in tandem with the theorem prover to check its work. When the theorem prover caught an error, it fed the proof, as well as information about the error, back into the LLM, so that it can learn from its mistake and generate a new and hopefully error-free proof. This process yields a remarkable increase in accuracy. The tool for automatically generating proofs is called Thor, which can generate proofs 57% of the time. 


Reconciling Agile Development With AI Safety

On one hand, the Agile principles of an iterative approach, regular risk management checks, cross-expertise collaboration, solicitation of third-party feedback at every stage, and adaptability to changing priorities or new findings seem well-suited to the seamless incorporation of responsible AI practices. However, we think responsible AI development will require a full revamp of the software development lifecycle, from pre-training assessments of data to post-deployment monitoring for performance and safety. Some practices, such as automated algorithmic checks (like tests for data bias and model performance metrics, all of which are part of Stanford’s HELM set of evaluations) can be utilized anywhere in the development lifecycle. Other techniques may be purely ex post, like algorithmic audits. An Agile approach avoids engineering siloes, allowing for stage-specific practices to be adopted where necessary, while ensuring stage-agnostic practices are adopted at all relevant stages, and allowing these stages to proceed in tandem.


Modern-day manufacturing: A process built on data governance

The average manufacturer generates high volumes and different types of data, including customer information, production orders, and shipment tracking, to name a few. This is further compounded with every supplier, distributor, and third party that’s added to the supply chain. Without a system to validate all this data, a manufacturer can find itself with inaccurate or incomplete data. Poor data quality not only leads to operational inefficiencies and mistakes, it also hinders the organization’s growth by limiting its ability to forecast demands and plan production runs. ... Within complex manufacturing ecosystems, it can be unclear who owns data as it flows across the supply chain. Various teams generate and use different types of data, making ownership and responsibility a challenge to pin down. ... Establishing data ownership involves identifying primary stakeholders who are responsible for ensuring the quality, security, and correct use of data assets. 



Quote for the day:

"We live in a society obsessed with public opinion. But leadership has never been about popularity." -- Marco Rubio