Daily Tech Digest - January 13, 2024

Frenemies to friends: Developers and security tools

Cultural shifts happen when security is built into the developer’s existing flow, as opposed to being injected as its own new stage in the pipeline. Look for points in their process where they are already in “pause” or “edit” mode, like at the Pull Request, where you can surface vulnerabilities and ask for remediation efforts. Doing so can avoid context switching and feelings of being interrupted. Capitalizing on an existing developer pause point can help train your developers to look at security vulnerabilities like functionality bugs, a skill they already have, while also shortening feedback loops. ... Developer-to-developer enablement is key. There is often a feeling of mistrust between engineering and security, but developers share the same interests and have the same priorities. Let individual contributors have an opportunity to educate and enable other individual contributors. If you have had a successful pilot or PoC team, or notice self-motivated folks using the tool proactively, give them space to share their experience with the tool. 


The Joys and Pains of DevOps

DevOps is very much a culture change in the way development, operations and even security work together. Even though DevOps aims to improve this, in many cases, these areas still function in silos. There are times when one area implements something that blocks another; and as a DevOps leader, you’re often in the middle trying to figure out the best path forward while also finding an acceptable middle ground. ... A well-engineered DevOps solution should render the team invisible. That includes both the happy path, when deployments succeed, as well as how well you enable teams to solve their deployment issues. There is also one common element of what makes DevOps rewarding: improving developer experience and business outcomes. Dale Francis, director of product development at Climavision, says the rewards of DevOps come from solving problems, so day-to-day operations become simple and the experience for developers better. In addition, maturing as a DevOps organization also lets everyone focus more on solving business problems, rather than fighting technical issues. 


Why Engineering Is Key To A Flourishing Workplace Culture

If your engineering strategy demands precision but your workplace culture tolerates ambiguity and shortcuts, you won't get anywhere. If your engineering strategy demands accountability but your workplace culture doesn't draw connections between an individual's efforts and the higher goals of the operation, you won't get anywhere. If your engineering strategy demands innovation but your workplace culture rewards risk aversion, you won't get anywhere. ... In an arena as complex and technical as engineering, it's easy to lose sight of the human side. Whether your workplace is in-person, remote or hybrid, it's crucial to create spaces (literal or virtual) where employees feel connected and empowered to ask questions. Trust and creativity flourish in an environment where autonomy and authentic connections coexist. ... Inertia is fatal to engineering. Regularly evaluate and adopt new technologies. Find out what your customers need. Find out what hurdles they're up against. Think three steps ahead so your tech stack supports the evolving needs of your business and the market.


Life's Too Short to Work With Incompatible People

Celebrate failure and learn to give feedback. When you embrace failure, you learn and course-correct more quickly. Failure is a sign you're doing something right. You're testing, learning, flexing your creative muscles and moving on efficiently after hitting a brick wall. You must build a team open to feedback to make the most of your failures for the company's good. Feedback is the mode by which we make positive changes out of failure. The challenge? Feedback makes most people cringe. We associate it with criticism as opposed to growth. ... Clear communication may seem like an obvious necessity on high-performing teams, but it's something that's often taken for granted. Unclear communication can quickly tank a team's efforts. A team that has mastered precise communication, on the other hand, can achieve incredible outcomes quickly. We follow an "open book" mentality at Wistia. On all-hands calls, we share candid information about the state of the company – inclusive of the good and the bad – so everyone has the big picture. 


Researchers demo new CI/CD attack techniques in PyTorch supply-chain

Khan initially found a critical vulnerability that could have led to the poisoning of GitHub Actions’ official runner images. The “runners” are the VMs that execute build actions defined inside GitHub Actions workflows. After reporting the vulnerability to GitHub and receiving a $20,000 bug bounty for it, Khan realized that the core issue he found was systemic and that thousands of other repositories were likely impacted. Since then, Khan and Stawinski found vulnerabilities in the software repositories and development infrastructure of major corporations and software projects and collected hundreds of thousands of dollars in rewards through bug bounty programs. Their “victims” included Microsoft Deepspeed, a Cloudflare application, the TensorFlow machine-learning library, the crypto wallets and nodes of several blockchains, and PyTorch, one of the most widely used open-source machine-learning frameworks. PyTorch was originally developed by Meta AI, a subsidiary of Meta, but its development is now governed by the PyTorch Foundation, an independent organization that operates under the Linux Foundation’s umbrella.


For a Secure Foundation, Health Systems Must Address Technical Debt

We need update network equipment, workstations. We may still even have Windows 2003 and 2008. And hardware is not as expensive as the applications that are on there. So that level of technical debt and competing for those dollars where in healthcare you need to have nice offices and that type of thing. So we’re competing with those, with other projects or capital where other organizations may think of that as just an ongoing IT update expense. ... I might hear this stuff at home occasionally, but it’s the same with IT projects. “Hey, we had an acquisition. We got them up and running. We didn’t take care of their technical debt so we’re assuming that.” We’re going through some of those servers now, it’s like, can we even find anybody that knows anything about it, or is it just everyone’s afraid to turn it off? What I like to say is if you didn’t sit around the right campfire, you don’t know the story. So for me, my job sometimes is just to keep asking those questions: “Who knows something about this server?” Sometimes it comes down to the scream test, but I’ve developed a quality, I call it positive persistence. I just keep asking questions politely until we make progress.


The way forward is to make technology 'human-like': Report

As the world undergoes a massive technological transformation, artificial intelligence (AI) and other disruptive technologies will increasingly adopt a more human-like or "Human by Design" approach, according to a new study published on Wednesday. These technologies becoming more human-like and intuitive for people to use, will increasingly lead to a new era of unprecedented productivity and creativity, said the report, titled 'Accenture Technology Vision 2024: Human by design, how AI unleashes the next level of human potential,' which also emphasizes that enterprises that prepare for this shift now will be the winners in the future. The research further highlights that as human-centric technologies continue to advance, they are becoming easier to interact with and more seamlessly integrated into every aspect of our lives. ... As AI, spatial computing, and body-sensing technologies evolve to imitate human capabilities and become less noticeable, the true focus will be on the people who are empowered with new capabilities to achieve what was once considered impossible.


Expert Insight: Andrew Snow on a landmark GDPR ruling

For organisations, it makes clear beyond all doubt that ignorance isn’t an excuse. In fact, if organisations – or managers within them – plead ignorance to the infringement now, they may face a higher fine than if they had taken responsibility for their actions. For regulators, an important precedent has been set. This ruling has provided them with clear direction on where the line falls when deciding on issuing administrative penalties, including fines. For instance, the EDPB [European Data Protection Board] recently reported on another case, involving the Slovak and Hungarian authorities, where there was a dispute over the ownership. The Hungarian regulator ultimately determined that both parties jointly determined the purposes of processing, so were joint controllers – and as such, breached the GDPR because their agreement failed to document this and, by extension, their respective responsibilities. Given the timing of this decision, it probably wasn’t influenced by the ECJ ruling, but I expect that future cases like this would use the ruling as a precedent.


What Are Digital Twins and How Can They Be Used in Healthcare?

Trayanova’s research is on applying personalized digital twin approaches to clinical decision-making. She aims to improve predictive diagnostics and to predict optimal treatment plans for patients. This is currently being used to treat patients with heart rhythm disorders. At Johns Hopkins, Trayanova and her team can create a personalized digital twin representing the geometry of a patient’s heart. The digital twin includes the heart’s structure; disease remodeling such as damage, fibrosis and inflammation identified through MRI or PET scans; and its electrical wave propagation. When an electrical wave propagates to the heart, it triggers a contraction. However, if a patient has scarring or other damage, the wave will catch in that area and, rather than propagating through the heart, it will recirculate and cause an arrythmia. To treat the arrythmia, the digital twin must accurately represent the damage as well as the electrical activity of each cell in the heart. “Now you have something that dynamically links the heart’s components,” Trayanova says. Using the digital twin, she and her team can send a signal and watch how the electrical wave propagates through the model. 


What will the metaverse mean for business models?

In media and entertainment, the primary model of business has evolved from ownership to subscription. In the past, most people bought CDs and DVDs to build a collection – today, owning vinyl is booming in popularity again. But for the majority of people, the accepted model is accessing songs, films and TV series online and building your own virtual library. The difference is that if you stop paying the subscription, you have nothing. Will it be the same in the metaverse? We’ll have to wait and see. But it’s safe to assume that people will want ownership of their assets without paying a subscription (except for the wallet that protects them). To complicate things, there is the question of what role content from Generative AI will play in metaverse business models. Today, it’s generally accepted that no one owns work created by Generative AI. But won't this change? In fact, this assumption may even be wrong – in the UK for example, the law implies that the creators of the AI platform own anything wholly created by it. 



Quote for the day:

"Great leaders do not desire to lead but to serve." -- Myles Munroe

Daily Tech Digest - January 12, 2024

Navigating Tomorrow: Becoming an Enterprise of the Future

Preparing for what lies ahead goes far beyond just implementing the right technologies, it is about developing a culture that embraces change with empathy. Cultivating a mindset across the organisation that values innovation, continuous learning, and agility ensures that every employee charges forward with confidence. In times of economic uncertainties and technological advancements, it is crucial that we practice empathy. Naturally, there is some fear that technologies like AI will replace human workers. As such, leaders must help employees understand that technology is here to augment their roles and empower them to spend more time on other valuable tasks. The key to embracing any new technology and providing access at scale is to get everyone in the team on board. Whether greeted with excitement or anxiety, leaders must champion this culture of change by encouraging employees to seek new ways of working while ensuring they remain engaged and valued. Certainly, data-driven decision-making will undoubtedly continue to be the cornerstone of future business attempts. 


The Importance of Enterprise Architecture in the Modern Business Landscape

The field of Enterprise Architecture is constantly evolving, driven by emerging trends and innovations. One of the significant trends is the adoption of cloud computing and hybrid IT environments. Cloud-based solutions offer scalability, flexibility, and cost-efficiency, making them increasingly popular among businesses. Enterprise Architecture helps organizations leverage these technologies by designing architectures that integrate cloud services and on-premises infrastructure, ensuring seamless operations and efficient resource utilization. Another emerging trend is the incorporation of artificial intelligence (AI) and machine learning (ML) in Enterprise Architecture practices. AI and ML technologies enable businesses to automate processes, analyze vast amounts of data, and gain valuable insights. By integrating AI and ML into their Enterprise Architecture frameworks, organizations can enhance decision-making, optimize business processes, and improve overall efficiency. Furthermore, the rise of digital transformation has had a significant impact on Enterprise Architecture. 


Top 8 challenges IT leaders will face in 2024

To guide an organization through uncertainty, IT leaders must help ensure everyone in the company is on the same page, Srivastava says. Instead of playing catch-up, he suggests a proactive approach with clear communication as a guiding principle. “It starts with establishing a clear set of agreed upon initiatives and outcomes for the organization,” he says. “We have to make sure everyone understands what they are doing, why they are doing it, and — most importantly — how success will be measured.” ... Security is a challenge that makes the list of top CIO worries perennially, but Grant McCormick, CIO of cybersecurity company Exabeam, notes a rising need for increased collaboration between IT and security teams to address the issue. “The role of the CIO has recently seen a massive convergence with cybersecurity,” says McCormick. “Regardless of whether or not security reports into the CIO, or another leader within the company, it is in everyone’s best interest to be conscious of the organization’s security posture and to enable IT and cybersecurity to work in a highly synchronized manner.”


Economic Uncertainty Doesn’t Mean Compromising Cybersecurity

This futuristic technology isn’t just something to tap into to enrich individual experiences; it is also to help solve some of society’s most pressing challenges and, most of all, to keep people safe. For cryptocurrencies, where there is estimated to be four times more fraud than in regular fiat payments, technology providers are devising new innovations to stay ahead. New solutions can help customers make informed decisions that protect their business, as well as the entire payments ecosystem. A simple dashboard can provide visibility of crypto spend, transaction volumes and an anti-money laundering risk rating exposure. Through solutions like these, banks and other businesses can earn and, importantly, keep the trust of their customers—on whom their business depends. Trust is fragile. It can be broken in a nanosecond. And as the global financial ecosystem expands, it’s getting harder for organizations to navigate the maze of cyber risks alone. Businesses, merchants, financial institutions and fintechs need trailblazing tools and expert knowledge to understand the risks they’re facing. 


Redefining Data Governance: Bridging The Gap Between Technical And Domain Experts

As the data industry gravitates toward decentralization, specifically federated systems, the absence of a robust framework in data governance, master data and data quality becomes glaringly evident. The prevailing issue in many companies is not the sheer volume of data or a lack of technological options but the erroneous assumption that their data is inherently primed for insights, AI applications and democratization. This misconception overshadows the real challenge: the need for a comprehensive approach to data management that integrates the expertise of domain professionals. The advent of practical AI applications marks a watershed moment in the history of data governance. This technology is not just a tool for automation; it serves as a bridge between the technical and business realms. It provides a platform where business experts can meaningfully contribute to data strategies and decision-making processes. Technical teams initially assumed the mantle of data governance out of necessity due to the requisite skill sets. 


Orchestrating Resilience Building Modern Asynchronous Systems

The first one is state management. Basically, the problem here is that you need to contemplate lots of possible combinations of states and events. For example, the "review received" message could come in while the campaign is in pending state instead of the relevant waiting state, or an out of sequence event could come in from somewhere, and so on. All of those cases need to be handled, even though they are not the most likely sequence of events and states. ... Handling retries becomes a task almost as complex as implementing primary logic, sometimes even more so. You can think of implementing your retry mechanisms in different ways, for example by storing a retry counter in the database and incrementing it on each failed attempt until either you succeed or reach the maximum allowed number of retries. Alternatively, you could embed the retry counter in the queue message itself, so you dequeue a message, process it, and, if it fails, re-enqueue the message and increment the retry count. In both cases this implies a huge overhead for developers.


Attackers deploy rootkits on misconfigured Apache Hadoop and Flink servers

In the attack chain against Hadoop, the attackers first exploit the misconfiguration to create a new application on the cluster and allocate computing resources to it. In the application container configuration, they put a series of shell commands that use the curl command-line tool to download a binary called “dca” from an attacker-controlled server inside the /tmp directory and then execute it. A subsequent request to Hadoop YARN will execute the newly deployed application and therefore the shell commands. Dca is a Linux-native ELF binary that serves as a malware downloader. Its primary purpose is to download and install two other rootkits and to drop another binary file called tmp on disk. It also sets a crontab job to execute a script called dca.sh to ensure persistence on the system. The tmp binary that’s bundled into dca itself is a Monero cryptocurrency mining program, while the two rootkits, called initrc.so and pthread.so, are used to hide the dca.sh script and tmp file on disk. The IP address that was used to target Aqua’s Hadoop honeypot was also used to target Flink, Redis, and Spring framework honeypots 


Merck's Cyberattack Settlement: What Does it Mean for Cyber Insurance Coverage?

The Merck and Mondelez cases are likely not going to be the last of their kind. More legal disputes between insurers and insureds, whether regarding war exclusions or other issues, could arise in the future. “I think that the cyber litigation is just getting started,” says Stern. More cases could drive change in the way cyber insurance companies approach risk tied to cyberattacks and what is considered cyberwarfare. When new risks challenge the existing approach to coverage, it drives industry change. “Maybe it takes a second or a third dispute to really achieve a definitive conclusion on that particular matter,” says Kannry. “Then, what can often happen is insurance industry says, ‘You know what, that type of loss needs to be understood and defined separately.’” Compared to many other insurance products, cyber insurance is relatively new. That means there remains plenty of room for the development of innovative ways to offer cyber insurance coverage. But the road forward likely won’t be without bumps for insurers and insureds.


Organizations Must Be Prudent To Realize Value In Generative AI

Rather than being swayed by the allure of generative AI capabilities, remain steadfast about the core features that can genuinely transform and enhance your operations. This pragmatic approach should be considered a short- to mid-term strategy for any forward-thinking organization. The reality is that features closely coupled with generative AI capabilities are still on the horizon. It will be at least a couple of years before they become commonplace. To navigate this transformative landscape effectively as an analytics professional, you must equip yourself with a deep understanding of generative AI. This proficiency will enable you to distinguish between features loosely coupled with generative AI and features that are natively and seamlessly integrated into the technology stack. Furthermore, keep a vigilant eye on the vendors supplying your critical business software. A vendor's stance and commitment to generative AI can profoundly impact how your organization operates in the future. 


LLM hype fades as enterprises embrace targeted AI models

LLMs were created by research teams exploring the capabilities of AI technology rather than as models designed to solve specific business problems. As a result, their capabilities are broad and shallow — writing a fairly generic email or press releases, for example. For the modern business, they have limited capabilities beyond that, requiring more data to produce results with any depth. While the AI landscape used to be dominated solely by OpenAI, major names in the tech world are beginning to outperform ChatGPT with their own LLMs, including Google’s new Gemini model. However, due to the broad capabilities of these new large language models, the text and image-based benchmarks used to determine the model’s prowess were just as general. These benchmarks ranged from simple multi-step reasoning to basic arithmetic. If an AI company’s gauge for a successful Generative AI platform is how correctly it can complete rudimentary math equations, that has little to no relevance for the work of an enterprise organization.



Quote for the day:

"Before you are a leader, success is all about growing yourself when you become a leader, success is all about growing others." -- Jack Welch

Daily Tech Digest - January 11, 2024

Four Ways the Evolution of AI Is Changing the Corporate Governance Landscape

There is no doubt that AI has been touted as the long-awaited answer to everyone’s productivity and efficiency woes. Tools like ChatGPT can do everything from generating interview questions to writing a song. It can create pictures, deliver data, and solve complex problems. Yet AI is not without its issues, and some believe that the most pressing dangers associated with this technology have not even begun to emerge. AI giants have been very clear that society must pay close attention to AI development. It’s crucial for directors and investors alike to understand that while science fiction movies seem like they belong in a fantasy realm, the reality they depict may not be as far-fetched as it seems. Similarly, scientists cannot take for granted that a bent toward corporate profit won’t motivate boards to push AI developers in that same direction. Instead of making an attempt to battle the behemoth of monetary thirst, it may be a better idea to come up with creative ways to make social goals and AI safety profitable. If developers can’t overcome the opposing viewpoint, why not try to find a way to join them?


The Incident Lifecycle: How a Culture of Resilience Can Help You Accomplish Your Goals

There are three points within the incident lifecycle where we can focus time and energy to improve the learning cycle and gain some bandwidth to improve resilience in the system. It’s not easy, because you’ll generally have to make small adjustments and changes along the way. CTOs won’t generally approve $100,000 for cross-incident analysis (that won’t be a marketable improvement to stakeholders) without evidence that it’s helpful. ... You need perspectives from across the organization. The discussion shouldn’t include only the incident manager and the person who pushed the bad code. I find that folks in marketing, product management, and especially customer support have great insights into the impact of an incident. When you meet, make sure it's an open conversation – the person facilitating should be talking less than anyone else in the room. This way, you will capture how this incident affected different groups. You may learn, for example, that the on-call engineer lacked dashboard access or customer support got slammed with complaints.


Nurturing Leadership Through The Power Of Reading

The most straightforward yet impactful way reading can contribute to self-development is through gaining knowledge. Whether extracting insights from books, articles or research papers, immersing oneself in written content is a foundational pillar of continuous development. This direct approach is not just about gathering information; it's also about internalizing concepts and lessons to create a reservoir of intellectual wealth for informed decision-making and sustained professional evolution. The simple power of reading remains a reliable means of absorbing knowledge—a timeless practice that can help propel individuals toward continuous growth and success. ... Reading also facilitates internal exploration. Self-help and philosophical literature invite introspection, which can nurture profound self-awareness. Atomic Habits by James Clear, for example, provides actionable insights for leaders seeking to enhance their habits and maximize their potential, fostering a deeper understanding of personal strengths and weaknesses. 


CI Is Not CD

A crucial difference I’ve often observed is that CI and CD tools have different audiences. While developers are often active on both sides of CI/CD, CD tools are frequently used by a wider group of people. ... CD tools have a range of subtle features that make it easier to handle deployment scenarios. They have a way to manage environments and infrastructure. This mechanism applies the correct configuration for each deployment and provides a way to handle deployments at scale, such as managing tenant-specific infrastructure or deployments to different locations (such as retail stores, hospitals or cloud regions). Alongside practical deployment features, CD tools also make the state of deployments visible to everyone who needs to know what software versions are where. This removes the need for people to ask for status updates, just as your task board handles work items. If you want to know your bank balance, you don’t want to phone your bank; you want to self-serve the answer instantly. The same is true for your deployments.


Managing CEO expectations is this year’s Priority No. 1

Today’s CEOs are more likely to get their IT visions from stories written by credulous writers authoring for online business media. That’s if we’re lucky. If we aren’t, they’ll want Tony Stark’s ability to conjure up high-tech solutions by gesticulating into a 3D touch interface while arguing with the AI that ran the Iron Man’s lab. That leaves it up to you, your company’s hard-working CIO, to temper the CEO’s expectations from what they infer from the Marvel Cinematic Universe to Earth 2024. Because CEOs’ real reality (“real” by definition) is likely to be disappointing compared to the MCU and other semi-fictional realities they see, hear of, or imagine, CIOs can worry a little less about how IT might disappoint them on this score. ... Okay, fair’s fair and fun’s fun. But few CEOs will be completely consumed by these semi-whimsical depictions of information technology’s future. They’ll continue to have practical concerns, too, like where all the money is that cloud computing was supposed to save them. Some disappointments, that is, are both evergreen and rooted in real reality. 


Embracing offensive cybersecurity tactics for defense against dynamic threats

The essence of a coalition approach in offensive cyber operations is straightforward: combining forces to enhance cyber defense capabilities. This approach is critical in today’s world, where cyber threats transcend national borders. By pooling resources, knowledge, and intelligence, a coalition approach facilitates a more comprehensive and effective response to cyber threats. In the financial industry for example we have FS-ISAC that supports all these. Effective implementation involves establishing clear communication channels, defining shared objectives, and ensuring mutual trust among participating entities. ... Looking ahead, the line between offense and defense in cybersecurity is blurring. The future I envision is one where these two are not distinct entities but different aspects of a singular, holistic strategy. Offensive tools will be used not just to attack but to inform, to scout for threats and act before they materialize. This integrated approach is akin to a martial artist’s stance, ready to block and strike simultaneously.


CES 2024: Will the Coolest New AI Gadgets Protect Your Privacy?

As Tschider points out, "COPPA doesn’t have any cybersecurity requirements to actually reinforce its privacy obligations. This issue is only magnified in contemporary AI-enabled IoT because compromising a large number of devices simultaneously only requires pwning the cloud or the AI model driving function of hundreds or thousands of devices. Many products don't have the kind of robust protections they actually need." She adds, "Additionally, it relies primarily on a consent model. Because most consumers don't read privacy notices (and it would take well over a hundred days a year to read every privacy notice presented to you), this model is not really ideal." For Tschider, a superior legal framework for consumer electronics might take bits of inspiration from HIPAA, or New York State's cybersecurity law for financial services. But really, one need only look across the water for an off-the-shelf model of how to do it right. For cybersecurity, the NIS 2 Directive out of the EU is broadly useful," Tschider says, adding that "there are many good takeaways both from the General Data Protection Regulation and the AI Act in the EU."


Critical Components for Data Fabric Success

In a physical data fabric, users access data, run analytics on it, or use APIs at a consumption layer to deliver the data wherever it is needed. Prior to that, data is modeled, prepared, and curated in the discovery layer, and transformed and/or cleansed as needed in the orchestration layer. In the ingestion layer, data is drawn from one or more data sources (which can be on premises or in the cloud) and stored in the persistence layer, which is usually a data lake or data warehouse. Logical data fabrics integrate data using data virtualization to establish a single, trusted source of data regardless of where the data is physically stored. This enables organizations to integrate, manage, and deliver distributed data to any user in real time regardless of the location, format, and latency of the source data. Unlike a logical data fabric, a physical data fabric requires the ability to physically centralize all the required data from multiple sources before it can deliver the data to consumers. Data also needs to be physically transformed and replicated every time and be adapted to each new use case. 


Boost Your Business With Digital Twin Technology

Digital twins allow businesses to answer questions that can directly impact strategic and operational decisions. “Organizations can move from answering simple questions about asset performance to understanding how these assets -- machines, assembly lines, supply chains -- will operate in the future, and what actions the business can take to meet performance and uptime goals,” Mann explains. Manufacturers are the businesses most likely to gain value from digital twin technology. “Manufacturers look to understand the causes of downtime, model scenarios to improve efficiency, and reduce waste,” says Devin Yaung, senior vice president, group enterprise, IoT products and services, at technology and business solutions provider NTT, in an email interview. Digital twins of individual machines permit instant views into maintenance issues and potential failures. “The growth of connected IoT sensors and devices has allowed all industries to gain insights into assets,” Yaung says. “Because of this explosion of connectivity, we are seeing large adoption not only in manufacturing but also in utilities, mining, hospitals, ports, airports, logistics/transportation, agriculture, and many other industries.”


Hey Gen. Z, you’re looking for tech jobs in all the wrong places

The pace of digital adoption and technological change today is far greater than it's ever been, according to Ger Doyle, senior vice president of US-based IT staffing firm Experis. The rise AI and genAI is likely to accelerate that trend, “so new graduates, as well as those in the workforce today, need to embrace a concept of life-long learning to stay relevant in the new world,” Doyle said. Pandor agreed: “Candidates should remain consistently curious throughout the job-searching process. Keeping up to date with the latest trends and developments in the digital world by reading technical news enables them to showcase their interest in the ever-changing sector when they do land a job interview. From a more practical perspective, talent can also continue to practice and enhance their technical skills while job hunting so that they are ready to hit the ground running.” Younger job candidates might not be aware of the breadth and diversity of roles available, Pandor said, and they shouldn’t rule out other opportunities early in their careers.



Quote for the day:

“Nobody talks of entrepreneurship as survival, but that’s exactly what it is.” -- Anita Roddick

Daily Tech Digest - January 10, 2024

It's Time to Take a Modern Approach to Password Management

Standards for decentralized identity are being advocated by recognized bodies such as W3C. While regulations and other aspects such as authorization, role, and attribute-based access are still further developing, businesses and institutions now have the opportunity to create interoperable designs that can seamlessly integrate with this new model. In this architecture, the most trusted identity providers are likely to play a dominant role as decentralized issuers (DID), which will be crucial for the adoption of VCs. Users are more likely to trust these established brands to certify their digital credentials. However, new vendors, brands, and institutions may emerge to compete in this space and position themselves as market leaders. Furthermore, a witness ledger, which offers traceability and trust of VC transactions, will likely be supported by a technology similar to blockchain network but more eco-friendly. This will enable digital merchants to verify the credibility of a credential, and ultimately their potential customers. 


Putting AI to Work: Systems of Intelligence and Actionable Agency

Pervasive AI will create a new System of Intelligence (SoI) that integrates data, technologies, platforms, and practices for the purposes of finding and understanding patterns, extracting insights, promoting efficiency and creativity, and facilitating decision-making. This will illuminate how organizations actually do on a functional basis, through real-time data inputs, allowing people greater awareness and meaningful action. Here is why: The system of intelligence is designed to work in a way that is different from traditional data systems or systems of record. Rather than requiring users to know how to extract insights from the data, the system of intelligence is designed to identify, ask questions and provide insights in a way that is easy for all users to understand. Standards and practices of the SoI are still emerging, which gives leaders the rare chance to both learn from and guide the development of a new system of work in the coming year. This is necessary work since imagining that nothing will change with AI is akin to thinking that, at the dawn of television, radio would simply be transposed wholesale, with no particular effect on culture, process, or business models.


Leveraging Blockchain Technology to Counter the Threat of Deepfake Videos

One of the fundamental features of blockchain is its immutability. Once data is added to a blockchain, it becomes virtually impossible to alter or erase. Applying this characteristic to video content could create an immutable record of the original footage, ensuring that any subsequent alterations or manipulations would be immediately apparent. ... Blockchain’s timestamping capabilities can provide a reliable chronology of when content is created, modified, or accessed. Integrating blockchain into the video creation process allows for the creation of a verifiable and transparent timeline for each piece of content. This timestamping ensures that any attempt to manipulate videos would be easily traceable, enabling swift identification of the source of misinformation and aiding in the attribution of responsibility. ... Blockchain operates on a decentralized network of nodes, each maintaining a copy of the ledger. This decentralized nature can be harnessed for video verification, where multiple nodes across the network can independently verify the authenticity of a given video.


How to Build Team Culture in a Remote-Work World

A positive team culture leads to happier employees. This may result in increased productivity over the long run. Because a positive work environment leads to things like friendships and increased levels of support between coworkers, you're more likely to see lower turnover rates and higher employee retention rates when you emphasize team culture. Positive team cultures also reduce levels of stress and anxiety among employees. With a less stressful environment, your skilled workers are more likely to remain with your company long-term. Additionally, they'll share their positive experiences as an employee. As this word-of-mouth spreads, your positive team culture may eventually result in your business becoming a sought-after place to work. In addition to these hiring and personnel benefits, positive team cultures correlate directly with profitability. You might engage in a chance discussion with a coworker over the water cooler or in the breakroom — leading to opportunities, collaborations and innovations that otherwise might not have happened.


Faster than ever: Wi-Fi 7 standard arrives

For home networks, Wi-Fi 7 enhances the performance of smart home devices, providing a more reliable connection for Internet of Things technologies. The improved bandwidth and speed are perfect for families, like mine, which have multiple devices streaming high-definition content simultaneously. Your overall Wi-Fi performance, whether it's just you or your family and friends, will see a dramatic improvement. In businesses, Wi-Fi 7 can support more devices with minimal interference. This capability makes it ideal for large offices and coworking spaces. The improved speed and stability facilitate seamless video conferencing and efficient cloud-based applications, which are essential for modern companies. All that's the good news. The bad news is that the 6 GHz wireless spectrum uses shorter wavelengths. Short wavelengths are great for fast data transfers at close range, So, they're great for connecting to your Wi-Fi 7-enabled HDTV a few feet away from your router. But short wavelengths are poor at connecting at long distances and suffer greater interference from physical obstructions, such as dense walls or floors in a building.


3 Essential Attitudes & Dispositions of Good Corporate Governance

While leadership and governance are two concepts that often work in tandem, every director must understand that they are not the same thing. Leadership refers to a person’s ability to influence the attitudes and actions of others to lead them toward a common goal. Governance, on the other hand, should be about making decisions that lead to increased corporate performance and meeting or exceeding agreed-upon targets. Being a good leader without shifting their mindset toward governance can make directors behave in selfish and territorial ways, putting undue focus on their own desires and beliefs. On the other hand, having the power to govern without the skill of leadership can often lead to passivity and a bend toward bureaucracy, which can easily stop board progress in its tracks. ... Many directors — especially those who are new to the position — struggle with speaking up when they see something happening that needs the board’s attention. They may be worried about receiving private or public backlash or derailing the company’s progress toward meeting targets. 


Why most companies suck at digital transformation

Focus on architecture in the wide, without forgetting architecture in the narrow. Enterprises need to understand the holistic architecture required to support accurate DX positive outcomes and not just focus on individual systems. This is an outcome of a comprehensive strategy, in that we’re utilizing all systems in place, including legacy and other on-premises assets, and establishing how they will work and play well with migrated or net-new systems existing on public clouds. If companies focus only on small systems or architectures, they usually neglect to understand how they will exist within a strategically defined DX ecosystem. This results in decoupled projects that may be impressive on their own but provide little or no value to the larger strategy that is more important than just the parts that make it up. ... The most significant issue is that most don’t even understand what digital transformation is, even those with the term in their titles. Instead, they focus on the tactics, meaning tools and technology, never understanding the plan to make things incrementally better.


Researchers develop technique to prevent software bugs

Baldur took several months to build. The work was done as a collaboration with Google, and built on top of a significant amount of prior research. First, whose team performed its work at Google, used Minerva, an LLM trained on a large corpus of natural-language text, and then fine-tuned it on 118GB of mathematical scientific papers and webpages containing mathematical expressions. Next, she further fine-tuned the LLM on a language, called Isabelle/HOL, in which the mathematical proofs are written. Baldur then generated an entire proof and worked in tandem with the theorem prover to check its work. When the theorem prover caught an error, it fed the proof, as well as information about the error, back into the LLM, so that it can learn from its mistake and generate a new and hopefully error-free proof. This process yields a remarkable increase in accuracy. The tool for automatically generating proofs is called Thor, which can generate proofs 57% of the time. 


Reconciling Agile Development With AI Safety

On one hand, the Agile principles of an iterative approach, regular risk management checks, cross-expertise collaboration, solicitation of third-party feedback at every stage, and adaptability to changing priorities or new findings seem well-suited to the seamless incorporation of responsible AI practices. However, we think responsible AI development will require a full revamp of the software development lifecycle, from pre-training assessments of data to post-deployment monitoring for performance and safety. Some practices, such as automated algorithmic checks (like tests for data bias and model performance metrics, all of which are part of Stanford’s HELM set of evaluations) can be utilized anywhere in the development lifecycle. Other techniques may be purely ex post, like algorithmic audits. An Agile approach avoids engineering siloes, allowing for stage-specific practices to be adopted where necessary, while ensuring stage-agnostic practices are adopted at all relevant stages, and allowing these stages to proceed in tandem.


Modern-day manufacturing: A process built on data governance

The average manufacturer generates high volumes and different types of data, including customer information, production orders, and shipment tracking, to name a few. This is further compounded with every supplier, distributor, and third party that’s added to the supply chain. Without a system to validate all this data, a manufacturer can find itself with inaccurate or incomplete data. Poor data quality not only leads to operational inefficiencies and mistakes, it also hinders the organization’s growth by limiting its ability to forecast demands and plan production runs. ... Within complex manufacturing ecosystems, it can be unclear who owns data as it flows across the supply chain. Various teams generate and use different types of data, making ownership and responsibility a challenge to pin down. ... Establishing data ownership involves identifying primary stakeholders who are responsible for ensuring the quality, security, and correct use of data assets. 



Quote for the day:

"We live in a society obsessed with public opinion. But leadership has never been about popularity." -- Marco Rubio

Daily Tech Digest - January 09, 2024

10 ways to destroy developer happiness

Who doesn’t get annoyed by endless meetings? Developers are busy people, and most would rather spend their time coding than talking about it. Meetings that are not focused and efficient are a frequent source of disenchantment. “Meetings that drag on without contributing to progress can be very draining,” says Vlad Gukasov, software development engineer at Amazon. “These often take up valuable time that could be better spent on actual development work.” ... Unnecessary red tape can be incredibly frustrating to developers. “Navigating through layers of bureaucracy can be quite stifling,” Gukasov says. “The complexity of internal procedures can sometimes hinder the smooth progress of software development.” Developers like efficiency, says Remi Desmarais, director of engineering and software development at software company Tempo Software. “They frequently encounter delays, from waiting for clarification on requirements, to code processes like compilation, building and testing to seeking approval from code reviewers, which can hinder their progress,” he says.


Data Center Cooling: Embracing Liquid Cooling for the Era of Sustainable and Efficient Operations

While the adoption of liquid cooling undeniably offers an enticing remedy for the thermal challenges posed by high-density racks, its integration into data center management introduces a new set of considerations and complexities. The deployment of liquid cooling systems necessitates a bespoke infrastructure, comprising specialized components such as pumps, heat exchangers, and filtration systems. These elements work in concert to ensure the seamless circulation and efficient heat dissipation of the liquid coolant throughout the intricate network of electronic components. Beyond the physical requirements, the use of liquid coolants imposes a critical need for stringent safety protocols and specialized training for personnel entrusted with the operation and maintenance of these systems. The introduction of liquid into the data center ecosystem marks a shift that extends beyond hardware considerations, demanding a holistic approach to facility management and personnel training to guarantee the safe and effective functioning of these advanced cooling solutions.


Meet the industrial metaverse: How Sony and Siemens seek to unleash the power of immersive engineering

According to Siemens CEO Roland Busch, "This will empower customers to accelerate innovation, enhance sustainability, and adopt new technologies faster and at scale, leading to a profound transformation of entire industries and our everyday lives. Together with our customers and partners, Siemens is proud to announce new products that will bring the industrial metaverse a step closer to all of us." ... Another interesting phrase Siemens has been using is immersive engineering. The idea is that engineering, design, and content creation can be done inside a 3D environment using a toolkit called the Siemens Xcelerator portfolio. Siemens offers a product called NX Immersive Designer. This tool is designed to "seamlessly connect the real and digital worlds." Essentially, users can create a digital twin, a version of a real-world system modeled and simulated in the virtual world. Digital twins replicate physical entities with accurate virtual models, helping engineers simulate performance testing and make predictions about points of failure. The virtual nature of the twin allows many more variations and usages compared to the cost and risk of building a single physical prototype. 


Leadership opportunities must align with an individual's natural strengths: Gallup’s Rohit Kar

In addressing the correlation between effective leadership and employee engagement, our approach to leadership development strategy emphasises the enduring elements that remain constant amid technological advancements and societal shifts. We prioritise understanding the unchanging fundamental aspects of human nature, acknowledging that regardless of external influences, certain core needs persist. At Gallup, when we delve into engagement, we recognise that employees bring their emotions into the workplace. To measure these emotions productively, we simplify the process, ensuring it remains straightforward and outcome-oriented. Our focus lies in enabling managers to engage in meaningful conversations by providing adaptable frameworks and tools. We ensure scalability, speed, and agility in our delivery methods to cater to changing consumption patterns without altering the essence of what we deliver – because the core principles of management don't require alteration. Our unique approach lies in retaining fundamental principles while adapting to the evolving landscape. 


New York Times’ blockbuster suit could decide the fate of genAI

Microsoft and OpenAI say their use of copyrighted material is transformative. They contend the output of the chatbots transforms the original content into something different. The Times suit claims there’s no real transformation, that what Microsoft and OpenAI are doing is outright theft. It claims the companies are not just stealing Times content, but their audience as well, and making billions of dollars from it. People will have no need to read the Times either online or in print, if they can get all the newspaper’s information for free from a chatbot instead, the suit alleges. “There is nothing ‘transformative’ about using The Times’scontent without payment to create products that substitute for The Times and steal audiences away from it. Because the outputs of Defendants’ GenAI models compete with and closely mimic the inputs used to train them, copying Times works for that purpose is not fair use.” ... So, who’s right? This is not a difficult call. The answer is simple. The Times is right. Microsoft and OpenAI are wrong. Microsoft and OpenAI are getting a free ride to use copyrighted material that takes a tremendous amount of time and money to create, and uses that material to reap big profits.


After Orange Disruption, Brace for More BGP Route Hijacking

Orange's attacker appeared to have obtained and used a valid password for the telco's administrator account with RIPE, for which two-factor authentication wasn't enabled. Security experts report that the source of the password appears to have been information-stealing malware called Raccoon. After gaining access to the account, the attacker used RIPE's hosted RPKI resource certification service to broadcast a valid, cryptographically signed route origin authorization to direct traffic to an autonomous system number not controlled by Orange, resulting in the traffic never reaching its intended destination. ... The telecommunications giant subsequently told Information Security Media Group in a statement: "The Orange account in the IP network coordination center (RIPE) has suffered an improper access that has affected the browsing of some of our customers." The company said it had immediately responded and resolved the problem later Wednesday and that "appropriate measures have been taken to prevent such an incident from happening again."


Data Governance & Controls: An Increasingly Critical Foundation for AML Compliance Programs

Ensuring effective AML compliance necessitates a steadfast commitment to data accuracy and integrity, achievable only through robust governance. Financial institutions can only rely on the consistency and correctness of their AML compliance solutions by establishing stringent controls ensuring accuracy and integrity. The significance of data accuracy and integrity can easily be seen in relation to false positive and false negative alerts in transaction monitoring. False positives, stemming from inaccurate data triggering unnecessary alerts, can lead to a significant waste of resources and needless increases in program costs. Conversely, false negatives, arising from incomplete or unreliable data, may cause suspicious activity to go unreported, which could lead to potential regulatory action. Financial institutions that build strong data accuracy and integrity standards into their processes minimize these risks, thereby enhancing the overall efficiency and efficacy of their AML compliance program.


Designing an IT department for a world defined by change

For years, inside the technology department, we’ve referred to “the business”, meaning everything other than IT, and for its part, “the business” has spent years trying to isolate IT as a mere cost centre. But nearly a quarter of the way through the 21st century, the need to integrate digital tools into every part of business means this splendid isolation no longer really works. Even in the most performant IT departments, centralised decision-making around tooling and technology choices falls behind the galloping pace of the market, and it's difficult to imagine how it could ever be otherwise in a department of finite resources. ... Still, by refusing to allow the curation of information technology outside of that provided by the technology function, it becomes a self-fulfilling prophecy - the only people expected to understand the challenges of technology are those hired into the IT department. The only way to break that vicious cycle is to discard the model of the central IT department. Once you start questioning the existence of the IT department, it’s only a short journey to rejecting the concept of functional structures entirely.


Building a Better Analytics Team

Once you have the right group of people on your team, your challenge is to build an environment where they can work effectively. Ensure that they are clear about the organization’s mission, vision, values, and objectives. Create an environment where the connection between your team’s objectives, projects, and tasks and the targets and objectives of the organization are clear. Team members need to understand how the data architecture being built will have a direct impact on the decisions the organization is making and how their activities further the goal of becoming a more data-driven organization. Team building is not a one-time activity. It is built into everyday activities, including goal setting, performance reviews, project planning and execution, team meetings, and one-on-one meetings between you and your direct reports. The clarity of purpose is something that increases each time it is reviewed and as the team’s objectives and goals are refined and realigned. As a leader, one of your main objectives is to remove obstacles that arise. 


The Rise of Dual Ransomware Attacks

Two ransomware attacks mean that companies could be facing more data encryption, more data exfiltration, more data leaks, and multiple demands for payment. If an enterprise is dealing with a single attacker behind two attacks, they could be facing the challenge of different ransomware strains. How did the two strains impact an organization’s systems, and what will it take to remediate and return to normal operations? How much data was taken? Ransomware groups can take that exfiltrated data to leak sites to up the pressure on victims to pay. If two different attackers are in play, recovery can be even more complicated. Two different attackers may encrypt the same files. “We've had a couple of cases where the first ransomware incident occurs. They encrypted a bunch of files. In the midst of dealing with that a second attack…those attackers encrypt the encrypted files of the other attackers,” Minder shares. “If that happens, where you get encrypted files encrypted again, the likelihood of corruption goes up thousand percent, and you may not get your files back at all.”



Quote for the day:

“Our greatest fear should not be of failure but of succeeding at things in life that don't really matter.” -- Francis Chan