Showing posts with label botnet. Show all posts
Showing posts with label botnet. Show all posts

Daily Tech Digest - April 15, 2025


Quote for the day:

“Become the kind of leader that people would follow voluntarily, even if you had no title or position.” -- Brian Tracy



Critical Thinking In The Age Of AI-Generated Code

Besides understanding our code, code reviewing AI-generated code is an invaluable skill nowadays. Tools like GitHub's Copilot and DeepCode can code-review better than a junior software developer. Depending on the complexity of the codebase, they can save us time in code reviewing and pinpoint cases that we may have missed, but, after all, they are not flawless. We still need to verify that the AI assistant's code review did not provide any false positives or false negatives. We need to verify that the code review did not miss anything important and that the AI assistant got the context correctly. The hybrid approach seems to be the most effective one: let AI handle the grunt work and rely on developers for the critical analysis. ... After all, code reviewing AI-generated code is an excellent opportunity to educate ourselves while improving our code-reviewing skills. Keep in mind that, to date, AI-generated code optimizes for patterns in its training data. This may not be aligned with coding first principles. AI-generated code may follow templated solutions rather than custom designs. It may include unnecessary defensive code or overly generic implementations. We need to check that it has chosen the most appropriate solution for each code block generated. Another common problem is that LLMs may hallucinate.


DeepCoder: Revolutionizing Software Development with Open-Source AI

One of the DeepCoder project’s most significant contributions is the introduction of verl-pipeline, an optimized extension of the very open-source RLHF library. The team identified sampling, the generation of long token sequences as the primary bottleneck in training and developed “one-off pipelining” to address this challenge. This technique overlaps sampling, reward calculation and training, reducing end-to-end training times by up to 2.5x. This optimization is game-changing for coding tasks requiring thousands of unit tests per reinforcement learning iteration, making previously prohibitive training runs accessible to smaller research teams and independent developers. For DevOps professionals, DeepCoder represents an opportunity to integrate advanced code generation directly into CI/CD pipelines without dependency on API-gated services. Teams can fine-tune the model on their codebase, creating customized assistants that understand their specific architecture and coding patterns. ... DeepCoder’s open-source nature aligns with the DevOps collaboration and shared improvement philosophy. As more organizations adopt and contribute to the model, we can expect to see specialized versions emerge for different programming languages and problem domains.


Transforming Software Development

AI assistants are getting smarter, moving beyond prompt-based interactions to anticipate developers’ needs and proactively offer suggestions. This evolution is driven by the rise of AI agents, which can independently execute tasks, learn from their experiences and even collaborate with other agents. Next year, these agents will serve as a central hub for code assistance, streamlining the entire software development lifecycle. AI agents will autonomously write unit tests, refactor code for efficiency and even suggest architectural improvements. Developers’ roles will need to evolve alongside these advancements. AI will not replace them. Far from it; proactive AI assistants and their underlying agents will help developers build new skills and free up their time to focus on higher-value, more strategic tasks. ... AI models are more powerful when trained on internal company data, which allows them to generate insights specific to an organization’s unique operations and objectives. However, this often requires running models on premises for security and compliance reasons. With open source models rapidly closing the performance gap with commercial offerings, more businesses will deploy models on premises in 2025. This will allow organizations to fine-tune models with their own data and deploy AI applications at a fraction of the cost.


Cybercriminal groups embrace corporate structures to scale, sustain operations

We have seen cross collaboration between groups that specialize in specific activities. For example, one group specializes in social engineering, while another focuses on scaling malware and botnets to uncover open servers that yield database breaches. They, in turn, can sell access to those who focus on ransomware attacks. Recently, we have seen collaboration between AL/ML developers who scrape public records to build Org Charts, as well as lists of real estate holdings. This data is then used en masse with situational and location data to populate PDF attachments in emails that look like real invoices, with executives’ names in fake prior email responses, as part of the thread. ... the recent development in hackers organizing into larger groups has allowed the stakes to get even higher. Look at the Lazarus Group, who pulled off one of the largest heists ever by targeting Bybit and stealing $1.5 billion in Ethereum, as well as subsequently converting $300 million in unrecoverable funds. This group is likely state-sponsored and funding North Korean military programs. Therefore, understanding North Korean national interests will hint at future targets. The increasing scale of their attacks likely reflects greater resources allocated by North Korea, more sophisticated tooling and capabilities, lessons learned from previous operations, and a growing number of personnel trained in cyber operations.


Agentic AI might soon get into cryptocurrency trading — what could possibly go wrong?

Not everyone is bullish on the intersection of Web3, agentic AI and blockchain. Forrester Research vice president and principal analyst Martha Bennett is among those who are skeptical. In 2023, she co-authored an online post critical of Worldcoin, now the World project, and her opinion hasn’t changed in several regards. World project still faces major challenges, including privacy issues and concerns about its iris biometric technology, she said. And Agentic AI is still in its early stages and not yet capable of supporting Web3 transactions. Most current generative AI (genAI) tools, including LLMs, lack the autonomy defined as “agentic AI.” “There’s no AI technology today that would be able automate Web3 transactions in a reliable and secure manner,” she said. Given the risks and the potential for exploitation, it’s too soon to rely on AI systems with high autonomy for Web3 transactions. She did note, however, that Web3 already uses automation through smart contracts — self-executing electronic contracts with the terms of the agreement directly written into code. “Will Web3 go mainstream in 2025? My overall answer is no, but there are nuances,” she said. “If mainstream means mass consumer adoption, it’s a definite no. There’s simply not enough utility there for consumers.” Web3, Bennett said, is largely a self-contained financial ecosystem, and efforts to boost adoption through Decentralized Physical Infrastructure Networks (DePIN), such as Tools for Humanity’s, haven’t led to major breakthroughs.


Artificial Intelligence fuels rise of hard-to-detect bots 

“The surge in AI-driven bot creation has serious implications for businesses worldwide,” said Tim Chang, General Manager of Application Security at Thales. “As automated traffic accounts for more than half of all web activity, organisations face heightened risks from bad bots, which are becoming more prolific every day.” ... “This year’s report sheds light on the evolving tactics and techniques utilised by bot attackers. What were once deemed advanced evasion methods have now become standard practice for many malicious bots,” Chang said. “In this rapidly changing environment, businesses must evolve their strategies. It’s crucial to adopt an adaptive and proactive approach, leveraging sophisticated bot detection tools and comprehensive cybersecurity management solutions to build a resilient defense against the ever-shifting landscape of bot-related threats.” ... Analysis in the report reveals a deliberate strategy by cyber attackers to exploit API endpoints that manage sensitive and high-value data. Implications of this trend are especially impactful for industries that rely on APIs for their critical operations and transactions. Financial services, healthcare, and e-commerce sectors are bearing the brunt of these sophisticated bot attacks, making them prime targets for malicious actors seeking to breach sensitive information.


Humans at the helm of an AI-driven grid

A growing number of utilities are turning to AI-based tools to process vast data streams and streamline tasks once managed by manual calculation. For instance, algorithms can analyse weather patterns, historical consumption, and real-time sensor readings to make more accurate power demand and renewable energy generation forecasts. This supports more efficient balancing of supply and demand, reducing the likelihood of overloaded transformers or unexpected brownouts. Some utilities are also exploring AI-driven alarm management, which can filter the flood of alerts triggered by a network issue. Instead of operators sifting through hundreds of notifications, AI tools can be used to identify and highlight the most critical issues in real time. Another AI application is with congestion management, detecting trouble spots on the grid where demand might exceed capacity and even propose rerouting strategies to keep electricity flowing reliably. While still in their early stages, AI tools hold promise for driving operational efficiency in many daily scenarios. ... Even the smartest algorithm, however, lacks the broader perspective and accountability that people bring to grid management. Power and Utility companies are tasked with a public service mandate: they must ensure safety, affordability, and equitable access to electricity.


CISO Conversations: Maarten Van Horenbeeck, SVP & CSO at Adobe

The digital divide is simple to understand but complex to solve. Fundamentally, it separates those who have access to cyber and cyber knowledge from those who do not. There are areas of the world and socio-economic groups or demographics who have little or very limited access to the internet, and consequently very little awareness of cybersecurity. But cyber and cyber threats are worldwide; and technology is increasingly integrated and interconnected globally. “Cyber issues emanating from the digital divide don’t just play out far away from our homes – they play out very close to our homes as well,” warns Van Horenbeeck. “There’s a huge divide between people who know, for example, not to reuse passwords, to use multi factor authentication, and those individuals that have none of that experience at all.” In effect the digital divide creates a largely invisible and unseen threat surface for the long-connected world. He believes that technology companies can play a part in solving this problem by making cybersecurity features easy to understand and use. and cites two examples of the Adobe approach. “We invested, for example, in support for passkeys because we feel it’s a more effective and easier method of authentication that is also more secure.”


How AI, Robotics and Automation Transform Supply Chains

Enterprises designing robots to augment the human workforce need to take design thinking and ergonomic approaches into consideration. Designers must think about how robots comprehend and understand their physical surroundings without tripping over cables or objects on the floor, obstructing movement or causing human injuries. These robots are created with the aim to collaborate with humans for repetitive tasks and lift heavy loads. Last year, OT.today featured stories on how humanoid robots augmented the human workforce at Amazon, Mercedes, NASA and the Piaggio Group. In 2017, Alibaba invested in AI labs and the DAMO Academy. At its flagship Computing Conference in 2018, held in Hangzhou, China, Alibaba showcased a range of robots designed for warehouses, autonomous deliveries and other sectors, including hospitality and pharmaceuticals. More recently, Alibaba invested in LimX Dynamics, a company specializing in humanoid and robotic technology. Japanese automobile manufacturers have been using industrial robots since the early 1980s. Chip manufacturing companies in Taiwan and other countries also use them. Robots assist in surgeries in the healthcare sector. But none of those early manufacturing robots resembled humanoids or even had advanced AI seen in today's robots.


CIOs are overspending on the cloud — but still think it’s worth it

CIOs should also embrace DevOps practices tied to cost reduction when consuming cloud resources, Sellers says. One pitfall that doesn’t get enough attention: Many organizations don’t educate developers on the cost of cloud services, despite the glut of developer services large cloud providers make trivial to call. “I’ve lost track of how many services Amazon provides that developers can just use, and some of those can be quite expensive, but a developer doesn’t really know that,” Sellers says. “They’re like, ‘Instead of writing my own solution to this, I can just call this service that Amazon already provides, and boom, my job is done.’” The disconnect between developers and financial factors in the cloud is a real problem that leads to increased cloud costs, adds Nick Durkin, field CTO at Harness, provider of an AI-driven software development platform. Without knowing the costs of accessing a cloud-based GPU or CPU, for example, a developer is like a home builder who doesn’t know the cost of wood or brick, Durkin says. “If you’re not giving your smartest engineers access to the information about services that they can optimize on, how would you expect them to do it?” he says. “Then, finance comes back a month later with a beating stick.”

Daily Tech Digest - October 15, 2024

The NHI management challenge: When employees leave

Non-human identities (NHIs) support machine-to-machine authentication and access across software infrastructure and applications. These digital constructs enable automated processes, services, and applications to authenticate and perform tasks securely, without direct human intervention. Access is granted to NHIs through various types of authentications, including secrets such as access keys, certificates and tokens. ... When an employee exits, secrets can go with them. Those secrets – credentials, NHIs and associated workflows – can be exfiltrated from mental memory, recorded manually, stored in vaults and keychains, on removable media, and more. Secrets that have been exfiltrated are considered “leaked.” ... An equally great risk is that employees, especially developers, create, deploy and manage secrets as part of software stacks and configurations, as one-time events or in regular workflows. When they exit, those secrets can become orphans, whose very existence is unknown to colleagues or to tools and frameworks. ... The lifecycle of NHIs can stretch beyond the boundaries of a single organization, encompassing partners, suppliers, customers and other third parties. 


How Ernst & Young’s AI platform is ‘radically’ reshaping operations

We’re seeing a new wave of AI roles emerging, with a strong focus on governance, ethics, and strategic alignment. Chief AI Officers, AI governance leads, knowledge engineers and AI agent developers are becoming critical to ensuring that AI systems are trustworthy, transparent, and aligned with both business goals and human needs. Additionally, roles like AI ethicists and compliance experts are on the rise, especially as governments begin to regulate AI more strictly. These roles go beyond technical skills — they require a deep understanding of policy, ethics, and organizational strategy. As AI adoption grows, so too will the need for individuals who can bridge the gap between the technology and the focus on human-centered outcomes.” ... Keeping humans at the center, especially as we approach AGI, is not just a guiding principle — it’s an absolute necessity. The EU AI Act is the most developed effort yet in establishing the guardrails to control the potential impacts of this technology at scale. At EY, we are rapidly adapting our corporate policies and ethical frameworks in order to, first, be compliant, but also to lead the way in showing the path of responsible AI to our clients.


The Truth Behind the Star Health Breach: A Story of Cybercrime, Disinformation, and Trust

The email that xenZen used as “evidence” was forged. The hacker altered the HTML code of an email using the common “inspect element” function—an easy trick to manipulate how a webpage appears. This allowed him to make it seem as though the email came directly from the CISO’s official account. ... XenZen’s attack demonstrates how cybercriminals are evolving. They are using psychological warfare to create chaos. In this case, xenZen not only exploited a vulnerability but also fabricated evidence to frame the CISO. The security community needs to stay vigilant and anticipate attacks that may target not just systems but also individuals and organizations through disinformation. ... Making the CISO a scapegoat for security breaches without proper evidence is a growing concern. Organizations must understand the complexities of cybersecurity and avoid jumping to conclusions. Security teams should have the support they need, including legal protection and clear communication channels. Transparency is essential, but so is the careful handling of internal investigations before pointing fingers.


How CIOs and CTOs Are Bridging Cross-Functional Collaboration

Ashwin Ballal, CIO at software company Freshworks, believes that the organizations that fail to collaborate well across departments are leaving money on the table. “Siloed communications create inefficiencies, leading to duplicative work, poor performance, and a negative employee experience. In my experience as a CIO, prioritizing cross-departmental communication has been essential to overcoming these challenges,” says Ballal. His team continually reevaluates the tech stack, collaborating with leaders and users to confirm that the organization is only investing in software that adds value. This approach saves money and helps keep employees engaged by minimizing their interactions with outdated technology. He also uses employees as product beta testers, and their feedback impacts the product roadmap. ... “My recommendation for other CIOs and CTOs is to regularly meet with departmental leaders to understand how technology interacts across the organization. Sending out regular surveys can yield candid feedback on what’s working and what isn’t. Additionally fostering an environment where employees can experiment with new technologies encourages innovation and problem-solving.”


2025 Is the Year of AI PCs; Are Businesses Onboard?

With the rise of real-time computing needs and the proliferation of IoT devices, businesses are realizing the need to move AI closer to where the data is - at the edge. This is where AI PCs come into play. Unlike their traditional counterparts, AI PCs are integrated with neural processing units, NPUs, that enable them to handle AI workloads locally, reducing latency and providing a more secure computing environment. "The anticipated surge in AI PCs is largely due to the supply-side push, as NPUs will be included in more CPU vendor road maps," said Ranjit Atwal, senior research director analyst at Gartner. NPUs allow enterprises to move from reactive to proactive IT strategies. Companies can use AI PCs to predict IT infrastructure failures before they happen, minimizing downtime and saving millions in operational costs. NPU-integrated PCs also allow enterprises to process AI-related tasks, such as machine learning, natural language processing and real-time analytics, directly on the device without relying on cloud-based services. And with generative AI becoming part of enterprise technology stacks, companies investing in AI PCs are essentially future-proofing their operations, preparing for a time when gen AI capabilities become a standard part of business tools.


Australia’s Cyber Security Strategy in Action – Three New Draft Laws Published

Australia is following in the footsteps of other jurisdictions such as the United States by establishing a Cyber Review Board. The Board’s remit will be to conduct no-fault, post-incident reviews of significant cyber security incidents in Australia. The intent is to strengthen cyber resilience, by providing recommendations to Government and industry based on lessons learned from previous incidents. Limited information gathering powers will be granted to the Board, so it will largely rely on cooperation by impacted businesses. ... Mandatory security standards for smart devices - The Cyber Security Bill also establishes a framework under which mandatory security standards for smart devices will be issued. Suppliers of smart devices will be prevented from supplying devices which do not meet these security standards, and will be required to provide statements of compliance for devices manufactured in Australia or supplied to the Australian market. The Secretary of Home Affairs will be given the power to issue enforcement notices (including compliance, stop and recall notices) if a certificate of compliance for a specific device cannot be verified.


The Role of Zero Trust Network Access Tools in Ransomware Recovery

By integrating with existing identity providers, Zero Trust Network Access ensures that only authenticated and authorized users can access specific applications. This identity-driven approach, combined with device posture assessments and real-time threat intelligence, provides a robust defense against unauthorized access during a ransomware recovery. Moreover, ZTNA’s application-layer security means that even if a user’s credentials are compromised, the attacker would only gain access to specific applications rather than the entire network. This granular access control is crucial in containing ransomware attacks and preventing lateral movement across the network. ... As a cloud-native solution, ZTNA can easily scale to meet the demands of organizations of all sizes, from small businesses to large enterprises. This scalability is particularly valuable during a ransomware recovery, where the need for secure access may fluctuate based on the number of systems and users involved. ZTNA’s flexibility also allows it to integrate with various IT environments, including hybrid and multi-cloud infrastructures. This adaptability ensures that organizations can deploy ZTNA without the need for significant changes to their existing setups, making it an ideal solution for dynamic environments.


What Is Server Consolidation and How Can It Improve Data Center Efficiency?

Server consolidation is the process of migrating workloads from multiple underutilized servers into a smaller collection of servers. ... although server consolidation typically focuses on consolidating physical servers, it can also apply to virtual servers. For instance, if you have five virtual hosts running on the same physical server, you might consolidate them into just three or virtual hosts. Doing so would reduce the resources wasted on hypervisor overhead, allowing you to maximize the return on investment from your server hardware. ... To determine whether server consolidation will reduce energy usage, you’ll have to calculate the energy needs of your servers. Typically, power supplies indicate how many watts of electricity they supply to servers. Using this number, you can compare how energy requirements vary between machines. Keep in mind, however, that actual energy consumption will vary depending on factors like CPU clock speed and how active server CPUs are. So, in addition to comparing the wattage ratings on power supplies, you should track how much electricity your servers actually consume, and how that metric changes before and after you consolidate servers.


How DDoS Botent is used to Infect your Network?

The threat posed by DDoS botnets remains significant and complex. As these malicious networks grow more sophisticated, understanding their mechanisms and potential impacts is crucial for organizations. DDoS botnets not only facilitate financial theft and data breaches but also enable large-scale spam and phishing campaigns that can undermine trust and security. To effectively defend against these threats, organizations must prioritize proactive measures, including regular updates, robust security protocols, and vigilant monitoring of network activity. By implementing strategies to identify and mitigate botnet attacks, businesses can safeguard their systems and data from potential harm. Ultimately, a comprehensive understanding of how DDoS botnets operate—and the strategies to combat them—will empower organizations to navigate the challenges of cybersecurity and maintain a secure digital environment. As a CERT-In empanelled organization, Kratikal is equipped to enhance your understanding of potential risks. Our manual and automated Vulnerability Assessment and Penetration Testing (VAPT) services proficiently discover, detect, and assess vulnerabilities within your IT infrastructure. 


Banks Must Try the Flip Side of Embedded Finance: Embedded Fintech

With a one-way-street perspective on embedded finance, the idea is that if payment volume is moving to tech companies then banks should power the back end of the tech experience. This is a good start but the threat from fintech companies to retail banks will only continue to deepen in the future. Customer adoption is higher than ever for some fintechs like Chime and Nubank, for example. A better approach would be for banks to use embedded fintech to improve customer experience by upgrading banks’ tech offerings to retain customers and grow within their customer base. Embedded fintech can help these organizations stay competitive technologically. ... There are many opportunities for innovation with embedded payroll. Banks are uniquely positioned to offer tailored payroll solutions that map to what small businesses today want. Payroll is complex and needs to be compliant to avoid hefty penalties. Embedded payroll lets banks offload costs, burdens and risks associated with payroll. Banks can offer faster payroll with less risk when they hold the accounts for employers and payees. They can also give business customers a fuller picture of their cash flow, offering them peace of mind. 



Quote for the day:

"Pull the string and it will follow wherever you wish. Push it and it will go nowhere at all." -- Dwight D. Eisenhower

Daily Tech Digest - March 10, 2024

What’s the privacy tax on innovation?

A few decades ago, California had one of the strongest definitions for certifying Organic foods in the US. Eventually, the US government stepped in with a watered-down definition. Despite the pain of new privacy controls, the US data broker industry will lobby for a similar approach to at least harmonize privacy regulations at the Federal level that limit the impact on their business models when operating across state lines. For businesses and consumers, a more equitable approach would be to add a few more teeth to the cost of data misuse arising from legal sales, employee theft, or breaches. A few high-profile payouts arising from theft or when this data is used as part of multi-million dollar ransomware attacks on critical business systems would have a focusing effect on better privacy management practices. Another option is to turn to banks as holders of trust. Banks may be a good first point for managing the financial data we directly share with them. But what about all the data that others gather that may not be tied to traditional identifiers like social security numbers (SSN) used to unify data, such as IP addresses, phone numbers, Wi-Fi hubs, or the trail of GPS dots that gravitate to your home or office?


Living with the ghost of a smart home’s past

There were the window shades that always opened at 8AM and always closed at sundown. My brother disconnected everything that looked like a hub, and still, operating on some inaccessible internal clock, the shades carried on as they were once programmed to do. ... This is the state of home ownership in 2024! People have been making their homes smart with off-the-shelf parts for well over a decade now. Sometimes they sell those homes, and the new homeowners find themselves mired in troubleshooting when they should be trying to pick out wall colors. Some former homeowners will provide onboarding to the home’s smart home system, but most do as the guy who used to own my brother’s house did. They walk away and leave it as an adventure for the next person. ... I really hope the new renters of my old Brooklyn walk-up appreciate all the 2014 Philips Hue lights I left installed in the basement. There’s a calculus you make as you’re moving. It’s a hectic time, and there’s a lot to be done. Do you want to spend half the day freeing all those Hue bulbs from their obnoxious and broken recessed light housings, or do you want to leave a potential gift for the next homeowner and get started on nesting in your new place? 


Overcoming the AI Privacy Predicament

According to one study by Brookings, while 57% of consumers felt that AI will have a net negative impact on privacy, 34% were unsure about how AI would affect their privacy. Indeed, AI evokes a mixed set of thoughts and emotions in consumers. For most people, the promise of AI is clear: from increasing efficiency, to automating mundane tasks and freeing up more time for creative work, to improving outcomes in areas such as healthcare and education. ... In the realm of AI, the lack of trust is significant. Indeed, 81% of consumers think the information collected by AI companies will be used in ways people are uncomfortable with, as well as in ways that were not originally intended. That consumers are put in a seemingly impossible predicament regarding their privacy leaves them little choice but to a.) consent, or b.) forgo use of the product or service. Both choices leave consumers wanting more from the digital economy. When a new technology has negative implications for privacy, consumers have shown they are willing to engage in privacy-protective behaviors, such as deleting an app, withholding personal information, or abandoning an online purchase altogether.


How Static Analysis Can Save Your Software

While static analysis is a means of pattern detection, fixing an actual bug (for example, dereferencing a null pointer) is much harder, albeit possible. It becomes mathematically difficult to track exponentially increasing possible states. We call this “path explosion.” Say you’re writing code that, given two integers, divides one by the other, and there are various failure modes depending on the integers’ values. But what if the denominator is zero? That results in undefined behavior, and it means you need to look at where those integers came from, their possible values and what branches they took along the way. If you can see that the denominator is checked against zero before the division — and branches away if it is — you should be safe from division-by-zero issues. This theoretical stepping through stages of code is called “symbolic execution.” It’s not too complicated if the checkpoint is fairly close to the division process, but the further away it gets, the more branches you must account for. Crossing the function boundary gets even trickier. But once you have calls from other translation units, the problem becomes intractable in the general case. 


Avoiding Shift Left Exhaustion – Part 1

Shift left requires developers to be involved in testing, quality assurance, and collaboration throughout the development cycle. While this is undoubtedly beneficial for the final product, it can lead to an increased workload for developers who must balance their coding responsibilities with testing and problem-solving tasks. ... Adapting to Shift left practices often requires developers to acquire new skills and stay current with the latest testing methodologies and tools. This continuous learning can be intellectually stimulating and exhausting, especially in an industry that evolves rapidly. Developers must understand new tools, processes, and technologies as more things get moved earlier in the development lifecycle. ... The added pressure of early and continuous testing and the demand for faster development cycles can lead to developer burnout. When developers are overburdened, their creativity and productivity may suffer, ultimately impacting the software quality they produce. ... Shifting testing and quality assurance left in the development process may impose strict time constraints. Developers may feel pressured to meet tight deadlines, which can be stressful and lead to rushed decision-making, potentially compromising the software’s quality.


Ransomware Attacks on Critical Infrastructure Are Surging

Especially under fire are critical services. Healthcare and public health agencies dominated, filing 249 reports to IC3 last year over ransomware attacks, followed by 218 reports from critical manufacturing and 156 from government facilities. Ransomware-wielding attackers are potentially targeting these sectors most because they perceive the victims as having a proclivity to pay, given the risk to life or essential business processes posed by their systems being disrupted. Last year, IC3 received a ransomware report from at least one victim in all of the 16 critical infrastructure sectors - which include financial services, food and agriculture, energy and communications - except for two: dams and nuclear reactors, materials and waste. The ransomware group tied to the largest number of successful attacks against critical infrastructure reported to IC3 last year was LockBit, followed by Alphv/BlackCat, Akira, Royal and Black Basta. Law enforcement recently disrupted Alphv/BlackCat, as well as LockBit, after which each group separately claimed to have rebooted before appearing to go dark. 


What’s the missing piece for mainstream Web3 adoption?

Today’s Web3 lacks a unifying ecosystem, causing the market to fracture into multiple, independently evolving use cases. Crypto enthusiasts have to use various decentralized applications (DApps) and platforms to perform multiple transactions and interact with the different sectors of Web3. However, this isn’t a sustainable growth model for the Web3 industry and is more of a deterrent rather than a benefit when it comes to crypto adoption. ... Recognizing the need for a more integrated approach, some Web3 players are moving beyond the hype. Legion Network is emerging as a notable example among these. As a one-stop shop for Web3, Legion Network addresses the complexity of the industry and reaches new audiences. It brings together essential Web3 use cases, including a proprietary crypto wallet with comprehensive portfolio tracking, DeFi swaps and bridges, engaging play-to-earn/win games, captivating quests with prize rewards, a launchpad for emerging projects and a unique SocialFi experience that fosters community engagement.


What’s Driving Changes in Open Source Licensing?

In response to the challenges posed by cloud computing, some vendor-driven open source projects have changed their licenses or their GTM models. For example, MongoDB, Elastic, Confluent, Redis Labs and HashiCorp have adopted new licenses that restrict the use of their software-as-a-service by third parties or require them to pay fees or share their modifications. These changes are intended to protect the revenue and sustainability of the original vendors and to ensure that they can continue to invest in the open source project. However, these changes have also caused some controversy and backlash from the user community, who may feel that the project is becoming less open and more proprietary or that they are losing some of the benefits and freedoms of open source. However, community-driven open source projects have largely maintained their permissive licenses and their collaborative approach. These projects still benefit from the diversity and scale of their user community, who contribute to the development, maintenance, support and security of the software. These projects also leverage the support of organizations and foundations, such as the Linux Foundation, the Apache Software Foundation and the CNCF, who provide governance, funding and infrastructure. 


Botnets: The uninvited guests that just won’t leave

Reducing response time is vital. The longer the dwell time, the more likely it is that botnets can impact a business, particularly given that botnets can spread across many devices in a short period. How can security teams improve detection processes and shrink the time it takes to respond to malicious activity? Security practitioners should have multiple tools and strategies at their disposal to protect their organization’s networks against botnets. An obvious first step is to prevent access to all recognized C2 databases. Next, leverage application control to restrict unauthorized access to your systems. Additionally, use Domain Name System (DNS) filtering to target botnets explicitly, concentrating on each category or website that might expose your system to them. DNS filtering also helps to mitigate the Domain Generation Algorithms that botnets often use. Monitoring data while it enters and leaves devices is vital as well, as you can spot botnets as they attempt to infiltrate your computers or those connected to them. This is what makes security information and event management technology paired with malicious indicators of compromise detections so critical to protecting against bots. 


Are You Ready to Protect Your Company From Insider Threats? Probably Not

The real problem is that employees and employers don’t trust each other. This is an enormous risk for employees, as this environment makes it more likely that insider threats, security risks that originate from within the company, will emerge or intensify when tensions are high and motivations, including financial strain, dissatisfaction or desperation, drive individuals to act against their own organization. That’s the bad news. The worst news is that most companies are unprepared to meet the moment. ... Insider threats often betray their motivation. Sometimes, they tell colleagues about their intentions. Other times, their actions speak louder than words, as attempts to work around security protocols, active resentment for coworkers or leadership or general job dissatisfaction can be a red flag that an insider threat is about to act. Explaining the impact of human intelligence, the U.S. Cybersecurity and Infrastructure Security Agency (CISA) writes, “An organization’s own personnel are an invaluable resource to observe behaviors of concern, as are those who are close to an individual, such as family, friends, and coworkers.”



Quote for the day:

"Leaders must be close enough to relate to others, but far enough ahead to motivate them." -- John C. Maxwell

Daily Tech Digest - June 03, 2023

Is it Possible to Calculate Technology Debt?

Perhaps we should rename it Architectural Debt or even Organisational Debt? From an Enterprise Architecture standpoint, we talk about “People, Processes, and Technology,” all of which contribute to the debt over time and form a more holistic view of the real debt. It does not matter what it is called as long as there is consistency within the organisation and it has been defined, agreed and communicated. ... The absence of master data management, quality, data lineage, and data validation all contribute to data debt. People debt is caused by having to support out-of-date assets (software and/or infrastructure), the resulting deskilling over time and missed opportunity to reskill which all potentially leads to employee attrition. Processes requiring modification can become dependent on technology due to the high cost of change, or the alternative of adjusting the design to accommodate poorly designed processes. While Robotic Process Automation (RPA) can provide a rapid solution in such cases, it raises the question of whether the automation simply perpetuates flawed processes without addressing the underlying issue. 


There Are Four Types of Data Observability. Which One is Right for You?

Business KPI Drifts: Since data observability tools monitor the data itself, they are often used to track business KPIs just as much as they track data quality drifts. For example, they can monitor the range of transaction amounts and notify where spikes or unusual values are detected. This autopilot system will show outliers in bad data and help increase trust in good data. Data Quality Rule Building: Data observability tools have automated pattern detection, advanced profiling, and time series capabilities and, therefore, can be used to discover and investigate quality issues in historical data to help build and shape the rules that should govern the data going forward. Observability for a Hybrid Data Ecosystem: Today, data stacks consist of data lakes, warehouses, streaming sources, structured, semi-structured, and unstructured data, API calls, and much more. ... Unlike metadata monitoring that is limited to sources with sufficient metadata and system logs – a property that streaming data or APIs don’t offer – data observability cuts through to the data itself and does not rely on these utilities.


Why Companies Should Consider Developing A Chief Security Officer Position

The combination of the top-down and cross-functional influence of the CSO with the technical reach of the CISO should be key to creating and maintaining the momentum required to deliver change and break business resistance where it happens. In my experience, firms looking to implement this type of CSO position should start looking internally for the right executive: Ultimately the role is all about trust, and your candidate should have intimate knowledge of how to navigate the internal workings of the organization. I would recommend looking for someone that is an ambitious leader—not someone at an end-of-career position. Additionally, consider assigning this role to a seasoned executive. Someone you believe is motivated overall by the protection of the business from active threats, able to take an elevated long-term view where required, over and above the short-term fluctuations of any business. Demonstrating leadership in a field as complex should be seen as an opportunity to showcase skills that can be applied elsewhere in the organization.


Threatening botnets can be created with little code experience, Akamai finds

According to the research the Dark Frost actor is selling the tool as DDoS-for-hire exploit and as a spamming tool. “This is not the first exploit by this actor,” said West, who noted that the attacker favors Discord to openly tout their wares and brag. “He was taking orders there, and even posting screenshots of their bank account, which may or may not be legitimate.” ... The Dark Frost botnet uses code from the infamous Mirai botnet, which West said was easy to obtain, and highly effective in exploiting hundreds of machines, and is therefore emblematic of how, with source code from previously successful malware strains and AI code generation, someone with minimal knowledge can launch botnets and malware. “The author of Mirai put out the source code for everyone to see, and I think that it started and encouraged the trend of other malware authors doing the same, or of security researchers publishing source code to get a bit of credibility,” said West.


Experts say stopping AI is not possible — or desirable

"These systems are not imputed with the capability to do all the things that they're now able to do. We didn’t program GPT-4 to write computer programs but it can do that, particularly when it’s combined with other capabilities like code interpreter and other programs and plugins. That’s exciting and a little daunting. We’re trying to get our hands wrapped around risk profiles of these systems. The risk profiles, which are evolving literally on a daily basis. “That doesn't mean it's all net risk. There are net benefits as well, including in the safety space. I think [AI safety research company] Anthropic is a really interesting example of that, where they are doing some really interesting safety testing work where they are asking a model to be less biased and at a certain size they found it will literally produce output that is less biased simply by asking it. So, I think we need to look at how we can leverage some of those emerging capabilities to manage the risk of these systems themselves as well as the risk of what’s net new from these emerging capabilities.”


How IT can balance local needs and global efficiency in a multipolar world

Technical architecture solutions, such as microservices, can help companies balance the level of local solution tailoring with the need to harness scale efficiencies. While not new, these solutions are more widely accepted and can be more easily realized in modern cloud platforms. These developments are enabling leading companies to evolve their operating models by building standardized, modular, and configurable solutions that maximize business flexibility and efficiency while making data management more transparent ... However useful these localization capabilities are, they will not work as needed unless local teams have sufficient autonomy (at some companies, local teams in China, for example, clear decisions through central headquarters, which is a major roadblock for pace and innovation). The best companies provide local teams with specific decision rights within guidelines and support them by providing necessary capabilities, such as IT talent embedded with local market teams to get customer feedback early.


Constructing the innovation mandate

We need to understand successful innovation actually touches all aspects of a business, by contributing to improving business processes, identifying new, often imaginative, ways to reduce costs, building out existing business models into new directions and value and discovering new ways and positioning into markets. To get to a consistent performance of innovation and creativity within organizations you do need to rely on a process, structure and the consistent ability to foster a culture of innovation. An innovation mandate is a critical tool for defining the scope and direction of innovation and the underlying values, commitment and resources placed behind it. Normally this innovation mandate comes in the form of a document, generally build up by a small team of senior leaders, innovation experts and subject matter experts. That group should possess a deep understanding of the existing organization’s strategy, business models, operations and culture and a wider appreciation of the innovation landscape, the “fields of opportunity” and the emerging practices of innovation management.


3 Unexpected Technology Duos That Will Supercharge Your Marketing

While geofencing isn't the newest technology to enter the marketing spectrum, it is improving exponentially day by day. Geofencing creates virtual geographic boundaries around targeted areas, and when someone crosses into one of those areas, it creates a triggered response — your ads will show up while they're browsing their favorite sites or checking their email. ... Website content can be a major trust builder for your businesses and therefore can play a vital part in turning an interested prospect into a buying customer. But many a business owner has cringed at the thought of writing copy for their website ... let alone regularly updating it with blog posts or e-newsletter articles. Creating large amounts of content can be a constant challenge for business owners, and I get it. You're already busy running a business! But what I want small business owners to realize is that they have access to many tools — some of them free — that will do 95% of the writing for you.


The Evolution of the Chief Privacy Officer

Given the natural overlap between privacy, security and the uses of data, strategic cooperation is key. “It’s about building a strategy together to develop an enterprise approach,” Jones said. “My role is to build privacy and transparency into every state system and application and business process at every stage of the life cycle.” Cotterill looks to Indiana’s IT org chart to help define the spheres of responsibility. The governor appoints the chief information officer and chief data officer, and the CISO and CPO report to each of them, respectively. “The CIO, and the CISO reporting to him, they’re focused on providing cost-effective, secure, consistent, reliable enterprise IT services and products,” he said. “For the CDO, with the CPO reporting to him … we have a threefold mission: to empower innovation, enable the use of open data, and do that all while maintaining data privacy.” IT provides “that secure foundation to do business,” while he and the CDO “are focused on the substantive use of data to drive decisions and improve outcomes,” he said.


Should Data Engineers be Domain Competent?

A traditional data engineer views a table with one million records as relational rows that must be crunched, transported and loaded to a different destination. In contrast, an application programmer approaches the same table as a set of member information or pending claims that impact life. The former is a pureplay, technical view, while the latter is more human-centric. These drastically differing lenses form the genesis of the data siloes ... When we advocate domain knowledge, let’s not relegate it to a few business analysts who are tasked to translate a set of high-level requirements into user stories. Rather domain knowledge implies that every data engineer gets a grip on the intrinsic understanding of how functionality flows and what it tries to accomplish. Of course, this is easier to preach than practice, as expecting a data team to understand thousands of tables and millions of rows is akin to expecting them to navigate a freeway in peak time on the reverse gear with blindfolds. It will be a disastrous. While its amply evident that data teams need domain knowledge, it’s hard to expect that centralized data teams will deliver efficient results. 



Quote for the day:

"Leaders are visionaries with a poorly developed sense of fear and no concept of the odds against them. " -- Robert Jarvik

Daily Tech Digest - January 29, 2022

BotenaGo Botnet Code Leaked to GitHub, Impacting Millions of Devices

Researchers also found additional hacking tools, from several sources, collected in the same repository. Alien Labs called the malware source code “simple yet efficient,” able to carry out malware attacks with a grand total of a mere 2,891 lines of code (including empty lines and comments). In its November writeup, Alien Labs noted that BotenaGo, written in Google’s open-source Golang programming language, could exploit 33 vulnerabilities for initial access. The malware is light, easy to use and powerful. BotenaGo’s 2,891 lines of code are all that’s needed for a malware attack, including, but not limited to, installing a reverse shell and a telnet loader used to create a backdoor to receive commands from its command-and-control (C2) operator. Caspi explained that BotenaGo has automatic setup of its 33 exploits, presenting an attacker a “ready state” to attack a vulnerable target and infect it with an appropriate payload based on target type or operating system. The source code leaked to GitHub and depicted below features a “supported” list of vendors and software used by BotenaGo to target its exploits at a slew of routers and IoT devices.


The best IT skill for the 2020s? Become an 'evergreen' learner

For starters, the "soft" skills will matter in the months and years ahead. These include professional skills such as communication, leadership, and teamwork, says Don Jones, vice president of developer skills at Pluralsight. Then there is a need for "tech-adjacent skills, like a familiarity with project management and business analysis." Jones urges an "evergreen" approach to skills mastery, as technology evolves too quickly to commit to a single platform or solution set. "The biggest-impact skill is the ability to learn," he says. "There's no single tech skill you can invest in that won't change or be outdated in a year; your single biggest skill needs to be the ability to update skills and learn new skills." This also means placing a greater emphasis on emotional intelligence, as many emerging systems will be built on artificial intelligence, analytics, or automation that mimic human processes, therefore augmenting human workers. "Anyone can be taught to swap out memory, but the skill of communication and responding to human emotion is not a skill so easily taught," says Chris Lepotakis


Three things Web3 should fix in 2022

Web3 backers love to talk about how blockchain networks are computers that can be programmed to do anything you imagine, given superpowers by the fact that they are also decentralized. Ethereum was the first of these computers to get real traction, but it was quickly overwhelmed by traffic. Traffic is managed by charging fees to use the computer, and the fees to complete a single transaction on the Ethereum network can run over $100. Imagine spending $75 to create a “free” Facebook account and another $75 every time you wanted to post something, and you have a sense of what it would be like to participate in a social network on the blockchain today. Ethereum is in the midst of a transformation designed to make it more efficient — which is to say, faster, less expensive, and less wasteful of energy. In the meantime, technologists routinely appear announcing that they have built a more efficient blockchain. Solana, for example, is a company that raised $314 million last year to build what it calls “the fastest blockchain in the world.” With that in mind, let’s check in on how the fastest blockchain in the world was doing on Sunday, when the aforementioned crypto crash led many people to use it to buy and sell assets.


Five Data Governance Trends for Organizational Transformation in 2022

There is a growing challenge to better govern data as it increases in variety and volume, and there is an estimate that 7.5 septillion gigabytes of data is generated every single day. Moreover, in organizations, silos are getting created through multiple data lakes or data warehouses without the right guidelines, which will eventually be a challenge in managing this data growth. To achieve nimbleness, we can simplify the data landscape by using a semantic fabric, popularly called data fabric, based on a strong Metadata Management operating model. This can further make data interoperable between divisions and functions while working to a competitive advantage. Data fabric simplifies Data Management, across cloud and on-premise data sources, even though data is managed as domains. In addition, data democratization can be a strong enabler for managing data across domains with ease and making data available as well as interoperable. Allowing business users to source and consume relevant data for their instantaneous reporting or generation of insights can reduce significant turnaround time in acquiring or sourcing data traditionally.


How the metaverse could impact the world and the future of technology

The metaverse could potentially use virtual reality, or augmented reality as we know it now, to immerse users in an alternate world. The technology is still being developed, but companies like Meta say they are building and improving these devices. Meta's Oculus Quest, now in its second model, is one such device. "When you're in the metaverse, when you're in a virtual reality headset, you will feel like you're actually sitting in a room with someone else who can see you, who can see all of your nonverbal gestures, who you can respond to and mimic," Ratan said. Immersive worlds and creating online avatars is nothing new, as games like Grand Theft Auto Online, Minecraft and Roblox have already created virtual universes. Meta's announcement last October aims to go beyond entertainment, and create virtual workspaces, homes and experiences for all ages. "What's happening now is the metaverse for social media without gaming," Ratan said. "The new metaverse is designed to support any type of social interaction, whether that's hanging out with your friends or having a business meeting."


Use the Drift and Stability of Data to Build More Resilient Models

Data drift represents how a target data set is different from a source data set. For time-series data (the most common form of data powering ML models), drift is a measure of the “distance” of data at two different instances in time. The key takeaway is that drift is a singular, or point, measure of the distance between two different data distributions. While drift is a point measure, stability is a longitudinal metric. We believe resilient models should be powered by data attributes that exhibit low drift over time — such models, by definition, would exhibit less drift-induced misbehavior. In order to manifest this property, drift over time, we introduce the notion of data stability. Stable data attributes drift little over time, whereas unstable data is the opposite. We provide additional details below. Consider two different attributes: the daily temperature distribution in NYC in November (TEMPNovNYC) and the distribution of the tare weights of aircraft at public airports (AIRKG). It is easy to see that TEMPNovNYC has lower drift than AIRKG; one would expect lesser variation between November temperatures at NYC across various years, than between the weights of aircrafts at two airports.


How to become an AI influencer

An influencer has huge responsibilities to fill. As someone with a big following, it is important to understand the kind of impact they can have on their target audience, especially if they are young or just starting out in their career. Venkat Raman, co-founder of Aryma Labs, a data consulting firm, lists down a few things influencers should keep in mind while creating their content. Don’t give false hopes An influencer should not give people false hopes. He adds, “I see many posts and tweets where some influencers proclaim that one does not need to know advanced math to break into data science. The poor aspirants believe it, and when they face the tough curriculum, they give up. I think we need to be honest. This will help set the correct expectations.” ... Many influencers in the field teach statistics through their content. Statistics is one of the core foundations of data science. Raman adds, “I have seen even the most popular YouTubers teach statistics wrongly.” The foundation can’t be left shaky. The influencers owe it to their audience to teach the right stuff. Unfortunately, in the chase for ‘number of followers’ and pressure to create content every now and then, they end up creating substandard content.


‘Dark Herring’ Billing Malware Swims onto 105M Android Devices

On the technical side, once the Android application is installed and launched, a first-stage URL is loaded into a webview, which is hosted on Cloudfront, researchers said. The malware then sends an initial GET request to that URL, which sends back a response containing links to JavaScript files hosted on Amazon Web Services cloud instances. The application then fetches these resources, which it needs to proceed with the infection process — and specifically, to enable geo-targeting. “One of the JavaScript files instructs the application to get a unique identifier for the device by making a POST request to the “live/keylookup” API endpoint and then constructing a final-stage URL,” according to the analysis. “The baseurl variable is used to make a POST request that contains unique identifiers created by the application, to identify the device and the language and country details.” The response from that final-stage URL contains the configuration that the application will use to dictate its behavior, based on the victim’s details. Based on this configuration, a mobile webpage displayed to the victim, asking them to submit their phone number to activate the app (and the DCB charges).


4 ways to mature your digital automation strategy

Immature strategies focus on simple tasks. It’s a great place to start, but to get the most out of automation, it needs to grow. To evolve these task-based automations into automated workflows, applications and systems need to communicate with each other. Steadily adding connected systems provides the opportunity to build increasingly complex, end-to-end workflows. As more processes are connected, you will need a platform to manage the increasing complexity. Fortunately, vendors in different segments of enterprise IT are converging with offerings of business process automation (BPA) suites that include integration libraries and automation and workflow capabilities. This trend provides support for organizations building out their strategies and validates the importance of automation paired with connectivity. RPA bots are very popular because they are powerful and easy to use. This is both a blessing and a curse because RPA is often used when it shouldn’t be, leading to poorly designed processes. 


Integrating IoT in Your Business

If you look at the LoRaWAN ecosystem as a whole, we now have a few hundred hardware partners that have created off the shelf products. So the first one, we say, okay, just don’t start, build your own hardware, look at it, look what’s there. And of course, we have experience with a lot of these devices and we’ve highlighted them. And of course, we also know as a company, which ones are higher quarter quality, and which are of lesser quality. But this abundance of availability make sure that you can choose, and also make sure there’s a market. Second, if you wanna move into, let’s say custom hardware development, because the sensor is not out there, or because you wanna build up IP or because it’s, I mean, you can think of many reasons. What you now see is that with, in the LoRaWAN ecosystem, there’s a lot of libraries, there’s a lot of tools, a lot of modules, that also makes it easier to build your own hardware. So we’ve started off with an open code initiative called a generic node, where we were offering the ecosystem, that’s a example of how we feel what should be the perfect LoRaWAN device and you can use it for inspiration or we can help you further. 



Quote for the day:

"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks

Daily Tech Digest - December 09, 2021

How should we regulate DeFi?

There is opportunity for the appropriate level of regulation to give DeFi enough breathing space to make a difference: boosting transparency, increasing financial inclusion and enabling credit to 8 billion people that will see the world take a tremendous jump toward prosperity. Yet there is also potential for overreach that would stifle innovation and growth and have unintended consequences. Unfortunately, we seem to be well down this path already. What is needed is the realization that DeFi shares many of the same goals as financial regulators: overhauling inflexible processes and delivering wider access, cheaper prices and more stability — all while ensuring these benefits are widely shared with all participants in the market. ... DeFi has the potential to create fairer, more transparent and more liquid markets through completely new mechanisms, helping everyone to reduce fraud and front-running, resolving fragmentation and creating markets that are efficient, resilient, fair and equally accessible to all — not just participants that have the right connections.


How to make agile actually work for analytics

The most striking difference between what we do and what software developers do is in our end products. In software, the goal is to get to a product that the end-user loves. In data, our goal is to help people make a decision they trust, and the journey the user takes to get there can be just as important as the end result. Most commonly we see this manifested in how we tell stories with our data. We use notebooks to capture context and process, and presentations to guide users to an understanding. It’s in this process that we establish trust, turn charts into insights, and make our data valuable. This is also the driver behind one of the greatest pains of our work: the follow-up questions, and ad-hoc requests. These questions and requests come from a place of curiosity and represent a desire to have that same intimate understanding of data that we get in crafted data stories. And yet, in practice, we try to eliminate these questions with processes that front-load requirements gathering and tools that have made no room for this way of working.


Cloudentity SaaS platform enables zero trust access control for APIs

Deployable in minutes, Cloudentity empowers businesses to deliver Open Banking, Embedded Finance and other innovative online services without changing identity providers or application code. Cloudentity delivers a declarative identity and authorization framework that works across any cloud to simplify access control and data governance. From Open Banking to eCommerce fraud prevention, Cloudentity makes it easier to deliver cloud-native applications and safer to extend your data to the customers and partners that matter most. A standout capability of the new SaaS platform is its drag and drop Data Lineage feature, which provides a simple and intuitive way of mapping identity and user context data to an application. For developers, Data Lineage solves the complexities of Single Sign On (SSO) and provides real-time control over who can access each element of your API data. For ITops, DevOps and SecurityOps, teams can rapidly validate controls and pinpoint areas that need to be updated or fixed to prevent API data leakage and meet personal data protection obligations.


SaaS DR/BC: If You Think Cloud Data is Forever, Think Again.

Humans and technology have always had co-dependent challenges. Let’s face it, it’s one of the main reasons my career exists! So it stands to reason that human inference, whether deliberate or not, is a common reason for losing information. This can be as innocuous as uploading a CSV file that corrupts data sets, accidentally deleting product listings, or overwriting code repositories with a forced push. There’s also intentional human interference. This means someone who has authorized access, nuking a bunch of stuff. It may sound far-fetched but we have seen terminated employees or third-party contractors cause major issues. It’s not very common, but it happens. Cyberthreats are next on the list, which are all issues that most technical operations teams are used to. Most of my peers are aware that the level of attacks increased during the global pandemic, but the rate of attacks had already been increasing prior to COVID-19. Ransomware, phishing, DDoS, and more are all being used to target and disrupt business operations. If this happens, data can be compromised or completely wiped out.


Starting an SRE Team? Stay Away From Uptime.

Why shouldn't you be too concerned about your uptime metrics? In reality SRE can mean different things to different teams but at its core, it’s about making sure your service is reliable. After all, it’s right there in the name. Because of this many people assume that uptime is the most valuable metric for SRE teams. That is flawed logic. For instance, an app can be “up” but if it’s incredibly slow or its users don’t find it to be practically useful, then the app might as well be down. Simply keeping the lights on isn’t good enough and uptime alone doesn’t take into account things like degradation or if your site’s pages aren’t loading. It may sound counterintuitive, but SRE teams are in the customer service business. Customer happiness is the most important metric to pay attention to. If your service is running well and your customers are happy, then your SRE team is doing a good job. If your service is up and your customers aren’t happy, then your SRE team needs to reevaluate. A more holistic approach is to view your service in terms of health.
 

An opportunity is coming to drive up the number of women in tech

Another key element is creating the right culture and environment for diversity to thrive. In a gender context, an important aspect here is male allyship. Men have a real role to play in supporting the ‘levelling up’ agenda. They need to see that increasing gender diversity and equity is not just an issue for women themselves – it’s for everyone. They can become active allies through their own behaviours and actions. This extends right up to board level and executive leadership. We need to continue to work to influence leader behaviour and build their understanding of people’s different styles. Instances of men talking over women in the boardroom or not listening to ideas are still all too common. Reporting is also critical. You can’t change what you don’t measure. Collating diversity statistics and reporting them to the board and more widely around the business is an essential part of raising awareness and stimulating action. Transparent reporting was in fact seen as the most effective lever for improving diversity and inclusion in this year’s survey.


Is the “great resignation” coming for you?

When employees feel their personal ambitions are too difficult to achieve, they start to think about leaving. Those ambitions might involve having a family while maintaining a career, gaining a range of professional experiences, or even accumulating personal experiences such as travel. People will ask: “I don’t mind making sacrifices, but are the trade-offs producing the benefits I expected?” When that question surfaces, employees are already halfway out the door. For example, young men and women who are working extremely hard and don’t have time for friends, exercise, or adventures may start to doubt that the company is the right place for them—even if the pay is fabulous. ... Managers often show great care about performance and little concern about the whole person who is delivering the results. Feeling uncared for is deadly for motivation and destructive to performance over the long run. Many managers rarely ask about other aspects of their team members’ lives, their personal interests, or their ambitions. Too few managers show genuine understanding and appreciation for what it took to deliver such great results.


DevSecOps jobs: 3 ways to get hired

Automation is a major part of DevSecOps, and this requires the use of multiple software applications and tools. For example, companies use a variety of different application security testing tools (ASTs), which are essential to ensure that the code being used in development is safe and to prevent malicious packages from being introduced. These tools can be static (SAST), dynamic (DAST), and interactive (IAST) and they can also be from different vendors. Some may include automated vulnerability detection, prioritization, and even remediation capabilities that can address issues without requiring IT staff to spend much time researching vulnerabilities. The lesson: Many different tools are used in DevSecOps, and these will likely change as new innovations are introduced. Stay informed and updated on industry trends, especially if you are early in your journey because the tools and needs of today might be very different in a few years’ time. The idea behind shifting left and DevSecOps is to break down the traditional separation between developers, security, and IT professionals.


Google TAG Disrupts Blockchain-Enabled Botnet

Google is skeptical about the complete disruption of Glupteba's operations. It says: "The operators of Glupteba are likely to attempt to regain control of the botnet using a backup command and control mechanism that uses data encoded on the Bitcoin blockchain." The botnet also has a feature that allows it to evade traditional takedowns. TAG says that a conventional botnet-infected device looks for predetermined domain addresses that point to the C2 server. The instructions to locate these domains are hard-coded in the malware installed on the victim's device. If the predetermined domains are taken down by law enforcement agencies or others, the infected devices can no longer receive instructions from the C2 servers and therefore can no longer be operated by the bot controller. The Glupteba botnet, however, does not rely solely on predetermined domains to ensure its survival, the TAG researchers. They say that when the botnet’s C2 server is interrupted, Glupteba malware is hard-coded to search the public Bitcoin blockchain for transactions involving three specific Bitcoin addresses that are controlled by the Glupteba botnet operators.


You’re Doing it Wrong: It’s Not About Data and Applications – It’s About Processes

We often model processes to document them, to validate them with stakeholders, to teach them to others – and most of all, to improve them. In far too many companies, what they do and why they do it is implicit, not communicated well, and invites plenty of competing points of view as to what it really is. You need to tackle the process first before you attempt to automate any of its tasks. Not doing so would be like digging holes with a crane instead of a shovel, but without thinking about whether the holes are being dug in the right places (or should be dug at all). It’s not enough to think about saving time and money. Automating a process (not just its activity) documents it, makes it teachable and scalable, and goes a long way to reducing or eliminating mistakes (high profile errors can be a major catalyst for process automation). It also makes a process easily audited and monitored And it’s a lot easier to figure out how to improve a process you can see. And improvement is a must; if there’s one thing to expect when it comes to process automation, it’s change.



Quote for the day:

"Great Groups need to know that the person at the top will fight like a tiger for them." -- Warren G. Bennis