Daily Tech Digest - January 04, 2023

AI is coming to the network

The dynamics of AI-infusing a network organization will, as with many other forms of automation, center on four modes of interaction: offloading, reskilling, deskilling, and displacing. AI offloading means putting AI tools at the command of trained and experienced networking professionals to help them do their work. The idea is to make network pros more effective by allowing them to offload tasks that are repetitive, complex, time sensitive, or require extremely high levels of focused attention, but that are not creative. This is supposed to free these scarce and precious resources to do other, higher-level work instead, while paying minimal and supervisory attention to what the AI is doing. (Human attention is the most precious resource in any IT shop.) The network team doesn’t shrink, and its portfolio of services can even grow without the team also having to grow to make that possible. Reskilling allows network staff to be trained to move into other parts of IT or into entirely different kinds of jobs. It also encompasses the idea of using AI to help train new network staff up to proficiency.


Distributed SQL: An Alternative to Database Sharding

Distributed SQL is the new way to scale relational databases with a sharding-like strategy that's fully automated and transparent to applications. Distributed SQL databases are designed from the ground up to scale almost linearly. ... In simple terms, a distributed SQL database is a relational database with transparent sharding that looks like a single logical database to applications. Distributed SQL databases are implemented as a shared-nothing architecture and a storage engine that scales both reads and writes while maintaining true ACID compliance and high availability. Distributed SQL databases have the scalability features of NoSQL databases—which gained popularity in the 2000s—but don’t sacrifice consistency. They keep the benefits of relational databases and add cloud compatibility with multi-region resilience. A different but related term is NewSQL (coined by Matthew Aslett in 2011). This term also describes scalable and performant relational databases. However, NewSQL databases don’t necessarily include horizontal scalability.


How layoffs can affect diversity in tech—and what to do about it

Although layoffs have dominated the conversation during the latter part of the year, evidence shows that the Great Resignation isn’t over yet. Online job site Hired found that attracting, hiring, and retaining top talent has proven to be difficult, citing employee burnout as a key challenge, placing the blame on rapid changes in the employment environment and angst over mass layoffs and hiring freezes. For companies yet to announce job cuts, Laman said that before any decision is made, organizations need to be sure they factor DE&I into decisions around layoffs. ... However, Williams argued that there's a lot of evidence to suggest that we pattern match when we try to spot potential, meaning that one of the really big risks from all these layoffs is that if you disproportionately have just one type of person represented at a leadership level making the decisions about who stays and who goes, they're not going to have understood or realize the potential of some people who look very different or are very different from them. Carver agrees, noting that being a good manager and being a good technologist are not one and the same, meaning people are often promoted despite lacking some necessary management skills.


How Global Turmoil and Inflation Will Impact Cybersecurity and Data Management in 2023

Rising geopolitical tensions between China, Russia, and NATO allies are responsible for increased cybersecurity threats. This will lead to companies tightening security measures in 2023. With healthcare, financial, defense, and public utility sectors facing new threats from politically motivated bad actors, the organizations with cloud-based IT operations should consider employing “data geofencing” through contractual agreements with their cloud providers -- many of which store data in global data centers -- to ensure data is kept within designated regions due to national security concerns and local legal requirements. Organizations in highly regulated industries must be on high alert to protect data and websites against DDoS attacks and phishing expeditions. Data management and cybersecurity professionals should work together to devise and execute new strategies that “meet the moment” and mitigate the potential for critical customer and corporate data eventually winding up on the Dark Web. One way data teams can support company security policies is by “flipping the script” on data asset management. 


Cyberattackers Torch Python Machine Learning Project

In the latest attack on PyTorch, the attacker used the name of a software package that PyTorch developers would load from the project's private repository, and because the malicious package existed in the PyPI repository, it gained precedence. The PyTorch Foundation removed the dependency in its nightly builds and replaced the PyPI project with a benign package, the advisory stated. ... Fortunately, because the torchtritan dependency was only imported into the nightly builds of the program, the impact of the attack did not propagate to typical users, Paul Ducklin, a principal research scientist at cybersecurity firm Sophos, said in a blog post. "We're guessing that the majority of PyTorch users won't have been affected by this, either because they don't use nightly builds, or weren't working over the vacation period, or both," he wrote. "But if you are a PyTorch enthusiast who does tinker with nightly builds, and if you've been working over the holidays, then even if you can't find any clear evidence that you were compromised, you might nevertheless want to consider generating new SSH key pairs as a precaution, and updating the public keys that you've uploaded to the various servers that you access via SSH."


Why it might be time to consider using FIDO-based authentication devices

Every business needs a secure way to collect, manage, and authenticate passwords. Unfortunately, no method is foolproof. Storing passwords in the browser and sending one-time access codes by SMS or authenticator apps can be bypassed by phishing. Password management products are more secure, but they have vulnerabilities as shown by the recent LastPass breach that exposed an encrypted backup of a database of saved passwords. For organizations with high security requirements, that leaves hardware-based login options such as FIDO devices. The FIDO (Fast Identity Online) standard is maintained by the FIDO Alliance and aims to reduce reliance on passwords for security. It does so by complementing or replacing them with strong authentication based on public-key cryptography. FIDO includes specs that take advantage of biometric and other hardware-based security measures, either from specialized hardware security gadgets or the biometric features built into most new smartphones and some PCs. That makes FIDO and other physical key or token methods more phishing resistant and harder for attackers to bypass. 


Why organizations tend to fall short on secure data management

Developing a more comprehensive structure for data classification by determining a piece of data’s value, its risk profile, or its level of sensitivity can improve understanding of the data retention period, thus informing data policy to help mitigate risk and reducing the attack surface for a potential breach. That means determining from the outset that data needs to get sanitized after a set time and through a set policy, rather than waiting until the asset it sits on is disposed. Equally, by thinking about the information lifecycle from the get-go, enterprises can make quick decisions on whether they should even have that data, and if not, they should erase it immediately with a certificate proving that the erasure has been successful. If data has only been held as part of a project, then when that project finishes the team should remove it from the infrastructure under that organization’s command. Classifying data appropriately can provide actionable insight to restructure policies and help employees better understand the information lifecycle management process.


7 downsides of open source culture

The word community gets thrown around a lot in open source circles, but that doesn’t mean open source culture is some sort of Shangri-La. Open source developers can be an edgy group: brusque, distracted, opinionated, and even downright mean. It is also well known that open source has a diversity problem, and certain prominent figures have been accused of racism and sexism. Structural inequality may be less visible when individuals contribute to open source projects with relative anonymity, communicating only through emails or bulletin boards. But sometimes that anonymity begets feelings of disconnection, which can make the collaborative process less enjoyable, and less inclusive, than it's cracked up to be. Many enterprise companies release open source versions of their product as a “community edition.” It's a great marketing tool and also a good way to collect ideas and sometimes code for improving the product. Building a real community around that project, though, takes time and resources. If a user and potential contributor posts a question to an online community bulletin board, they expect an answer. 


Is Silicon Valley's Unique Aura Fading Away?

The Silicon Valley mindset is about using technology to push for what’s possible, not what’s probable, says Shannon Goggin, co-founder and CEO of San Francisco-based benefits data platform provider Noyo. “It’s about taking big swings, building the future, and creating breakthroughs.” Yet after decades of tech dominance, doubts about Silicon Valley's long-term industry supremacy are beginning to appear. ... With a rise in remote work, the value that Silicon Valley employees once placed on a vibrant office life with trendy workspaces, elaborate on-site meals, and transportation, has faded, Jain observes. “People are now looking for solid employers who offer opportunities for collaboration and the ability to make a difference,” he says. “Silicon Valley tech firms are starting to take notice.” Thanks to emerging distributed company models, the Silicon Valley mindset will continue spreading to other areas, Goggin says. “The startup ecosystem is incredibly supportive, and I will be proud to see the next generation of companies create even more opportunity for people who haven’t historically had the chance to participate in the startup and tech ecosystem,” she says.


9 steps to protecting backup servers from ransomware

The backup server should not be connected to lightweight directory access protocol (LDAP) or any other centralized authentication system. These are often compromised by ransomware and can easily be used to gain usernames and passwords to the backup server itself or to its backup application. Many security professionals believe that no administrator accounts should be put in LDAP, so a separate password-management system may already be in place. A commercial password manager that allows sharing of passwords only among people who require access could fit the bill. MFA can increase security of backup servers, but use some other method than SMS or email, both of which are frequently targeted and circumvented. Consider a third-party authentication application such as Google Authenticator or Authy or one of the many commercial products. Backups systems should be configured so nearly no one has to login directly to an administrator or root account. For example, if a user account is set up on Windows as an administrator account, that user should not have to log into it in order to administer the backup system. 



Quote for the day:

"A tough hide with a tender heart is a goal that all leaders must have." -- Wayde Goodall

Daily Tech Digest - January 03, 2023

Security Top IT Investment Priority in 2023

Dennis Monner, chief commercial officer at Aryaka, says he thinks what IT leaders are finding is that the talent that they really need on their teams is in short supply. “The boundaries between the traditional, functional disciplines are getting fuzzy, requiring a new breed of security professional,” he explains. “The cloud team needs to understand the network. The network team needs to understand security. It’s driving them to rethink their investment and hiring strategy.” He adds recruiting, training, and retention all takes real dollars from the budget that could potentially be deployed in services that guarantee performance. “You can only outsource security to a certain degree,” Haff cautions. “Even if you're 100% in a public cloud, you're still largely responsible for your own application security, as well as your internal access and authentication procedures.” While a cloud provider can implement all manner of security tech and processes if you don't control who has access, those won't do much good. “It was somewhat disappointing that, although our survey generally showed investments in people was a high priority, ‘hiring security or compliance staff’ was one of the lowest security funding priorities,” he adds.


How biometric payments are tackling financial exclusion

Even the most reluctant individuals are likely to have succumbed to contactless payments and some form of digitised banking in recent times. This will have the positive impact of making the needed transition to biometrics more seamless. Using fingerprints or facial recognition to unlock phones or access apps is not unusual. If anything, they have been convenient and comforting additions to the surge of tech innovations over the last couple of decades. There is a relief in knowing that these portals are being secured by methods that are almost impossible to replicate. It is a breakthrough that financial players and governments in the world’s most developed countries still need to catch up with, as emerging economies have already capitalised on biometrics’ capabilities for almost a decade now. In India, for example, internal fraud and leakage from pension payments dropped by 47% after transitioning from cash to biometric smart cards. Because the solution bypasses the need for prior credit ratings or credentials, the country has also been able to catalyse safe online banking among previously unbanked adults since biometrics’ introduction in 2014.


Decentralised finance – a threat for traditional FS firms, or an opportunity?

Done right, DeFi offers traditional banks and financial services firms the ability to reduce costs, increase speed and attract new customers who are looking for simplified, more attractive, and secure solutions. When we look at the current payments ecosystem, we’re confronted with a maze of payments services, systems and rules which rely on a cacophony of different players. DeFi offers a solution to this inherent friction, delivering ecosystems than can run autonomously based on rules and verify transactions without human intervention. The main attractions of this innovation are two-fold. Firstly, it reduces inefficiency while eliminating fees, manual effort (e.g. for corporate actions) and intermediaries. Basic transactions can be executed at any time, from any place, with the only requirement having an internet connection and a compliant wallet. By removing the middleman in asset rights transfers, lowering exchange fees, and giving access to wider global markets, moving securities on blockchain could save between $17B and $24B in global trade processing costs.


Engineering Best Practices of CI Pipelines

The essence of a CI/CD system is to aim for green builds and to resolve issues quickly when a red build occurs, meaning a test failed. When the automated tests run, any failure results should be visible to all team members. Then, it should be a top priority for the team to make the build work again. Green builds and rapid fixes are critical for two reasons. First, when tests are failing, it is not possible to test forthcoming development and changes accurately. Secondly, continuous deployment will be halted, because no new and validated packages exist. Although it may seem like a frustrating situation to stop active development and instead focus on fixing failed tests, this mindset will ensure optimal application stability. An efficient CI/CD system should be the only path that leads to the production environment. In other words, if you have confidently built a CI/CD system with a comprehensive set of tests, there should be no other way to deploy applications to the production system. It can be highly tempting — and common — to maintain administrator privileges and deploy an application to the production systems just once. 


Blue-Green Deployment From the Trenches

The concept of blue-green deployment is to have (at least) two instances of an application running at one time. When a new version is released, it can be released to just one (or some) instances, leaving the others running on the old version. Access to this new version can be restricted completely at first, then potentially released to a subset of consumers, until confidence in the new release is achieved. At this point, access to the instance(s) running the old version can be gradually restricted and then these too can be upgraded. This creates a release with zero downtime for users. There are, of course, caveats. Any breaking change to data sources or APIs means that old requests cannot be processed by the new version, which rules out a blue-green release. It’s one of my favourite interview questions to ask how one might approach a breaking change in a blue-green environment on the off-chance that someone comes up with a great solution, but it would probably involve some bespoke routing layer to enrich or adapt "old" requests to the "new" system. At which point, you’d have to consider whether it isn’t better just to have some good old downtime. 


Top 10 AI Trends that Will Redefine Technology in the Year 2023

The role of AI and data science in innovation and automation will increase in 2023. Data ecosystems are able to scale, decrease waste, and provide timely data to a variety of inputs. But laying the foundation for change and fostering innovation is crucial. With the use of AI, software development processes can be optimised, and further advantages include greater collaboration and a larger body of knowledge. We need to foster a data-driven culture and go past the experimental stages in order to change to a sustainable delivery model. This will undoubtedly be a significant advancement in AI. ... Over the past few years, IT systems have become more sophisticated. Vendors will seek platform solutions that offer visibility across numerous monitoring domains, including application, infrastructure, and networking, according to a new Forrester prediction. ... The automatic modification of neural net topologies and improved tools for data labelling are two promising areas of automated machine learning. When the selection and improvement of a neural network model are automated, the cost and time to market for new solutions for artificial intelligence (AI) will be reduced.


How the EU plans to take on big tech in 2023

Increasing competition could leave gaps for European challengers to enter. The EU, however, has historically struggled to turn its world-leading research into big tech companies. One barrier is the notoriously slow and inefficient transfer of IP from academia to the economy. This problem is illustrated by the EU producing more research papers than the US, but turning far fewer into commercial applications. According to Luigi Congedo, a venture capitalist and Innovation Advisor at marketing firm Clarity, this weakness can be reduced by changing the EU’s investment framework. This, he argues, could stimulate a more effective technology transfer — and prevent promising startups from being acquired by Silicon Valley giants. “We need to create our Google, Facebook, and Microsoft, and, in order to do it, create a better environment to compete and do business across the continent,” he said. “If we fail in creating a real European platform for innovation and instead maintain the current ‘country-based model,’ all our emerging businesses will end up becoming M&A targets for American multinational companies.”


The limitations of mathematical modeling

Thompson believes these failures are often owing to misaligned incentives: “Those who correctly estimate significant tail risks [i.e., deviations from the normal distribution in a statistical model] may not be recognized or rewarded for doing so. Before the event, tail risks are unknown anyway if they can only be estimated from past data,” and “after the event, there are other things to worry about.” In short, it was in investors’ interest to design a model that characterized unlikely risks as infinitesimally so, and regulators weren’t paying attention. So why should we bother with models at all? Occasionally, Thompson believes, they do get it right. Her preferred example concerns research by two chemists, F. Sherwood Rowland and Mario Molina, who in the 1970s modeled the potential impact on the ozone layer of the continued release of chlorofluorocarbons, or CFCs. Within 15 years of their research, an international agreement, the Montreal Protocol, had been signed to limit CFC use, and it is now possible that the ozone layer could recover to its 1980 level by 2050. “The acceptability of the model was a function of the relatively simple context and the low costs of action,” Thompson explains


Business and tech leadership predictions 2023

With remote working on the rise – despite some companies attempting to go back to the office – global hiring will continue to increase. More and more people will be able to work in digital jobs that can be done from anywhere. “When you hire internationally, you have access to a much larger talent pool, and with the possibility of hiring employees to work from anywhere in the world, companies will have a unique opportunity of filling their roles in a more diverse way to increase cross-cultural competency in remote teamwork,” says Kelvin Ong, chief of staff at online software engineering school Microverse. However, Ong agrees with James Wilknson, that this means IT managers will have to develop their soft skills, such as explicit and clear written communication (“low-context communicaiton” and sending messages where there is a timelag before you get a response (“asynchrous communication”). ... Hedley says: “Most recessions are mild and temporary. While they are not fun, recessions can be endured. Second, business owners can, to a large extent, control their own destiny. And that’s especially true when it comes to identifying and hiring the talent that will move the needle.”


How to be the manager your IT team needs in 2023

Authenticity is important in creating high-performing teams because it lays the groundwork for strong relationships and environments in which employees can bring their whole, best selves to work. Being authentic doesn’t mean bearing all your darkest secrets, but it does mean understanding your own personal style and drivers and helping your team understand those. Humans are wired for consistency, so when you show up consistently and authentically, your employees know what to expect, how to approach you, and what’s important. Better still, they feel they have space to share who they are and what drives them. ... Perhaps the most important tip, though, is to be present when you are with your team. Shifting to all virtual work over the last couple of years has taken a toll on our ability to focus in the moment. We are constantly typing emails while listening to conference calls or responding to chats and texts while also trying to write articles or create solutions for clients. The pressure to multitask is great, but the benefits of focus and attention are even greater.



Quote for the day:

"Be so good at what you do that no one else in the world can do what you do." -- Robin Sharma

Daily Tech Digest - January 02, 2023

5 ways CIOs will disappoint their CEOs in 2023

Promise #1: The cloud will save money. Disappointment: It never did, and still won’t Why it won’t: You can buy servers as cheaply as the cloud providers, and they need to add a profit margin when they charge you for using them. What you should promise instead: Unlike on-premises infrastructure, the cloud lets IT easily add capacity in small increments when demand requires it. And — and this is the biggie — it also lets IT shed capacity when it’s no longer needed. The result? When demand is seasonal or unpredictable the cloud truly does save money. But when demand is steady, or increases in demand are predictable, on-premises infrastructure costs less. In the cloud, fixed costs are small but incremental costs are big. The costs of on-premises systems are the opposite. ... Promise #4: ‘Agile’ means no more big-project failures. Disappointment: Your name will be on some miserable Agile project failures this year What’s going to go wrong: Your company is going to make three Agile mistakes. The first, and worst, is that it won’t lose the habit of insisting on multitasking — developers will still be asked to juggle multiple competing projects, and their top priority will still be the next phone call.


Digital transformation: 4 security tips for 2023

Cybersecurity training keeps employees, customers, and vendors safe from cyberattacks. Take the initiative to seek out top-of-the-line training resources that will walk you through every aspect of promoting a secure environment. Training does not need to be expensive. Learn how to avoid data breaches, cultivate a security-first mindset, and maintain airtight security. While no measure can prevent a cyberattack entirely, proper training can help minimize your risk and reduce the chance of a breach. In addition, continue to sweat the small stuff. While one weak password or phishing email may not seem like a big deal, it’s in your best interest to take every threat seriously. Implement strong password complexity controls and policies, develop and maintain phishing campaigns, track user activity, and create policies for sharing information on the internet. For example, posting information on social media could reveal answers to common security questions. Staying vigilant will help your organization avoid trouble in the future.


Wireless electronics can power trillions of IoT sensors. Here's how

We are yet to witness the full potential of IoT, but before that, we need to overcome a big challenge. The sensors that make IoT networks possible require power to stay functional, and unfortunately, our existing energy solutions are not enough to support this demand. A team of researchers at King Abdullah University of Science and Technology (KAUST) in Saudi Arabia has been working on this problem and in their latest study, they propose an interesting solution. The authors reveal details about wireless-powered electronics that promise to meet the energy demands of IoT networks in a sustainable and eco-friendly manner. Sensors are currently powered by technologies like Li-ion batteries. Although batteries can power a large network of devices, they need to be replaced again and again. Therefore a battery-based approach is expensive, unsustainable, and harmful to the environment. For instance, conventional batteries are made of metals that are procured through mining activities resulting in air and soil contamination. Plus, when these batteries are not carefully disposed of, they release toxic chemicals into our environment.


Agile vs. waterfall: Comparing project management cultures

Waterfall and agile culture are different forms of managing software projects, but they are made of the same constituent concept: people managing people. The values we covered, on the other hand, are not interchangeable. They are different in kind, they are indeed the quintessential difference between agile and waterfall. Following the scrum guide by the book, having squads, agile coaches, dailies, and meetups might make you show up as agile, but unless your values are aligned with the Manifesto, you’re just dressing waterfall as agility. This is precisely the scenario we have been witnessing in the last few years. As more and more companies see the results of strong agile culture creating unicorns and industry juggernauts, more of them want a quick way to execute digital transformation. What happens is that they start practicing agile, but keep the waterfall values of control with a lack of flexibility and hierarchy. Even worse, since the number of successfully transformed companies is way smaller than those who just pretend to have transitioned, more and more people have no experience with agile values, leading them to believe that doing agile with waterfall values is perfectly normal.


What Rust Brings to Frontend and Web Development

“Rust to WebAssembly is one of the most mature paths because there’s a lot of overlap between the communities,” Gardner told The New Stack. “A lot of people are interested in both Rust and WebAssembly at the same time.” It’s not an either “Rust or JavaScript” or even “WebAssembly or JavaScript” situation, he said. It’s possible to blend WebAssembly with JavaScript. “You’re going to see some people rewrite for WebAssembly, but you’re going to see some people take advantage of WebAssembly where appropriate, and then use JavaScript for connecting the various pieces under the hood, and maybe running portions of the application as necessary,” he said. ... Chris Siebenmann, a Unix systems administrator at the University of Toronto’s CS Labs, has a theory about that: Languages spread when developers like using the language to accomplish things that matter to them. Right now, that language is Rust. “Rust is a wave of the future because a lot of people are fond of it and they are writing more and more things in Rust, and some of these things are things that matter to plenty of people,” Siebenmann wrote in 2021.


An Entity to DTO

According to Martin Fowler, DTO is: “An object that carries data between processes in order to reduce the number of method calls. When you're working with a remote interface, such as Remote Facade, each call to it is expensive. As a result, you need to reduce the number of calls. The solution is to create a Data Transfer Object that can hold all the data for the call.” So, initially, DTOs were intended to be used as a container for remote calls. In a perfect world, DTOs should not have any logic inside and be immutable. We use them only as state holders. Nowadays, many developers create DTOs to transfer data between application layers, even for in-app method calls. If we use JPA as a persistence layer, we can read an opinion that it is a bad practice to use entities in the business logic, and all entities should be immediately replaced by DTOs. We recently introduced DTO support in the JPA Buddy plugin. The plugin can create DTOs based on JPA entities produced by data access layer classes and vice versa – create entities based on POJOs. This allowed us to look at DTOs closer and see how we can use them on different occasions.


Blockchain & Internet Of Things Are A Perfect Match

It won’t all be plain sailing if we’re to migrate IoT workloads to a blockchain-based infrastructure. There are some key issues that need to be overcome, but luckily a number of interesting solutions are already being built. One of the main challenges with blockchain is that it’s not a low-latency protocol. As such, most blockchains process a very low number of transactions per second, and that presents issues for large-scale IoT device networks, as these require extremely rapid rates of data transfer to keep up. Ethereum, the world’s most popular smart contract blockchain, is only capable of processing around seven transactions per second, for example. Moreover, the Ethereum network is often congested, leading to high transaction costs. In its natural state, it’s not a realistic platform for large-scale IoT deployments. The answer to this problem may lie in scaling solutions like Boba Network, which is a Layer-2 network and hybrid compute platform that powers lightning fast transactions with much lower costs than traditional Layer-1 networks. Boba Network relies on a technology called optimistic rollups, which enable multiple transactions to be bundled into one and processed simultaneously. 


Getting data loss prevention right

DLP is not a plug-and-play solution. There is considerable prep work that must take place before anything is deployed. Reliable processes must exist for identifying data, performing continuous inspections, and verifying results. There must be a clear framework that identifies how data is classified, what gets blocked, and who is responsible for ultimately setting policies. Historically, many DLPs have relied on data access pattern recognition (REGEX), which offers mediocre insights into how data is used. In other words, even with the right people at the helm, the tools may be lackluster. DLP’s middling capabilities, often wielded by untrained IT departments, have given it a reputation for over-promising and under-delivering. Without a strong ability to apply context to data, many DLPs are glorified string-matching tools that overwhelm analysts with false positives. ... Much of DLP’s shortcomings are attributable to untrained staff or poor implementations. Some DLPs are built upon frameworks with functional limitations that may negatively impact their effectiveness. 


Ransomware ecosystem becoming more diverse for 2023

The ransomware ecosystem has changed significantly in 2022, with attackers shifting from large groups that dominated the landscape toward smaller ransomware-as-a-service (RaaS) operations in search of more flexibility and drawing less attention from law enforcement. This democratization of ransomware is bad news for organizations because it also brought in a diversification of tactics, techniques, and procedures (TTPs), more indicators of compromise (IOCs) to track, and potentially more hurdles to jump through when trying to negotiate or pay ransoms. ...  "Fast forward to this year, when the ransomware scene seems as dynamic as ever, with various groups adapting to increased disruptive efforts by law enforcement and private industry, infighting and insider threats, and a competitive market that has developers and operators shifting their affiliation continuously in search of the most lucrative ransomware operation." ... This trend is likely to continue in 2023 with ransomware groups expected to come up with new extortion tactics to monetize attacks on victims where they're detected before deploying the final ransomware payload.


Driving Employee Retention and Performance Through Recruiting

When the job market reopened as the pandemic wound down, there simply weren’t enough workers to fill jobs. Recruiters and hiring managers were under a lot of pressure to fill roles and fill them fast. The Muse CEO and founder Kathryn Minshew explains it this way: With companies desperate to hire and HR pros stretched thin, recruiters may be going rogue and stretching the truth to fill roles. Or, they could say things they think are true, but they don’t have the full picture of the workplace experience. She advises companies to be honest about what it’s like to work there, including successes as well as areas for improvement. Interviews should be a two-way street, and you must give candidates enough time to ask questions about company culture. "When people feel like they have opted into a situation with eyes wide open," Minshew says, "they’re much more likely to accept the good and the bad, and to show up as engaged, productive, satisfied employees. Rather than fluffy mission statements, what if you were able to openly and transparently connect candidates to their personal purpose from their first connection to your employer brand?



Quote for the day:

"A lot of people have gone farther than they thought they could because someone else thought they could. " -- Zig Zigler

Daily Tech Digest - January 01, 2023

New Year’s resolutions for cloud pros

We live in days when cloud skills are defined by specialization. People aren’t just cloud database experts, they are experts on a specific cloud database on a specific cloud provider. The same can be said for cloud-based business intelligence, a specific SaaS provider, or cloud operations focused on a specific OS configuration. We seem to fall into niches. This limits your options if your specific cloud technology becomes less popular. It’s better to have a skill waiting in the wings than to learn one at the last minute. Look at job sites to see what skills are most in demand that are somewhat related to your current skills and obtain the basic chops that will allow you to talk your way into a new gig if needed. For instance, if you’re focused just on a single cloud object database, perhaps learn about one or two other object databases on another cloud provider. This should be a relatively easy transition given that the concepts are much the same. You can diversify even more, such as learning about cloud-native development if you’re currently a cloud developer.
 

The one real problem with synthetic media

Synthetic media promises a very near future in which advertisements are custom generated for each customer, super realistic AI customer service agents answer the phone even at small and medium-sized companies, and all marketing, advertising and business imagery is generated by AI, rather than human photographers and graphics people. The technology promises AI that writes software, handles SEO, and posts on social media without human intervention. Great, right? The trouble is that few are thinking about the legal ramifications. Let’s say you want your company’s leadership to be presented on an “About Us” page on your website. Companies now are pumping existing selfies into an AI tool, choosing a style, then generating fake photos that all look like photos taken in the same studio with the same lighting, or painted by the same artist with the same style and palate of colors. But the styles are often “learned” by the AI by processing (in legal terms) the intellectual property of specific photographers or artists.


The Curious Case of Linux: It’s for Everyone, but Nobody Uses it

There are three main reasons that users shy away from using Linux. The first is the perceived unintuitiveness of the OS, which is the biggest fear of new users. The second is the lack of support for applications, games, and devices – a problem that has plagued Linux forever. The third, and most questionable, is the toxic fanbase associated with the operating system, which commonly undermines the efforts of newcomers to the ecosystem. Command line interface nightmares are the most-quoted reasons for newcomers to join the ecosystem. In addition to this, software developers rarely optimise applications for use in Linux, making compatibility a nightmare for creators and power users. To combat this, the community has come up with distros that inherently require less technical know-how than others. One of the best examples of this is Pop!_OS. ... Another major problem that average users have with Linux is not only the lack of software, but a lack of support for games.


Building Security Champions

A Security Champion is a team member that takes on the responsibility of acting as the primary advocate for security within the team and acting as the first line of defense for security issues within the team. Or, more plainly: The person who is most excited about security on a team. They want to read the book, fix the bug, or ask security questions. Every time. Security champions are your communicators. They deliver security messages to each dev team, teaching, sharing, and helping. They are your point of contact, delivering messages to and from the security team and keeping you up to date on what matters to your team. They are your advocate. They perform security work, for their dev team, with your help. They also advocate for security, asking questions in situations you would have been left out of. Raising concerns you might have missed. They are a peer for everyone on their team and can influence in ways that you yourself cannot. In the next few paragraphs, we will cover how to build an amazing security champions program!


Italian Healthcare Group Targeted in Data-Leaking Shakedown

The criminals claim they reached out directly to hospital staff: "We has also ask some of employees during phone calls about the incident but they answered that they didn't heard about any breach. So, they were asked to review the evidence in Live Chat and we have repeatedly tried to make it clear that hundreds of thousands of personal data have been compromised due to their negligence." The criminals add: "Our advise is to replace the entire IT staff and have them undergo proficiency tests and check them for budget wasting as well." Take all such posturing and self-serving announcements with a big grain of salt, says Brett Callow, a threat analyst at security firm Emsisoft who closely tracks ransomware groups' activities. ... "Why do they do this? It's all about PR and branding. They think that organizations may be less likely to want to hand money to the type of evil criminals who are happy to put lives at risk by carrying out financially motivated attacks on hospitals."


Workplace Trends You Need to Know for 2023

As we near the end of 2022, a shift is happening — for the better. The U.S. Surgeon General reported that 71% of employees believe their employer is more concerned about their mental health and wellbeing than ever before. This is a huge step forward and one we must grasp and run with. In response, the U.S. Surgeon General released a framework that aims to support workplaces in better improving the mental health and wellbeing of their employees. This includes: Ensuring there is an opportunity for growth, valuing employee contributions, enhancing social connections in the workplace and focusing on achieving better work-life integration. We're likely to see more mental wellbeing initiatives and strategies employed across businesses that deliver meaningful and practical help to their employees — from self-care days off once a month to increased wellbeing benefits, mental health first aid training and even adaptations to the workplace.


US Congress funds cybersecurity initiatives in FY2023 spending bill

The bill stipulates that no government agency may use their funds to buy telecom equipment from Chinese tech giants Huawei or ZTE for “high or moderate impact information systems,” as determined by the National Institute of Standards and Technology (NIST). It further states that agencies cannot use any of their funds for technology, including biotechnology, digital, telecommunications, and cyber, developed by the People’s Republic of China unless the secretary of state, in consultation with the USAID administrator and the heads of other federal agencies, as appropriate, determines that such use does not adversely impact the national security of the United States. Moreover, no agency can spend funds on entities owned, directed, or subsidized by China, Iran, North Korea, or Russia unless the FBI or other appropriate federal entity has assessed any risk of cyber espionage or sabotage associated with acquisitions from these entities. ... Finally, the bill amends the Federal Food, Drug, and Cosmetic Act to make medical device makers meet specific cybersecurity standards. 


Cloud Adoption Plans Accelerate, Highlighting Need for Qualified IT

As organizations transition to providing digital solutions in a digital workplace, public, multi, and hybrid cloud adoption is on the rise. Farid Roshan, global head of digital enablement practice at Altimetrik, says the transitional data center mindset leads to high sunk costs for procuring appliances and difficulty in attaining talent to support data center maintenance activities. “Organizations lose precious time and energy focusing on managing infrastructure vs. building products that bring value to their customers,” he says. From his perspective, public cloud platforms provide IT teams the ability to focus on creating innovative solutions and attracting highly skilled talent to develop products that drive business growth, while reducing overall IT cost of ownership. Roshan adds cloud adoption can lead to unexpected delays and failure in transforming organizations if the cloud strategy is not well understood across the organization. “Understanding the goals for moving to the cloud as well as implementing an executive cloud strategy, defining a roadmap and OKRs, will allow for business and IT groups to align their annual and quarterly goals,” he says.


What is the role of the data manager?

The data manager’s function is essentially to oversee the value chain and ensure data is delivered effectively, says Carruthers. “This means helping create data which is accessible, usable and safe. Information can then be delivered to the right place and in a good condition so it can be used in the most effective way possible.” Carruthers compares the role of data manager to the conductor in an orchestra. “The manager is there to oversee the whole data team, rather than frantically trying to play every instrument themselves. As the orchestra analogy suggests, it is a data manager’s role to ensure the song sheet is followed by every team member. This means managing the use of data to ensure it goes through the correct value chain.” The data manager role is not just about being “good with data”. It involves a combination of technical and interpersonal skills, says Andy Bell, vice president global data product management at data integrity specialist Precisely. As well as technical skills, he says data managers need to have “a thorough understanding about the application of technology”. 


Cybercriminals create new methods to evade legacy DDoS defenses

Attackers will continue to make their mark in 2023 by trying to develop new ways to evade legacy DDoS defenses. We saw Carpet Bomb attacks rearing their head in 2022 by leveraging the aggregate power of multiple small attacks, designed specifically to circumvent legacy detect-and-redirect DDoS protections or neutralize ‘black hole’ sacrifice-the-victim mitigation tactics. This kind of cunning will be on display as DDoS attackers look for new ways of wreaking havoc across the internet and attempt to outsmart existing thinking around DDoS protection. In 2023, the cyberwarfare that we have witnessed with the conflict in Ukraine will undoubtedly continue. DDoS will continue to be a key weapon in the Ukrainian and other conflicts both to paralyse key services and to drive political propaganda objectives. DDoS attack numbers rose significantly after the Russian invasion in February and DDoS continues to be used as an asymmetric weapon in the ongoing struggle.



Quote for the day:

"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller

Daily Tech Digest - December 31, 2022

Credentials Are the Best Chance To Catch the Adversary

It used to be that attackers would batter the networks of their targets. Now, they may use LinkedIn and social media to identify your employees’ personal email accounts, hack them, and look for other credentials. External actors may also identify unhappy employees posting negative reviews on Glassdoor and offer to buy their credentials. Or these actors may just boldly call your employees out of the blue and offer to pay them for their login information and ongoing approval of multi-factor authentication (MFA) prompts. As a result, MFA is no longer a reliable tool in preventing attacks, as it can be easily gamed by malicious insiders. ... Not every attack uses stolen credentials to gain initial access to networks, but every attack eventually involves credentials. After gaining access to networks, bad actors see who has privileged access. ... Between nation-state actors, criminal gangs, computer-savvy teenagers and disgruntled insiders, the likelihood is that your network has already been penetrated. What you need now is to detect these attacks at speed to minimize their damage.


Artificial Intelligence Without The Right Data Is Just... Artificial

Successful AI “requires data diversity,’ says IDC analyst Ritu Jyoti in a report from earlier in 2022. “Similarly, the full transformative impact of AI can be realized by using a wide range of data types. Adding layers of data can improve accuracy of models and the eventual impact of applications. For example, a consumer's basic demographic data provides a rough sketch of that person. If you add more context such as marital status, education, employment, income, and preferences like music and food choices, a more complete picture starts to form. With additional insights from recent purchases, current location, and other life events, the portrait really comes to life.” To enable AI to scale and proliferate across the enterprise, “stakeholders must ensure a solid data foundation that enables the full cycle of data management, embrace advanced analytical methods to realize the untapped value of data,” says Shub Bhowmick, co-founder and CEO of Tredence. “In terms of data availability and access, businesses need a way to parse through huge tracts of data and surface what’s relevant for a particular application,” says Sachdev.


Web3, the Metaverse and Crypto: Trends to Expect in 2023 and Beyond

If something good can come from FTX, it is that more regulations are coming, especially for centralized crypto exchanges, along with stricter rules on investor protection in the crypto trading space. Even Congress is paying attention, having summoned SBF for a congressional hearing (he was arrested the day before the scheduled hearing). These regulations are overdue – I have advocated for regulating centralized crypto exchanges since 2017. However, it’s better late than never. Legislators and regulators world-wide have zeroed in on the crypto market with an attempt to lay out rules, which hopefully prevents future catastrophes such as FTX. But legislators and regulators must be cautious in their approach, making sure not to stifle Web3 innovation. If they understand the difference between cryptocurrency as an asset class that trades on a centralized trading platform, and innovation that utilizes Web3 technology, and stick to investor protection while creating a welcoming environment for the development of Web3 applications, then we might be expecting a favorable legislative environment both for investors and developers.


Microservices Integration Done Right Using Contract-Driven Development

When all the code is part of a monolith, the API specification for a service boundary may just be a method signature. Also, these method signatures can be enforced through mechanisms such as compile time checks, thereby giving early feedback to developers. However, when a service boundary is lifted to an interface such as http REST API by splitting the components into microservices, this early feedback is lost. The API specification, which was earlier documented as an unambiguous method signature, now needs to be documented explicitly to convey the right way of invoking it. This can lead to a lot of confusion and communication gaps between teams if the API documentation is not machine parsable. ... Adopting an API specification standard such as OpenAPI or AsyncAPI is critical to bring back the ability to communicate API signatures in an unambiguous and machine-readable manner. While this adds to developers’ workload to create and maintain these specs, the benefits outweigh the effort.


The Threat of Predictive Policing to Data Privacy and Personal Liberty

It's not just related to law enforcement targeting; it's also related to any legal decisions. Custody decisions, civil suit outcomes, insurance decisions, and even hiring decisions can all be influenced by the RELX-owned LexisNexis system, which gathers and aggregates data. Unfortunately, there's little recourse for someone who was unfairly treated due to a data-based risk assessment because people are rarely privy to the way these decisions are made. So, a corporate HR manager or Family Court judge could be operating off bad or incomplete data when making decisions that could effectively change lives. RELX and Thomson Reuters have disclaimers freeing them from liability for inaccurate data, which means your information could be mixed in with someone else's, causing serious repercussions in the wrong circumstances. In 2016, a man named David Alan Smith successfully sued LexisNexis Screening Solutions when the company provided his prospective employer with an inaccurate background check. 


10 digital twin trends for 2023

Over the last year, the world has been wowed by how easy it is to use ChatGPT to write text and Stable Diffusion to create images. ... Over the next year, we can expect more progress in connecting generative AI techniques with digital twin models for describing not only the shape of things but how they work. Yashar Behzadi, CEO and founder of Synthesis AI, a synthetic data tools provider, said, “This emerging capability will change the way games are built, visual effects are produced and immersive 3D environments are developed. For commercial usage, democratizing this technology will create opportunities for digital twins and simulations to train complex computer vision systems, such as those found in autonomous vehicles.” ... Hybrid digital twins make it easier for CIOs to understand the future of a given asset or system. They will enable companies to merge asset data collected by IoT sensors with physics data to optimize system design, predictive maintenance and industrial asset management. Banerjee foresees more and more industries adopting this approach with disruptive business results in the coming years.


Change Management is Essential for Successful Digital Transformation

Vasantraj notes, “Organizational culture is vital in fostering leadership and enabling enterprises to adapt. Successful teams are built on trust and the ability to put aside self-interest and work together. Teams must think of organizations as a single entity and keep a growth mindset.” This type of collaborative culture doesn’t emerge without a lot of effort. Amy Ericson, a Senior Vice President at PPG, suggests one way a great change management leader can make their efforts employee-centric is to lead with empathy. She makes three helpful recommendations, “First, ask how your people are. Really ask them. Then, listen. You may find that they’re struggling, and your interest in how they are doing and genuine concern will help them move forward productively. Second, acknowledge their situation and ask how you can help. Do they need access to new tools or resources? Do they need a different schedule? Third, thank them, and follow through. Praise their courage to be honest, and deliver on your promises to help them succeed.”[5] Beyond being an empathetic leader, the BCG team highly recommends getting employees involved from the beginning of the change process.
.

‘There’s a career in cybersecurity for everyone,’ Microsoft Security CVP says

When there’s an abundance of opportunities, there are many ways of getting into that opportunity. We do have an incredible talent shortage. Going back to a myth buster, 37% of the people that we surveyed said that they thought a college degree was necessary to be in security. It’s not true. You don’t need a college degree. Many security jobs don’t require a four-year college degree. You can qualify by getting a certificate, an associate degree from a community college. Hence, why we are working with community colleges. There’s also a lot of resources for free because it can be daunting. The cost itself can be daunting, but there’s a lot of resources. Microsoft has a massive content repository that we have made available. We have made certifications. These are available to anyone who wants to take them, and there are ways you can train yourself and get into cybersecurity. We have this abundance of opportunity, which creates new ways of getting in, and we need to educate people about all these facets about how they can get in.


How the Rise of Machine Identities Impacts Enterprise Security Strategies

First, security leaders must rethink their traditional identity and access management (IAM) strategies. Historically, IAM has focused on human identities authenticating access systems, software and apps on a business network. However, with the rise of containers, APIs and other technology, a secure IAM approach must utilize cryptographic certificates, keys and other digital secrets that protect connected systems and support an organization’s underlying IT infrastructure. With the shift to the cloud, a Zero Trust framework has become the new security standard, where all users, machines, APIs and services must be authenticated and authorized before being able to access apps and data. In the cloud, there is no longer a traditional security perimeter around the data center, so the service identity is the new perimeter. When handling machine identities, fine-grained consent controls are essential in protecting privacy as data is moved between machines. The authorization system discerns the “who, what, where, when, and why” and confirms that the owner has consented to the sharing of that data and the person requesting access isn’t a fraudster. 


3 Predictions For Fintech Companies’ Evolution In 2023

If you spend even five minutes on LinkedIn, you know the debate between in-person, hybrid and distributed work is still a hot one. But what does the data tell us? Owl Lab’s State of Remote Work Report found the number of workers choosing to work remotely in 2022 increased 24%, those choosing hybrid went up 16% and interest for in-office work dropped by 24%. The data keeps rolling in with this McKinsey study that found, when offered, almost everyone takes the opportunity to work flexibly. Companies looking to embrace this flexible work mindset should focus on improving and optimizing synchronous activities like all-hands meetings, lunch and learns, and coffee chats. Supporting asynchronous work is also important. Personally, I’m a champion of written and narrative documentation of projects, which allows people to review and process on their own time and at their own pace. In my experience, this makes meetings even more productive and impactful so people can focus on the outcomes of time spent together. No one has a crystal ball for what the next year holds.  



Quote for the day:

"Leadership matters more in times of uncertainty." -- Wayde Goodall

Daily Tech Digest - December 29, 2022

10 IT certifications paying the highest premiums today

The Certified in the Governance of Enterprise IT (CGEIT) certification is offered by the ISACA to validate your ability to handle “the governance of an entire organization” and can also help prepare you for moving to a C-suite role if you aren’t already in an executive leadership position. The exam covers general knowledge of governance of enterprise IT, IT resources, benefits realization, and risk optimization. To qualify for the exam, you’ll need at least five years of experience in an advisory or oversight role supporting the governance of IT in the enterprise. ... The AWS Certified Security certification is a specialty certification from Amazon that validates your expertise and ability with securing data and workloads in the AWS cloud. The exam is intended for those working in security roles with at least two years of hands-on experience securing AWS workloads. It’s recommended that candidates for the exam have at least five years of IT security experience designing and implementing security solutions. ... To earn the certification, you will need to pass the AWS Certified Security Specialty exam, which consists of multiple choice and multiple response questions.


When will cloud computing stop growing?

So, no matter where the market goes, and even if the hyperscalers begin to seem more like legacy technology, the dependencies will remain and growth will continue. The hyperscaler market could become more complex and fragmented, but public clouds are the engines that drive growth and innovation. Will it stop growing at some point? I think there are two concepts to consider: First, cloud computing as a concept. Second, the utility of the technology itself. Cloud computing is becoming so ubiquitous, it will likely just become computing. If we use mostly cloud-based consumption models, the term loses meaning and is just baked in. I actually called for this in a book I wrote back in 2009. Others have called for this as well, but it’s yet to happen. When it does, my guess is that the cloud computing concept will stop growing, but the technology will continue to provide value. The death of a buzzword. The utility, which is the most important part, carries on. Cloud computing, at the end of the day, is a much better way to consume technology services. The idea of always owning our own hardware and software, running our own data centers, was never a good one.


Modernise and Bolster Your Data Management Practice with Data Fabric

Data has emerged as an invaluable asset that can not only be used to power businesses but can also be put to the wrong use for individual benefit. With stringent regulatory norms around data handling and management in place, data security, governance and compliance need dedicated attention. Data fabric can significantly improve security by integrating together data and applications from across physical and IT systems. It enables a unified and centralized route to create policies and rules. The ability to automatically link policies and rules basis metadata such as data classifications, business terms, user groups, roles, and more, including policies on data access controls, data privacy, data protection, and data quality ensures optimized data governance, security, and compliance. Changing business dynamics require businesses to be ahead of the curve by virtue of aptly and actively using data. Data fabric is a data operational layer that weaves through huge volumes of data from multiple sources and processes it using machine learning enabling businesses to discover patterns and insights in real-time. 


It’s a Toolchain!

Even ‘one’ toolchain is really not the same chain of tools; it is the same CI/CD tool managing a pool of others. This has really interesting connotations for the idea of the “weakest link in the chain,” whether we’re talking security, compliance or testing, because the weakest link might depend on which tools are spawned this run. To take an easy example that doesn’t overlap with the biggest reason above—targeting containers for test and virtual machines (VMs) for deployment. Some organizations do this type of thing regularly due to licensing or space issues. Two different deployment steps in ‘one’ toolchain. There are more instances like this than you would think. “This project uses make, that one uses cmake” is an example of the type of scenarios we’re talking about. These minor variations are handled by what gets called from CI. Finally, most of the real-life organizations I stay in touch with are both project-based and are constantly evolving. That makes both of the above scenarios the norms, not the exceptions. While they would love to have one stack and one toolchain for all projects, no one realistically sees that happening anytime soon. 


How DevOps is evolving into platform engineering

Platform engineering is the next big thing in the DevOps world. It has been around for a few years. Now the industry is shifting toward it, with more companies hiring platform engineers or cloud platform engineers. Platform engineering opens the door for self-service capabilities through more automated infrastructure operations. With DevOps, developers are supposed to follow the "you build it, you run it" approach. However, this rarely happens, partly because of the vast number of complex automation tools. Since more and more software development tools are available, platform engineering is emerging to streamline developers' lives by providing and standardizing reusable tools and capabilities as an abstraction to the complex infrastructure. Platform engineers focus on internal products for developers. Software developers are their customers, and platform engineers build and run a platform for developers. Platform engineering also treats internal platforms as a product with a heavy focus on user feedback. Platform teams and the internal development platform scale out the benefits of DevOps practices. 


Top 5 Cybersecurity Trends to Keep an Eye on in 2023

Cyber security must evolve to meet these new demands as the world continues shifting towards remote and hybrid working models. With increased reliance on technology and access to sensitive data, organizations need to ensure that their systems are secure and their employees are equipped to protect against cyber threats. Organizations should consider implementing security protocols such as Multi-Factor Authentication (MFA), which requires additional authentication steps to prove the user’s identity before granting access to systems or data. MFA can provide an additional layer of protection against malicious actors who may try to access accounts with stolen credentials. Businesses should also consider developing policies and procedures for securing employee devices. This could include providing employees with secure antivirus software and encrypted virtual private networks (VPNs) for remote connections. Additionally, employees should be trained on the importance of strong passwords, unique passwords for each account, and the dangers of using public networks.


Understanding Data Management, Protection, and Security Trends to Design Your 2023 Strategy

Today more than ever there is a need for a modernized approach towards data security considering that the threats are increasingly getting sophisticated. Authentication-as-a-Service with built-in SSO capabilities, tightly integrated with Cloud apps will secure online access. Data encryption solutions with comprehensive key management solutions will help customers protect their digital assets whether on-premise or cloud. EDRM solutions with the widest file and app support will aide customers to protect and have control over their data even outside their networks. DLP solutions with integrated user behavior analysis (UBA) modules provide customers leverage their investment in their DLP. Data discovery and classification help organizations get complete visibility into sensitive data with efficient data discovery, classification, and risk analysis across heterogeneous data stores. These are some approaches organizations can benefit from OEMs designing data security solutions and products.


US-China chip war puts global enterprises in the crosshairs

“In addition to the chipmakers and semiconductor manufacturers in China, every company on the supply chain of advanced chipsets, such as the electronic vehicle manufacturers and HPC [high performance computing] makers in China, will be hit," said Charlie Dai, research director at market research firm Forrester. "There will also be collateral damage to the global technology ecosystem in every area, such as the chip design, tooling, and raw materials.” Enterprises might not feel the burn right away, since interdependencies between China and the US will be hard to unwind immediately. For example, succumbing to pressure from US businesses, in early December the US Department of Defense said it would allow its contractors to use chips from the banned Chinese chipmakers until 2028. In addition, the restrictions are not likely to have a direct effect on the ability of the global chip makers to manufacture semiconductors, since they have not been investing in China to manufacture chips there, said Pareekh Jain, CEO at Pareekh Consulting.


Financial Services Was Among Most-Breached Sectors in 2022

The practice of attackers sneaking so-called digital skimmers - typically, JavaScript code - onto legitimate e-commerce or payment platforms also continues. These tactics, known as Magecart-style attacks, most often aim to steal payment card data when a customer goes to pay. Attackers either use that data themselves or batch it up into "fullz," referring to complete sets of credit card information that are sold via a number of different cybercrime forums. Innovation continues among groups that practice Magecart tactics. In recent weeks, reports application security vendor Jscrambler, three different attack groups have begun wielding new, similar tactics designed to inject malicious JavaScript into legitimate sites. One of the groups has been injecting a "Google Analytics look-alike script" into victims' pages, while another has been injecting a "malicious JavaScript initiator that is disguised as Google Tag Manager." The third group is also injecting code, but does so by having registered the domain name for Cockpit, a free web marketing and analytics service that ceased operations eight years ago. 


Microservices Integration Done Right Using Contract-Driven Development

Testing an application is not just about testing the logic within each function, class, or component. Features and capabilities are a result of these individual snippets of logic interacting with their counterparts. If a service boundary/API between two pieces of software is not properly implemented, it leads to what is popularly known as an integration issue. Example: If functionA calls functionB with only one parameter while functionB expects two mandatory parameters, there is an integration/compatibility issue between the two functions. Such quick feedback helps us course correct early and fix the problem immediately. However, when we look at such compatibility issues at the level of microservices where the service boundaries are at the http, messaging, or event level, any deviation or violation of the service boundary is not immediately identified during unit and component/api testing. The microservices must be tested with all their real counterparts to verify if there are broken interactions. This is what is broadly (and in a way wrongly) classified as integration testing.



Quote for the day:

"To command is to serve : nothing more and nothing less." -- Andre Marlaux