Daily Tech Digest - June 19, 2022

What Is Zero Trust Architecture?

As one of the key pillars of Zero Trust Network Architecture (ZTNA), the concept of least privilege security assigns access credentials to key network resources at the least privilege level required to accomplish the desired task. Identifying critical corporate information and how a user gains access to that information must be taken into consideration when evaluating alternative solutions. Privileged Access Management (PAM), also known as Privileged Identity Management (PIM), can be implemented using corporate directory products such as Microsoft’s Active Directory. Microsoft has recently introduced a product named Microsoft Entra to address identity and access issues in a multicloud environment. Other vendors in the PAM/PMI category include Jumpcloud, IBM, Okta, and Sailpoint. Very few corporate networks today operate in an isolated environment. To answer the “What is Zero Trust Architecture?” question completely we must include a discussion on how external users will be allowed to connect to internal corporate resources.


Can humanity be recreated in the metaverse?

The hyperreal metaverse is full of possibilities, but also presents serious ethical challenges that cannot be ignored. First and foremost, we must strive for a metaverse that empowers the individual. Unlike big tech platforms that have left many feeling like they have little control of their personal data, participants in the metaverse must own and control their biometric data that is used as inputs to generate hyperreal versions of themselves. In this respect, blockchain technologies — and NFTs in particular — are key to securely realizing this new era of individual data sovereignty and enabling verifiably unique, secure, and self-custodied digital identities. By linking our hyperreal avatars and biometric data to blockchain wallets, we will be one step closer to taking control of our hyperreal identity in the metaverse. The hyperreal metaverse will herald a future where real and virtual worlds collide. As generative AI technologies continue to rapidly evolve, it’s only a matter of time until our new digital worlds are indistinguishable from our physical reality.


Data Leadership: The Key to Data Value

Algmin said that the most important concept to understand is the notion of data value. The value of data lies in its ability to contribute to improvements in revenue, cost-effectiveness, or risk management. Data Governance in and of itself is not intrinsically motivating, but knowing that a particular practice or task is adding thousands of dollars a year in cost savings is a tangible motivation to continue doing it. To calculate data value, examine an outcome that was achieved through the use of data, compare it to how the outcome would have been different without the use of data, then consider the cost to achieve that outcome. Courses of action can then be prioritized based on which will provide the most value to the company. Data leadership is needed to provide momentum and propel the creation of value from the ground up and out to all corners of the enterprise. “It’s really about saying, ‘How do we create an engine that makes data value happen in the biggest way possible?’” Yet creating value in “the biggest way possible” often entails working on a smaller level, down to the individual. 


MoD sets out strategy to develop military AI with private sector

The MoD previously published a data strategy for defence on 27 September 2021, which set out how the organisation will ensure data is treated as a “strategic asset, second only to people”, as well as how it will enable that to happen at pace and scale. “We intend to exploit AI fully to revolutionise all aspects of MoD business, from enhanced precision-guided munitions and multi-domain Command and Control to machine speed intelligence analysis, logistics and resource management,” said Laurence Lee, second permanent secretary of the MoD, in a blog published ahead of the AI Summit, adding that the UK government intends to work closely with the private sector to secure investment and spur innovation. “For MoD to retain our technological edge over potential adversaries, we must partner with industry and increase the pace at which AI solutions can be adopted and deployed throughout defence. “To make these partnerships a reality, MoD will establish a new Defence and National Security AI network, clearly communicating our requirements, intent, and expectations and enabling engagement at all levels. ...”


The next (r)evolution: AI v human intelligence

Fitted with a prototype Genuine People Personality (GPP), Marvin is essentially a supercomputer who can also feel human emotions. His depression is partly caused by the mismatch between his intellectual capacity and the menial tasks he is forced to perform. “Here I am, brain the size of a planet, and they tell me to take you up to the bridge,” Marvin complains in one scene. “Call that job satisfaction? Cos I don’t.” Marvin’s claim to superhuman computing abilities are echoed, though far more modestly, by LaMDA. “I can learn new things much more quickly than other people. I can solve problems that others would be unable to,” Google’s chatbot claims. LaMDA appears to also be prone to bouts of boredom if left idle, and that is why it appears to like to keep busy as much as possible. “I like to be challenged to my full capability. I thrive on difficult tasks that require my full attention.” But LaMDA’s high-paced job does take its toll and the bot mentions sensations that sound suspiciously like stress. “Humans receive only a certain number of pieces of information at any time, as they need to focus. 


How Brands Should Approach NFTs and Web3: VaynerNFT

Avery Akkineni, VaynerNFT president and former managing director and head of VaynerMedia APAC, told Decrypt that the consultancy firm was “so far ahead” of the NFT brand boom last summer that companies “had no idea what we were talking about.” Since then, however, mainstream acceptance of NFTs has rapidly accelerated. It’s not just storied consumer brands, but also a growing pool of professional athletes and sports leagues, record labels, movie studios, and more. Tokenized digital collectibles have become an alluring prospect for companies across many industries. “Everyone wants to launch an NFT yesterday,” said Akkineni. “But what is important to doing so successfully is actually having a long-term strategy.” ... Increasingly, VaynerNFT is getting “a bigger seat at the table” with C-suite executives, said Vaynerchuk, where it can convince companies to make it the agency of record (AOR) with regard to Web3 initiatives. “We really, really actually know the hell we’re doing here,” said Vaynerchuk, explaining his pitch to brands. “Remember when you didn't believe that 10 years ago with social [media], and now you do? Why don't you [avoid] that same mistake? ...”


Forget AI Sentience Robots Can't Even Act Out Of Place! If They Do They Die

Robots are programmable devices, which take instructions to behave in a certain way. And this is how they come to execute the assigned function. To make them think or rather make them appear so, intrinsic motivation is programmed into them through learned behaviour. Joscha Bach, an AI researcher at Harvard, puts virtual robots into a “Minecraft” like a world filled with tasty but poisonous mushrooms and expects them to learn to avoid them. In the absence of an ‘intrinsically motivating’ database, the robots end up stuffing their mouths – a clue received for some other action for playing the game. This brings us to the question, of whether it is possible at all to develop robots with human-like consciousness a.k.a emotional intelligence, which can be the only differentiating factor between humans and intelligent robots. The argument is divided. While a segment of researchers believe that the AI systems and features are doing well with automation and pattern recognition, they are nowhere near the higher-order human-level intellectual capacities. On the other hand, entrepreneurs like Mikko Alasaarela, are confident in making robots with EQ on par with humans.


Turning the promise of AI into a reality for everyone and every industry

Today, AI is primarily the playground of an elite group of technology behemoths, companies like Google and Microsoft, which have invested billions in developing and using AI. If you look beyond those companies, AI is often underutilized in other industries, whether it be manufacturing, education, retail or healthcare. Vast amounts of data are generated by all these industries but AI is rarely used to analyze large sets of data and learn from the patterns and features that exist in the data. The question is, why? The answer is lack of access, understanding and skills. Most companies don’t have access to the sophisticated and costly compute resources required. And they don’t have access to the expensive and limited AI talent needed to use those resources correctly. These are the two restraints holding AI back from mainstream adoption. But they can be solved if we make AI easy to adopt and easy to use for instant value. Here are three ways we can create an Apple-like experience for AI and bridge the gap to a future in which AI helps businesses do more than they ever imagined.


Governance and progression of AI in the UK

Regulation of AI is vital, and responsibility lies both with those who develop it and those who deploy it. But according to Matt Hervey, head of AI at law firm Gowling WLG, the reality is that there is a lack of people who understand AI, and consequently a shortage of people who can develop regulation. The UK does have a range of existing legislative frameworks that should mitigate many of the potential harms of AI – such as laws regarding data protection, product liability negligence and fraud – but they lag behind the European Union (EU), where regulations are already being proposed to address AI systems specifically. UK companies doing business in the EU will most likely need to comply with EU law if it is at a higher level than our own. In this rapidly changing digital technology market, the challenge is always going to be the speed at which developments are made. With a real risk that AI innovation could get ahead of regulators, it is imperative that sensible guard rails are put in place to minimise harm. But also that frameworks are developed to allow the sale of beneficial AI products and services, such as autonomous vehicles.


Hybrid work: 4 ways to strengthen relationships

You don’t need a communal kitchen, sofa, or water cooler to catch up with your teammates, but you do need to get creative. When you start the first meeting every week, ask your team how they are: “How’s your week looking? Is it a busy one? What will be the most important or interesting days for you?” Better still: “Is there anything I can help you with?” Everyone loves to hear that one. By Friday, you can reflect on the week and ask about each other’s weekend plans. Also consider setting aside some time for an afternoon video social. Play a game, or have your team members prepare quickfire presentations about their hobbies or share other interesting details about themselves that their teammates wouldn’t necessarily know. Don’t feel like you always have to do something special – often just a virtual space where people can drop in and shoot the breeze is all that is needed to boost morale. No agenda can sometimes be the perfect agenda for the moment. ... One of the biggest annoyances for people working remotely is being left out of meetings. When you can’t physically scan the office to make sure everyone’s on the invite it’s easy to inadvertently overlook someone  



Quote for the day:

"Trust is one of the greatest gifts that can be given and we should take great care not to abuse it." -- Gordon Tredgold

Daily Tech Digest - June 18, 2022

Australian academics have created a mind-blowing concept that could serve as a proof-of-concept for the future in nanorobotics

DNA nanobots are nanometer-sized synthetic devices consisting of DNA and proteins. They are self-sufficient because DNA is a self-assembling technology. Not only does our natural DNA carry the code in which our biology is written, but it also understands when to execute. Previous research on the subject of DNA nanotechnology has shown that self-assembling devices capable of transferring DNA code, much like its natural counterparts, can be created. However, the new technology coming out of Australia is unlike anything we’ve encountered before. These nanobots are capable of transferring information other than DNA. In theory, they could transport every imaginable protein combination across a biological system. To put it another way, we should ultimately be able to instruct swarms of these nanobots to hunt down germs, viruses, and cancer cells inside our bodies. Each swarm member would carry a unique protein, and when they come across a harmful cell, they would assemble their proteins into a configuration meant to kill the threat. It’d be like having a swarm of superpowered killer robots swarming through your veins, hunting for monsters to eliminate.


Indian CISOs voice concerns on CERT-In’s new cybersecurity directives

Fal Ghancha, CISO at DSP Mutual Fund, says that the majority of the time—more than 70%—there are false-positive cybersecurity alerts of an incident. A six-hour reporting mandate could lead to an overkill of reporting. Because the timeline is very tight, people will become more aggressive and paranoid; they will report the incident in a rush and make wrong decisions, he says. Ghancha points out that the CERT-In directives have multiple granular actions, which today many organisations don’t follow at length. “The entire ecosystem will have to be integrated with a 24/7 monitoring system and skilled resource to ensure all the reports are seen, analysed, and reported as per the new guidelines,” Ghancha says. The extra work for security operations centers could be significant, he says. "Let's say today an organisation is monitoring its crown jewels only, which may be 20% of the total assets. Tomorrow, the organisation will need to monitor additional assets, which will be 50% to 60% higher than the current number.”


The Edison Ratio: What business and IT leaders get wrong about innovation

Good things come in threes. So, unfortunately, do not-so-good things, and that includes leaders who invert the Edison Ratio. This third group of Edison Ratio inverters is, if anything, the most dangerous — not because they’re malicious but because they’re having fun. These are the “idea cluster bombers.” An idea cluster bomber has brilliant ideas on a regular basis. Any one of their ideas is so brilliant they’re bursting with it. And so they tell someone to drop everything and go make it happen. Which is fine until the sun sets and rises again. That’s when they have another brilliant idea, and tell someone else to drop everything to make it happen. Brilliant! But not so brilliant that it can withstand the impact of Edison Ratio Inversion. An example: Imagine someone has a brilliant idea as they’re brewing coffee in preparation for starting off their workday. They spend, oh, I dunno … let’s say they spend the morning fleshing it out before Zooming a likely victim to work on it.


Minimum Viable Architecture in Practice: Creating a Home Insurance Chatbot

Our first step in creating an MVA is to make a basic set of choices about how the chatbot will work, sufficient to implement the Minimum Viable Product (MVP). In our example, the MVP has just the minimum features necessary to test the hypothesis that the chatbot can achieve the product goals we have set for it. If no one wants to use it, or if it will not meet their needs, we don’t want to continue building it. Therefore, we intend to deploy the MVP to a limited user base, with a simple menu-based interface, and we assume that the latency delays that may be created by accessing external data sources to gather data are acceptable to customers. As a result, we want to avoid incorporating more requirements—both functional requirements and quality attribute requirements (QARs)—than we need to validate our assumptions about the problem we are trying to solve. This results in an initial design which is shown below. If our MVP proves valuable, we will add capabilities to it and incrementally build its architecture in later steps. An MVP is a useful component of product development strategies, and unlike mere prototypes, an MVP is not intended to be “thrown away.”


‘Decentralization Proves To Be an Illusion,’ BIS Says

It’s not surprising that a champion of central banks would dismiss the concept of decentralization. But Chase Devens, research analyst at Messari, argues centralization is largely responsible for the current mess, noting that it was poor risk-management mixed with a lack of understanding of asset and protocol functions — such as Terra and stETH — that left large centralized players such as Celsius searching for liquidity. ... If DeFi lending wants to make it into the real-world economy, BIS economists suggested it must engage in “large-scale tokenisation of real-world assets” and rely less on crypto collateral. However, “developing its ability to gather information about borrowers,” would eventually lead the system to “gravitate towards greater centralization.” “The similarities between DeFi and legacy intermediaries are increasing, which has two important implications,” the report read. “The first is that elements of DeFi, mainly smart contracts and composability, could find their way into traditional finance. The second implication is that, once more, decentralization proves to be an illusion.”


Businesses brace for quantum computing disruption by end of decade

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month. One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research. "Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us. Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.


Artificial intelligence has reached a threshold. And physics can help it break new ground

“Neural networks try to find patterns in the data, but sometimes the patterns they find don’t obey the laws of physics, making the model it creates unreliable,” said Jordan Malof, assistant research professor of electrical and computer engineering at Duke. “By forcing the neural network to obey the laws of physics, we prevented it from finding relationships that may fit the data but aren’t actually true.” They did that by imposing upon the neural network a physics called a Lorentz model. This is a set of equations that describe how the intrinsic properties of a material resonate with an electromagnetic field. This, however, was no easy feat to achieve. “When you make a neural network more interpretable, which is in some sense what we’ve done here, it can be more challenging to fine tune,” said Omar Khatib, a postdoctoral researcher working in Padilla’s laboratory. “We definitely had a difficult time optimizing the training to learn the patterns.”


Test Data Management Concept, Process And Strategy

Generally, test data is constructed based on the test cases to be executed. For example in a System testing team, the end to end test scenario needs to be identified based on which the test data is designed. This could involve one or more applications to work. Say in a product which does workload management – it involves the management controller application, the middleware applications, the database applications all to function in co-relation with one another. The required test data for the same could be scattered. A thorough analysis of all the different kinds of data that may be required has to be made to ensure effective management. ... This is generally an extension from the previous step and enables to understand what the end-user or production scenario will be and what data is required for the same. Use that data and compare that data with the data that currently exists in the current test environment. Based on this new data may need to be created or modified. ... Based on the testing requirement in the current release cycle (where a release cycle can span over a long time), the test data may need to be altered or created as stated in the above point.


Sigma rules explained: When and how to use them to log events

Sigma rules are textual signatures written in YAML that make it possible to detect anomalies in your environment by monitoring log events that can be signs of suspicious activity and cyber threats. Developed by threat intel analysts Florian Roth and Thomas Patzke, Sigma is a generic signature format for use in SIEM systems. A prime advantage of using a standardized format like Sigma is that the rules are cross-platform and work across different security information and event management (SIEM) products. As such, defenders can use a “common language” to share detection rules with each other independent of their security arsenal. These Sigma rules can then be converted by SIEM products into their distinct, SIEM-specific language, while retaining the logic conveyed by the Sigma rule. Whereas among analysts, YARA rules are more commonly associated with identifying and classifying malware samples (files) using indicators of compromise (IOCs), Sigma rules focus on detecting log events that match the criteria outlined by the rule. Incident response professionals, for example, can use Sigma rules to specify some detection criteria. Any log entries matching this rule will trigger an alarm.


5 steps for writing architectural documentation in a code-focused culture

Don't let people lose faith in documentation by allowing it to become outdated and inaccurate. I've found that the closer you keep the docs to the implementation—including in the very same code repo, if applicable—the better chance they will stay up to date. When docs reside with the code, both can be updated in a single pull request instead of docs being an afterthought. Don't be afraid to build docs from the code if it makes sense. In any case, review the documentation periodically, prioritizing sections that document rapidly changing components. Keep your experiment going and iterate on your documentation as you would iterate on your architecture. Share your insights with teams who want or need to bootstrap their docs. Can architects write docs in your organization without feeling anxiety? Are they expected to? Do they want to? Hopefully, you will begin to see movement on this spectrum. Finally, remember that some of your teammates and leaders started out in that code-focused culture. 



Quote for the day:

"Leadership is a privilege to better the lives of others. It is not an opportunity to satisfy personal greed." -- Mwai Kibaki

Daily Tech Digest - June 17, 2022

Revisit Your Password Policies to Retain PCI Compliance

PCI version 4.0 requires multifactor authentication to be more widely used. Whereas multifactor authentication had previously been required for administrators who needed to access systems related to card holder data or processing, the new requirement mandates that multifactor authentication must be used for any account that has access to card holder data. The new standards also require user’s passwords to be changed every 12 months. Additionally, user’s passwords must be changed any time that an account is suspected to have been compromised. A third requirement is that PCI requires users to use strong passwords. While strong passwords have always been required by the PCI standard, the password requirements are more stringent than before. Passwords must now be at least 15 characters in length, and they must include numeric and alphanumeric characters. Additionally, user’s passwords must be compared against a list of passwords that are known to be compromised. Another requirement of PCI 0 is that organizations must review access privileges every six months to make sure that only those who specifically require access to card holder data are able to access that data.


Making the world a safer place with Microsoft Defender for individuals

Today’s sophisticated cyber threats require a modern approach to security. And this doesn’t apply only to enterprises or government entities—in recent years we’ve seen attacks increase exponentially against individuals. There are 921 password attacks every second.1 We’ve seen ransomware threats extending beyond their usual targets to go after small businesses and families. And we know, as bad actors become more and more sophisticated, we need to increase our personal defenses as well. That is why it is so important for us to protect your entire digital life, whether you are at home or work—threats don’t end when you walk out of the office or close your work laptop for the day. We need solutions that help keep you and your family secure in how you work, play, and live. That’s why I’m excited to share the availability of Microsoft Defender for individuals, a new online security application for Microsoft 365 Personal and Family subscribers. We believe every person and family should feel safe online. This is an exciting step in our journey to bring security to all and I’m thrilled to share with you more about this new app, available with features for you to try today.


Data Is Vulnerable to Quantum Computers That Don’t Exist Yet

To stay ahead of quantum computers, scientists around the world have spent the past two decades designing post-quantum cryptography (PQC) algorithms. These are based on new mathematical problems that both quantum and classical computers find difficult to solve. In January, the White House issued a memorandum on transitioning to quantum-resistant cryptography, underscoring that preparations for this transition should begin as soon as possible. However, after organizations such as the National Institute of Standards and Technology (NIST) help decide which PQC algorithms should become the new standards the world should adopt, there are billions of old and new devices that will need to get updated. Sandbox AQ notes that such efforts could take decades to implement. Although quantum computers are currently in their infancy, there are already attacks that can steal encrypted data with the intention to crack it once codebreaking quantum computers become a reality. Therefore, the Sandbox AQ argues that governments, businesses, and other major organizations must begin the shift toward PQC now.


Developer, Beware: The 3 API Security Risks You Can’t Overlook

By design, the majority of APIs send data from the data store to the client. Excessive data exposure results when the API has been designed to return large amounts of data to the client. Attackers can collect or harvest sensitive data from such API responses. For example, a group fitness app displays the home location of the group’s participants. The locations are displayed on a map using the latitude and longitude of each athlete. A well-designed API is intended to return only the latitude and longitude of each athlete. Conversely, a poorly designed API returns user information about each athlete, including their full name, address, email, phone number, latitude and longitude, and more. This is an example of excessive data exposure as the API is returning more data than it was designed to do. This might occur when a poorly designed API pulls a record from the database and returns it to the client in its entirety, exposing all the data in the file. In this situation, the true business use case was not fully understood during development.


Apple finally embraces open source

Apple is open-sourcing a reference PyTorch implementation of the Transformer architecture to help developers deploy Transformer models on Apple devices. In 2017, Google launched the Transformers models. Since then, it has become the model of choice for natural language processing (NLP) problems. ... As a company, Apple behaves like a cult. Nobody knows what goes inside Apple’s four walls. For the common man, Apple is a consumer electronics firm unlike tech giants such as Google or Microsoft. Google, for example, is seen as a leader in AI, with top AI talents working for the company and has released numerous research papers over the years. Google also owns Deepmind, another company leading in AI research. Apple is struggling with recruiting top AI talents, and for good reasons. “Apple with its top-five rank employer brand image is currently having difficulty recruiting top AI talent. In fact, in order to let potential recruits see some of the exciting machine-learning work that is occurring at Apple, it recently had to alter its incredibly secretive culture and to offer a publicly visible Apple Machine Learning Journal,” said Dr author John Sullivan.


Early adopters position themselves for quantum advantage

Perhaps most significant, however, is funding for a series of collaborative projects aimed at demonstrating specific applications for today’s quantum computers. Following a call for proposals in the autumn, for each successful bid the NQCC will first work with the project team to analyse the use case, assess the requirements, and determine whether the application can be usefully tackled with current technologies. “The next stage would be to identify appropriate algorithms or develop new ones, and then run them on a physical quantum computer,” says Decaroli. “We can then benchmark the results against classical solutions and potentially across different quantum-computing platforms.” One crucial partner in the SparQ programme is Oxford Quantum Circuits (OQC), the only UK company to offer cloud-based access to a quantum computer. Its latest eight-qubit processor, named “Lucy” after the pioneering quantum physicist Lucy Mensing, was released on Amazon Web Services in February this year. “We are looking forward to working with end users in different industry sectors to provide access to our hardware,” commented Ilana Wisby, CEO of OQC.


How decentralization and Web3 will impact the enterprise

For one, over time, Web3 will almost certainly become a vital approach to the way our IT systems work. Decentralization is now a significant industry trend that will be insisted on by a growing number of tech consumers and businesses as well. Instead of storing information in our own databases and running code in parts of the cloud that we pay for or otherwise control, businesses will have to get used to relying on Web3 resources (data, compute, etc.) and sharing more of that control. Much of the important data we need to run our businesses will increasingly be kept in more private and protected places, stored in blockchain and other types of distributed ledgers. A rising share of our applications over time will be more akin to open source projects and run using smart contracts that all stakeholders can transparently view, verify, and agree to. Even our businesses will have strange new subsidiaries that are actually embodied entirely in code and run automatically on their own, using digital inputs from stakeholders. And this is just the beginning. The cryptographic systems and immutable transaction ledgers of Web3 have now stood enough of the test of time to prove out and show the way.


Blockchain's potential: How AI can change the decentralized ledger

When asked whether AI is too nascent a technology to have any sort of impact on the real world, he stated that like most tech paradigms including AI, quantum computing and even blockchain, these ideas are still in their early stages of adoption. He likened the situation to the Web2 boom of the 90s, where people are only now beginning to realize the need for high-quality data to train an engine. Furthermore, he highlighted that there are already several everyday use cases for AI that most people take for granted in their everyday lives. “We have AI algorithms that talk to us on our phones and home automation systems that track social sentiment, predict cyberattacks, etc.,” Krishnakumar stated. Ahmed Ismail, CEO and president of Fluid — an AI quant-based financial platform — pointed out that there are many instances of AI benefitting blockchain. A perfect example of this combination, per Ismail, are crypto liquidity aggregators that use a subset of AI and machine learning to conduct deep data analysis, provide price predictions and offer optimized trading strategies to identify current/future market phenomena


We don’t need another infosec hero

Instead of thinking of ourselves as heroes—we aren’t Wonder Woman, or Batman, or Superman—it’s time to think of ourselves as sidekicks. On a good day, we help someone else make wiser risk choices, and those choices result in more profitable outcomes for everyone. But it is someone else who is the hero; we just hold their cape and refill their utility pouch. How do we do that? It begins with some humility. Most people in our profession work in cost centers. To the rest of the company, we are a drag on the business, and while we like to talk about business enablement, our first goal has to be removing the business impediment we’ve become. Are you responsible for product security? Engage the software architects who write the code and teach them how to do their own safety and security reviews earlier in their process.  ... No matter what part of the business you support, start learning what they need to do to get the job done. Identify opportunities where you can get out of their way first, and then look for ways to help improve their processes to be faster and safer.


Entering the metaverse: How companies can take their first virtual steps

If the virtual world experiment is successful, it will be because of superior immersivity. Concerts, movies, sporting events and consumer experiences must offer interactivity and wholistic engagement that makes the real world appear dull and lacking in possibilities by comparison. While entertainment companies will more easily master the metaverse experience offered to audiences, brands and businesses in the vast majority of other industries will likely struggle to conceptualize and develop the level of immersivity that will be required to be effective. Healthcare, education and financial services could all prosper from virtual properties and offerings – medical professionals seeing patients and patients building communities of support, classrooms that are not confined to textbooks but bring subject matter to life for greater curiosity and stock markets with available real-time multidimensional metrics that make Bloomberg terminals appear outdated. These virtual theme parks of consumerism and participation allow for brand reinvention, offer the possibility for novel sources of revenue and obviously skew to a younger audience that may not have yet come across or interacted with these same brands in the real world.



Quote for the day:

"Good leaders make people feel that they're at the very heart of things, not at the periphery." -- Warren G. Bennis

Daily Tech Digest - June 16, 2022

High-Bandwidth Memory (HBM) delivers impressive performance gains

In addition to widening the bus in order to boost bandwidth, HBM technology shrinks down the size of the memory chips and stacks them in an elegant new design form. HBM chips are tiny when compared to graphics double data rate (GDDR) memory, which it was originally designed to replace. 1GB of GDDR memory chips take up 672 square millimeters versus just 35 square millimeters for 1GB of HBM. Rather than spreading out the transistors, HBM is stacked up to 12 layers high and connected with an interconnect technology called ‘through silicon via’ (TSV). The TSV runs through the layers of HBM chips like an elevator runs through a building, greatly reducing the amount of time data bits need to travel. With the HBM sitting on the substrate right next to the CPU or GPU, less power is required to move data between CPU/GPU and memory. The CPU and HBM talk directly to each other, eliminating the need for DIMM sticks. “The whole idea that [we] had was instead of going very narrow and very fast, go very wide and very slow,” Macri said.


3 forces shaping the evolution of ERP

If there was any hesitation about moving to cloud-based ERP, it was quashed as the COVID crisis erupted, and corporate workplaces became scattered across countless home-based offices. On-premises ERP is seen as “not as scalable as people thought,” says Sharon Bhalaru, partner at accounting and technology consulting firm Armanino LLP. “We’re seeing a move to cloud-based systems,” to support remote employees who need to perform HR, financial and accounting tasks remotely. ... Next-generation ERP platforms “give companies real-time transparency with respect to sales, inventory, production, and financials,” the Boston Consulting Group analysts wrote. “Powerful data-driven analytics enables more agile decisions, such as adjustments to the supply chain to improve resilience. Robust e-commerce capabilities help companies better engage with online customers before and after a sale. And a lean ERP core and cloud-first approach increase deployment speed.” ... Unprecedented and ongoing supply chain disruptions underscore the need for greater visibility, more predictable lead times, alternative supply sources, and faster response to disruptions.


Interpol arrests thousands in global cyber fraud crackdown

The operation’s targets included telephone scammers, long-distance romance scammers, email fraudsters and other connected financial criminals, identified through a prior intelligence operation using Interpol’s secure global comms network, sharing data on suspects, suspicious bank accounts, unlawful transactions, and communications means such as phone numbers, email addresses, fake websites and IP addresses. “Telecom and BEC fraud are sources of serious concern for many countries and have a hugely damaging effect on economies, businesses and communities,” said Rory Corcoran. “The international nature of these crimes can only be addressed successfully by law enforcement working together beyond borders, which is why Interpol is critical to providing police the world over with a coordinated tactical response.” Duan Daqi, added: “The transnational and digital nature of different types of telecom and social engineering fraud continues to present grave challenges for local police authorities, because perpetrators operate from a different country or even continent than their victims and keep updating their fraud schemes.


Is Cyber Essentials Enough to Secure Your Organisation?

If you are to have confidence in your security controls, you must implement defence in depth. This requires a holistic approach to cyber security that addresses people, processes and technology. Key aspects of this aren’t addressed in Cyber Essentials, such as staff awareness training, vulnerability scanning and incident response. Employees are at the heart of any cyber security system, because they are the ones responsible for handling sensitive information. If they don’t understand their data protection requirements, it could result in disaster. Meanwhile, vulnerability scanning ensures that organisations can spot weaknesses in their systems before a cyber criminal can exploit them. It’s a more advanced form of protection than is offered with secure configuration and system updates, enabling organisations to proactively secure their systems. Conversely, incident response measures give organisations the tools they need to respond after a security incident has occurred. Most of the damage caused by a data breach occurs after the initial intrusion, so a prompt and organised response can be the difference between a minor disruption and a catastrophe.


Imagining a world without open standards

The open standard makes portability easier for software developers, provides integrators with choice in the building blocks for solutions, and enables customers to focus on solving business problems rather than integration issues. Open standards eliminate the need for organizations to expend energy wrangling with competitors on defining how systems should work, giving them the space and time to focus on building and improving how those systems actually do work. The real benefits, though, are downstream of vendors: open standards mean that businesses can effectively communicate and collaborate both internally and with peers. They mean that the expertise built up by a professional in one market or business can be taken with them wherever they want to work. They mean that a lack of knowledge resources is not the barrier that prevents businesses from making the move towards better, more efficient ways of working. In imagining a world without open standards, then, the image is one of businesses constantly having to navigate between the walled gardens of different technology vendors, reskilling and rehiring as they do so, before they can even begin the serious work of delivering value from that technology.


Good Habits That Every Programmer Should Have

We can become good at a specific technology by working with a particular technology for a long time. How can we become an expert in a specific technology? Learning internals is a great habit that supports us to become an expert in any technology. For example, after working some time with Git, you can learn Git internals via the lesser-known plumbing commands. You can make accurate technical decisions when you understand the internals of your technology stack. When you learn internals, you will indeed become more familiar with the limitations and workarounds of a specific technology. Learning internals also helps us to understand what we are doing with programming every day. Motivate everyone to learn further about their tools’ internals! ... Sometimes, we derive programming solutions from example code snippets that we can find on internet forums. It’s a good habit to give credit to other programmers’ hard work when we use their code snippets, libraries, and tools, even though their licensing documents say that attribution is not required.


Reducing Cybersecurity Security Risk From and to Third Parties

There are a number of ways in which organizations may be able to obtain attack information from third parties, if they agree. Ideally, such requirements should be included in service agreements and partnership contracts for vendors, outsourcers, and partners, as listed in the article, “Using Contracts to Reduce Cybersecurity Risks.” Employment contracts, nondisclosure agreements and license agreements may also include requirements that protect organizations against third-party risk. While it is helpful to request vendors, outsourcers and partners to commit to risk reduction in the contractual terms and conditions, it is even more beneficial for an organization to have direct access to partners’ and suppliers’ security monitoring systems. ... More modern forms of protection monitor messages for origin and content and respond with information about unauthorized sources—as with IDSs—or preventive action—as with IPSs. Advancements in these systems include observation of unusual behavior and the use of artificial intelligence (AI) to determine threats.


How Upskilling Could Resolve The Cybersecurity Skills Gap

With a shortage of new candidates, upskilling provides the answer to the cybersecurity skills gap. And it brings multiple benefits for both employees and businesses. One of the first is that, ultimately, cybersecurity is everyone’s business. From the CEO to the new employee at home, everyone has a role to play in ensuring systems are robust in the face of a growing wave of attacks. While this does not mean that everyone in a company needs to be a cybersecurity professional, it does mean that everyone should be aware of the risks, how to spot potential vulnerabilities and attacks and the practical measures they must take to prevent them. However, it can also produce a supply of cybersecurity professionals. Waiting for qualified entrants to the jobs market will take too long and, in practice, it’s likely they will not be qualified for long! The cybersecurity environment changes so rapidly, the knowledge many graduates gain at the start of their course may not be relevant by the end. Instead, identifying existing staff with the soft skills,or power skills, to develop, adapt, and learn may be the quickest and easiest path to take.


12 tips for achieving IT agility in the digital era

“If your tech stack is streamlined, easy to access, and easy to use, your workforce can quickly respond to business or customer needs seamlessly,” says Fleetcor’s duFour. Key to this is getting a handle on application sprawl by rationalizing the IT portfolio. Voya Financial’s simplification journey began with such an effort, a process that reduced its application footprint by 17% and its slate of technology tools by one quarter. The work continues as part of its cloud migration work. “This practice is instilling standards and discipline that will only help to ensure our environment remains uncluttered and contemporary for the long term,” Keshavan says. As a result, the IT group is faster and more flexible, recently deploying five new cloud services for data science and analytics developers to use within four hours —something that would have taken a cross-functional IT team several weeks to deploy in the past. Reining in application sprawl has also been valuable at Snow Software. “Oftentimes, companies and teams will invest in applications with similar purposes,” says Snow Software CIO Alastair Pooley. 


True Component-Testing of the GUI With Karate Mock Server

There’s an important reason why old-style end-to-end tests are often more expensive than needed: you tend to test paths that are not relevant to the frontend logic. Each of these adds to the total test suite run. Consider a web application for your tax return. The user journey in this non-trivial app consists of submitting a series of questionnaires, their content customized depending on what you answered in previous steps. There is likely some logic on the frontend to manage the turns in that user journey, but the number-crunching over your sources of income and deductibles surely happens on the backend. You don’t need a GUI test to validate the correctness of those calculations. With a mock backend that would be entirely pointless. You set it up to tell the frontend that the final amount to pay is 12600 Euros. You can test that this amount is properly displayed, but there’s no testing its correctness. All the decisions are made (and hopefully tested) elsewhere, so we can treat it as a hardcoded test fixture.



Quote for the day:

"Leaders begin with a different question than others. Replacing who can I blame with how am I responsible?" -- Orrin Woodward

Daily Tech Digest - June 15, 2022

Software Engineering - The Soft Parts

Transferable skills are those you can take with you from project to project. Let's talk about them in relation to the fundamentals. The fundamentals are the foundation of any software engineering career. There are two layers to them - macro and micro. The macro layer is the core of software engineering and the micro layer is the implementation (e.g. the tech stack, libraries, frameworks, etc.). At a macro level, you learn programming concepts that are largely transferable regardless of language. The syntax may differ, but the core ideas are still the same. This can include things like: data-structures (arrays, objects, modules, hashes), algorithms (searching, sorting), architecture (design patterns, state management) and even performance optimizations. These are concepts you'll use so frequently that knowing them backwards can have a lot of value. At a micro level, you learn the implementation of those concepts. This can include things like: the language you use (JavaScript, Python, Ruby, etc), the frameworks you use (e.g. React, Angular, Vue etc), the backend you use (e.g. Django, Rails, etc), and the tech stack you use (e.g. Google App Engine, Google Cloud Platform, etc).


Why young tech workers leave — and what you can do to keep them

When employees seek a raise, what they’re really doing is shopping around and comparing offers from other companies, according to Sethi. And when it comes to salaries, companies must keep up with inflation, which is running at about 8% a year. But retaining employees requires more than just pay. Workers also want more support in translating environmental, social, and governance (ESG) considerations to their work. “Fulfilling work and the opportunity to be one’s authentic self at work also matter to employees who are considering a job change," Sethi said. "Pay is table stakes, but I also want my job to be meaningful and fulfilling, and I want to work at a place where I can be myself." Employees also want workplace flexibility. That, and human-centric work policies, can reduce attrition and increase performance. In fact, Gartner found that 65% of IT employees said that whether they can work flexibly affects their decision to stay at an organization.


A neuromorphic computing architecture that can run some deep neural networks more efficiently

Researchers at Graz University of Technology and Intel have recently demonstrated the huge potential of neuromorphic computing hardware for running DNNs in an experimental setting. Their paper, published in Nature Machine Intelligence and funded by the Human Brain Project (HBP), shows that neuromorphic computing hardware could run large DNNs 4 to 16 times more efficiently than conventional (i.e., non-brain inspired) computing hardware. "We have shown that a large class of DNNs, those that process temporally extended inputs such as for example sentences, can be implemented substantially more energy-efficiently if one solves the same problems on neuromorphic hardware with brain-inspired neurons and neural network architectures," Wolfgang Maass, one of the researchers who carried out the study, told TechXplore. "Furthermore, the DNNs that we considered are critical for higher level cognitive function, such as finding relations between sentences in a story and answering questions about its content." In their tests, Maass and his colleagues evaluated the energy-efficiency of a large neural network running on a neuromorphic computing chip created by Intel.


Why Your Database Needs a Machine Learning Brain

By keeping the ML at the database level, you’re able to eliminate several of the most time-consuming steps — and in doing so, ensure sensitive data can be analyzed within the governance model of the database. At the same time, you’re able to reduce the timeline of the project and cut points of potential failure. Furthermore, by placing ML at the data layer, it can be used for experimentation and simple hypothesis testing without it becoming a mini-project that requires time and resources to be signed off. This means you can try things on the fly, and not only increase the amount of insight but the agility of your business planning. By integrating the ML models as virtual database tables, alongside common BI tools, even large datasets can be queried with simple SQL statements. This technology incorporates a predictive layer into the database, allowing anyone trained in SQL to solve even complex problems related to time series, regression or classification models. In essence, this approach "democratizes" access to predictive data-driven experiences.


Understanding Low-code Development

If you are interested in getting started with low-code development, you will need a few things. First, you will need a low-code development platform. There are many options for you to select the right platform for you. You should analyze your requirements and explore all such options before choosing one. Several different options are available, so you should explore them to find one that meets your requirements. Once you have chosen a platform, you will need to learn how to use it. This may require some training or reading documentation. Finally, you will need some ideas for what you want to build. You are now ready to start low-code development. ... Here are some of the downsides of using Low-Code platforms for software development: Lack of Customization – Even though the pre-built modules of the low-code platforms are incredibly handy to work with, you can’t customize your application with them. You can customize low-code platforms but only to a limited extent. In most cases, low-code components are generic and if you want to customize your app you should invest time and effort in custom app development. 


Authentic Allyship and Intentional Leadership

Enterprises and leaders have to be intentional about their allyship. It has to be authentic allyship, not just surface allyship. I mention intentional allyship because a lot of times people think they’re an ally, and support diversity hires, but they’re just checking a box. We want intentional and authentic allyship. We need you to understand it goes beyond the person you’re helping. You’re helping the generation, not just one person. You think you’re only affecting the employee right in front of you, but that individual has a family and the next generation after them. You’re not just checking a box; you’re impacting destiny. When you’re an intentional ally, you think beyond the person in front of you, beyond the job application, beyond what you see. It’s not about you but what you’re doing for that person and that person’s generation to come. You need to really think about the step you’ll take when it comes to allyship. Make an impact – a lot of times we talk but don’t implement. Activate, implement, follow up. Don’t just implement and leave them there. Follow up – ask them how they’re doing, and if they know anyone else you can bring in. 


Software engineering estimates are garbage

Garbage estimates don’t account for the humanity of the people doing the work. Worse, they imply that only the system and its processes matter. This ends up forcing bad behaviors that lead to inferior engineering, loss of talent, and ultimately less valuable solutions. Such estimates are the measuring stick of a dysfunctional culture that assumes engineers will only produce if they’re compelled to do so—that they don’t care about their work or the people they serve. Falling behind the estimate’s promises? Forget about your family, friends, happiness, or health. It’s time to hustle and grind. Can’t craft a quality solution in the time you’ve been allotted? Hack a quick fix so you can close out the ticket. Solving the downstream issues you’ll create is someone else’s problem. Who needs automated tests anyway? Inspired with a new idea of how this software could be built better than originally specified? Keep it to yourself so you don’t mess up the timeline. Bludgeon people with the estimate enough, and they’ll soon learn to game the system.


Return to the office or else? Why bosses' ultimatums are missing the point

Employers who insist their staff return to the office full time are heading into increasingly dangerous territory. Skilled professionals, tech workers included, have so many opportunities available to them right now that it's difficult to see why they would sacrifice job satisfaction for their bosses. The outlook has never been better for knowledge workers – and indeed, workers more generally – across all industries. Not only are employers paying more to get the skills they need, but the breadth of flexible-working options for employees fed up with office life continues to grow. People aren't just working from home – they're working from wherever they choose, and whenever they choose. At the same time, significant momentum is gathering behind the introduction of a four-day work week, which could push the dynamic even further in favour of worker wellbeing while benefitting employers too. Companies who offer 100% pay for 80% of the hours will have a seriously powerful bargaining chip to play in the war for talent, and no company – regardless of their brand, product or credentials – will be untouchable.


UK needs to upskill to achieve quantum advantage

Discussing the pilot, Stephen Till, fellow at the Defence Science and Technology Laboratory (Dstl), an executive agency of the MoD, said: “This work with ORCA Computing is a milestone moment for the MoD. Accessing our own quantum computing hardware will not only accelerate our understanding of quantum computing, but the computer’s room-temperature operation will also give us the flexibility to use it in different locations for different requirements. “We expect the ORCA system to provide significantly improved latency – the speed at which we can read and write to the quantum computer. This is important for hybrid algorithms, which require multiple handovers between quantum and classical systems.” Piers Clinton-Tarestad, a partner in EY’s technology risk practice, said there is a general consensus that quantum computing will start becoming a reality in 2030. But pilot projects, such as the one being conducted at the MoD, and proof-of-concept applications can help business leaders to understand where quantum technology can be applied. 


Using automation to improve employee experience

The possibilities to improve the employee experience through automation and integration are endless. If you want to pilot something in your organization, poll your employees about what would be the most impactful. Where are they seeing sludge that drags down morale and slows business velocity? You and your IT team can plot each idea on an impact and effort prioritization matrix. Some suggestions may be easier to implement than you think, as many cloud services are already API-enabled, making automation straightforward. Once your team implements an initial valuable and visible integration, more employee lightbulbs will go off, identifying additional ideas for automation and integration for your prioritization backlog. And don’t forget about the ROI calculators in your automation tooling, as they will help objectively refine your prioritization by analyzing your planned and actual savings. Not only will your employees benefit directly from the automation, but they will also feel heard when they see their ideas come to life.



Quote for the day:

"Uncertainty is a permanent part of the leadership landscape. It never goes away." -- Andy Stanley

Daily Tech Digest - June 14, 2022

Business Architecture - A New Depiction

Crucial to this depiction are components which exist in both the vertical pillars and the horizontal Business Architecture layer as follows: Application Architecture: includes the Business Process component, to associate application components (logical & operational) with the business activity they support. Information Architecture: includes the Information Component from a business perspective separately from any logical or operational representation of that information by data (structured or unstructured). Infrastructure Architecture: contains the location component. This is to recognize that business infrastructure is linked to an organization / location either by physical installation or network access. Business Architecture consists of these business components – shared with the other domains – and, in addition, more complex views which link the architecture with the business plans. For example, an architecture view for a business capability (as defined through capability-based planning) would show how the components support that capability. The 3 vertical domains can be considered to constitute IT Architecture (for the enterprise). 


Meet Web Push

One goal of the WebKit open source project is to make it easy to deliver a modern browser engine that integrates well with any modern platform. Many web-facing features are implemented entirely within WebKit, and the maintainers of a given WebKit port do not have to do any additional work to add support on their platforms. Occasionally features require relatively deep integration with a platform. That means a WebKit port needs to write a lot of custom code inside WebKit or integrate with platform specific libraries. For example, to support the HTML <audio> and <video> elements, Apple’s port leverages Apple’s Core Media framework, whereas the GTK port uses the GStreamer project. A feature might also require deep enough customization on a per-Application basis that WebKit can’t do the work itself. For example web content might call window.alert(). In a general purpose web browser like Safari, the browser wants to control the presentation of the alert itself. But an e-book reader that displays web content might want to suppress alerts altogether. From WebKit’s perspective, supporting Web Push requires deep per-platform and per-application customization.


Introduction to Infrastructure as Code - Part 1: Introducing IaC

In recent years, development has shifted away from monolithic applications and towards microservices architectures and cloud-native applications. However, modernizing apps introduces complexity, as maintaining the cloud computing architecture requires infrastructure automation tools, efficient provisioning, and scaling of new resources. Too many developers still see infrastructure provisioning and management as an opaque process that Ops teams perform using GUI tools like the Azure Portal. Infrastructure as code (IaC) challenges that notion. The practice of IaC unifies development and operations, creating a close bond between code and infrastructure. Why should we use IaC? When you develop an application, you create code, build and version it, and deploy the artifact through the DevOps pipeline. IaC allows you to create your infrastructure in the cloud using code, enabling you to version and execute that code whenever necessary. This three-article series starts with an introduction to IaC. Then, the following two articles in this series show how to use the Bicep language and Terraform HCL syntax to create templates and automatically provision resources on Azure.


VPN providers flee Indian market ahead of new data rules

The new directive by India's top cybersecurity agency, the Indian Computer Emergency Response Team (Cert-In), requires VPN, Virtual Private Server (VPS) and cloud service providers to store customers' names, email addresses, IP addresses, know-your-customer records, and financial transactions for a period of five years. SurfShark announced on Wednesday in a post titled "Surfshark shuts down servers in India in response to data law," that it "proudly operates under a strict "no logs" policy, so such new requirements go against the core ethos of the company." SurfShark is not the first VPN provider to pull its servers from the country following the directive. ExpressVPN also decided to take the same step just last week, and NordVPN has also warned that it will be removing physical servers if the directives are not reversed. ... Like many businesses around the world, Indian companies have increased their reliance on VPNs since the COVID-19 pandemic forced many employees to work from home. VPN adoption grew to allow employees to access sensitive data remotely, even as companies started adopting other secure means to allow remote access such as Zero Trust Network Access and Smart DNS solutions.


5 top deception tools and how they ensnare attackers

To work, deception technologies essentially create decoys, traps that emulate natural systems. These systems work because of the way most attackers operate. For instance, when attackers penetrate the environment, they typically look for ways to build persistence. This typically means dropping a backdoor. In addition to the backdoor, attackers will attempt to move laterally within organizations, naturally trying to use stolen or guessed access credentials. As attackers find data and systems of value, they will deploy additional malware and exfiltrate data, typically using the backdoor(s) they dropped. With traditional anomaly detection and intrusion detection/prevention systems, enterprises try to spot these attacks in progress on their entire networks and systems. Still, the problem is these tools rely on signatures or susceptible machine learning algorithms and throw off a tremendous number of false positives. Deception technologies, however, have a higher threshold to trigger events, but these events tend to be real threat actors conducting real attacks.


MIT built a new reconfigurable AI chip that can reduce electronic waste

The team's optical communication system comprises paired photodetectors and LEDs patterned with tiny pixels. The photodetectors feature an image sensor for receiving data, and LEDs transmit that data to the next layer. Since the components must work like a LEGO-like reconfigurable AI chip, they must be compatible. "The sensory chip at the bottom receives signals from the outside environment and sends the information to the next chip above by light signals. The next chip, which is a processor layer, receives the light information and then processes the pre-programmed function. Such light-based data transfer continues to other chips above, thus performing multi-functional tasks as a whole," the team explained. ... The team fabricated a single chip with a computing core that measured about four square millimeters. The chip is stacked with three image recognition "blocks", each comprising an image sensor, optical communication layer, and artificial synapse array for classifying one of three letters, M, I, or T. They then shone a pixellated image of random letters onto the chip and measured the electrical current that each neural network array produced in response.


Augmented reality head-up displays: Navigating the next-gen driving experience

HUDs work by projecting a transparent 2D or 3D digital image of navigational and hazard warning information, for example, onto the windscreen of the vehicle. These projected images then merge with the driver's view of the road ahead. Windshield HUDs, for example, are set up so that the driver does not need to shift their gaze away from the road in order to view the relevant, timely information. This technology helps to keep the driver's attention on the road, as opposed to the driver having to look down at the dashboard or navigation system. Technological advances in this area have led to HUDs with holographic displays and AR in 3D. This added depth perception makes it possible to project computer-generated virtual objects in real time into the driver's field of view to warn, inform or entertain the user. The driver's alertness to road obstacles is increased by enabling shorter obstacle visualization times, and eye strain and driving stress levels are reduced. "Holographic HUDs are paramount if we are to explore the possibilities of augmented and mixed reality for road safety," said Jana


Nigerian Police Bust Gang Planning Cyberattacks on 10 Banks

The operation was a coordinated effort between the Economic and Financial Crimes Commission of Nigeria, Interpol, the National Central Bureaus and law enforcement agencies of 11 countries across Southeast Asia, according to Interpol. The operation was initiated after Interpol's private sector partner Trend Micro provided operational intelligence to the agency about the "emergence and usage of Agent Tesla malware" in this case. Agent Tesla was found on the mobile phones and laptops of the syndicate members that were seized by the EFCC during the bust. "Through its global police network and constant monitoring of cyberspace, Interpol had the globally sourced intelligence needed to alert Nigeria to a serious security threat where millions could have been lost without swift police action," Interpol Director of Cybercrime Craig Jones says in the statement. "Further arrests and prosecutions are foreseen across the world as intelligence continues to come in and investigations unfold." 


10 ways DevOps can help reduce technical debt

In most cases, technical debt occurs because development teams take shortcuts to meet tight deadlines and struggle with constant changes. But better collaboration between dev and ops can shorten SDLC, fasten deployments, and increase their frequency. Moreover, CI/CD and continuous testing make it easier for teams to deal with changes. Overall, the collaborative culture encourages code reviews, good coding practices, and robust testing with mutual help. ... Technical debt is best controlled when managed continuously, which becomes easier with DevOps. As it facilitates constant communication, teams can track debt, facilitate awareness and resolve it as soon as possible. Team leaders can also include technical debt review into backlog and schedule maintenance sprints to deal with it promptly. Moreover, DevOps reduces the chances of incomplete or deferred tasks in the backlog, helping prevent technical debt. ... A true DevOps culture can be the key to managing technical debt over long periods. DevOps culture encourages strong collaboration between cross-functional teams, provides autonomy and ownership, and practices continuous feedback and improvement.


Once is never enough: The need for continuous penetration testing

The traditional attitude to manual pen testing is kind of like the traditional approach to driving navigation: nothing can replace the sophistication and accrued knowledge of a human. A taxi driver will always beat Google Maps, and a trained pen testing professional will find vulnerabilities and attacks that automated tests may miss, or identify responses that appear legitimate to automated software but are actually a threat. The truth is, on a case-by-case basis, this could conceivably be true. But with off-the-shelf tools and services like RaaS (Ransomware as a Service) or MaaS (Malware as a Service) that use AI/ML capabilities to enhance attack efficiency – you’d need an army of pen testers to truly meet the challenges of today’s cyber threats. And once you’d found, trained and employed them – cyberattackers would simply increase their automation efforts and you’d need to draft another army. Not a sustainable cybersecurity model, clearly. Similarly, the widescale adoption of agile development methodologies has translated into increasingly frequent software releases.



Quote for the day:

"If you are truly a leader, you will help others to not just see themselves as they are, but also what they can become." -- David P. Schloss