Daily Tech Digest - October 01, 2021

6 steps for third-party cyber risk management

Classify vendors based on the inherent risk they pose to the organization (i.e., risk that doesn’t take into account existing mitigations). To do this, create a scoping questionnaire that can be completed by the employee who owns the vendor relationship to capture vital information regarding the service being offered, the location and level of data being accessed, stored or processed, and other factors that indicate what kind of security assessment may be needed. Every vendor presents a different level of risk. For example vendors that provide critical services usually have access to sensitive information and therefore pose a larger threat to the organization. This is where a vendor risk questionnaire comes in. You can develop your own or use one of the templates available online. In certain cases your organization may be required to comply with standards like SOC2 Type 2, ISO 27001, NIST SP 800-53, NIST CSF, PCI-DSS, CSA CCM, etc. It’s also important that your questionnaire covers questions related to such frameworks and compliance requirements.


Incentivizing Developers is the Key to Better Security Practices

To help development teams improve their cybersecurity prowess, they must first be taught the necessary skills. Utilizing scaffolded learning, and tools like Just-in-Time (JiT) training can make this process much less painful, and helps to build upon existing knowledge in the right context. The principle of JiT is that developers are served the right knowledge at just the right time, for example, if a JiT developer training tool detects that a programmer is creating an insecure piece of code, or is accidentally introducing a vulnerability into their application, it can activate and show the developer how they could fix that problem, and how to write more secure code to perform that same function in the future. With a commitment to upskilling in place, the old methods of evaluating developers based solely on speed need to be eliminated. Instead, coders should be rewarded based on their ability to create secure code, with the best developers becoming security champions that help the rest of the team improve their skills. 


The Turbulent Past And Uncertain Future Of AI

Although deep-learning systems tend to be black boxes that make inferences in opaque and mystifying ways, neuro-symbolic systems enable users to look under the hood and understand how the AI reached its conclusions. The U.S. Army is particularly wary of relying on black-box systems, as Evan Ackerman describes in "How the U.S. Army Is Turning Robots Into Team Players," so Army researchers are investigating a variety of hybrid approaches to drive their robots and autonomous vehicles. Imagine if you could take one of the U.S. Army's road-clearing robots and ask it to make you a cup of coffee. That's a laughable proposition today, because deep-learning systems are built for narrow purposes and can't generalize their abilities from one task to another. What's more, learning a new task usually requires an AI to erase everything it knows about how to solve its prior task, a conundrum called catastrophic forgetting. At DeepMind, Google's London-based AI lab, the renowned roboticist Raia Hadsell is tackling this problem with a variety of sophisticated techniques.


Increase Your DevOps Productivity Using Infrastructure as Low Code

Often what people focus on around DevOps is the tooling element as this often leads people down the continuous integration and continuous delivery route, aka CI/CD. One of the most popular open-source CI/CD tools is Jenkins, which is an all-in-one automation server that brings together the various parts of the software development life cycle. There are endless tools available on the market that can fit into your DevOps processes and cover virtually any technology stock you can think of these days. As Jenkins is one of the most popular, let’s take a look at some of the pros and cons in comparison with other infrastructures as low code tools. With Jenkins being open source, this gives you full control over the platform and what you do with it. Unfortunately this also puts all the responsibility onto yourself to make sure it’s doing what it should be doing. Starting at the infrastructure level, this is something you have to host yourself, which naturally comes with an associated cost for the underlying resource.


Russian Scientists Use Supercomputer To Probe Limits of Google’s Quantum Processor

From the early days of numerical computing, quantum systems have appeared exceedingly difficult to emulate, though the precise reasons for this remain a subject of active research. Still, this apparently inherent difficulty of a classical computer to emulate a quantum system prompted several researchers to flip the narrative. Scientists such as Richard Feynman and Yuri Manin speculated in the early 1980s that the unknown ingredients which seem to make quantum computers hard to emulate using a classical computer could themselves be used as a computational resource. For example, a quantum processor should be good at simulating quantum systems, since they are governed by the same underlying principles. Such early ideas eventually led to Google and other tech giants creating prototype versions of the long-anticipated quantum processors. These modern devices are error-prone, they can only execute the simplest of quantum programs and each calculation must be repeated multiple times to average out the errors in order to eventually form an approximation.


The Eclat algorithm

In this article, you will learn everything that you need to know about the Eclat algorithm. Eclat stands for Equivalence Class Clustering and Bottom-Up Lattice Traversal and it is an algorithm for association rule mining (which also regroups frequent itemset mining). Association rule mining and frequent itemset mining are easiest to understand in their applications for basket analysis: the goal here is to understand which products are often bought together by shoppers. These association rules can then be used for example for recommender engines (in case of online shopping) or for store improvement for offline shopping. The ECLAT algorithm is not the first algorithm for association rule mining. The foundational algorithm in the domain is the Apriori algorithm. Since the Apriori algorithm is the first algorithm that was proposed in the domain, it has been improved upon in terms of computational efficiency (i.e. they made faster alternatives). There are two faster alternatives to the Apriori algorithm that are state-of-the-art: one of them is FP Growth and the other one is ECLAT.


Why Coding Interviews Are Getting So Hard?

If candidates are sheep, then interviewers are wolves. The sheep learn to run faster and faster because they want to survive, and so are the wolves. Years ago, there weren’t any interview practice materials. New grads would review their Data Structure and Algorithm textbooks to prepare for coding interviews. And we would turn to senior students who have been through some interview process to pick up some wisdom. ... If you are an interviewer, try to avoid problems that are easily available on the internet or at least tweak them before using them. Try to avoid problems that clearly require practicing, i.e., dynamic programming. Try to focus less on whether a problem is solved perfectly but instead pay more attention to how candidates think and approach the problem. If you are a candidate, prepare for the interviews as hard as you can! Frankly speaking, that may not be the best way to use your time. But you need to do what you need to do. And after the interview, don’t share the problems. The world is big and pretty diverse. The discussions above are based on my very limited experience. And they might be wrong in a different context.


For networking pros, every month is Cybersecurity Awareness Month

Not sure why the organizers didn’t make “Cybersecurity First” the theme of the month’s first week, but it is not for me to second-guess the federal Cybersecurity & Infrastructure Security Agency (CISA) and the public/private National Cyber Security Alliance (NCSA), organizers of the annual awareness month. NCSAM is a great idea, just as is Bat Appreciation Month, Church Library Month, and International Walk to School Month, all of which also occur in October. It’s always good to be reminded that precautions and safeguards are needed when navigating a sometimes dangerous digital world. And that walking to school benefits students physically and mentally. For enterprise professionals, of course, every month is Cybersecurity Awareness Month. Security constantly is on the minds of enterprise IT pros, if not the minds of enterprise workers (sore subject!). And well it should be, coming off a year described by the CrowdStrike 2021 Global Threat Report as “perhaps the most active year in memory.”


Cloud computing in manufacturing: from impossible to indispensable

Advancements in infrastructure, combined with the exponential growth of software offerings in the cloud, has accelerated the digitisation of the supply chains, allowing companies to operate and interact with each other in a more transparent and automated way. Companies are quickly expanding their operational intelligence, moving from single assets descriptive analytics – where manufacturers are informed of what has happened; to prescriptive analytics – where manufacturers are informed of options to respond to what’s about to happen; across multiple lines, factories, all the way to critical elements of their supply chain. The exponential value creation cycle enabled by the Cloud Continuum does not depend on IT only. It requires organisations to have a well-defined vision, an adequate operating model, and a properly designed set of technology adoption principles. The adoption of cloud solutions without these three components usually leads to difficulty scaling and sustaining the intended benefits. In summary, cloud adoption in manufacturing went from a concept deemed impossible


Today’s cars are mobile data centers, and that data needs to be protected

The utopian vision of the AV paradigm removing the stress of having to pilot the vehicle, improving road safety, and managing urban traffic flows has already given rise to what manufacturers are referring to as the “passenger economy”. While we are chauffeured by software, we will be able to work, shop, and play from the comfort of our seats within continuous network connectivity. Independent of our own data demand, our vehicles will also be communicating and receiving sensor and telemetry data with other vehicles to avoid collisions, with our smart cities to ensure an efficient journey time, and with the manufacturer to schedule maintenance and contribute to the next generation of car design. All this critical data, however, could form the basis of a dystopian nightmare. Compromised applications might disable the software controlling safety systems on which AVs will depend. Knowledge of the driver’s identity, social media streams, and location might proliferate an avalanche of targeted advertising from local services, a loss of privacy, and potentially compromised personal safety. 



Quote for the day:

"Leaders dig into their business to learn painful realities rather than peaceful illusion." -- Orrin Woodward

No comments:

Post a Comment