With perimeter defenses increasingly becoming a thing of the past, attack surfaces increasing, and adversaries becoming more capable, a managed threat detection and response (MDR) model has piqued interest in major industries. A crucial difference between MDR and traditional ransomware defenses, is MDR’s proactive response to threats. MDR is a powerful managed security service that combines threat intelligence, threat hunting, security monitoring, incident analysis, and incident response. It leverages telemetry on endpoints, monitors user behaviors, and helps produce a data-driven baseline of a business’ ‘normal’ activities, whether on premises or in the cloud. Essentially, it couples the best detection technologies and security expertise to seek out and eliminate threats before catastrophic damage occurs. Ransomware protection has been critical for businesses, especially during the pandemic. COVID-19 has proven to be a nightmare for assessing what ‘normal’ behavior looks like for organizations. Most companies lacked contingencies for adapting to the pandemic.
The fact is low-code and no-code has been a term for probably 15 years, if not more in one way or another. I think I remember trying to write my first website in a low-code front page application, but what did I do? The second I did that I had to jump into the code, the HTML code to actually make it work. But we are at a different time, I think in really a unique time where we have a broad base of the workforce, the majority of the workforce now is the millennial generation or lower. So we have a younger workforce that actually grew up with technology and they've used it day in and day out. We don't really think of it as, 'Oh, well, you had apps and phones,' but that familiarity with technology has given a technical or literacy that just comes with today's day and age. Now, if you accompany that with the fact that low-code platforms are much more powerful than they were before, you have a perfect union of people who just want to get stuff done and configure out technology if you give it to them, and technology that is powerful enough, yet simple enough to leverage to really innovate on. Now, there is something you mentioned Bill, that is really important, which is enterprises have to be bought into this.
You’ll see more self-healing, self-configuring and provisioning. Day 2 operations will be seamless, and self-correcting work will be all done in software automatically. In many ways we have already achieved these capabilities with Mist and our Wi-Fi technology that has a self-correcting mechanism. In the data center, operations will be driven by automation to eliminate errors, and find and correct particular problems. Our focus on AI has been a real shot in the arm for the company and our customers. As we pull more and more telemetry from our routers and switches, automation and AI will drive a lot more functionality into our software. The data gathered by telemetry is king. You need that kind of data to gain insights into what’s going on, how devices are working and software. You find out how the network is operating with packet capturing and the state of the cloud network, and then look for deviations. In our case, [Juniper’s AI-powered virtual assistant] Marvis in 2019 learned of network problems and could solve 20% of them without intervention. Now that number is over 80% of problems solvable automatically, in part due to all of the intelligent telemetry it gathers.
The blockchain is a constantly growing list of information. That information is in blocks, and all these blocks are linked together. Each block matches the preceding and following, and the information that the middle block contains is encrypted by an algorithm using a cryptographic function called hash. This makes this information inviolable. It is a secure, open and public database. To illustrate how the blockchain works, the metaphor of a ledger distributed among many people is often used. It would be a great book where digital events are recorded. The fundamental thing here is that this book is "distributed", that is, shared between many different parts (nodes). It can only be updated from the consensus of the majority of the system participants and, once entered, the information can never be deleted. The Bitcoin blockchain , for example, contains an accurate and verifiable record of all the transactions that have been made in its history. In other words, the authenticity of the Blockchain is not verified by a third party, but by the consensus of the whole: it is the same network of users that participates in it.
The planned law is intended to apply to any company selling an AI product or service into the EU, not just to EU-based companies and individuals — so, as with the EU’s data protection regime, it will be extraterritorial in scope. The overarching goal for EU lawmakers is to foster public trust in how AI is implemented to help boost uptake of the technology. Senior Commission officials talk about wanting to develop an “excellence ecosystem” that’s aligned with European values. “Today, we aim to make Europe world-class in the development of a secure, trustworthy and human-centered Artificial Intelligence, and the use of it,” said Commission EVP, Margrethe Vestager, announcing adoption of the proposal at a press conference. “On the one hand, our regulation addresses the human and societal risks associated with specific uses of AI. This is to create trust. On the other hand, our coordinated plan outlines the necessary steps that Member States should take to boost investments and innovation. To guarantee excellence. All this, to ensure that we strengthen the uptake of AI across Europe.”
Crossgen2 is an exciting new platform addition and part of the .NET 6 release. It is a new tool that enables both generating and optimizing code in a new way. The crossgen2 project is a significant effort, and is the focus of multiple engineers. I thought it might be interesting to try a more conversational approach to exploring new features. ... Crossgen’s pedigree comes from the early .NET Framework days. Its implementation is tightly coupled with the runtime (it essentially is just the runtime and JIT attached to a PE file emitter). We are building a new version of Crossgen – Crossgen 2 – which starts with a new code base architected to be a compiler that can perform analysis and optimizations not possible with the previous version. ... As the .NET Core project became more mature and we saw usage grow across multiple application scenarios, we realized that crossgen’s limitation of only really being able to produce native code of one flavor with one set of characteristics was going to be a big problem. For example, we might want to generate code with different characteristics for Windows desktop on one hand and Linux containers on the other. The need for that level of code generation diversity is what motivated the project.
Language is sequential data. Basically, you can observe it as a stream of words, where the meaning of each word is depending on the words that came before it and from the words that come after it. That is why computers have such a hard time understanding language because in order to understand one word you need a context. Also, sometimes as the output, you need to provide a sequence of data (words) as well. A good example to demonstrate this is the translation of English into Serbian. As an input to the algorithm, we use a sequence of words and for the output, we need to provide a sequence as well. ... During the training, process Encoder is supplied with word embeddings from the English language. Computers don’t understand words, they understand numbers and matrixes (set of numbers). That is why we convert words into some vector space, meaning we assign certain vectors (map them to some latent vector space) to each word in the language. These are word embeddings. There are many available word embeddings like Word2Vec. However, the position of the word in the sentence is also important for the context.
To determine just how effective micro-segmentation can be, Illumio conducted a red team exercise with Bishop Fox. The team was tasked with finding “crown jewel” assets in a test environment, and while they did not face a defensive blue team, they were pitted against increasingly tight micro-segmentation policies. The first and lowest level policy tested was environmental separation. This is a fairly course-grained approach where workloads in different environments, such as production, testing, or development, can only connect with others in the same environment. It quickly became clear that even this simple level of separation could cause attackers to take at least three times as long to reach their target. This 300-percent increase in difficulty for the intruder meant defensive tools and security personnel had much more time to detect and investigate signs of unusual activity. The next level of micro-segmentation, application ringfencing, proved to be even more effective, creating a 450-percent increase in difficulty for the attacker. At this stage, only workloads associated with specific applications could talk to each other.
The entire quantum industry is "still finding its way to what applications are really useful," he said. "You tend to see this list of potential applications, a heralded era of quantum computing, but I don't think we really know," he said. The Qatalyst software from QCI focuses on the kinds of problems that are of perennial interest, generally in the category of optimization, particularly constrained optimization, where a solution to a given loss function or objective function is made more complicated by having to narrow the solution to a bunch of variables that have a constraint of some sort enforced, such as bounded values. ... "They are described at a high level as the traveling salesman problem, where you have multi-variate sort of outcomes," said Liscouski. "But it's supply-chain logistics, it's inventory management, it's scheduling, it's things that businesses do today that quantum can really accelerate the outcomes in the very near future." Such problems are "a very important use case," said Moulds. Quantum computers are "potentially good at narrowing the field in problem spaces, searching through large potential combinations in a wide variety of optimization problems," he said.
Agile at the organizational level is changing the DNA of organizations; it brings higher autonomy of creative, innovative, and collaborative teams that are better designed to deal with complexity and the unpredictability of the VUCA challenges. It needs flexibility and quick responses to change. It breaks all fundamental beliefs that classical management was built on top of, and creates a strong need for changing leadership. Dynamic structures with no fixed design are hard to manage the traditional way, and growth of emergent leadership is inevitable. Agile leaders are catalyst and servant leaders; they are role models of a new way of working. They coach, mentor, and encourage others to become agile leaders as well. Being an agile leader is a journey, and agile leaders need to focus on helping other leaders around them grow to make agility as a whole sustainable. Having a critical mass of agile leadership is crucial for any agile environment; without it, we are only creating another process and adding terminology, and all we get is “fake agile,” not business results.
Quote for the day:
"Leaders need to strike a balance between action and patience." -- Doug Smith