Supporting low-latency commands and parallel queues, NVMe is designed to exploit the performance of high-end SSDs. "It not only offers significantly higher performance and lower latencies for existing applications than legacy protocols, but also enables new capabilities for real-time data processing in the data center, cloud and edge environments," says Yan Huang, an assistant professor of business technologies at Carnegie Mellon University's Tepper School of Business. "These capabilities can help businesses stand out from their competition in the big data environment." NVMe is particularly valuable for data-driven businesses, especially those that require real-time data analytics or are built upon emerging technologies. The NVMe protocol is not limited to connecting flash drives; it also can serve as a networking protocol. The arrival of NVMe-oF (NVMe over Fabrics) now allows organizations to create a very high-performance storage network with latencies that rival direct attached storage (DAS). As a result, flash devices can be shared, when needed, among servers.
The authors note that certain standards may indicate what is not compliant but do not provide guidance on how to move the item into compliance. Otherwise, they observed the common state that many organizations are "perennially noncompliant with corporate standards." In one example used, a breach of Singhealth, the book lays out a series of occurrences where auditors gave a clean bill of compliance but the organization was compromised. While a no-finding audit may seem desirable to many at the time, these often simply shift costs forward: the audit was paid for, a breach happened, and the breach incurs significant cost. Post-breach, many organizations must patch the root cause plus anything else that was harmed after attackers made their beach-head, then the organization must deal with any regulatory/financial fall-out based on the type of data lost. Those previous clean audits offer no assistance in the aftermath, as they have been proven deficient.
Cyber threat intelligence is the discipline of tracking adversaries, following bread crumbs, and producing intelligence you can use to help your team and make the other side’s life harder. To achieve that, the five-year-old MSTIC team includes former spies and government intelligence operators whose experience at places like Fort Meade, home to the National Security Agency and US Cyber Command, translates immediately to their roles at Microsoft. MSTIC names dozens of threats, but the geopolitics are complicated: China and the United States, two of the most significant players in cyberspace and the two biggest economies on earth, are virtually never called out the way countries like Iran, Russia, and North Korea frequently are. “Our team uses the data, connects the dots, tells the story, tracks the actor and their behaviors,” says Jeremy Dallman, a director of strategic programs and partnerships at MSTIC. “They’re hunting the actors—where they’re moving, what they’re planning next, who they are targeting—and getting ahead of that.”
While it may seem trivial, the conflict here is a fundamental one in approaches to artificial intelligence. Namely, how far can you get with mere statistical associations between huge sets of data, and how much do you need to introduce abstract concepts for real intelligence to arise? At one end of the spectrum, Good Old-Fashioned AI or GOFAI dreamed up machines that would be entirely based on symbolic logic. The machine would be hard-coded with the concept of a dog, a flower, cars, and so forth, alongside all of the symbolic “rules” which we internalize, allowing us to distinguish between dogs, flowers, and cars. Such a system would be able to explain itself, because it would deal in high-level, human-understandable concepts. The equation is closer to: “ball” + “stitches” + “white” = “baseball”, rather than a set of millions of numbers linking various pathways together. There are elements of GOFAI in Google’s new approach to explaining its image recognition: the new algorithm can recognize objects based on the sub-objects they contain.
The recent demand for low-code development comes from a desire to modernise IT environments quickly without taking a rip-and-replace approach, says Scheurman. “The push from the business on software development is to do things fast. They also want to automate. That is why I think low-code and robotic process automation (RPA) are part of a continuous spectrum.” Nick Ford, vice-president of product and solution marketing at low-code supplier Mendix, agrees that the hidden benefit of low-code is meeting user needs. “What often happens is there is an impetus for an idea – a new insurance product, for example,” he says. “That might be built as a prototype by a subject-matter expert who creates the data model on-screen in low-code, but over time that is fleshed out and made production-ready, including integration with back-office systems, by a developer collaborating on the same model. It is not waterfall – they have different windows into the model to do different things.”
Many of today’s security and fraud problems occur within applications and are difficult, if not impossible, to detect externally to the applications. For example, if a fraudster has obtained a user’s login details via a credential attack, their access to the site while logging in can appear normal – but once inside the site, can start to behave maliciously. Nixer CyberML allows development teams to rapidly add machine learning based detection to online applications (online banking, ecommerce systems, ticket sites, critical business apps, etc.) that can learn to accurately distinguish between good and bad user behaviour. This initial release designed for developers, includes the Nixer CyberML architecture, code libraries for Spring framework based applications, and a local Nixer CyberML Engine designed to help with credential protection functionality. The Nixer CyberML Engine, stores and processes anonymous application event data, and contains the machine learning algorithms which determine whether events are normal or potentially malicious.
Combined with real-time feeds of the patient's ultrasound scans, this lets the clinician recognise vital signs and decide whether a hospital intervention is needed, or if the wound can be managed directly in the vehicle. "To improve the efficiency of healthcare, we need to understand that not everyone needs to come to the hospital," said Clutton-Block. "With this technology, we can decide a lot better whether a wound should be healed on the spot, or if it requires further assistance." If the patient needs to be operated, he added, the clinician can make sure that the hospital has surgeons ready as soon as the ambulance pulls in. It is slightly premature, however, to expect to see smart ambulances driving around every city corner anytime soon. Clutton-Block explained that, contrary to preconceptions, this is not because the technology is too immature: "Actually, I don't think the technology is very difficult," he laughed. "And compared to some hospital equipment, which can reach hundreds of thousands of pounds, a VR headset isn't very expensive either."
With fewer (but higher-quality) collaborative projects, the team needed fewer meetings. Fewer meetings meant less time developing agendas and building presentations and fewer invitations clogging already packed in-boxes. The best part? The meetings that they did have felt essential and relevant to everyone attending them, meaning they did better work. The M&M’s retail leadership team became better collaborators by collaborating less. Less collaboration cleared the calendar and mental space that allowed them to dig deeper for higher-quality work. The impact wasn’t only in dollars (though the business was more profitable than it had been in years.) Their engagement scores went up because employees were doing more meaningful collective work. It might seem counterintuitive to think about how you can collaborate less. But when you collaborate in projects that truly matter the most, you’ll get much better results. Sure, you could opt for hypercollaboration, and maybe you can’t undo all the apps already put in place for it.
PlatformDIGITAL is intended to provide a foundation for customers to address the need for global coverage, capacity, and ecosystem connectivity from a single data center provider; tailor infrastructure deployments and controls matched to business needs; operate deployments as a seamless extension of any global infrastructure; and enable global distributed workflows at centers of data exchange to remove data gravity barriers and scale digital business. Digital Realty's PDx approach was developed by enterprise IT practitioners and was created by codifying hundreds of product deployment combinations into repeatable implementation patterns. The goal is to allow customers to quickly deploy enterprise infrastructure and to scale their digital businesses globally. It’s a similar interconnection strategy to that of DRT’s chief rival Equinix, but slightly different, notes David Cappuccio, distinguished analyst with Gartner. “This is a move by Digital Realty to compete on a global scale with Equinix. They have 220+ sites and have interconnected them all, similar to Equinix. But rather than focusing on the interconnection strategy and being the infrastructure glue for global enterprises like Equinix, they are focusing on the data part, with the idea that as you move applications or workloads closer to the customer, or a specific geo to solve location or compliance issues, you are also moving data.
Although it might at first sound counterintuitive, AI enables marketers to create highly personalised consumer experiences. It does so by offering a deeper understanding of the consumer, particularly when it comes to how they perceive and interact with the company and brand. By analysing input such as social media activity, marketers can harness real-time data to see what is being said about their brand and specific marketing campaigns, and then use this information to modify the messaging to achieve maximum effectiveness. Data-driven AI solutions are also a massive aid when it comes to creating personalised marketing campaigns that gets the right message across to the right people. Previous data that was available to marketers was typically made up of demographic data such as age, location and gender. Now, there’s an abundance of much more informative data that is readily available to capture and analyse, including customers' past and present behavioural patterns and previous interactions between the two parties. Just think how much time it would take a small team to capture and analyse each consumer interaction!
Quote for the day:
"People seldom improve when they have no other model but themselves." -- Oliver Goldsmith