This challenge will only be amplified as the amount of data available to retailers increases: The market for retail Internet of Things (IoT) sensors, RFID tags, beacons and wearables is projected to grow 23% annually through 2025, which will generate data needed for targeted customer experiences and optimized operations. As retail consumers increasingly live and shop across multiple channels, a new strategy for analytics is needed to take advantage of all that additional data. Single data pipelines that slow learning abstraction and decision-making based on those insights are not the right fit for this new paradigm. A single data pipeline prevents analytics from delivering insights at the pace needed by line-of-business decision makers. In an SVOT world, employees often lose patience with the process and attempt do-it-yourself strategies with data. An environment where marketing, sales, demand planning, supply chain, operations and finance each apply their own tools, filters and data-modeling decisions will result in a multitude of interpretations, even if they start from the same pile of data.
Despite mounting evidence of the substantial benefits provided by analytics, most companies have barely scratched the surface of what is possible. The good news is that the tide is turning. The field is increasingly attracting new talent, who are introducing new skills such as data science and statistics to the realm of HR. This helps to further progress, as does the advance in technologies enabling real-time data collection and analysis of unstructured, as well as structured, data. Consequently, the growth of these skills is set to continue to rise exponentially. Building a people analytics function coupled with capitalizing on technologies that collect, store, and dynamically visualize data enables companies to put information at the fingertips of the business leaders to support decision-making. Moreover, this democratization of data can also help managers by providing data on their own behaviors, as well as providing them with insights that support employee engagement, development, and performance.
PCI Express 5.0 finalized, but 4.0-compatible hardware is only now shipping
On its own merits, PCIe 5.0 is impressive, doubling the transfer rates from PCIe 4.0, which in turn doubled transfer rates from PCIe 3.0. In terms of practical deployments, a PCIe 5.0 x1 slot delivers the same bandwidth (~4GB/s) as a full-size, first-generation PCIe x16 slot from 2003, commonly used in graphics cards. In terms of practical deployment, it is likely to be some time before PCIe 5.0 devices arrive, though it is possible that Intel may skip PCIe 4.0 entirely, as their Compute Express Link (CXL) technology for connecting FPGA-based accelerators is based on PCIe 5.0. This should be taken with a grain of salt—rumors indicated that Intel planned to skip a 10nm manufacturing process, in favor of moving to 7nm, following low yields on 10nm parts. Intel's Computex announcements show 10nm plans for mobile systems, though desktop-class CPUs have yet to be announced. From an implementation standpoint, the technical complexity between 4.0 and 5.0 is lower than 3.0 and 4.0, making it likely to see a quick upgrade for existing 4.0 designs.
Sustainable Operations in Complex Systems With Production Excellence
Production excellence is a set of skills and practices that allow teams to be confident in their ownership of production. Production-excellence skills are often found among SRE teams or individuals with the SRE title, but it ought not be solely their domain. Closing the feedback loop on production ownership requires us to spread these skills across everyone on our teams. Under production ownership, operations become everyone's responsibility rather than “someone else's problem”. Every team member needs to have a basic fluency in operations and production excellence even if it's not their full-time focus. And teams need support when cultivating those skills and need to feel rewarded for them. There are four key elements to making a team and the service it supports perform predictably in the long term. First, teams must agree on what events improve user satisfaction and eliminate extraneous alerts for what does not. Second, they must improve their ability to explore production health, starting with symptoms of user pain rather than potential-cause-based exploration.
A Quantum Revolution Is Coming
Now, individuals and entities across NGIOA are part of an entangled global system. Since the ability to generate and manipulate pairs of entangled particles is at the foundation of many quantum technologies, it is important to understand and evaluate how the principles of quantum physics translate to the survival and security of humanity. If an individual human is seen as a single atom, is our behavior guided by deterministic laws? How does individual human behavior impact the collective human species? How is an individual representative of how collective systems, whether they be economic to security-based systems, operate? Acknowledging this emerging reality, Risk Group initiated a much-needed discussion on Strategic Impact of Quantum Physics on Financial Industry with Joseph Firmage, Founder & Chairman at National Working Group on New Physics based in the United States, on Risk Roundup.
CIO interview: Sam Shah, director for digital development, NHS England
Shah believes the effective use of standards across emerging technology will help break forms of supplier lock-in that have previously characterised much of the provision of NHS systems and services. To help encourage providers generate innovative solutions to business challenges in the health service, Shah says the sector needs to be a more attractive place for IT suppliers. “We’re keen to help – we want to generate grants to help innovators in the UK work in partnership with the NHS,” he says. “We have an entire network of academics and scientists that support our work. And we have a much more open approach to development, so that suppliers can start working with the NHS in a more meaningful way. “As we amass more data and connect more datasets, we have an opportunity to bring about precision public health to reduce inequalities and to reduce the burden on society. We can create precision medicine that allows clinicians to prescribe much more precisely around the needs of the patient and their optimal needs. Our world is becoming more data-driven, but we need help from suppliers to deliver these services.”
Put simply, location intelligence is the ability to derive business insights from geospatial information. Those with well-developed location intelligence abilities use GIS, maps, data, and analytical skills to solve real-world problems, specifically business problems. This is an important distinction. Location intelligence is primarily a business term that refers to solving business problems. GIS may be the technical foundation of location intelligence, but it’s not the same thing. ... In reality, when you factor location into analysis, you open up a world of opportunity. Specifically, you make it possible to tackle a unique set of problems. Think about an offshore oil company trying to predict and monitor sea ice activity. Rogue icebergs or shifting ice floes, driven by global climate change, pose a tremendous risk to the safe operation of offshore oil rigs and shipping vessels. Mitigation of sea ice risk is inherently about predicting and its monitoring the location of sea ice: its size, shape, and speed and the consequences if it impacts an oil platform.
European Union Votes to Create a Huge Biometrics Database
The identity records will include names, dates of birth, passport numbers, and other ID information. The biometrics details meanwhile include the fingerprints and facial scans. The primary aim of the biometric database is to make it easier for EU border and law enforcement personnel to search for people’s information faster. This is an upgrade to the current system of going through different databases when looking for information. The interoperability of the CIR will ensure that the law enforcement officers have fast, seamless, systematic and controlled access to the information that they need to perform their tasks. It would also detect multiple identities linked to the same set of biometric data and facilitate identity checks of third-country nationals (TCNs), on the territory of a Member State, by police authorities. The CIR for third-country citizens would enable identification of TCNs that lack proper travel documents.
With regards to a blockchain platform that offers a space for content creators to go about their business unheeded, there is a lot of potential, and already some use-cases of a decentralised content platform that has an incentivisation program already attached. Many are aware of Steemit, within the blockchain sphere, which is a blogging and social networking website that uses the Steem blockchain to reward publishers and curators. It is a useful service as because of its decentralised nature, there should be no censorship - but that is in question because there is still Steemit Inc heading up the entire operation. But in principle, a fully decentralised content platform allows for free reign regarding posting, and because of the token economy associated with it, there is monetisation, as well as crowd sentiment driving the content. Many will worry about hate speech and other dangers being pronounced on these decentralised platforms, but in quite a libertarian viewpoint, this will only be as successful as the demand for it.
WebAssembly and Blazor: A Decades Old Problem Solved
In mid-April 2019, Microsoft gently nudged a young framework from the "anything is possible" experimental phase to a "we're committed to making this happen" preview. The framework, named Blazor because it runs in the browser and leverages a templating system or "view engine" called Razor, enables the scenario .NET developers almost gave up on. It doesn't just allow developers to build client-side code with C# (no JavaScript required), but also allows developers to run existing .NET Standard DLLs in the browser without a plugin. ... HTML5 and JavaScript continued to win the hearts and minds of web developers. Tools like jQuery normalized the DOM and made it easier to build multi-browser applications, while at the same time browser engines started to adopt a common DOM standard to make it easier to build once and run everywhere. An explosion of front-end frameworks like Angular, React, and Vue.js brought Single Page Applications (SPA) mainstream and cemented JavaScript as the language of choice for the browser operating system.
Quote for the day:
"Great spirits have always encountered violent opposition from mediocre minds." -- Albert Einstein
No comments:
Post a Comment