Daily Tech Digest - August 18, 2019

Realities and myths for 5G’s impact on logistics

Realities and myths for 5G’s impact on logistics image
The disruptive potential of 5G in logistics is all about the Internet of Things. We already see 4G and Wi-Fi networks as the ‘connective tissue’ between every device we connect to the internet, including computers, phones, wearables, home appliances and major business infrastructure. Every business relies on data to function, and logistics companies handle even more data than most. The sheer confluence of various employee functions, delivery vehicles, material handling equipment and facility control systems has always required lightning-fast connections with low latency and high uptime. 5G can deliver on that promise once it’s up and running. Individual devices will be able to achieve their own internet connections, provided they bring their own power or have access to it. Because of the far lower latency than 4G — up to 10 times lower — companies will be able to distribute and exchange far larger quantities of data than ever.

Cloud security is too important to leave to cloud providers

The need to take control of security and not turn ultimate responsibility over to cloud providers is taking hold among many enterprises, an industry survey suggests. The Cloud Security Alliance, which released its survey of 241 industry experts, identified an "Egregious 11" cloud security issues.  The survey's authors point out that many of this year's most pressing issues put the onus of security on end user companies, versus relying on service providers. "We noticed a drop in ranking of traditional cloud security issues under the responsibility of cloud service providers. Concerns such as denial of service, shared technology vulnerabilities, and CSP data loss and system vulnerabilities -- which all featured in the previous 'Treacherous 12' -- were now rated so low they have been excluded in this report. These omissions suggest that traditional security issues under the responsibility of the CSP seem to be less of a concern. Instead, we're seeing more of a need to address security issues that are situated higher up the technology stack that are the result of senior management decisions."

Asbeck estimates that it will only take a few years before labor industries adopt their use, and it won’t be long before passive suits become an affordable and commonplace use by the able-bodied. Exoskeletons could add years of enjoying an active lifestyle, like hiking, for the elderly by providing greater endurance. ... “A lot of us wear a Bluetooth device in our ear now, and that was tech you saw the military wearing in a movie fifteen years ago,” Haas said, “It looks crazy and futuristic but now we all see that type of technology.” If wearable robotics make you think less of Iron Man and more of Wall-E — where futuristic humans rely on robotic lounge chairs for mobility, rendering them too bloated to walk — David Perry, an engineer at Harvard Biodesign Lab, doesn’t show concern. He says these suits will maintain the health of people who do incredible physical feats. His leg exoskeleton, which soldiers tested, won't alleviate all the stress on the wearer’s muscles. But it does make it easier to walk by about 15%, even while carrying a heavy load.

Managing compliance costs with quality data

Quality data should allow compliance teams to screen against a range of datasets, including sanctions, politically exposed persons (PEPs), law enforcement lists, regulatory lists, adverse media records geared towards sanctions compliance, AML, countering the financing of terrorism, and countering proliferation finance. Data should also be comprehensive; de-duplicated; consistent; accurate; configurable; and up-to-date. Given this extensive list of requirements, when choosing a data provider, it is important to choose one that has global capabilities and is able to deliver reliable and trusted data.The dataset being considered should have a highly analytical set of inclusion criteria and be de-duplicated. A successful provider should allow extensive ‘slicing and dicing’ and configuration and, perhaps most importantly of all, must take the control of operational costs seriously. It is worth mentioning that overscreening – in other words screening beyond regulatory requirements – can significantly contribute to excessive operational costs.

Can private online communities drive corporate cultural change?

speech balloons speech bubbles conversation talk social media network by comicsans getty
By creating a dedicated interactive workspace for your customers, partners and company staff, you create a collaborative space built on trust. Potential customers look for – and many times rely on – current customer interaction for recommendations and support. Through customer engagement you can educate your customers on new product enhancements and product rollouts. Public online communities are typically full of distracting noise, or communication is one-way (through reviews or isolated through FAQ’s or chatbots between the customer and the brand). Although subcultures formed by joining groups can limit noise, building brand loyalty amongst the distracting noise of public online communities is still not as safe or effective. The data can’t be gathered to increase a company’s value and effectiveness or enhance product development. In a dedicated, private online community, interactive real-time networking and tribal problem-solving helps to create long-term partnerships and friendships.

Major breach found in biometrics system used by banks, UK police and defence firms

Facial recognition technology on woman
The researchers said the sheer scale of the breach was alarming because the service is in 1.5m locations across the world and because, unlike passwords being leaked, when fingerprints are leaked, you can’t change your fingerprint. “Instead of saving a hash of the fingerprint (that can’t be reverse-engineered) they are saving people’s actual fingerprints that can be copied for malicious purposes,” the researchers said in the paper. The researchers made multiple attempts to contact Suprema before taking the paper to the Guardian late last week. Early Wednesday morning (Australian time) the vulnerability was closed, but they still have not heard back from the security firm. Suprema’s head of marketing, Andy Ahn, told the Guardian the company had taken an “in-depth evaluation” of the information provided by vpnmentor and would inform customers if there was a threat. “If there has been any definite threat on our products and/or services, we will take immediate actions and make appropriate announcements to protect our customers’ valuable businesses and assets,” Ahn said.

Here’s How Artificial Intelligence Is Fueling Climate Change

AI Apocalypse
At the moment, data centers—the enormous rooms full of stacks and stacks of servers that juggle dank memes, fire tweets, your vitally important Google docs and all the other data that is stored somewhere other than on your phone and in your home computer—use about 2% of the world’s electricity. ... According to The MIT Technology Review, Dickerson recently told a conference audience in San Francisco that—unless super-efficient semiconductors are innovated in the next five years—data centers handling AI demands could account for 10% of the world’s electricity use by 2025, a hundred-fold increase in a half-decade. Dickerson’s forecast is a worst-case scenario. Other tech execs have given estimates that vary wildly. Some think data centers, period, will suck 10% of the global electricity load. Yet others think that usage will remain relatively flat, in part because of large companies’ abilities to handle vast amounts of data in more efficient ways. Google, for example, is using AI technology to cool its data centers, reducing demand for power by 40%.

On Stocks And Machine Learning

Undoubtedly, the “cognitive biases” described by Kahneman and Tversky act on and affect the decisions of even the most experienced and famed stock analysts and portfolio managers in the world. Specifically, the “Confirmation Bias” may lead analysts to purchase stocks that are well- known, popular and “juicy”. Analysts are usually “swamped” with information and data on the companies they follow which might raise their confidential level in their analysis of these companies’ stocks. The “Anchoring Bias” will make it difficult for the analyst to sell a stock that he purchased even if he discovers that he had erred in his original analysis of this stock’s performance. The “Representational Bias” may also lead the analysts to wrong investments. The problem related to the “Representational Bias” stems from the tendency of the analyst, when investigating the history and profile of the company, to assume that these parameters will repeat themselves in the future. This assumption ignores the “reversion to the mean” phenomenon which is typical for the finance market and the economic market in general.

Data management roles: Data architect vs. data engineer, others

How do these data management roles compare? Data architects design and help implement database systems and other repositories for corporate data, Bowers said. They're also responsible for ensuring that organizations comply with internal and external regulations on data, and for evaluating and recommending new technologies, he added. Bowers described a data architect as a "know-it-all" who has to be familiar with different databases and other data management tools, as well as use cases, technology costs and limitations, and industry trends. "I had to master a ton of technologies to become a data architect," he said. A data modeler identifies business rules and entities in data sets and designs data models for databases and other systems to help reduce data redundancy and improve data integration, according to Bowers. Data modelers make less money on average than many other IT workers, but you get what you pay for, he cautioned.

Data Management No Longer an IT Issue

The next-generation data management platform needs to treat data differently. It needs to see data as a liquid core asset - not a static one -- that can be quickly ingested, stored in the most appropriate data formats and locations, and easily accessed by any analytical processing engine. The data architecture should be flexible, scalable, high-performance, integrated, and secure. But this does not mean you need to create an entirely new enterprise data platform, according to Han. "The core components are still the same - applications, middleware, database, analytics, and systems. However, when we build the new data architecture on top of the existing framework, we must be aware that there are new access points like mobile and IoT for collecting data today, which did not exist 15 years ago. There is also a huge abundance of data that comes in a variety of formats today. So, the question is, how can we integrate them all?" Oracle’s Big Data SQL, an end-to-end big data and AI platform, looks at all data in unison and integrates them to maximize its value.

Quote for the day:

"Leadership is, among other things, the ability to inflict pain and get away with it - short-term pain for long-term gain." -- George Will

No comments:

Post a Comment