Cloud washing occurs when software providers attach the “cloud” label to a program in an effort to rebrand or boost sales. These programs differ from cloud-native software, which are built specifically for the cloud. Cloud washed tools take advantage of companies who want to integrate cloud solutions into their infrastructure. As such, they should be avoided at all costs. Though cloud washing is a serious problem, it can be easy to find if you know what you’re looking for. We’ve put together some tips on how to tell if a program is cloud washed and what cloud-native tools actually look like. Companies cloud wash by claiming that legacy software (old or outdated software that continues to be supported) is cloud software. Whether it’s a ploy to attract uninformed customers or evidence of a lack of cloud understanding, marketers will attach the word “cloud” to their old programs to boost sales. This tends to happen with tools that connect to the Internet.
We now know that data collected from one channel needs to inform efforts in every other channel and that technologies that were introduced as channel-specific tools now need to work across entire organizations — something even the marketing clouds have trouble with. Because of the way marketing technology has evolved, marketers are left managing very complicated tech stacks comprised of multiple technologies, stitched together to complete what should be seamless and interconnected marketing processes. It’s no wonder that even though companies have more technology at their disposal than at any other point in history, only 39% of executives today say they feel they have the digital capabilities they need to compete. As someone who has spent the last decade reimagining how to process, analyze and act on audience, channel and tactic data at scale, I believe the introduction of artificial intelligence (AI) will be the final tipping point for marketing’s digital transformation — despite challenges that remain. Here’s how.
"The rise of machine learning will really make our industry unrecognizable in the future," said Anthony Cowell, head of asset management for KPMG in the Cayman Islands. His clients include some of the world's largest asset managers, hedge funds and private-equity firms. For instance, Citi Private Bank has deployed machine learning to help financial advisors answer a question they're frequently asked: What are other investors doing with their money? By using technology, the bank can anonymously share portfolio moves being made by clients all over the planet. "Traditionally that kind of information was sourced from your network. You might have had a few coffees or heard about it over a cocktail," Philip Watson, head of the global investment lab at Citi and chief innovation officer at Citi Private Bank, told CNN Business. "Now, we can share insight that is very valuable."
Citi also built a recommender engine that uses machine learning tools to advise clients.
Back then we thought about higher bitrates, increased spectrum efficiency, etc. We also had a few ideas that we were not able to get into the LTE standard due to backward compatibility issues – the most key feature being what we call "lean carrier". We also put energy efficiency high on the requirement list. Most importantly, we saw a need for solutions that could support all kinds of communication needs, way beyond traditional services, for example, serving the forecast massive IoT market was one of our key requirements. And, in order to really stretch our design, we added support for critical machine type communication, which is now known as URLLC (ultra-reliable low latency communication). ... 5G will also be combined with edge computing and 3rd party applications running close to the devices. On the edge, AI will be able to learn and control most of our infrastructure in smart cities and smart manufacturing in factories. So the transformational impact of 5G will be enormous compared to the previous Gs.
Hackers often gain access to systems by exploiting weaknesses in edge devices. These include routers and switches. Now, other devices such as smart thermostats, doorbells, even security cameras are also vulnerable. Simply put, the rigorousness is often not applied when ensuring whether these IoT devices are secure. Blockchain technology can be used to protect systems, and devices from attacks. According to Joseph Pindar, co-founder of the Trusted IoT Alliance, blockchain can give those IoT devices enough “smarts” to make security decisions without relying on a central authority. For instance, devices can form a group consensus regarding the normal occurrences within a given network, and to lockdown any nodes that behave suspiciously. Blockchain technology can also protect all the data exchanges happening between IoT devices. It can be used to attain near real-time secure data transmissions and ensure timely communication between devices located thousands of miles apart.
This deal with Entrust Datacard is expected to close during the second quarter of 2019, subject to the successful completion of the acquisition of Gemalto by Thales and the approval of Entrust as a suitable purchaser by the European Commission, US Department of Justice, Australian Competition and Consumer Commission, and New Zealand Commerce Commission. Thales said the deal will enable nCipher Security, which has more than 300 employees and reported more than €100m in revenues in 2018, to “continue to deliver innovative solutions and services and strengthen its market leadership”. It added that Entrust Datacard is a global leader in public key infrastructure (PKI) solutions and services, and the primary use case for GP HSMs in protecting infrastructure private keys such as root and issuing certification authorities keys. “This makes Entrust Datacard the ideal organisation for Thales to divest this business, ensuring its leadership position in the GP HSMs market and providing trust, integrity and control to business-critical applications,” the company said.
Disinformation has spawned a new sub-industry within journalism, with fact checkers working around the clock to analyse politicians’ speeches, articles from other publications and news reports, and government statistics among much else. But the sheer volume of disinformation, together with its ability to multiply and mutate like a virus on a variety of social platforms, means that thorough fact-checking is only possible on a tiny proportion of disputed articles. While technology has provided the seedbed and distribution for disinformation, it also offers a solution to the issue. Artificial intelligence in particular offers powerful tools in the fight against disinformation, working on multiple levels to identify dubious content. These techniques are broadly split between content-based and response-based identification. The former works much like a human fact checker, by matching the content of an article with trusted sources of information to highlight errors or outright lies.
Use cross-functional teams to plan and design this modernization effort. Functional experts from areas such as IT, strategy, R&D, customer interaction, and operations can all work together in an agile “sandbox” environment to design the changes around a set of coordinated specifications. In this early stage, and throughout the initiative, you thus link leading-edge knowledge of the changing technology with deep, day-to-day awareness of the desired results. As you bring these teams together, you will establish a shared frame of reference — a common language to describe the features you want and the capabilities you are building. This also will help engage new stakeholders as they join in the effort. A major transportation company revamped its online system this way, improving the integration between the website that handled passenger bookings and the back-office functions that, among other things, routed travel. In its intensive sandbox sessions, the company set up temporary cross-functional working groups, which became known as “tribes.”
“An attacker could exploit this vulnerability by connecting to the cluster service manager and injecting commands into the bound process,” Cisco wrote in its Security Advisory. Cisco says that the vulnerability is due to insufficient input validation in Cisco HyperFlex software releases prior to 3.5. Such input can impact the control flow or data flow of a program and cause a number of resource control problems. Cisco has released a software update to address this vulnerability and said that there are no other workarounds to address this exposure. The second vulnerability – rated 8.1 on Cisco's scale – is a snafu in the hxterm service of Cisco HyperFlex Software that could let an attacker connect to the service as a non-privileged, local user. A successful exploit could allow the attacker to gain root access to all member nodes of the HyperFlex cluster in Cisco HyperFlex software releases prior to 3.5, according to the security advisory.
A battle is also brewing in the US over state and federal privacy laws. Several states have passed laws aimed at data privacy and ethical use. The most prominent and restrictive of these is the California Consumer Privacy Act of 2018 -set to take effect in 2020 and billed to be the toughest data privacy law in the country (incorporating many GDPR-like restrictions). Many companies have lobbied against this and other state bills, pushing for less restrictive measures and asking that a uniform federal law supersede all state legislation. To this end, both the US Chamber of Commerce and the Internet Association, which represents companies like Amazon, Facebook, Google, and Twitter, have released their own recommendations for a federal bill. The Data Care Act introduced by a group of US senators, a competing congressional bill, The Information Transparency and Personal Data Control Act, and the White House recommendations round out the plethora of proposals.
Quote for the day:
"The ultimate measure of a man is not where he stands in moments of comfort, but where he stands at times of challenge and controversy." -- Martin Luther King, Jr.