Resiliency is the capability to handle partial failures while continuing to execute and not crash. In modern application architectures — whether it be microservices running in containers on-premises or applications running in the cloud - failures are going to occur. For example, applications that communicate over networks are subject to transient failures. These temporary faults cause lesser amounts of downtime due to timeouts, overloaded resources, networking hiccups, and other problems that come and go and are hard to reproduce. ... You can’t avoid failures, but you can respond in ways that will keep your system up or at least minimize downtime. For example, when one microservice fails, its effects can cause the system to fail. ... Developers often use the Circuit Breaker and Retry patterns together to give retrying a break. Retry tries an operation again, but when it doesn’t succeed, you don’t always want to just try it one more time or you may risk prolonging the problem. The Circuit Breaker pattern effectively shuts down all retries on an operation after a set number of retries have failed.
It is obvious that banking will also have to change and adapt to the new reality. Big events in the financial industry have had great impacts in the last 10 years. The global financial crisis from 2007-08 was just the ignition for numerous scandals followed by huge fines, leading to a flood of new regulations (AML, KYC, Basel III). Unfortunately, most regulations have yet to prove their benefits for the consumer and bank client. They have definitely made service delivery more bureaucratic, time-consuming and costly, reducing client satisfaction. And, of course, they have also made life for the bank staff more difficult, especially as resources are trimmed and cost-cutting continues. And this is all happening in the midst of a global shift of economic power towards the East. We should also note that the developed world, especially in Europe, is rather more technophobic than progressive. This sharply contrasts with the emerging Asian economies, driven by behemoth China. They are already now taking the lead in many technology and science disciplines, e.g., robotics, artificial intelligence, social media, smartphones, wearable technologies, internet of things and so on.
Machine learning evolution (infographic)
Deep learning’s improved accuracy in image, voice, and other pattern recognition have made Bing Translator and Google Translate go-to services. And enhancements in image recognition have made Facebook Picture Search and the AI in Google Photos possible. Collectively, these have put machine recognition capabilities in the hands of consumers in a big way. What will it take to make similar inroads in business? Quality training data, digital data processing, and data science expertise. It will also require a lot of human intelligence, such as language-savvy domain experts who refine computable, logically consistent business context to allow logical reasoning. Business leaders will have to take the time to teach machines and incorporate machine intelligence into more processes, starting with narrow domains. Some in the statistically oriented machine learning research “tribes”—the Connectionists, the Bayesians and the Analogizers, for example —will worry that “human-in-the-loop” methods advocated by the Symbolists aren’t scalable. However, we expect these human-to-machine feedback loops, that blend methods of several tribes, will become a lot more common inside the enterprise over the next few years.
Reinventing the healthcare sector with Artificial Intelligence
India is also joining a growing list of the countries that are using AI in the healthcare. The adoption of AI in India is being propelled by the likes of Microsoft and a slew of health-tech startups. For instance, Manipal Hospitals, headquartered in Bengaluru, is using IBM Watson for Oncology, a cognitive-computing platform, to assist physicians discover personalised cancer care options, according to an Accenture report. For cardiac care, Columbia Asia Hospitals in Bengaluru is leveraging startup Cardiotrack’s AI solutions to predict and diagnose cardiac diseases. “Last year the company embarked on Healthcare NExT, a Microsoft initiative which aims to accelerate healthcare innovation through AI and cloud computing. By working side-by-side with the healthcare industry’s most pioneering players, we are bringing Microsoft’s capabilities in research and product development to help healthcare providers, biotech companies and organizations across India use AI and the cloud to innovate,” said Anil Bhansali, Corporate Vice President, Cloud & Enterprise, Managing Director, Microsoft India (R&D) Private Limited.
Look For the ‘Human’ When Buying HR Technology
Cognitive data processing technologies have the ability to not only automate common tasks like generating and distributing reports, but also to perform duties as highly nuanced as career coaching across an entire employee base. Machine learning algorithms can learn an organization’s priorities for skills development, help assess an individual’s credentials, then make recommendations for training or positions to consider. Offered to employees through an existing learning management system or a mobile app, such innovations can facilitate large-scale corporate objectives of development and retention and give each and every worker the useful career information they crave today. Yet some new technologies have a wow factor that makes them seem useful or innovative when they may not be ready or reasonable to replace HR’s role. Virtual Reality (VR) is being touted for all sorts of uses, from eLearning to giving candidates a taste of what it’s like to work for a company. But these should be considered as augmenting traditional tactics, not entirely replacing them.
Some reasons to think about big data as data commons
Finally, the digital economy has generated a considerable fiscal distortion: companies that rely on digital business models pay on average half the effective tax rate of traditional companies, thanks to the “fluid” nature of their businesses and to the placement of their subsidiaries in countries with low tax regimes. In response thereto, the European Commission has freshly come up with two distinct legislative proposals to ensure a fair and efficient European tax system. In the light of the above, an alternative data management for a more equal and sustainable socio-economic digital environment is not only desirable but necessary. The DECODE project is building towards this direction: a digital economy where citizen data is not stored in silos located and handled in overseas countries but rather self-controlled and available for broader communal use, with appropriate privacy protections and value distribution outcomes. The Data Commons approach centres precisely around the pooling and collective management of individually-produced streams of personal and sensor data that, combined with public data of the cities, will offer data-driven services that better respond to individual and overall community needs.
Third Wave Dev + Ops: Self-Learning DevOps for a Self-Learning World
DevOps AI success, in other words, may be largely contingent upon continuous testing. After all, we want our AI-powered recommendation engines to guide customers toward the right items while they're still on our websites. Why wouldn't we similarly want our AI-powered DevOps to guide developers toward the right behaviors while they're still on a feature or fix? ... Most IT organizations still treat cybersecurity as its own functional silo. In five years, this approach won't work—especially since next-generation architectures, such as blockchain, that depend on innate code might run on imperfectly secured environments beyond enterprise IT teams' control. Security must therefore become intrinsic to DevOps QA—not something for someone else to clean up after the fact. Building security checks directly into the integrated development environment (IDE) helps to protect companies from increasingly sophisticated attacks designed to discover and exploit design flaws. Security-enabled DevOps also saves money and speeds time to market in precisely the same way that QA shift left does.
Every Android device from the last 6 years may be at risk to RAMPage vulnerability
While not impossible, RAMPage is more difficult to practically attack on end-user devices, partially as vendor-specific or device-specific issues make it more difficult to reliably create the conditions that allow for exploitation. Because of the degree of precision involved, it would theoretically be possible that the same model phone with DRAM from different vendors would have different avenues to attack, or that certain optional hardware protections of LPDDR4, if added at manufacturing time, would partially mitigate the attack, the paper noted. Additionally, while the RAMPage attack was only demonstrated on an LG G4, it is possible that it may be applicable toward iOS devices and other devices using LPDDR2, 3, or 4 chips and running software with similar memory management techniques. That said, the researchers have proposed a fix for RAMPage called GuardION. From their tests in the whitepaper, they found "results in a performance degradation of 6.6%, which we believe is still acceptable.
The Insane Amounts of Data We're Using Every Minute
By the looks of the research, things are only getting bigger. In 2012, there were approximately 2.2 billion active internet users. In 2017, active internet users reached 3.8 billion people -- nearly 48 percent of the world’s population. When it comes to social media, data usage is unsurprisingly high. Since last year, Snapchat alone saw a 294 percent increase in the amount of images shared per minute. Nearly 2.1 million snaps are shared every 60 seconds. On average, there are 473,400 tweets posted every minute, 49,380 Instagrams photos and 79,740 Tumblr posts. So who’s behind all this social media madness? Americans upped their internet usage by 18 percent since 2017, however it’s not all going to Snapchat and Twitter. Much of it is going to video-streaming services such as Netflix and YouTube. Since last year, Netflix saw a whopping 40 percent increase in streaming hours, going from 69,444 hours to 97,222. And YouTube videos have reached 4.3 million views per minute. Even the peer-to-peer transactions app Venmo saw a major data jump, with 32 percent more transactions processed every minute compared to last year. Overall, Americans use 3.1 million GB of data every minute.
Building a Robust and Extensive Security Architecture
Building a device security system is the first line of defense in ensuring IoT security. The security capabilities of devices need to be configured to match their functions and computing resources, including memory, storage and CPU. For weak devices, such as water and gas meters, where resources are limited and power consumption and cost are issues, basic security capabilities are a must. These include basic two-way authentication, DTLS+, encrypted transmission and remote upgradability, as well as lightweight and secure transmission protocols. Strong devices with more powerful computing capabilities that don’t have power consumption constraints and are operationally critical, such as industrial control terminals and car networking equipment, require advanced security capabilities, including trusted devices, intrusion detection, secure startup, and anti-virus protection. Device chip security and security for lightweight operating systems such as LiteOS need defense capabilities in line with the security protections of strong devices.
Quote for the day:
"Remember this: Anticipation is the ultimate power. Losers react; leaders anticipate." -- Tony Robbins