OCR tools are undergoing a quiet revolution as ambitious software providers combine them with AI. As a consequence, data capturing software is simultaneously capturing information and comprehending the content. In practice this means that AI tools can check for mistakes independent of a human-user providing streamlined fault management. But how do these tools work? The answer is slightly different depending on which AI platform you’re is using. One detailed case study of how AI is used to enhance OCR can been in Infrrd’s work with a global investment firm. Infrrd IDC, a hybrid AI and OCR tool was used to help manage financial reports. The tool was used to copy financial reports from various languages and translate them into English. To do this, Infrrd used a combination of machine learning and Computer Vision algorithms. These algorithms were used to analyse document layout during pre-processing to pinpoint what information was to be recorded.
According to analysts, it’s been clear for some time that the server virtualization market is approaching a saturation point. Gartner reported that license revenues for x86 virtualization declined for the first time ever in the first quarter of 2016, with most enterprises reporting data-center virtualization levels of 75% or higher. And by 2017, Gartner declared the server-virtualization market so mature that it stopped doing its annual server-virtualization Magic Quadrant reports altogether. Meanwhile, the threat to VMware goes beyond companies having virtualized pretty much every workload that can be virtualized. In a bid to reduce capital expenditures and increase business agility, organizations are trying to downsize their data centers and shift existing workloads to the cloud, either on SaaS platforms or cloud infrastructure from AWS or Azure. And as companies decide to go cloud-native for all new applications, they are turning to cutting-edge approaches like containerization, micro-services and serverless computing, which don’t require a traditional VM.
First of all, enterprise architects must take a future-oriented perspective. Innovation requires more exploration and risk-taking than architects are typically used to. In many organizations, their chief responsibility is to keep track of the complexities of the current situation, with the purpose of finding opportunities for local improvements, risk reduction or cost savings. Established architecture methods and practices are often aimed at staying in control. You also see this in the terminology used. Often, architects want to design a ‘future-proof’ architecture or system that will easily accommodate any potential future requirements. However, in a volatile environment there is no such thing as a future-proof solution. The only thing you can do is to design something for change, ensuring that change itself is, and stays, as easy as possible. There are trade-offs to consider here. If you want to maximize cost efficiency, sharing resources across your enterprise may be a good idea.
5G will shatter the current 4G speed limitation, increasing it by up to 1,000 times enabling 8K video applications, or allowing rural subscribers to enjoy the same Internet experience as their urban counterparts. 5G will also drastically lower network application latency from hundreds of milliseconds to just a few—we’re talking single digits—giving rise to near real-time machine-to-machine interaction currently found only in science fiction movies. Surgeries will be performed from the other side of the globe, while fatal car crashes could be virtually eliminated. Fully autonomous robotic factories could request maintenance before any failures occur, while a fleet of drones could apply pesticides to crops with surgical precision. Just as Albert Einstein pushed humanity’s understanding of physics to new heights, 5G will push mankind to achieve new speeds as latency drops and drive the number of connected devices beyond anything previously imagined.
According to Christopher, it appears that in the rush to adopt RPA, enterprises may not be taking an integrated approach to automation and are failing to comprehensively examine processes before they automate them. “I’ve heard lots of enterprises actually admit this,” said Christopher. “Before one of our recent roundtables on intelligent automation, I was going around the room talking to different business leaders and lots of them admitted how when they were starting out they’d look at processes and say ‘let’s just swap in some automation’, then they realise they’re just left with a different form of a worker doing the same job.” It seems enterprises are leading with a solution before identifying the problem. “Automation should be seen as an opportunity to drive dramatic process improvement,” added Christopher. This, of course, is no mean feat. Let’s say, for example, you’re a multi-national producer, and you want to improve your order to cash process. From order entry all the way through to the delivery of goods and receipt of payments, it’s a huge project.
What bankers should be worried about is the government--not fintech and Big Tech firms. Specifically, politicians who have no idea: 1) How the banking system works, and 2) What the difference between a Main Street bank and Wall Street bank is. It's more than just potential regulatory changes that threatens banks, however (not that what some of these politicians want to propose won't be painful). The problem is that it's taken the banking industry roughly 10 years to rebuild its standing with consumers (not counting the one west coast bank that seems to do everything in its power to keep its reputation in the tanks). For 10 years I've said that banks wouldn't be in the clear until a new villain came along (you probably don't remember that it was banks, on the heels of the financial crisis, who saved British Petroleum from being the most hated villain after the Gulf oil spill). With the data abuses by Facebook (the British government calls the company "digital gangsters"), and news that Amazon paid no taxes--again--Big Tech is becoming the new villain.
Platforms as a service or PaaS is one of the biggest trends to look out for in the Fintech space. This will allow solutions to go beyond the cloud computing arena. The companies can extend solutions out of the box and add smart customization to satisfy diverse industry needs. Diverse functions like Sending and receiving payments, advanced payment services, infrastructure building and enhancement, unconventional and new user experiences are the future of new collaborations in the Fintech Industry. According to data sourced in late 2018 from the World Bank, India houses the second largest unbanked population in the world. This is a clear indication that the Government needs to look at non-traditional channels that have the ability to drive change and impact the economy favourably. Indian Fintech companies have been instrumental in creating lean cloud-based solutions to reach out to the masses.
Sen. Mark Warner (D-Va.) sent a letter to several major health care groups on Thursday asking what they have done to prevent cyberattacks and how the federal government can help them address cyber issues. “The increased use of technology in health care certainly has the potential to improve the quality of patient care, expand access to care (including by extending the range of services through telehealth), and reduce wasteful spending,” Warner wrote in the letter, according to a release. “However, the increased use of technology has also left the health care industry more vulnerable to attack.” Warner, the vice chair of the Senate Intelligence Committee and co-chair of the Senate Cybersecurity Caucus, cited a Government Accountability Office report that found that more than 113 million health care records were stolen in 2015 through cyberattacks. The letter was sent to organizations like the American Hospital Association, the American Medical Association, the National Rural Health Association and the Healthcare Leadership Council.
Only a decade and a half ago, companies still had their own data centers. Now, cloud computing has made virtual clusters of computers and storage available to those same firms, and delivery of those services is concentrated among companies like Amazon, Google and Microsoft. “Now,” said Shtilman, “nobody thinks about building this stuff out on their own. They go to these large platforms, and these platforms give you what you need, globally, in any data center — in Singapore, in Europe, in China. We believe that, five years down the road, this is what is going to happen in FinTech,” through a flexible and multi-faceted platform geared toward helping companies — small and large — offer services across geographies. At present, with multi-currency support across 65 holding currencies and 170 payout currencies, Rapyd’s fund collection offerings include cards, cash (which the CEO noted is “still king” in many countries), bank transfers and local eWallets. Fund disbursements include push-to-card and local eWallet options.
The Rookout team describe their breakpoint functionality as "non-breaking breakpoints", as the corresponding application execution does not actually pause or halt as it would with a traditional active debugger. They also state that "no restarts, redeployment or extra coding is required" in order to set these breakpoints, and this can lead to very quick hypothesis testing and bug detection. As a result of a Rookout breakpoint being hit within an application's execution flow, an engineer can view stack traces and global variable values, as well as specify individual variable "watches". InfoQ learned that in the case of Java debugging, the underlying mechanism that provides the breakpoint functionality is based on java.lang.instrument, which allows Java programming language agents to instrument programs running on the JVM. The instrumentation of an application is accomplished by adding a Rookout dependency to the codebase e.g. via Maven or Gradle.
Quote for the day:
"Money can't buy happiness, but it can make you awfully comfortable while you're being miserable." -- Clare Boothe Luce