Daily Tech Digest - February 24, 2019

AI and OCR: How optical character recognition is being revitalised

AI and OCR: How optical character recognition is being revitalised image
OCR tools are undergoing a quiet revolution as ambitious software providers combine them with AI. As a consequence, data capturing software is simultaneously capturing information and comprehending the content. In practice this means that AI tools can check for mistakes independent of a human-user providing streamlined fault management. But how do these tools work? The answer is slightly different depending on which AI platform you’re is using. One detailed case study of how AI is used to enhance OCR can been in Infrrd’s work with a global investment firm. Infrrd IDC, a hybrid AI and OCR tool was used to help manage financial reports. The tool was used to copy financial reports from various languages and translate them into English. To do this, Infrrd used a combination of machine learning and Computer Vision algorithms. These algorithms were used to analyse document layout during pre-processing to pinpoint what information was to be recorded.  


VMware’s ongoing reinvention

power best of the best rule the world
According to analysts, it’s been clear for some time that the server virtualization market is approaching a saturation point. Gartner reported that license revenues for x86 virtualization declined for the first time ever in the first quarter of 2016, with most enterprises reporting data-center virtualization levels of 75% or higher. And by 2017, Gartner declared the server-virtualization market so mature that it stopped doing its annual server-virtualization Magic Quadrant reports altogether. Meanwhile, the threat to VMware goes beyond companies having virtualized pretty much every workload that can be virtualized. In a bid to reduce capital expenditures and increase business agility, organizations are trying to downsize their data centers and shift existing workloads to the cloud, either on SaaS platforms or cloud infrastructure from AWS or Azure. And as companies decide to go cloud-native for all new applications, they are turning to cutting-edge approaches like containerization, micro-services and serverless computing, which don’t require a traditional VM.


How Enterprise Architects Can Contribute to Innovation

First of all, enterprise architects must take a future-oriented perspective. Innovation requires more exploration and risk-taking than architects are typically used to. In many organizations, their chief responsibility is to keep track of the complexities of the current situation, with the purpose of finding opportunities for local improvements, risk reduction or cost savings. Established architecture methods and practices are often aimed at staying in control. You also see this in the terminology used. Often, architects want to design a ‘future-proof’ architecture or system that will easily accommodate any potential future requirements. However, in a volatile environment there is no such thing as a future-proof solution. The only thing you can do is to design something for change, ensuring that change itself is, and stays, as easy as possible. There are trade-offs to consider here. If you want to maximize cost efficiency, sharing resources across your enterprise may be a good idea. 


5G-Ready Network Today Requires a Secure, Automated Cloud Architecture

5G-blog.png
5G will shatter the current 4G speed limitation, increasing it by up to 1,000 times enabling 8K video applications, or allowing rural subscribers to enjoy the same Internet experience as their urban counterparts. 5G will also drastically lower network application latency from hundreds of milliseconds to just a few—we’re talking single digits—giving rise to near real-time machine-to-machine interaction currently found only in science fiction movies. Surgeries will be performed from the other side of the globe, while fatal car crashes could be virtually eliminated. Fully autonomous robotic factories could request maintenance before any failures occur, while a fleet of drones could apply pesticides to crops with surgical precision. Just as Albert Einstein pushed humanity’s understanding of physics to new heights, 5G will push mankind to achieve new speeds as latency drops and drive the number of connected devices beyond anything previously imagined.


Scaling RPA: before automating processes, improve them image
According to Christopher, it appears that in the rush to adopt RPA, enterprises may not be taking an integrated approach to automation and are failing to comprehensively examine processes before they automate them. “I’ve heard lots of enterprises actually admit this,” said Christopher. “Before one of our recent roundtables on intelligent automation, I was going around the room talking to different business leaders and lots of them admitted how when they were starting out they’d look at processes and say ‘let’s just swap in some automation’, then they realise they’re just left with a different form of a worker doing the same job.” It seems enterprises are leading with a solution before identifying the problem. “Automation should be seen as an opportunity to drive dramatic process improvement,” added Christopher. This, of course, is no mean feat. Let’s say, for example, you’re a multi-national producer, and you want to improve your order to cash process. From order entry all the way through to the delivery of goods and receipt of payments, it’s a huge project. 


The Biggest Threat To Banks Isn't Fintech Or Big Tech--It's The Government

What bankers should be worried about is the government--not fintech and Big Tech firms. Specifically, politicians who have no idea: 1) How the banking system works, and 2) What the difference between a Main Street bank and Wall Street bank is. It's more than just potential regulatory changes that threatens banks, however (not that what some of these politicians want to propose won't be painful). The problem is that it's taken the banking industry roughly 10 years to rebuild its standing with consumers (not counting the one west coast bank that seems to do everything in its power to keep its reputation in the tanks). For 10 years I've said that banks wouldn't be in the clear until a new villain came along (you probably don't remember that it was banks, on the heels of the financial crisis, who saved British Petroleum from being the most hated villain after the Gulf oil spill). With the data abuses by Facebook (the British government calls the company "digital gangsters"), and news that Amazon paid no taxes--again--Big Tech is becoming the new villain.


Top Fintech Trends Revamping Financial Technology

Top Fintech Trends Revamping Financial Technology
Platforms as a service or PaaS is one of the biggest trends to look out for in the Fintech space. This will allow solutions to go beyond the cloud computing arena. The companies can extend solutions out of the box and add smart customization to satisfy diverse industry needs. Diverse functions like Sending and receiving payments, advanced payment services, infrastructure building and enhancement, unconventional and new user experiences are the future of new collaborations in the Fintech Industry. According to data sourced in late 2018 from the World Bank, India houses the second largest unbanked population in the world. This is a clear indication that the Government needs to look at non-traditional channels that have the ability to drive change and impact the economy favourably. Indian Fintech companies have been instrumental in creating lean cloud-based solutions to reach out to the masses.


Warner questions health care groups on cybersecurity

Sen. Mark Warner (D-Va.) sent a letter to several major health care groups on Thursday asking what they have done to prevent cyberattacks and how the federal government can help them address cyber issues. “The increased use of technology in health care certainly has the potential to improve the quality of patient care, expand access to care (including by extending the range of services through telehealth), and reduce wasteful spending,” Warner wrote in the letter, according to a release. “However, the increased use of technology has also left the health care industry more vulnerable to attack.” Warner, the vice chair of the Senate Intelligence Committee and co-chair of the Senate Cybersecurity Caucus, cited a Government Accountability Office report that found that more than 113 million health care records were stolen in 2015 through cyberattacks. The letter was sent to organizations like the American Hospital Association, the American Medical Association, the National Rural Health Association and the Healthcare Leadership Council.


FinTech-As-A-Service Eyes Global Payments Simplification Makeover

Rapyd FinTech as a service
Only a decade and a half ago, companies still had their own data centers. Now, cloud computing has made virtual clusters of computers and storage available to those same firms, and delivery of those services is concentrated among companies like Amazon, Google and Microsoft. “Now,” said Shtilman, “nobody thinks about building this stuff out on their own. They go to these large platforms, and these platforms give you what you need, globally, in any data center — in Singapore, in Europe, in China. We believe that, five years down the road, this is what is going to happen in FinTech,” through a flexible and multi-faceted platform geared toward helping companies — small and large — offer services across geographies. At present, with multi-currency support across 65 holding currencies and 170 payout currencies, Rapyd’s fund collection offerings include cards, cash (which the CEO noted is “still king” in many countries), bank transfers and local eWallets. Fund disbursements include push-to-card and local eWallet options.


Debugging Microservices Running in Containers: Tooling Review at KubeCon NA

The Rookout team describe their breakpoint functionality as "non-breaking breakpoints", as the corresponding application execution does not actually pause or halt as it would with a traditional active debugger. They also state that "no restarts, redeployment or extra coding is required" in order to set these breakpoints, and this can lead to very quick hypothesis testing and bug detection. As a result of a Rookout breakpoint being hit within an application's execution flow, an engineer can view stack traces and global variable values, as well as specify individual variable "watches". InfoQ learned that in the case of Java debugging, the underlying mechanism that provides the breakpoint functionality is based on java.lang.instrument, which allows Java programming language agents to instrument programs running on the JVM. The instrumentation of an application is accomplished by adding a Rookout dependency to the codebase e.g. via Maven or Gradle.



Quote for the day:


"Money can't buy happiness, but it can make you awfully comfortable while you're being miserable." --  Clare Boothe Luce


Daily Tech Digest - February 23, 2019

Why Not All FinTech Providers Are FinTech Firms

Provenir On Why Not Every Financial Technology Provider Is A FinTech
“There’s often a disconnect between what a business shows and what’s going on underneath, especially when it comes to technology. It’s almost like an iceberg: The client sees a small, nimble solution, but underneath looms a technology monolith that’s difficult to turn and slow to get around,” Thomas said. He also weighed in on the role financial technology vendors have to play in the future. “As technology companies, we have to acknowledge that we’re playing a leading role in how the industry develops. How can we expect banks to transform if they’re relying on technology dodos instead of agile, forward-thinking FinTechs?” The central tenet here, for all traditional financial services firms, including technology vendors, is recognizing the need not just for change, but a shift to a more agile approach where they can deliver products with the same speed as their FinTech competitors. “It’s no longer good enough to talk the FinTech talk; you have to be able to walk the walk,” said Thomas.


Misconfiguration Leads to Major Health Data Breach

The misconfigured database at UW Medicine was the result of a coding error when data was being moved onto a new server, a UW Medicine spokeswoman tells Information Security Media Group. The organization is not offering free credit or ID monitoring services because the exposed files contained no Social Security numbers, patient financial information or medical records, the spokeswoman says. The files contained protected health information that UW Medicine is legally required to track to, for example, comply with Washington state reporting requirements, the statement says. The exposed information included patients' names, medical record numbers, and a description and purpose of the information shared for regulatory reporting purposes. "The database is used to keep track of the times UW Medicine shares patient health information that meets certain legal criteria," the statement says. The most common reasons involve situations where UW Medicine is required by Washington state law to share patient information with public health authorities, law enforcement and Child Protective Services, the organization notes.


On the future of blockchain and its impact on banking

Quoting his own understanding of blockchain, Balakrishnan says, "Trade finance is the only justified use case which will give a RoI, where people will allow to be impacted, as it being a genuine problem across." ...  As far as payment companies are considered, the barrier to them adopting blockchain would be the legacy systems that have to undergo major change to shift on a newer platform, but a visionary would pave way for it. The lack of standardization across organizations will ensure that the adoption and change will happen in the banks first as an efficiency mechanism and then play it out in other segments. Another avenue would be the consortium lending as Balakrishnan explains,"Multiple banks can come together and look at consortium lending, with assets being clear, reducing frauds, a typical NPA story, a TPA account where money should flow through--can we build a mechanism where all of us could access it and the primary lending institution can play the role of a conveyor and the rest of the stuff is available to us."


BlackBerry acquires Cylance to cement security capability


Today, BlackBerry took a giant step forward toward our goal of being the world’s largest and most trusted AI [artificial intelligence]-cyber security company,” said John Chen, executive chairman and CEO of BlackBerry. “Securing endpoints and the data that flows between them is absolutely critical in today’s hyper-connected world. By adding Cylance’s technology to our arsenal of cyber security solutions, we will help enterprises intelligently connect, protect and build secure endpoints that users can trust.” Cylance’s machine learning and AI technology is a strategic addition to BlackBerry’s end-to-end secure communications portfolio. In particular, Cylance’s embeddable AI technology is expected to accelerate the development of BlackBerry Spark, the secure communications platform for the internet of things. Designed for ultra security and industry-specific safety certifications, such as ISO 26262 in vehicles, BlackBerry Spark taps into the company’s existing security portfolio of technology that includes FIPS-validated, app-level, AES 256-bit encryption to ensure data is always protected.


Most popular programming language frameworks and tools for machine learning

More than 1,300 people mainly working in the tech, finance and healthcare revealed which machine-learning technologies they use at their firms, in a new O'Reilly survey. The list is a mix of software frameworks and libraries for data science favorite Python, big data platforms, and cloud-based services that handle each stage of the machine-learning pipeline. Most firms are still at the evaluation stage when it comes to using machine learning, or AI as the report refers to it, and the most common tools being implemented were those for 'model visualization' and 'automated model search and hyperparameter tuning'. Unsurprisingly, the most common form of ML being used was supervised learning, where a machine-learning model is trained using large amounts of labelled data. For instance, a computer-vision model tasked with spotting people in video might be trained on images annotated to indicate whether they contain a person.


Calculating Quantum Computing's Future


The most popular approach to quantum computing uses superconducting electronic circuits, piggybacking on the foundations of the semiconductor industry. Whereas ordinary computers encode information as silicon-inscribed bits, either “zeros” or “ones,” quantum computers use quantum bits, or “qubits” (pronounced cue-bits). These particles, weirdly, inhabit multiple states at once. To keep them in flux, they must be kept isolated and cold. Very, very cold. “What you’re looking at is the world’s most expensive refrigerator,” says Bob Sutor, head of quantum strategy at IBM, while gesturing at a 20-qubit quantum computer that company unveiled in January. Despite its small size, Rigetti, founded by a physicist who previously built quantum computers at IBM, believes it can challenge the titans. The company sells a quantum computing cloud service to researchers who are racing to be the first to achieve “quantum advantage,” when a quantum computer outperforms a traditional one.


Big Data, AI & IoT, Part Three: What's Stopping Us?

Hurdles to AI, Big Data, IoT growth
This series of articles has looked at the promise of Big Data, AI, and IoT, and how they all make up one ecosystem. So after looking at the benefits of these technologies in specific environments, it is worth a review of the obstacles faced before they will realize their full potential. Business leaders and media outlets alike have begun to clamor around the promise of AI, Big Data and IoT as if they are a magic bullet that will solve the world’s problems. But no technology exists in a vacuum, and the potential impact of these technologies is currently mitigated by barriers such as standardization, a lack of understanding, and unrealistic expectations at the top of many organizations. Wading through these issues is a challenge, but businesses, enterprises and governments are starting to realize that cooperation and steady progress will bring a quicker win than rushing in head first. Any new technology faces a host of issues in development and rollout, but looking at the current IoT landscape can shed some light on the challenges facing adopters of these particular emerging technologies.


Criminals, Nation-States Keep Hijacking BGP and DNS

Criminals, Nation-States Keep Hijacking BGP and DNS
DNS is also being abused for cyber espionage. In November 2018, Crowdstrike said it had spotted such a campaign targeting government domains in Lebanon and the United Arab Emirates. "We are naming it DNSpionage due to the fact that it supports DNS tunneling as a covert channel to communicate with the attackers' infrastructure," Crowdstrike said. In January, FireEye documented a global DNS hijacking campaign "that has affected dozens of domains belonging to government, telecommunications and internet infrastructure entities across the Middle East and North Africa, Europe and North America," possibly sponsored by Iran. As security blogger Brian Krebs has reported, one problem with attacks that utilize DNS is that few companies monitor for malicious DNS changes. Woodward says that's a problem with BGP hijacking as well. While large, well-resourced organizations may quickly spot any such hijacking, service providers in small countries may not.


Post-Breach HIPAA Enforcement: A Call for 'Safe Harbors'

Post-Breach HIPAA Enforcement: A Call for 'Safe Harbors'
Among its other breach-related suggestions, CHIME also recommends "amending [HIPAA] language around the responsibilities of business associates by adding that for breaches that start with them they must bear responsibility." That includes notification of media and breach reporting to HHS. Under the current HIPAA rules, covered entities are responsible for notification of breaches by their business associates. The AHA offers similar safe harbor suggestions. "Despite complying with HIPAA rules and implementing best practices, hospitals and healthcare providers will continue to be the targets of sophisticated cyberattacks, and some attacks will inevitably succeed," AHA writes. Whether exploiting previously unknown vulnerabilities or taking advantage of an organization with limited resources, attackers will continue to be successful, AHA notes. "The AHA believes that victims of attacks should be given support and resources, and enforcement efforts should rightly focus on investigating and prosecuting the attackers," AHA writes.


Uber Open-Sources Ludwig Code-Free Deep-Learning Toolkit

Ludwig is built on top of Google's TensorFlow deep-learning library. There are other "wrappers" of TensorFlow that provide friendly interfaces, such as Keras or Gluon. However, these still require users to define their neural networks by writing code (usually Python). Ludwig pre-packages a large number of popular deep-learning patterns, which can be combined and configured using a YAML file. A large class of deep-learning solutions for vision and speech problems follow an "encoder/decoder" pattern. In this pattern, the input is converted from raw data into a tensor representation, which is then fed into one or more layers of a neural network. The layer types depend on the input data. For example, image data is often fed into a convolution neural network (CNN), while text data is fed into a recurrent neural network (RNN). Likewise, the output if the network is converted from tensors back into output data, often passing through RNN layers (if the output is text) or some other common layer type.



Quote for the day:


"The test we must set for ourselves is not to march alone but to march in such a way that others will wish to join us." -- Hubert Humphrey


Daily Tech Digest - February 22, 2019

Cloud Washing: How to Spot It and How to Avoid It
Cloud washing occurs when software providers attach the “cloud” label to a program in an effort to rebrand or boost sales. These programs differ from cloud-native software, which are built specifically for the cloud. Cloud washed tools take advantage of companies who want to integrate cloud solutions into their infrastructure. As such, they should be avoided at all costs. Though cloud washing is a serious problem, it can be easy to find if you know what you’re looking for. We’ve put together some tips on how to tell if a program is cloud washed and what cloud-native tools actually look like. Companies cloud wash by claiming that legacy software (old or outdated software that continues to be supported) is cloud software. Whether it’s a ploy to attract uninformed customers or evidence of a lack of cloud understanding, marketers will attach the word “cloud” to their old programs to boost sales. This tends to happen with tools that connect to the Internet.


Why AI Transformation Is Digital Transformation, Fully Realized

We now know that data collected from one channel needs to inform efforts in every other channel and that technologies that were introduced as channel-specific tools now need to work across entire organizations — something even the marketing clouds have trouble with. Because of the way marketing technology has evolved, marketers are left managing very complicated tech stacks comprised of multiple technologies, stitched together to complete what should be seamless and interconnected marketing processes. It’s no wonder that even though companies have more technology at their disposal than at any other point in history, only 39% of executives today say they feel they have the digital capabilities they need to compete. As someone who has spent the last decade reimagining how to process, analyze and act on audience, channel and tactic data at scale, I believe the introduction of artificial intelligence (AI) will be the final tipping point for marketing’s digital transformation — despite challenges that remain. Here’s how.


How elite investors use artificial intelligence and machine learning to gain an edge


"The rise of machine learning will really make our industry unrecognizable in the future," said Anthony Cowell, head of asset management for KPMG in the Cayman Islands. His clients include some of the world's largest asset managers, hedge funds and private-equity firms. For instance, Citi Private Bank has deployed machine learning to help financial advisors answer a question they're frequently asked: What are other investors doing with their money? By using technology, the bank can anonymously share portfolio moves being made by clients all over the planet. "Traditionally that kind of information was sourced from your network. You might have had a few coffees or heard about it over a cocktail," Philip Watson, head of the global investment lab at Citi and chief innovation officer at Citi Private Bank, told CNN Business. "Now, we can share insight that is very valuable."
Citi also built a recommender engine that uses machine learning tools to advise clients.


Behind-the-scenes look at 5G Evolution

Back then we thought about higher bitrates, increased spectrum efficiency, etc. We also had a few ideas that we were not able to get into the LTE standard due to backward compatibility issues – the most key feature being what we call "lean carrier". We also put energy efficiency high on the requirement list. Most importantly, we saw a need for solutions that could support all kinds of communication needs, way beyond traditional services, for example, serving the forecast massive IoT market was one of our key requirements. And, in order to really stretch our design, we added support for critical machine type communication, which is now known as URLLC (ultra-reliable low latency communication). ... 5G will also be combined with edge computing and 3rd party applications running close to the devices. On the edge, AI will be able to learn and control most of our infrastructure in smart cities and smart manufacturing in factories. So the transformational impact of 5G will be enormous compared to the previous Gs.


4 Promising Use Cases Of Blockchain In Cybersecurity


Hackers often gain access to systems by exploiting weaknesses in edge devices. These include routers and switches. Now, other devices such as smart thermostats, doorbells, even security cameras are also vulnerable. Simply put, the rigorousness is often not applied when ensuring whether these IoT devices are secure. Blockchain technology can be used to protect systems, and devices from attacks. According to Joseph Pindar, co-founder of the Trusted IoT Alliance, blockchain can give those IoT devices enough “smarts” to make security decisions without relying on a central authority. For instance, devices can form a group consensus regarding the normal occurrences within a given network, and to lockdown any nodes that behave suspiciously. Blockchain technology can also protect all the data exchanges happening between IoT devices. It can be used to attain near real-time secure data transmissions and ensure timely communication between devices located thousands of miles apart.


Thales to sell nCipher to Entrust Datacard


This deal with Entrust Datacard is expected to close during the second quarter of 2019, subject to the successful completion of the acquisition of Gemalto by Thales and the approval of Entrust as a suitable purchaser by the European Commission, US Department of Justice, Australian Competition and Consumer Commission, and New Zealand Commerce Commission. Thales said the deal will enable nCipher Security, which has more than 300 employees and reported more than €100m in revenues in 2018, to “continue to deliver innovative solutions and services and strengthen its market leadership”. It added that Entrust Datacard is a global leader in public key infrastructure (PKI) solutions and services, and the primary use case for GP HSMs in protecting infrastructure private keys such as root and issuing certification authorities keys. “This makes Entrust Datacard the ideal organisation for Thales to divest this business, ensuring its leadership position in the GP HSMs market and providing trust, integrity and control to business-critical applications,” the company said.


How AI can help to prevent the spread of disinformation

Disinformation has spawned a new sub-industry within journalism, with fact checkers working around the clock to analyse politicians’ speeches, articles from other publications and news reports, and government statistics among much else. But the sheer volume of disinformation, together with its ability to multiply and mutate like a virus on a variety of social platforms, means that thorough fact-checking is only possible on a tiny proportion of disputed articles. While technology has provided the seedbed and distribution for disinformation, it also offers a solution to the issue. Artificial intelligence in particular offers powerful tools in the fight against disinformation, working on multiple levels to identify dubious content. These techniques are broadly split between content-based and response-based identification. The former works much like a human fact checker, by matching the content of an article with trusted sources of information to highlight errors or outright lies.


10 Principles for Modernizing Your Company’s Technology


Use cross-functional teams to plan and design this modernization effort. Functional experts from areas such as IT, strategy, R&D, customer interaction, and operations can all work together in an agile “sandbox” environment to design the changes around a set of coordinated specifications. In this early stage, and throughout the initiative, you thus link leading-edge knowledge of the changing technology with deep, day-to-day awareness of the desired results. As you bring these teams together, you will establish a shared frame of reference — a common language to describe the features you want and the capabilities you are building. This also will help engage new stakeholders as they join in the effort. A major transportation company revamped its online system this way, improving the integration between the website that handled passenger bookings and the back-office functions that, among other things, routed travel. In its intensive sandbox sessions, the company set up temporary cross-functional working groups, which became known as “tribes.”


Cisco warns on HyperFlex security vulnerabilities

3 patch training update software band aid laptop with virus binary
“An attacker could exploit this vulnerability by connecting to the cluster service manager and injecting commands into the bound process,” Cisco wrote in its Security Advisory. Cisco says that the vulnerability is due to insufficient input validation in Cisco HyperFlex software releases prior to 3.5. Such input can impact the control flow or data flow of a program and cause a number of resource control problems. Cisco has released a software update to address this vulnerability and said that there are no other workarounds to address this exposure. The second vulnerability – rated 8.1 on Cisco's scale – is a snafu in the hxterm service of Cisco HyperFlex Software that could let an attacker connect to the service as a non-privileged, local user. A successful exploit could allow the attacker to gain root access to all member nodes of the HyperFlex cluster in Cisco HyperFlex software releases prior to 3.5, according to the security advisory.


How and why the data privacy mandate is expanding


A battle is also brewing in the US over state and federal privacy laws. Several states have passed laws aimed at data privacy and ethical use. The most prominent and restrictive of these is the California Consumer Privacy Act of 2018 -set to take effect in 2020 and billed to be the toughest data privacy law in the country (incorporating many GDPR-like restrictions). Many companies have lobbied against this and other state bills, pushing for less restrictive measures and asking that a uniform federal law supersede all state legislation. To this end, both the US Chamber of Commerce and the Internet Association, which represents companies like Amazon, Facebook, Google, and Twitter, have released their own recommendations for a federal bill. The Data Care Act introduced by a group of US senators, a competing congressional bill, The Information Transparency and Personal Data Control Act, and the White House recommendations round out the plethora of proposals.



Quote for the day:


"The ultimate measure of a man is not where he stands in moments of comfort, but where he stands at times of challenge and controversy." -- Martin Luther King, Jr.


Daily Tech Digest - February 21, 2019

Data Mining — What, Why, How?


Data mining sits at the intersection of statistics (analysis of numerical data) and artificial intelligence / machine learning (Software and systems that perceive and learn like humans based on algorithms) and databases. Translating these into technical skills leads to requiring competency in Python, R, and SQL among others. In my opinion, a successful data miner should also have a business context/knowledge and other so called soft skills (team, business acumen, communication etc.) in addition to the above mentioned technical skills. Why? Remember that data mining is a tool with the sole purpose of achieving a business objective by accelerating the predictive capabilities. A pure technical skill will not accomplish that objective without some business context. The following article from KDNuggets proves my point that data mining job advertisements mentioned the following terms very frequently: team skills, business acumen, analytics among others. The same article also has SQL, Python and R at the top of the list as technical skills.



Two Sides of a Coin: Blockchain, Ethics and Human Rights

What does it mean to say that a technology is evil? Given Krugman’s arguments, it’s easy to see what he meant: bitcoin is used exclusively for acts which are morally bad; hence, bitcoin is itself evil. As an ethical argument, this is willfully ignorant; you don’t need a Nobel Prize to find examples of blockchain being used for social good. But, interestingly, the underlying thought pattern – that bitcoin is evil because it brings about bad consequences– is an example of a legitimate moral theory known as consequentialism. If Krugman was arguing along consequentialist lines, his error lies in disregarding bitcoin’s positive aspects and in the failure to make the assumption of this ethical framework explicit.  Intrigued, we started searching the academic databases for ethical frameworks applied to blockchain, but found nothing. Yet we kept finding controversies surrounding certain blockchain use cases which relied implicitly on the ethical frameworks that philosophers have developed over thousands of years.


Zuckerberg Eyeing Blockchain For Facebook Login And Data Sharing


In the interview, Zuckerberg said that authentication was a use of blockchain that he is potentially interested in. However, he caveated it by saying: “I haven’t found a way for this to work.” He added: “You basically take your information, you store it on some decentralized system, and you have the choice of whether to log in in different places, and you’re not going through an intermediary.” “There’s a lot of things that I think would be quite attractive about that. For developers, one of the things that is really troubling about working with our system, or Google’s system for that matter, or having to deliver services through Apple’s App Store is that you don’t want to have an intermediary between serving the people who are using your service and you.” “Where someone can just say 'hey, we as a developer have to follow your policy and if we don’t, then you can cut off access to the people we are serving'. That’s kind of a difficult and troubling position to be in.”


Power over Wi-Fi: The end of IoT sensor batteries?

Power over Wi-Fi: The end of IoT sensor batteries?
The researchers believe that harvesting 150 microwatts of power (the power level of a typical Wi-Fi signal) with one of the rectennas could produce around 40 microwatts of electricity—enough to power a chip. Scaling the system to a vehicle, data center hall, or similar-sized setup, which they say is possible in part because their MoS2 material is thin and flexible, would conceivably generate commensurate power. The researchers also say the non-rigid, battery-free system is better than others’ attempts at rectennas because they capture “daily” signals such as “Wi-Fi, Bluetooth, cellular LTE, and many others," says Xu Zhang, of collaborator Carnegie Mellon University, in the article. The other Radio Frequency-to-power converters, which are thick and non-flexible, aren’t wideband enough, the groups say. Of course, radio waves already power some chips. RFID tags are an example. But those solutions are limited in their power and, therefore, range and bandwidth, which is why the search is on for something better.


UK committed to working with EU cyber security partners


Within the cyber security sphere, Martin said it was “objectively true” that nearly all the functions of the NCSC fall outside the scope of EU competence. “It follows that our enhanced cooperation with European partners, and the EU as a whole, in cyber security over recent years is not automatically affected by the UK’s changing relationship with the EU,” he said. “Pretty much everything we do now to help European partners, and what you do to help us, on cyber security can, should, and I am confident will, continue beyond 29 March.” In the past, said Martin, the UK has shared classified and other threat data with EU member states and institutions and played a role in the development of European thinking in areas such as standards and incident response.


What organizations can do to mitigate threats to data management

Adding granular encryption with BYOK (Bring Your Own Key) is an effective weapon in breach prevention. If even an administrator or engineer who manages data in an organization cannot read that data, a hacker will be stopped cold – he may be effective in stealing the data, but not in using it for his own gain. Threats to cybersecurity are considerable and are becoming worse with the proliferation of big data and its use in AI. Good practices raise awareness of cybersecurity risks and help organizations create robust, reliable and fast disaster recovery plans (DRPs) in advance. And, organizations can gain by using AI to monitor systems, detect vulnerabilities, and bridge those vulnerabilities, turning AI into a strategic asset. Many organization's cloud data environments lack the technology for the effective automation of data privacy compliance, and they find it challenging to meet the requirements of the most stringent regulation for data protection, GDPR.


How to recruit top tech talent: Do's and don'ts

hrhiringrecruitingistock-923039588fizkes.jpg
Dice Editor Nate Swanner said they were surprised that remote work rated so highly on the list and added that "tech pros can see through the pizazz: A flashy job title, dedicated parking spot and a fresh MacBook Pro won't cumulatively overcome great health benefits or remote work." Research firm Gartner has found that things may not be so simple, though: Benefits like healthcare may be highly desired, but they're also basic expectations for job seekers. "Instead, candidates want to know which benefits set the organization apart," Gartner said, noting that educational benefits, well-being initiatives, and innovative perks are far more likely to attract top talent. Giving credence to Gartner's argument is its research on the types of benefits mentioned in a job posting v. how much time that posting remains up. Mentions of medical care, employee well-being, and work-life balance had zero impact on how long a posting goes unfilled, while dental/vision coverage, financial benefits, family programs, and disability/life insurance all significantly reduced the amount of time it took to fill a job.


Move over HR: Why tech is taking charge of company culture

The key lesson, says Lewis, is that the broader organisation sees the plus-points that a new way of working brings and then demands similar benefits. "In the same way that it happened in the IT industry in terms of Scrum and Agile, I think people have started to realise that smaller, cross-functional teams can add value in other areas of the business," he says. Lewis, therefore, posits a change in perception, one that holds non-IT executives are recognising that digital chiefs have broad expertise that can help change the business for the better. Board members who call on their CIOs for advice on people and processes find new ways to overcome the cultural challenges associated to transformation. That view resonates with Brad Dowden, interim CIO and director at Intercor Transformations. He says the experience digital leaders have of running transformation programmes definitely leaves them well-placed to advise the rest of the organisation — including HR chiefs — about the best ways to pursue successful culture change initiatives.


Breaking the chains: How FUD is holding the cyber sector hostage


The biggest cyber danger for companies is not the CFO getting hacked by Chinese wizard-class hackers using an offensive AI-driven quantum virus via blockchain – it’s someone from the accounts team, clicking on that phishing email link because he did his mandatory corporate security training seven months ago and has forgotten to double-check the URL. It could also be someone from the development team facing a tight deadline and nabbing some code from GitHub, without having the time to really read through it and find that remote shell buried in line 2,361. Suppliers can hype and sensationalise the capabilities of their products, and the scale of the threat, but ultimately all they are doing is damaging customers’ trust – the trust that is vital for a company to know that its cyber security strategy is based on a proportional and relevant response to the threats it faces as an organisation.


Using Contract Testing for Applications With Microservices

What makes contract testing awesome is that it does this in a way which really fits well into a microservice workflow, said Groeneweg. The most important thing is that it decouples the test between the service who’s using the API (consumer) and the API itself (provider). This allows you to bring them both to production without needing the other. It’s especially useful when they are maintained by different teams because it enabled them to be autonomous in testing and releasing.
Groeneweg stated that contract testing is a way of reducing the risk of integration bugs. Also, contract testing is a lot faster than other ways of integration testing. That’s important as it allows you to decrease lead time and kill waste which is caused by slow feedback from tests, he said. As the consumer defines the contract, contract testing also leads to better interfaces and APIs that are actually used.



Quote for the day:


"The key to successful leadership today is influence, not authority." -- Ken Blanchard


Daily Tech Digest - February 20, 2019

Excessive Permissions are Your #1 Cloud Threat


IT administrators and hackers now have identical access to publicly-hosted workloads, using standard connection methods, protocols, and public APIs. As a result, the whole world becomes your insider threat. Workload security, therefore, is defined by the people who can access those workloads, and the permissions they have. ... One of the primary reasons for migrating to the cloud is speeding up time-to-market and business processes. As a result, cloud environments make it very easy to spin up new resources and grant wide-ranging permissions, and very difficult to keep track on who has them, and what permissions they actually use. All too frequently, there is a gap between granted permissions and used permissions. In other words, many users have too many permissions, which they never use. Such permissions are frequently exploited by hackers, who take advantage of unnecessary permissions for malicious purposes. As a result, cloud workloads are vulnerable to data breaches, service violation (i.e., completely taking over cloud resources), and resource exploitation.



The most advanced hacking groups are getting more ambitious

Groups like Chafer, DragonFly, Gallmaker and others are all conducting highly-targeted hacking campaigns as they look to gather intelligence against businesses they think hold valuable information. Once attackers might have needed the latest zero-days to gain access to gain entry into corporate networks, but now it's spear-phishing emails laced with malicious contents which are most likely provide attackers with the initial entry they need. And because these espionage groups are so proficient at what they do, they have well tried-and-tested means of conducting activity once they're inside a network. "It's like they have steps which they go through which they know are effective to get into networks, then for lateral movement across networks to get what they want," Orla Cox, director of Symentec's security response unit told ZDNet.


Why blockchain may be blockchain’s best cybersecurity option

Chains of binary data.
Developers should take the initiative to build their own decentralized security applications for anti-phishing, anti-malware, intrusion detection and distributed VPNs to deploy on the global blockchain. The bottom line is that it’s not enough to just trust blockchain’s security because of more transparency than other technological data security and privacy methods. Developers, miners and even enterprises need to look at the entire digital ecosystem when considering security, as every single point provides savvy hackers a weak link to exploit. As blockchain investment continues to skyrocket and the crypto markets continue to diversify – even with the recent slowdown – we will see more unique and sophisticated examples of cyber criminals penetrating blockchain’s security veneer. That’s the paradoxical ratio of technology: for as many positive innovations that tech creates, there almost is an equal amount of sinister “innovations” to match. This is most certainly true regarding blockchain. The key is to keep discussing threats to blockchain to inspire those securing it.


How Estonia became an e-government powerhouse

Estonia is among the elite group of countries in the highest echelons of the UN's E-Government Development Index (EDGI), with its citizens and public servants able to access a wide range of services online using secure digital IDs, including making payments, accessing full health records, and internet voting. Estonia has been building out its e-government since the mid-90s, not long after declaring independence from the Soviet Union. The program continues to make headlines with bold new digital initiatives, such as its e-residency program, which gives anyone living anywhere in the world the ability to receive a government-issued digital ID and full access to Estonia's public e-services. Today, 99% of the public services are available online 24/7, 30% of Estonians use i-Voting, and the country estimates the reduced bureaucracy has saved 800 years of working time.


The 11 biggest issues IT faces today

The 12 biggest issues IT faces today
“Security professionals must be extra vigilant with detection and training against these threats,” says John Samuel, CIO at CGS. “This year, companies will need to introduce AI-based protection systems to be able to contain any such attacks introduced by this next-gen tech.” Grinnell says AI wasn’t a factor in the most notable attacks of the last year, but he expects that to change. “I believe 2019 will bring the first of many AI-driven attacks on U.S. companies, critical infrastructure and government agencies,” he says. “Let’s hope I’m wrong.” Forward-thinking organizations are now implementing privacy by design in their products, but making sure those efforts meet GDPR standards is an ongoing concern. Google, for example, just saw a record fine by French regulators over how the company collects data. “U.S. businesses will need to consider a GDPR-type policy to protect citizens even before any regulations are enacted,” Samuel says. “Ultimately, there must be international guidelines to ensure customer privacy and protection on a global scale to allow for easier compliance.”


Setting expectations and preparing for a new breed of cyberattacks

Lateral movement is a method used by cyberattackers to move through a network, as they search for the essential data that is the eventual target of the breach. Continuing to hide in plain sight, cybercriminals are leveraging non-malware / fileless attack methods to do this, which is the biggest indicator that attackers aren’t just focused on one component of an organization, but are seeking additional targets as they infiltrate the network. In order for today’s organizations to prepare for these threats to security, they first need to solve the problem of visibility. True endpoint visibility should allow you to “turn back the clock” and see exactly what happened on the endpoint at a specific date. To understand how significant this capability is, we found that an organization with 10,000 endpoints is estimated to see more than 660 attempted cyberattacks per day.


Can work allocation algorithms play fair?


Allocating work by algorithm is not an inherently bad idea, according to James Farrar, chair of United Private Hire Drivers, a branch of the Independent Workers Union of Great Britain, who has co-led legal action against Uber for drivers’ rights. Many drivers working for conventional minicab companies pay a fee for the privilege (Uber takes a percentage, typically 20-25%), only to see controllers giving the best jobs to their friends, ordering them to collect their takeaway food and even demanding bribes. “People didn’t just walk away from those operators, they ran to Uber,” Farrar says. But the company’s algorithms create their own problems. Uber tells prospective drivers that “there’s no office and no boss”, adding that “with Uber, you’re in charge”. Farrar says this is not reflected in the ways the company’s algorithms allocate jobs and influence driver behaviour.


Unactioned data subject access requests could lead to legal action

A Talend report published in September 2018 found that only 30% of organisations are able to fulfil DSARs within the GDPR’s 30-day deadline. This shows how difficult it is to maintain an effective DSAR process. Requests have increased substantially since the GDPR took effect, while the deadline to respond has decreased and the amount of information that must be provided has increased. It’s no surprise, therefore, that many organisations are looking for help. The GDPR DSAR Support Service, provided by our sister company GRCI Law, is a perfect example of how you can simplify the process. GRCI Law’s experienced data privacy lawyers and DPOs (data protection officers) will manage the process on your behalf to ensure that requests are completed in accordance with the GDPR’s requirements.


How managed network services are evolving to simplify the global WAN

How managed network services are evolving to simplify the global WAN
The first step toward evolving the managed network services market was network function virtualization (NFV). “When the service providers were facing the need to streamline their operation, move faster, respond faster, they took an approach of virtualizing appliances,” says Yovel. “Think about all the different network functions that used to be in the old network—next-generation firewalls, various orchestration solutions, VPN solutions, and so on. They virtualized all these boxes, but that didn't change the core dynamic of the network itself. Each function coming from different vendors still had its own management interface, plus its own scaling and sizing environment. The fact the appliance was virtualized didn’t change that. They still had the same problem with the centralized architecture as in the past.” Consider the example of virtualizing a firewall. Mobile users still need to connect over the internet over long distances to some firewall in some location to get the security they need. The fact that the firewall is virtualized doesn’t change that dynamic.


Microservices With CQRS and Event Sourcing

Microservices are independent, modular services that have their own layered architecture. When microservices share the same database, the data model among the services can follow relationships among the tables associated with the microservices. ... A shared database is not recommended in a microservices-based approach, because, if there is a change in one data model, then other services are also impacted. As part of microservices best practices, each microservice should have its own database. ... The limitation of this approach is that transaction management cannot be properly handled. If customer data is deleted, the corresponding order also has to be deleted for that customer. Though this can be achieved with workarounds, like calling a delete service in the Order service, atomicity is not achievable in a straight forward way. This needs to be handled with customization.



Quote for the day:


"The world_s greatest achievers have been those who have always stayed focussed on their goals and have been consistent in their efforts." -- Roopleen