Daily Tech Digest - July 14, 2019

German banks are moving away from SMS one-time passcodes

sms-phone.jpg
The cyber-security industry has been warning against the insecurity of SMS OTP for years now, as well, but not because of SIM swapping attacks -- which are virtually social engineering attacks. The cyber-security industry has been warning against securing systems with SMS-based authentication because of inherent and unpatchable weaknesses in the SS7 protocol used in the backbone of all mobile telephony networks for years. Vulnerabilities in this protocol allow attackers to silently hijack a user phone number, even without a telco's knowledge, allowing threat actors to track users or authorize online payments or login requests. These vulnerabilities have not gone unnoticed in Germany. ... While two-step verification and two-factor authentication is recommended, security experts have been warning against relying on SMS as "the second factor." Instead, experts recommend using authenticator apps or hardware security tokens, two of the methods that German banks are now rolling out to secure their systems and replace SMS-based authentication.


Four Insurtech Startups Shaking Up The Insurance Industry

Insurtech can ease the insurance claim process
Insurtech companies tend to focus on increased personalization and greater speed and efficiency of services to meet changing customer needs, with many using AI to offer deeper data insights. And while some are set on displacing the industry incumbents, others are working with the leading insurance firms as they transition to the age of digital innovation. ... The inconsistent and often unpredictable nature of external data can cause insurers a real headache, leading to delays in processing new opportunities and managing existing books of business. Untangler is the insurtech that gets around that problem by using AI to recognize inbound customer or employee data in any format, transform it into readable data in seconds from which providers can create quotes without having to convert the data within cells. Launched in May this year the technology was developed by entrepreneurs Richard Stewart and Steve Carter, initially for their own startup Untangl, which provides employee benefits, including insurance products, to SMEs.


Beware of Geeks Bearing AI Gifts

Artificial? Definitely. Intelligent? Maybe.
Last March, McDonald’s Corp. acquired the startup Dynamic Yield for $300 million, in the hope of employing machine learning to personalize customer experience. In the age of artificial intelligence, this was a no-brainer for McDonald’s, since Dynamic Yield is widely recognized for its AI-powered technology and recently even landed a spot in a prestigious list of top AI startups. Neural McNetworks are upon us. Trouble is, Dynamic Yield’s platform has nothing to do with AI, according to an article posted on Medium last month by the company’s former head of content, Mike Mallazzo. It was a heartfelt takedown of phony AI, which was itself taken down by the author but remains engraved in the collective memory of the internet. Mr. Mallazzo made the case that marketers, investors, pundits, journalists and technologists are all in on an AI scam. The definition of AI, he writes, is so “jumbled that any application of the term becomes defensible.” Mr. Mallazzo’s critique, however, conflates two different issues. The first is the deliberately misleading marketing that is common to many hyped technologies, and is arguably epitomized by some blockchain companies.


Healthcare Organizations Too Confident in Cybersecurity

"There are some surprises in the results, particularly the higher than expected confidence that organizations have in regards to the security of their patient portal and telemedicine platforms given that only 65% deploy multi-factor authentication," said Erin Benson, director of market planning for LexisNexis Health Care. "Multi-factor authentication is considered a baseline recommendation by key cybersecurity guidelines. Every access point should have several layers of defense in case one of them doesn't catch an instance of fraud. At the same time, the security framework should have low-friction options up front to maintain ease of access by legitimate users." The report findings suggest that traditional authentication methods are insufficient, multi-factor authentication should be considered a baseline best practice and the balance between optimizing the user experience and protecting the data must be achieved in an effective cybersecurity strategy, the press release said.


Big Data Governance: 4 Steps to Scaling an Enterprise Data Governance Program

Picture of dozens of birds grouped on several lines (almost looks like an abacus), an illustration of alignment in nature.
No matter how big your data governance program becomes, it must retain its agility. If you can’t adapt quickly, then you’ll lose momentum and your initiative will start to deliver diminishing returns. A big challenge here is aligning the huge number of people involved in your initiative. We’ve already discussed the need for collaboration, but crowdsourcing solutions to big decisions can soon lead to analysis paralysis. The solution is to develop an efficient decision-making system that allows for everyone’s voice to be heard. A best-practice decision-making framework, such as a DACI approach (Driver, Approver, Contributor, and Informed), can help here. These frameworks establish a continuous cycle of listening and acting, where everyone has a chance to feed into the discussion, but a small group of clearly identified people retain control over decision-making. That way, everyone’s happy and you make steady progress.


Is Programming knowledge Required to Pursue Data Science?

The answer is No! Data Science is not just about having technical knowledge. Being a domain related to both the Computer science world as well as the Business world, the latter has a fair share of skill set that is very vital for becoming a data scientist. In fact, non-technical skills that are mentioned below arguably sum up to 60% of the work as a data scientist. These are skills that were not mentioned in the Venn diagram but are equally important in any ideal data-driven project. ... data science is a field for everyone. From an application developer to a businessman, everyone will have a base skill set that enables anyone to start a fresh career in Data Science. Even those who do not want to learn to programme can hone their strengths in their business or mathematical department and still be a part of this wonderful domain. At the end of the day, a sense of problem solving and commitment is all that one will need to excel in any given situation.


Brazil is at the forefront of a new type of router attack


According to Avast researchers David Jursa and Alexej SavĨin, most Brazilian users are having their home routers hacked while visiting sports and movie streaming sites, or adult portals. On these sites, malicious ads (malvertising) run special code inside users' 
browsers to search and detect the IP address of a home router, the router's model. When they detect the router's IP and model, the malicious ads then use a list of default usernames and passwords to log into users' devices, without their knowledge. The attacks take a while, but most users won't notice anything because they're usually busy watching the video streams on the websites they've just accessed. If the attacks are successful, additional malicious code relayed through the malicious ads will modify the default DNS settings on the victims' routers, replacing the DNS server IP addresses routers receive from the upstream ISPs with the IP addresses of DNS servers managed by the hackers. The next time the users' smartphone or computer connects to the router, it will receive the malicious DNS server IP addresses, and this way, funnel all DNS requests through the attacker's servers, allowing them to hijack and redirect traffic to malicious clones.


Building Your Federal Data Strategy

The variety of federal agency missions, resourcing and data maturity levels vary greatly and thus each will have a unique perspective on the Federal Data Strategy and how they can leverage it. Regardless of an agency’s start point however, building an agency strategy and implementation plan based on the Federal Data Strategy’s three guiding data principles (i.e., Ethical governance; Conscious Design; Learning Culture) and three best practices (i.e., Building a culture that values data and promotes public use; Governing, managing and protecting data; Promoting efficient and appropriate data use) is imperative. Most of these principles and strategies will likely be driven initially by new policies, reporting requirements, budget planning, and associated activities. Given the necessary policy staffing and approval processes however, it will take some time to get these in place. A sequential, policy-only approach to the strategy is apt to result in extended timelines hindering rapid progress.


What Matters Most with Data Governance and Analytics in the Cloud?


As much as there are benefits, there understandably can be issues to address when it comes to moving data and redeploying processes from legacy platforms to Big Data platforms and the cloud to reduce costs and achieve higher processing performance. “There is a desire to modernize infrastructure into Big Data platforms or clouds,” commented Smith. Syncsort provides solutions for data infrastructure optimization, the cloud, data availability, security, data integration, and data quality. As businesses build data lakes to centralize data for advanced analytics, they need to also ingest mainframe data for Big Data platforms like Hadoop and Spark. According to Smith, the top analytics use cases that drive data lakes and enterprise data hubs are advanced/predictive analytics, real-time analytics, operational analytics, data discovery and visualization, and machine learning and AI. The top legacy data sources that fill the data lake are enterprise data warehouses, RDBMS, and mainframe/IBM I Systems.  “Legacy infrastructure is so critical,” Smith commented. “These systems are not going away.” Mainframes and IMB I still run the core transactional apps of most enterprises, according to the company.


Sensitive Data Governance Still a Difficult Challenge


Governing all of an organization’s mountains of sensitive data, or even knowing what sensitive data exists and where it’s located within the enterprise, isn’t easy. Data classification is hard to accomplish. Often it does not occur reliably given the volume of data that must be discovered and when the task is left to business users. A large health care provider stores 4.1 billion columns of data. A financial services company sucks in more than 10 million data sets per day. As the data pours in, only a small percentage of it — the so called Critical Data Elements (CDEs) — are tagged in a painfully slow and error-prone manual process that leaves most data miscategorized, lost or still waiting to be discovered, and impossible to track. Most companies have between 100 and 200 CDEs, but customers typically have thousands and sometimes even millions of data elements depending on their business, data organization and representation. This presents a risk because CCPA covers any data you know and possess about your customers.



Quote for the day:


"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour


No comments:

Post a Comment