Daily Tech Digest - November 03, 2017

artificial intelligence / machine learning / binary code / virtual brain
Transforming the organization into a cognitive enterprise will be an arduous task and an evolutionary process. Jobs will not disappear overnight, and many organizations will outright fail to leverage the power of this technology — and will suffer the business consequences as a result. This lack of inevitability is because there are two significant problems when it comes to leveraging machine learning in the enterprise: data and bias. Machine learning only works with data. Lots and lots of data. It’s called machine learning because the machine must be ‘taught’ by giving it data from which it can distill patterns, and, in most cases, the teaching data must be ‘clean’ — meaning that it must be accurate and represent the desired outcomes. This reality means that for machine learning to work, an organization must begin with lots and lots of good, clean data. 


How to select the best self-service BI tool for your business
If most of your data is on Azure, you might want to rule out BI systems that run only on Amazon Web Services, and vice versa. If possible, you want the data and the analysis to be collocated for performance reasons. Vendors tend to cite analyst reports that are most favorable to their product. Don't trust the vendor's skimmed abstract or take the diagram they show you at face value: Ask for and read the whole report, which will mention cautions and weaknesses as well as strengths and features. Also take the fact of inclusion in an analyst's report with a large grain of salt: ... Some BI platforms now use in-memory databases and parallelism to accelerate queries. In the future, you may see more highly parallelized GPU-based databases built into BI services — third parties are building these, demonstrating impressive speedups.


Where is my data!? Why GDPR is good for Mainframes

The implications for the mainframe and GDPR are vast. The increased use of mobile devices alone are driving exponential growth in transaction volumes, and that data contains massive amounts of PII. This personal data is spread across the organization, widely used, transformed and accessed in different ways by different people, meaning application-based controls are not enough for complying with the regulation. The key first step toward achieving GDPR compliance for mainframe data is beginning with the identification and classification of the data, and determining which data contains PII information. Based on that classification, you will have a view of what personal data is being stored and where, and therefore a view at the levels of risk in your organization. If personal data is circulating outside the assigned channels and flows, it’s important to understand why and assess the associated risk to that data.


Tapping into big data’s potential

Tapping into big data’s potential
With big data you have different aspects, and there is relevance to how central banks deal with the data in general. When you look into the responses to the survey, they clearly show that, although it is unstructured data as far as the research is concerned, it could be structured and voluminous for other purposes – such as the credit register. I think there is a question about what the data is used for, and not so much the size or the structured versus unstructured demarcation. ... Firstly, there are those who say big data is primarily the type of unstructured data the private sector is dealing with. According to a recent BIS review, central banks are clearly interested too, for example, in looking at internet searches for nowcasting. A second area that is really key for central banks is in dealing with very large administrative and financial datasets. It is not simply because it is large that makes it big data, but because it is large and complex.


Facebook's plan to throw humans at security, ... equates to indictment on AI progress

For Facebook, the crisis isn't due to Russians tinkering with election sentiment. The crisis for Facebook is trust. You are the product. If you don't trust Facebook's information you may not engage as much. Facebook needs you to pass along information. The fact that there is shock -- shock I tell you -- over how Facebook can be used to manipulate the masses is almost comical. After all, those tools are the same reason marketers are fueling Facebook's financial dominance over the ad industry. But this rant isn't an indictment of social media lemmings or Facebook's controls or approach to ads. The Facebook conference call -- and Zuckerberg's solution to double headcount on security and throw humans at the fake news and trust issue -- is really an indictment on its AI prowess. Facebook simply doesn't have the tools or AI engineering to automate its way out of its mess.


Stratis: Blockchain-as-a-Service (BaaS)


Stratis is a flexible and powerful Blockchain Development Platform designed for the needs of real-world financial services businesses and other organizations that want to access the benefits of Blockchain technologies without the overheads inherent in running their own network infrastructure. ... Stratis is designed with the integration of fiat gateways in mind from the outset. It allows financial organizations to use the blockchain for the transfer of existing currencies that are both readily accepted by mainstream consumers and are not subject to damaging volatility: tokens of value that are simply digital equivalents of regular money. This ‘best of both worlds’ approach means that businesses can maintain compliance in whatever way they see fit, according to jurisdiction and organisational policy, while simultaneously using the blockchain as a store of value


The Future of Cybersecurity Part II: The Need for Automation

istock 166419812
Threats are evolving so quickly on the black hat side that the only way to combat them is through automated and intelligent defense layers that can quickly identify new and existing threats and then make decisions to mitigate them. I call this type of cybersecurity defense “actionable intelligence.” It requires deploying interconnected security solutions everywhere across your expanded network, including deep into the cloud, The goal is to create a security solution that is able to see and identify the stages of a threat and then make a decision on its own. Such an expert system is able to identify and block attacks at network speeds so that we don’t have to rely on humans, who often miss too much and respond far too slowly, to take action. This may require rethinking – and even retooling – your security infrastructure. To start, devices need to be able to see each other and share threat intelligence.


Data lake and data warehouse – know the difference

If you’re still struggling with the notion of a data lake, then maybe the following analogy will clarify matters. Think of a data mart or data warehouse as a storage facility rife with cases of bottled water. Those cases didn’t just magically appear overnight. People and machines gathered and purified the water. After packaging it, only then was it ready for people to buy and drink. By comparison, think of a data lake as a large body of natural water that you would only drink if you were dying of thirst. If you need 50 gallons of water to put out a fire, you don’t need to buy cases of bottled water and empty them out one by one. It’s all there, ready to go. In keeping with this analogy, the “water” in a data lake flows from many places: rivers, tributaries and waterfalls. That is, the data lake doesn’t hold only one type of water (that is, data). Data lakes can house all types of data: structured, semistructured and unstructured.


Blockchain Technology and The Changing Global Economy at the Ethereal Summit

Ethereal Summit 3
There are many parallels between the adoption of blockchain technology in emerging markets and the mainstream adoption of telecommunication in the 21st century. Instead of using phone lines, developing countries utilized newer technology and developed their infrastructure using satellite wireless communication. By "piggybacking" on the cell technology of developed countries, developing countries were able to incorporate new technology in an efficient and cost-effective way. Similarly, countries with fewer established financial systems are taking advantage of decentralized financial institutions powered by blockchain technology instead of establishing traditional banks. Although implementation speeds will vary by country, blockchain technology has the potential to empower all markets, including those looking for a technological piggyback.


What Is "Cloud-native" Data and Why Does It Matter?


Be aware that in cloud-native systems, the unified log often becomes the system of record. Materialized views show you a representation of that data for a given purpose. This is different way of thinking of data storage, and for many, turns the idea of a database inside out! The unified log holds individual transactions from your various inputs. Those items may inflate into objects or records in your applications or cache. This may be a new way for you to store data, but it’s proven to be an excellent solution at scale. That said, you don't have to throw out your trusty relational database. Instead, reassess how you use it. For example, if you've been using your relational database for application session state, consider introducing something like Redis and get familiar with key-value stores. At the same time, introduce modern relational databases like Google Cloud Spanner that are designed for geographic resilience and cloud-scale performance on demand.



Quote for the day:


"If you are filled with pride then you'll have no room for wisdom." -- African Proverb


No comments:

Post a Comment