Daily Tech Digest - May 06, 2017

Artificial intelligence will make or break us. Here's how we need to respond

The problem, for some, is the assumption that new technological breakthroughs are incomparable to those in the past. Many scholars, pundits, and practitioners would agree with Alphabet Executive Chairman Eric Schmidt that technological phenomena have their own intrinsic properties, which humans “don’t understand” and should not “mess with.” Others may be making the opposite mistake, placing too much stock in historical analogies. The technology writer and researcher Evgeny Morozov, among others, expects some degree of path dependence, with current discourses shaping our thinking about the future of technology, thereby influencing technology’s development. Future technologies could subsequently impact our narratives, creating a sort of self-reinforcing loop.


Mind the Gap

The sheer number of IT departments that are not acknowledging the numerous security gaps for cyber-attackers to exploit is astonishing. The problem is that many of those within the industry believe they have their security posture under control but they haven’t looked at the wider picture. The number of threats is increasing every day and as new technologies and opportunities emerge, companies need new security infrastructure to cope with the modifications of the threat landscape. Currently, C-level executives struggle to keep up with the necessity to approve budget requirements to bring their enterprise security up to the next level of protection. If companies are not up to date with the latest trends, businesses are being left more vulnerable to data breached as a consequence.


APT10’s devastating cyber attack shows anti-virus defences can't be relied on

Using meticulously-acquired data, these emails masquerade as messages from a public sector entity, such as the Japan International Cooperation Agency, for example, while the attachments are crafted to address a topic of direct relevance to the recipient. For most employees, clicking open such an attachment will be virtually automatic, activating the malware code hidden in the structure or content of the file attachment. This sophisticated malware immediately rips through networks, heading for the plans, the designs and the data that these incredibly well-resourced threat-actors want to steal. ... These solutions are not only incapable of detecting 100% of the viruses out there, they cannot detect the sophisticated threats that hackers such as APT10 now deploy inside the instruments essential to everyday business – email attachments.


Concern mounts at Indian ID scheme as portals ‘leak’ 100m people’s details

The disclosures came as part of a report entitled Information Security Practices of Aadhaar (or lack thereof): A Documentation of Public Availability of Aadhaar Numbers with Sensitive Personal Financial Information, which focuses on just four of India’s numerous government portals: The National Social Assistance Programme (NSAP): which provides supports unemployed, elderly, sick and disabled citizens; The National Rural Employment Guarantee Act (NREGA) scheme: which provides households in rural areas at least 100 days of guaranteed wage employment each year; The Chandranna Bima Scheme, Govt. of Andhra Pradesh: which provides relief to families if a worker is disabled or killed; Daily Online Payment Reports of NREGA, Govt. of Andhra Pradesh: which tracks progress and payments under the NREGA scheme.


Digital Strategy Vs. Digital Transformation: What's The Difference?

How much appetite for going digital do you have? This is where the question of digital strategy versus digital transformation comes in. The two terms are often misused, in part by being used interchangeably when they are in fact two very different things. A digital strategy is a strategy focused on utilizing digital technologies to better serve one particular group of people (customers, employees, partners, suppliers, etc.) or to serve the needs of one particular business group (HR, finance, marketing, operations, etc.). The scope of a digital strategy can be quite narrow, such as using digital channels to market to consumers in a B2C company; or broader, such as re-imagining how marketing could be made more efficient through the use of digital tools like CRM, marketing automation, social media monitoring, etc. and hopefully become more effective at the same time.


HSBC adopts cloud-first strategy to solving big data business problems

“We deliberately picked projects that were real business problems, because we didn’t want to do a meaningless proof of concept that was kind of interesting to us but didn’t really solve anything,” says Knott. “We chose those five areas because they are important, but they’re not so big that we’re betting the bank on the success or failure of these things.” Some of these use cases, such as the bank’s anti-money laundering activity, requires sifting through billions of transactions looking for suspicious activity, and the organisation wants to use machine learning models to cut down the time it takes to do this work and improve its accuracy. ... “There was a huge appetite to do this, but we also needed to satisfy ourselves that cloud is safe and secure, our regulators are happy, and all the important people are comfortable with what we are doing,” he says.


Why Log Shipping is Better than Database Mirroring for Migrations

Using log shipping, I can go straight from failover to having the database in an AG in a matter of a few seconds, not hours. The set up time will be a little more work because I’m setting up 2 secondaries (2 sets of restores) during set up time, but those can run at the same time so the difference in set up time is negligible. The big difference is I don’t have to go through the restore process for the secondary database because the databases came from the same source database, their log chains are intact, and their logs are in sync already. The failover process for this log shipping scenario does have to be performed in a very specific manner. I wrote on this process previously in my article for SQL Server Pro magazine called 3 Log Shipping Techniques back in 2011. It is actually a very simple process once you understand it.


Can Design Thinking Unleash Organizational Innovation?

To support this “fail fast / learn faster” environment, we do our preliminary data science work using small data sets (10 to 20 GB) on jazzed up laptops (running all of our favorite data management and data science tools). We do this to accelerate the “fail fast / learn faster” process. We don’t want to get hung up on spending lots of time and resources setting up a big analytic sandbox with large data sets. Just the aggregation, cleansing, aligning, transforming and enriching of large terabyte-sized data sets can substantially hinder the rapid “fail fast / learn faster” data science process. We can learn a lot about the variables and metrics that might be better predictors of business performance in a small environment before we start to operationalize the resulting data and analytics in our data lake.


The Hidden Costs of Bad Data

Data is all around us and has a profound impact on our daily lives. But what happens when we rely on bad data to make a decision? Is it as simple as arriving late to work as a result of bad directions, or does bad data have a more costly and meaningful impact on our lives? Erroneous decisions made from bad data are not only inconvenient, but also extremely costly. IBM looked at poor data quality costs in the United States and estimated that decisions made from bad data cost the US economy roughly $3.1 trillion dollars each year. Research from Experian Data Quality also found that bad data has a direct impact on the bottom line of 88% of all American companies. The averages losses from bad data was 12% of the company’s overall revenue. A Gartner Reporter also found that 27% of data in the World’s top companies is flawed.


TensorFlow, an open source software library for machine learning

TensorFlow delivers a set of modules (providing for both Python and C/C++ APIs) that enable constructing and executing TensorFlow computations, which are then expressed in stateful data flow graphs. These graphs make it possible for applications like Google Photos to become incredibly accurate at recognizing locations in images based on popular landmarks. In 2011, Google developed a product called DistBelief that worked on the positive reinforcement model. The machine would be given a picture of a cat and asked if it was a picture of a cat. If the machine guessed correctly, it was told so. An incorrect guess would lead to an adjustment so that it could better recognize the image. TensorFlow improves on this concept by sorting through layers of data called Nodes. Diving deeper into the layers would allow for more and complex questions about an image.



Quote for the day:


"Change the changeable, accept the unchangeable, and remove yourself from the unacceptable." -- Denis Waitley


No comments:

Post a Comment