Daily Tech Digest - November 28, 2018


Modern enterprises solutions are not only smarter but also require no physical infrastructure at all, making them more cost-effective than older technologies. The rise in the number of software-as-a-service (SaaS) based enterprise management products has consequently helped more and more entrepreneurs build digitized enterprises through the use of simple and efficient products. SaaS-based cloud application services do not require any storage, servers, databases, but offer greater capabilities such as inter-operability and easier customization. This allows service providers to integrate advanced technologies such as artificial intelligence, machine learning, data mining and analytics, etc. into their enterprise systems and business processes to unlock higher levels of productivity like never before.



The 10 most in-demand tech jobs of 2019

The tech jobs landscape of 2019 will likely look largely the same as it did in 2018, with roles in software development, cybersecurity, and data science dominating across industries. "Emerging technologies will be key catalysts for the in-demand jobs we expect to see in 2019," said Sarah Stoddard, community expert at job search site Glassdoor. "From artificial intelligence, automation, virtual reality, cryptocurrency and more, demand for jobs in engineering, product, data science, marketing and sales will continue to rise in order to support the innovation happening across the country." More and more often, traditional companies are beginning to resemble tech companies, and this trend will likely continue throughout the next year, Stoddard said. "As employers across diverse industries, from health care to finance to automotive and more, continue to implement various technologies to streamline workflows and boost business, the demand for top-notch workers who have a balance of technical and soft skills will continue to rise."


GDPR is encouraging UK IT directors to pay cyber ransoms


The Sophos study revealed that small businesses were least likely to consider paying a ransomware demand, with 54% of IT directors at UK companies with fewer than 250 employees ruling out paying their attackers, while just 11% of directors at companies with 500-750 employees said they would opt for this approach. The study, based on more than 900 interviews conducted by market research firm Sapio Research, also showed that UK IT directors are significantly more likely to pay up than their counterparts in other Western European countries. Of the five European countries studied, Irish IT directors were the least likely to pay. Just 19% said they would “definitely” be willing pay a ransom rather than a larger fine. IT directors in France, Belgium and the Netherlands were also less likely to pay a ransom, with only 33% of respondents in France, 24% in Belgium and 38% in the Netherlands saying they would “definitely” be willing to pay.


New Hacker Group Behind 'DNSpionage' Attacks in Middle East

"It's clear that this adversary spent time understanding the victims' network infrastructure in order to remain under the radar and act as inconspicuous as possible during their attacks," the Talos report noted. The new campaign is the second in recent months targeting Middle East organizations and is a sign of the recently heightened interest in the region among cyberattackers. In September, Check Point reported on new surveillance attacks on law enforcement and other organizations in Palestine and other Middle East regions by a group known as Big Bang. A Siemens report from earlier this year described organizations in the oil and gas sectors in the Middle East particularly as being the most aggressively targeted in the world. Half of all cyberattacks in the region are targeted at companies in these two sectors. According to Siemens, a startling 75% or organizations in these sectors have been involved in at least one recent cyberattack that either disrupted their OT network or led to confidential data loss.


Cisco predicts nearly 5 zettabytes of IP traffic per year by 2022

Cisco predicts nearly 5 zettabytes of IP traffic per year by 2022
Cisco says that since 1984, over 4.7 zettabytes of IP traffic have flowed across networks, but that’s just a hint of what’s coming. By 2022, more IP traffic will cross global networks than in all prior “internet years” combined up to the end of 2016. In other words, more traffic will be created in 2022 than in the first 32 years since the internet started, Cisco says. One of the more telling facts of the new VNI is the explosion of machine-to-machine (M2M) and Internet of Things (IoT) traffic. For example M2M modules account for 3.1 percent of IP traffic in 2017, but will be 6.4 percent of IP traffic by 2022, said Thomas Barnett, director of service provider thought leadership at Cisco. By 2022, M2M connections will be 51 percent of the total devices and connections on the internet. A slew of applications from smart meters, video, healthcare monitoring, smart car communications, and more will continue to contribute to a significant growth in traffic. What that means is customers and service providers will need to secure and manage M2M traffic in new and better ways, Barnett said.


The journey to turning your organisation into a platform

A traditional organisation, which produces a product or service, can become a platform organisation that facilitates exchanges between producers, even its previous competitors, and consumers – it has swapped the means of production for the means of connection. Many platform organisations are now more valuable and durable than traditional companies. Consequently, firms and government agencies now investigate them in their annual strategy processes and innovation groups. So how do you make that journey from traditional “brownfield” organisation to one that can really benefit from the opportunity of being platform-centric? There are three phases to the journey: design, launch and grow: For traditional companies, the search for a platform business model starts outside in an emerging ecosystem, but should also relate to the value created in the existing business model, otherwise the organisation loses the potential competitive advantage of its relationships, intellectual property, products, services, domain knowledge, scale, data and so on.


Quantum Computing to Protect Data: Will You Wait and See or Be an Early Adopter?

Quantum Computing to Protect Data: Will You Wait and See or Be an Early Adopter?
One area of data protection that will be affected by quantum computing capabilities is encryption. You see, quantum computing will make current day encryption practices obsolete. The traditional Public Key Infrastructure (PKI) system used can easily come crashing down when public keys become vulnerable to attack by quantum machines. Instead of years to decipher codes, we could be down to minutes or even instantly. That changes life pretty darn dramatically. Just imagine all those security certificates issued for websites, emails and digital signatures to validate authentication becoming obsolete in a matter of minutes. We can already sense the drool from cyber criminals and adversarial nations. Here comes the “the sky is falling” talk, so here’s the disclaimer: we don’t expect this encryption calamity to happen tomorrow, but we do expect it to happen within our lifetime. It’s not unreasonable to think within a decade or so. The 10-15 year mark isn’t all too unreasonable, especially if you start taking into consideration study and standardization. But that’s the problem with any new technology: timing.


How better standards can decrease data security spending needs

Companies across a variety of industries are feeling the strain of increasingly savvy malware and other digital attacks that threaten data security – but it’s not just information that’s at risk. According to businesses, these attacks are also putting pressure on their budgets, with 92 percent of companies planning cyber security budget increases, according to a report by Enterprise Strategy Group. But can budgets keep up with growing security needs? Particularly for small businesses, the only option may be to standardize security practices to hold down costs. As in any industry, standardization makes it easier for companies to assess their needs, access appropriate tools, and can help reduce the cost of those tools overall. Data security, however, is a quickly changing field, creating a barrier to standardization. Recently, though, standardization at the highest levels, specifically starting with the federal government, has opened new doors for companies seeking cyber security solutions that don’t cost a fortune and work better than current approaches.


The need for data literacy

The need for data literacy
Thas been an explosion in the data available for decision making – marketing is no different. In fact, many would argue that being able to understand data, in particular customer data is now critical to success. For marketing to be truly successful marketers need to put the customer at the heart of everything, from the initial product or service design right through to delivery and after purchase support, therefore having a clear understanding of customer data at each critical point is a necessity. Because data is now so important it is often referred to ‘as the new oil’ or ‘the universal language of this fourth industrial revolution’. What is for sure is that the modern marketer needs to be able to ask questions of machines and use data to build knowledge, make decisions and communicate its meaning with board members or stakeholders. The ability to translate data into useable information that can drive and articulate more meaningful campaigns to audiences is a key skill for modern marketers.


Sentiment Analysis: What's with the Tone?


A typical use case is feedback analysis. Depending on the tone of the feedback — upset, very upset, neutral, happy and very happy — the feedback takes a different path in a support center. Sentiment analysis is indeed widely applied in voice of the customer (VOC) applications. For example, when analyzing responses in a questionnaire or free comments in a review, it is extremely useful to know the emotion behind them in addition to the topic. A disgruntled customer will be handled in a different way from an enthusiastic advocate. From the VOC domain, the step to applications for healthcare patients or for political polls is quite short. Similarly, the number of negative vs. positive comments can decide the future of a YouTube video or a Netflix movie. How can we extract sentiment from a text? Sometimes even humans are not that sure of the real emotion when reading between the lines. Even if we manage to extract the feature associated with sentiment, how can we measure it?



Quote for the day:


"An entrepreneur without funding is a musician without an instrument." -- Robert A. Rice Jr


Daily Tech Digest - November 27, 2018

Mass data fragmentation requires a storage rethink
It’s been estimated that up to 60 percent of secondary data storage is taken up by copies, needlessly taking up space and cost and raising risk. Worse, there is no re-purposing of the data for other use cases, such as test/develpment (where frequent copies of data are made for developers to test or stage their apps) or analytics (where data is copied and centralized in a lake or warehouse to run reports against). Today’s distributed, mobile organizations and easy access to cloud services mean there are more options than ever for data to be stored in multiple locations – perhaps without IT’s knowledge or control. And with the advent of edge computing and the Internet of Things (IoT), some data will never move from its edge location but will need to be managed in situ, away from conventional infrastructure and control. The specialized and siloed nature of secondary infrastructure and operations means IT is burdened with extra Opex and organizational overhead just to "keep the lights on," as well as extra cycles for coordination across functions to meet SLAs, recover from failures, manage upgrade cycles, troubleshoot support issues, and so on.



How to avoid the coming cloud integration panic

Enterprises typically don’t think about data, process, and service integration until there is a tactical need. Even then, they typically get around the issues by pulling together a quick and dirty solution, which typically involves FTP, a file drop, or even Federal Express. The result of all this is that a lot of integration between the cloud and on-premises systems remains undone, be it data integration, process integration, or service integration. This will become a crisis in 2019 for many enterprises, because they can spend the entire year, or more, just pulling together integration solution for their public cloud systems—which they now depend on for some mission-critical processes. To avoid that crisis, here’s what you need to do. First, catalog all data, services, and processes, using some sort of repository to track them all.. You need to do this for all on-premises systems and all public cloud systems, and you need to do so with the intent of understanding most of the properties so you can make sure the right things are talking to the right things.


TLA calls on tech industry to hire one million tech workers by 2023


TLA suggested increasing the amount of funding for female-founded businesses to increase diversity in the city’s tech sector, and recommended encouraging women to join investment firms to push up the likelihood of funding for female-led firms. Linda Aiello, senior vice-president of international employee success at Salesforce, said the “cognitive diversity” of teams created by having a mix of talent will help firms to better reflect their customers, and considering diversity in the tech industry is not only becoming “increasingly important” for product design, but should be considered at all levels of a company. “The technology sector, like almost every other industry, faces a diversity gap,” she said. “This is an issue that’s felt across all organisations and all sectors and it crosses so many threads from gender and race to religion, sexuality and socio-economic backgrounds – each of which contributes to the cognitive diversity of a team.” 


Researchers Use Smart Bulb for Data Exfiltration

For their experiment, the researchers used the Magic Blue smart bulbs, which work with both Android and iOS, and which rely on Bluetooth 4.0 for communication. The devices are made by a Chinese company called Zengge, which claims to be a supplier for brands such as Philips and Osram.  The bulbs are marketed as supporting Bluetooth Low Energy (Bluetooth LE or Bluetooth Smart) and the researchers focused on those using the Low Energy Attribute Protocol (ATT). Some of the bulbs are only Bluetooth Smart Ready, the researchers said.  The bulbs use Just Works as pairing method, which allowed Checkmarx to sniff the communication with the mobile application used for control. The Android application, the company discovered, works with other bulbs that have the same characteristics as well.  The researchers paired the mobile phone running the iLight app with the smart bulb and started controlling the device, while also attempting to capture the traffic.


How to implement Enterprise DevOps: 5 steps

istock-881484354.jpg
Under a traditional IT operating model, there are generally too many handoffs between teams, said John Brigden, vice president of Amazon Web Services (AWS) Managed Services, during a Monday session at AWS re:Invent 2018. "You've got lots of handoffs when a change is made, or any kind of adjustment is made to the environment ... and that can result in loss of innovation, loss of speed, and a lot of other challenges the enterprise faces today," Brigden said during the session. The notion of DevOps and DevOps teams in general can also be flawed, he added. "You might have tens, even hundreds of DevOps teams in your environment, and if these DevOps teams are left to figure everything out for themselves—network configuration, security compliance, compliance with PCI, change management, automation, in addition to writing the application to achieve their business outcome —you can get to a place where you have a lot of non-standardization, a lot of complexity, and perhaps create an environment that could slow down what you're really trying to achieve," Brigden said.


Weren’t algorithms supposed to make digital mortgages colorblind?

Some online lenders, such as Upstart (which does not offer mortgages), have said their algorithms help reduce the cost of credit and give more people offers at better pricing than traditional lenders. Upstart uses “alternative” data about education, occupation and even loan application variables in its underwriting models. (For instance, people who ask for round numbers like $20,000 are a higher risk than people who ask for odder numbers like $19,900.) “A lot of variables that tend to be correlated with speed or lack of prudence are highly correlated with default,” Upstart co-founder Paul Gu said in a recent interview. “And indications that someone desperately needs the money right away will be correlated with defaults.” Such factors are less discriminatory than relying on FICO scores, which correlate to income and race, according to online lender. But in the mortgage area, it appears that bank and fintech lenders are baking traditional methods of underwriting into their digital channels.


It’s complicated: How enterprises are approaching IAM challenges


IAM is all of these things and more – and for those running security in the enterprise, it is clear that living with the multiplicity of IAM is par for the course because IAM is more than just identity provisioning or access governance or single sign-on (SSO) or any one of a long list of disciplines. The success, or otherwise, of identity management in companies today relies on moving from singular and isolated technical initiatives to a full IAM programme – or at least having a plan for such a journey. “If you had to single out a sector at the cutting edge of IAM, it’s financial services,” says Martin Kuppinger ... “That’s because finances need good protection – and regulators and the sector itself have long required secure digital identities and standardised processes. Yet that’s only one part of the IAM story now, because next to this security-first identity agenda is a parallel consumer-convenience move being driven by the large digital companies that are developing a different kind of expertise in consumer identity management.”


Pattern Recognition and Machine Learning

Download Bishop Pattern Recognition and Machine Learning 2006
This leading textbook provides a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners. No previous knowledge of pattern recognition or machine learning concepts is assumed. This is the first machine learning textbook to include a comprehensive coverage of recent developments such as probabilistic graphical models and deterministic inference methods, and to emphasize a modern Bayesian perspective. It is suitable for courses on machine learning, statistics, computer science, signal processing, computer vision, data mining, and bioinformatics. This hard cover book has 738 pages in full colour, and there are 431 graded exercises. Solutions for these exercises and extensive support for course instructors are provided on Christopher Bishop’s page. Now available to download in full as a PDF.


Hiring tips: 9 secrets to working with IT recruiters

Hiring tips: 9 tips for working with IT recruiters
You can’t expect recruiting professionals, whether internal or external, to find the best talent if you’re not one hundred percent honest and open about the available role or roles, what you’re looking for, your timeline, what you’re willing to pay and the amount of competition for the vacancy, says Mondo’s Zafarino. “One thing that is key from the recruiter’s perspective is having full transparency from the CIO or IT hiring manager,” Zafarino says. “If there are internal candidates in the running, too; if you’re using other agencies as well, that’s fine. But you must communicate this to your recruiting partner. Let them know where your budget approval stands, or if you’re still working on getting the resources. And the most important thing is allocating the right amount of time for recruiters to fill the need. If it’s an urgent need, we’ll go full steam ahead, but if it’s a more passive potential hire then we’ll reallocate sources according to your needs and where you’re at in the process.”


Great Scrum Masters Are Grown, Not Born


Here's my assertion: Scrum Masters are Agile Coaches because they do what Agile Coaches at the program level do; they just do it within the scope of one or a few teams. They need all the skills and self-leadership that Agile Coaches at the program level need to be really effective for the teams they serve.  I am part of the working group ICAgile commissioned to refresh the Learning Path for Agile Coaching which was released earlier this year. When we got together, one of the main things we wanted to adjust in the community at large was this notion that a Scrum Master is somehow a less powerful role than Agile Coach or that it's even an administrative role that does not require a lot of skill. These were damaging applications of the roles that we saw across the industry. It resulted in stunted Scrum Masters who were not allowed to develop the skills needed to really help teams not only deliver, but deliver while improving team capabilities. The people on the ground need a full complement of skills because on the ground, with teams, day in and day out, is where the action is.



Quote for the day:


"Leadership happens at every level of the organization and no one can shirk from this responsibility." -- Jerry Junkins


Daily Tech Digest - November 26, 2018

The race to create a real-life Star Trek medical scanner


Basil Leaf Technologies is still working towards creating a Tricorder in the way that most people think of it: a single device that can diagnose a range of conditions. For a real-life Tricorder to serve as a universal diagnostic tool in the way that Star Trek envisioned, it would need to be able to analyse far more biomarkers than the DxtER currently does. Handily, scientists are also working on expanding the capabilities of Tricorder-like devices. Earlier this year, researchers from the University of Glasgow created a handheld sensor device based on a CMOS chip that can analyse a number of metabolites in blood or urine, analysing them to diagnose conditions including heart attacks. Elsewhere, companies are working on creating Tricorder type hardware with a focus on infectious disease: the Q-POC, made by QuantumDx, is expected to launch next year, and brings handheld diagnostics for bacterial and viral infections.


hyper convergence speed burning rubber tire binary fast by tao55 getty images
“The increasing use of hybrid cloud environments by enterprises also lines up nicely with the software-defined data center story, which HCI is certainly a large part of,” Lagana says. HCI has become a suitable platform for broader use due to a lot of the underlying improvements in the technology, Lagana says. At the same time, many enterprises have gone through an IT “refresh cycle” and HCI seems like a natural transition. “We’ve spoken with some HCI adopters and, in some cases, folks we’re talking to are upgrading multiple generation-old infrastructure running on old, sometimes now unsupported software,” Lagana says. “At that point, if the old server and/or storage technology they’re using is that far behind what’s now available, it becomes a matter of the level of complexity they’re seeking in their new environment.”


11 common wireless security risks you don't have to take


The best thing is to acknowledge your wireless ecosystem has security holes in it. This is even more likely when you have users connecting to random wireless hotspots at home, while traveling and so on. Even if you eliminate all the above vulnerabilities and implement WPA3, your business can be exposed to someone mimicking a legitimate AP -- the "evil twin" vulnerability, which has been around since the inception of Wi-Fi. Not only can an evil twin attack exploit network systems and information, but when it does happen you'll likely never know about it. The evil twin vulnerability can be mitigated using a wireless intrusion prevention system offered by many of the big networking vendors. Still, these systems won't protect your mobile users when they are out and about.



Regulator action will take time – six months is too early to get a proper read. Yet, we can still get a feel for what is going on by looking at what’s happening in a given country. The UK is interesting; their Information Commissioner predates GDPR as UKs privacy regulations go back to 1998. The UK commissioner is currently publishing findings and leveling fines after investigations for activities dating back to 2016. That gives us a feel for how long investigations may take under GDPR. Perhaps we will not know the full impact for another two years to the magnitude of fines levied. Facebook’s challenges with Cambridge Analytica were lucky in that they fell under the prior law resulting in a smaller 500K GDP fine than the billions allowed by GDPR. Breaches at British Airways and others, which took place since GDPR became active, are being carefully monitored to see if in fact they were properly reported to the UK commission within the 72-hour limit of being discovered.

When working on complex challenges, you’ll need to try doing new things (new offerings) and doing old things in new ways (new processes). But this risk-taking has to be prudent. At my firm, new team members must have the diligence and humility to learn the established way of handling a problem before they invent a new way. We try small experiments in safe contexts (tweaking established offerings and processes with trusting and trusted partners) before trying big experiments in dangerous contexts. In Mexico, for instance, although the work involved a unique situation and lots of trial and error, a foundation of decades of relevant experience enabled us to advance. You can improvise well only if you have practiced a lot. ... Often, you can’t rely only on your own perspective. Ask for feedback: from your colleagues, clients, and anyone else involved with the problem you’re trying to solve. Ask casually and formally, verbally and in writing, and with specific and open-ended questions.



8 strategies to keep legacy systems running


While most organizations are moving business operations to Software as a Service (SaaS) and cloud computing solutions, some organizations retain dependencies on legacy platforms and the software that runs on them. Maintaining access to these deprecated platforms can often be a source of frustration for IT, as aging hardware and software often requires scavenging websites such as Craigslist or eBay for decades-old parts. However, new parts and software can be used in its place, making the process easier. For software which requires the use of older operating systems, VirtualBox can readily be used to virtualize the OS and application, allowing the legacy environment to be used on modern hardware. VirtualBox has a built-in host for Remote Desktop Protocol (RDP), allowing users to connect remotely to a VirtualBox VM. VirtualBox is more adept at handling virtualization for legacy software than QEMU/KVM or other modern hypervisors.


Why Deep Defense Should Start with Detecting Compromised Credentials


In a worst-case scenario, the credentials for an admin account could grant access to an advanced threat actor – once they are in the environment they can move laterally, placing backdoors, RATs and other software to become persistent, and exfiltrate the data of employees or customers to resell or utilize for their own financial gain.  Though phishing and spear-phishing remain somewhat seminal techniques, particularly when combined with social engineering, malware use is often more efficient in terms of volume and timeliness than phishing. Though more complex skills are required for this tactic to be efficient, many malware families are openly sold -as-a-service – AgentTesla, for example is marketed between $6-15 per month, with customer support and updates available, bringing the barrier to entry down. Advanced attackers may use malware to infect machines and move laterally in an organization’s network.


New Linux crypto-miner steals your root password and disables your antivirus


The trojan itself is a giant shell script of over 1,000 lines of code. This script is the first file executed on an infected Linux system. The first thing this script does is to find a folder on disk to which it has write permissions so it can copy itself and later use to download other modules. Once the trojan has a foothold on the system it uses one of two privilege escalation exploits CVE-2016-5195 (also known as Dirty COW) and CVE-2013-2094 to get root permissions and have full access to the OS. The trojan then sets itself up as a local daemon, and even downloads the nohup utility to achieve this operation if the utility is not already present. After the trojan has a firm grasp on the infected host, it then moves on to executing its primary function for which it was designed for, which is cryptocurrency mining. The trojan first scans and terminates the processes of several rival cryptocurrency-mining malware families, and then downloads and starts its own Monero-mining operation.


How to tell which IoT predictions to pay attention to

2 networks smart city iot connected
Probably the main reason for the difficulty in predicting where the IoT market at large is going to go is that there's no general agreement on a precise definition of the boundaries of that market. Hence, the large number of large numbers purporting to describe the "size of the IoT market," which are frequently measuring very different aspects of it. “Everyone knows it’s going to be big,” said Alan Griffiths, principal consultant with market researcher Cambashi. “And no one’s got the faintest idea, in my opinion, of how big it’s going to be.” He talks to top technical people – CIOs and CTOs – for his estimates of the IIoT market, which gives him a better read on who’s buying what. Griffiths’ research on the IIoT market highlights another important point: IoT trend predictions focused on more specific market segments, or on particular technologies, tend to be a lot more digestible. The relevant details needed to create such an analysis are easier to get, and it’s more difficult to make guesswork look presentable.


Blockchain Implementations are still POCs


The problem is with good intention people in business and technology community who are still in the awe of the promises of Bitcoin. They are now hurting the cause and becoming the burden by forcing the one thought or one defined checklist for any Blockchain implementation. I think technology should be allowed to evolve organically and is not made the prisoner of the ‘original idea.’ I believe ‘Identifying the business problem you want to solve’, and will be the key to the success of any Blockchain implementation (rather than the phrase ‘Blockchain implementation’ it should rather be ‘Blockchain network setup’ and ‘application implementations’ on that setup). Eliminating intermediaries is the Utopian idea where one is asking to get into business transaction wherein one has to believe set of programmers of Blockchain platform rather than the entity that can be dragged in the court of law in the situations of dispute. 



Quote for the day:


"Don't focus so much on who is following you, that you forget to lead." -- E'yen A. Gardner


Daily Tech Digest - November 25, 2018

Artificial intelligence: Germans see no reason to fear robot coworkers

One example of how AI can benefit people is automated driving. Bosch is striving to make road transportation emissions-free, accident-free, and stress-free. With nine out of ten accidents are currently attributable to human error, smart technology could use AI to prevent many of these from happening in the first place. Connected manufacturing is another banner field for AI. In a smart factory, people and machines will work together as an intelligent team. Robots will relieve people of strenuous and dangerous tasks and learn from experience. This will reduce people’s burden. The Bosch survey found that many Germans could imagine being able to accept this situation. Two-thirds of respondents – 67 percent – believe that manufacturing and mobility are going to benefit greatly from artificial intelligence. They are also open to working with a robot if it takes over routine chores. Half of all respondents could well imagine such a situation, and would above all devote the free time gained to social or creative activities.


Women in Blockchain: CryptoWendyO talks about her motivation

There’s so much negative energy directed at crypto from mainstream financial institutions because the public “doesn’t like change.” “Because crypto is intangible, it’s hard for the masses to understand. “We saw this with the internet and credit cards. If you notice, the group of folks present when credit cards became mainstream still write cheques – as time progresses, so will the masses.” The recent falls after the hard fork mean the market – which is basic supply and demand – needs a “catalyst to bring in new money.” WendyO says: “There’s nothing we can do individually to stop negative price action. What we can do is support one another and continue to support the entrepreneurs building in the space. They are the key to mass adoption. “Once Blockchain projects are seamless and make life easier for the masses, they will come.” Asked by me why people are panicking so much, she believes: “Price impacts the human psyche so much. People are entering into positions without proper risk management and education.



We all know how the media and the film industry are overhyping AI with androids and over-intelligent systems. Some computer pioneers, Alan Turing (you may want to watch The Imitation Game to appreciate the legend he is) at the forefront, did set off on projects with a view to making machines that think. Turing, however, did realise that this would be abysmally difficult, and in 1950 proposed: Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s? If this were then subjected to an appropriate course of education, one would obtain the adult brain. This idea grew on to become Deep Learning. Fast forward to 2018: we have, and are still gathering, massive amounts of data. We have and are still developing more and more advanced algorithms. But do we have the hardware to crunch all those calculations within reasonable time? And if we do, can it be done without having all those GPUs cause another global warming on their own by literally heating up from all the processing?


Forget Robots, Blockchain Technology May Be the Real Threat to Your Job

Blockchain isn't just the technology behind the Bitcoin craze. It could also mean the end of the middle manager.
Traditionalists say this is a necessary component of an organization, freeing senior management to think strategically and move away from the day-to-day, while building a talent bench of the next generation of senior managers. Detractors ask what a middle manager actually adds to the bottom line, pointing to an unclear or difficult to define return on investment. The truth, as is often the case, lies somewhere in the middle. But it may not matter. Many organizations have clear, tangible, quantifiable key performance indicators for day-to-day functions, like sales closed or widgets shipped. With the advent of smart contracts on blockchain, it’s clear: robots aren’t the only ones gunning for your job. Blockchain technology is too. A smart contract is code designed to facilitate, verify or enforce performance of set terms. ... Notably, this is not a far-off concept—it’s something that, in many situations, could be implemented tomorrow.


Rebooting analytics leadership: Time to move beyond the math
CAOs often find themselves doing this heavy lifting with a limited sphere of influence. They typically do not have the profit-and-loss or revenue accountability that would grant them due power in the organization. Moreover, like chief marketing officers a decade ago, CAOs need—but typically lack—a true seat at the C-suite table, placing them at a disadvantage when trying to obtain adequate funding or resources to power the analytics agenda. ... Arguably, none of the previous CAO personas could succeed in today’s landscape. We’ve entered an era that requires a new CAO persona—the Catalyst—who embraces a style of leadership geared toward addressing the current demands, roadblocks, and scrutiny most companies face today when it comes to deploying AI and advanced analytics at scale. Catalysts approach their role very differently than did past CAO personas, in ways that those with more scientific and technical career backgrounds might not have ever done before.



How voice biometrics catches fraudsters


According to Costain, it is relatively easy for the system to identify a new voice. Often, a fraudster will phone in to check whether stolen credentials are valid, but in certain cases, the fraudster may scam the customer to obtain these credentials. “It’s a bit like epidemiology with Patient Zero,” he said. The same voice may try to access multiple accounts, which would signal an attempted fraud. RBS has also been compiling a database of evidence, which Costain said has led to a few police arrests of people who have made fraudulent calls. Over the next six months, the bank will have technology to enable customers to determine whether a call they receive from the bank is genuine, he said. Experian’s Global fraud report 2018 found that customers want to be recognised, while businesses want to address the growing fraud they are experiencing.


AI and Neuroscience: A virtuous circle


Another key challenge in contemporary AI research is known as transfer learning. To be able to deal effectively with novel situations, artificial agents need the ability to build on existing knowledge to make sensible decisions. Humans are already good at this - an individual who can drive a car, use a laptop or chair a meeting are usually able to cope even when confronted by an unfamiliar vehicle, operating system or social situation. Researchers are now starting to take the first steps towards understanding how this might be possible in artificial systems. For example, a new class of network architecture known as a “progressive network” can use knowledge learned in one video game to learn another. The same architecture has also been shown to transfer knowledge from a simulated robotic arm to a real-world arm, massively reducing the training time. Intriguingly, these networks bear some similarities to models of sequential task learning in humans. These tantalising links suggest that there are great opportunities for future AI research to learn from work in neuroscience.


6 ways to include dark data in analytic strategies

istock-876586498archive-files.jpg
The goal for CIOs is simple: Find out what data is under company management, but that it possibly didn't know that it had. Then, develop a strategic data plan with executives that addresses what do with this data so that it delivers its highest value to the company. ... As soon as it is determined that certain areas of data are useful, begin to digitalize and exploit it for value so you can get it working for you. ... Outside data sources can enhance the value of data you already have under management. A prime example is the monitoring of Greenland's ice pack. If you monitor climate change and are concerned about the pace of global warming, you can study historical photos of Greenland's land mass from decades ago. Comparison of Greenland against how it was decades ago to how it is today can demonstrate both the impact and progression of global warming. ... As paper-based forms of unstructured data are digitalized, it is essential for data to undergo quality assurance checks for data integrity and quality.


Generative Adversarial Networks (GANs) – The Basics You Need To Know

Horse.gif
So as name suggest it is called as Adversarial Networks because this is made up of two neural networks. Both neural networks are assigned different job role i.e. contesting with each other. Neural Network one is called as Generator, because it generate new data instances. Other neural net is called as Discriminator, evaluates work for first neural net for authenticity. The cycle continue to obtain accuracy or near perfection results. ... To understand “Generative Adversarial Networks”, its very important to differentiate between supervised learning and unsupervised learning. ... GAN’s are fairly new architecture in the deep learning domain. They fall under unsupervised neural network category. The performance measure is far better then traditional neural nets. When we use google search engine we use GANs at time of typing, what we want to search. 


Distributed Machine Learning Is The Answer To Scalability And Computation Requirements


It was this challenge to handle large-scale data due to scalability and efficiency of learning algorithms with respect to computational and memory resources that gave rise to distributed ML. For example, if the computational complexity of the algorithm outpaces the main memory then the algorithm will not scale well and will not be able to process the training data set or will not run due to memory restrictions. Distributed ML algorithms rose to handle very large data sets and develop efficient and scalable algorithms with regard to accuracy and to requirements of computation. Distributed ML algorithms are part of large-scale learning which has received considerable attention over the last few years, thanks to its ability to allocate learning process onto several workstations — distributed computing to scale up learning algorithms. It is these advances which make ML tasks on big data scalable, flexible and efficient.



Quote for the day:


"You can't just wish change; you have to live the change in order for it to become a reality." -- Steve Maraboli


Daily Tech Digest - November 23, 2018

Indian IT companies have had to transform every part of their process for faster growth 

1
To get here, Indian IT companies have had to change every part of their process—even becoming less Indian. They have shed jobs, changed how they work, upended part of their business models by focusing on hiring at client locations abroad, invested in training their employees, and chased acquisitions. The goal this time: growing digital revenue, in the hopes that it would offset the contraction in the traditional business, which is still over 60% of their revenues. “The core purpose of IT has changed to helping transform businesses and drive revenues from reducing cost and improving efficiency. This has led to a new wave of growth for IT, helping customers digitally transform their businesses,” said Hexaware CEO R Srikrishna. ... IT companies are looking at building the same campus recruitment engine in client markets as they did in India, and are focusing on current campus hires to become ambassadors for them at their universities. On Wednesday, Infosys announced it would hire 1,200 locals in Australia by 2020, over a third of whom would come from campuses. 



With more and more Data Centers being built, and soon after we are done bashing the economy and again ready to make hay as the economy starts turning around, we can alter the shape of our destiny dramatically! As other industries will sag, think of hotels, airlines, etc., which will certainly contribute to a certain reduction of carbon emissions, there is strong reason to believe that the spin-offs from the remote-everything will go into overdrive. Today we are carefully talking about online conferences and soon there will be lots of online activities that will be firing up all over the place. This will lead to huge data crunch operations as more and more information will need more processing power as it will come in audio, video and other formats. This will be extremely demanding for data centers, no matter how centralized they are.


TechUK calls on Matt Hancock to fast-track NHS digitisation


Hancock has said he wants the health tech industry to thrive, which TechUK said is a great idea, but very far from the current status quo where tech companies find health and social care one of the most difficult sectors to “crack”. It called on Hancock to ensure better access to data and improved procurement. “Data, like oil, is worth nothing if it is left in the ground,” the manifesto said. “Far too much data is held in non-digital form or in siloed repositories making it impossible to join up. “Tech companies that need data to build, develop, test and prove their solutions find it difficult to access, while companies that produce valuable data find it difficult to feed back into the System to inform better decision making.” 


What machine learning means for software development

Fractal art
Will machine learning eat software, as Pete Warden and Andrej Karpathyhave argued? After all, “software eating the world” has been a process of ever-increasing abstraction and generalization. A laptop, phone, or smart watch can replace radios, televisions, newspapers, pinball machines, locks and keys, light switches, and many more items. All these technologies are possible because we came to see computers as general-purpose machines, not just number crunchers. From this standpoint, it’s easy to imagine machine learning as the next level of abstraction, the most general problem solver that we’ve found yet. Certainly, neural networks have proven they can perform many specific tasks: almost any task for which it’s possible to build a set of training data. Karpathy is optimistic when he says that, for many tasks, it’s easier to collect the data than to explicitly write the program.


Socially Responsible Automation: A Framework for Shaping the Future

We define SRA as the set of technology choices, business strategies, innovation approaches, and management practices that move the affordances of automation beyond cost and performance efficiencies towards profitable and sustainable growth with more and better jobs driving economic development and social cohesion. SRA strives to optimize both business and social goals by adopting “common good” and “shared value” ideals. A minimal approach to SRA would be one where technology decisions are guided by their potential negative impact on jobs and the workforce; where mechanisms such as economic modelling, decision frameworks, and human factors approaches are employed to quantitatively and qualitatively assess technology choices and outcomes, and where appropriate trade-offs are made to balance the economic benefits of automation with the social costs of labor reduction and unemployment.


Chromecast (2018) review: Google's revamped media streamer is what you make of it

chromecast3hero
The new Chromecast isn’t much different from the second-generation model from 2015. The new design has rounder edges, but it’s still a small puck that hangs behind the TV on a 3-inch HDMI cable, and it still uses the TV’s USB port or a wall outlet for power. (You’ll likely need to choose the latter if you want the Chromecast to turn the TV on when it connects to your phone.) The way you use Chromecast hasn’t really changed, either: In lieu of a remote control and TV-based menus, Chromecast uses the streaming apps on your iOS or Android device as the interface. Apps that support Chromecast will show a cast button that links your device to the television, and whatever video you select will begin playing on the larger screen. You can also use the Chrome browser on a laptop or desktop to launch video from websites that offer that feature. What’s different, then?


How data scientists can help operational analytics succeed

A typical company has an organization that develops and an organization that operates. When I was consulting with PayPal, we had a group of talented professionals that constantly improved the functionality of the PayPal website. There was an equally talented group of professionals responsible for handling the operations of the production site. This operations group had a very different environment within which to succeed. That is why they had the best tools available to analyze what was happening at any point in time, and the best practices for troubleshooting problems in the moment. Data science can help tremendously with monitoring and troubleshooting. A key difference between operations and development is in their perspective of the status quo. For operations, stability is the goal—preserve the status quo; therefore, data science must be used to alert operators when the situation is not normal.


Digital Well-being — Its time to look after ourselves


We always choose such immediate enjoyment of likes, reacts, swipes and claps over long term flourishing, punching a hole in our well-being. But all this is not really good for nothing, your each swipe, click, reaction generates tonnes of revenues for companies in exchange for your sleep. Well, they always say, If You're Not Paying For It, You Are The Product. One must have come across recent announcements by Google and Apple on addressing the issue of digital well-being by monitoring screen time. But if one must need to really achieve/experience the state, one must possess the necessary digital skills. I have explained this using an analogy below. Consider an analogy between two phases of life. At young age when children are exposed to the real world society, certain real world skills like language, manners, values and other resources are taught for them to overcome challenges they might face in life. In similar manner when children are exposed to digital technology, are they prepared or equipped with digital skills to face the challenges they may come across?


Malware Moves: Attackers Retool for Cryptocurrency Theft

Malware Moves: Attackers Retool for Cryptocurrency Theft
Modular malware called Trickbot, which has also been used to mine for cryptocurrency, is up to new tricks. "TrickBot has traditionally targeted banking customers in multiple geographies to steal login credentials in order to commit identity fraud and facilitate fraudulent transactions," researchers at Digital Shadows say in a research report. But TrickBot's designers have been adding additional capabilities that appear designed to extend the reach of the malware. In February, TrickBot's designers added an open source monero cryptocurrency-mining module. And in March, they added the ability to crypto-lock devices, "potentially helping threat actors to extort victims," the research report says. Last month, Vitali Kremez, director of research at threat intelligence firm Flashpoint, warned the TrickBot had been updated to included a module designed to steal passwords from multiple types of applications and browsers.


Mirai Evolves From IoT Devices to Linux Servers

Netscout researchers say they have observed what appears to be a relatively small number of threat actors attempting to deliver the malware on Linux servers by exploiting a recently disclosed vulnerability in Hadoop YARN. The YARN vulnerability is a command injection flaw that gives attackers a way to remotely execute arbitrary shell commands on a vulnerable server. Many of the servers running Hadoop YARN are x86-based. Netscout has been tracking attempts to exploit the flaw using its global network of honeypots. It says it has observed tens of thousands of exploit attempts daily. In November alone, Netscout observed attackers attempting to deliver some 225 unique malicious payloads via the Hadoop YARN vulnerability. Of that, at least one dozen of the malware samples were Mirai variants.



Quote for the day:


"Adapt what is useful, reject what is useless, and add what is specifically your own." -- Bruce Lee