Daily Tech Digest - February 21, 2018

The New Era Of Artificial Intelligence


AI will soon become commoditized and democratized, just as electricity was in its time. Today we use computers, smartphones, other connected devices, and, mostly, apps. Whilst access to internet technologies has constantly improved over the past decades, very few people are able to program these and generate income by intelligently exploiting consumer data, which, in theory, is not theirs. GAFA (Google, Amazon, Facebook and Apple) and the Chinese BAT (Baidu, Alibaba and Tencent,) are among the most prominent players in these fields. Tomorrow’s world would be different with the emergence of relatively simple, portable AI devices, which might not necessarily be connected to each other by the internet, but would feature completely new protocols and peer-to-peer technologies. This will significantly re-empower consumers. Because it is decentralized, portable AI will be available for the masses within a decade or so. Its use will be intuitive; just as driving a car is today. Portable AI will also be less expensive than motorized vehicles, 


What is DevSecOps and Vulnerabilities?

The principles of security and communications should be introduced every step of the way when building applications. The philosophy of DevSecOps was created by security practitioners who seek to “work and contribute value with less friction”. These practitioners run a web site that details an approach to improving security, explaining that “the goal of DevSecOps is to bring individuals of all capabilities to a high level of security efficiency in a short period of time. Security is everyone responsibility.” DevSecOps statement includes principles such as building a lower access platform, focusing on science, avoiding fear, uncertainty and doubt, collaboration, continuous security monitoring and cutting edge intelligence. Community DevSecOps promotes action directed at detecting potential issues or exploiting weaknesses. In other words, think like an enemy and perform similar tactics such as trying to penetrate to identify gaps that can be exploited and that need to be treated.


7 essential technologies for a modern data architecture

7 essential technologies for a modern data architecture
At the center of this digital transformation is data, which has become the most valuable currency in business. Organizations have long been hamstrung in their use of data by incompatible formats, limitations of traditional databases, and the inability to flexibly combine data from multiple sources. New technologies promise to change all that. Improving the deployment model of software is one major facet to removing barriers to data usage. Greater “data agility” also requires more flexible databases and more scalable real-time streaming platforms. In fact no fewer than seven foundational technologies are combining to deliver a flexible, real-time “data fabric” to the enterprise. Unlike the technologies they are replacing, these seven software innovations are able to scale to meet the needs of both many users and many use cases. For businesses, they have the power to enable faster and more intelligent decisions and to create better customer experiences.


Tesla cloud systems exploited by hackers to mine cryptocurrency

Researchers from the RedLock Cloud Security Intelligence (CSI) team discovered that cryptocurrency mining scripts, used for cryptojacking -- the unauthorized use of computing power to mine cryptocurrency -- were operating on Tesla's unsecured Kubernetes instances, which allowed the attackers to steal the Tesla AWS compute resources to line their own pockets. Tesla's AWS system also contained sensitive data including vehicle telemetry, which was exposed due to the unsecured credentials theft. "In Tesla's case, the cyber thieves gained access to Tesla's Kubernetes administrative console, which exposed access credentials to Tesla's AWS environment," RedLock says. "Those credentials provided unfettered access to non-public Tesla information stored in Amazon Simple Storage Service (S3) buckets." The unknown hackers also employed a number of techniques to avoid detection. Rather than using typical public mining pools in their scheme


Micron sets its sights on quad-cell storage

Micron sets its sights on quad-cell storage
The first single-level cell, with one bit per cell, first emerged in the late 1980s when flash drives first appeared for mainframes. In the late 1990s came multi-level cell (MLC) drives capable of storing two bits per cell. Triple-level cell (TLC) didn't come out until 2013 when Samsung introduced its 840 series of SSDs. So, these advances take a long time, although they are being sped up by a massive increase in R&D dollars in recent years. Multi-bit flash memory chips store data by managing the number of electronic charges in each individual cell. With each new cell, the number of voltage states doubles. SLC NAND tracks only two voltage states, while MLC has four voltage states, TLC has eight voltage states, and QLC has 16 voltage states. This translates to much lower tolerance for voltage fluctuations. As density goes up, the computer housing the SSD must be rock-stable electrically because without it, you risk damaging cells. This means supporting electronics around the SSD to protect it from fluctuations.



When it comes to cyber risk, execute or be executed!

Accountability must be clearly defined, especially in strategies, plans and procedures. Leaders at all levels need to maintain vigilance and hold themselves and their charges accountable to execute established best practices and other due care and due diligence mechanisms. Organizations should include independent third-party auditing and pen-testing to better understand their risk exposure and compliance posture. Top organizations don’t use auditing and pen-testing for punitive measures, but rather, to find weaknesses that should be addressed. Often, they find that personnel need more training, and regular cyber drills and exercises to get to a level of proficiency commensurate with their goals. Those organizations that fail are those that do not actively seek to find weaknesses or fail to address known weaknesses properly. Sound execution of cyber best practices buys down your overall risk. With today’s national prosperity and national security reliant on information technology, the stakes have never been higher.


Hack the CIO

CIOs have known for a long time that smart processes win. Whether they were installing enterprise resource planning systems or working with the business to imagine the customer’s journey, they always had to think in holistic ways that crossed traditional departmental, functional, and operational boundaries. Unlike other business leaders, CIOs spend their careers looking across systems. Why did our supply chain go down? How can we support this new business initiative beyond a single department or function? Now supported by end-to-end process methodologies such as design thinking, good CIOs have developed a way of looking at the company that can lead to radical simplifications that can reduce cost and improve performance at the same time. They are also used to thinking beyond temporal boundaries. “This idea that the power of technology doubles every two years means that as you’re planning ahead you can’t think in terms of a linear process, you have to think in terms of huge jumps,” says Jay Ferro, CIO of TransPerfect, a New York–based global translation firm.


Taking cybersecurity beyond a compliance-first approach

Stack of legal documents with compliance and regulatory stamp
With high profile security breaches continuing to hit the headlines, organizations are clearly struggling to lock down data against the continuously evolving threat landscape. Yet these breaches are not occurring at companies that have failed to recognize the risk to customer data; many have occurred at organizations that are meeting regulatory compliance requirements to protect customer data.  Given the huge investment companies in every market are making in order to comply with the raft of regulation that has been introduced over the past couple of decades, this continued vulnerability is – or should be – a massive concern. Regulatory compliance is clearly no safeguard against data breach. Should this really be a surprise, however? With new threats emerging weekly, the time lag inherent within the regulatory creation and implementation process is an obvious problem. It can take over 24 months for the regulators to understand and identify weaknesses within existing guidelines, update and publish requirements, and then set a viable timeline for compliance.


Three sectors being transformed by artificial intelligence


While these industries will see significant AI adoption this year, the AI platforms and products that scale to mainstream adoption won’t necessarily be the household names you may expect. As the “Frightful Five” continue to grow and expand their reach across industries, they have designed powerful AI products. However, these platforms present challenges for smaller companies looking to implement AI solutions, as well as larger companies in competitive industries such as retail, online gaming, shipping, and travel to name a few. How can an advertiser on Facebook feel comfortable entrusting its data to a tech behemoth that may sell a product that competes with its business? Should a big data company using a Google AI feature be concerned about the privacy of its data? These risks are very real, yet businesses have options. They can instead choose to host data on independent platforms with independent providers, guarding their intellectual property while also supercharging the advancement of AI technology.


What the ‘versatilist’ trend means for IT staffing

IT staff who once only focused on systems in the datacenter now focus on systems in the public cloud as well. This means that while they understand how to operate the LAMP stacks in their enterprise datacenters, as well as virtualization, they also understand how to do the same things in a pubic cloud. As a result, they have moved from one role to two roles, or even more roles. However, the intention is that eventually that the traditional systems will go away completely, and they will just be focused on the cloud-based systems. I agree with Gartner on that, too. While I understand where Gartner is coming from, the more automation that sits between us and the latest technology means we need more technology specialists, not less. So, I’m not convinced that IT versatilists will gain new business roles to replace the loss of of the traditional datacenter roles, as Gartner suggests will happen.



Quote for the day:


"We're so busy watching out for what's just ahead of us that we don't take time to enjoy where we are." -- Bill Watterson


Daily Tech Digest - February 20, 2018

Regression Testing Strategies: an Overview


Change is the key concept of regression testing. The reasons for these changes usually fall into four broad categories: New functionality. This is the most common reason to run regression testing. The old and new code must be fully compatible. When developers introduce new code, they don’t fully concentrate on its compatibility with the existing code. It is up to regression testing to find possible issues; Functionality revision. In some cases, developers revise the existing functionality and discard or edit some features. In such situations, regression testing checks whether the feature in question was removed/edited with no damage to the rest of the functionality; Integration. In this case, regression testing assures that the software product performs flawlessly after integration with another product; and Bug fixes. Surprisingly, developers’ efforts to patch the found bugs may generate even more bugs. Bug fixing requires changing the source code, which in turn calls for re-testing and regression testing.



How the travel industry is using Big Data to tailor-make your holidays


It doesn’t take much paranoia to see how this is obviously beneficial to the airlines: your type of credit card gives a rough idea on your credit score, your billing address can give an idea of your social status, and even your email address says something about you. Plus, it’s easy to spot if you regularly fly alone. Or are your family with you? Is a certain financially-unconnected person always in the seat next to you? Are you flying to a ‘romantic’ location? Did you book a nice hotel, or are you a cheapskate? Are any of your Facebook friends or Twitter followers on the flight? What have you been looking at on the in-flight WiFi? And what events are happening in the area where you bought your flight to? All this data allows airlines to develop better models of their customers, and therefore give them ever better ways of refining their pricing models. Certain airlines are already running reverse auctions on upgrades, but this could be taken further.



5 Ways Blockchain Is Changing The Face Of Innovation of 2018


The volatility in cryptocurrencies is well-known and not for the faint-hearted, especially over recent weeks. Blockchain-based payment network Havven sets out to provide the first decentralized solution to price stability. Designed to provide a practical cryptocurrency, Havven uses a dual token system to reduce price volatility. The fees from transactions within the system are used to collateralise the network, secured by blockchain and supposedly enabling the creation of an asset-backed stablecoin. Think of Tether, but not being tied to the dollar. Each transaction generates fees that are paid to holders of the collateral token and as transaction volume grows, the value of the platform increases. Havven is a low-fee and stable payment network that wants to enable anyone anywhere to transact with anyone else. It's an interesting addition to the increasingly crowded crypto space.


Could we soon be seeing utility cryptocurrency mining?

cryptocurrency mining
Proof-of-work is the main model for cryptocurrency mining and blockchain, especially for Bitcoin. Basically, the way to guarantee the order of transactions is to slow down the system and make it computationally onerous to add a new block – i.e. it takes time and computing capacity. If two blocks are added simultaneously, then it is basically a competition to see who can perform the calculation tasks faster and add more to the chain, because the longer fork wins. The reward for adding a block is to receive some tokens (e.g. Bitcoins). SHA-256 (Secure Hash Algorithm), which came with Bitcoin, is a commonly used model, and there are targets for the hash algorithm value that basically forces it to perform a lot of calculations for each transaction to achieve the targeted value. The benefit of the current algorithm is that the results are easy to check and see whose block is added to the chain. It would probably need quite a lot work to develop models in which miners make some otherwise useful computation for proof of work.


For artificial intelligence to thrive, it must explain itself


The reason for this fear is that deep-learning programs do their learning by rearranging their digital innards in response to patterns they spot in the data they are digesting. Specifically, they emulate the way neuroscientists think that real brains learn things, by changing within themselves the strengths of the connections between bits of computer code that are designed to behave like neurons. This means that even the designer of a neural network cannot know, once that network has been trained, exactly how it is doing what it does. Permitting such agents to run critical infrastructure or to make medical decisions therefore means trusting people’s lives to pieces of equipment whose operation no one truly understands. If, however, AI agents could somehow explain why they did what they did, trust would increase and those agents would become more useful. And if things were to go wrong, an agent’s own explanation of its actions would make the subsequent inquiry far easier. Even as they acted up, both HAL and Eddie were able to explain their actions. 


Build a multi-cloud app with these four factors in mind

multi-cloud adoption
A key driver behind multi-cloud adoption is increased reliability. In 2017, Amazon's Simple Storage Service went down due to a typo in a command executed during routine maintenance. In the pre-cloud era, the consequences of an error like that would be relatively negligible. But, due to the growing dependence on public cloud infrastructure, that one typo reportedly cost upwards of $150 million in losses across many companies. A multi-cloud app -- or an app designed to run on various cloud-based infrastructures -- helps mitigate these risks; if one platform goes down, another steps in to take its place. ... Infrastructure changes should take days, not months. Regardless of the reason -- to save money, to prevent vendor lock-in or simply to run your app in a development environment without design compromises -- writing code without a specific cloud platform in mind ensures it will run on any server.




The Impact Of Artificial Intelligence Over The Next Half Decade

The Impact Of Artificial Intelligence Over The Next Half Decade
You will get a fully automated health checkup every time you take a bath or use the toilet at your house. Body fluids and temperature will be analyzed by sensors and the data will be forwarded to an “AI doctor” that will be able to inform you if there is something wrong with you and how to proceed. Ok, maybe this one will take a little longer than a decade. “ASIMO” alike droids will begin to be sold as “physical personal assistants” – and they’re not so much different from what you can see as the “common” robots in the movie AI; mainly to perfume nursing support to hold population. Cognitive Augmentation – As Maurice Conti explained, we are already “augmented”. Each and every one of us have a smartphone which is connected to the Internet and can easily reach out to a simple service like Google to get immediate knowledge about some unknown fact of life upon needing it. 

What an artificial intelligence researcher fears about AI


Along the way, we will find and eliminate errors and problems through the process of evolution. With each generation, the machines get better at handling the errors that occurred in previous generations. That increases the chances that we’ll find unintended consequences in simulation, which can be eliminated before they ever enter the real world. Another possibility that’s farther down the line is using evolution to influence the ethics of artificial intelligence systems. It’s likely that human ethics and morals, such as trustworthiness and altruism, are a result of our evolution — and factor in its continuation. We could set up our virtual environments to give evolutionary advantages to machines that demonstrate kindness, honesty and empathy. This might be a way to ensure that we develop more obedient servants or trustworthy companions and fewer ruthless killer robots. While neuroevolution might reduce the likelihood of unintended consequences, it doesn’t prevent misuse.


Red Hat CIO talks open hybrid cloud, next-generation IT

There's no one single roadblock that exists for the journey, which is ongoing. But the biggest hurdle is one of people, to have your people ready with the skills needed for this. We looked at this and asked: What are the types of skills we need resident in our team to live in this world? Do we want to hire people or leverage contractors? Then we built some programs around efforts to upskill our people; it's incumbent on us to help them learn new skills. But we had a mix of all three [new hires, contractors and upskilled staff]. I don't think it's pragmatic to think you can do one versus the other. I think you need to think all three of those. [On the other hand] just giving it to a provider saying, 'Go figure this out,' is a recipe for disaster. You have to stay very engaged.


Growing an Innovative Culture


Creating an innovative culture requires strong leaders who realise that changes in the culture has to start with themselves. We speak to many executives who think they can change the culture by creating a special team to foster innovation. This is not a "make it so" change. It requires everyone (including the executive) to behave differently in order to change the culture. Most executives and upper management are not motivated to change their behavior, as their rewards system is usually based on short term financial measures and not value delivery to customers and other stakeholders. Organisational risk aversion is another big barrier to innovation. We are frequently asked to provide stories to executives on how their competitors or other organisations much like themselves have implemented innovation. No one wants to be the first to try something new or different for fear of failure.



Quote for the day:


"Leaders who won't own failures become failures." -- Orrin Woodward


Daily Tech Digest - February 19, 2018


The problem is that employee satisfaction can be a double-edged sword. While satisfied employees are good for current activities, that very satisfaction can inhibit innovation. Transformative innovation is difficult. It is far easier to stick with what we know works and tweak the current process than it is to start over. People who are satisfied with the current way of doing business are not likely to transform it. People who transform their organizations must be aggravated enough with the current situation that they’re willing to bear the effort and risk to change it. Leaders who want their organizations to continuously transform must not only look for dissatisfaction on which to capitalize, but also be willing to cultivate dissatisfaction in their employees. ... The right kind of dissatisfaction is a mindset of constantly questioning the status quo and striving for more-than-incremental change. The wrong kind is constantly finding fault with the current situation, arguing that it is somebody else’s fault and assuming it’s somebody else’s responsibility to fix.



Dear IT security pros: it's time to stop making preventable mistakes

5 fumbling dumb mistake
Just think about it – how many log analysis services do you know? They generally all have a nice UI. Same goes for SIEMs. But the confusion comes with the graphic and alert overload – red and green icons telling analysts there are numerous findings that require attention. Security analysts usually don’t know which alerts to start executing on – and it’s hard to determine which alert is of the highest risk and which is just noise because no personnel changed its threshold. And to make matters worse, once a security analyst has opened an alert to start vetting it, they’re usually too scared to close down wide-open-to-the-internet ports because they don’t know the extent of the impact that will have on the company’s production environment. As a security advisor, the thing that really irritates me is just how preventable most (if not all) of the 2017 attacks I researched were. Companies like Equifax are not being decimated by unusually savvy hackers, they are being exposed by their own internal mistakes. Most of these errors are straight out of any “Tech Security 101” textbook.



Global cyber risk perception: Highest management priorities

The survey also found that a vast majority – 75% – identified business interruption as the cyber loss scenario with the greatest potential to impact their organization. This compares to 55% who cited breach of customer information, which has historically been the focus for organizations. Despite this growing awareness and rising concern, only 19% of respondents said they are highly confident in their organization’s ability to mitigate and respond to a cyber event. Moreover, only 30% said they have developed a plan to respond to cyber-attacks. “Cyber risk is an escalating management priority as the use of technology in business increases and the threat environment gets more complex,” said John Drzik, president Global Risk and Digital, Marsh. “It’s time for organizations to adopt a more comprehensive approach to cyber resilience, which engages the full executive team and spans risk prevention, response, mitigation and transfer.”


Meaningful AI Deployments Are Starting To Take Place: Gartner

Meaningful AI deployments are starting to take place: Gartner - CIO&Leader
Meaningful Artificial Intelligence (AI) deployments are just beginning to take place, according to Gartner. Gartner’s 2018 CIO Agenda Survey shows that 4% of CIOs have implemented AI , while a further 46% have developed plans to do so. "Despite huge levels of interest in AI technologies, current implementations remain at quite low levels," said Whit Andrews, research vice president and distinguished analyst at Gartner. "However, there is potential for strong growth as CIOs begin piloting AI programs through a combination of buy, build and outsource efforts," As with most emerging or unfamiliar technologies, early adopters are facing many obstacles to the progress of AI in their organizations. Gartner analysts have identified the following four lessons that have emerged from these early AI projects


Hacking critical infrastructure via a vending machine? The IOT reality

Many are currently, and rightly, concerned about protection from outside threats getting into important networks. The latest firewalls, intrusion prevention systems, advanced protection systems all play a part in defence, but as more and more connected devices enter networks, it is now critical to look at threats from within as well.  If firms do not have proper infrastructure to support IoT devices, they risk exposing their corporate networks to malicious activities. This can lead to devastating effects, especially if hackers uncover vulnerabilities in IoT devices within critical infrastructure. A good starting point for businesses as they take their network security efforts seriously in today's hyper-connected world, is to increase awareness of all the devices on the network and implement centralised management systems that help ensure compliance.


Ok, I Was Wrong, MDM is Broken Too: Insular, Dictatorial MDM Doesn’t Work

Ok, I Was Wrong, MDM is Broken Too: Insular, Dictatorial MDM Doesn’t Work
In master data management, fundamentally, your data problems are not technology problems. They are not even MDM problems. Your data problems aren’t even really well … data problems. They are business problems. They are the problem of getting four business people, three data stewards and several application managers into a room to formally agree on what revenue is for a customer record stored in the sales, marketing, ERP, and finance systems. MDM problems are about getting the right people educated, motivated and in agreement. And this can be messy and difficult. When you succeed with MDM you succeed by working from the business down. When you fail you fail because you design and implement something around a technology first and then you ‘release’ your master data solution to various practitioners around your company and expect them to comply. Like my peers in my freshman programming course we race to implement without spending enough time planning, negotiating and understanding.


Dissect the SQL Server on Linux high availability features


The availability group configurations that provide high availability and data protection require three synchronous replicas. When there is no Windows Server failover cluster, the availability group configuration is stored in the master database on participating SQL Server instances, which need at least three synchronous replicas to provide high availability and data protection. An availability group with two synchronous replicas can provide data protection, but this configuration cannot provide automatic high availability. If the primary replica has an outage, the availability group will automatically fail over. However, applications cannot automatically connect to the availability group until the primary replica is recovered. You can have a mixed availability group that contains both Windows and Linux replicas, but Microsoft only recommends this for data migration.


“Less is More”: Four Steps to Aligning Your Project Queue and Goals Today

aligning-project-queue
Today, as grown ups, “busywork” no longer holds the cachet it once may have. With corporate belts tightening and analytics available that expose the efficacy of each and every tactic, bloat can be harmful or fatal to even the most well intentioned of marketing professionals. And with 40 percent of corporate enterprises still bemoaning the fact that they can’t prove the ROI of their marketing activities, it’s clear that in many marketing departments, the project queue may be filled with plenty to keep the team busy – but is it hitting the mark? I recently spent time with a financial services client that was struggling to define growth, as it battled for market share in a crowded segment. Upon evaluating its marketing portfolio, it was clear that it had completed many projects in the recent past – but only a handful had yielded what one would consider to be “big wins.” 


How to connect to a remote MySQL database with DBeaver

dbeaverhero.jpg
If your database of choice is MySQL, you have a number of options. You can always secure shell into that server and manage the databases from the command line. You can also install a tool like phpMyAdmin or adminer to take care of everything via a web-based interface. But what if you'd prefer to use a desktop client? Where do you turn? One possible option is DBeaver. DBeaver is a free, universal SQL client that can connect to numerous types of databases—one of which is MySQL. I want to show you how to install and use DBeaver to connect to your remote MySQL server. DBeaver is available for Windows, macOS, and Linux. I'll be demonstrating on a Ubuntu 17.10 desktop connecting to a Ubuntu Server 16.04. The installation of DBeaver is fairly straightforward, with one hitch. Download the necessary .deb file from the downloads page and save it to your ~/Downloads directory. Open up a terminal window and change into that directory with the command cd ~/Downloads.


5 things that will slow your Wi-Fi network

snail rocket fast speed
The 2.4 GHz frequency band has 11 channels (in North America), but only provides up to three non-overlapping channels when using the default 20 MHz wide channels or just a single channel if using 40 MHz-wide channels. Since neighboring APs should be on different non-overlapping channels, the 2.4 GHz frequency band can become too small very quickly. The 5 GHz band, however, provides up to 24 channels. Not all APs support all the channels, but all the channels are non-overlapping if using 20 MHz-wide channels. Even when using 40 MHz-wide channels, you could have up to 12 non-overlapping channels. Thus, in this band, you have less chance of co-channel interference among your APs and any other neighboring networks. You should try to get as many Wi-Fi clients as you can to use the 5 GHz band on your network to increase speeds and performance. Consider upgrading any 2.4 GHz-only Wi-Fi clients to dual-band clients.



Quote for the day:



"Learn to do favors not for the people that can later return the favor but for those that need the favor." -- Unknown


Daily Tech Digest - February 18, 2018

Ready or Not, It's Time to Embrace AI

Ready or Not, It's Time to Embrace AIAI has changed online commerce by enabling brands to make sense of their data and put it to good use with smarter algorithms. In this age of conversational commerce, artificial intelligence is critical to providing a personalized experience. Businesses without an AI strategy are almost certain to perish. According to Forrester, insights-driven businesses will "steal" $1.2 trillion per year from their "less-informed peers.” Until a few months back, only bigger companies could afford the sizable investments required to implement AI. That's no longer the case. AI is becoming more accessible to businesses of all sizes. In the next few years, AI will continue to expand its reach throughout organizations. Early adopters already are reaping the benefits. If you're not one of them, now is the time to start.  Here are four reasons you (and every small-business owner) should incorporate AI-enabled technology in your sales and customer-service strategies.


Artificial Intelligence And The Threat To Salespeople


If you work in sales, now is the time to step your game up in a major way. Companies are investing in technology to replace salespeople. The truth is, your company thinks you're overpaid. If you're a salesperson, you're probably making six, seven or eight figures a year, and your company believes it's too much money. Now, listen, I'm not here to give you good news. I'm here to give you the truth. Here's what I see in the wave of the future. Those who know how to program the technology, operate the robots and work with artificial intelligence — the computer programs, algorithms, etc. — will be the salespeople remaining in their jobs. No longer will you be able to say, "People expect service. They want me to answer when the phone rings." Admin jobs will be automated. ... A human. We can expect a slew of robots to replace a lot of mid-level income earners. Many salespeople making six and seven figures a year will be removed, no matter their skills and sales. Artificial intelligence is far stronger than our natural-born intelligence.


Why is it so hard to train data scientists?

A data scientist should be familiar with databases, as many of the world’s data are organized in relational and non-relational databases. For working with a variety of data types the data scientist needs to be able to parse and render files, and convert between data formats. Working with large databases often requires programing skills beyond basic scripting in R or Python, as well as knowledge in algorithm design and operating system. Machine learning is also a required skill. In other words, a complete data scientist should have knowledge in computer science at the level of a trained computer scientist. A data scientist must also be highly familiar with statistics, and understand multiple statistical methods for tasks such as regression, dimensionality reduction, statistical significance analysis, Mote Carlo simulations, and Bayesian methods, to name a few.


Trend Micro Cybersecurity Reference Architecture for Operational Technology

Trend Micro Cybersecurity Reference Architecture for Operational Technology
Vulnerabilities arise particularly when just-in-time manufacturing and a faster speed to market leave less time for product safety testing. These vulnerabilities might not be uncovered until millions of vehicles have been released, in which case the necessary patching procedure is all but certain to prove even more costly — not only to the affected carmaker’s finances but also to its reputation. It’s important, then, for security measures to be properly applied right from the outset of the car manufacturing process, starting in the design phase. That is why it is important for device manufacturer to integrate security into the device itself, to ensure consumers and businesses are protected from these challenges, the minute they install your IoT device. Because of these challenges, Trend Micro have developed a cybersecurity solution called Trend Micro Internet of Thing (IoT)


Is REST the New SOAP?

Almost a decade ago there was a flurry of activity around REST and SOAP based systems. Several authors wrote about the pros and cons of one or the other, or when you should consider using one instead of the other. However, with much attention moving away from SOAP-based Web Services to REST and HTTP, the arguments and discussions died down and many SOA practitioners adopted REST (or plain HTTP) as the basis for their distributed systems. However, recently Pakal De Bonchamp wrote an article called "REST is the new SOAP" in which he compares using REST to "a testimony to insanity". His article is long and detailed but some of the key points he makes include the complexity, in his view, of simply exposing a straightforward API which could be done via an RPC mechanism in "a few hours" yet with REST can take much longer. Why? Because: No more standards, no more precise specifications. Just a vague “RESTful philosophy”, prone to endless metaphysical debates, and as many ugly workarounds.


Where do blockchain opportunities lie? Top FinTech influencers weigh in

FinTech
Any evolution in infrastructure must support the service and product expectations of the marketplace. Perhaps the most notable change in the financial services space would be the transition from corporate to individual data ownership and privacy. This in itself will fundamentally change the relationship between a bank and its customers, as well as industry revenue models. Beyond this, you’ll see intermediaries from our traditional financial services model get squeezed out as blockchain technologies reduce overall risk to any transaction. Additionally, blockchain technology’s standardization of information will enable broader adoption of adjacent new age technologies such as RPA and AI. These technologies leveraged together will move traditional financial services off of spreadsheets. This will require more training and retraining of personnel.


Strava’s privacy PR nightmare shows why you can’t trust social fitness apps

Strava needs its users to share their rides, runs, and swims. After all, the more activities they share—currently users post over 1.3 million activities per day—the more evidence Strava has to encourage others to keep using the app, and perhaps even trade up from the free version to an $8-per-month one. More shared data also means more to feed into Strava’s Metro business, which sells anonymized commuter data to cities. The company wasn’t profitable as of this past fall, but its CEO, James Quarles, clearly sees these two lines of business as the main paths to growth, assuming it gets more and more information from its users. And, frankly, using Strava in a very social way can be addicting. Since it began, in 2009, the company has perfected the art of fitness gamification and competitive sharing. Its app lets you see basic stats from your and your friends’ workouts; it encourages you to give each other kudos for completing activities


“Unlearn” to Unleash Your Data Lake

Figure 2:  Data Science Engagement Process
Sometimes it’s necessary to unlearn long held beliefs (i.e. 2-point shooting in a predominately isolation offense game) in order to learn new, more powerful, game changing beliefs (i.e. 3-point shooting in a rapid ball movement offense). Sticking with our NBA example, Phil Jackson is considered one of the greatest NBA coaches, with 11 NBA World Championships coaching the Chicago Bulls and the Los Angeles Lakers. Phil Jackson mastered the “Triangle Offense” that played to the strengths of the then dominant players Michael Jordan and Kobe Bryant to win those 11 titles. However, the game passed Phil Jackson as the economics of the 3-point shot changed how to win. Jackson’s tried-and-true “Triangle Offense” failed with the New York Knicks leading to the team’s dramatic under-performance and ultimately his firing. It serves as a stark reminder of how important it is to be ready to unlearn old skills in order to move forward.


Why a CHRO Will Be the Next Must-Have Role in the Boardroom


The primary job of any board of directors is to make sure the right leadership team is in place to drive the business, and the CEO is at the heart of that goal. A strong leadership bench is one with a succession plan in place, but this is a delicate topic. There are disclosure issues around such material information, of course, and some CEOs need encouragement to leave when the time is right – whether the change is contentious or not. Similarly, boards are often nervous about the timing of such shifts, particularly when they perceive a lack of a strong successor. Managing through these issues doesn’t come naturally to many board members, but it does for experienced CHROs. Such executives can offer insights on planned transitions and how to navigate the process, from identifying internal candidates to talking about development plans to introducing these topics to chief executives.


How IoT Affects the CISO's Job

"There are a lot of companies that are well positioned to handle IoT, but there are a lot that are so focused on just the day-to-day security work of keeping windows PCs and Linux servers secure, that they haven't gotten started at all," Pesatore says in an interview with Information Security Media Group. CISOs need to take steps to ensure they're involved in device acquisition decisions in all departments within the enterprise, he stresses. "Security and IT need to be involved in the decisions on building and buying these types of devices so we can make sure they are as secure and safe as possible," he says. And security staffs need to diversify their skills as a wider variety of devices are used in the enterprise, he adds. "When you look at the internet of things devices, it's a very heterogeneous world. There are all kinds of different operating systems and software and communications standards," he notes.



Quote for the day:

"The man who is afraid to risk failure seldom has to face success." -- John Wooden


Daily Tech Digest - February 17, 2018

The Three Do’s of DDoS protection

The Three Do’s of DDoS protection
Attackers have been putting DDoS firmly in the IT and Network consciousness – and they did it by substantially raising the bar for just how big and disruptive a DDoS attack can now be. ... DDoS attacks are not just growing in strength and frequency, but also diversifying in whom they target and the diversity of DDOS attacks, application layer as well as volumetric. You no longer need to be a big organisation to be impacted by DDoS – everyone is now a target. And as more of us conduct our business on internet-based systems, the risk of costly disruption grows. Attacks are backed by significant malicious resources, and are most effectively countered by the service provider that connects you to the Internet. DDoS attacks can strike at any time; potentially crippling network infrastructure and severely degrading network performance and reachability. Depending upon the type and severity of an attack on a website or other IP-accessible system, the impact can result in thousands or even millions of dollars of lost revenue.



When Streams Fail: Implementing a Resilient Apache Kafka Cluster at Goldman Sachs


Gorshkov reminded the audience of latency numbers that every programmer should know, and stated that the speed of light dictates that a best-case network round trip from New York City to San Francisco takes ~60ms, Virginia to Ohio takes ~12ms, and New York City to New Jersey takes ~4ms. With data centers in the same metro area or otherwise close, multiple centers can effectively be treated as a single redundant data center for disaster recovery and business continuity. This is much the same approach as taken by modern cloud vendors like AWS, with infrastructure being divided into geographic regions, and regions being further divided into availability zones. Allowing multiple data centers to be treated as one leads to an Apache Kafka cluster deployment strategy as shown on the diagram below, with a single conceptual cluster that spans multiple physical data centers.


Can Cybersecurity be Entrusted with AI?

Will AI be the bright future of security as the sheer volume of threats is becoming very difficult to track by humans alone. May be AI might come out as the most dark era, all depends upon Natural Intelligence. Natural Intelligence is needed to develop AI/machine learning tools. Despite popular belief, these technologies cannot replace humans (My own personal opinion). Using them requires human training and oversight. As the results reveal, AI is here to stay and it will have a large impact on security strategies moving forward but side by side with Natural intelligence. Cybersecurity state as on date is too much vulnerable but implementation of AI systems into the mix can serve as a real turning point. These systems come with a number of substantial benefits. These benefits will help prepare cybersecurity professionals for taking on cyber-attacks and safeguarding the enterprise.


What’s Driving India’s Fintech Boom?

Mobile Payments
Industry analysts expect that payments will be a pathway to other areas such as lending, insurance, wealth management and banking. “Most people in India lack credit history. Digital payments give them a credit history which can be leveraged in other areas,” explains Prantik Ray, professor of finance at XLRI – Xavier School of Management. Ravi Bapna, professor of business analytics and information systems at the Carlson School of Management, University of Minnesota, adds: “Innovative data-driven and behavioral risk management models can overcome barriers that arise from lack of widespread and robust credit scoring of individuals.” Rajesh Kandaswamy, research director-banking and securities at Gartner, points out that in mature geographies, payment mechanisms are already evolved and basic banking services are a given. However, in countries like China and India, digital payments are evolving in tandem with the growth in ecommerce.


In a digital world, do you trust the data?

Trust is now a defining factor in an organization's success or failure. Indeed, trust underpins reputation, customer satisfaction, loyalty and other intangible assets. It inspires employees, enables global markets to function, reduces uncertainty and builds resilience. The problem is that - in today's environment - trust isn't just about the quality of an organization's brands, products, services and people. It's also about the trustworthiness of the data and analytics that are powering its technology. KPMG International's Guardians of trust report explores the evolving nature of trust in the digital world. Based on a survey almost 2,200 global information technology (IT) and business decision-makers involved in strategy for data initiatives, this report identifies some of the key trends and emerging principles to support the development of trusted analytics in the digital age.


The Great Disruption of Your Career

Seriously; even coffee shops are now using affordable facial recognition technology with basic CRM to create an amazing experience for customers... "Hi Tony, your triple-shot decaf, skim, soy latte is on its way... did you manage to go water-skiing on the weekend?" Perfect... I'll be able to keep my head down deleting spammy emails while rocking away to Spotify... no need place an order in advance or give eye contact or interact with anyone while securing my morning caffeine fix :-) White collar professions are not immune to the employment apocalypse. Combinations of technology with offshoring to lower cost markets are already biting like a savage dog at your crotch. Do you lay awake at night wondering how you can make yourself indispensable? What do you really do that cannot be automated?


Designing, Implementing, and Using Reactive APIs


Reactive programming is a vast subject and is well beyond the scope of this article, but for our purposes, let’s define it broadly as a way to define event driven systems in a more fluent way than we would with a traditional imperative programming style. The goal is to move imperative logic to an asynchronous, non-blocking, functional style that is easier to understand and reason about.  Many of the imperative APIs designed for these behaviors (threads, NIO callbacks, etc.) are not considered easy to use correctly and reliably, and in many cases using these APIs still requires a fair amount of explicit management in application code. The promise of a reactive framework is that these concerns can be handled behind the scenes, allowing the developer to write code that focuses primarily on application functionality. The very first question to ask yourself when designing a reactive API is whether you even want a reactive API! Reactive APIs are not the correct choice for absolutely everything.


Wireless Reshaping IT/OT Network Best Practices

Wireless Reshaping IT/OT Network Best Practices
IoT, its accompanying cloud services and Big Data analytics, routinely deliver immense and unheard-of amounts of data from devices and sensors. That means network architectures continue to adapt and will change dramatically to implement the data flow from these sensors. That also means networks will become outward focused, as the amount of data acquired from edge devices dwarf the amount of data produced inside the network. Previously, network architecture for wireless used a design that had a wireless access point directly and quickly connected to wired Ethernet. Network backhauls were always wired. However, in more recent times, companies with sprawling multi-building campuses, manufacturing, or process plants, have been using wireless backhauls. Some of these are using WiMAX (IEEE 802.16) as broadband microwave links. Others are designed as optical. These wireless backhauls are significantly less expensive to install, and provide secure data transmission.


GDPR: The Data Subject Perspective

The discussion that followed highlighted a key point: the value of the data means that stakes are high. Organizations are understanding how much value can be driven by intelligent use of data. My opinion is that many individuals have sold themselves short in negotiations around use of personal data. This is because individual data subjects have had limited knowledge, power or influence at a negotiating table that doesn’t really exist – unlike the agreement process for other contracts in which both parties are normally well informed. GDPR implication: The key is intelligent use of data. Personal data which is not managed correctly will have less impact on an organization’s bottom line, and will become a burden under GDPR. Organizations should review their data collection mechanisms and consider data minimisation, and data masking technology to implement privacy by default and design.


A business guide to raising artificial intelligence in a digital economy

The report highlights a need for a fundamental shift in leadership that is required to cultivate partnerships with customers and business partners, and to further accelerate the adoption of artificial intelligence as the fuel for enterprises to grow and deliver social impact. Accenture's 2018 report ...  highlights how rapid advancements in technologies -- including artificial intelligence (AI), advanced analytics and the cloud -- are enabling companies to not just create innovative products and services, but change the way people work and live. This, in turn, is changing companies' relationships with their customers and business partners. "Technology," said Paul Daugherty, Accenture's chief technology and innovation officer, "is now firmly embedded throughout our everyday lives and is reshaping large parts of society. This requires a new type of relationship, built on trust and the sharing of large amounts of personal information."



Quote for the day:


"A wise man gets more use from his enemies than a fool from his friends." -- Baltasar Gracian


Daily Tech Digest - February 16, 2018

5 early warning signs of project failure
One of the first (and biggest) warning signs that your project may be headed for failure is an internal culture that is resistant to change. Projects bring about improvements in workflows and new operational best practices, often with an increased use of technology. These changes can create a significant amount of fear, as employees assume the end result will mean job losses or major disruption to their individual working world. Many projects have been internally sabotaged right from the start as result of these fears. How can you tell if you have a culture that is resistant to change? Employees who are resistant to change are often reluctant to share information and exhibit negative attitudes towards the project and its benefits, either through direct communication or body language and facial expressions. Alleviating these fears by creating a culture that embraces change is key.



A quick-and-dirty way to predict human behavior

A quick-and-dirty way to predict human behavior
Machine learning and AI technologies are everywhere. One of the top uses is to predict human behavior. Luckily, people are creatures of habit. Moreover, when given the freedom to do anything they want, most people will do what everyone else is doing (I’m paraphrasing a badly remembered quote). That makes is kind of easy to predict what people will do next, at least statistically. Imagine you go to a website and start rating things. First you rate a cat picture, then a baseball, and then a Magpul FMG-9. There were also a few things you didn’t rate on the same page. Assuming that someone else made similar rankings as you, we can probably “guess” what you’d rank the other things. ... The algorithm that many recommendations are based on is called Alternating Least Squares (or some form of it). With ALS, you use a training set or, if you have a lot of users, you can use some of them as the training set to rate the others.


HP expands its Device-as-a-Service offering to include Apple

repair and replace broken mobile devices managed mobility services
Through its DaaS offering, HP determines the contractual relationship enterprises want to have, whether it's with a value-added reseller, a global systems integrator or a direct relationship with HP, "and then we provide it back to you within a utility model or a per-user, per-device pricing model," said Jonathan Nikols, global head of HP's Device-as-a-Service. For example, the cost of a contract would include an SLA on how fast the turnaround time on a device repair and replacement should be – whether it's next day or in four days. When an end user's device breaks or needs replacing, they file a help-desk ticket just as they would with any IT shop; the ticket is automatically routed to the HP DaaS service. The service also handles employee on-boarding and off-boarding. Mixed-device environments are the norm now, HP said, making it increasingly difficult and costly for organizations to manage multiple device types, OSes and vendors.


Who should buy a Ryzen APU, and who shouldn't

ryzen 3 2200g 9
If you're asking yourself, "should I buy a Ryzen APU?" for a new budget gaming PC, the short answer is yes, probably. That's because for building a ground-up, entry-level gaming machine, the Ryzen APU is the best game in town, and possibly the only game for DIY builders, in the face of wallet-busting GPU prices. But for everyone? Well, no. There is no one-size fits all answer, so read on to learn who should buy the Ryzen APU—and who shouldn't.  ... AMD's new APUs have essentially enough CPU and GPU power to enable satisfying gaming at 720p to 1080p. Both APUs combine quad-core Zen x86 cores with up to 11 Vega graphics cores, and the Ryzen 5 2400G also has SMT. The integrated graphics basically offers from double to triple the gaming performance of Intel's HD 630 graphics, which is inside everything from an $85 Pentium to a $380 Core i7.


How your company can prevent a data breach – and what to do if one occurs
As any executive whose company has suffered a data breach knows, the true costs of cybercrime are devastating, far-reaching and continue long after business functions have been restored. Between investigation and repair costs, customer notification requirements, contractual liabilities and workflow continuity, worldwide spending to mitigate the impact of cyberattacks is projected to reach an unprecedented $90 billion this year. Then there are the indirect costs, which include legal fees and public reputation rebuilding. This last component is particularly crucial, since a recent Gemalto survey revealed that 70 percent of consumers said they would cut ties with a company that had suffered a cyberattack. Indeed, businesses are anticipated to bear the brunt of cybercrime’s growing financial burden.Over half of last year’s cyberattacks targeted corporations; and among all small businesses, 58 percent had been personally hit by data breach.


Juniper Networks Expands Portfolio for Secure Multicloud Computing


“The promise of multicloud is to deliver an infrastructure that is secure, ubiquitous, reliable and fungible and where the migration of workloads will be a simple and intuitive process,” said Bikash Koley, chief technology officer at Juniper Networks. “For IT to be successful in becoming multicloud-ready, it is critical organizations consider not only the data center and public cloud, but also the on-ramps of their campus and branch networks. Otherwise, enterprises will face fractured security and operations as network boundaries prevent seamless, end-to-end visibility and control.” A Juniper-commissioned study by PwC found that workload migration is underway in the next three years across every core functional area, such as customer service, systems management, marketing, compute bursting, business applications, DevOps and backup and recovery.


Bitcoin thieves use Google AdWords to target victims

screen-shot-2018-02-15-at-09-27-25.jpg
The fraudsters established "gateway" phishing links that appeared in search results when potential victims searched Google for cryptocurrency-related keywords, such as "blockchain" or "bitcoin wallet." These links, bolstered by the purchase of Google AdWords, would then send victims to malicious domains, which would serve phishing content depending on the IP address and likely language of the visitor. According to the team, the hackers are focusing on countries where access to traditional banking may be difficult, such as Estonia, Nigeria, Ghana, and a number of other African countries. When access to banking is difficult, cryptocurrency, as decentralized assets recorded on the Blockchain, may empower users financially. However, it seems that the cybercriminals behind the campaign also know there may be more interest from residents of these countries, and so, this idea has decided the focus of phishing campaigns.


Google's Android P will make it easier for OEMs to copy iPhone X

ipxa.jpg
Google's intent on making Assistant more visible in Android comes as the personal digital assistant market is becoming too crowded, forcing potential competitors out. While Amazon's Alexa and Microsoft's Cortana are available as downloadable apps, Samsung's built-in and much maligned Bixby assistant continues to linger, though The Verge has called for Bixby's death. Cortana was conspicuously absent at CES 2018, leading ZDNet's Larry Dignan to declare the trade show "Cortana's Funeral." Facebook announced the discontinuation of their virtual assistant, M, in January. For screen cutouts, it is yet to be seen if this design trend will continue to persist, or if this will wind up as a fad similar to 3D phones. While Apple's use of the technology is notable, it seems unlikely that manufacturers are holding back on shipping phones for lack of OS-level support. Of note, Sharp also produced a handful of 3D smartphones available primarily in Japan.


Nokia is re-evaluating its wearables division


Once a global leader in mobile, the company failed to embrace the smartphone revolution, selling to Microsoft, which then shuttered the whole thing entirely. Of course, the Nokia name is back in the smartphone space, but that comes under a licensing deal through HMD — a company founded by former execs from the company. Interestingly, recent numbers show that the brand has actually been doing pretty well. Wearables, on the other hand, have stagnated, forcing brands to exit the space, sell or shutter entirely. The herd has thinned over the past year, and even top names like Fitbit have struggled to keep their head above water. For Nokia, acquiring a company like Withings no doubt seemed like a quick way to hit the ground running — but the timing was rough on this one. Hopefully this doesn’t mark the end of the Withings/Nokia Health line, which made some really solid and innovative devices.


Cloud sync vs backup: Which disaster recovery works better for business continuity?

Backup is the traditional way most businesses protect their digital assets from disaster. At regular intervals, changes in local storage are transferred to either a local backup device or a cloud backup service. Usually, these changes are incremental and go into backup archives. A good backup service will store ongoing snapshots, so it's always possible to go back in time and recover an old document. The gotcha with backup systems is that recovery is often cumbersome. You usually have to launch a backup program on your PC, dig through the various backup instances, and initiate a restore. In most cases, you can't really use or read the files in the backups until they're restored to your computer. Cloud sync, by contrast, takes files that exist on your local computer and moves them into a cloud infrastructure. Most cloud infrastructures encourage you to work on the files in the cloud.



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman