Daily Tech Diest - February 23, 2018

Cisco automation tools make it easier for network admins

automation-robot-phonlamaiphoto.jpg
Cisco has a new automation software portfolio that helps global service providers manage massive amounts of network data and better prepare for impending security threats. "We built out an entirely new portfolio of automation tools. It really centers on the fact that our customers have a whole set of challenges. They're currently spending on average somewhere between 3-4 times the amount to operate an infrastructure than they are just to purchase the infrastructure," said Jonathan Davidson, senior vice president and general manager of Cisco Service Provider Networking. In 2016, there were 17 billion devices and connections running on service provider networks and this is forecast to grow to 27 billion by 2021. To address this shift, the Cisco Crosswork Network Automation portfolio will assist industry adoption of complete lifecycle network automation and intent-based networking to help networks predict change and react in near real time.



Leveraging Security to Enable Your Business

The first step is to look into more modern technologies, such as a reverse proxy, which can overcome the cumbersome nature of multiple VPNs and ensure quick, seamless, and secure access from anywhere, on any device. With this approach, there is no need to repeatedly require MFA once a user has "passed the test" of proving who they are. Businesses can also leverage adaptive authentication technology, which automatically adjusts authentication requirements relative to the risk of the request. For example, an initial login may require MFA, but subsequent logins by the same user, from the same device, in the same day would not. If, however, the request suddenly comes from an unknown device, there could be something fishy going on. With adaptive authentication, the rules for an MFA requirement for specific risky login instances can be preset and automatically enforced.


AI for good: Can AI be trusted - and is it too late to ask?

Artificial Intelligence Trusted
The answer seems to point towards human input: in the words of AI researcher Professor Joanna Bryson, “if the underlying data reflects stereotypes, or if you train AI from human culture, you will find bias.” And if we’re not careful, we risk integrating that bias into the computer programs that are fast taking over the running of everything from hospitals to schools to prisons – programs that are supposed to eliminate those biases in the first place. Nigel Willson, global strategist at Microsoft, points out the importance of recognising how no technology is ever black and white. “The reality is that AI is like anything else – it can be very dangerous, or it can be amazing, based on how it’s used or misused,” he says. AI is only as as accurate as the information on which it is trained – meaning that we must be very careful with how we train it. Awareness of ‘unfair’ bias integrated into decades of data has led researchers to attempt the design of algorithms that counteract that bias when scraping the data: but this sparks the question of what constitutes ‘fairness’.


Telecom Opportunities: How to Monetize IoT

When League of Legends, one of the most popular online video games, went through the issue of lagging, their developers created their own internet to let players connect to the game. Riot Games created a network of routers, data centers and peer ISPs to create a network that placed latency before costs. Players from any part of the country would be directly connected to Riot’s access servers rather than routers on the regular ISP network. With 5G, Telcos can offer new levels of latency but there is more than just network connectivity that they can offer to gaming companies. One example Ericsson showed me during a recent visit to Kista, Sweden was an interface that allowed the gamer to manage their account from inside the game, for example they could top up their data allowance without having to exit the game.


“There is also growing use of managed security services to complement their on-site capability and provide secure file transfers and software updates, as well as continuous monitoring,” he said. However, he said that although there is a high level of awareness of the need for good cyber security in industrial operations, in many cases cyber security fundamentals are not yet in place. A recent Honeywell-sponsored survey by LNS Research of 130 decision makers from industrial companies revealed that only 37% were monitoring their plant systems for suspicious behaviour and 20% are not conducting regular risk assessments. “The survey also found that 53% said they had already experienced cyber security breach, but that is not surprising, given how young we are globally in cyber protection for critical infrastructure and industrial cyber security,” said Zindel.


Big Data Isn’t a Thing; Big Data is a State of Mind


Big Data is about exploiting the unique characteristics of data and analytics as digital assets to create new sources of economic value for the organization. Most assets exhibit a one-to-one transactional relationship. For example, the quantifiable value of a dollar as an asset is finite – it can only be used to buy one item or service at a time. Same with human assets, as a person can only do one job at a time. But measuring the value of data as an asset is not constrained by those transactional limitations. In fact, data is an unusual asset as it exhibits an Economic Multiplier Effect, whereby it never depletes or wears out and can be used simultaneously across multiple use cases at near zero margin cost. This makes data a powerful asset in which to invest. Understanding the economic characteristics of data and analytics as digital assets is the first step in monetizing your data via predictive, prescriptive and preventative analytics.


How long does it take to detect a cyber attack?

The study found that US companies took an average of 206 days to detect a data breach. This is a slight increase on the previous year (201 days). Ponemon suggests all organizations should aim to identify a breach within 100 days. The average cost of identifying a breach within this time was $5.99 million, but for breaches that took longer to identify, the average cost rose to $8.70 million. There is a similar correlation in terms of containing a breach. Breaches that took less than 30 days to contain had an average cost of $5.87 million, but this rose to $8.83 million for breaches that took longer to contain. The good news is that organizations have become significantly better at containing breaches, with the average time dropping from 70 days in 2016 to 55 days. The majority of breached organizations are notified by someone other than their own staff, according to Mandiant’s M-Trends 2017 report. It found that 53% of breaches were discovered by an external source.


Hackers are selling legitimate code-signing certificates to evade malware detection


Code-signing certificates are designed to give your desktop or mobile app a level of assurance by making apps look authentic. Whenever you open a code-signed app, it tells you who the developer is and provides a high level of integrity to the app that it hasn't been tampered with in some way. Most modern operating systems, including Macs, only run code-signed apps by default. But not only does code-signing have an affect on users who inadvertently install malware, code-signed apps are also harder to detect by network security appliances. The research said that hardware that uses deep packet inspection to scan for network traffic "become less effective when legitimate certificate traffic is initiated by a malicious implant." That's been picked up by some hackers, who are selling code-signing certificates for as little as $299. Extended validation certificates which are meant to go through a rigorous vetting process can be sold for $1,599.


Machine-learning promises to shake up large swathes of finance


Natural-language processing, where AI-based systems are unleashed on text, is starting to have a big impact in document-heavy parts of finance. In June 2016 JPMorgan Chase deployed software that can sift through 12,000 commercial-loan contracts in seconds, compared with the 360,000 hours it used to take lawyers and loan officers to review the contracts. Machine-learning is also good at automating financial decisions, whether assessing creditworthiness or eligibility for an insurance policy. Zest Finance has been in the business of automated credit-scoring since its founding in 2009. Earlier this year it rolled out a machine-learning underwriting tool to help lenders make credit decisions, even for people with little conventional credit-scoring information. It sifts through vast amounts of data, such as people’s payment history or how they interact with a lender’s website.


The emerging link between employee well-being and cyber security services

This epidemic means big problems for employees and employers alike — and a significant opportunity for brokers who can provide solutions that protect employees’ financial well-being. When identity thieves take advantage of employees’ stolen personal information to obtain credit or loans, or commit various types of fraud, both employees and employers pay a steep price. ...  In other words, the identity theft resolution process is not only stressful for employees, it has a significant impact on their productivity at work. The reason is because without the assistance of an identity theft resolution resource, employees have to do a lot of leg work, such as filing police reports, writing letters and making trips to financial institutions to report fraud.



Quote for the day:


"You never really learn much from hearing yourself speak." -- George Clooney


Daily Tech Digest - February 22, 2018

(Image: geralt/Pixabay)
Organizations are investing more money in their analytics programs. These programs do more now than recommending a new blouse or what to watch next on Netflix. If you are SpaceX and your data is incorrect, it could result in the loss of a multi-million-dollar rocket, Biltz said. That's a big deal. The Accenture report, culled from survey responses of more than 6,300 business and IT executives worldwide, found that 82% of those executives are using data to drive critical and automated decisions. What's more, 97% of business decisions are made using data that managers consider to be of unacceptable quality, Accenture notes, citing a study published in HBR. "Now it becomes vitally important that the data you have is as true, as correct, as you can make it," Biltz said. Right now, organizations don't have the systems in place to do that." Plus, there's just more data now, coming from a variety of different sources, than there ever has been in the past.


9 ways to overcome employee resistance to digital transformation

While it's easy to assume technology changes would cause the most issues in the transformation process, tech isn't actually the root of the problem, said R/GA Austin's senior technology director Katrina Bekessy. "Rather, it's usually organizing the people and processes around the new tech that's difficult," Bekessy said. "It's hard to change the way people work, and realign them to new roles and responsibilities. In short, digital transformation is not only a transformation of tech, but it also must be a transformation in a team's (or entire company's) culture and priorities." Inertia and ignorance are two key parts of employee resistance to transformation, according to Michael Dortch, principal analyst and managing editor at DortchOnIT.com. "Inertia results in the 'but we've always done it this way' response to any proposed change in operations, process, or technology, while ignorance limits the ability of constituents to see the necessity and benefits of digital transformation," Dortch said.


8 Machine Learning Algorithms explained in Human language

Machine Learning explained in human language
What we call “Machine Learning” is none other than the meeting of statistics and the incredible computation power available today (in terms of memory, CPUs, GPUs). This domain has become increasingly visible important because of the digital revolution of companies leading to the production of massive data of different forms and types, at ever increasing rates: Big Data. On a purely mathematical level most of the algorithms used today are already several decades old. ... You are looking for a good travel destination for your next vacation. You ask your best friend for his opinion. He asks you questions about your previous trips and makes a recommendation. You decide to ask a group of friends who ask you questions randomly. They each make a recommendation. The chosen destination is the one that has been the most recommended by your friends. The recommendations made by your best friend and the group will both make good destination choices. But when the first recommendation method works very well for you, the second will be more reliable for other people.


3 Things You Need to Know (and Do) Before Adopting AI

3 Things You Need to Know (and Do) Before Adopting AI
AI enables machines to learn and act, either in place of humans or to supplement the work of humans. We’re already seeing widespread use of AI in our daily lives, such as when brands like Netflix and Amazon present us with options based on our buying behaviors, or when chat bots respond to our queries. AI is used to pilot airplanes and even streamline our traffic lights. And, that’s just the beginning as we enter the age of AI and machine learning, with these technologies replacing traditional manufacturing as drivers of economic growth. A McKinsey Global Institute study found that technology giants Baidu and Google spent up to $30 billion on AI in 2016, with 90 percent of those funds spent on research and development, and deployment and 10 percent on AI acquisitions. In 2018, AI adoption is expected to jump from 13 percent to 30 percent, according to Spiceworks' 2018 State of IT report.


Is the IoT backlash finally here?

Is the IoT backlash finally here?
As pretty much everyone knows, the Internet of Things (IoT) hype has been going strong for a few years now. I’ve done my part, no doubt, covering the technology extensively for the past 9 months. As vendors and users all scramble to cash in, it often seems like nothing can stop the rise IoT. Maybe not, but there have been rumblings of a backlash to the rise of IoT for several years. Consumer and experts worry that the IoT may not easily fulfill its heavily hyped promise, or that it will turn out to be more cumbersome than anticipated, allow serious security issues, and compromise our privacy.  Others fear the technology may succeed too well, eliminating jobs and removing human decision-making from many processes in unexamined and potentially damaging ways. As New York magazine put it early last year, “We’re building a world-size robot, and we don’t even realize it.” Worse, this IoT robot “can only be managed responsibly if we start making real choices about the interconnected world we live in.”


Intel expects PCs with fast 5G wireless to ship in late 2019

Intel 5g notebook
Intel will show off a prototype of the new 5G connected PC at Mobile World Congress show in Barcelona. In addition the company will demonstrate data streaming over the 5G network. At its stand, Intel said that it will also show off eSIM technology—the replacement for actual, physical SIM cards—and a thin PC running 802.11ax Wi-Fi, the next-gen Wi-Fi standard. Though 5G technology is the mobile industry’s El Dorado, it always seems to be just over the next hill. Intel has promoted 5G for several years, saying it will handle everything from a communications backbone for intelligent cars to swarms of autonomous drones talking amongst themselves.  Carriers, though, have started nailing down when and where customers will be able to access 5G technology. AT&T said Wednesday, for example, that a dozen cities including Dallas and Waco, Texas, and Atlanta, Georgia, will receive their first 5G deployments by year’s end. Verizon has plans for three to five markets, including Sacramento, California.


Who's talking? Conversational agent vs. chatbot vs. virtual assistant


A conversational agent is more focused on what it takes in order to maintain a conversation. With virtual agents or personal assistants, those terms tend to be more relevant in cases where you're trying to create this sense that the conversational agent you're dealing with has its own personality and is somehow uniquely associated with you. At least for me, the term virtual assistant sort of metaphorically conjures the idea of your own personal butler -- someone who is there with you all the time, knows you deeply but is dedicated to just you and serving your needs. .. I think there becomes an intersection between the two ideas. For it to serve you on a personal level, any kind of good personal assistant or virtual assistant needs to retain a great deal of context about you but then use that context as a way of interacting with you -- to use the conversational agent technique for not just anticipating your need but responding to your need and getting to know you better to be able to respond to that need better in the future.


Why the GDPR could speed up DevOps adoption

istock-531240484.jpg
One of the key trends that's happening now, especially with the changing demographics and change in technology, is most people are interacting with businesses digitally, via their phones, via their computers and so on. A lot of businesses, whether it's retail or banking or insurance or whatever have you—the face of those businesses has started to become digital and where they're not becoming digital there are new companies that are springing up that are disrupting those businesses. DevOps, the whole movement, the single biggest thing about it is agility, which is the ability to bring applications to market quicker, so this new demographic that's interacting with all the businesses digitally can consume or can interact with these businesses in ways that they're used to interacting with everything else, and for these businesses to protect themselves against disruption from other people.


Cisco Report Finds Organizations Relying on Automated Cyber-Security

Automation
Among the high-level findings in the 68-page report is that 39 percent of organizations stated they rely on automation for their cyber-security efforts. Additionally, according to Cisco's analysis of over 400,000 malicious binary files, approximately 70 percent made use of some form of encryption. Cisco also found that attackers are increasingly evading defender sandboxes with sophisticated techniques. "I'm not surprised attackers are going after supply chain, using cryptography and evading sandboxed environments, we've seen all these things coming for a long time," Martin Roesch, Chief Architect in the Security Business Group at Cisco, told eWEEK. "I've been doing this for so long, it's pretty hard for me to be surprised at this point." Roesch did note however that he was pleasantly surprised that so many organizations are now relying on automation, as well as machine learning and artificial intelligence, for their cyber-security operations.


Artificial general intelligence (AGI): The steps to true AI

Artificial general intelligence (AGI): The steps to true AI
AI lets a relatively dumb computer do what a person would do using a large amount of data. Tasks like classification, clustering, and recommendations are done algorithmically. No one paying close attention should be fooled into thinking that AI is more than a bit of math. AGI is where the computer can “generally” perform any intellectual task a person can and even communicate in natural language the way a person can. This idea isn’t new. While the term “AGI” harkens back to 1987, the original vision for AI was basically what is now AGI. Early researchers thought that AGI (then AI) was closer to becoming reality than it actually was. In the 1960s, they thought it was 20 years away. So Arthur C. Clarke was being conservative with the timeline for 2001: A Space Odyssey. A key problem was that those early researchers started at the top and went down. That isn’t actually how our brain works, and it isn’t the methodology that will teach a computer how to “think.” In essence, if you start with implementing reason and work your way down to instinct, you don’t get a “mind.”



Quote for the day:


"A man's character may be learned from the adjectives which he habitually uses in conversation." -- Mark Twain


Daily Tech Digest - February 21, 2018

The New Era Of Artificial Intelligence


AI will soon become commoditized and democratized, just as electricity was in its time. Today we use computers, smartphones, other connected devices, and, mostly, apps. Whilst access to internet technologies has constantly improved over the past decades, very few people are able to program these and generate income by intelligently exploiting consumer data, which, in theory, is not theirs. GAFA (Google, Amazon, Facebook and Apple) and the Chinese BAT (Baidu, Alibaba and Tencent,) are among the most prominent players in these fields. Tomorrow’s world would be different with the emergence of relatively simple, portable AI devices, which might not necessarily be connected to each other by the internet, but would feature completely new protocols and peer-to-peer technologies. This will significantly re-empower consumers. Because it is decentralized, portable AI will be available for the masses within a decade or so. Its use will be intuitive; just as driving a car is today. Portable AI will also be less expensive than motorized vehicles, 


What is DevSecOps and Vulnerabilities?

The principles of security and communications should be introduced every step of the way when building applications. The philosophy of DevSecOps was created by security practitioners who seek to “work and contribute value with less friction”. These practitioners run a web site that details an approach to improving security, explaining that “the goal of DevSecOps is to bring individuals of all capabilities to a high level of security efficiency in a short period of time. Security is everyone responsibility.” DevSecOps statement includes principles such as building a lower access platform, focusing on science, avoiding fear, uncertainty and doubt, collaboration, continuous security monitoring and cutting edge intelligence. Community DevSecOps promotes action directed at detecting potential issues or exploiting weaknesses. In other words, think like an enemy and perform similar tactics such as trying to penetrate to identify gaps that can be exploited and that need to be treated.


7 essential technologies for a modern data architecture

7 essential technologies for a modern data architecture
At the center of this digital transformation is data, which has become the most valuable currency in business. Organizations have long been hamstrung in their use of data by incompatible formats, limitations of traditional databases, and the inability to flexibly combine data from multiple sources. New technologies promise to change all that. Improving the deployment model of software is one major facet to removing barriers to data usage. Greater “data agility” also requires more flexible databases and more scalable real-time streaming platforms. In fact no fewer than seven foundational technologies are combining to deliver a flexible, real-time “data fabric” to the enterprise. Unlike the technologies they are replacing, these seven software innovations are able to scale to meet the needs of both many users and many use cases. For businesses, they have the power to enable faster and more intelligent decisions and to create better customer experiences.


Tesla cloud systems exploited by hackers to mine cryptocurrency

Researchers from the RedLock Cloud Security Intelligence (CSI) team discovered that cryptocurrency mining scripts, used for cryptojacking -- the unauthorized use of computing power to mine cryptocurrency -- were operating on Tesla's unsecured Kubernetes instances, which allowed the attackers to steal the Tesla AWS compute resources to line their own pockets. Tesla's AWS system also contained sensitive data including vehicle telemetry, which was exposed due to the unsecured credentials theft. "In Tesla's case, the cyber thieves gained access to Tesla's Kubernetes administrative console, which exposed access credentials to Tesla's AWS environment," RedLock says. "Those credentials provided unfettered access to non-public Tesla information stored in Amazon Simple Storage Service (S3) buckets." The unknown hackers also employed a number of techniques to avoid detection. Rather than using typical public mining pools in their scheme


Micron sets its sights on quad-cell storage

Micron sets its sights on quad-cell storage
The first single-level cell, with one bit per cell, first emerged in the late 1980s when flash drives first appeared for mainframes. In the late 1990s came multi-level cell (MLC) drives capable of storing two bits per cell. Triple-level cell (TLC) didn't come out until 2013 when Samsung introduced its 840 series of SSDs. So, these advances take a long time, although they are being sped up by a massive increase in R&D dollars in recent years. Multi-bit flash memory chips store data by managing the number of electronic charges in each individual cell. With each new cell, the number of voltage states doubles. SLC NAND tracks only two voltage states, while MLC has four voltage states, TLC has eight voltage states, and QLC has 16 voltage states. This translates to much lower tolerance for voltage fluctuations. As density goes up, the computer housing the SSD must be rock-stable electrically because without it, you risk damaging cells. This means supporting electronics around the SSD to protect it from fluctuations.



When it comes to cyber risk, execute or be executed!

Accountability must be clearly defined, especially in strategies, plans and procedures. Leaders at all levels need to maintain vigilance and hold themselves and their charges accountable to execute established best practices and other due care and due diligence mechanisms. Organizations should include independent third-party auditing and pen-testing to better understand their risk exposure and compliance posture. Top organizations don’t use auditing and pen-testing for punitive measures, but rather, to find weaknesses that should be addressed. Often, they find that personnel need more training, and regular cyber drills and exercises to get to a level of proficiency commensurate with their goals. Those organizations that fail are those that do not actively seek to find weaknesses or fail to address known weaknesses properly. Sound execution of cyber best practices buys down your overall risk. With today’s national prosperity and national security reliant on information technology, the stakes have never been higher.


Hack the CIO

CIOs have known for a long time that smart processes win. Whether they were installing enterprise resource planning systems or working with the business to imagine the customer’s journey, they always had to think in holistic ways that crossed traditional departmental, functional, and operational boundaries. Unlike other business leaders, CIOs spend their careers looking across systems. Why did our supply chain go down? How can we support this new business initiative beyond a single department or function? Now supported by end-to-end process methodologies such as design thinking, good CIOs have developed a way of looking at the company that can lead to radical simplifications that can reduce cost and improve performance at the same time. They are also used to thinking beyond temporal boundaries. “This idea that the power of technology doubles every two years means that as you’re planning ahead you can’t think in terms of a linear process, you have to think in terms of huge jumps,” says Jay Ferro, CIO of TransPerfect, a New York–based global translation firm.


Taking cybersecurity beyond a compliance-first approach

Stack of legal documents with compliance and regulatory stamp
With high profile security breaches continuing to hit the headlines, organizations are clearly struggling to lock down data against the continuously evolving threat landscape. Yet these breaches are not occurring at companies that have failed to recognize the risk to customer data; many have occurred at organizations that are meeting regulatory compliance requirements to protect customer data.  Given the huge investment companies in every market are making in order to comply with the raft of regulation that has been introduced over the past couple of decades, this continued vulnerability is – or should be – a massive concern. Regulatory compliance is clearly no safeguard against data breach. Should this really be a surprise, however? With new threats emerging weekly, the time lag inherent within the regulatory creation and implementation process is an obvious problem. It can take over 24 months for the regulators to understand and identify weaknesses within existing guidelines, update and publish requirements, and then set a viable timeline for compliance.


Three sectors being transformed by artificial intelligence


While these industries will see significant AI adoption this year, the AI platforms and products that scale to mainstream adoption won’t necessarily be the household names you may expect. As the “Frightful Five” continue to grow and expand their reach across industries, they have designed powerful AI products. However, these platforms present challenges for smaller companies looking to implement AI solutions, as well as larger companies in competitive industries such as retail, online gaming, shipping, and travel to name a few. How can an advertiser on Facebook feel comfortable entrusting its data to a tech behemoth that may sell a product that competes with its business? Should a big data company using a Google AI feature be concerned about the privacy of its data? These risks are very real, yet businesses have options. They can instead choose to host data on independent platforms with independent providers, guarding their intellectual property while also supercharging the advancement of AI technology.


What the ‘versatilist’ trend means for IT staffing

IT staff who once only focused on systems in the datacenter now focus on systems in the public cloud as well. This means that while they understand how to operate the LAMP stacks in their enterprise datacenters, as well as virtualization, they also understand how to do the same things in a pubic cloud. As a result, they have moved from one role to two roles, or even more roles. However, the intention is that eventually that the traditional systems will go away completely, and they will just be focused on the cloud-based systems. I agree with Gartner on that, too. While I understand where Gartner is coming from, the more automation that sits between us and the latest technology means we need more technology specialists, not less. So, I’m not convinced that IT versatilists will gain new business roles to replace the loss of of the traditional datacenter roles, as Gartner suggests will happen.



Quote for the day:


"We're so busy watching out for what's just ahead of us that we don't take time to enjoy where we are." -- Bill Watterson


Daily Tech Digest - February 20, 2018

Regression Testing Strategies: an Overview


Change is the key concept of regression testing. The reasons for these changes usually fall into four broad categories: New functionality. This is the most common reason to run regression testing. The old and new code must be fully compatible. When developers introduce new code, they don’t fully concentrate on its compatibility with the existing code. It is up to regression testing to find possible issues; Functionality revision. In some cases, developers revise the existing functionality and discard or edit some features. In such situations, regression testing checks whether the feature in question was removed/edited with no damage to the rest of the functionality; Integration. In this case, regression testing assures that the software product performs flawlessly after integration with another product; and Bug fixes. Surprisingly, developers’ efforts to patch the found bugs may generate even more bugs. Bug fixing requires changing the source code, which in turn calls for re-testing and regression testing.



How the travel industry is using Big Data to tailor-make your holidays


It doesn’t take much paranoia to see how this is obviously beneficial to the airlines: your type of credit card gives a rough idea on your credit score, your billing address can give an idea of your social status, and even your email address says something about you. Plus, it’s easy to spot if you regularly fly alone. Or are your family with you? Is a certain financially-unconnected person always in the seat next to you? Are you flying to a ‘romantic’ location? Did you book a nice hotel, or are you a cheapskate? Are any of your Facebook friends or Twitter followers on the flight? What have you been looking at on the in-flight WiFi? And what events are happening in the area where you bought your flight to? All this data allows airlines to develop better models of their customers, and therefore give them ever better ways of refining their pricing models. Certain airlines are already running reverse auctions on upgrades, but this could be taken further.



5 Ways Blockchain Is Changing The Face Of Innovation of 2018


The volatility in cryptocurrencies is well-known and not for the faint-hearted, especially over recent weeks. Blockchain-based payment network Havven sets out to provide the first decentralized solution to price stability. Designed to provide a practical cryptocurrency, Havven uses a dual token system to reduce price volatility. The fees from transactions within the system are used to collateralise the network, secured by blockchain and supposedly enabling the creation of an asset-backed stablecoin. Think of Tether, but not being tied to the dollar. Each transaction generates fees that are paid to holders of the collateral token and as transaction volume grows, the value of the platform increases. Havven is a low-fee and stable payment network that wants to enable anyone anywhere to transact with anyone else. It's an interesting addition to the increasingly crowded crypto space.


Could we soon be seeing utility cryptocurrency mining?

cryptocurrency mining
Proof-of-work is the main model for cryptocurrency mining and blockchain, especially for Bitcoin. Basically, the way to guarantee the order of transactions is to slow down the system and make it computationally onerous to add a new block – i.e. it takes time and computing capacity. If two blocks are added simultaneously, then it is basically a competition to see who can perform the calculation tasks faster and add more to the chain, because the longer fork wins. The reward for adding a block is to receive some tokens (e.g. Bitcoins). SHA-256 (Secure Hash Algorithm), which came with Bitcoin, is a commonly used model, and there are targets for the hash algorithm value that basically forces it to perform a lot of calculations for each transaction to achieve the targeted value. The benefit of the current algorithm is that the results are easy to check and see whose block is added to the chain. It would probably need quite a lot work to develop models in which miners make some otherwise useful computation for proof of work.


For artificial intelligence to thrive, it must explain itself


The reason for this fear is that deep-learning programs do their learning by rearranging their digital innards in response to patterns they spot in the data they are digesting. Specifically, they emulate the way neuroscientists think that real brains learn things, by changing within themselves the strengths of the connections between bits of computer code that are designed to behave like neurons. This means that even the designer of a neural network cannot know, once that network has been trained, exactly how it is doing what it does. Permitting such agents to run critical infrastructure or to make medical decisions therefore means trusting people’s lives to pieces of equipment whose operation no one truly understands. If, however, AI agents could somehow explain why they did what they did, trust would increase and those agents would become more useful. And if things were to go wrong, an agent’s own explanation of its actions would make the subsequent inquiry far easier. Even as they acted up, both HAL and Eddie were able to explain their actions. 


Build a multi-cloud app with these four factors in mind

multi-cloud adoption
A key driver behind multi-cloud adoption is increased reliability. In 2017, Amazon's Simple Storage Service went down due to a typo in a command executed during routine maintenance. In the pre-cloud era, the consequences of an error like that would be relatively negligible. But, due to the growing dependence on public cloud infrastructure, that one typo reportedly cost upwards of $150 million in losses across many companies. A multi-cloud app -- or an app designed to run on various cloud-based infrastructures -- helps mitigate these risks; if one platform goes down, another steps in to take its place. ... Infrastructure changes should take days, not months. Regardless of the reason -- to save money, to prevent vendor lock-in or simply to run your app in a development environment without design compromises -- writing code without a specific cloud platform in mind ensures it will run on any server.




The Impact Of Artificial Intelligence Over The Next Half Decade

The Impact Of Artificial Intelligence Over The Next Half Decade
You will get a fully automated health checkup every time you take a bath or use the toilet at your house. Body fluids and temperature will be analyzed by sensors and the data will be forwarded to an “AI doctor” that will be able to inform you if there is something wrong with you and how to proceed. Ok, maybe this one will take a little longer than a decade. “ASIMO” alike droids will begin to be sold as “physical personal assistants” – and they’re not so much different from what you can see as the “common” robots in the movie AI; mainly to perfume nursing support to hold population. Cognitive Augmentation – As Maurice Conti explained, we are already “augmented”. Each and every one of us have a smartphone which is connected to the Internet and can easily reach out to a simple service like Google to get immediate knowledge about some unknown fact of life upon needing it. 

What an artificial intelligence researcher fears about AI


Along the way, we will find and eliminate errors and problems through the process of evolution. With each generation, the machines get better at handling the errors that occurred in previous generations. That increases the chances that we’ll find unintended consequences in simulation, which can be eliminated before they ever enter the real world. Another possibility that’s farther down the line is using evolution to influence the ethics of artificial intelligence systems. It’s likely that human ethics and morals, such as trustworthiness and altruism, are a result of our evolution — and factor in its continuation. We could set up our virtual environments to give evolutionary advantages to machines that demonstrate kindness, honesty and empathy. This might be a way to ensure that we develop more obedient servants or trustworthy companions and fewer ruthless killer robots. While neuroevolution might reduce the likelihood of unintended consequences, it doesn’t prevent misuse.


Red Hat CIO talks open hybrid cloud, next-generation IT

There's no one single roadblock that exists for the journey, which is ongoing. But the biggest hurdle is one of people, to have your people ready with the skills needed for this. We looked at this and asked: What are the types of skills we need resident in our team to live in this world? Do we want to hire people or leverage contractors? Then we built some programs around efforts to upskill our people; it's incumbent on us to help them learn new skills. But we had a mix of all three [new hires, contractors and upskilled staff]. I don't think it's pragmatic to think you can do one versus the other. I think you need to think all three of those. [On the other hand] just giving it to a provider saying, 'Go figure this out,' is a recipe for disaster. You have to stay very engaged.


Growing an Innovative Culture


Creating an innovative culture requires strong leaders who realise that changes in the culture has to start with themselves. We speak to many executives who think they can change the culture by creating a special team to foster innovation. This is not a "make it so" change. It requires everyone (including the executive) to behave differently in order to change the culture. Most executives and upper management are not motivated to change their behavior, as their rewards system is usually based on short term financial measures and not value delivery to customers and other stakeholders. Organisational risk aversion is another big barrier to innovation. We are frequently asked to provide stories to executives on how their competitors or other organisations much like themselves have implemented innovation. No one wants to be the first to try something new or different for fear of failure.



Quote for the day:


"Leaders who won't own failures become failures." -- Orrin Woodward


Daily Tech Digest - February 19, 2018


The problem is that employee satisfaction can be a double-edged sword. While satisfied employees are good for current activities, that very satisfaction can inhibit innovation. Transformative innovation is difficult. It is far easier to stick with what we know works and tweak the current process than it is to start over. People who are satisfied with the current way of doing business are not likely to transform it. People who transform their organizations must be aggravated enough with the current situation that they’re willing to bear the effort and risk to change it. Leaders who want their organizations to continuously transform must not only look for dissatisfaction on which to capitalize, but also be willing to cultivate dissatisfaction in their employees. ... The right kind of dissatisfaction is a mindset of constantly questioning the status quo and striving for more-than-incremental change. The wrong kind is constantly finding fault with the current situation, arguing that it is somebody else’s fault and assuming it’s somebody else’s responsibility to fix.



Dear IT security pros: it's time to stop making preventable mistakes

5 fumbling dumb mistake
Just think about it – how many log analysis services do you know? They generally all have a nice UI. Same goes for SIEMs. But the confusion comes with the graphic and alert overload – red and green icons telling analysts there are numerous findings that require attention. Security analysts usually don’t know which alerts to start executing on – and it’s hard to determine which alert is of the highest risk and which is just noise because no personnel changed its threshold. And to make matters worse, once a security analyst has opened an alert to start vetting it, they’re usually too scared to close down wide-open-to-the-internet ports because they don’t know the extent of the impact that will have on the company’s production environment. As a security advisor, the thing that really irritates me is just how preventable most (if not all) of the 2017 attacks I researched were. Companies like Equifax are not being decimated by unusually savvy hackers, they are being exposed by their own internal mistakes. Most of these errors are straight out of any “Tech Security 101” textbook.



Global cyber risk perception: Highest management priorities

The survey also found that a vast majority – 75% – identified business interruption as the cyber loss scenario with the greatest potential to impact their organization. This compares to 55% who cited breach of customer information, which has historically been the focus for organizations. Despite this growing awareness and rising concern, only 19% of respondents said they are highly confident in their organization’s ability to mitigate and respond to a cyber event. Moreover, only 30% said they have developed a plan to respond to cyber-attacks. “Cyber risk is an escalating management priority as the use of technology in business increases and the threat environment gets more complex,” said John Drzik, president Global Risk and Digital, Marsh. “It’s time for organizations to adopt a more comprehensive approach to cyber resilience, which engages the full executive team and spans risk prevention, response, mitigation and transfer.”


Meaningful AI Deployments Are Starting To Take Place: Gartner

Meaningful AI deployments are starting to take place: Gartner - CIO&Leader
Meaningful Artificial Intelligence (AI) deployments are just beginning to take place, according to Gartner. Gartner’s 2018 CIO Agenda Survey shows that 4% of CIOs have implemented AI , while a further 46% have developed plans to do so. "Despite huge levels of interest in AI technologies, current implementations remain at quite low levels," said Whit Andrews, research vice president and distinguished analyst at Gartner. "However, there is potential for strong growth as CIOs begin piloting AI programs through a combination of buy, build and outsource efforts," As with most emerging or unfamiliar technologies, early adopters are facing many obstacles to the progress of AI in their organizations. Gartner analysts have identified the following four lessons that have emerged from these early AI projects


Hacking critical infrastructure via a vending machine? The IOT reality

Many are currently, and rightly, concerned about protection from outside threats getting into important networks. The latest firewalls, intrusion prevention systems, advanced protection systems all play a part in defence, but as more and more connected devices enter networks, it is now critical to look at threats from within as well.  If firms do not have proper infrastructure to support IoT devices, they risk exposing their corporate networks to malicious activities. This can lead to devastating effects, especially if hackers uncover vulnerabilities in IoT devices within critical infrastructure. A good starting point for businesses as they take their network security efforts seriously in today's hyper-connected world, is to increase awareness of all the devices on the network and implement centralised management systems that help ensure compliance.


Ok, I Was Wrong, MDM is Broken Too: Insular, Dictatorial MDM Doesn’t Work

Ok, I Was Wrong, MDM is Broken Too: Insular, Dictatorial MDM Doesn’t Work
In master data management, fundamentally, your data problems are not technology problems. They are not even MDM problems. Your data problems aren’t even really well … data problems. They are business problems. They are the problem of getting four business people, three data stewards and several application managers into a room to formally agree on what revenue is for a customer record stored in the sales, marketing, ERP, and finance systems. MDM problems are about getting the right people educated, motivated and in agreement. And this can be messy and difficult. When you succeed with MDM you succeed by working from the business down. When you fail you fail because you design and implement something around a technology first and then you ‘release’ your master data solution to various practitioners around your company and expect them to comply. Like my peers in my freshman programming course we race to implement without spending enough time planning, negotiating and understanding.


Dissect the SQL Server on Linux high availability features


The availability group configurations that provide high availability and data protection require three synchronous replicas. When there is no Windows Server failover cluster, the availability group configuration is stored in the master database on participating SQL Server instances, which need at least three synchronous replicas to provide high availability and data protection. An availability group with two synchronous replicas can provide data protection, but this configuration cannot provide automatic high availability. If the primary replica has an outage, the availability group will automatically fail over. However, applications cannot automatically connect to the availability group until the primary replica is recovered. You can have a mixed availability group that contains both Windows and Linux replicas, but Microsoft only recommends this for data migration.


“Less is More”: Four Steps to Aligning Your Project Queue and Goals Today

aligning-project-queue
Today, as grown ups, “busywork” no longer holds the cachet it once may have. With corporate belts tightening and analytics available that expose the efficacy of each and every tactic, bloat can be harmful or fatal to even the most well intentioned of marketing professionals. And with 40 percent of corporate enterprises still bemoaning the fact that they can’t prove the ROI of their marketing activities, it’s clear that in many marketing departments, the project queue may be filled with plenty to keep the team busy – but is it hitting the mark? I recently spent time with a financial services client that was struggling to define growth, as it battled for market share in a crowded segment. Upon evaluating its marketing portfolio, it was clear that it had completed many projects in the recent past – but only a handful had yielded what one would consider to be “big wins.” 


How to connect to a remote MySQL database with DBeaver

dbeaverhero.jpg
If your database of choice is MySQL, you have a number of options. You can always secure shell into that server and manage the databases from the command line. You can also install a tool like phpMyAdmin or adminer to take care of everything via a web-based interface. But what if you'd prefer to use a desktop client? Where do you turn? One possible option is DBeaver. DBeaver is a free, universal SQL client that can connect to numerous types of databases—one of which is MySQL. I want to show you how to install and use DBeaver to connect to your remote MySQL server. DBeaver is available for Windows, macOS, and Linux. I'll be demonstrating on a Ubuntu 17.10 desktop connecting to a Ubuntu Server 16.04. The installation of DBeaver is fairly straightforward, with one hitch. Download the necessary .deb file from the downloads page and save it to your ~/Downloads directory. Open up a terminal window and change into that directory with the command cd ~/Downloads.


5 things that will slow your Wi-Fi network

snail rocket fast speed
The 2.4 GHz frequency band has 11 channels (in North America), but only provides up to three non-overlapping channels when using the default 20 MHz wide channels or just a single channel if using 40 MHz-wide channels. Since neighboring APs should be on different non-overlapping channels, the 2.4 GHz frequency band can become too small very quickly. The 5 GHz band, however, provides up to 24 channels. Not all APs support all the channels, but all the channels are non-overlapping if using 20 MHz-wide channels. Even when using 40 MHz-wide channels, you could have up to 12 non-overlapping channels. Thus, in this band, you have less chance of co-channel interference among your APs and any other neighboring networks. You should try to get as many Wi-Fi clients as you can to use the 5 GHz band on your network to increase speeds and performance. Consider upgrading any 2.4 GHz-only Wi-Fi clients to dual-band clients.



Quote for the day:



"Learn to do favors not for the people that can later return the favor but for those that need the favor." -- Unknown


Daily Tech Digest - February 18, 2018

Ready or Not, It's Time to Embrace AI

Ready or Not, It's Time to Embrace AIAI has changed online commerce by enabling brands to make sense of their data and put it to good use with smarter algorithms. In this age of conversational commerce, artificial intelligence is critical to providing a personalized experience. Businesses without an AI strategy are almost certain to perish. According to Forrester, insights-driven businesses will "steal" $1.2 trillion per year from their "less-informed peers.” Until a few months back, only bigger companies could afford the sizable investments required to implement AI. That's no longer the case. AI is becoming more accessible to businesses of all sizes. In the next few years, AI will continue to expand its reach throughout organizations. Early adopters already are reaping the benefits. If you're not one of them, now is the time to start.  Here are four reasons you (and every small-business owner) should incorporate AI-enabled technology in your sales and customer-service strategies.


Artificial Intelligence And The Threat To Salespeople


If you work in sales, now is the time to step your game up in a major way. Companies are investing in technology to replace salespeople. The truth is, your company thinks you're overpaid. If you're a salesperson, you're probably making six, seven or eight figures a year, and your company believes it's too much money. Now, listen, I'm not here to give you good news. I'm here to give you the truth. Here's what I see in the wave of the future. Those who know how to program the technology, operate the robots and work with artificial intelligence — the computer programs, algorithms, etc. — will be the salespeople remaining in their jobs. No longer will you be able to say, "People expect service. They want me to answer when the phone rings." Admin jobs will be automated. ... A human. We can expect a slew of robots to replace a lot of mid-level income earners. Many salespeople making six and seven figures a year will be removed, no matter their skills and sales. Artificial intelligence is far stronger than our natural-born intelligence.


Why is it so hard to train data scientists?

A data scientist should be familiar with databases, as many of the world’s data are organized in relational and non-relational databases. For working with a variety of data types the data scientist needs to be able to parse and render files, and convert between data formats. Working with large databases often requires programing skills beyond basic scripting in R or Python, as well as knowledge in algorithm design and operating system. Machine learning is also a required skill. In other words, a complete data scientist should have knowledge in computer science at the level of a trained computer scientist. A data scientist must also be highly familiar with statistics, and understand multiple statistical methods for tasks such as regression, dimensionality reduction, statistical significance analysis, Mote Carlo simulations, and Bayesian methods, to name a few.


Trend Micro Cybersecurity Reference Architecture for Operational Technology

Trend Micro Cybersecurity Reference Architecture for Operational Technology
Vulnerabilities arise particularly when just-in-time manufacturing and a faster speed to market leave less time for product safety testing. These vulnerabilities might not be uncovered until millions of vehicles have been released, in which case the necessary patching procedure is all but certain to prove even more costly — not only to the affected carmaker’s finances but also to its reputation. It’s important, then, for security measures to be properly applied right from the outset of the car manufacturing process, starting in the design phase. That is why it is important for device manufacturer to integrate security into the device itself, to ensure consumers and businesses are protected from these challenges, the minute they install your IoT device. Because of these challenges, Trend Micro have developed a cybersecurity solution called Trend Micro Internet of Thing (IoT)


Is REST the New SOAP?

Almost a decade ago there was a flurry of activity around REST and SOAP based systems. Several authors wrote about the pros and cons of one or the other, or when you should consider using one instead of the other. However, with much attention moving away from SOAP-based Web Services to REST and HTTP, the arguments and discussions died down and many SOA practitioners adopted REST (or plain HTTP) as the basis for their distributed systems. However, recently Pakal De Bonchamp wrote an article called "REST is the new SOAP" in which he compares using REST to "a testimony to insanity". His article is long and detailed but some of the key points he makes include the complexity, in his view, of simply exposing a straightforward API which could be done via an RPC mechanism in "a few hours" yet with REST can take much longer. Why? Because: No more standards, no more precise specifications. Just a vague “RESTful philosophy”, prone to endless metaphysical debates, and as many ugly workarounds.


Where do blockchain opportunities lie? Top FinTech influencers weigh in

FinTech
Any evolution in infrastructure must support the service and product expectations of the marketplace. Perhaps the most notable change in the financial services space would be the transition from corporate to individual data ownership and privacy. This in itself will fundamentally change the relationship between a bank and its customers, as well as industry revenue models. Beyond this, you’ll see intermediaries from our traditional financial services model get squeezed out as blockchain technologies reduce overall risk to any transaction. Additionally, blockchain technology’s standardization of information will enable broader adoption of adjacent new age technologies such as RPA and AI. These technologies leveraged together will move traditional financial services off of spreadsheets. This will require more training and retraining of personnel.


Strava’s privacy PR nightmare shows why you can’t trust social fitness apps

Strava needs its users to share their rides, runs, and swims. After all, the more activities they share—currently users post over 1.3 million activities per day—the more evidence Strava has to encourage others to keep using the app, and perhaps even trade up from the free version to an $8-per-month one. More shared data also means more to feed into Strava’s Metro business, which sells anonymized commuter data to cities. The company wasn’t profitable as of this past fall, but its CEO, James Quarles, clearly sees these two lines of business as the main paths to growth, assuming it gets more and more information from its users. And, frankly, using Strava in a very social way can be addicting. Since it began, in 2009, the company has perfected the art of fitness gamification and competitive sharing. Its app lets you see basic stats from your and your friends’ workouts; it encourages you to give each other kudos for completing activities


“Unlearn” to Unleash Your Data Lake

Figure 2:  Data Science Engagement Process
Sometimes it’s necessary to unlearn long held beliefs (i.e. 2-point shooting in a predominately isolation offense game) in order to learn new, more powerful, game changing beliefs (i.e. 3-point shooting in a rapid ball movement offense). Sticking with our NBA example, Phil Jackson is considered one of the greatest NBA coaches, with 11 NBA World Championships coaching the Chicago Bulls and the Los Angeles Lakers. Phil Jackson mastered the “Triangle Offense” that played to the strengths of the then dominant players Michael Jordan and Kobe Bryant to win those 11 titles. However, the game passed Phil Jackson as the economics of the 3-point shot changed how to win. Jackson’s tried-and-true “Triangle Offense” failed with the New York Knicks leading to the team’s dramatic under-performance and ultimately his firing. It serves as a stark reminder of how important it is to be ready to unlearn old skills in order to move forward.


Why a CHRO Will Be the Next Must-Have Role in the Boardroom


The primary job of any board of directors is to make sure the right leadership team is in place to drive the business, and the CEO is at the heart of that goal. A strong leadership bench is one with a succession plan in place, but this is a delicate topic. There are disclosure issues around such material information, of course, and some CEOs need encouragement to leave when the time is right – whether the change is contentious or not. Similarly, boards are often nervous about the timing of such shifts, particularly when they perceive a lack of a strong successor. Managing through these issues doesn’t come naturally to many board members, but it does for experienced CHROs. Such executives can offer insights on planned transitions and how to navigate the process, from identifying internal candidates to talking about development plans to introducing these topics to chief executives.


How IoT Affects the CISO's Job

"There are a lot of companies that are well positioned to handle IoT, but there are a lot that are so focused on just the day-to-day security work of keeping windows PCs and Linux servers secure, that they haven't gotten started at all," Pesatore says in an interview with Information Security Media Group. CISOs need to take steps to ensure they're involved in device acquisition decisions in all departments within the enterprise, he stresses. "Security and IT need to be involved in the decisions on building and buying these types of devices so we can make sure they are as secure and safe as possible," he says. And security staffs need to diversify their skills as a wider variety of devices are used in the enterprise, he adds. "When you look at the internet of things devices, it's a very heterogeneous world. There are all kinds of different operating systems and software and communications standards," he notes.



Quote for the day:

"The man who is afraid to risk failure seldom has to face success." -- John Wooden