Daily Tech Digest - March 16, 2018

The future of IT: Snapshot of a modern multi-cloud data center

Multi-cloud Data Centers Are Emerging as a Hedge Against the Major Commercial Clouds
The idea of cloud computing remains simplicity itself, which is a key element of its appeal: Move the cost and complexity of procuring, provisioning, operating, and supporting an endless array of hardware, software, and enabling services for your business out to a 3rd party, which does it all it for you, yet more securely and with much greater economies of scale. Writ large across virtually all industries, a comprehensive shift to the cloud thus continues to be a top objective of CIOs in many organizations this year. Even the objective, despite misgivings that we're really just going back to the monolithic IT vendor world again. Not surprisingly, enabling such a strategic move is also the top business goal of the leading commercial cloud vendors, namely Amazon, Microsoft, and Google, who continue to vie vigorously for marketshare, technical leadership, and -- some would say -- the most interesting and valuable part of the market itself ... Hosting companies like Rackspace and others used to be able to provide a hedge that IT departments could use for such purposes, through services like co-location. However, most such providers have not been able to keep up with the overall capacity race or compete in the bruising cost efficiency battles that the top cloud providers can afford to wage.



The Containerization of Artificial Intelligence

While AI is more hype than reality today, machine intelligence — also referred to as predictive machine learning — driven by a meta-analysis of large data sets that uses correlations and statistics, provides practical measures to reduce the need for human interference in policy decision-making. A typical by-product of such application is the creation of models of behavior that can be shared across policy stores for baselining or policy modifications. ... Adoption of AI can be disruptive to organizational processes and must sometimes be weighed in the context of dismantling analytics and rule-based models. The application of AI must be constructed on the principle of shared security responsibility; based on this model, both technologists and organizational leaders will accept joint responsibility for securing the data and corporate assets because security is no longer strictly the domain of specialists and affects both operational and business fundamentals.


AI & Blockchain: 3 Major Benefits Of Combining These Two Mega-Trends


AI, as the term is most often used today is, simply put, the theory and practice of building machines capable of performing tasks that seem to require intelligence. Currently, cutting-edge technologies striving to make this a reality include machine learning, artificial neural networks and deep learning. Meanwhile, blockchain is essentially a new filing system for digital information, which stores data in an encrypted, distributed ledger format. Because data is encrypted and distributed across many different computers, it enables the creation of tamper-proof, highly robust databases which can be read and updated only by those with permission. Although much has been written from an academic perspective on the potential of combining these ground-breaking technologies, real world applications are sparse at the moment. However, I expect this situation to change in the near future. So here are three ways in which AI and blockchain are made for each other.


Cyber criminals using complex financial system, study shows


The findings on cyber criminal money-laundering and cashing-out methods are part of a study into the macro economics of cyber crime and how the various elements link together which has been led by Michael McGuire, senior lecturer in criminology at Surrey University. “This is the first time the financial flows of cyber criminals have been put together into a composite picture,” said McGuire, who will present the full findings of the nine-month Web of profit study at the RSA Conference in San Francisco from 17-19 in April. “Law enforcement and cyber security professionals can use the study to understand how revenue generation is feeding into laundering, and how laundering is feeding into more traditional methods of money-laundering and the way cyber criminals are spending their money, so that they look at the intersections between the various networks more carefully,” he told Computer Weekly.


To OSPF Or Not? Which Routing Protocol To Use

OSPF with a multipoint MAN is a classic DR/BDR LAN situation, reducing the amount of peer-to-peer flooding. I haven’t run into this at large scale in a design setting yet. Would having such a MAN provide a pretty good reason to run OSPF overall? How would one damp instability in such a network? Large failure domain? What number of peers is “too big” for a full mesh MAN? The other problem I’m still mulling over is the OSPF WAN to dual datacenters design. In one case, a customer was running more than 250 VLANs (one per area) over DWDM, and more recently over OTV between datacenters, with more than 4000 GRE over IPsec tunnels. Dual hub DMVPN and BGP route reflectors looks very attractive compared to that. “Totally stubby EIGRP” — hubs that advertise only 0/0 or corporate default to remote sites — could also work well. By the way, if you are using EIGRP, note Cisco’s clever recent stub-site feature, which was probably built to simplify IWAN.


5 Applications Of Smart Contracts

Smart-Contracts
Due to a lack of automated administration, it can take months for an insurance claim to be processed and paid. This is as problematic for insurance companies as it is for their customers, leading to admin costs, gluts, and inefficiency. Smart contracts can simplify and streamline the process by automatically triggering a claim when certain events occur. For example, if you lived in an area that was hit by a natural disaster and your house sustained damage, the smart contract would recognise this and begin the claim. Specific details (such as the extent of damage) could be recorded on the blockchain in order to determine the exact amount of compensation. ... The terms of a mortgage agreement, for example, are based on an assessment of the mortgagee’s income, outgoings, credit score and other circumstances. The need to carry out these checks, often through third parties, can make the process lengthy and complicated for both the lender and the mortgagee. Cut out the middle men, however, and parties could deal directly with each other (as well as access all the relevant details in one location).


China wants to shape the global future of artificial intelligence

“[The Chinese government] sees standardization not only as a way to provide competitiveness for their companies, but also as a way to go from being a follower to setting the pace,” says Jeffrey Ding, a student at Oxford University’s Future of Humanity Institute who studies China’s nascent AI industry, and who translated the report. The government’s plan cites the way US standards bodies have influenced the development of the internet, expressing a desire to avoid having the same thing happen with AI. China’s booming AI industry and massive government investment in the technology have raised fears in the US and elsewhere that the nation will overtake international rivals in a fundamentally important technology. In truth, it may be possible for both the US and the Chinese economies to benefit from AI. But there may be more rivalry when it comes to influencing the spread of the technology worldwide. “I think this is the first technology area where China has a real chance to set the rules of the game,” says Ding.


Open Source & Smart Mobility In The Transportation Industry


Open source projects in the big data space move their development and feature sets along quickly to harness the latest enhancements in technology, performance, and scalability. New best practices get baked into data platform solutions very quickly, and a huge community of data scientists, scripters, and programmers all works toward the same goal, making best-of-breed technology available to anyone. At the foundational level, innovation occurs so rapidly that it is unrealistic to expect a vendor to encapsulate all these new developments in anything but a proprietary solution layered on top. Selecting an open source platform for data projects removes any risk of vendor lock-in. When it comes to the data space, like most things, putting all your eggs in one basket is inadvisable. Much of the innovation that is occurring in the open source data space is directly attributable to the best and brightest minds’ aversion to being tied down to a single vendor, making a shared effort much more attractive.


Transforming Bank Compliance with Smart Technologies

Digitization, the final stage in the transformation process, has the potential to create a step change in compliance operations. The catalyst is the emergence of smart technologies, which offer significant performance improvements and the ability to mimic human capabilities such as learning, language use, and decision making. Smart technologies have multiple potential applications in the context of compliance, from support for relatively routine tasks in client onboarding to analysis of unstructured data sets—for example, in relation to money laundering. Across the board, these technologies offer a route to significant efficiency gains and can help employees work more effectively. The starting point in building a cutting-edge compliance framework is to establish a taxonomy that describes and classifies key areas of risk. Such a taxonomy is also a prerequisite for defining the scope of a target operating model. The six most relevant types of compliance risks relate to financial crime and conduct.


IBM sets up no-fee Data Science Elite team to speed up AI adoption


Big Blue is calling the latter “Cloud Private for Data”, based on an in-memory database. It adds up to a platform for doing data science, data engineering and app building. IBM said the aim is to “enable users to build and exploit event-driven applications capable of analysing data from things like IoT [internet of things] sensors, online commerce, mobile devices, and more”. ... IBM is also announcing a “Data Science Elite Team”, described as a “no-charge consultancy dedicated to solving clients’ real-world data science problems and assisting them in their journey to AI”. Patricia Maqetuka, chief data officer at South African bank Nedbank, has used the team. She said: “Nedbank has a long tradition of using analytics on internal, structured data. Thanks to IBM Analytics University Live, we were exposed to the guidance and counsel of IBM’s Elite team. This team helped us to unlock new paradigms for how we think about our analytics and change the way we look at use cases to unlock business value.”



Quote for the day:


"Don't waste words on people who deserve your silence. Sometimes the most powerful thing you can say is nothing at all." -- Joubert Botha


Daily Tech Digest - March 15, 2018

How Valuable Is Your Company's Data?

Image: iStock
"As best as I can tell, there's no manual on how to value data but there are indirect methods. For example, if you're doing deep learning and you need labeled training data, you might go to a company like CrowdFlower and they'd create the labeled dataset and then you'd get some idea of how much that type of data is worth," said Ben Lorica, chief data officer at O'Reilly Media. "The other thing to look at is the valuation of startups that are valued highly because of their data." Observation can be especially misleading for those who fail to consider the differences between their organization and the organizations they're observing. The business models may differ, the audiences may differ, and the amount of data the organization has and the usefulness of that data may differ. Yet, a common mistake is to assume that because Facebook or Amazon did something, what they did is a generally-applicable template for success. However, there's no one magic formula for valuing data because not all data is equally valuable, usable or available.



Let Your Data Scientists Be Human


Humans are better at common sense than computers, instantly recognizing when a decision doesn’t make sense. This does not mean that humans are obsolete. Humans are stronger at communication and engagement, context and general knowledge, creativity, and empathy. When I have a frustrating problem, I want to talk to a human — someone who will understand my exasperation, listen to my experience, and make me feel valued as a customer while also solving my problem for me. Humans are better at common sense than computers, instantly recognizing when a decision doesn’t make sense. And humans can be creative. I recently heard music composed by a computer, and I’m sure that song won’t make it into the Top 40! Traditionally, businesses have hired data scientists who manually designed and built algorithms. The data scientists spent much of their time writing code and applying mathematics and statistics. Data scientists had no time to be human.


Yet Another Hospital Gets Extorted by Cybercriminals

Yet Another Hospital Gets Extorted By Cybercriminals.jpg
Unlike the vast majority of ransomware attacks, the Hancock attack was not the byproduct of a successful phishing campaign. “The [hacking] group obtained the login credentials of a vendor that provides hardware for one of the critical information systems used by the hospital,” explains Hancock Health President and CEO Steve Long. “Utilizing these compromised credentials, the hackers targeted a server located in the emergency IT backup facility utilized by the hospital...” Since they’d made a practice of regularly backing up all of their critical files, Hancock administrators initially believed that they would be able to purge the compromised files and replace them with clean backup versions. Unfortunately, it turned out that the “electronic tunnel” between the backup site and the hospital had been intentionally blocked. Several days later, administrators discovered that “the core components of the backup files from many other systems had been purposefully and permanently corrupted by the hackers.”


Is this the dawn of the robot CEO as artificial intelligence progresses?

Robot CEO
A human CEO can be corrupted by outside influence, but generally they have the freedom to make up their own minds and will face life-changing consequences should their impropriety be discovered. Robot CEOs on the other hand, could be completely ‘brain-washed’ by cybercriminals. For all of their incisive decision making and their unfaltering commitment to the company’s balance sheets, board and shareholders, a robot CEO could effectively ruin a company in seconds, or – if obfuscation is the game – quietly skim the company of profits in a ‘death by a thousand cuts’ approach. Kaspersky Lab researchers think the idea of robot CEOs is intriguing, but has some very real concerns about a future where robots are given too much responsibility. Cybercriminals go where the money is. That means if the robot stands between them and the possibility of substantial financial gain, they’ll find a way to exploit it. It’s always a cat and mouse game in cybersecurity. We come up with a defence; they find a way around it. It would be no different for a robot CEO.


Can The CIO Be The CDO?


The SAP Digital Transformation Executive Study indicates that successful companies must combine the best of these modes, resulting in what is effectively a “bimodal” approach to driving innovation. Our findings suggest that 72% of digital transformation leaders see a bimodal architecture as key to maintaining their core processes while quickly implementing next-generation technology. For better or worse, CIOs are traditionally associated with mode 1 – keeping the company running efficiently and effectively, at the lowest cost and least disruption. (No wonder they reigned during the era of deploying ERP systems.) It’s mission-critical, but it’s also the less glamorous side of IT today. In contrast, CDOs are all about disruption and digital transformation – the “mode 2” initiatives: driving new sources of revenue generation and using data to improve the customer experience. According to the SAP study, mode 2 initiatives fall into the category of “core business goals” for 96% of the Top 100 leaders in digital transformation, compared with 61% of laggards.


Voice-Operated Devices, Enterprise Security & the 'Big Truck' Attack

Biometric authentication, for one, doesn't solve the problem. In theory, Alexa could learn to identify authorized people's voices and listen only to the commands they give. But while this seems like a possible solution, the opposite is actually true. To begin with, there is an inherent trade-off between usability and security. Implementing such a system means that users would have to go through an onboarding process to teach Alexa or any other voice-enabled device how they sound. Compared to the status quo, where Alexa works out of the box, we are talking about a serious degradation in user comfortability. Biometric identification also means false positives: if your voice sounds different because you are sick, sleepy, or eating, Alexa will probably not accept you as an authorized user. And this is not all — there are systems available (like this example of Adobe VoCo) that, by using a person's voice sample saying one thing, can generate a new sample of his voice saying another thing.


With auditability, deep learning could revolutionise insurance industry


Given the low level of risk involved, said Natusch, “correlation is sufficient to drive action”. For applications such as Google’s photo identification, neural network-based algorithms are sufficient, he said. But in a risk-averse use case, such as decision-making in healthcare, people need to understand why a given decision was made, he said. This requires a causation-based approach, more suited to probabilistic graphical models, he added. Speaking of his experiences at Prudential, Natusch said: “We need two models – one to understand historical data, and something for handwriting recognition.” Discussing how handwriting recognition could streamline claims processing at the insurer, Natusch said that once a paper claims form is scanned, it ends up as a grayscale image. This is effectively a set of numbers that can be analysed using a neural network.


Innovation in Retail Banking 2017

The level of investment in both digitalization and innovation has increased in lockstep with each other as a result. The report illustrates the varying priorities of organizations of different sizes and the challenges and opportunities in the marketplace. More than ever, it is clear that having a defined innovation business model, with the application of data and advanced insights, is an imperative for success. It is also clear that being a ‘fast follower’ is not a viable strategy. We would like to thank Efma and Infosys Finacle for their partnership and their sponsoring of the 9th annual Innovation in Retail Banking research report. Their partnership has enabled us to create the most robust benchmarking of digitalization and innovation in banking, and to better understand the impact across all components of the financial services ecosystem.


How to train and deploy deep learning at scale

There was no deep learning in Spark MLlib at the time. We were trying to figure out how to perform distributed training of deep learning in Spark. Before actually getting our hands really dirty and trying to actually implement anything we wanted to just do some back-of-the-envelope calculations to see what speed-ups you could hope to get. ... The two main ingredients here are just computation and communication. ... We wanted to understand this landscape of distributed training, and, using Paleo, we've been able to get a good sense of this landscape without actually running experiments. The intuition is simple. The idea is that if we're very careful in our bookkeeping, we can write down the full set of computational operations that are required for a particular neural network architecture when it's performing training.


Outbrain Outgrows Initial Big Data Infrastructure, Migrates

"If a researcher or algorithm developer wants to introduce something new, we want to say 'let's try it, let's go for it,' " Yaron said. But it had become too difficult to do that with the 5-year-old Hadoop cluster which was now 330 nodes. "It resulted in bad performance for our algorithms when the Hadoop cluster wasn't stable enough," Yaron said. "At some point it became a source of frustration." Yaron's team decided to rebuild its Hadoop infrastructure with new physical servers and standardizing on a MapR implementation of Hadoop. "We are a company that runs a lot of open source, and we try to contribute to the open source community as well," Yaron told me. "However, there are cases where we feel there is value to enterprise technologies." The new physical servers changed the ratio between disk space, RAM, and CPU, Yaron said, and hardware and software upgrade enabled Outbrain to reduce the footprint of Hadoop servers in the data center to one-third of what it had been before.



Quote for the day:


"More people would learn from their mistakes if they weren't so busy denying them." -- Harold J. Smith


Daily Tech Digest - 14th March 2018

How to build an intelligent supply chain


Intelligent supply chains boast the ability to perform continuous predictive analytics on enormous amounts of data and use machine learning models to review historical information to plan for current and future needs. To create such a system, the first thing a company needs is "an intelligent brain, a cognitive operating system," said Frederic Laluyaux, president and CEO of Aera Technology, a platform dedicated to building the self-driving enterprise. Companies already have the needed data from their transactional systems. "The brain does the job, and the data feeds the brain." The cognitive operating system provides the computing connectivity. Laluyaux said that Aera’s system crawls transactional systems, like Google crawls websites. In some cases, their system has to crawl 54 different ERPs at one company, which is not unusual. "Every big company that has gone through mergers and acquisitions will have the same complexity," Laluyaux said. Even if the company has standardized their ERPs, they have many different modules.



Raspberry Pi 3 Model B+ arrives

rasppib3.jpg
TechRepublic's Nick Heath got an early hands-on with the Raspberry Pi 3 Model B+ and found in benchmarking tests that it's the fastest Raspberry Pi model available, both in single-core and quad-core measurements. As Heath notes, the addition of 802.11ac Wi-Fi gives the Raspberry Pi 3 Model B+ triple the maximum throughput of the Pi 3 Model B's 2.4GHz 802.11n Wi-Fi. Eben Upton, co-creator of the popular developer board, told TechRepublic that its B+ releases are all about refinement. "It's not a Raspberry Pi 4. I think what we've ended up with is something which is a really good B+, a bit too good for a B+, but that would be not really anywhere near enough for a Raspberry Pi 4," said Upton. "The B+ is our attention-to-detail spin for the product, where we take all the stuff we've learned in the past couple of years and smoosh it all together and make a new thing that addresses some of the concerns, or takes advantage of some of the opportunities that have come along in the intervening time."


What is security’s role in digital transformation?

futuristic user interface - digital transformation
McQuire admits that companies do struggle to keep up with the technological progress. “Many firms are simply unable to keep up with the rapid technology changes. The threat landscape is transforming before our eyes with malware, ransomware, and phishing attacks all rising rapidly,” he says. “There is also significant regulatory change occurring in the form of GDPR, which adds new pressures and holds those with weak security and privacy processes financially accountable.” “You combine this with a general lack of security talent in most firms and the fact that most run a complex web of legacy security technologies that don’t properly protect them from employees who now access work information across a mix of devices and cloud apps, and you have a security market that is booming,” McQuire adds. ...  CISOs, it appears, are trying to be present throughout the entire DX process. For instance, at an event late last year, Los Angeles CISO Timothy Lee said that CISOs that embrace digital transformation may help an organization adapt to a rapidly evolving global marketplace.


What CISOs Should Know About Quantum Computing

Quantum computing is quickly moving from the theoretical world to reality. Last week Google researchers revealed a new quantum computing processor that the company says may beat the best benchmark in computing power set by today's most advanced supercomputers. That's bad news for CISOs because most experts agree that once quantum computing advances far enough and spreads wide enough, today's encryption measures are toast. General consensus is that within about a decade, quantum computers will be able to brute-force public key encryption as it is designed today. Here are some key things to know and consider about this next generation of computing and its ultimate impact on security.


Support Always-On Demands With Software-Defined Resiliency and DRaaS


Software-defined resiliency (SDR) is IBM’s approach to DRaaS. It helps ensure enterprise applications operate reliably and protect data even when disaster strikes. It’s the latest step in the journey to redefine data center operations in software. A software-defined approach makes disaster recovery more controllable and visible, enabling administrators to extend across hybrid cloud infrastructures. It also introduces perhaps the most valuable feature in an otherwise laborious process: orchestrated recovery. By automating and orchestrating the replication and recovery of not just the servers and virtual machines but also the applications and business services, disaster recovery becomes reliable and repeatable. SDR makes use of existing vendors’ data protection mechanisms like replication and backup, but also manages them. Instead of using different tools for each enterprise software product, it provides a single interface to control all replication and recovery processes.


A startup is pitching a mind-uploading service that is “100 percent fatal”


Brain uploading will be familiar to readers of Ray Kurzweil’s books or other futurist literature. You may already be convinced that immortality as a computer program is definitely going to be a thing. Or you may think transhumanism, the umbrella term for such ideas, is just high-tech religion preying on people’s fear of death. Either way, you should pay attention to Nectome. The company has won a large federal grant and is collaborating with Edward Boyden, a top neuroscientist at MIT, and its technique just claimed an $80,000 science prize for preserving a pig’s brain so well that every synapse inside it could be seen with an electron microscope. McIntyre, a computer scientist, and his cofounder Michael McCanna have been following the tech entrepreneur’s handbook with ghoulish alacrity. “The user experience will be identical to physician-assisted suicide,” he says. “Product-market fit is people believing that it works.” Nectome’s storage service is not yet for sale and may not be for several years. Also still lacking is evidence that memories can be found in dead tissue.



Serverless computing, containers see triple-digit quarterly growth among cloud users

There has also been huge growth in adoption of serverless computing among cloud users. In the fourth quarter of 2017, serverless adoption grew by 667 percent among the sites tracked, the survey's authors report. This is up from 321 percent just the quarter before. "Serverless continues to be attractive to organizations since it doesn't require management of the infrastructure," the report's authors observe. "As companies migrate increasingly to the cloud and continue to build cloud-native architectures, we think the pace of serverless adoption will also continue to grow." The study's authors also looked at cloud CPU consumption to draw conclusions about how people are deploying cloud power. The dominance of general-purpose workloads (employed 43 percent of the time) shows that most organizations start their cloud journey by moving development and test workloads, mostly provisioned as standard instances.


Avoiding security event information overload

abstract data stream
The best SIEM vendor you can pick is one that understands that less is more. The Herjavec Group is one such company that recently caught my eye. Started by Robert Herjavec, one of the stars of ABC’s addictive Shark Tank television series, the Herjavec Group lives this philosophy. Here’s what Ira Goldstein, Herjavec Group’s senior vice president of global technical operations, said about their less-is-more philosophy, “[The data required to manage security for a modern enterprise infrastructure] has to be parsed, correlated, alerted, evaluated, analyzed, investigated, escalated, and remediated fast enough to protect integrity and operations. The only way to make sense of it all is to focus on fewer, more specific use cases that matter, as opposed to a high volume of low fidelity alerts.” “An effective security operation is driven by discipline, preventing use-case sprawl that causes information overload,” says Goldstein.


Reading very BIG text files using PowerShell

I recently ran into this very same problem in which the bcp error message stated: An error occurred while processing the file “D:\MyBigFeedFile.txt” on data row 123798766555678. Given that the text file was far in excess of notepad++ limits (or even easy human interaction usability limits), I decided not to waste my time trying to find an editor that could cope and simply looked for a way to pull out a batch of lines from within the file programmatically to compare the good rows with the bad. I turned to PowerShell and came across the Get-Content cmdlet to read the file, which looked like it would give me at least part of what I wanted. At this point, there are quite a few ways to look at specific lines in the file, but in order to look at the line in question and its surrounding lines I could only come up with one perfect solution. By passing the pipeline output into the Select-Object cmdlet I could use its rich skipand next functionality.


How to Interpret the SEC’s Latest Guidance on Data Breach Disclosure

Specifically, corporate officers, directors and “other corporate insiders” are prohibited from trading shares if they have knowledge of any unpublicized security incident within the company. While the overall intent of this latest statement is clear, the guidance is vague in key areas by design. For instance, the second section of the guidance emphasizes that companies must make "timely disclosure of any related material nonpublic information." It’s unclear what the SEC explicitly means by "timely disclosure," as the SEC doesn’t provide a specific time limit that companies must meet. This puts a lot of trust in corporate leaders to put speedy remediation and due diligence at the center of their security policy, which is a bit of a gamble given the track record of executive action during the fallout of the Equifax breach. The GDPR, on the other hand, is much more prescriptive, giving organizations 72 hours to report an incident related to the personal data of EU citizenry.



Quote for the day:


"Change is not a threat, it's an opportunity. Survival is not the goal, transformative success is." -- Seth Godin


Daily Tech Digest - March 13, 2018

Understanding The Strengths And Weaknesses Of Biometrics

Of all biometric methods, facial recognition is the latest to enter the market. While original iterations could be defeated using photos of the appropriate person, modern implementations map the structure and movement of the face to reduce the success of this kind of forgery. While the technology is new, if proven effective it could be a reasonable alternative to some of the other methods mentioned. However, with current attacks and false positives demonstrated against the Apple FaceID system, there is likely to be more advancement required in face recognition. It’s clear to see that there have been some significant advances made in biometric security. In terms of the level of security it provides, there is still some way to go before most methods are likely to receive widespread adoption. Another barrier to adoption is the level of public discomfort with keeping physical details on record as, thanks to fingerprints, biometrics are commonly associated with identifying criminals.


Businesses need to take cryptojacking seriously


Although some recently-discovered cryptojacking campaigns have compromised websites to hijack the computing resources of visitors to those sites temporarily through their browsers, corporate servers offer more computing power and are a much more attractive target. Although businesses need to be aware of both types of cryptojacking attack because website compromises could lead to brand damage and affect web-based services, cryptojacking attacks that target corporate servers arguably represent the greater risk. The main aim of cryptojacking is to hijack computing power to carry out the calculations required to generate cryptocurrencies, but that does not mean there is not a significant impact on the business. The most obvious effect is that businesses may experience a slowdown in responses from their servers and there may be some availability issues because of illicit cryptocurrency mining activity, causing costly downtime, especially to online businesses.


Cisco’s intent-based networks now available for the WAN

Cisco SD-WAN vAnalytics is built on technology the company got in the Viptela acquistion and is designed to provide visibility into WAN performance and capacity planning. Like its IBN solution for the data center and campus, SD-WAN vAnalytics allows network professionals to perform “what if” scenarios to try things and see what happens before the changes are committed. This is much more effective than the traditional model of hope things work and then reboot the router if things go awry. If the system notices a problem, it provides corrective actions and the steps taken to implement them. Over time, these actions will be executed automatically, but we’re still in the crawl phase of IBN and it’s unrealistic to expect customers to fully automate things.  The term vAnalytics is actually a bit of a misnomer, as it’s a suite that includes vAnalytics that does the baselining, trending, datamining, comparisons, and cause and effect and combines it with vManage that provides the real-time and historical visibility, troubleshooting tools, capacity planning, and utilization.


5 biggest healthcare security threats for 2018

Healthcare professional and security
Healthcare organizations tend to have a few attributes that make them attractive targets for attackers. A key reason is the number of different systems that are not patched regularly. “Some of them are embedded systems that, due to the way the manufacturer has created them, can’t be easily patched. If the healthcare IT department were to do so, it would cause significant problems with the way the vendor can support them,” says Perry Carpenter, chief evangelist and strategy officer at KnowBe4. The critical nature of what healthcare organizations do puts them on the radar of attackers. Health data is a valuable commodity in the cybercriminal world, and that makes it a target for theft. Because of what’s at stake—the well-being of patients—healthcare organizations are more likely to pay ransomware demands ... There is no reason to believe that ransomware attacks will tail off this year. “Until we harden our people and our systems sufficiently, [ransomware] will continue to prove successful and gain more momentum.


Malware 'Cocktails' Raise Attack Risk

The "old favorites" piece is important. According to SonicWall CEO Bill Conner, "New malware is down, but the number of variants is up." And the number of variants appears to be growing. "Last year we were seeing about 500 a day new variants. In February, they had gone to 700 day," he says. It's not like the variant writers were creative, though - many of the iterations are mash-ups of existing malware. "The variants have gone down in terms of exploit kits, but new malware cocktails are going up," says Conner. The growth of "malware cocktails" in part is due to the rise in ransomware-as-a-service operations around the world. That's bad news because ransomware-as-a-service allows less programming-skilled actors into the malware game, and some targets are twice-victimized. "About half of the [ransomware victims] did pay, but even among those who paid many weren't able to get their data back because the variants didn't contain all the keys," Conner says.


How Blockchain Is Helping Democratize Access to Credit


Enter blockchain, the technology Rodrigues believes can make this vision a reality. “Blockchain is changing both the technology and the power structure behind the credit industry,” Rodrigues said. “It’s shaking power structures that previously had to rely on banks, credit bureaus, and nation-states as middlemen.” Blockchain is increasingly being tested as a way to track that which was previously difficult to pin down, from securing virtual assets to giving refugees an immutable financial identity. Put simply, blockchain is a database of encrypted transactions stored across a network of computers. That network actively participates in the validation, upkeep, and accuracy of the database, and is paid for doing so in cryptocurrencies. Swapy Network will run on the Ethereum blockchain and issue its own cryptographic tokens, called Swapy Tokens, to be used to buy and sell various services across the company’s three products.


Severe flaws could turn your smart camera into someone else's surveillance tool

security threats and vulnerabilities
The camera, which has night vision and a motion sensor, can capture video, supports two-way communication, and has a built-in speaker. It works with a cloud-based service and can be controlled via smartphones, tablets, or computers. Kaspersky Lab identified multiple vulnerabilities in the affected camera’s firmware and cloud implementation. In fact, the architecture of the cloud service was even vulnerable. Regarding the dangerous vulnerability in the cloud service architecture, Kaspersky Lab’s researchers noted, “An intruder could gain access via the cloud to all cameras and control them. One of the main problems associated with the cloud architecture is that it is based on the XMPP protocol. Essentially, the entire Hanwha smart camera cloud is a Jabber server. It has so-called rooms, with cameras of one type in each room. An attacker could register an arbitrary account on the Jabber server and gain access to all rooms on that server.”


Microsoft Teams will integrate with Cortana, add transcription and translation features

According to Microsoft, Cortana voice integrations for Teams-enabled devices will launch later this year, allowing users to easily make a call, join a meeting or add people to meetings using natural, spoken language. What’s more, these voice capabilities will extend to IP phones and conference room devices, as well. This feature alone could be a big selling point for Microsoft Teams, but it’s one of several the company announced are in the works. Also coming in 2018 is cloud recording — another that takes advantages of advances in voice technology in recent years. Microsoft Teams will be able to record meetings with a click, then create an automatic transcription of what was said. Meeting attendees can choose to play back the meeting in full, or just a key part, using the transcription as reference. This feature will also be advanced in the future to include facial recognition, so meeting remarks can be properly attributed.


5 principles of monitoring microservices

microservices
As the building blocks of microservices, containers are black boxes that span the developer laptop to the cloud. But without real visibility into containers, it’s hard to perform basic functions like monitoring or troubleshooting a service. You need to know what’s running in the container, how the application and code are performing, and if they’re generating important custom metrics. And as your organization scales up, possibly running thousands of hosts with tens of thousands of containers, deployments can get expensive and become an orchestration nightmare. To get container monitoring right, you have a few choices: Ask developers to instrument their code directly, run sidecar containers, or leverage universal kernel-level instrumentation to see all application and container activity. Each approach has advantages and drawbacks, so you will need to review which one fulfills your most important objectives.


The future of storage: Pure Storage CEO Charlie Giancarlo shares his predictions

There are a bunch of things going on in data centers today that weren’t a factor even a couple of years ago. This includes artificial intelligence, machine learning, video processing and analytics. These only perform as well as the performance of the lowest-common-denominator in the underlying infrastructure. With advanced Intel CPU and Nvidia GPU-based applications, the goal should be to keep the massively parallel CPU busy so there are no idle cycles. Being able to keep the parallel processors fed means having an active data store that allows multiple groups to access the data at the same time. Magnetic disks aren’t fast enough to keep up, but flash is. Furthermore, legacy storage solutions that were designed for magnetic disk cannot deliver the speed that flash can provide without degrading flash performance.



Quote for the day:


"Being responsible sometimes means pissing people off." -- Colin Powell


Daily Tech Digest - March 12, 2018

Monetizing Services in Digital Time with AI and Machine Learning

Machine learning can help determine the right level of personalization for offers and deliver them at the perfect digital moment, increasing take-up rates for services. AI can identify problems faster with automated root cause analysis and anomaly detection, then trigger corrective actions to ensure retention and protect revenues. According to eMarketer, in 2018, around 1.87 billion individuals worldwide will use a mobile phone to watch digital video, an 11.9 percent increase compared to 2017. With real-time offerings like this on the line, every second counts when solving problems. Machine learning even brings opportunities to monetize operator data towards third parties, with offerings for verticals like retail and transportation. Combined with demographics and social media, operators can sell reports to show anonymized movements of crowds for tracking store performance against competitors, choosing the next location for a shop, seeing what are the most traveled routes for urban planning, and deciding how to lay out the next metro line.


Ransomware for robots is the next big security nightmare

robot-ransomware.jpg
Taking what was learned in previous studies into the security vulnerabilities of robots, researchers were able to inject and run code in Pepper and NAO robots and take complete control of the systems, giving them the option to shut the robot down or modify its actions. The researchers said it was possible for an attacker with access to the Wi-Fi network the robot is running on to inject malicious code into the machine. "The attack can come from a computer or other device that is connected to internet, so a computer gets hacked, and from there, the robot can be hacked since it's in the same network as the hacked computer," said Cerrudo. Unlike computers, robots don't yet store vast amounts of valuable information that the user might be willing to pay a ransom to retrieve. But, as companies often don't have backups to restore systems from, if a robot becomes infected with ransomware, it's almost impossible for the user to restore it to normal by themselves.


How Postgresql Just Might Replace Your Oracle Database

PostgreSQL replacing Oracle database? Salesforce might make it happen
If, in fact, Salesforce is developing a homegrown replacement for Oracle’s database, it might well be building it on PostgreSQL, the database Salesforce has actively flirted with since 2012. In 2013, Salesforce hired Tom Lane, a prominent PostgreSQL developer. In that same year, it hired several more, and even today PostgreSQL experience is called out for in dozens of jobs advertised on the company’s career page. Just as Facebook, Google, and other web giants have shaped MySQL to meet their aggressive demands for scale, so too might Salesforce be able to mold PostgreSQL to wean it from its dependence on Oracle. ... Oracle would claim that it isn’t worried, but the DB-Engines database popularity ranking, which measures database popularity across a range of factors, should give it pause. For years, PostgreSQL has been on the rise, even as Oracle and MySQL  have faded. PostgreSQL is now a strong fourth place, with MongoDB right behind it.


The 10 most common cybersecurity scams uncovered

In the beginning there was the internet, and shortly after that came the internet scammers. Online scams include everything from the now-legendary Nigerian prince meme to the less-well-known but infinitely more devious fake shopping websites. If you were curious about the origins of these deceitful hoaxes, we’ve got you covered. On the other hand, should you be worried about the repercussions of falling victim to one of these fraudulent schemes, we’ll also touch on that.  Online scams are typically malware disguised as rewards or charitable gestures. After all, what is the Nigerian prince scam other than an attempt to get you to care about getting someone else out of a rut, and providing you with a huge payout for doing so? All the examples below are designed to prick our consciences, or play on our greed or vanity in one way or another. Some of these cyber-scams are actually pretty ingenious, but ultimately malicious – others are just plain malicious.


Could Singapore hold the secret to preparing workers for an uncertain future?


Singapore offers a simple yet elegant solution: “second-skilling.” Tay realized that in today’s economy, second-skilling — developing your skills in a sector other than the one you work in — is necessary for career resiliency; it gives you options and flexibility. That second skill can either complement the skills you’re already using in your current job, or offer a completely alternative path. But who pays for second-skill training? The answer in Singapore is surprising. Thanks in part to Tay’s lobbying, every Singaporean 25 and older gets S$500 (about US$350) for skills training of their choice from the government through the SkillsFuture program. The money’s in a virtual credit account, and the government plans to provide periodic top-ups. It can pay for training in anything a person might want to learn, not just what their company needs them to know. “Many programs are already funded 80 to 90 percent,” says Tays. “So the five hundred dollars can be used to pay for the unfunded portions, which, previously, we had to fork out from our own pockets.”


5 emerging tech trends at SXSW that will shape 2018

Sadly, generalized intelligence (think Hal 9000) is still a dream in engineer’s mind. However, that doesn’t mean that AI isn’t still a hot topic for 2018. With computers getting faster and GPU’s being re-purposed, we’re seeing an explosion of innovation, from machine-learning models to validate brand creative to those that create unique art and music. Companies like IBM and Getty Images are asking how they can apply large scale AI to the creative process, and what that means for their business. Meanwhile, L’Oréal is applying machine-learning to improve product recommendations through its smart hairbrush. Despite all that, some of the more compelling topics this year are around the ethics and morality of AI. As AI is used in more serious applications (like self-driving cars and medicine), we rely on machines to make life-or-death decisions. Who is responsible for these decisions, and what rules do they follow? Still a Wild West to be figured out.


Third-party security vetting: Do it before you sign a contract

partnership collaboration puzzle pieces unity
Archer contends relationships are key. “Security has to be able to say, ‘We're not going to do business with that vendor,’” he says. To enforce a policy like that, the c-suite must take security seriously. If there’s not a CSO to represent you, talk to the CEO yourself. “If you can't get through the front door, maybe you get through the back door,” he recommends. Either way, he adds, “Establish those relationships.” Then grow relationships with the actual prospective vendors. At Fannie Mae, this starts with a security best practices questionnaire included in all RFIs. Archer’s team divided vendors into two groups — critical and regular — by the type of data they’ll access. For prospective critical vendors, there are around 250 questions. Regular vendors get shorter, industry-specific versions of the questionnaire. Most questions for both groups are primarily yes or no: “Are you SOC 1 and SOC 2 compliant?”, for example. The RFI is also an opportunity for prospective vendors to get to know you. In addition to adding questions, Fannie Mae outlines security expectations.


How Retail Shifted from Business Intelligence to Data Science

data science business intelligence retail
Being competitive in retail often means raising or lowering prices depending on what rival companies are also offering. Walmart is one retailer that spends tons of money on monitoring not just its own transactions, but also the price changes of its competitors while taking into consideration its own stock levels. With the aid of data science, Walmart is able to implement real-time changes to its pricing and never lose its edge over other retailers. In the past, to price products this fiercely took a lot more time and couldn’t be monitored so conveniently or, most importantly, predicted. This also has a second advantage, which is the possibility of moving away from the traditional end of sales technique of significantly lowering prices once demand has almost vanished at the end of a season, to instead drop prices more gradually, which has been shown to be more effective. Business intelligence is still an incredibly useful thing to have, with 62% of retailers reporting that using data is creating a competitive advantage, according to IBM.


Aryaka adds new security layers to global private network


Each security layer, which ranges from distributed denial-of-service protection to network edge and cloud security options, offers specific features, according to Gary Sevounts, chief marketing officer at Aryaka, based in San Mateo, Calif. Additional layers include a virtual firewall for Amazon Web Services and Microsoft Azure instances and built-in compartmentalization and contained environment controls. Finally, an early warning portal identifies behavior anomalies and potential risks. By offering multiple security layers from different security vendors, Sevounts said Aryaka's customers can receive extra protection in case there is a problem with one of the layers. "If you have all your layers from a single provider and something happens to the core technology, then all the layers are ineffective, and the company is susceptible to attacks," he said.


A new mindset drives a new way forward

collaboration brainstorming mobile [RawPixel.com - CC0 via Unsplash]
Why should IT have to struggle deploying cobbled-together kits, and trying to overcome difficult lighting or acoustic challenges? And why should participants have to spend 10 minutes trying to launch a meeting? Or suffer through inaudible sound, shouting into phones, camera angles that are too tight or too wide for the room, etc.? So yes, more scenarios are being supported, but the result is greater complexity to achieve a mediocre (or worse) experience — and this frustrates people, hampers productivity, and slows adoption. To make matters worse, those solutions can come at an unnecessarily high cost to buy and manage. In order to make real strides in collaboration today and into the future, you have to simultaneously address all three competing pressures. Solutions have to be able to support more scenarios — including scenarios we haven’t yet imagined — in simpler ways, while delivering a better experience.



Quote for the day:


"The past has no power over the present moment." -- Eckhart Tolle


Daily Tech Digest - March 11, 2018

KDD Process
For our purposes, however, we will separate the data preparation from the modeling as its own regimen. As Python is the ecosystem, much of what we will cover will be Pandas related. For the uninitiated, Pandas is a data manipulation and analysis library, is one of the cornerstones of the Python scientific programming stack, and is a great fit for many of the tasks associated with data preparation. Data preparation can be seen in the CRISP-DM model shown above (though it can be reasonably argued that "data understanding" falls within our definition as well). We can also equate our data preparation with the framework of the KDD Process -- specifically the first 3 major steps -- which are selection, preprocessing, and transformation. We can break these down into finer granularity, but at a macro level, these steps of the KDD Process encompass what data wrangling is.


The Difference Between Entrepreneur and Executive

Entrepreneurs must understand that their business(es) should run without them. Systems and structure must be executed by management and each member of an enterprise should know his/her role. When venture capitalists and bankers invest in a new start-up, it is the first thing they look for – business structure. The passionate nature of the founder may get them to the table, but it is true day-to-day business management they look for. Look at Ray Kroc, founder of McDonalds. ... Executives, on the other hand, should take a page from the entrepreneur by looking beyond the numbers and going with their gut. When Mazda introduced the Miata, all the marketing data out there said nothing about a little convertible sports car. It was the last thing on the American consumers’ mind. But Mazda did the unthinkable – they put passion back into driving with a fun and affordable roadster that brought back the days of British MG Midgets and weekends in the country.


Great Data Scientists Don’t Just Think Outside the Box, They Redefine the Box


The data scientists didn’t wait until someone developed a better Machine Learning algorithm. Instead, they looked at the wide variety of Machine Learning and Deep Learning tools and algorithms available to them, and applied them to a different, but related use case. If we can predict the health of a device and the potential problems that could occur with that device, then we can also help customers prevent those problems, significantly enhancing their support experience and positively impacting their environment. ... One of a data scientist’s most important characteristics is that they refuse to take “it can’t be done” as an answer. They are willing to try different variables and metrics, and different type of advanced analytic algorithms, to see if there is another way to predict performance. This graphic measures the activity between different IT systems. Just like with data science, this image shows there’s no lack of variables to consider when building your Machine Learning and Deep Learning models!


This NYC Startup Supercharges Advisors With AI and NLP

By focusing on data that is often overlooked or misclassified such as tickers, instrument names, strategies, investment goals and many other financial entity types, we’re able to provide “4K NLP for financial data” as an input into our engine. Its robust platform includes three new configurable APIs; the first, Personalized Insights, curates personalized stories of “what to say”, the second – Client Prioritization API helps answer the question of “who to talk to” by providing a prioritized list of clients to call, with the reasons for out reach. The company’s third API, Expert Conversation, is a natural language interface with data aggregation, curation and linking capabilities. It is focused on question answering for market, ETF, mutual fund and equities research. It’s a smarter, faster way to get answers to questions that are buried in research reports or sits behind many screens.


Hackers create 'ghost' traffic jam to confound smart traffic systems

Image by Christian Mueller http://www.shutterstock.com/gallery-679411p1.html
The attack manipulates the mechanism I-SIG uses to manage queues, by spoofing the attack vehicle's predicted arrival time and the requested phase of the traffic lights (I-SIG lets vehicles request a green light for their arrival, and decides whether or not to grant it based on the queue it's created of all the incoming requests). “The attacker can change the speed and location in its BSM [Basic Safety Message – El Reg] message to set the arrival time and the requested phase of her choice and thus increase the corresponding arrival table element by one”, the paper said. The attack, they claimed, has a 94 per cent success rate, and on average, would increase delays by 38.2 per cent. The best defence against these and other attacks, the researchers say, is a combination of more robust algorithms, better performance in the roadside units that give the system its realtime feedback, and better validation of vehicle-originated messages.


This crazy invention by an Indian Banker will leave you speechless

The most Interesting Fact about “Bankerpedia” is, The Portal was coded in 6 days on Mobile during the daily commute from Elgin Mills Civil Lines Kanpur to Ghatampur (because he lived in Kanpur and was posted in Ghatampur approx 100 Km of daily travel to & fro). At the time of coding, he didn’t know that one day it’s going to be used by thousands of Learners. His imagination of finding a way to ease the efforts of collecting Notes & sorting out what to study & what to leave and how to share all these notes to the Colleagues who are going to appear in the same exam turned out to be a great Idea. ... he crowd-sourced it with several bankers & to update study material & maintain them, he created Artificial Intelligence Bots to take care of all these things. Apart from this, he has kept Security of Users and user’s data at top priority by using SSL. The usage of SSL technology ensures that all data transmitted between the web server and browser remains encrypted.


How to replace and upgrade a MacBook Pro hard disk


Upon receiving the SSD, I moved the screws from the side of the old disk to the same locations on the new drive, and then installed the drive in the MacBook Pro. I also reconnected the battery to the motherboard and replaced the hard drive retention piece, as well as the bottom cover and all screws. I connected the thumb drive to the MacBook Pro, booted up the laptop while pressing the Option key, and then chose to boot from the thumb drive that read Install OS X El Capitan. I selected the SDD as the disk to which I wanted to install the operating system, and then I marveled at how easy the process was. Next, the installation process failed. I was greeted with a nonsensical error that read "This copy of the Install OS X El Capitan application can't be verified. It may have been corrupted or tampered with during downloading." The file was fine; it wasn't corrupt, nor had it been tampered with.


A multi-sided approach to financing the smart city

Building successful Smart City initiatives requires collaboration between engaged individuals, city governments and a growing range of private commercial organisations. Yet there is a practical difficulty in all of this – finding a way to pay for Smart Cities - not an easy subject at a time when there is enormous pressure on public finances in countries all over the world. Most city and national governments are unable fund new initiatives of this kind from taxpayer income alone, and that leads them to seek partnerships and alliances with commercial bodies and technology specialists to design and deliver new services and new options. This is where problems start to arise. Technology companies are interested in partnering with city governments for their own reasons. They want to test their ideas, gain proofs of concept, access useful development data and other research requirements for building their own businesses.


How to build a data-first culture for a digital transformation

There’s not just one metric you need to pay attention to, but it’s not hundreds either. Organizations can get overly excited about data, then all of a sudden, you’re overwhelmed. So we decided to focus on data that helped us understand customer behavior and eliminate the unknowns. Look-alikes (an algorithmically assembled group of people who resemble, in some way, an existing group) based on existing segments of customers were most valuable, and over time we layered additional elements, such as demographics, behavior, age, current carrier, and location. We then overlay those insights with data from digital properties: website, mobile app, stores, and call centers. And we started to understand better our customers’ journeys across the web, as they called us, tweeted about us, etc. We’re now starting to teach our “bots” to learn more about contextually relevant interactions with the customer. For example, if a customer visits one of our stores, then comes online and looks at various sets of pages or has a pending order, the bot learns how to respond to that specific customer profile. 


What Is The Difference Between Artificial Intelligence (AI) And Machine Learning?

As it turned out, one of the absolute best application zones for machine learning for a long time was PC vision, however despite everything it required a lot of hand-coding to take care of business. Individuals would go in and compose hand-coded classifiers like edge identification channels so the program could distinguish where a protest began and halted; shape recognition to decide whether it had eight sides; a classifier to perceive the letters "S-T-O-P." From every one of those hand-coded classifiers they would create calculations to comprehend the picture and "learn" to decide if it was a stop sign. ... Back in that late spring of '56 meeting the fantasy of those AI pioneers was to build complex machines — empowered by rising PCs — that had similar attributes of human intelligence. This is the idea we consider as "General AI" — astounding machines that have every one of our faculties (possibly more), all our reason, and figure simply as we do. You've seen these machines perpetually in motion pictures as companion — C-3PO — and enemy — The Eliminator. 



Quote for the day:


"Be determined to handle any challenge in a way that will make you grow." -- Les Brown