Daily Tech Digest - June 05, 2018

10 Open Source Security Tools You Should Know

(Image: Anemone123)
The people, products, technologies, and processes that keep businesses secure all come with a cost — sometimes quite hefty. That is just one of the reasons why so many security professionals spend at least some of their time working with open source security software. Indeed, whether for learning, experimenting, dealing with new or unique situations, or deploying on a production basis, security professionals have long looked at open source software as a valuable part of their toolkits.  However, as we all are aware, open source software does not map directly to free software; globally, open source software is a huge business. With companies of various sizes and types offering open source packages and bundles with support and customization, the argument for or against open source software often comes down to its capabilities and quality. For the tools in this slide show, software quality has been demonstrated by thousands of users who have downloaded and deployed them. The list is broken down, broadly, into categories of visibility, testing, forensics, and compliance. If you don't see your most valuable tool on the list, please add them in the comments.



The growing ties between networking roles and automation

Automation was expected to steal jobs and replace human intelligence. But as network automation use cases have matured, Kerravala said, employees and organizations increasingly see how automating menial network tasks can benefit productivity. To automate, however, network professionals need programming skills to determine the desired network output. They need to be able to tell the network what they want it to do. All of this brings me to an obvious term that's integral to automation and network programming: program, which means to input data into a machine to cause it to do a certain thing. Another definition says to program is "to provide a series of instructions." If someone wants to give effective instructions, a person must understand the purpose of the instructions being relayed. A person needs the foundation -- or the why of it all -- to get to the actual how. Regarding network automation, the why is to ultimately achieve network readiness for what the network needs to handle, whether that's new applications or more traffic, Cisco's Leary said.


5 ways location data is making the world a better place

A salesperson (L) talks with a visitor in front of a map showing the location of an apartment complex which is currently under construction at its showroom in Seoul March 18, 2015. While activity is soaring, with the number of transactions at a 7-year high, housing prices are rising at a glacial pace as heavy household debt and a fast-ageing population keep a lid on price growth. To match story SOUTHKOREA-ECONOMY/HOUSING Picture taken on March 18. REUTERS/Kim Hong-Ji
In the insurance sector, detailed data creates better predictions and more accurate customer quotes. Yet potential purchasers often don’t know the information needed for rigorous risk assessments, such as the distance of their house from water. Furthermore, lengthy and burdensome questionnaires can lose firms business; analysis from HubSpot found, by reducing form fields, customer conversions improve. PCA Predict uses its Location Intelligence platform to compile free data from the Land Registry and Ordinance Survey, including LiDAR height maps, as well as commercial address data, to determine accurate information on a potential customer’s property, such as distance from a river network, height, footprint, if the property is listed and its risk of wind damage. The model is also being developed to determine a building’s age using machine-learning and road layout. “We take disparate datasets and apply different types of analysis to extract easy-to-use attributes for insurers,” says Dr Ian Hopkinson, senior data scientist at GBG, the parent company of PCA.


Adoption of Augmented Analytics Tools Is Increasing Among Indian Organizations

Indian organizations are increasingly moving from traditional enterprise reporting to augmented analytics tools that accelerate data preparation and data cleansing, said Gartner, Inc. This change is set to positively impact the analytics and business intelligence (BI) software market in India in 2018. Gartner forecasts that analytics and BI software market revenue in India will reach US$304 million in 2018, an 18.1 percent increase year over year. ... "Indian organizations are shifting from traditional, tactical and tool-centric data and analytics projects to strategic, modern and architecture-centric data and analytics programs," said Ehtisham Zaidi, principal research analyst at Gartner. "The 'fast followers' are even looking to make heavy investments in advanced analytics solutions driven by artificial intelligence and machine learning, to reduce the time to market and accuracy of analytics offerings."


Apple’s Core ML 2 vs. Google’s ML Kit: What’s the difference?

core ml 2
A major difference between ML Kit and Core ML is support for both on-device and cloud APIs. Unlike Core ML, which can’t natively deploy models that require internet access, ML Kit leverages the power of Google Cloud Platform’s machine learning technology for “enhanced” accuracy. Google’s on-device image labeling service, for example, features about 400 labels, while the cloud-based version has more than 10,000. ML Kit offers a couple of easy-to-use APIs for basic use cases: text recognition, face detection, barcode scanning, image labeling, and landmark recognition. Google says that new APIs, including a smart reply API that supports in-app contextual messaging replies and an enhanced face detection API with high-density face contours, will arrive in late 2018. ML Kit doesn’t restrict developers to prebuilt machine learning models. Custom models trained with TensorFlow Lite, Google’s lightweight offline machine learning framework for mobile devices, can be deployed with ML Kit via the Firebase console, which serves them dynamically.


How to evaluate web authentication methods

user authentication
Two attributes I hadn’t give a lot of thought to are “requiring explicit consent” and “resilient to leaks from other verifiers.” The former ensures that a user’s authentication is not initiated without them knowing about it, and the latter is about preventing related authentication secrets from being used to deduce the original authentication credential. The authors evaluate all the covered authentication solutions across all attributes, and they include a nice matrix chart so you can see how each compared to the other. It’s a genius table that should have been created a long time ago. The authors rate each authentication option as satisfying, not satisfying or partially satisfying each attribute. The attributes aren’t ranked, but anyone could easily take the unweighted framework, add or delete attributes, and weight it with their own needed importance. For example, many authentication evaluators looking for real-world solutions will want to add cost (both initial and ongoing) and vendor product solutions. The author’s candid conclusions include: “A clear result of our exercise is that no [authentication] scheme we examined is perfect – or even comes close to perfect scores.”


Advanced Architecture for ASP.NET Core Web API


Before we dig into the architecture of our ASP.NET Core Web API solution, I want to discuss what I believe is a singlebenefit which makes .NET Core developers lives so much better; that is, DependencyInjection (DI). Now, I know you will say that we had DI in .NET Framework and ASP.NET solutions. I will agree, butthe DI we used in the past would be from third-party commercial providers or maybe open source libraries. They did a good job, butfor a good portion of .NET developers, there was a big learning curve, andall DI libraries had their uniqueway of handling things. Today with .NET Core, we have DI built right into the framework from the start. Moreover,it is quite simple to work with, andyou get it out of the box. The reason we need to use DI in our API is that it allows usto have the best experience decoupling our architecture layers and also to allowus to mock the data layer, or have multiple data sources built for our API. To use the .NET Core DI framework, justmake sure your project references the Microsoft.AspNetCore.AllNuGet package (which contains a dependency on Microsoft.Extnesions.


Intuitively Understanding Convolutions for Deep Learning


The advent of powerful and versatile deep learning frameworks in recent years has made it possible to implement convolution layers into a deep learning model an extremely simple task, often achievable in a single line of code. However, understanding convolutions, especially for the first time can often feel a bit unnerving, with terms like kernels, filters, channels and so on all stacked onto each other. Yet, convolutions as a concept are fascinatingly powerful and highly extensible, and in this post, we’ll break down the mechanics of the convolution operation, step-by-step, relate it to the standard fully connected network, and explore just how they build up a strong visual hierarchy, making them powerful feature extractors for images. The 2D convolution is a fairly simple operation at heart: you start with a kernel, which is simply a small matrix of weights. This kernel “slides” over the 2D input data, performing an elementwise multiplication with the part of the input it is currently on, and then summing up the results into a single output pixel.


Windows Server 2019 embraces SDN

windows server 2019
The new virtual networking peering functionality in Windows Server 2019 allows enterprises to peer their own virtual networks in the same cloud region through the backbone network. This provides the ability for virtual networks to appear as a single network. Fundamental stretched networks have been around for years and have provided organizations the ability to put server, application and database nodes in different sites. However, the challenge has always been the IP addressing of the nodes in opposing sites. When there are only two static sites in a traditional wide area network, the IP scheme was relatively static. You knew the subnet and addressing of Site A and Site B. However, in the public cloud and multi-cloud world – where your target devices may actually shift between racks, cages, datacenters, regions or even hosting providers – having addresses that may change based on failover, maintenance, elasticity changes, or network changes creates a problem. Network administrators have already spent and will drastically increase the amount of time they spend addressing, readdressing, updating device tables, etc to keep up with the dynamic movement of systems.


Managing a hybrid cloud computing environment


Ensuring the security of physical edge networking connections and the connectivity of all communication is equally essential. This requires redundant networking components that utilize built-in failover capabilities. Finally, careful selection of the power infrastructure is vital to supporting all elements of edge computing. The ability to maintain power at all times via the use of backup power and integration of the remote monitoring of the power infrastructure into the customer’s management system are paramount. You can do this by seeking UPSs, rackmount power distribution units (PDUs) and power management software with remote capabilities. Being able to remotely reboot UPSs or PDUs can be extremely helpful in edge applications. In addition, solutions like Eaton’s Intelligent Power Manager software can enhance your disaster avoidance plan by allowing you to set power management alerts, configurations and action policies. By creating action policies for remediation, Eaton enables you to automate server power capping, load shedding and/or virtual machine migration should problems occur.



Quote for the day:


"Don't be buffaloed by experts and elites. Experts often possess more data than judgement." -- Colin Powell


Daily Tech Digest - June 03, 2018

How financial institutions can start with artificial intelligence

Technology 1
AI is increasingly becoming the way for leading financial services to provide everything from customer service to investment advice, says PwC’s Mike Quindazzi. Yet, few banking industry CEOs are considering the impact of AI on future skills, despite the impact that AI is already having on trading desks and reshaping customer interactions. “Protecting the base and avoiding risks is clear and present in the minds of banking leaders,” says Quindazzi. “Many challenges persist due to bias, privacy, trust, lack of trained staff, and regulatory concerns. In the near term, ‘augmented intelligence’ solutions, in which machines assist humans, are quickly making their way into operation.” 'AI or die!' seems to be the rallying cry at every banking conference these days, according to Bradley Leimer, of Explorer Advisory and Capital. “But before going down the path of building and implementing solutions leveraging AI and similar tools, financial institutions must ask themselves where they’re falling short in regard to providing their customers true lifetime value around their finances,” Leimer adds.



Maintaining Malaysia’s digital transformation trajectory

Digital transformation - or DX, as IDC calls it - should be placed as the core strategy and organisations should accelerate the DX pace to thrive in the competitive digital ecosystem. IDC Malaysia's FutureScape 2018 predictions primarily focus on the four pillar technology areas; Cloud, Mobility, Social and Big Data and analytics as well as six innovation accelerators; Augmented and Virtual Reality (AR/VR), Cognitive/AI System, Next-Gen Security, Internet of Things (IoT), 3D Printing and Robotics. Some of IDC's expectations are eye-opening: By 2021, at least 20 percent of Malaysia GDP will be digitised - with growth in every industry driven by digitally-enhanced offerings, operations and relationships; by 2020, investors will use platform/ecosystem, data value, and customer engagement metrics as valuation factors for all enterprises. ... By getting the private sector to partner in funding #MYCYBERSALE 2017 with continued support by MDEC, PIKOM was able to reduce government funding for the project by 40 percent while increasing the Gross Merchandise Value (GMV) or sales generated through the online sales by 55 percent. 


Innovative companies think differently about people

Modern businesses are facing new problems that need fresh thinking. Hiring the same people as before isn’t going to cut it. The variety of perspectives in diverse teams deliver better products, services and customer experiences – and obviously that’s good for business. The ABC recently launched Employable Me, a show about a group of jobseekers aiming to prove that having a neurological condition shouldn’t make them unemployable. It goes to the heart of this need to explore new talent pools. The unemployment rate for people on the autism spectrum was above 30 per cent in 2015, more than three times the rate for people with disability and almost six times the general population. Yet people with these disorders are often highly intelligent. Some have great attention to detail or an intense commitment to delivering high-quality work. They tend to be lateral thinkers and have immeasurable value to offer.


What frustrates Data Scientists in Machine Learning projects?


There is an explosion of interest in data science today. One just needs to insert the tag-line ‘Powered-by-AI’, and anything sells. But, thats where the problems begin. Data science sales pitches often promise the moon. Then, clients raise the expectations a notch up and launch their moonshot projects. Ultimately, it’s left to the data scientists to take clients to the moon, or leave them marooned. An earlier article, ‘4 Ways to fail a Data scientist Job interview’ looked at the key blunders candidates commit in the pursuit of data science. Here, we wade into the fantasy world of expectations from data science projects, and find out the top misconceptions held by clients. Here we’ll talk about the 8 most common myths I’ve seen in machine learning projects, and why they annoy data scientists. If you’re getting into data science, or are already mainstream, these are potential grenades that might be hurled at you. Hence, it would be handy knowing how to handle them.


The blockchain explained for non-engineers

glockblockchain.jpg
Blockchain buzz is inescapable. And while the technology has transformed some companies and minted fresh millionaires in a dazzlingly short period of time, blockchain is as confounding as it is powerful. If you're confused by the hype, you're not alone. The blockchain is a decentralized, vettable, and secure technology that has, in less than a decade, become a powerful driver of digital transformation poised to help create a new employment economy. Evangelists claim blockchain tech will disrupt industrial supply chains, streamline real estate transactions, and even redefine the media industry. "Think of blockchain as the next layer of the internet," said Tom Bollich, CTO of MadHive. "HTTP gave us websites ... now we have blockchain, which is like a new layer of computing." Employment data seems to validate blockchain's current hype cycle. Google search data indicates a cresting wave of interest in the tech, and according to Indeed.com searches for blockchain-related jobs spiked nearly 1000 percent since 2015. Enterprise organizations like Capital One, Deloitte, ESPN, and eBay are hiring blockchain engineers, retraining project managers to facilitate integrations, and even searching for specialized attorneys.


Unusual Breach Report by Humana Shines Light on Fraud Prevention

In a statement provided to Information Security Media Group, a Humana spokeswoman says the company's initial analyses, and its continuous, ongoing monitoring activities, indicate that fewer than 200 members were impacted in the incident. "The abnormal activity was first identified as an anomaly in our interactive voice response reporting tools. It was noted that an abnormally high abandon rate was being observed from a small number of telephone exchanges," she says. "All evidence in this particular incident indicates that the abnormal activity was benign." Report to State Ryan Kriger, Vermont's assistant attorney general, tells ISMG that Humana reported to the state that 11 Vermont residents were affected by the recent incident. He adds that it's not clear if the incident reported by Humana involving callers who might have been trying to confirm the personally identifiable information of other individuals qualifies as a data breach.


Network security has become irrelevant: Zscaler CEO

Most of the thousands of security companies today sell on the fear of uncertainty. There is so much noise that it is very hard to figure out who to choose. I envisioned the digital transformation taking place in the enterprise, and how it would disrupt traditional network and security models. I asked myself simple questions before starting Zscaler. The world was changing, and employees were beginning to go mobile. More and more applications were becoming SaaS-enabled. I saw a lot of cloud based businesses such as Salesforce, and I figured that security could also be done in the cloud. That’s when I decided to create a security platform where companies can comfortably and securely access SaaS applications, without having to worry buying, deploying and managing. The differentiating factor for us is that we are not looked at as a security product. We are an enabler of business because companies want agility in today’s environment. The idea is to enable businesses to do things better and in a secure manner. Our technology solution is designed to provide security across the cloud stack.


Why a Coffee Shop Will Probably Be Your Workspace Within 10 Years


A study by CTrip of 500 volunteers found that individuals who worked from home were 13.5 percent more efficient and 9 percent more engaged than their peers working in the office. They also took shorter breaks and sick days and took less time off, and attrition rates were 50 percent better. Job satisfaction was higher overall, too. Another study by TINYpulse had similar positive results. Subsequently, more and more companies--particularly those in the transportion, computer, information systems and mathematics industries--are giving workers the leeway to work outside the standard cubicle. These companies don't particularly care where workers work, so long as they finish the jobs they're assigned on time with the expected quality level. In fact, they're using flexible work options to attract new hires, particularly millennials. I should point out here that, in the CTrip study, many workers eventually went back to the office when given the opportunity. Workers want flexibility, but they also wanted to get away from being so isolated and to combat the accurate perception that they wouldn't be considered prominently for bonuses and promotion. 


Parallel programming no longer needs to be an insurmountable obstacle

Parallel code gets its speed benefit from using multiple threads instead of the single one that sequential code uses. Deciding how many threads to create can be a tricky question because more threads don't always result in faster code: if you use too many threads the performance of your code might actually go down. There are a couple of rules that will tell you what number of threads to choose. This depends mostly on the kind of operation that you want to perform and the number of available cores. Computation intensive operations should use a number of threads lower than or equal to the number of cores, while IO intensive operations like copying files have no use for the CPU and can therefore use a higher number of threads. The code doesn’t know which case is applicable unless you tell it what to do. Otherwise, it will default to a number of threads equal to the number of cores.


The Hybrid Cloud Habit You Need to Break

Plenty of small companies start off with a data center in the basement. A few years and a couple satellite offices later, the company decides to move some applications onto a private cloud to accommodate the geography of its workforce. A few years after that, it moves other applications to a public cloud service to stay ahead of traffic surges, lower costs, and add agility. At each stage, the network administrator establishes security protocols for the new environment based on the new architecture. But many network administrators never go back and adjust the data center’s security in light of the new private cloud, and the protocols are seldom adjusted when the second cloud is added. There are lots of reasons for this. Budget plays a role. A planned cloud adoption might have a budget for security that only factors in the new environment. Or the administrator might believe that, having checked for hardware and policy compatibility between the new environments, the security policies are aligned, and there’s no need and no time to go back.



Quote for the day:


"Confidence comes not from always being right but from not fearing to be wrong." -- Peter T. McIntyre


Daily Tech Digest - June 02, 2018

AI application in CX
For a simple, isolated interaction, AI is able to deliver results by simply knowing that an email is an email and a campaign is a campaign. Our web analytics and CRM platforms take advantage of this inherent luxury. But in holistic, cross-channel journey analytics, the idea that touchpoints of a similar category will be the same across enterprises is an antiquated notion. Customer journeys are as unique to individual businesses as fingerprints. Every company has their own set of touchpoints and a distinct method for employing those engagements in their customer experience. For AI to deliver value, it must be given some context. By context, I mean more than simply designating a certain interaction as an “inbound call” and another as “order fulfillment.” AI must know the significance of these events in shaping a customer behavior. That requires an awareness of both the journey that these touchpoints helped to shape and the KPIs which were subsequently impacted by that customer behavior—whether related to revenue, profitability, customer lifetime value, customer satisfaction or other factors driving high-level business performance.



The real issue here isn't the level of spending--it's the underlying philosophies and organizational cultures driving (and determining) the tech spending levels. In a recent blog post, Chris Skinner wrote about the excuses smaller banks give explaining their resistance to technology ... "If you don’t think you can change a teeny-weeny bank, then what the hell are you doing there? Massive banks are changing and they’ve got 1000x the challenges you have. Most of the barriers to small financial firms seizing the digital opportunities are created by negative thinking. But then I have to say that most small financial firms I’ve met are ultimately constrained by the negative thinking of their CEO. This is because many small financial firms are led by a CEO who was anointed ages ago. They got the job, and they’ve been there for years. They’re not really a CEO to be honest, but just a caretaker for the next guy." I agree with the "massive banks are changing" point, but not the rest of the quote. I know (and work with) a lot of mid-size bank and credit union CEOs who have been in their role for a while


New European Union Data Law GDPR Impacts Are Felt By Largest Companies


These regulations are impacting companies globally, not just European firms. Forbes reported in December 2017, GDPR will affect US-based businesses as well – even those without clients or operations in the European Union. Indeed, as Oliver Smith reported earlier this month, GDPR has cost US-based companies nearly $7.8 billion in compliance to avoid the multi-million dollar fines and penalties.  ... These numbers are astronomical, and for data-based entrepreneurial startups, prohibitive. As CNN’s Ivana Kottasová reports “The cost of complying with the new law has already forced an online game producer, a small social network, and a mobile marketing firm to close key businesses or shut down entirely.” This regulation will greatly impact data-driven businesses in Europe and across the Globe. The 28-state European Union is the world’s second-largest economy, an economy that companies with a digital presence can’t help but interact with. These are the companies that capture our interest as 30 Under 30 observers. As these startup founders grapple with the implications of the GDPR, many hesitate moving forward until they 1. understand, and 2. can comply with these regulations in a cost-effective way.


Continuous Development Will Change Organizations as Much as Agile Did

Today, leading companies are embracing a new business process methodology. Once again, it has started in the bowels of technology companies and startups. And, once again, business leaders would do well to pay close attention to the strategic implications. The methodology is Continuous Development, which, like agile, began as a software development methodology. Rather than improving software in one large batch, updates are made continuously, piece-by-piece, enabling software code to be delivered to customers as soon as it is completed and tested. Companies that can successfully implement Continuous Development throughout their organization will find dramatic strategic benefits ... Continuous Development is a growing trend in the software industry. And for good reason: it represents a more effective method for software development in order to achieve both external and internal objectives. Various estimates and surveys suggest that as many as 20% of software professionals are using some form of it. Business executives at companies large and small would be wise to embrace this new methodology and even push their organizations to adopt this more flexible, powerful technique to develop technology products.


Achieving Intelligent Automation – Leveraging IoT Data from Automated Systems

Achieving Intelligent Automation - Leveraging IoT Data from Automated Systems
In the future, Brendan believes that we will see more city-level optimization of logistics and transport networks. Aside from the obvious energy saving benefits of AI in such applications, Brendan says that there could be benefits which are harder to perceive at first glance. For example, HVAC systems in buildings typically have have hundreds of different sensors gathering data. Sensors can record the air flow and air temperature data from vents connected to outside environment. Brendan goes on to say that this is but a step towards more complete autonomy through AI. He adds that toay AI can identify anomalies; two years from now, AI will likely be able to identify whether an anomaly is a critical problem or not (again from historical evidence). Analogous to how autonomous vehicles are now removing humans from the loop, he sees AI platforms aimed at leveraging IoT data develop similar capabilities in the future. Looking five years ahead into the future, Brendan has the following predictions about how AI applications for automated systems might evolve


Banking Playing Catch Up in Technology, Conceding Battle for Payments


As the banking industry continues to move more transactions to digital channels and adjusts the technology used in back-office operations, costs are being reduced, productivity is increasing and response to risk and compliance needs are improving. As a result, and for the first time in its five-year history, the annual Economist Intelligence Unit survey on the future of retail banking, conducted for Temenos, shows that global bank executives are now more concerned with technology-driven trends than they are by regulation. About 58% of respondents in the survey said “changing customer behavior and demands” will have the biggest impact on retail banks in the years till 2020, citing a survey of 400 senior banking executives across the globe. In addition, “technology and digital” (48%) are now bigger trends than “regulatory fines and recompense orders” (43%). This trend is not true in North America, where regulatory fines and penalties are still the primary concern for large banks (56%), compared to just 34% who said the same about new technologies such as artificial intelligence and blockchain (14% lower than global results).


How to avoid the coming cloud complexity crisis

How to avoid the coming cloud complexity crisis
Create a complexity management plan. This means taking a few steps back and understanding your own issues before you start throwing processes, technology, and a lot of cash at the problems. In this plan, you need to define your approach to dealing with traditional and cloud-driven complexity, how systems will be tracked, how you’ll minimize complexity going forward, and the use of technology to assist you. Select tools needed to managed complexity. This is a Pandora’s box, because everyone has an idea of what tools will be helpful. In my work, I end up in a lot of emotional discussions around something that should be very logical. You need to pick tools that provide the following capabilities: configuration management, devops automation, hybrid monitoring and management tools, and cloud-specific tools such as cloud services brokers (CSBs) or cloud management platforms (CMPs). Set up processes. This means taking the time to figure out core processing of tracking cloud and traditional resources, of services bound to those resources, and data that exists around those resources. How do you add and/or remove resources? Who does it? And what tools do you use?


How Capital One sees digital identity as a business opportunity

“It fits with Capital One’s strategy, not just from a digital identity services perspective but with the broader platform business model that Capital One has used to expand its set of services beyond just a narrow set of financial services,” Shevlin said. “Capital One is positioning itself to become the Amazon of banking more than any other big bank is.” Nash was the director of identity services at Google as well as the senior director of consumer identity at PayPal. He announced in a blog post that Confyrm, which he founded in 2013, would be sold to Capital One. “We were eager to scale our efforts to help restore trust in digital identities, and we were fortunate to find a partner in Capital One, who shared our vision and commitment to improving consumer identity protection,” Nash wrote in the post. In response to inquiries from American Banker, Capital One provided a link to Nash's blog post. The price of the Confyrm deal was not disclosed. Capital One is not the only bank working with APIs to help protect customers online.


Demystifying black box AI


In Hong Kong, the finance and insurance industries are probably most likely to be affected by the black box AI problem according to Chun. These industries are increasingly using machine learning in fraud detection, investment advice, portfolio management, algorithmic trading, and loan or insurance underwriting. Bias might be resulted from machine learning or AI systems. Machine learning learns from data, so it’s going to replicate any biases in the data set. “If the data models themselves contain biases then the results from AI machine learning will potentially also be biased,” he said. “The black box AI phenomenon is particularly problematic for consumer facing applications,” Chun added. “For example, if a loan or insurance policy got rejected because of AI recommendations, the consumer would want to know why.” In these situations, humans have to involve in reviewing the AI algorithm or offering explanations to the consumer. Echoing the same sentiment, Samson Tai, IBM Hong Kong’s distinguished engineer and CTO said, “Biases are arised mainly due to problems in data processes rather than training algorithms. It’s important to be aware of the issues of biases in data sets.”


Don't Neglect Physical Security of 'Workstations'

Regulator: Don't Neglect Physical Security of 'Workstations'
A May 30 cybersecurity alert issued by the Department of Health and Human Services' Office for Civil Rights urges HIPAA covered entities and BAs to pay closer attention to providing good physical security for "workstations," which include a wide variety of devices. In its monthly newsletter alert for May, OCR notes that while the HIPAA Security Rule specifically references "workstations," the term is defined in the HIPAA rule as "a computing device, for example a laptop or desktop computer, or any other device that performs similar functions - and electronic media - stored in its immediate environment. Portable electronic devices ... included in this definition ... could include tablets, smart phones and similar portable electronic devices." Physical security is an important component of the HIPAA Security Rule that is often overlooked, OCR writes. "What constitutes appropriate physical security controls will depend on each organization and its risk analysis and risk management process."



Quote for the day:


"You're making the biggest mistake of all when you think your title means you can't be mistaken." -- @LeadToday


Daily Tech Digest - May 24, 2018

Fintech is disrupting big banks, but here’s what it still needs to learn from them

Fintech is disrupting big banks, but here̢۪s what it still needs to learn from them
As a general rule, fintech’s priorities lean more toward customer convenience than risk management. The sector’s value proposition is based largely on its ability to say yes where traditional finance would say no, allowing more people to take out loans, open credit cards, and open checking accounts than ever before. Just like tech startups that are funded by venture capital, fintechs also place a premium on growth, which makes turning down a potential customer due to credit risk (or any other factor) painful, but essential for sustainable growth. Though it’s definitely possible to grow while managing risk intelligently, it’s also true that pressure to match the “hockey-stick” growth curves of pure tech startups can lead fintechs down a dangerous path. Startups should avoid the example of Renaud Laplanche, former CEO of peer-to-peer lender Lending Club, who was forced to resign in 2016 after selling loans to an investor that violated that investor’s business practices, among other accusations of malfeasance. It’s not just financial risk that they may manage badly: the sexual harassment scandal that recently rocked fintech unicorn SoFi shows that other types of risky behavior can impact bottom lines, too.


How to mitigate the complexity of microservice communication


"The biggest single challenge arises from the fact that, with microservices, the elements of business logic are connected by some sort of communication mechanism … rather than direct links within program code," said Randy Heffner, principal analyst at Forrester. This means there are more opportunities for errors in network and container configurations, errors in request or response, network blips, and errors in security configurations, configs and more. In other words, there are simply many more places where things can go wrong with microservice communication. It's also much more challenging to debug application logic that flows across multiple microservices. In a monolithic app, a developer can embed multiple debug and trace statements in code that will all automatically go into one log. With microservices, developers need to collect logs and other debugging outputs. They then need to correlate those logs and outputs into a single stream in order to debug a collection of microservices that interact. This is even harder to do when multiple streams of testing are active in an integration testing environment.


27 Incredible Examples Of AI And Machine Learning In Practice

27 Incredible Examples Of Artificial Intelligence (AI) And Machine Learning In Practice
With approximately 3.6 petabytes of data (and growing) about individuals around the world, credit reference agency Experian gets its extraordinary amount of data from marketing databases, transactional records and public information records. They are actively embedding machine learning into their products to allow for quicker and more effective decision-making. Over time, the machines can learn to distinguish what data points are important from those that aren’t. Insight extracted from the machines will allow Experian to optimize its processes.  American Express processes $1 trillion in transaction and has 110 million AmEx cards in operation. They rely heavily on data analytics and machine learning algorithms to help detect fraud in near real time, therefore saving millions in losses. Additionally, AmEx is leveraging its data flows to develop apps that can connect a cardholder with products or services and special offers. They are also giving merchants online business trend analysis and industry peer benchmarking.


Skills shortage a major cyber security risk


Security industry leaders are increasingly putting emphasis on cyber resilience based on good detection and response capabilities, rather than relying mainly on defence technologies and controls. “These results reflect the difficulty in defending against increasingly sophisticated attacks and the realisation breaches are inevitable – it’s just a case of when and not if,” said Piers Wilson, director at the IISP. “Security teams are now putting increasing focus on systems and processes to respond to problems when they arise, as well as learning from the experiences of others.” When it comes to investment, the survey suggests that for many organisations, the threats are outstripping budgets in terms of growth. The number of businesses reporting increased budgets dropped from 70% to 64% and businesses with falling budgets increased from 7% up to 12%. According to the IISP, economic pressures and uncertainty in the UK market are likely to be restraining factors on security budgets, while the demands of the General Data Protection Regulation (GDPR) and other regulations such as Payment Services Directive (PSD2) and Networks and Information Systems Directive (NISD) are undoubtedly putting more pressure on limited resources.


Talos finds new VPNFilter malware hitting 500K IoT devices, mostly in Ukraine

vpnfilter-cisco-talos.jpg
While the researchers have said that such a claim isn't definitive, they have observed VPNFilter "actively infecting" Ukrainian hosts, utilising a command and control infrastructure dedicated to that country. The researchers also state VPNFilter is likely state sponsored or state affiliated. As detailed by the researchers, the stage 1 malware persists through a reboot, which normal malware usually does not, with the main purpose of the first stage to gain a persistent foothold and enable the deployment of the stage 2 malware. "Stage 1 utilises multiple redundant command and control (C2) mechanisms to discover the IP address of the current stage 2 deployment server, making this malware extremely robust and capable of dealing with unpredictable C2 infrastructure changes," the researchers wrote. The stage 2 malware possesses capabilities such as file collection, command execution, data exfiltration, and device management; however, the researchers said some versions of stage 2 also possess a self-destruct capability that overwrites a critical portion of the device's firmware and reboots the device, rendering it unusable.


AWS facial recognition tool for police highlights controversy of AI in certain markets

aiface.jpg
Amazon had also touted the ability to use Rekognition with footage from police body camera systems, though the ACLU notes that mentions of this type of interaction were scrubbed from the AWS website "after the ACLU raised concerns in discussions with Amazon," adding that this capability is still permissible under Amazon's terms of service. This change "appears to be the extent of its response to our concerns," according to the ACLU. Naturally, using cloud services to build a panopticon is likely to generate concern among residents of the localities that have deployed the technology. Under optimal circumstances, this would be implemented following a referendum or, at a minimum, a period of public comment about combining surveillance technology with mass facial recognition. The ACLU sought documents indicating that any such outreach was attempted, though no documents were discovered. It does, however, point out the existence of an internal email from a Washington County employee stating that the "ACLU might consider this the government getting in bed with big data."


DevOps is a culture, but here's why it's actually not


For DevOps to continue to grow, though, we must put the idea that DevOps is a culture aside. That is simply not sufficient and can cause a take everything or nothing approach. DevOps is a transformation process and a collaboration philosophy, and this particular definition comes with different approaches and different criteria for success. It is time to set up standards to help people imagine practical goals and adopt new norms we can all share. Instead of an all or nothing approach, standards unify people and organizations around unique goals that are independent from the used technology, the team size, priority or any other criterion. Setting up standards can also be an iterative process. Take time to think through and grow standards that can continuously shape the interaction between developers, ops, code and servers. And make sure these DevOps standards give the different stakeholders the time to experiment, learn and provide feedback. The 12-factor apps, cloud native or Consortium for IT Software Quality standards are some good examples to follow and consider for iteration.


AI boosts data center availability, efficiency

data center technician
AI in the data center, for now, revolves around using machine learning to monitor and automate the management of facility components such as power and power-distribution elements, cooling infrastructure, rack systems and physical security. Inside data-center facilities, there are increasing numbers of sensors that are collecting data from devices including power back-up (UPS), power distribution units, switchgear and chillers. Data about these devices and their environment is parsed by machine-learning algorithms, which cull insights about performance and capacity, for example, and determine appropriate responses, such as changing a setting or sending an alert. As conditions change, a machine-learning system learns from the changes – it's essentially trained to self-adjust rather than rely on specific programming instructions to perform its tasks. The goal is to enable data-center operators to increase the reliability and efficiency of the facilities and, potentially, run them more autonomously. However, getting the data isn’t a trivial task.


The path to explainable AI

binary code displayed across an artificial face
Traceability also addresses several challenges in AI’s implementation. First, it focuses on the quality in new emerging applications of this advanced technology. Second, in the evolution of human and machine interactions, traceability makes answers more understandable by humans, and helps drive AI’s adoption and related change management necessary for successful implementations. Third, it helps drive compliance in regulated industries such as life sciences, healthcare, and financial services. Traceability exists in some more mature AI applications like computational linguistics. In other emerging technologies that are less mature, the so-called black box problem still tends to appear. This mostly occurs in the context of deep neural networks, machine learning algorithms that are used for image recognition, or natural language processing involving massive data sets. Because the deep neural network is established through multiple correlations of these massive data sets, it is hard to know why it came to a particular conclusion, for now. Companies need a more comprehensive governance structure, especially with these advanced technologies like neural networks that do not permit traceability.


Discussions on the Future of .NET Core


One of the major weaknesses today of .NET Core is the misunderstandings that come with it. Countless developers are still asking, "What's the difference between .NET Core, .NET Standard and .NET Framework". Likewise, which one should they choose and why. The choices aren't always easy or clear. For example it is actually possible to have a .NET Core application that targets the .NET Framework – which if you think about it is really confusing, because we know that both the .NET Framework and .NET Core are runtime implementations of .NET Standard. The .NET Core terminology is overloaded. There are .NET Core applications, .NET Core CLI, .NET Core SDK and .NET Core runtimes. I believe there is much room for improvement with regards to making all of this easier to digest and use. There’s still some work to be done on the performance side of things. Kestrel, the ASP.NET Core web server, performs extremely well in the TechEmpower “plaintext” benchmark, but not so well in the higher-level tests involving database queries and the like. Much of the code that’s been migrated over from the full-fat .NET Framework could be improved a lot in that regard. The great thing is now people are diving into the code and digging these things out.



Quote for the day:



"Prosperity isn't found by avoiding problems, it's found by solving them." -- Tim Fargo


Daily Tech Digest - May 23, 2018

12 Frequently Asked Questions on Deep Learning


Feature engineering is a process of putting domain knowledge into the creation of feature extractors to reduce the complexity of the data and make patterns more visible to learning algorithms to work. This process is difficult and expensive in terms of time and expertise. In Machine learning, most of the applied features need to be identified by an expert and then hand-coded as per the domain and data type. For example, features can be pixel values, shape, textures, position and orientation. The performance of most of the Machine Learning algorithm depends on how accurately the features are identified and extracted. Deep learning algorithms try to learn high-level features from data. This is a very distinctive part of Deep Learning and a major step ahead of traditional Machine Learning. Therefore, deep learning reduces the task of developing new feature extractor for every problem. Like, Convolutional NN will try to learn low-level features such as edges and lines in early layers then parts of faces of people and then high-level representation of a face.



No CS degree? For skilled developers, 75% of hiring managers don't care

istock-849858410.jpg
Strong work experience is the most important qualification that recruiters and hiring managers look for when filling tech positions, the report found. However, resume-bolstering factors like degree, prestige, and skill keywords are not accurate predictors of future job success, according to the report. Instead, hiring managers and recruiters are looking to indicators that demonstrate ability, such as previous work experience, years of work, and personal projects, which get closer at measuring a candidate's skills. ... Hiring managers' top three measures of success in recruiting were quality of candidate, future performance success, and employee retention, the report found. Failing to align on skills and expectations for candidates are two of the top hurdles facing hiring managers when it comes to working with recruiters, the report found. To solve this problem, recruiters should regularly check in with hiring managers to understand the nuances of the technical skills hiring managers are looking for in each open role. For example, what are the crucial must-have skills for a fullstack developer versus a back-end developer? This can help narrow down the pool of qualified candidates.


Doctors are using AI to see how diseases change our cells


This model can predict where these organelles will be found in any new cell, so long as it’s provided with an image from a microscope. The researchers also used AI to create a probabilistic model that takes its best guess at where one might expect to find those same organelles if provided with a cell’s size and shape, along with the location of its nucleus. These models are useful for doctors and scientists because they provide a close-up look at the effects of cancer and other diseases on individual cells. By feeding the AI with data and images of cancer cells, they can get a more complete picture of how the cell, and its individual components, are affected. And that can indicate how doctors can help each patient with treatment tailored to their disease. The team from the Allen Institute hopes their tools can help democratize medical research, improving healthcare in underserved areas. So the researchers are working to improve them, creating more complete models, according to NPR. They hope to have a broader database, full of models of more cells, available over the next few months.


Everything you need to know about the new general data protection regulations

istock-gdpr-concept-image.jpg
GDPR applies to any organisation operating within the EU, as well as any organisations outside of the EU which offer goods or services to customers or businesses in the EU. That ultimately means that almost every major corporation in the world will need to be ready when GDPR comes into effect, and must start working on their GDPR compliance strategy. There are two different types of data-handlers the legislation applies to: 'processors' and 'controllers'. The definitions of each are laid out in Article 4 of the General Data Protection Regulation. A controller is "person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of processing of personal data", while the processor is "person, public authority, agency or other body which processes personal data on behalf of the controller". If you are currently subject to the UK's Data Protection Act, for example, it's likely you will have to look at GDPR compliance too.


You’ve probably been hiring the wrong kind of data scientist

A lot of people like to call themselves data scientists because they’re using point-and-click tools, like Tableau and Excel, to perform data analysis and visualization in order to gain business insights. ... The real challenge comes from handling large datasets, including textual or other unstructured raw data, and doing so in real time–all of which requires programmatic execution. That is, coding. Indeed, many of the gains in AI and data science are thanks to what researchers are calling the “Unreasonable Effectiveness of Data”–being able to learn programmatically from astronomical data sets. This work is also highly nuanced and detailed, and doing the wrangling and cleaning properly is crucial for developing effective machine intelligence later on. Point-and-click software just isn’t sophisticated enough to substitute for good programming skills (after all, you can perform machine learning with Excel). This goes beyond just the usual mantra of “garbage in, garbage out.” Employers are trying to manage turbocharged public relations on social media while staying in regulators’ good graces despite that enhanced scrutiny.


A Simple and Scalable Analytics Pipeline


The core piece of technology I’m using to implement this data pipeline is Google’s DataFlow, which is now integrated with the Apache Beam library. DataFlow tasks define a graph of operations to perform on a collection of events, which can be streaming data sources. This post presents a DataFlow task implemented in Java that streams tracking events from a PubSub topic to a data lake and to BigQuery. An introduction to DataFlow and it’s concepts is available in Google’s documentation. While DataFlow tasks are portable, since they are now based on Apache Beam, this post focuses on how to use DataFlow in conjunction with additional managed services on GCP to build a simple, serverless, and scalable data pipeline. The data pipeline that performs all of this functionality is relatively simple. The pipeline reads messages from PubSub and then transforms the events for persistence: the BigQuery portion of the pipeline converts messages to TableRow objects and streams directly to BigQuery, while the AVRO portion of the pipeline batches events into discrete windows and then saves the events to Google Storage.


7 risk mitigation strategies for the cloud

7 risk mitigation strategies for the cloud
“Cloud services often encourage ‘casual use’ of data; I can collect, search and store anything just about anywhere” is the hook, says John Hodges, vice president of product strategy for AvePoint. “We often see this in systems like Box, DropBox or OneDrive, where there is a real mixed-use danger in how content is stored and shared.” The simple solution? Prohibit services where mixed-use is likely to be a problem. ... Zero trust is an IT security strategy wherein an organization requires every user, system or device inside or outside its perimeter to be verified and validated before connecting to its systems. How can you use a zero trust model to mitigate cloud risk? For Insurity, an organization that specializes in property and casualty insurance services and software, a zero trust approach means restricting access tightly. “We provide logical access to the minimum set of users with a minimum set of rights and privileges in line with job function requirements. This control is audited internally by our Enterprise Security team and externally as part of our annual SOC audit,” says Jonathan Victor, CIO of Insurity. Regularly examine user access levels and ask yourself whether they make sense.


What Is Microservices? An Introduction to Microservice Architecture


Now, let us look at a use-case to get a better understanding of microservices. Let's take a classic use case of a shopping cart application. When you open a shopping cart application, all you see is just a website. But, behind the scenes, the shopping cart application has a service for accepting payments, a service for customer services and so on. Assume that developers of this application have created it in a monolithic framework. So, all the features are put together in a single code base and are under a single underlying database. Now, let's suppose that there is a new brand coming up in the market and developers want to put all the details of the upcoming brand in this application. Then, they not only have to rework the service for new labels, but they also have to reframe the complete system and deploy it accordingly. To avoid such challenges developers of this application decided to shift their application from a monolithic architecture to microservices.


Cybercriminals Battle Against Banks' Incident Response

Persistent attackers aren't backing down when banks detect them and launch their incident response processes, either. One in four bank CISOs in the Carbon Black study say their institution faced attackers fighting back when they got spotted, trying to deter defenses and the investigation into the attack. "They are leaving wipers or destructive malware to inhibit [IR], deleting logs, and inhibiting the capacity of forensics tools," for example, says Tom Kellermann, chief cybersecurity officer at Carbon Black. "Sometimes they are using DDoS to create smokescreens during events." These counter-IR activities are forcing banks to be be more proactive and aggressive as well, he says. "They need to have threat hunting teams. You can't just rely on telemetry and alerts." While banks are often relying on their IR playbooks, attackers have the freedom to freelance and counter IR activities. They're changing their malware code on the fly when it gets detected, deleting activity logs to hide their tracks, and even targeting bank security analysts and engineers to help their cause.


Certain types of content make for irresistible phishes

It used to be that fear, urgency and curiosity were the top emotional motivators behind successful phishes. Now they’ve been replaced by entertainment, social media and reward/recognition. According to the company, simulated eCards, internal promotion/reward programs, and a number of financial and compliance scenarios (e.g., phishes with “Financial Information Review” or “Compliance Training” in the subject line) are most successful at getting users to click. Employees should be trained to be aware of their emotional reactions to emails and see them as phishing triggers. “When creating simulations, remember consumer scams—those phony Netflix or LinkedIn emails sent to busy employees, who are glad to switch gears and click on something fun,” the company notes. “Understand the dynamics of entertainment or social phishing (think uncritical social acceptance and shortened URLs).” And when it comes to emails promising rewards, employees should be taught to be critical of rewards and deals that sound too good to be true.



Quote for the day:


"You don't have to hold a position in order to be a leader." -- Anthony J. D'Angelo


Daily Tech Digest - May 22, 2018

Smart Homes of Tomorrow – This Is Why We Can’t Have Nice Things


What new design concepts are needed to address these emerging technologies and risks presented by the ever-connected smart home? It not about informing everyone that everything can be hacked. It’s about creating awareness and helping others understand the risks involved with modern technology. It’s about helping the builders and designers understand the technological solutions required by their clients and how to implement them correctly, so they too can educate the user on how to maintain their system safely and securely. Consumers need to be aware of their devices’ remote and local environment, and how their data is collected and stored. They also need to be aware of how their personal devices and appliances can be affected by outages outside of their control, like a DDoS attack on a cloud environment or something as simple as a power outage. Finally, we as a community need to put pressure on the manufacturers to produce secure devices with clear plans on how to patch and mitigate future vulnerabilities. Manufacturers also have to begin working together to insure user data and integrate it in a crowded environment of smart links and physical devices, ultimately preventing remote access.



Legit tools exploited in bank heists


Native tools such as PowerShell and Windows Management Instrumentation (WMI) grant users exceptional rights and privileges to carry out the most basic commands across a network. These “non-malware” or fileless attacks account for more than 50% of successful breaches, the report said, with attackers using existing software, allowed applications and authorised protocols to carry out malicious activities. In this way, attackers are able to gain control of computers without downloading any malicious files and therefore remain unnoticed by malware-detection security systems.  ... Finally, PowerShell is used to connect to a command and control server to download a malicious PowerShell script designed to find sensitive data and send it to the attacker, all without downloading any malware. Almost every Carbon Black customer (97%) was targeted by a non-malware attack during each of the past two years, but the report notes that awareness of malicious usage for tools such as PowerShell has never been higher, with 90% of CISOs reporting seeing an attempted attack using PowerShell.


4 best practices for tapping the potential of prescriptive analytics

One potential value of prescriptive analytics is that you don’t necessarily need a ton of data to reach the best decision or outcome. Prescriptive analytics focuses the question you’re asking, and the decisions you’re trying to reach, to one tangible answer using a smart model of your business that is not dependent on the amount of data (how much or how little) that you have. Predictive techniques and functionalities can be great at identifying a multitude of options through statistical modeling and forecasting, as long as you have the relevant data—but that’s precisely the problem. It’s difficult to process and synthesize numerous options and the nuanced differences among them to determine what you should actually do. How can you be sure that you’re making the best decision? How can you be sure of the impact it will have on your company? Prescriptive analytics can involve hundreds of thousands of tradeoffs associated with a question you might have, and it uses the available data to identify the best decision and impact relative to the goal you’re trying to achieve. 


Technological advancements are changing the landscape for new and existing businesses alike


With Brexit looming over both small and large businesses operating within the UK, maintaining a competitive price point despite new tariffs will be a tough challenge to overcome. However, with new technological solutions opening up greater levels of efficiency and productivity this task should not appear as daunting as many believe. Tech companies will continue to innovate new solutions for both current and future business issues, finding new ways to improve efficiency within the business environment. Voice recognition technology such as iBOB, for instance, is now freeing up valuable time for business owners and receptionists. It is reducing the need for customer service interactions on the phone, such as appointment bookings, which is making the human impact much more impactful and is allowing small businesses to focus their resources on more profitable aspects of their business. Employing affordable technological solutions, with an aim to focus added work hours on tasks more closely related to the bottom line, will allow existing businesses to maintain their competitive position within the market.


Cultural change more important than tech when implementing big data

“It is not your data or my data; it is the firm’s data, and the value you create for the business is from that data,” Tewary explained. “It is a transformation. It’s changing the people culture aspect, so there’s a lot of education. You know, you have to be an evangelist. You wear multiple hats to show people the value.” For Tewary at Verizon, finding advocates within the company for sharing big data was crucial. “We found champions,” he said. “For example, finance … was a good champion for us, where we used the data and analytics to really actually launch some very critical initiatives for the firm — asset-backed securities. … That created the momentum.” Dobrin agreed with this strategy of using champions within an enterprise to help lead the way for the entire company. “It’s not just a jump to the top of the ladder, because there’s just a lot of work that’s required to do it. You can do that within a business unit.” While the whole enterprise doesn’t need to get there all at the same time, as other areas of the enterprise begin to see the use of big data and how it can change the game, they will be open to the idea, Dobrin explained.


What is SDN? How software-defined networking changed everything

180507-06-open-vswitch.jpg
To stay competitive in networking and to avoid being obsoleted by history, network equipment vendors have either blazed the trails for SDN, or found themselves adopting SDN reluctantly, perhaps looking a little singed in the process. One vendor clearly in the former camp, not the latter, is Juniper Networks. It plunged into the SDN field during the fateful year of 2012, first by purchasing a firm called Contrail, and then by building it into an open-source virtual appliance ecosystem unto itself: OpenContrail. As the diagram above depicts, OpenContrail serves as a device that provides the routing logic for distributed operating systems that host Docker containers. ... "It's a big part of operating and automating both a virtual and a physical infrastructure. It orchestrates the VNFs [virtual network functions] and puts together the service chains, all the way to the edge and to the core. Contrail uses vRouter and, in a distributed data center infrastructure, reach into any part of the cloud, string up the required VNFs, stitch together the different pieces of the service, and deliver a custom service to a certain vertical, or a group of end customers. It automates that whole process of customizing the services that can be offered, ultimately, to our service provider customers."


How can you categorize consumers who keep remaking themselves?


Perhaps there won’t be a “mass market” for consumer goods anymore; just a mass of individuals who are increasingly difficult to categorize, and who reinvent themselves from moment to moment, from platform to platform. People will still want to gather in groups with like-minded people. But they will find them through technology and data and connect with them based on their shared values and interests rather than practical connections, such as living in the same area. Rather than being defined by markers such as gender, age or location, they will express themselves in ways that are more fluid and flexible. In one of the future worlds we modeled at our hack week in Berlin, these groups – or “tribes” – broke down physical borders and formed their own communities, both real and virtual. They started to pool their purchasing power and demand a different relationship with brands. Today, consumer-facing companies try to tailor offers and discounts that will appeal to individual consumers, based on their purchasing data – with varying degrees of skill and success. In the future, will products themselves be individualized?


Containers and microservices and serverless, oh my!

Instead of using containers to run applications, serverless computing replaces containers with another abstraction layer. Its functions or back-end services are one-job programs, which use compute resources without worrying the developer. Instead of calling functions in the traditional sense, in serverless, a developer calls a working program to provide a service for the program they're building. The Cloud Native Computing Foundation (CNCF) Serverless Working Group defines serverless computing as "building and running applications that do not require server management. It describes a finer-grained deployment model where applications, bundled as one or more functions, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment.” Or for another definition: "Serverless architectures refer to applications that significantly depend on third-party services,” says Mike Roberts, engineering leader and co-founder of Symphonia, a serverless and cloud architecture consultancy. “By using these ideas, and by moving much behavior to the front end, such architectures remove the need for the traditional 'always on' server system sitting behind an application.”


Websites Still Under Siege After 'Drupalgeddon' Redux

Nearly two months after critical Drupal fixes were released, security firm Malwarebytes says it is still finding dozens of unpatched websites that have been exploited to host cryptocurrency miners or in other cases redirect to malware (see Cryptocurrency Miners Exploit Widespread Drupal Flaw). The problems stem from two critical vulnerabilities in Drupal, both of which are remotely executable. That's a perfect combination for attackers: Give them a widely used piece of software such as Drupal, as well as known vulnerabilities that can be easily and remotely exploited without even needing to attempt to trick a victim into taking any action. The first flaw, CVE-2018-7600, was revealed March 28, and the second, CVE-2018-7602, on April 25. The vulnerabilities were so severe that they were dubbed Drupalgeddon 2 and Drupalgeddon 3. Although patches have been available since the vulnerabilities were publicized, attackers are still taking advantage of websites that haven't been upgraded. "Rolling out a CMS is the easy part," writes Jerome Segura, lead malware intelligence analyst with Malwarebytes, in a blog post. "Maintaining it is where most problems occur due to lack of knowledge, fear of breaking something and, of course, costs."


Effective IoT Security Requires Machine Learning

Artificial intelligence (AI) is a branch of computer science that focuses on the theory and development of computer systems that are capable of performing tasks that normally require human intelligence, such as visual perception and decision-making. Machine Learning is a subset of AI that focuses on the practice of using algorithms to parse data, learn from it, and then make a prediction about something. In contrast to a static algorithm, a critical aspect of machine learning is that the machine is “trained” using large amounts of data and algorithms that give the machine the ability to continually learn how to perform a given task. Tools based on machine learning are necessary to supplement the existing set of security tools. These new tools help organizations identify and mitigate the emerging generation of security breaches that are designed to leverage both the legacy and evolving attack surfaces to evade the enterprise’s traditional defenses. When evaluating security tools based on machine learning, there are three key concepts that IT organizations should keep in mind.



Quote for the day:


"There are plenty of difficult obstacles in your path. Don't allow yourself to become one of them." -- Ralph Marston