Daily Tech Digest - August 31, 2018

IoT gets smarter but still needs backend analyticsThe difference between doing analytics completely on an endpoint device or partially on a device is an important one, according to Gartner research vice president Mark Hung. At the core, the analytics done by IoT implementations is about machine learning and artificial intelligence, letting systems take data provided by smart endpoints and fashion it into actionable insights about reliability, performance, and other line-of-business information automatically. Applying the lessons learned from sophisticated ML is easy enough, even for relatively constrained devices, but some parts of the ML process are much too computationally rigorous to happen at most endpoints. This means that the endpoints themselves don’t change their instructions, but that they provide information that can be used by a more powerful back-end to customize a given IoT implementation on a per-endpoint basis. The case of video analytics for smart city applications like traffic monitoring – using a system where the cameras themselves track pedestrians and motorists, then score that data against a centrally-created AI model – is an instructive one.


The anatomy of fake news: Rise of the bots

Spreading misinformation has become a mainstream topic to the extent that even the term ‘Twitter bot’ is a well-recognised term establishing itself into the modern lexicon. Whilst the term is well known, it can be argued that the development and inner workings of Twitter bots are less well understood. Indeed, even identifying accounts that are attributed to being a bot is considerably more difficult, and with good reason since their objective to appear as legitimate interactions require constant refinement. This continuous innovation from botnet operators are necessary as social media companies get better at identifying automated accounts. A recent study conducted by SafeGuard Cyber analysed the impact and techniques leveraged by such bots, and in particular looked at bots attributed to Russian disinformation campaigns on Twitter. The concept of bot armies is challenged in the research, of the 320,000 accounts identified the bots were divided into thematic categories presenting both sides of the story.


How to retrofit the cloud for security: 2 essential steps

How to retrofit the cloud for security
Identity and access management (IAM) can be retrofitted after a cloud migration without a lot of effort. While it depends on the IAM system you use, the native IAM systems found in clouds such as Amazon Web Services and Microsoft Azure are typically both a better choice and a quicker choice. At the end of the day, of course, it’s your particular requirements that will determine your choice of IAM. Keep in mind that IAM systems depend on directory services to maintain identity and to provide the proper authorization to those identities. You must deploy one of those systems if you don’t already have one. Also, keep in mind that IAM is only of value if all applications and data are included in the system, both in the cloud and on-premises. I’m not a fan of shortcuts when it comes to cloud computing security. However, reality sometimes makes these shortcuts a necessary evil. The result is not as good as if security were integrated from the start. However, if security was not implemented, most data and applications are at risk for hackery.


Why Everyone’s Thinking About Ransomware The Wrong Way

Bad-themed crypto ransomware
If you think your IT systems are the target of ransomware, you’re not alone. But you’re also not correct. Your IT systems are just the delivery mechanism. The real target is your employees. Ransoms rely on psychological manipulation that IT systems aren’t susceptible to (AI isn’t there just yet). The systems are the prisoner being held for money. The psychology of ransomware is complex, and the two main types — locker and crypto — use different tactics and are successful within different populations of people (more on this later). It’s not just a case of getting your workforce to abide by security rules and keep their eyes open for dodgy ransom notes (this just helps prevent the data and system from becoming prisoners). You must recognize their unique psychological susceptibilities and design work practices that prevent individuals within your workforce from becoming attractive targets. As mentioned above, ransomware uses complex psychological tactics to get their targets to pay. The two main types of ransomware play off different psychological vulnerabilities.


Here's what two executive surveys revealed about blockchain adoption

blockchain code record coding
Rajesh Kandaswamy, a Gartner fellow and chief blockchain researcher, had a more sobering analysis of blockchain adoption, saying that while interest among enterprises is high, actual deployments are rare. Even when enterprises do perform proof of concept projects, they're often rolled out under pressure from executives who want to do "something" with blockchain. "Most industries are not close to adoption, and even when they do, they do limited activity to test the technology, not as much because of a strong business case," Kandaswamy said via email. A Gartner CIO survey released in May revealed that fewer than 1% of more than 3,100 respondents had rolled out production blockchain systems. Gartner has since completed a second survey whose numbers have yet to be released, but adopters remain low, Kandaswamy said. ... "The challenge for CIOs is not just finding and retaining qualified engineers, but finding enough to accommodate growth in resources as blockchain developments grow," Gartner Research vice president David Furlonger stated in the report.


Android 'API breaking' vulnerability leaks device data, allows user tracking

All versions of Android, including OS forks -- such as Amazon's Kindle FireOS -- are believed to be affected, potentially impacting millions of users. The cybersecurity firm initially reported its findings to Google in March. ... The patch was confirmed in early August, leading to the public disclosure of the vulnerability. Google has fixed the security flaw in the latest version of the Android operating system, Android P, also known as Android 9 Pie. However, the tech giant will not fix prior versions of Android as resolving the vulnerability "would be a breaking API change," according to the cybersecurity firm. Earlier this month, Google announced the launch of Android 9 Pie, which is already rolling out to Android users on some devices. Android devices manufactured by vendors including Nokia, Xiaomi, and Sony will receive the updated OS by the end of fall. The update includes new gesture navigation, themes, and adaptive settings for screen brightness and battery life, among others. Users able to upgrade to Android 9 are encouraged to do so.


Chip shrinking hits a wall -- what it means for you

Chip shrinking hits a wall -- what it means for you
“The vast majority of today’s fabless customers are looking to get more value out of each technology generation to leverage the substantial investments required to design into each technology node. Essentially, these nodes are transitioning to design platforms serving multiple waves of applications, giving each node greater longevity. This industry dynamic has resulted in fewer fabless clients designing into the outer limits of Moore’s Law,” said Thomas Caulfield, who was named CEO of GlobalFoundries last March, in a statement. Making the move to a new process node is no trivial matter. It takes billions to drop one size in process technology. What Caulfield is saying is there are fewer customers for such bleeding-edge manufacturing processes, so the return on investment isn’t there. “I think we’ve reached a change in Moore’s Law. Moore’s Law is an economic law: that we reduce the cost of transistors with each generation. We will still reduce the size of the transistor but at a slower rate,” said Jim McGregor, president of Tirias Research, who follows the semiconductor industry.


No-code and low-code tools seek ways to stand out in a crowd


A suite of prebuilt application templates aim to help users build and customize a bespoke application, such as salesforce automation, recruitment and applicant tracking, HR management and online learning. And a native mobile capability enables developers to take the apps they've built with Skuid and deploy them on mobile devices with native functionality for iOS and Android. "We're seeing a lot of folks who started in other low-code/no-code platforms move toward Skuid because of the flexibility and the ability to use it in more than one type of platform," said Ray Wang, an analyst at Constellation Research in San Francisco. "People want to be able to get to templates, reuse templates and modify templates to enable them to move very quickly." Skuid -- named for an acronym, Scalable Kit for User Interface Design -- was originally an education software provider, but users' requests to customize the software for individual workflows led to a drag-and-drop interface to configure applications.


Will Google's Titan security keys revolutionize account security?

img2713.jpg
Titan security keys use the FIDO Universal Second Factor (U2F) protocol, which relies on public key cryptography. Adding a Titan device to an account ties a public encryption key to that account, which is verified against a private key using a cryptographic signature supplied by the Titan device during login. Titan keys also protect against phishing attacks from fake login portals—even with a compromised password a Titan-enabled account is still protected. When a user logs in to a fake portal, Google said, the key will know that it isn't a legitimate website and will stop the login process immediately. Don't assume that Titan keys are only usable with Google accounts—the FIDO protocol is a popular one that works with a multitude of websites and applications. Any website that supports U2F will work with a Titan key. Titan hardware is also built to be secure—Google designed the devices around a secure element hardware chip that contains all the necessary firmware for it to function, and all of that information is sealed in during the manufacturing process, as opposed to being installed afterward.


DDD With TLC


When introducing DDD to a new team, start with bounded contexts – breaking down big problems into small, manageable, solvable problems. But leave out the terminology and just start doing it. Understanding the dynamics of a team in order to successfully coach them has a lot to do with instinct and empathy. It’s so important to listen carefully, be respectful, non-judgmental and to be kind. People resist DDD because they believe it is too much to learn or is too disruptive to their current process. Solving small problems is a good approach that can gain trust in adopting DDD. Domain modeling is an art, not a science, so it’s not uncommon to run into a wall and circle back or even have a revelation that makes you change direction. Teams benefit from encountering that with a coach who is familiar with modeling and is not worried about the perspective changing while you are going through the process.



Quote for the day:

"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks

Daily Tech Digest - August 30, 2018

Companies are not focusing enough on machine identities, says study image
We spend billions of dollars protecting usernames and passwords but almost nothing protecting the keys and certificates that machines use to identify and authenticate themselves. The number of machines on enterprise networks is skyrocketing and most organisations haven’t invested in the intelligence or automation necessary to protect these critical security assets. The bad guys know this, and they are targeting them because they are incredibly valuable assets across a wide range of cyber-attacks. According to the study, Securing The Enterprise With Machine Identity Protection: Newer technologies, such as cloud and containerisation, have expanded the definition of a machine to include a wide range of software that emulates physical machines. Furthermore, these technologies are spawning a tidal wave of new, rapidly changing machines on enterprise networks.



The Evolution of IoT Attacks


In addition to the evolution of IoT devices, there has been an evolution in the way attacker’s think and operate. The evolution of network capabilities and large-scale data tools in the cloud has helped foster the expansion of the IoT revolution. The growth of cloud and always-on availability to process IoT data has been largely adopted among manufacturing facilities, power plants, energy companies, smart buildings and other automated technologies such as those found in the automotive industry. But this has increased the attack surfaces for those that have adopted and implemented an army of possible vulnerable or already exploitable devices. The attackers are beginning to notice the growing field of vulnerabilities that contain valuable data. In a way, the evolution of IoT attacks continues to catch many off guard, particularly the explosive campaigns of IoT based attacks. For years, experts have warned about the pending problems of a connected future, with IoT botnets as a key indicator, but very little was done to prepare for it. Now, organizations are rushing to identify good traffic vs malicious traffic and are having trouble blocking these attacks since they are coming from legitimate sources.


Microservices development will fail with monolithic mindset


Effective microservices development requires organizational change that goes beyond simple, single-team DevOps, said Brian Kirsch, an IT architect and instructor at Milwaukee Area Technical College. Without an overarching DevOps infrastructure across all projects, too many enterprises have created siloed DevOps mini-teams, each producing hundreds of microservices. It's not possible to create a cohesive product when each team works independently and doesn't know what others are doing, Kirsch said. An important practice for organizations moving to microservices is to standardize development tools, frameworks and platforms. Standardization prevents overspending on tools and training and discourages expertise silos and competition for resources. In siloed development, each team in a company often uses its own preferred technology. This reduces engineering resources, because developers may lack skill sets needed to switch teams or substitute on a team using another technology, Kirsch said.


Top 9 Data Science Use Cases in Banking


Banks are obliged to collect, analyze, and store massive amounts of data. ... Nowadays, digital banking is becoming more popular and widely used. This creates terabytes of customer data, thus the first step of data scientists team is to isolate truly relevant data. After that, being armed with information about customer behaviors, interactions, and preferences, data specialists with the help of accurate machine learning models can unlock new revenue opportunities for banks by isolating and processing only this most relevant clients’ information to improve business decision-making. Risk modeling is a high priority for investment banks, as it helps to regulate financial activities and plays the most important role when pricing financial instruments. Investment banking evaluates the worth of companies to create capital in corporate financing, facilitate mergers and acquisitions, conduct corporate restructuring or reorganizations, and for investment purposes.


Improving security is top driver for ISO 27001


“Unfortunately, as long as cyber crime remains a lucrative trade, risks will continue to escalate and attackers will continue to proliferate,” said Alan Calder, founder and executive chairman of IT Governance. “To counter this, organisations need to be fully prepared. ISO 27001, an information security standard designed to minimise risks and mitigate damage, offers the preparedness that organisations need.”  Other top reasons for implementing ISO 27001 include gaining a competitive advantage (57%), ensuring legal and regulatory compliance (52%) and achieving compliance with the EU’s General Data Protection Regulation (GDPR), which was cited by 48% of respondents. According to IT Governance, ISO 27001 provides an excellent starting point for achieving the technical and operational measures required by the GDPR to help mitigate data breaches. Closely in line with the drivers for implementing ISO 27001, improved information security was by far the greatest advantage afforded by achieving certification, according to 89% of respondents.


NSX technology shifts virtual administrator responsibilities


NSX technology, and network virtualization broadly, lives at the kernel on each of the hosts. It has to exist at this level to have access to the traffic it needs without affecting the performance of the VMs. This means it's a host extension, and it falls on the virtual admin to ensure installation and functionality. After that's complete, however, the responsibilities can shift to different people. The functions of firewall and router rules haven't changed just because the environment has moved from physical to virtual, which implies these functions remain the network engineers' responsibilities. The network engineers still have relevant, specialized knowledge, but these rules are often generated automatically based on the VM deployment. Network mapping software, such as vRealize Network Insight, can offer additional complexity. Network engineers and virtual admins can both use these tools to examine the virtual network, ensure functionality and minimize risk before establishing a software-defined network.


What is CUDA? Parallel programming for GPUs

What is CUDA? Parallel programming for GPUs
Without GPUs, those training runs would have taken months rather than a week to converge. For production deployment of those TensorFlow translation models, Google used a new custom processing chip, the TPU (tensor processing unit). In addition to TensorFlow, many other DL frameworks rely on CUDA for their GPU support, including Caffe2, CNTK, Databricks, H2O.ai, Keras, MXNet, PyTorch, Theano, and Torch. In most cases they use the cuDNN library for the deep neural network computations. That library is so important to the training of the deep learning frameworks that all of the frameworks using a given version of cuDNN have essentially the same performance numbers for equivalent use cases. When CUDA and cuDNN improve from version to version, all of the deep learning frameworks that update to the new version see the performance gains. Where the performance tends to differ from framework to framework is in how well they scale to multiple GPUs and multiple nodes.



NASA to use data lasers to beam data from space to Earth

NASA to use data lasers to beam data from space to Earth
Laser is not as easy as radio, though, NASA explains. That’s partly because the Earth’s rotation, coupled with the amount of time it takes data to reach the ground station from the spacecraft — albeit faster than radio — means tricky timing calculations are needed to determine where the narrower laser needs to hit. Traditional radio simply needs a data dump, from space, in the vicinity of the ground receiver, whereas laser needs to be continually connected during the transmission. The agency intends to employ a special locking, pointing mechanism. The idea is that a pre-scheduled passing craft’s telescope picks up a finder-signal sent from the ground station. That allows the transmitter to lock on. Mirrors in the spacecraft’s laser modulator are driven by sensors, and they send the beam. Using the LCRD, NASA is aiming for a 1.24 Gigabits per second, geosynchronous-to-ground optical link with two ground stations. The first flight, run by NASA's Goddard Space Flight Center in Greenbelt, Maryland, is expected to take place next year.


Want a CIO role? Here are the top skills you need and how to get there

While technical skills are more critical, that doesn't necessarily mean executive teams are looking for a former programmer or network engineer to fill their CIO role. A CIO must appreciate the balance between the hype/promise of new technologies and the reality of business, Inuganti said. Despite the need for technical skills, making a jump into leadership at the CIO level requires a deep understanding of the business. CIO candidates must understand the metrics that drive the business, what competitors are doing, and more, Inuganti said. The market previously went too far to the business side of things, but with the growth of cloud, big data, artificial intelligence (AI) and other technologies, it is requiring more technical skills. In terms of what skills are currently hot, data was always there, Inuganti said, but skills around data analysis are growing in desirability for CIOs. He said it's the hottest commodity in the market today, based on what he has seen with executive searches.


Inside the world's most prolific mobile banking malware

The malware's ability to read messages also means it can intercept text messages from the bank containing one-time passwords, helping the attackers to steal from accounts that use additional security. In addition, Asacub ensures the user can't check their mobile banking balance or change any settings because the permissions it has been given enables it to prevent the legitimate banking app from running on the phone. The attacks might seem basic, but they still work, and Kaspersky figures say Asacub currently accounts for 38 percent of mobile banking trojan attacks "The example of the Asacub Trojan shows us that mobile malware can function for several years with minimal changes in its distribution pattern," Shishkova told ZDNet. "One of the main reasons for this is that the human factor can be leveraged through social engineering: SMS-messages look like they are meant for a certain user, so victims unconsciously click on fraudulent links. In addition, with regular change of domains from which the Trojan is distributed, catching it requires heuristic methods of detection," she added.



Quote for the day:


"The People That Follow You Are A Reflection Of Your Leadership." -- Gordon TredGold

Daily Tech Digest - August 29, 2018

women in it programmer devops reflection monitor glasses by angelos michalopoulos unsplash
More than often, the network will be a multi-vendor, consisting of numerous domains with operational and architectural teams operating in silo. Computer networks are complex, and this complexity can be ’managed out’ by introducing a framework that abstracts the complexity. Lowering complexity and getting things right in a standardized way introduces you to the world of automation. Within a network, there are elements that used to be either easy or hard to automate. Here, one should not assume that if something is easy to automate, we should immediately dive in without considering the easy-versus-impact ratio. Operating system upgrades are easy to automate but have a large impact if something goes wrong. No one wants to live in a world of VLANs and ports. Realistically, they have a relatively low impact with a very basic configuration that needs to be on every switch. This type of device level automation is an easy ramp to automation as long as it does not touch any services.


The bright future of machine learning

Man and Machine collaboration
Mind+machine partnerships are useful in any situation where you’re dealing with a large amount of data, constrained time, or the need for continuous coaching or training. Using machine partners can help us make better choices, decisions and products, with a key advantage being that machines can work on demand, while another human might not be able to learn or respond as quickly. An example I use in class is the use of mind+machine to improve food safety at restaurants. Computers use pre-existing historical data to predict when to inspect restaurants. The computer can identify a restaurant that’s likely to have a violation, and then human inspectors can follow up. And this partnership has done a much better job — inspectors were getting to restaurants that had critical violations about a week earlier than they would have if they had just gone with the normally generated schedule.


This Entrepreneur Shares The Focus Strategy That Helped Him Build an App Used by Millions

Moving, with all the attendant logistical headaches and emotional investment, can be one of the most stressful things you can experience. When your living situation is in flux, it can affect every aspect of your life. And that’s before you factor in trying to find a stranger, or an untested friend or acquaintance, to split the costs with you. This is the problem that Ajay Yadav wants to solve with his company Roomi. He founded the startup in 2015 to help people looking for roommates connect with people who actually are who they say are. Users of Roomi sign up for the service by completing a background check that includes ID verification and social media accounts. If the prospective roommates think they have a match, they can plan to meet through a secure in-app messaging platform. Since launching in New York City three years ago, the company now operates in more than 20 cities, acquired four companies, raised $17 million in funding and has a user base of 2.4 million.


Stop Talking Gobbledygook to the Business

Image: Pixabay
For explaining or defending a machine’s good decisions and fixing the bad ones, you’ll want to be able to see scored machine learning output for each record combined with the top variable value details in plain business language that most influenced the predicted outcome. For example: Record ID 232333 was predicted to be a high value customer because of size greater than 10,000 employees, monthly spend between $1M and $1.5M, and so on for relevant decision influencing input variables. To start earning stakeholder trust early on in your machine learning projects, share intermediate reports such as top outcome influencers that can be invaluable to the line of business. Machine learning can rapidly narrow down the scope of potential variables that matter most when faced with hundreds or thousands of variables to analyze.  As you create models, visually share progress and insights on where your model is accurate and where it makes mistakes using scatter plots, combination charts and interactive data visualization tools.


Dell Latitude 7490 review: A solid business all-rounder

There is a good range of ports and connectors, including a Smart Card reader, NFC sensor and fingerprint reader -- the latter two located on the wrist rest. The Smart Card reader sits on the front left edge, where there are also two USB 3.1 ports, a full size HDMI connector and a USB Type-C port with DisplayPort and Thunderbolt 3. The large, round power jack is at the back of this edge. Meanwhile the right edge offers a headset jack, a MicroSD card reader, a SIM slot, a third USB 3.1 port and an Ethernet port with a spring-out base that means it can be accommodated easily in the chassis. The pop-out SIM card caddy is perhaps a little vulnerable, though it's about as invisible as it could be, nestled at the bottom of the right edge. It accommodates a Micro-SIM rather than a Nano-SIM. It's nice to see a MicroSD card slot here, although full-size SD would be welcome too. My review sample performed well. Simultaneous writing into a web app, audio streaming and 20-plus Chrome tabs opened across two application windows presented it with no difficulties at all.


The GDPR And The B2B Seller: Keep Calm And Sell On


B2B sellers are struggling to engage empowered B2B buyers — those traveling on self-directed journeys — who are raising the bar for more insight, more co-creation, and more creativity. Piled on top of these challenges, GDPR seems like the whim of a capricious god in a cosmic smackdown, throwing more obstacles in the way of sales representatives. However, many of the identified seller pain points are actually ineffective tactics — a vestige of a bygone era — that are off-putting to customers and prospects who are fed up with a barrage of impersonal, non-purposeful, and irrelevant communications. GDPR prohibits selling methods that leverage nonconsensual use of personal data, and this new reality will ultimately be good for sellers willing to shift their behaviors. Sales will spend less time doing data entry and sending automatic emails and more time focusing on how they can help interested customers. One of the sales leaders we interviewed shared this sentiment: “For sales representatives to stay relevant, they need to stop automating things. This is just the tip of the iceberg for sales and marketing teams becoming more human.”


Are AI and “deep learning” the future of, well, everything?

Machine learning and deep learning have grown from the same roots within computer science, using many of the same concepts and techniques. Simply put, machine learning is an offshoot of artificial intelligence that enables a system to acquire knowledge through a supervised learning experience. It’s a straightforward enough process, in theory: a human being provides data for analysis, and then gives error-correcting feedback that enables the system to improve itself. Depending upon the patterns in the data it’s exposed to, and which of those it recognises, the system will adjust its actions accordingly. It's this ability to self-develop without the need for explicit programming, but rather to change and adapt when exposed to new data, that makes machine learning such a powerful tool. However, what makes deep learning even more valuable is that it does so without, or with much less, human supervision. David Wood, co-founder of Symbian and now a “futurist” at Delta Wisdom, explains the difference using the example of face recognition.


5 ways the World Economic Forum says AI is changing banking


“As products and services become more easily comparable and therefore commoditized, it’s not sufficient any more to compete on delivering credit quickly and at a good price, which have been the historic competitive levers” for banks, said Rob Galaski, Deloitte Global Banking and Capital Marketing Consulting leader and one of the authors of the report. For example, to keep its auto loan business relevant, Royal Bank of Canada is piloting a forecasting tool for car dealers to predict demand for vehicle purchases based on customer data. Such information could be more valuable to the dealers than any banking product, Galaski said. “We think that is an exemplar of how we see the industry changing overall,” he said. “Much of the AI debate coming into our work was around replacing humans and doing existing things better or faster. But that take on AI dramatically underestimates the impact. The very way we go about conducting business can be redesigned using AI.”


Excess data center heat is no longer a bug -- it’s a feature!

green data center intro
Developed by MIRIS in cooperation with architecture firm Snøhetta, Skanska, Asplan Viak, and Nokia, The Spark also requires urban data centers to be built in close proximity to the buildings hoping to use the excess heat. These kinds of urban locations may further increase costs and put practical limitations on the size of data centers that can take advantage of the concept. While smaller data centers are increasingly popular, they may not be able to achieve the economies of scale enjoyed by the largest facilities, which can run into the millions of square feet. In addition, depending on the time of day, the weather, and other factors, the heat generated by the data center may not always precisely match the needs of the surrounding community, either generating more heat than the local homes and businesses need, or requiring them to get additional heat from other sources. That’s why the Lyseparken implementation includes a stake in the local power company, Fast Company said, and will “produce and consume electricity from a mix of renewable sources, including solar and thermal energy. 


EU regulation will drive U.S. banks to embrace FinTech or lose market share

"We are already seeing the U.K.'s open banking initiative, which is based on but wider than PSD2, being explored in other markets, including in Central America, Asia and Africa. So it wouldn't be surprising to see similar developments in the U.S.," he added. Even before being pressured by PSD2, some European banks were embracing emerging digital technologies, such as real-time electronic payments; they often gained the technology either through partnerships with FinTechs or by acquiring them outright. "U.K. banks are not, at this stage, seeing FinTechs so much as competitors as they are seeing them as potential collaborators with whom they can develop new journeys, services and products," Chertkow said. "What is clear is that consumer behaviors are changing, particularly with younger generations. Traditional banks need to decide whether they want to maintain their existing business model and seek to differentiate it from the FinTechs or whether they need to respond by copying the best user experiences of the FinTechs."



Quote for the day:


"Commitment is the conviction that it's right to fight for what you want." --Tim Fargo


Daily Tech Digest - August 27, 2018

What are next generation firewalls? How the cloud and complexity affect them

network security digital internet firewall binary code
So far, nextgen firewalls vendors haven't been able to fully translate their features to the needs of cloud environments, says NSS Labs' Spanbauer. "This is a significant engineering feat, and we're not quite there yet with a perfect replica, virtualized or physical." However, they are taking advantage of other capabilities that cloud offers, including the real-time sharing of threat intelligence data. "If you're patient zero, then that's an incredibly difficult scenario to block against," he says. "However, if you give it a minute or two minutes, then patient 10 or 15 to 20, with real-time updates, can be protected by virtue of the cloud abilities of the firewall." There's also the possibility of nextgen firewalls expanding into the endpoint security space. "If they merged, that would be a lot easier for enterprises to manage," says Spanbauer. "But that's not going to happen." Perimeter protection and endpoint protection will remain distinct for the foreseeable future, but the two sets of technologies could mutually benefit one another, he says.



Modular Downloaders Could Pose New Threat for Enterprises

The threat actor behind the campaign — an entity that Proofpoint identifies as TA555 — has been distributing AdvisorsBot via phishing emails containing a macro that initially executed a PowerShell command to download the malware. Since early August, the attacker has been using a macro to run a PowerShell command, which then downloads a PowerShell script capable of running AdvisorsBot without writing it to disk first, Proofpoint said. Interestingly, since first releasing the malware in May, its authors have completely rewritten it in PowerShell and .NET. Proofpoint has dubbed the new variant as PoshAdvisor and describes it as not identical to AdvisorsBot but containing many of the same functions, including the ability to download additional modules. ... It is certainly unusual for malware authors to do so and may be an attempt to further evade defenses. "For the enterprise, more variety in the threat landscape and newly coded malware increase complexity for defenders and should be driving investments in threat intelligence, robust layered defenses, and end user education," she says.


Machine learning turns unstructured secondary storage into globally accessible data

cloud data warehouse
It’s important to note, particularly for security-minded organizations, that Cohesity isn’t aggregating the data, just the object metadata, which then points to where the data is. Now storage administrators can globally roll out policies or make upgrades across the multi-node environment with a single click. ... One of the biggest and underappreciated benefits of SaaS is the ability to aggregate data across multiple customers and compare the data. In one’s consumer life, think of Amazon providing recommendations such as “Customers that bought X also bought Y.” Cohesity can compare data and understand its utilization or backup frequency or other data management capabilities against its peers and then make the appropriate changes. Digital CIOs need to shed conventional thinking around storage and think more about globally accessible and optimized data. This becomes particularly important in the ML era, when the quality of data can make the difference between being a market leader or a laggard. In particular, secondary storage may be the biggest, wasted resource that a company has, and being able to harness the knowledge and insights captured in it could help organizations accelerate their digital transformation efforts.


Why do enterprises take a long time to install vital security updates

The failure to rapidly deploy and install security updates is placing businesses at greater risk of a targeted cyberattack, as hackers look to exploit the vulnerabilities of outdated systems. Kollective’s report also found that 37% of IT managers list ‘a failure to install updates’ as the biggest security threat of 2018. This makes outdated software a bigger threat than password vulnerabilities (33%), BYOA / BYOD (22%) and unsecured USB sticks (9%). Even more startling, 13% of large businesses have given up on actively managing software distribution, and are, instead, passively asking employees to update their own systems. Kollective blames the failure to install updates on a combination of slow testing procedures and an inability to distribute updates automatically at scale. As Dan Vetras, CEO of Kollective explains: “Following numerous corporate cyberattacks over the last 12 months, today’s businesses are spending more than ever before on enhancing and improving their security systems. But, this investment is wasted if they aren’t keeping their systems up-to-date.


Here comes ‘antidisinformation as a service’

zuckerberg mark cutouts capitol
Most of the disinformation accounts deleted by Facebook, Twitter, Google and Microsoft were discovered not by those companies or the U.S. government, but by a company called FireEye. I told you in this space last year about disinformation as a service (DaaS). Most of the Russian disinformation campaigns are carried out by a private company called the Internet Research Agency. But now comes AaaS — antidisinformation as a service. That’s what FireEye provided this week to the Silicon Valley social networking companies. It considers itself a kind of NSA for hire — an intelligence organization, but for enterprises. How does it do it? FireEye’s methodology is multifaceted and a trade secret. But the company’s core competencies lie in discovering hidden malware and network hacks with the use of proprietary technology to detect behavioral anomalies — behavior by code and websites that isn’t normal. Once it finds the general nature of the weird behavior, it then does a lot of shoe-leather research.


What IPv6 features can be found in the latest specification?


The core IPv6 specification -- RFC 2460 -- has changed considerably since it was first released. The new IPv6 features are geared toward reliability, as well as operational and security considerations. To that end, the revised spec contains a security analysis of IPv6, with references to some of the work that's been carried out during the last few years, particularly in the area of IPv6 addressing. Other enhancements target IPv6 extension headers and fragmentation. For example, the original IPv6 specification allowed overlapping fragments -- that is, fragments that covered the same chunk of data from the original unfragmented datagram. The use of overlapping fragments to circumvent security controls was already very popular in the IPv4 world. However, even when there was no legitimate use for them in the IPv6 world, overlapping fragments were still considered valid. Such fragments were eventually declared illegal by RFC 5722, which published in 2009. Thus, the new specification incorporates that update, banning overlapping fragments.


Microsoft, Salesforce plan to open source major enterprise software products

open source keyboard
Microsoft ultimately decided that ONE is too important to keep to itself. “We have decided that this is such an important resource for everybody that just hoarding it ourselves is not the right thing to do,” Bahl said. “So, we are making it available to the entire community so that they can now — and it’s not just for production systems, but also for students that are now graduating.” The software will help large enterprises improve their network uptime by simulating changes to their network before rolling them out live. Microsoft hasn’t disclosed where it plans to release ONE, but GitHub — which Microsoft is in the process of acquiring — seems the logical choice. TransmogrifAI is an automated machine learning library for structured data, which makes sense coming from Salesforce, since its CRM products are built on the traditional row-and-column structure of a relational database. It’s written in Scala and built on top of Apache Spark, Apache’s in-memory analytics software.


Microsoft ups effort to drive Surface Go adoption

Microsoft Surface Go
One of the most fascinating things about executive leadership in most technology firms is that they generally don’t get marketing. It doesn’t seem to be taught in engineering schools and even those that get business degrees either opt to not take those classes or didn’t understand what they were taught. The result is that, in general, marketing is underfunded and staffed by people that don’t understand the critical parts of human nature that form the foundation of successful marketing campaigns.  Apple, during Steve Jobs tenure, was my best example of a firm that truly got the power of marketing and that company rose to be the most valuable (in terms of market cap) company in the segment. This was even though for much of the time they have been largely a one product company (iPod to iPhone). They outspent everyone they competed with occasional exception of Samsung who only occasionally outspent Apple with powerful competitive results (they are taking regular shots at Apple).


10 common pitfalls that threaten data quality strategies


“Implementing a data quality strategy is not as simple as installing a tool or a one-time fix,” explains Patty Haines, president and founder of Chimney Rock Information Solutions, Inc., a consultancy that aids organizations in building business intelligence and analytics environments by providing data warehouse and business intelligence services, solutions and mentoring. “Organizations across the enterprise need to work together to identify, assess, remediate, and monitor data with the goal of continual data improvements.” Haines offers her advice on 10 top challenges to a successful data quality strategy. ... “If differences in the definition and use of data continue, it can allow poor quality data to be entered, managed and reported,” Haines says. “The data quality strategy must include the business community, data governance, and subject matter experts working together to determine consistent and agreed-upon definitions to improve the quality of data.”


Why Facebook is powerless to stop its own descent

You could certainly argue that Facebook's problems aren't all of its own making. It's a tool that people use in whatever ways they decide. The fact that humanity has used the social network to power a renewal of tribalism, nationalism, and bigotry is hardly a phenomenon that Facebook or anyone else would have predicted. The problem for Facebook is that it took so long to respond--and it only truly did so after the issue became a PR nightmare. It had the opportunity to step up and figure out where the line was between healthy dialogue and hate speech, and it passed the buck. It prioritized user growth and activity over creating a healthy platform. A crisis doesn't build character, it reveals it--as the aphorism goes. Facebook has lost credibility. Few believe that it can be a leader in solving a problem that it helped create. As a result, the narrative around Facebook as a company and a platform is that it doesn't look out for its users' best interests. It doesn't put them first. And so more people are tuning out.



Quote for the day:


"Defeat is not the worst of failures. Not to have tried is the true failure." -- George Woodberry


Daily Tech Digest - August 26, 2018

robot-3010309_960_720
There’s good reason for Gartner’s confidence. AI has been moving fast and is already a key area of research and development for many organizations. The fruits of these efforts can already be seen in algorithms that influence things such as social media feeds, autonomous vehicles, apps and even some call centers. And that rapid progress means that we’ll soon start seeing AI technology appear “virtually everywhere” over the next 10 years as it becomes available to the masses. Gartner says movements and trends such as cloud computing, open-source technology projects and the “maker” community are fueling AI’s rise. It adds that AI is most prevalent in technologies such as AI platform as a service, artificial general intelligence, autonomous driving, autonomous mobile robots, flying autonomous vehicles, smart robots, conversational AI platforms, deep neural networks and virtual assistants.



What It Takes To Disrupt A Massive Industry

The strategy involves finding a gap in an incumbent’s product line and create something that will augment it. This essentially makes you a friend, not a foe. To this end, Nir initially built an analytics layer on top of existing platforms to help with cyber fraud. “And it worked out,” he said. “We became partners with the major players. We talked to their partners, sold to their customers.” But of course, this strategy must go to the next level if a company is to get to scale. “We poured the profits from the helper app back into engineering, building our next-gen SIEM platform,” said Nir. “Not only did the sales fund development, but we had access to customers who were using all the major SIEM platforms. We had a front-row seat and saw all the problems these customers were having with the legacy products. Needless to say, the big SIEM vendors weren’t pleased, but we had enough sales and momentum to go it alone. Also, we knew from experience how much better our next-gen platform was. When we launched our SIEM, we already had customers committed."


Is it time to automate politicians?


A robot could take over every politician’s favourite task of cutting ribbons to inaugurate new buildings. We already cede decision-making responsibility on health and finances to algorithms, why not with voting? An automated democracy could replace both politicians and ballot boxes. That may be extreme. Yet comical though it sounds, parts of our politics has already been technified. Consider reach. Both Narendra Modi, India’s prime minister, and the French presidential candidate Jean-Luc Mélenchon, beamed holograms of themselves to speak to several groups of thousands of people simultaneously. Next, there’s the message. In America’s 2016 election, candidates used social-media advertising to target different voters with different messages. The growing automation of our government is no longer sci-fi. Instead, it’s a reality we are only beginning to grasp. So to the question, can we replace politicians with robots? The answer is a soft yes.


Cyber Resilience: Where do you rank?


It is imperative for businesses to be proactive rather than reactive when it comes to cyber security. Businesses need to ensure every part of their enterprise is protected, even from employees as well as contractors and their supply chain partners. A small vulnerability can be hugely detrimental to a businesses’ cyber security. An effective method of preventing cyber attacks is to develop a culture of resilience within a business. Cyber security should not be the exclusive domain of the IT department, it has companywide consequences and should be the responsibility of the C-suite to drive a cyber safe culture within their organisations. As cyber crime presents itself in a variety of forms, businesses can combat the risk of a cyber attack by implementing staff training to spot a potential cyber attacks, like phishing emails, establishing password strength and change requirements, mandating software updates and data back-ups to secure their data. All while restricting what data employees can access and share.


How artificial intelligence is transforming the financial ecosystem

Artificial intelligence is fundamentally changing the physics of financial services. It is weakening the bonds that have held together the component parts of incumbent financial institutions, opening the door to entirely new operating models and ushering in a new set of competitive dynamics that will reward institutions focused on the scale and sophistication of data much more than the scale or complexity of capital. A clear vision of the future financial landscape will be critical to good strategic and governance decisions as financial institutions around the world face growing competitive pressure to make major strategic investments in AI and policy makers seek to navigate the challenging regulatory and social uncertainties emerging globally. Building on the World Economic Forum’s past work on disruptive innovation in financial services, this report provides a comprehensive exploration of the impact of AI on financial services.


Predictive Analytics: The Future of Financial Marketing


As we move from traditional analytics to predictive analytics, we can leverage new technology to deliver marketing messages to customers. Beyond direct mail, email, and even digital marketing, new touchpoints, such as chatbots, and voice-first interactive assistants will provide new ways to engage with a consumer. “Artificial intelligence (AI) that is fueled by predictive analytics, machine learning, and natural language processing will be the brains behind the face,” states Aite Group. Predictive analytics is the future of financial institution marketing, predicting when a consumer will experience a life event or need a financial service solution. This advanced form of needs analysis, once only available to the largest organizations, is now financially and operationally available to organizations of all sizes. The combination of predictive analytic tools and advanced digital delivery options can guide the customer to the best financial solution at the most opportune time … sometimes before the consumer even realizes they have a need.


New DevOps Study Offers A Reality Check for Financial Services

Despite reason for optimism, there are still potential hurdles to making the most of DevOps practices. Among those whose businesses have already migrated to DevOps, seventy-one percent of respondents claimed to have experienced challenges. In 26 percent of cases, IT leaders found that the operating teams were limiting the transition to DevOps. Another 26 percent reported difficulty due to management structure lacking clear business objectives, which made defining DevOps strategy difficult. The survey’s results paint a clear picture: Organizations need to have a unified approach in order to make the most of their DevOps goals. Speaking about areas where financial services organizations should focus, Hayes-Warren stated that IT leaders should lead the drive to automate processes and applications by leveraging the cloud. Furthermore, companies need to focus on their culture to ensure they’re adoption organization-wide practices that integrate members of their IT departments.


Will Machine Learning AI Make Human Translators An Endangered Species?


Training a neural machine to translate between languages requires nothing more than feeding a large quantity of material, in whichever languages you want to translate between, into the neural net algorithms. To adapt to this rapid transformation, One Hour Translation has developed tools and services designed to distinguish between the different translation services available, and pick the best one for any particular translation task. "For example, for travel and tourism, one service could be great at translating from German to English, but not so good at Japanese. Another could be great at French but poor at German. So we built an index to help the industry and our customers. We can say, in real time, which is the best engine to use, for any type of material, in any language." This work – comparing the quality of the output of NMT generated translation, gives a clue as to how human translators could see their jobs transforming in coming years.


Fintech Without Borders: Regulators Consult on Global Financial Innovation Network

The GFIN Regulators are encouraging responses from “innovative financial services firms, financial services regulators, technology companies, technology providers, trade bodies, accelerators, academia, consumer groups and other stakeholders keen on being part of the development of the GFIN.” Firms should submit responses to GFIN@fca.org.uk. Feedback submitted to this email address will be shared among the GFIN Regulators unless a firm specifically states otherwise. Alternatively, firms may provide feedback or arrange to discuss the Consultation with one or more particular GFIN Regulators. Contact details for these purposes are provided in the Consultation. A regulatory sandbox is a platform for firms to test innovative new products, services or business models on a limited scale before a full launch. A number of national regulators including the FCA, the HKMA and the MAS have developed such sandboxes


How ‘Similar-Solution’ Information Sharing Reduces Risk at the Network Perimeter


Even when information is shared, it’s typically between identical solutions deployed across various sites within a company. While this represents a good first step, there is still plenty of room for improvement. Let us consider the physical security solutions found at a bank as an analogy for cybersecurity solutions. A robber enters a bank. Cameras didn’t detect the intruder wearing casual clothes or anything identifying him or her as a criminal. The intruder goes to the first teller and asks for money. The teller closes the window. Next, the robber moves to a second window, demanding money and that teller closes the window. The robber moves to the third window, and so on until all available windows are closed. Is this the most effective security strategy? Wouldn’t it make more sense if the bank had a unified solution that shared information and shut down all of the windows after the first attempt?



Quote for the day:


"Added pressure and responsibility should not change one's leadership style, it should merely expose that which already exists." -- Mark W. Boyer


Daily Tech Digest - August 25, 2018


Biometrics aren't just being used at border control. Sydney Airport has announced it's teaming up with Qantas, Australia's largest airline, to use facial recognition to simplify the departure process. Under a new trial, passengers on select Qantas international flights can have their face and passport scanned at a kiosk when they check in. From then on, they won't need to present their passport to Qantas staff -- they'll be able to simply scan their face at a kiosk when they drop off luggage, enter the lounge and board their flight at the gate. Travellers will still need to go through regular airport security and official immigration processing, but all of their dealings with Qantas can be handled with facial recognition. "Your face will be your passport and your boarding pass at every step of the process," Geoff Culbert, Sydney Airport CEO, said of the new development. 


Google just gave control over data center cooling to an AI


Now, Google says, it has effectively handed control to the algorithm, which is managing cooling at several of its data centers all by itself. “It’s the first time that an autonomous industrial control system will be deployed at this scale, to the best of our knowledge,” says Mustafa Suleyman, head of applied AI at DeepMind, the London-based artificial-intelligence company Google acquired in 2014.  The project demonstrates the potential for artificial intelligence to manage infrastructure—and shows how advanced AI systems can work in collaboration with humans. Although the algorithm runs independently, a person manages it and can intervene if it seems to be doing something too risky. The algorithm exploits a technique known as reinforcement learning, which learns through trial and error. The same approach led to AlphaGo, the DeepMind program which vanquished human players of the board game Go


Overlook 5G security at your peril


Attacks can come in many different shapes and sizes; user malware, fraudulent calls, spam, viruses, data and identity theft, and denial of service, to name a few examples. The rise in security threats is partly due to the growing deployment of carrier Wi-Fi access infrastructures and small cells in public areas, offices and homes and will increase exponentially with M2M. Historically, carrier-grade telecom networks have had an excellent record for user and network security; however, today’s communications infrastructure is far more vulnerable than its predecessors. And with advances in security threats constantly evolving, service providers must invest in the right tools to keep on top of the issue. These increasing security risks are due to the move to the IP-centric LTE architecture. The flatter architecture is what exposed the 4G networks, due to the fact there were fewer steps to the core network, and this will continue to be an issue with 5G networks.


Companies lack leadership capabilities for digital transformation projects

Percentage of organizations believing they have the required capabilities
Yet, even after years of exponential growth in the digital and digital consulting arenas, new Capgemini research shows that the implementation of digital transformation projects is still lagging in its nascent stages. According to the responses of more than 1,300 business leaders from some 750 organisations, only a relatively small number of companies have the digital (39%) and managerial (35%) capacities needed to make their digital transformation successful. While the fact that these figures remain less than 50% is surprising, what is even more shocking is that, compared to exactly the same measurement six years ago, there has actually been a decline in the firms’ general readiness for digital transformation. Capgemini found that organisations today feel less equipped with the right leadership skills, at 45% in 2012 compared to 35% in 2018. According to Vincent Fokke, Chief Technology Officer at Capgemini in the Benelux, this is an important point to note.


AI and Robots: Not What You Think


Depending on what you read – and choose to believe about what you read – AI-driven robots are able to autonomously make decisions about what work gets done, how it gets done and who does it or there are decades of work yet to be done before we see a material impact. Personally, I think we’re somewhere in the middle, as manufacturers – pragmatists that they are – design and implement manufacturing strategies in a very deliberate way to achieve business requirements and then focus ongoing efforts to make key processes better and better. And I think that collaborative robots (cobots) will play a larger and larger role in accelerating progress. The AI that cobots possess makes them so much more than just machines for dirty, dull and dangerous work. So let the world watch and wait for artificial intelligence that will enable wholesale change in how we drive, care for our aged, teach our children and more. Manufacturers don’t have to wait for artificial intelligence-driven robots to help them make their operations better.


Serverless vs. Containers

Debate about serverless vs. containers often starts with control, or the lack thereof in the case of serverless. This is not new. In fact, I clearly remember the same debates around control when AWS was starting to gain traction way back in 2009. Now 10 years later, the dust has settled on that original debate but we have failed to learn our lesson. It's human nature to want control, but how much are you willing to pay for it? Do you know the total cost of ownership (TCO) you will take on. The ability to control your own infrastructure comes with a lot of responsibilities. To take on these responsibilities, you need to have the relevant skill sets in your organization. That means salaries (easily the biggest expense in most organizations), agency fees, and taking time away from your engineers and managers for recruitment and onboarding. Given the TCO involved, the goal of having that control has to be to optimize for something (for example, to achieve predictable performance for a business critical workflow), and not having control for its own sake.


The Evolution of Internet of Things: New Business Models Need Interoperability


The predicted rate of connected devices growth that is often cited by Gartner, Deloitte and others, is based upon the proliferation in data and the effect this rate of growth will have on businesses and the number of new businesses that will be created. But, if the current trend of single-use case IoT solutions continues to be siloed, these predictions for connected devices growth may not be realised. Open APIs between product and service providers are the key technology towards resolving this issue. ... That is simply too expensive and time consuming, particularly for smaller businesses, to maintain. Simply managing the connection between one partner will result in maintenance costs that, as a business with tight margins, might not be viable to continue. Gartner predicted 75 per cent of IoT projects will take twice the time allocated to be accomplished, because of the increasing complexity associated with developing this connectivity. So what is the solution?


Streamlining Data Science and Analytics Workflows for Maximum ROI

Despite the multitude of tasks associated with the data science position, its basic workflow (in terms of analytics) is readily codified into three steps. The first is data preparation or data wrangling; where the data scientist starts with raw data and “just tries to make sense of it before they’re doing anything real with it,” Mintz explains. “Then there’s the actual model building when they’re building a machine learning model. Assuming they find something valuable, there’s getting that insight back into the hands of the people who can use it to make the business run better.” Typically, data scientists approach building a new analytics solution for a specific business problem by accessing raw data from what might be a plethora of sources. Next, they engage in a lengthy process to prepare the data for consumption. “So much time and energy goes into that,” says Mintz. “You look at the surveys of data scientists and they say 70-80% of my time goes to data cleaning.”


Stranded and in Need of Rescue: Your Enterprise Data


In today’s enterprise, it is still very common for data to be stored disparately in any number of locations and systems. Getting to a single version of the truth is virtually impossible to achieve with siloed data and different areas of the business act and operate in different ways, depending upon which version of the truth they are subscribed or have access to. In fact, in an upcoming report released later this month, IDC name data siloes as the number one challenge for Digital Transformation (DX). Because isolated data leads inexorably to isolated working practices and those are the antithesis of an integrated strategy, which is what DX is all about. Integration. No wonder then, that figures such as those produced by Harvard Business Review and Forbes show that nearly two thirds of DX initiatives are failing. Optimal use of information is a Critical Success Factor for today’s enterprise.


Network technologies are changing faster than we can manage them

Data breach and user experience are the two biggest network worries. About 33 percent of network professionals said a data breach worries them the most about their network. Given the almost daily data breaches, who can blame them? In an ideal world, network managers would like to see tools that combine network and security management. However, only about 40 percent of respondents said their organization was using the same stack of tools to manage both network performance and security. But network pros are also being overwhelmed by the huge proliferation of cloud and network management tools. Many organizations are trying combinations of tools to manage the challenge. Network traffic analytics appears to be the most commonly used, with just over 28 percent of network professionals using it to manage their network challenge.



Quote for the day:


"If you don’t have some self doubts and fears when you pursue a dream, then you haven’t dreamed big enough." -- Joe Vitale