Daily Tech Digest - October 04, 2018


We have to describe the world as it is for us to gain useful insights. Sure, we might then use those to convert that reality to how it ought to be, but our ingoing information, plus its processing, has to be morally blind. There is quite a movement out there to insist that all algorithms, all AIs, must be audited. That there can be no black boxes – we must know the internal logic and information structures of everything. This is so we can audit them to ensure that none of the either conscious or unconscious failings of thought and prejudice that humans are prey to are included in them. But, as above, this fails on one ground – that we humans are prey to such things. Thus a description of, or calculation about, a world inhabited by humans must at least acknowledge, if not incorporate, such prejudices. Otherwise the results coming out of the system aren’t going to be about this world, are they?



Understanding Spring Reactive: Introducing Spring WebFlux


With the introduction of Servlet 3.1, Spring MVC could achieve non-blocking behavior. But, as the Servlet API contains several interfaces that are still blocking (maybe because of support for backward compatibility), there was always the chance of accidentally using blocking APIs in the application, which was intended to be developed as non-blocking. In such scenarios, the usage of a blocking API will certainly bring down the application sooner or later. ... The purpose of this series is to demonstrate the evolution of the Servlet/Spring from the blocking to non-blocking paradigm. I am not going into the details of Spring WebFlux in this tutorial. But, still, I am going to introduce a sample Spring Boot application using Spring WebFlux. One point which we should notice in the above diagram is that Spring WebFlux is Servlet Container agnostic. Spring Webflux works on Servlet Container and also on Netty through Reactor Netty Project. In my Spring boot application, I have a dependency on WebFlux as spring-boot-starter-webflux, and at server startup, it says that the application is ready with Netty.


Asking the right questions to define government’s role in cybersecurity

Asking the right questions to define government’s role in cybersecurity
Cyberthreats cross national boundaries, with victims in one jurisdiction and perpetrators in another—often among nations that don’t agree on a common philosophy of governing the internet. And complicating it all, criminal offences vary, legal assistance arrangements are too slow, and operating models for day-to-day policing are optimized for crimes committed by local offenders. ... Each country is addressing the challenge in its own way, just as companies tackle the issue individually. Approaches vary even among leading countries identified by the Global Cybersecurity Index, an initiative of the United Nations International Telecommunications Union. Differences typically reflect political and legal philosophy, federal or national government structures, and how far government powers are devolved to state or local authorities. They also reflect public awareness and how broadly countries define national security—as well as technical capabilities among policy makers.


Iron Ox uses AI and robots to grow 30 times more produce than traditional farms


Iron Ox’s first 1,000-square-foot farm, which is in full production as of this week, taps a robotic arm equipped with a camera and computer vision systems that can analyze plants at sub-millimeter scale and execute tasks like planting and seeding. A 1,000-pound mobile transport system roughly the size of a car, meanwhile, delivers harvested produce — including leafy greens such as romaine, butterhead, and kale and herbs like basil, cilantro, and chives — using sensors and collision avoidance systems “similar to that of a self-driving car.” Cloud-hosted software acts as a sort of brain for the system, ingesting data from embedded sensors and using artificial intelligence (AI) to detect pests, forecast diseases, and “ensure cohesion across all parts.” It might sound like pricey tech, but Alexander and company said they worked to keep costs down by using off-the-shelf parts and implementing a scalable transport system.


From Visibility To Vision: Staying Competitive In An Open Banking Future


One of the reasons the digital experiences of established banks remain so lackluster is a failure by both customers and employees to report instances of slow or faulty systems. Across the board there is a growing apathy and acceptance of poorly performing technology, creating a self-perpetuating cycle of unsatisfied users. The first step in rectifying this problem is to give the power and visibility back to the IT team and business by providing them with system monitoring solutions that can quantify “normal” behavior as a benchmark to identify deviations from normal, so they can truly measure the user’s experience. These solutions would effectively bypass the reliance on the end-user to report issues and instead focus on creating more agile capabilities to proactively identify and rectify areas of degrading performance. Once IT departments are equipped with an intelligent and proactive infrastructure, banks can effectively compete by delivering digital services that offer a superior customer experience.


Everyone, everywhere is responsible for IIoT cyber security


Cyber security threats are coming at us from every direction, not just from our corporate networks. Operational networks were simply not built for connectivity, and carefully thought-out security protocols are being ignored for the benefit of data access to drive productivity gains. Unfortunately, threat vectors now extend even to base-level assets. Attackers can target anything from a connected thermostat to a wireless field device in order to cause danger. This heralds a new type of aggressive, innovative cyber attack for industrial control systems, which are becoming increasingly accessible over the internet, often inadvertently. The actors, too, have changed, and they are becoming more sophisticated every day. Attack techniques, tools and lessons are readily available on the dark web, which means low-level cyber criminals have access to the information they need to attempt more serious attacks.


How updating an outdated industrial control system can work with fog computing

industrial iot industry networking sensors
According to fog computing and automation startup Nebbiolo Technologies – which declined to name the client directly, saying only that it’s a “global” company – the failure of one of those Windows IPCs could result in up to six hours of downtime for said client. They wanted that time cut down to minutes. It’s a tricky issue. If those 9,000 machines were all in a data center, you could simply virtualize the whole thing and call it a day, according to Nebbiolo’s vice president of product management, Hugo Vliegen. But it's a heterogeneous environment, with the aging computers running critical control applications for the production lines – their connections to the equipment can't simply be abstracted into the cloud or a data center. Architecturally, however, the system is a bit simpler. Sure, there are a lot of computers, but they’re all managed remotely. The chief problem is visibility and failover, Vliegen said. “If they fail, they’re looking at six hours downtime,” he said on Tuesday in a presentation at the Fog World Congress in San Francisco.


5 mistakes even the best organizations make with product and customer data

“In 2018, digital business transformation will be played out at scale, sparking shifts in organizational structure, operating models, and technology platforms. CEOs will expect their CIOs to lead digital efforts by orchestrating the enabling technologies, closing the digital skills gap, and linking arms with CMOs and other executive peers better positioned to address the transformational issues across business silos.”  The need to address these business silos has been a key driver in the growth of master data management (MDM). MDM integrates multiple disparate systems across organizations by streamlining the process of aggregating and consolidating information about products, customers, suppliers, employees, assets and reference data from multiple sources and formats. It connects that information to derive actionable insights and publishes it to backend systems as well as online and offline channels.


Codefirst: The Future of UI Design


If you look at your laptop, tablet, or mobile phone today, you’ll notice that the latest craze to sweep the industry is flat design. Flat design was a dramatic departure from Apple’s ubiquitous skeuomorphism style to one that celebrated minimalism. This trend boasted a UI that leveraged simplicity, flat surfaces, cleaner edges, and understated graphics. The flat design trend evidences a shift within the industry to make designs scale across many different form factors. Websites, on the other hand, have incorporated polygonal shapes, simple geometric layers, and bold lines that grab the audience’s attention. Tactile designs have also grown in popularity in recent months. This design trend makes objects appear hyper-real. Beyond these current trends, there are many examples of websites without borders, without multiple layers, with purposeful animation, and large images. Going forward, you can undoubtedly expect the bar to be raised within the app and web world to ensure that both UI and UX work seamlessly together to improve user interactions.


Incorporate NIST security and virtualization recommendations


The main goal of following these NIST virtualization recommendations is to ensure the secure execution of the platform's baseline functions. These recommendations primarily target cloud service providers that offer infrastructure as a service and enterprise IT teams planning to implement virtual infrastructures to host line-of-business applications. According to NIST, hypervisor platforms are susceptible to security threats via three primary channels: the enterprise network where the hypervisor host resides, rogue or compromised VMs accessing virtualized resources, and web interfaces for the platform's management services and consoles. NIST breaks down the hypervisor platform into the following five baseline functions: VM process isolation (HY-BF1), device mediation and access control (HY-BF2), direct command execution from guest VMs (HY-BF3), VM lifecycle management (HY-BF4), and hypervisor platform management (HY-BF5).



Quote for the day:


"Great Leaders Focus On Sustainable Success Rather Than Quicker Wins." -- Gordon TredGold


Daily Tech Digest - October 03, 2018

Lady Justice
The problem with many of the standard metrics is that they fail to take into account how different groups might have different distributions of risk. In particular, if there are people who are very low risk or very high risk, then it can throw off these measures in a way that doesn't actually change what the fair decision should be. ... The upshot is that if you end up enforcing or trying to enforce one of these measures, if you try to equalize false positive rates, or you try to equalize some other classification parity metric, you can end up hurting both the group you're trying to protect and any other groups for which you might be changing the policy. ... A layman's definition of calibration would be, if an algorithm gives a risk score—maybe it gives a score from one to 10, and one is very low risk and 10 is very high risk—calibration says the scores should mean the same thing for different groups. We basically say in our paper that calibration is necessary for fairness, but it's not good enough. Just because your scores are calibrated doesn't mean you aren't doing something funny that could be harming certain groups.


Here’s a solution to the AI talent shortage: Recruit philosophy students image
Who would have thought it? If schools and universities are going to help create a generation that is equipped to support the AI revolution, they might be better off teaching philosophy and psychology. Sport might be a good analogy. If you are trying to hire talent, you might be better off hiring staff while they are young, grabbing them from school or university as part of placements perhaps, an approach Melanie Oldham explains in this piece. It is an approach that sports clubs are fully versed in — football teams with their academies and talent scouts, scouring the playing fields on a Saturday morning. It often works out as a more effective approach than getting the cheque book out and buying players after they emerge. But for Rinku Singh and Dinesh Patel the route to stardom in baseball was not conventional. They joined the American baseball world after entering a talent contest in India. It was an unorthodox recruitment process made famous by the movie ‘Million Dollar Arm.’



What Is Deep Learning AI? A Simple Guide With 8 Practical Examples


It encompasses machine learning, where machines can learn by experience and acquire skills without human involvement. Deep learning is a subset of machine learning where artificial neural networks, algorithms inspired by the human brain, learn from large amounts of data. Similarly to how we learn from experience, the deep learning algorithm would perform a task repeatedly, each time tweaking it a little to improve the outcome. We refer to ‘deep learning’ because the neural networks have various (deep) layers that enable learning. Just about any problem that requires “thought” to figure out is a problem deep learning can learn to solve. The amount of data we generate every day is staggering—currently estimated at 2.6 quintillion bytes—and it’s the resource that makes deep learning possible. Since deep-learning algorithms require a ton of data to learn from, this increase in data creation is one reason that deep learning capabilities have grown in recent years.


A CIO forges a data strategy plan for creating actionable data


Information that you don't think is relevant right now can change in value. So wherever we can put a hook to preserve information for the future, we'll do that. Even if we don't take all the content and turn it into actionable data, we may take that data and leave it unstructured. We always like to leave that door open if there's information that the client has but can't think of a business case to use right now. ... It's a way of representing information -- subject, predicate, object. You start with metadata: You pull the information out about the data you're working with. Say I'm working with a journal article, so who is the author? What college did the author go to? That's just raw data. Now you want to relate that to other data. You have this author who attended this university and got this degree. Now you have not just three pieces of data, you have three related pieces of information that give you much more context.


Facebook Breach: Single Sign-On of Doom

Facebook Breach: Single Sign-On of Doom
"Due to the proliferation of SSO, user accounts in identity providers are now keys to the kingdom and pose a massive security risk. If such an account is compromised, attackers can gain control of the user's accounts in numerous other web services," according to "O Single Sign-Off, Where Art Thou?," a recently published report into "single sign-on account hijacking and session management on the web" authored by five researchers at the University of Illinois at Chicago. In the case of the Facebook breach, for example, its SSO system could have been used for a range of other sites, including its own Instagram, as well as Tinder, Spotify and others. "Our study on the top 1 million websites according to Alexa found that 6.3 percent of websites support SSO. This highlights the scale of the threat, as attackers can gain access to a massive number of web services," the researchers say. ... "Another very critical yet overlooked problem is that the stolen tokens can be used to obtain access to a user's account on other websites that support Facebook SSO *even if the user doesn't use Facebook SSO* to access them," he adds. "This depends on third-party implementations."


Augmented reality, fog, and vision: Duke professor outlines importance of smart architectures

8 virtual or augmented reality
Some of the trade-offs, she said, are already fairly well-known. For instance, many tasks that aren’t terribly demanding from a compute or network perspective are best accomplished at the edge, but the advantages in terms of latency are outweighed by the cloud’s more potent computing capabilities for more complex tasks. “When the task is small, the response time is dominated by the communication time, and the communication time is much smaller for edge systems,” she said. “Once you talk about larger tasks, however, there are more resources in the cloud, so computing time becomes more of a component in response time and the cloud connection will be faster than the edge.” “We also noted that connections to the cloud are much faster in on-campus conditions than they are in nearby residential areas, and this is well-known – connections from campuses to the cloud are optimized.” It’s an important point for academic researchers, she noted. Testing systems in areas that might not have a university laboratory’s optimized network connections yields results that are much more applicable to the real-world challenges faced by businesses.


Achieving the right balance of data privacy and IT security


A comprehensive data protection strategy must consider the integration of best practices to both security and privacy. Data integrity, retention, and availability are part of the overall data protection goal for an organization, and as such, they are tied directly to individuals’ rights as data subjects. ... Privacy cannot exist without security, but security can exist without privacy – not an ideal situation for anyone concerned. With the continued advance of technology, organizations and individuals must continue to increase awareness and knowledge of data protection, data threats, and the steps required to ensure security and privacy while still maintaining effective business practices and relatable social media interactions. The way to develop a resilient privacy and data protection program is to combine privacy- and security-related thinking into a common approach that makes it easier for employees in all organizational levels to do the right thing. As we continue to move forward in the data-driven world, we must view ourselves as data subjects and strive to attain an agile balance between security and privacy interests.


New details released on Huawei's intent-based network


The new S7530-HI and S6720-HI are fully programmable Ethernet switches based on Huawei's silicon Ethernet Network Processor. The custom application-specific integrated circuit delivers advanced features and is complemented with merchant silicon for standard functions. One of the unique attributes of this intent-based network line is it includes an integrated wireless controller for unified wired and wireless network management. The S7530-HI is equipped with all Gigabit Ethernet ports, and the S6720-HI has 100 Gigabit Ethernet uplinks. That makes the S6720-HI the first programmable, fixed form-factor switch with uplinks of that speed. These switches target the campus network and are designed to work with Huawei's wireless access points, which are ready for the internet of things, because they support a range of wireless protocols, including Bluetooth, Zigbee and radio frequency ID.


How Bank of England is using Splunk for proactive security


The bank is using Splunk to move away from a reactive SOC that only responds to known threats, and is now working towards being more proactive – or, as Pagett calls it, SOC 2.0. “The proactive model is around getting in lots of data and then what we call behavioural profiling or adversary modelling,” he says. “We try to model what our attackers might do from a behavioural point of view, and then we look for those behaviours.” Pagett says hackers can change the technology and techniques they use, but it is difficult for them to change their behaviour, making this the easiest way to spot when an attack is about to happen or is under way. The bank uses Splunk to mine the datasets needed to begin predicting these shifts in behaviour. This could range from a large number of failed password attempts to something more sophisticated, such as a spear-phishing attack with booby-trapped Microsoft Word attachments.


IT pros see security win with Microsoft Managed Desktop

Microsoft administrators said they see a clear value to this managed service -- which could potentially remove some tedious aspects of desktop management -- in an age when most users prefer physical devices. "We have folks spread across the country, so we have to wait for a shipment of laptops, and then image them and get them set up for the users," said David Bussey, systems engineer at the nonprofit Public Company Accounting Oversight Board in Washington, D.C. "What [Microsoft Managed Desktop] has to offer fits some of those pain points we're going through." Microsoft Managed Desktop allows businesses to choose two- or three-year hardware refresh cycles from a list of available devices. Right now, that list is limited to Microsoft's own Surface hardware -- specifically the Surface Laptop, Surface Pro and Surface Book 2. It plans to expand device offerings with third-party partnerships, the company said.



Quote for the day:


"Scientific knowledge is an enabling power to do either good or bad - but it does not carry instructions on how to use it." -- Richard Feynman


Daily Tech Digest - October 02, 2018

SIE Europe
SIE Europe is co-founded by three international Internet luminaries: Dr. Paul Vixie, Chairman and CEO of Farsight Security, Christoph Fischer, CEO of BFK edv-consulting GmbH and Peter Kruse, co-founder of CSIS Security Group A/S. “We founded SIE Europe to build a European-based community of Internet defenders who want to make the Internet safer for all users. As part of this initiative, SIE Europe will provide the infrastructure to collect, aggregate and share real-time DNS data in strict compliance with the privacy laws and regulations of the European Union, including General Data Protection Regulations (GDPR),” said Dr. Paul Vixie, Chairman and CEO of Farsight Security. All online transactions, good or bad, begin with the DNS. By providing visibility to the IP addresses, domain names and other digital artifacts of the DNS used by threat actors, security professionals will be able to accurately identify and map criminal infrastructures in their networks and take preventive measures to protect their networks from future cybercrime activity.



Facebook could face up to $1.6bn fine for data breach


Facebook said the attack exploited the “complex interaction of multiple issues in our code” and stemmed from a change made to the video uploading feature in July 2017. In response, Facebook said it had fixed the vulnerability, informed law enforcement and reset the access tokens of the almost 50 million accounts known to be affected. “We’re also taking the precautionary step of resetting access tokens for another 40 million accounts that have been subject to a “View As” look-up in the last year. As a result, around 90 million people will now have to log back in to Facebook, or any of their apps that use Facebook Login,” said Facebook. The company has also turned off the “View As” feature while it conducts a security review, but admitted it has yet to determine whether accounts were misused or any information accessed. Facebook said it is also still trying to establish the location and identity of the attackers and will reset the access tokens of any other accounts it believes may have been affected.


The CTO role: ‘It’s about planning and business opportunities’

Every CTO role is different, and in this case, Hanson, focuses on the sales side of the business, whereas other CTOs are more concerned with the development of products. “We have some very intelligent people in our product management division who look after the actual development of products. So I’m not on the product side. I’m more on the sales side,” confirms Hanson. His responsibility centres around making sure he finds out how Informatica’s prospects and consumers use the company’s technology. He needs to understand their challenges, governance and compliance issues moving forward; well as the pressures in their marketplace and how they need to leverage data to be successful and competitive in the marketplace. “It’s really my job to try and collect that information, and think about innovative uses for our products as they currently exist, and what type of initiatives we should try and help our prospects and customers with,” explains Hanson.


Big Data: changing the future of business models

null
The ability to analyse and make informed decisions from the use of data and its analytical capabilities is vital if a business is to succeed. In an increasingly competitive industry, it is imperative that firms are able to make quick and increasingly complex decisions to cater for the changing demands from customers and evolving market conditions.  By harnessing data, businesses can identify new opportunities within their existing business operations, create more efficient operations, increase profitability and improve customer service. By embracing data, businesses can gain a competitive edge over their rivals, ensuring they don’t lag behind the competition. Over the years, our data team has worked alongside businesses to help them find data-driven solutions and technologies with the aim of fast tracking their objectives and stimulating growth.


How I Lost My Faith in Private Blockchains

The business and legal worlds operate from an aspect of centralized entities, and while that remains the case, any forced attempts at decentralization are likely to come short. While it is possible that in the future we may see decentralized businesses, they are far more likely to come from the public blockchain world where they are able to grow organically in an entirely new paradigm. In the meantime, institutions and individuals should be evaluating permissioned blockchains like any other technology: it isn't magic, and it should be assessed like one would assess any other. The benefits of a technology should never be assumed based on buzzwords, hype or fear that "everyone else is doing it so why shouldn't I?" Instead, benefits should be assessed by asking what is the business problem, what are the different technology options available, and what are the quantifiable costs and benefits of each.


LinkedIn the latest to introduce its own server designs

LinkedIn the latest to introduce its own server designs
The idea behind the designs is to reduce the amount of work it takes to deploy servers in a data center. Again, this seems to assume people will build their own the way LinkedIn and other hyperscalers do it. It’s all designed to be like building with Lego bricks. LinkedIn also wanted to standardize hardware across both primary and edge data centers, which is likely why Vapor IO is involved. Edge locations don’t have a readily available technician, so if a company sends a technician to an edge container, the last thing it wants to do is make the tech waste time trying to figure out the layout of the equipment. By having common hardware between the two, the technician will work with familiar gear. LinkedIn claims these designs will mean being able to build infrastructure for 1 percent of the cost and six to ten times faster integration time, with greater power efficiency and other cost savings. However, it does not address the issue of IT staff building the hardware. LinkedIn, Google, Facebook, etc., can afford to hire engineers who build servers all day. Your average IT shop does not.


This is how cyber attackers stole £2.26m from Tesco Bank customers

The attackers most likely used an algorithm which generated authentic Tesco Bank debit card numbers and, using those virtual cards, they attempted to make thousands of unauthorised debit card transactions. The FCA said Tesco Bank's failures include the way in which the bank distributed debit card numbers and mistakes made in the reaction to the attack which meant that no action was taken for almost a day after the incident was first uncovered. A number of deficiencies in the way Tesco Bank handled security left customers vulnerable to cyber attackers in an incident that was "largely avoidable", said the FCA analysis of the incident which Tesco Bank had to this point been tight-lipped about -- to the frustration of other financial institutions. Poor design of Tesco Bank debit cards played a significant role in creating security vulnerabilities that led to thousands of customers having their accounts emptied. One of these involved the PAN numbers -- the 16-digit card number sequence used to identify all debit cards.


Google Chrome 70 is coming. Are your security certificates in order?

Google Chrome 70 is coming. Are your security certificates in order?
For those unfamiliar with the details of this, in 2017 Google and Mozilla decided to deprecate all Symantec-issued digital certificates based on their assessment that Symantec did not correctly validate its SSL certificates prior to issuing them to customers. Google and Mozilla then decided to put in place a multi-step plan to distrust any certificates issued from the Symantec PKI. This plan phased out Symantec certificates over the next year and a half. Instead of following the Google plan, Symantec elected to sell its certificate business to DigiCert. Despite the transaction, the requirement to replace all certificates issued from the Symantec PKI remained intact, requiring millions of certificates to be replaced during 2018. To assist customers in replacing their certificates, DigiCert contacted each certificate holder, offering free replacement certificates chained to the trusted DigiCert roots. The first major distrust date was on December 1, 2017, when no additional TLS certificates could be issued through the Symantec PKI. Prior to that date, DigiCert cut over all issuance processes to its PKI and validation systems.


Open Compute Project eyes European enterprise adoption with Experience Centre opening


The OCP’s championing of 21-inch server rack designs is often cited as a partial barrier to enterprise adoption of its technologies, as it makes it potentially harder for users to deploy the technology in existing datacentres where smaller server racks are consistently the norm. The centre’s opening is being overseen by datacentre infrastructure manufacturer Rittal and OCP supplier and service provider Circle B, in conjunction with Switch Datacenters, who is in the midst of building a datacentre based on OCP principles. “The three companies have determined that in the technology sector, IT managers at large enterprises and governments in the ... “These principles form the basis on which many hyperscalers operate. By adopting OCP designs in their datacentres large enterprises and governments can benefit from the same advantages as the hyperscalers: cost reductions, lower energy usage and much more flexibility.”


Building Agile Data Lakes with Robust Ingestion and Transformation Frameworks – Part 1


With the advent of Big Data technologies like Hadoop, there has been a major disruption in the information management industry. The excitement around it is not only about the three Vs – volume, velocity and variety – of data but also the ability to provide a single platform to serve all data needs across an organization. This single platform is called the Data Lake. The goal of a data lake initiative is to ingest data from all known systems within an enterprise and store it in this central platform to meet enterprise-wide analytical needs. However, a few years back Gartner warned that a large percentage of data lake initiatives have failed or will fail - becoming more of a data swamp than a data lake. How do we prevent this? We have teamed up with one of our partners, Clarity Insights, to discuss the data challenges enterprises face, what caused data lakes to become swamps, discuss the characteristics of a robust data ingestion framework and how it can help make the data lake more agile.



Quote for the day:


"One measure of leadership is the caliber of people who choose to follow you." -- Dennis A. Peer


Daily Tech Digest - October 01, 2018

Drone defense -- powered by IoT -- is now a thing

Drone defense -- powered by IoT -- is now a thing
What exactly constitutes a “malicious” drone isn’t entirely clear, but it could range from teenagers using a small drone to peek over a fence to the kinds of military drones that are being used as weapons in several areas around the world. And, in fact, the companies cite “military bases, venues, cities, enterprises, correctional facilities, and more” as potential customers. Further, DroneTracker, Dedrone’s “airspace security platform” is designed to detects a wide variety drones, the company notes, “including commercial, consumer and military-grade, as well as autonomous drones.” Dedrone leverages IoT sensor data to detect, classify, mitigate, and localize drone-based threats, the company says, while AT&T provides the LTE connectivity. Once a drone threat is detected, Dedrone notifies security personnel. ... But is the threat really that great for most business and industrial applications? I’m still not sure that knowing that AT&T and Dedrone — along with many others, I’m sure — are on the case makes me feel safer or even more vulnerable.



Understanding Risks to Data Drives Controls Efficiencies

Business professionals and IT practitioners agree that data are a valuable commodity for enterprises in many ways. The notion of using data to help monitor and manage risk tolerances in audit and assurance activities often is overlooked. Data should be considered and analyzed as the enterprise selects, plans and deploys controls, and should also be part of enterprise evaluation of the performance of those controls. This recently was highlighted by ISACA, which has put forth new guidance in partnership with SecurityScorecard titled Continuous Assurance Using Data Threat Modeling. In collaboration with industry experts, practitioners and ISACA subject matter experts, the guidance provides an excellent overview on how to adapt threat modeling to data in transit and data at rest as a strategy to put forth a more holistic, comprehensive and continuous model for understanding data risk and for analyzing potential risk in the supply chain.


Network security challenges remain a top concern for IT pros


More than one-third of respondents ranked network security challenges as their top concern when planning, deploying and managing enterprise networks. As mobile devices continue to expand and redefine the network edge, network security challenges remain a top issue, the study found. Additionally, 83% of respondents identified several types of network and telecom fraud as serious issues. More than half of IT pros cited identity fraud as a primary concern in relation to real-time communications. The expansion of communications channels -- such as voice, email, video, chat and in-app communications -- also affects network complexity, deployment, management and what defines the network edge.  To combat network security challenges and improve control, IT pros said they would consider emerging technologies, such as biometrics, artificial intelligence and blockchain. The survey, dubbed Enterprise Networks in Transition: Taming the Chaos, also highlighted software-defined WAN as a technology that could help enterprise networks evolve. Yet, according to the survey, North America lags other regions in software-defined networking deployments.


Is Blockchain a Universal Platform?

In order to be a responsible prosumer, with a micro-grid dedicated to full renewable energy use, blockchain allows you to monitor the exchange of energy between the point of creation – via a solar panel, for example – to the consumption of it through not just your home, but another prosumer’s home. Within this digital ledger, energy use can be monitored and maintained in such a way that community members are actively engaged with their utilities in a way that benefits a community as a whole. It would be completely ridiculous to suggest that the insurance industry is an emerging market – in fact, it is the largest market in the world with staggering 1.2 trillion dollars in revenue. Despite this position the market it is in, insurance is caught in a slog deeply rooted in traditional practices. Blockchain can be used to create sub-markets within the industry: Peer-to-peer insurance, which cuts out the middlemen and provides greater portions of premiums to the policy holder; Parametric insurance, which uses a smart contract to automatically pay twenty percent of any type of claim


Ransomware Crypto-Locks Port of San Diego IT Systems

Ransomware Crypto-Locks Port of San Diego IT Systems
The attacker or group of hackers behind the attempted shakedown has also demanded a ransom, payable in bitcoin, in exchange for the promise of a decryption key, port officials say. The port says that while IT systems have been disrupted, much of the port's business continues without interruption. "It is important to note that this is mainly an administrative issue and normal port operations are continuing as usual," says Port of San Diego CEO Randa Coniglio in a statement. "The port remains open, public safety operations are ongoing, and ships and boats continue to access the bay without impacts from the cybersecurity incident." The Port of San Diego - spanning the cities of Chula Vista, Coronado, Imperial Beach, National City and San Diego along the 34 miles of the San Diego Bay - is the fourth largest of California's 11 ports. It includes two maritime cargo terminals, two cruise ship terminals, 22 public parks, the Harbor Police Department and leases for hundreds of businesses, including 17 hotels, 74 restaurants and three retail centers, plus museums and bay tours.


The Right Diagnosis: A Cybersecurity Perspective

No security program is perfect, but some need more attention than others. What are the checkpoints that will help organizations understand where their security programs are ailing, how to make the right diagnosis, and begin the proper treatment? ... Just as the brain controls how the body functions, the leadership of a security organization controls how that organization functions. When looking to evaluate and understand where a security program stands, one of the first diagnostics should be focused on leadership. Do security leaders have a clear vision? Do they have a solid strategy? Are they focused on the right goals and priorities? Do they have the right plan to make their strategy a reality? Do they have the ear of the executives, the board, and other stakeholders? Are they building the right team? ... Security operations could be considered the central function of a security program, analogous to its heartbeat. Just as a healthy, regular heartbeat is critical to the health of the body, a healthy security operations program is critical to the health of a security organization. Is the security operations team properly trained?


Why 5G will disappoint everyone

5g smart city iot wireless silver platter tablet service
The wireless carriers hope 5G will enable them to compete with or replace ISPs, cable companies, and satellite internet and TV companies. So that’s nice. But it will probably be more than 15 years before 5G replaces 4G for most users most of the time. 5G won’t be reliable enough anytime soon for companies such as Apple and Samsung to remove the supercomputer-like processing power from smartphones and move everything to the cloud. I’m afraid that $1,000-plus smartphones are here to stay.And because of the way 5G works, rollouts will soon face another huge hurdle. ... The technology comes with a requirement that towers be far greater in number and far closer to users. Some residents in North Potomac say more than 60 5G wireless towers have been installed less than 30 feet from their front doors. It’s possible that definitive, widely accepted proof may emerge that clearly shows a health risk from 5G wireless equipment. It’s likely that debate over the health effects will continue. But it’s not even remotely conceivable that everybody will agree that 5G is harmless.


How the 'human and machine' model will transform customer service

The good news is that automated interactions will only become more tailored and efficient in the future as more businesses turn to AI in order to understand conversations in any language, automate repetitive processes and solve customer problems faster than the competition. Ultimately, it’s not a matter of AI and automation replacing customer support agents but rather enabling them to become ‘super agents’. And, the advantages that these ‘super agents’ pose for business growth are undeniable - from visibility (AI knows everything your users are doing, your customer support team does not), to sheer productivity (AI doesn’t need to sleep, eat or take time off). Inevitably, the decision to add automation to the customer service mix requires smart decisions and a solid understanding as to where and how automation can achieve cost savings while always fostering better, more personalized customer experiences.


Digital transformation in 2019: The big insights and trends

Digital Transformation Trends, Lessons Learned, and Best Practices for 2019
On average, most organizations believe that half of their revenue will come from digital channels by 2020. Furthermore, the World Economic Forum estimates that the overall economic value of digital transformation to business and society will top $100 trillion by 2025. Other similar data are easy to find. These represent vital macroeconomic trends that are the most significant attainable new business potential for the typical enterprise. Any way you look at it, the largest growth opportunities that most organizations can access now is to better seize the white space in these rapidly expanding digital markets. The latest trends in digital transformation for next year reflect some particularly hard won lessons from the past few years, on both the business and technology sides. It's worthwhile taking the time to understand how these insights came about, as organizations earlier in the journey can avoid making many of the same painful, expensive, and time-consuming realizations along the way. As they say, one useful definition of 'smart' is not making all the mistakes oneself.


The Future of Brain Science

Fortunately for neuroscience, mathematicians, data scientists, and computer scientists have been wrestling with their own “information overload” challenges, coping with exponential increases in the volume, variety, and velocity of digital data spawned by Moore’s law revolution in digital technology. Google, for instance, ingests unimaginable volumes of data every second, that they must somehow “monetize” (make money from, because their services are largely “free”) by precisely targeting digital advertisements to people who use Google search or Gmail. Google can only do this with the aid of massive cloud computing systems running complex math and AI algorithms that quickly recognize patterns and act upon these insights to serve up ads in real time. One branch of AI, called “cognitive computing,” holds particular promise for extending Nicolelis work to humans. Cognitive computing, which goes well beyond simple pattern recognition, achieves deep understanding of the underlying causes of complex patterns, instead of simple recognition that patterns exist.



Quote for the day:


"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has." -- Margaret Mead


Daily Tech Digest - September 30, 2018

How to successfully implement an AI system

null
Companies should calculate the anticipated cost savings that would be gained with a successful AI deployment, using that as a starting point for investment so that costs of errors or short falls on expectations are minimised if they occur. The cost savings should be based on efficiency gains, as well as the increased productivity that can be harnessed in other areas of the business by freeing up staff from administration tasks. This ensures companies do not over-invest at the beginning before seeing initial results and if changes are necessary they do not cannibalise potential ROI and companies can still potentially switch to other viable alternative use cases.  Before advising companies on what solution they should invest in, it's important to first establish what they want to achieve. Digital colleagues can provide a far superior level of customer service however, they require greater resource to set up.  Most chatbots are not scalable, once deployed they cannot be integrated into other business areas as they are designed to answer FAQs based on a static set of rules. Unlike digital colleagues, they cannot understand complex questions or perform several tasks at once.


How adidas is Creating a Digital Experience That's Premium, Connected, and Personalized

Take something like a product description. How do we really have the product descriptions and offerings so that if you're interested in sports we will help you find exactly the product that you need for the sport that you're interested in? We will also educate you and bring you back at different points in time to help you find out what you need when you need it, or with an engagement program. Ultimately, like the membership program, that it has something that's sticky, that you can give back to something, even more, you can participate in events and experiences. For us, a lot of it’s really deepening those experiences but also exploring new technologies and new areas. Omnichannel was kind of the original wave which happened and I said it was the freight train that came past us a couple of years ago. Now we're also looking at what those next freight trains are, whether it's technologies like blockchain or experiencing picking up a new channel. For example, we're working extensively with Salesforce on automation, how we can automate consumer experiences.


What Deep Learning Can Offer to Businesses


With the capabilities of artificial intelligence, the way the words are processed and interpreted can be changed dramatically. It turns out we can define the meaning of the word based on its position in the text without the need of using a dictionary. ... One of the most recent successful appliances of deep learning for image recognition came from Large Scale Visual Recognition Challenge, when Alex Krizhevsky applied convolutional neural networks to organize images from ImageNet, a dataset containing 1.2 million pictures, into 1,000 different classes. In 2012, Krizhevsky’s network, AlexNet, achieved a top-5 test error rate of 15.3%, outperforming traditional computer vision solutions with more than 10% accuracy. The experience of Alex Krizhevsky changed the landscape of the data science and artificial intelligence field from the perspective of the research and business application. In 2012, AlexNet was the only deep learning model at ILSVRC (ImageNet Large Scale Visual Recognition Competition). Two years later, in 2014, there were no conventional computer vision solutions among the winners.



Can Global Semantic Context Improve Neural Language Models?

Global co-occurrence count methods like LSM lead to word representations that can be considered genuine semantic embeddings, because they expose statistical information that captures semantic concepts conveyed within entire documents. In contrast, typical prediction-based solutions using neural networks only encapsulate semantic relationships to the extent that they manifest themselves within a local window centered around each word (which is all that’s used in the prediction). Thus, the embeddings that result from such solutions have inherently limited expressive power when it comes to global semantic information. Despite this limitation, researchers are increasingly adopting neural network-based embeddings. Continuous bag-of-words and skip-gram (linear) models, in particular, are popular because of their ability to convey word analogies of the type “king is to queen as man is to woman.”


Big Data and Machine Learning Won’t Save Us from Another Financial Crisis


Machine learning can be very effective at short-term prediction, using the data and markets we have encountered. But machine learning is not so good at inference, learning from data about underlying science and market mechanisms. Our understanding of markets is still incomplete. And big data itself may not help, as my Harvard colleague Xiao-Li Meng has recently shown in “Statistical Paradises and Paradoxes in Big Data.” Suppose we want to estimate a property of a large population, for example, the percentage of Trump voters in the U.S. in November 2016. How well we can do this depends on three quantities: the amount of data (the more the better); the variability of the property of interest (if everyone is a Trump voter, the problem is easy); and the quality of the data. Data quality depends on the correlation between the voting intention of a person and whether that person is included in the dataset. If Trump voters are less likely to be included, for example, that may bias the analysis.


Spending on cognitive and AI systems to reach $77.6 billion in 2022

Banking and retail will be the two industries making the largest investments in cognitive/AI systems in 2018 with each industry expected to spend more than $4.0 billion this year. Banking will devote more than half of its spending to automated threat intelligence and prevention systems and fraud analysis and investigation while retail will focus on automated customer service agents and expert shopping advisors & product recommendations. Beyond banking and retail, discrete manufacturing, healthcare providers, and process manufacturing will also make considerable investments in cognitive/AI systems this year. The industries that are expected to experience the fastest growth on cognitive/AI spending are personal and consumer services (44.5% CAGR) and federal/central government (43.5% CAGR). Retail will move into the top position by the end of the forecast with a five-year CAGR of 40.7%. On a geographic basis, the United States will deliver more than 60% of all spending on cognitive/AI systems throughout the forecast, led by the retail and banking industries.


5 ways industrial AI is revolutionizing manufacturing

artificial intelligence / machine learning / network
In manufacturing, ongoing maintenance of production line machinery and equipment represents a major expense, having a crucial impact on the bottom line of any asset-reliant production operation. Moreover, studies show that unplanned downtime costs manufacturers an estimated $50 billion annually, and that asset failure is the cause of 42 percent of this unplanned downtime. For this reason, predictive maintenance has become a must-have solution for manufacturers who have much to gain from being able to predict the next failure of a part, machine or system. Predictive maintenance uses advanced AI algorithms in the form of machine learning and artificial neural networks to formulate predictions regarding asset malfunction. This allows for drastic reductions in costly unplanned downtime, as well as for extending the Remaining Useful Life (RUL) of production machines and equipment. In cases where maintenance is unavoidable, technicians are briefed ahead of time on which components need inspection and which tools and methods to use, resulting in very focused repairs that are scheduled in advance.


Data Centers Must Move from Reducing Energy to Controlling Water

While it is a positive development that overall energy for data centers is being reduced around the globe, a key component that has — for the most part — been washed over is water usage. One example of this is the continued use of open-cell towers. They take advantage of evaporative cooling to cool the air with water before it goes into the data center. And while this solution reduces energy, the water usage is very high. Raising the issue of water reduction is the first step in creating ways our industry can do something about it. As we experience the continued deluge of the “Internet of Things”—projected to exceed 20 billion devices by 2020, we will only be able to ride this wave if we keep energy low and start reducing water usage. The first question becomes how can cooling systems reject heat more efficiently? Let’s say heat is coming off the server at 100 degrees Fahrenheit. The idea is to efficiently capture heat and bring it to the atmosphere as close to that temperature as possible — but it is all dependent on the absorption system.


AI and Automation to Have Far Greater Effect on Human Jobs by 2022

AI and Automation to Have Far Greater Effect on Human Jobs by 2022 (Infographic)
With the domination of automation in a business framework, the workforce can be extended to new productivity-enhancing roles. More than a quarter of surveyed businesses expect automation to lead to the creation of new roles in their enterprise. Apart from allotting contractors more task-specialized work, businesses plan to engage workers in a more flexible manner, utilizing remote staffing beyond physical offices and decentralization of operations. Among all, AI adoption has taken the lead in terms of automation for the reduction of time and investment in end-to-end processes. “Currently, AI is the most rapidly growing technology and will for sure create a new era of the modern world. It is the next revolution- relieving humans not only from physical work but also mental efforts and simplifies tasks extensively,” opined Kuppa. While human-performed tasks dominate today’s work environment, the frontier is expected to change in the coming years.


Modeling Uncertainty With Reactive DDD

Reactive is a big thing these days, and I'll explain later why it's gaining a lot of traction. What I think is really interesting is that the way DDD was used or implemented, say back in 2003, is quite different from the way that we use DDD today. If you've read my red book, Implementing Domain-Driven Design, you're probably familiar with the fact that the bounded contexts that I model in the book are separate processes, with separate deployments. Whereas, in Evan's blue book, bounded contexts were separated logically, but sometimes deployed in the same deployment unit, perhaps in a web server or an application server. In our modern day use of DDD, I’m seeing more people adopting DDD because it aligns with having separate deployments, such as in microservices. One thing to keep clear is that the essence of Domain-Driven Design is really still what it always was -- It's modeling a ubiquitous language in a bounded context. So, what is a bounded context? Basically, the idea behind bounded context is to put a clear delineation between one model and another model.



Quote for the day:


"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks


Daily Tech Digest - September 29, 2018

Optimizing Multi-Cloud, Cross-DC Web Apps and Sites

Latency, payload, caching and rendering are the key measures when evaluating website performance. Each round trip is subject to the connection latency. From the time the webpage is requested by the user to the time the resources on that webpage are downloaded in the browser is directly related to the weight of the page and its resources. The larger the total content size, the more time it will take to download everything needed for a page to become functional for the user. Using caching and default caching headers may reduce the latency since less content is downloaded and it may result in fewer round trips to fetch the resources, although sometimes round trips may be to validate that the content in the cache is not stale. Browsers need to render the HTML page and resources served to them. Client-side work may cause poor rendering at the browser and a degraded user experience, for example, some blocking calls (say 3rd party ads) or improper rendering of page resources can delay page load time and impact a user experience.


Lessons from the UK Government's Digital Transformation Journey

So many lessons! Some of my colleagues set out to document the higher level lessons. The result was an entire book -- Digital Transformation at Scale: Why the Strategy Is Delivery -- but there’s a huge amount more that couldn’t be included there. But top of the list is the importance of remaining focused on your purpose and your users’ needs. As technologists and agilists we can too easily be drawn into improving technology or simplifying processes without stepping back and asking why we have those things in the first place, or if the change we’re making is the right one. I’ve talked to a lot of teams in large organisations who have taken all the right steps in moving to agile but are still having trouble motivating their teams, and the missing piece is almost always being exposed directly to your users. Whether they’re end customers, or internal users, there’s nothing like seeing people use your products to motivate the team to make them better.


MissingLink.ai has launched this week to streamline and automate the entire deep learning life cycle for data scientists and engineers. “Work on MissingLink began in 2016, when my colleagues Shay Erlichmen [CTO], Rahav Lussato [lead developer], and I set out to solve a problem we experienced as software engineers. While working on deep learning projects at our previous company, we realized we were spending too much time managing the sheer volume of data we were collecting and analyzing, and too little time learning from it,” Yosi Taguri, CEO of MissingLink, wrote in a post. “We also realized we weren’t alone. As engineers, we knew there must be a more efficient solution, so we decided to build it. Around that time, we were joined by Joe Salomon [VP of product], and MissingLink was born.” The team decided to focus on machine learning and deep learning because of the potential to “impact our lives in found ways.” Machine learning has already been used for detecting diseases, in autonomous vehicles and in public safety situations, according to the company.


Big data architecture: Navigating the complexity

britusistock-938449134.jpg
First, there are the many different engines you might choose to run with your big data. You could choose Splunk to analyze log files, or Hadoop for large file batch processing, or Spark for data stream processing. Each of these specialized big data engines requires its own data universe, and ultimately, the data from these universes must come together—which is where the DBA is called in to do the stitching. But that's not all. Organizations are now mixing and matching on-premise and cloud-based big data-processing and data storage. In may cases, they are using multiple cloud vendors as well. Once again, data and intelligence from these various repositories must be blended together at some point, as the business requires. "This is a system integration problem that vendors need to help their clients solve," said Anoop Dawar, SVP of product management and marketing for MapR, a converged data platform for big data . "You have to not only be able to provide a platform for all of the different big data processing engines and data stores that are out there, but you must also be able to rapidly provide access to new big data processing engines and data stores as they emerge."


Key Difference Between The Cloud And The Data Center

Whilst the purpose of both is the same; storage, management, and maintenance of data – there is an evident architectural difference between both. So the first key difference is that a data center is land-based, in-house and has a physical setup with a physical presence of IT professionals working together as a team. On the other hand, a cloud is more like a virtual, physically non-existent store that is dependent on the internet and is accessible only by the user over the internet. There is a notable difference between the security that both offer. Of course, understandably cloud computing is less secure than data centers as the latter is an in-house setup and is liable to protect your security. On the contrary, cloud computing is internet-based which puts you at an increased risk of data leak and privacy invasion threats. Moreover, you are responsible for your own security with cloud computing because the third-party operator of the cloud is not liable for your data.


5 Easy Ways To Determine If Your Company Needs Blockchain


The purest form of blockchain is in tracking and authenticating a digital asset (music, movies, digital wallets, education certifications, mortgage contracts, and so on) with digital transactions logged against it. Blockchains can also track and authenticate physical assets (gold, organic food, artwork, manufactured parts, and such), though those assets can require checkpoints considered “off-chain.” In such cases, you’ll need trusted sources in your business network to audit and authenticate the physical asset, which can be tricky. Consider a notorious example from the aerospace industry. Some argue that well before the Challenger space shuttle disaster in 1986, some parties knew that the spacecraft’s O-ring seals contained a flaw, but this design and manufacturing problem wasn’t addressed properly. What if an aerospace industry blockchain was tracking the origin, specification, materials, and testing of that part and any known problems? Only once the integrity of that part and required tests had been confirmed by many trusted participants could the part be used.


Axon Conference Panel: Why Should We Use Microservices?

For Schrijver, it’s all about scalability. In terms of teams it’s the ability to work with multiple teams on one product. In terms of operations it’s the ability to independently scale different parts of a system. He thinks that if you build a microservices system the right way you can have almost unlimited horizontal scalability. Buijze pointed out that technically it doesn’t matter whether we work with monoliths or microservices; in theory you can scale out a monolith just as well as microservices. What microservices gives us is a strong and explicit boundary to every service. Although the architects draw limits for communication between components, we as developers are good at ignoring them. If it’s technically possible to directly communicate with another component we will do that, ignoring any rules the architects have set up. Keeping those boundaries intact is much easier when they are explicit and even more so if a component is managed by another team.


The rise of open source use across state and local government

GSA opens digital communities for AI and virtual reality
A simple solution for agencies looking to defend against open source vulnerabilities is to turn to enterprise open source providers. Enterprise-ready solutions undergo scrutinizing tests to ensure that any defect is detected, prevented, or addressed in a timely manner, thereby mitigating an agency’s risk. Even further, enterprise solutions protect government networks from these risks throughout the product lifecycle by ensuring the code is up-to-date, secure, and functioning as expected. Investing in future-oriented, enterprise open source solutions can also help lower the total cost of ownership. This is possible because agencies can sidestep the costly and painful vendor lock-in that comes with proprietary software. Instead, enterprise open source enables users to utilize software that is platform agnostic and enables the agency to make the hardware, operating system, and environment decisions that are optimal for their requirements and mission. At the end of the day, an enterprise open source solution provides government users with the best of both worlds.


Crowdstrike CTO on securing the endpoint and responding to a breach

The first was that a modern security platform had to be built as a native-cloud solution. The cloud was critical not just for ease of management and rapid agent rollouts, but also for protection of off-premise assets and workloads deployed in public and hybrid clouds. The cloud would also be used to dramatically reduce performance impact that an endpoint agent would have on a system as heavy processing work would be offloaded to an elastically scalable cloud compute. Finally, the cloud could leverage the power of crowdsourcing – collection of trillions of security-related events from endpoint agents deployed all over the world to learn from every adversary action and taking away their ability to reuse tradecraft as they launch attacks against new victims. The second principle was to leverage machine learning/artificial intelligence to predictively identify new threats by training algorithms on the largest dataset in the security industry – over a trillion events collected every single week by CrowdStrike Falcon agents protecting organisations in 176 countries.


What is Blockchain Technology? A Step-by-Step Guide For Beginners

What is Blockchain Technology?
Information held on a blockchain exists as a shared — and continually reconciled — database. This is a way of using the network that has obvious benefits. The blockchain database isn’t stored in any single location, meaning the records it keeps are truly public and easily verifiable. No centralized version of this information exists for a hacker to corrupt. Hosted by millions of computers simultaneously, its data is accessible to anyone on the internet. ... As revolutionary as it sounds, Blockchain truly is a mechanism to bring everyone to the highest degree of accountability. No more missed transactions, human or machine errors, or even an exchange that was not done with the consent of the parties involved. Above anything else, the most critical area where Blockchain helps is to guarantee the validity of a transaction by recording it not only on a main register but a connected distributed system of registers, all of which are connected through a secure validation mechanism.



Quote for the day:


"To have long term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley