Daily Tech Digest - October 07, 2018

Over three years and one global trade war later, the possibility of this scenario has turned from a fringe and ahead-of-its-time concern, to a mainstream and relevant one. As AI continues to advance at a dizzying pace, the real-world applications of AI-related technologies have also increased – and so have concerns about living in a world inundated by intelligent machines capable of performing specific tasks. The dilemma facing lawmakers and leaders today is how developments in Artificial Intelligence will be thoughtfully monitored at a national and global level to protect the interests of man- and womankind, while also allowing enough freedom for citizens, corporations, and governments to leverage the new and rapidly advancing technology to increase efficiencies and generate added value. While there are no easy or glaringly obvious answers to this dilemma, Harvard’s Belfer Center for Science and International Affairs produced a report in 2017 recommending that the National Security Council, DoD, and State Department start studying what internationally agreed-on limits should be imposed on AI.


Rapidly developing technology has not only disrupted industries and business models—there is evidence it is changing consumer behavior and reshaping how companies should view their customers. M&E companies’ outreach campaigns and efforts to make technology user-friendly has paid off with older generations, whose behavior is mimicking younger generations’. Indeed, results from the 12th edition of Deloitte’s Digital Media Trends Survey indicate that the behaviors of Gen Z (ages 14–21), millennials (ages 22–37), and Gen X (ages 38–53)  are converging. ... Similarly, half of Gen X respondents reported that they play video games frequently, almost matching Gen Z and millennial respondents. As a result, many M&E providers are struggling to segment media consumption habits based only on generational behavior. Demographic generalizations—such as the assumption that people of the same gender and in a similar age and income bracket will consume products and services the same way, and be engaged by the same marketing ploys—are less accurate than they used to be.
AI for security can help defenders in a myriad of ways. However, there are also downsides to the emergence of AI. For one, the technology has also been leveraged by cybercriminals, and it’s clear that it can be co-opted for various nefarious tasks. These have include at-scale scanning for open, and vulnerable ports – or automated composition of emails that have the exact tone and voice of the company’s CEO, learned over time by 24-7 eavesdropping. And in the not-too-distant future, that automatic mimicking could even extend to voice. IBM scientists for instance have created a way for AI systems to analyze, interpret and mirror users’ unique speech and linguistic traits – in theory to make it easier for humans to talk to their technology. However, the potential for using this type of capability for malicious spoofing applications is obvious. And meanwhile, the zeal for adopting AI across vertical markets – for cybersecurity and beyond – has opened up a rapidly proliferating new attack surface—one that doesn’t always feature built-in security-by-design.


Bringing cloud intelligence to the edge in connected factories

A ton of new scenarios are enabled by the ability to run AI, that formerly only lived in the cloud, on local devices. Machine learning can now be used in IoT scenarios that require real time responses. Performance of solutions can be increased by eliminating the time it takes to transmit data to the cloud and back, you achieve close to instantaneous data analysis, which is vital to making critical operating decisions. Mission and safety critical IoT solutions are now resilient to internet connectivity. Azure IoT Edge enables devices to continue operating and transmitting data for analysis even offline, ensuring reliable production even with intermittent internet connectivity. Finally, IoT solution costs are decreased. Transmitting all your data to the cloud can be expensive, especially if you have facilities in remote places where internet access is expensive. By doing your analysis at the edge, you reduce the amount of data that you need to send to the cloud.


Comparing Features of 4 Popular Machine Learning Platforms
In simple words, it is the part of an artificial intelligence program that helps the computer to learn and adapt without being programmed to learn each new change. The commercial reign exploits and makes use of machine learning and term it as “predictive analytics.” Predictive analytics helps and allows researchers, data scientists and engineers to produce reliable results learning through the history and pattern of data input. ... As the world is continuing to develop artificial intelligence and machine learning software, India is also keeping up with the growth. The government of India has also started to focus on developing their own plan for AI. Software development companies in India are now focusing on creating artificially intelligent computer programs that may be used to assist human intelligence in fields like healthcare, weather and climate, crowd management, space research, and education. App developers and many app development companies in India, agencies are now coming up with the application of machine learning


The automation imperative

While automation success is possible through either traditional top-down (waterfall) deployment or more flexible agile methods, a systematic approach is key. Only 5 percent of respondents at successful companies say their deployment methods have been ad hoc, compared with 19 percent of peers not reporting success ... What’s more, successful organizations are implementing different automation technologies from the ones other organizations are adopting. Respondents with successful automation efforts are more than twice as likely as others to say their organizations are deploying machine learning. They are also more likely to cite the use of other cognitive-based automation capabilities, such as cognitive agents and natural-language processing. At respondents’ organizations overall, the most commonly adopted automation technology is robotic process automation, which respondents say is deployed in equal shares of successful and other organizations.


A close up of the underside of the Fitbit Alta HR, which tracks a users heart rate
Remembering that Navarra was also wearing a Fitbit on her left wrist at the time of her death, the investigators also worked to crack into that data. They ended up getting a search warrant for it. Fitbit Director of Brand Protection Jeff Bonham took custody of Navarra’s device and worked on retrieving data on her heart rate and movements from her final days. The investigators noted that Navarra’s desktop computer was just five to ten feet away from where they found her body in the dining room. Her last recorded movement was on Thursday, September 13, approximately when the coroner removed her body from her home. Before that, her last movement was on Saturday, September 8, the day Tony dropped off the pizza. It was also the last day the device recorded her heart rate. The Fitbit recorded a “significant” heart rate spike at 3:20pm, and it then rapidly declined. By 3:28—while Tony’s car was still parked in her driveway—her heart had stopped beating, according to the device.


A source of controversy due in part to fears for human employment, the presence of robots in our daily lives is nevertheless ine
A source of controversy due in part to fears for human employment, the presence of robots in our daily lives is nevertheless inevitable, engineers at the conference said. The trick to making them more palatable, they added, is to make them look and act more human so that we accept them into our lives more easily. In ageing societies, "robots will coexist with humans sooner or later", said Hiroko Kamide, a Japanese psychologist who specialises in relations between humans and robots. Welcoming robots into households or workplaces involves developing "multipurpose machines that are capable of interacting" with humans without being dangerous, said Philippe Soueres, head of the robotics department at a laboratory belonging to France's CNRS scientific institute. ... As such, robots must move around "in a supple way" despite their rigid mechanics and stop what they are doing in case of any unforeseen event, he added. That's why people are choosing "modular systems shaped like human bodies" which are meant to easily fit into real-world environments built for humans.
An Empathy Map is just one tool that can help you empathise and synthesise your observations from the research phase, and draw out unexpected insights about your user’s needs. An Empathy Map allows us to sum up our learning from engagements with people in the field of design research. The map provides four major areas in which to focus our attention on, thus providing an overview of a person’s experience. Empathy maps are also great as a background for the construction of the personas that you would often want to create later. An Empathy Map consists of four quadrants. The four quadrants reflect four key traits, which the user demonstrated/possessed during the observation/research stage. The four quadrants refer to what the user: Said, Did, Thought, and Felt. It’s fairly easy to determine what the user said and did. However, determining what they thought and felt should be based on careful observations and analysis as to how they behaved and responded to certain activities, suggestions, conversations, etc.


How to explain containers in plain English

CIO Containers Ecosystem
Software containers can nonetheless be a bit challenging to explain, particularly if the audience isn’t technical and doesn’t understand certain fundamentals about how software gets built and operated. ... Containers solve the packaging problem of how to quickly build and deploy applications. They’re akin to virtual machines, but with two notable differences: they’re lightweight and spun up in seconds; and they move reliably from one environment to another (what works on the developer’s computer will work the same in dev/test and production). In the digital era, applications are the business – speed and innovation are creating winners and losers across all industries. The beauty of containers, and why organizations are moving in this direction, is that they dramatically speed-up development.




Quote for the day:

"Without growth, organizations struggle to add talented people. Without talented people, organizations struggle to grow." -- Ray Attiyah

Daily Tech Digest - October 06, 2018

Scientists Just Created Quantum Artificial Life For The First Time Ever


This is still an early proof-of-concept prototype, but it opens the door to diving further into the relationship between quantum mechanics and the origins of life. The same principles governing quantum physics may even have had a role to play in forming our genetic code. It's like playing the Sims on a whole new level of physics. Creating artificial life inside computers has been the subject of many a previous experiment, but current software typically takes a classical, Newtonian approach in producing these models – step by step, with logical progressions. We know that the real world adds a dab of quantumness to the mix – strange phenomena happening at the micro and macro level – and the new research aims to add that same unpredictability to computer simulations as well. In other words, the simulations are no longer limited to 1s and 0s, but can introduce some of the randomness we see in everyday life. That promises to open up a whole new field ready to be explored.




If an AI system can create human-readable reports from unstructured internet data, then it can also decipher legislation. It will take time to train AI how to process legislative language effectively, but as ML algorithms become ubiquitous, easily deployable, and more affordable to run, it’s likely that someone will develop AI to make legislation more transparent. AI can transform the legislative process by moving it from lawyers manually reading and writing bills to modeling them. Perhaps, one analyst may read and write bills as another leverages AI, natural-language-processing algorithms, and data visualization to model their impact within existing complex legislative frameworks. AI can help to model, predict, and monitor the impact of legislation that lawmakers pass, but it can also keep the same lawmakers accountable on many other fronts. In a 2018 Gallup poll, only 5 percent of those surveyed had a high degree of confidence in the U.S. Congress. In many countries, simply trying to understand what an elected official or candidate running for public office believes or has historically voted on can be a daunting task.


Has Innovation Just Become An Infectious Disease?


Today's strategic imperative seems to be around the notion of "innovate or die" and that idea might be a little bit too close to the truth for many. But the whole idea of innovation for some of pharma seems more vague and focused on both an ambiguous endpoint and a fuzzy process. Innovation is served up as an ingredient in a process that offers an expectation of magical transformation. Never in a box, innovation is that unbridled perspective that everyone tires to (paradoxically) put into a package and sell to their customers. So, we have an epidemic. Accelerators, incubators and bean bag chairs give me goose bumps. Could it be that there's just too much innovation? I don't think that's the case. But I do believe that the germ of innovation can grow in different ways that are very powerful--both transformative and malignant! The role of innovation is more a function of applying inventionto a marketplace. Amazon and Apple have largely mastered this process and have ignited the flame of consumer-centricity in the life sciences industry.


Data governance in healthcare

The vast amount of data generated and collected by a multitude of stakeholders in healthcare comes in so many different forms —insurance claims, physician notes, medical records, medical images, pharmaceutical R&D, conversations about health in social media, and information from wearables and other monitoring devices. Data is growing faster than ever before and by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet. It is the scale of this data that sits at the very heart of the fourth industrial revolution and the impact it will ultimately have on the way we care for patients and communities in the future. As healthcare environments become increasingly dependent on digital technologies to support care delivery (at a user, organizational and healthcare system level), their ability to use and exchange data becomes a critical enabler of transformation. In healthcare systems around the world, data and analytics (D&A) is re-shaping the way leaders make evidence-based decisions to improve patient outcomes and operational performance.


Why Google is So Sure Their AI Solution Will Beat Out the Rest


Google has mastered search. It has dominated the field in this area, but the company is making its way in cloud computing as well. Another area it seeks to lead? Artificial Intelligence (AI). Technology companies are selling AI as part of their cloud services, and profiting as a result. The capacity of data centers is sufficient enough to support it. A technology prowess, Google has experience with the cloud. It is now looking to serve external customers in new ways, rather than focus all the attention on its internal operations. Customers are not only the end user consumers one sees every day on the street. Competitors are big names such as Microsoft and Amazon, as the company aims to draw in customers such as Netflix and Spotify. Why does Google see its AI solution as the one to beat out of other business services? The answer may be in the techniques and capabilities the company is investing in. Machine learning is one of them, and has been an interest for a long time. Others include image recognition and also search and video recommendations.


How to Avoid the All-Flash Capacity Glut

When all-flash drives were 1TB or smaller, most organizations needed to buy well over 24 drives, but now 24 drives (a.k.a. 384TBs) will more than cover all the needs of production storage in many data centers. For many AFA vendors these are the minimum configurations. Those that offer a 12 drive alternative will see a significant drop in performance, despite the fact that many of these high capacity drives are rated to deliver 70,000 IOPS or more. Again, that is 70K IOPS PER DRIVE, yet many 12 drive systems can’t deliver more than 30,000 IOPS. Given the raw performance of an SSD, a 24 drive system should deliver about 1.5 million IOPS! The problem is that most storage vendors have built their software code using legacy techniques. They haven’t rethought the algorithms that drive the core of the storage system and they haven’t adapted them to take advantage of multi-core processors. Part of the reason for this development approach is time to market, because by leveraging legacy code and legacy techniques they are able to bring products to market faster.


What's the Password? Play Some Music and Log In Via Brainwaves

Music Brain
Firstly, we capture the EEG signal from a brain-machine interface: currents that flow during synaptic excitations of the dendrites of many pyramidal neurons in the cerebral cortex. EEG signals have been shown to be sufficiently different between individuals and therefore suitable for use in the authentication process. Then we port that data into our EEG Workbench, which was created by Luis Gutierrez [currently working in the UC Riverside InfoSec department] as part of his Masters thesis. ... Our main problem is the lack of available data. For machine learning, you need an enormous amount of data, so that's what we're focusing on right now—building up data stores. Secondly, there aren't many commercially available brain-machine interfaces with reliable data output that we can use for machine learning at this time. Many have an issue with the signal-to-noise ratio. We need to isolate which diode on the BMI gives us the best response. 


Blockchain Grows Up as Bankers Take the Place of 'Crypto Cowboys'

Despite its promise in improving business functions -- finance and supply chain management are two of the most often-cited use cases -- there are still a number of hurdles to the commercial adoption of blockchain. One is that the technology is still in its relative infancy; another is how exactly regulators would account for illegal activity amidst a mass of cross-border payments.  There are also economic factors. As long as "get rich quick" crypto fever is still alive, it's that much harder to incentivize blockchain enthusiasts to take on the less sexy work of building protocols for business. "Blockchain is an extremely powerful idea, but it's very far from being a mature technology," said Christian Laang, CEO of of the supply chain management platform Tradeshift. "If people are becoming millionaires from ICO [initial coin offerings], they're disincentivized to create the next generation of technology. There's a little bit of a bubble with all the short term-ism."


Resilient Systems in Banking


Having a resilient service for customers means ensuring that when a failure occurs, the part of the system affected by the error is small in comparison to your system as a whole. There are two ways to ensure this. Redundancy is about ensuring the system as a whole extends out beyond the scope of failure. However much is impaired, we've simply got more in reserve. Isolation is about ensuring that the scope of failure remains confined within a small area and cannot spread out to the boundaries of our system. However you design your system, you must have good answers for both of these. With these in mind, you must mentally test your design against every fault you can imagine. To help, consider the following dimensions of failure: faults at an infrastructural level (like network failure), as well as faults at an application level (like uncaught exceptions or panics); and faults that are intrinsic to the software we build (caused by us, i.e. bugs) as well as those that are extrinsic (caused by others e.g. invalid messages). It is not sufficient to assume the intrinsic faults will be shaken out by testing. It is as important to protect yourself from yourself as from the rest of the world.


Digital transformation needs to come from the top

The corporate culture of the modern enterprise should be to embrace change and encourage "failing forward," so that the organization can evolve and learn about new technologies and also identify growth opportunities and/or risks to the business, Canaday said. Not all of this innovation needs to come from within, however. Many organizations are looking to hire outside of their industry or partner with innovative third parties to bring fresh perspectives, ideas, and expertise that can help spark internal ideation and creativity, Canaday said. As organizations undergo the latest evolution in technology, business models have shifted to become focused on mobile channels, and more recently there's been an increase in voice-activated interfaces. What's driving this is customer demand. "Companies need to operate where the client transacts, and need to be hyper-focused on the customer experience" to ensure that those customers will continue doing business with them, Canaday said.




Quote for the day:


"Coaching is unlocking a person's potential to maximize their own performance. It is helping them to learn rather than teaching them." -- John Whitmore


Daily Tech Digest - October 04, 2018


We have to describe the world as it is for us to gain useful insights. Sure, we might then use those to convert that reality to how it ought to be, but our ingoing information, plus its processing, has to be morally blind. There is quite a movement out there to insist that all algorithms, all AIs, must be audited. That there can be no black boxes – we must know the internal logic and information structures of everything. This is so we can audit them to ensure that none of the either conscious or unconscious failings of thought and prejudice that humans are prey to are included in them. But, as above, this fails on one ground – that we humans are prey to such things. Thus a description of, or calculation about, a world inhabited by humans must at least acknowledge, if not incorporate, such prejudices. Otherwise the results coming out of the system aren’t going to be about this world, are they?



Understanding Spring Reactive: Introducing Spring WebFlux


With the introduction of Servlet 3.1, Spring MVC could achieve non-blocking behavior. But, as the Servlet API contains several interfaces that are still blocking (maybe because of support for backward compatibility), there was always the chance of accidentally using blocking APIs in the application, which was intended to be developed as non-blocking. In such scenarios, the usage of a blocking API will certainly bring down the application sooner or later. ... The purpose of this series is to demonstrate the evolution of the Servlet/Spring from the blocking to non-blocking paradigm. I am not going into the details of Spring WebFlux in this tutorial. But, still, I am going to introduce a sample Spring Boot application using Spring WebFlux. One point which we should notice in the above diagram is that Spring WebFlux is Servlet Container agnostic. Spring Webflux works on Servlet Container and also on Netty through Reactor Netty Project. In my Spring boot application, I have a dependency on WebFlux as spring-boot-starter-webflux, and at server startup, it says that the application is ready with Netty.


Asking the right questions to define government’s role in cybersecurity

Asking the right questions to define government̢۪s role in cybersecurity
Cyberthreats cross national boundaries, with victims in one jurisdiction and perpetrators in another—often among nations that don’t agree on a common philosophy of governing the internet. And complicating it all, criminal offences vary, legal assistance arrangements are too slow, and operating models for day-to-day policing are optimized for crimes committed by local offenders. ... Each country is addressing the challenge in its own way, just as companies tackle the issue individually. Approaches vary even among leading countries identified by the Global Cybersecurity Index, an initiative of the United Nations International Telecommunications Union. Differences typically reflect political and legal philosophy, federal or national government structures, and how far government powers are devolved to state or local authorities. They also reflect public awareness and how broadly countries define national security—as well as technical capabilities among policy makers.


Iron Ox uses AI and robots to grow 30 times more produce than traditional farms


Iron Ox’s first 1,000-square-foot farm, which is in full production as of this week, taps a robotic arm equipped with a camera and computer vision systems that can analyze plants at sub-millimeter scale and execute tasks like planting and seeding. A 1,000-pound mobile transport system roughly the size of a car, meanwhile, delivers harvested produce — including leafy greens such as romaine, butterhead, and kale and herbs like basil, cilantro, and chives — using sensors and collision avoidance systems “similar to that of a self-driving car.” Cloud-hosted software acts as a sort of brain for the system, ingesting data from embedded sensors and using artificial intelligence (AI) to detect pests, forecast diseases, and “ensure cohesion across all parts.” It might sound like pricey tech, but Alexander and company said they worked to keep costs down by using off-the-shelf parts and implementing a scalable transport system.


From Visibility To Vision: Staying Competitive In An Open Banking Future


One of the reasons the digital experiences of established banks remain so lackluster is a failure by both customers and employees to report instances of slow or faulty systems. Across the board there is a growing apathy and acceptance of poorly performing technology, creating a self-perpetuating cycle of unsatisfied users. The first step in rectifying this problem is to give the power and visibility back to the IT team and business by providing them with system monitoring solutions that can quantify “normal” behavior as a benchmark to identify deviations from normal, so they can truly measure the user’s experience. These solutions would effectively bypass the reliance on the end-user to report issues and instead focus on creating more agile capabilities to proactively identify and rectify areas of degrading performance. Once IT departments are equipped with an intelligent and proactive infrastructure, banks can effectively compete by delivering digital services that offer a superior customer experience.


Everyone, everywhere is responsible for IIoT cyber security


Cyber security threats are coming at us from every direction, not just from our corporate networks. Operational networks were simply not built for connectivity, and carefully thought-out security protocols are being ignored for the benefit of data access to drive productivity gains. Unfortunately, threat vectors now extend even to base-level assets. Attackers can target anything from a connected thermostat to a wireless field device in order to cause danger. This heralds a new type of aggressive, innovative cyber attack for industrial control systems, which are becoming increasingly accessible over the internet, often inadvertently. The actors, too, have changed, and they are becoming more sophisticated every day. Attack techniques, tools and lessons are readily available on the dark web, which means low-level cyber criminals have access to the information they need to attempt more serious attacks.


How updating an outdated industrial control system can work with fog computing

industrial iot industry networking sensors
According to fog computing and automation startup Nebbiolo Technologies – which declined to name the client directly, saying only that it’s a “global” company – the failure of one of those Windows IPCs could result in up to six hours of downtime for said client. They wanted that time cut down to minutes. It’s a tricky issue. If those 9,000 machines were all in a data center, you could simply virtualize the whole thing and call it a day, according to Nebbiolo’s vice president of product management, Hugo Vliegen. But it's a heterogeneous environment, with the aging computers running critical control applications for the production lines – their connections to the equipment can't simply be abstracted into the cloud or a data center. Architecturally, however, the system is a bit simpler. Sure, there are a lot of computers, but they’re all managed remotely. The chief problem is visibility and failover, Vliegen said. “If they fail, they’re looking at six hours downtime,” he said on Tuesday in a presentation at the Fog World Congress in San Francisco.


5 mistakes even the best organizations make with product and customer data

“In 2018, digital business transformation will be played out at scale, sparking shifts in organizational structure, operating models, and technology platforms. CEOs will expect their CIOs to lead digital efforts by orchestrating the enabling technologies, closing the digital skills gap, and linking arms with CMOs and other executive peers better positioned to address the transformational issues across business silos.”  The need to address these business silos has been a key driver in the growth of master data management (MDM). MDM integrates multiple disparate systems across organizations by streamlining the process of aggregating and consolidating information about products, customers, suppliers, employees, assets and reference data from multiple sources and formats. It connects that information to derive actionable insights and publishes it to backend systems as well as online and offline channels.


Codefirst: The Future of UI Design


If you look at your laptop, tablet, or mobile phone today, you’ll notice that the latest craze to sweep the industry is flat design. Flat design was a dramatic departure from Apple’s ubiquitous skeuomorphism style to one that celebrated minimalism. This trend boasted a UI that leveraged simplicity, flat surfaces, cleaner edges, and understated graphics. The flat design trend evidences a shift within the industry to make designs scale across many different form factors. Websites, on the other hand, have incorporated polygonal shapes, simple geometric layers, and bold lines that grab the audience’s attention. Tactile designs have also grown in popularity in recent months. This design trend makes objects appear hyper-real. Beyond these current trends, there are many examples of websites without borders, without multiple layers, with purposeful animation, and large images. Going forward, you can undoubtedly expect the bar to be raised within the app and web world to ensure that both UI and UX work seamlessly together to improve user interactions.


Incorporate NIST security and virtualization recommendations


The main goal of following these NIST virtualization recommendations is to ensure the secure execution of the platform's baseline functions. These recommendations primarily target cloud service providers that offer infrastructure as a service and enterprise IT teams planning to implement virtual infrastructures to host line-of-business applications. According to NIST, hypervisor platforms are susceptible to security threats via three primary channels: the enterprise network where the hypervisor host resides, rogue or compromised VMs accessing virtualized resources, and web interfaces for the platform's management services and consoles. NIST breaks down the hypervisor platform into the following five baseline functions: VM process isolation (HY-BF1), device mediation and access control (HY-BF2), direct command execution from guest VMs (HY-BF3), VM lifecycle management (HY-BF4), and hypervisor platform management (HY-BF5).



Quote for the day:


"Great Leaders Focus On Sustainable Success Rather Than Quicker Wins." -- Gordon TredGold


Daily Tech Digest - October 03, 2018

Lady Justice
The problem with many of the standard metrics is that they fail to take into account how different groups might have different distributions of risk. In particular, if there are people who are very low risk or very high risk, then it can throw off these measures in a way that doesn't actually change what the fair decision should be. ... The upshot is that if you end up enforcing or trying to enforce one of these measures, if you try to equalize false positive rates, or you try to equalize some other classification parity metric, you can end up hurting both the group you're trying to protect and any other groups for which you might be changing the policy. ... A layman's definition of calibration would be, if an algorithm gives a risk score—maybe it gives a score from one to 10, and one is very low risk and 10 is very high risk—calibration says the scores should mean the same thing for different groups. We basically say in our paper that calibration is necessary for fairness, but it's not good enough. Just because your scores are calibrated doesn't mean you aren't doing something funny that could be harming certain groups.


Here̢۪s a solution to the AI talent shortage: Recruit philosophy students image
Who would have thought it? If schools and universities are going to help create a generation that is equipped to support the AI revolution, they might be better off teaching philosophy and psychology. Sport might be a good analogy. If you are trying to hire talent, you might be better off hiring staff while they are young, grabbing them from school or university as part of placements perhaps, an approach Melanie Oldham explains in this piece. It is an approach that sports clubs are fully versed in — football teams with their academies and talent scouts, scouring the playing fields on a Saturday morning. It often works out as a more effective approach than getting the cheque book out and buying players after they emerge. But for Rinku Singh and Dinesh Patel the route to stardom in baseball was not conventional. They joined the American baseball world after entering a talent contest in India. It was an unorthodox recruitment process made famous by the movie ‘Million Dollar Arm.’



What Is Deep Learning AI? A Simple Guide With 8 Practical Examples


It encompasses machine learning, where machines can learn by experience and acquire skills without human involvement. Deep learning is a subset of machine learning where artificial neural networks, algorithms inspired by the human brain, learn from large amounts of data. Similarly to how we learn from experience, the deep learning algorithm would perform a task repeatedly, each time tweaking it a little to improve the outcome. We refer to ‘deep learning’ because the neural networks have various (deep) layers that enable learning. Just about any problem that requires “thought” to figure out is a problem deep learning can learn to solve. The amount of data we generate every day is staggering—currently estimated at 2.6 quintillion bytes—and it’s the resource that makes deep learning possible. Since deep-learning algorithms require a ton of data to learn from, this increase in data creation is one reason that deep learning capabilities have grown in recent years.


A CIO forges a data strategy plan for creating actionable data


Information that you don't think is relevant right now can change in value. So wherever we can put a hook to preserve information for the future, we'll do that. Even if we don't take all the content and turn it into actionable data, we may take that data and leave it unstructured. We always like to leave that door open if there's information that the client has but can't think of a business case to use right now. ... It's a way of representing information -- subject, predicate, object. You start with metadata: You pull the information out about the data you're working with. Say I'm working with a journal article, so who is the author? What college did the author go to? That's just raw data. Now you want to relate that to other data. You have this author who attended this university and got this degree. Now you have not just three pieces of data, you have three related pieces of information that give you much more context.


Facebook Breach: Single Sign-On of Doom

Facebook Breach: Single Sign-On of Doom
"Due to the proliferation of SSO, user accounts in identity providers are now keys to the kingdom and pose a massive security risk. If such an account is compromised, attackers can gain control of the user's accounts in numerous other web services," according to "O Single Sign-Off, Where Art Thou?," a recently published report into "single sign-on account hijacking and session management on the web" authored by five researchers at the University of Illinois at Chicago. In the case of the Facebook breach, for example, its SSO system could have been used for a range of other sites, including its own Instagram, as well as Tinder, Spotify and others. "Our study on the top 1 million websites according to Alexa found that 6.3 percent of websites support SSO. This highlights the scale of the threat, as attackers can gain access to a massive number of web services," the researchers say. ... "Another very critical yet overlooked problem is that the stolen tokens can be used to obtain access to a user's account on other websites that support Facebook SSO *even if the user doesn't use Facebook SSO* to access them," he adds. "This depends on third-party implementations."


Augmented reality, fog, and vision: Duke professor outlines importance of smart architectures

8 virtual or augmented reality
Some of the trade-offs, she said, are already fairly well-known. For instance, many tasks that aren’t terribly demanding from a compute or network perspective are best accomplished at the edge, but the advantages in terms of latency are outweighed by the cloud’s more potent computing capabilities for more complex tasks. “When the task is small, the response time is dominated by the communication time, and the communication time is much smaller for edge systems,” she said. “Once you talk about larger tasks, however, there are more resources in the cloud, so computing time becomes more of a component in response time and the cloud connection will be faster than the edge.” “We also noted that connections to the cloud are much faster in on-campus conditions than they are in nearby residential areas, and this is well-known – connections from campuses to the cloud are optimized.” It’s an important point for academic researchers, she noted. Testing systems in areas that might not have a university laboratory’s optimized network connections yields results that are much more applicable to the real-world challenges faced by businesses.


Achieving the right balance of data privacy and IT security


A comprehensive data protection strategy must consider the integration of best practices to both security and privacy. Data integrity, retention, and availability are part of the overall data protection goal for an organization, and as such, they are tied directly to individuals’ rights as data subjects. ... Privacy cannot exist without security, but security can exist without privacy – not an ideal situation for anyone concerned. With the continued advance of technology, organizations and individuals must continue to increase awareness and knowledge of data protection, data threats, and the steps required to ensure security and privacy while still maintaining effective business practices and relatable social media interactions. The way to develop a resilient privacy and data protection program is to combine privacy- and security-related thinking into a common approach that makes it easier for employees in all organizational levels to do the right thing. As we continue to move forward in the data-driven world, we must view ourselves as data subjects and strive to attain an agile balance between security and privacy interests.


New details released on Huawei's intent-based network


The new S7530-HI and S6720-HI are fully programmable Ethernet switches based on Huawei's silicon Ethernet Network Processor. The custom application-specific integrated circuit delivers advanced features and is complemented with merchant silicon for standard functions. One of the unique attributes of this intent-based network line is it includes an integrated wireless controller for unified wired and wireless network management. The S7530-HI is equipped with all Gigabit Ethernet ports, and the S6720-HI has 100 Gigabit Ethernet uplinks. That makes the S6720-HI the first programmable, fixed form-factor switch with uplinks of that speed. These switches target the campus network and are designed to work with Huawei's wireless access points, which are ready for the internet of things, because they support a range of wireless protocols, including Bluetooth, Zigbee and radio frequency ID.


How Bank of England is using Splunk for proactive security


The bank is using Splunk to move away from a reactive SOC that only responds to known threats, and is now working towards being more proactive – or, as Pagett calls it, SOC 2.0. “The proactive model is around getting in lots of data and then what we call behavioural profiling or adversary modelling,” he says. “We try to model what our attackers might do from a behavioural point of view, and then we look for those behaviours.” Pagett says hackers can change the technology and techniques they use, but it is difficult for them to change their behaviour, making this the easiest way to spot when an attack is about to happen or is under way. The bank uses Splunk to mine the datasets needed to begin predicting these shifts in behaviour. This could range from a large number of failed password attempts to something more sophisticated, such as a spear-phishing attack with booby-trapped Microsoft Word attachments.


IT pros see security win with Microsoft Managed Desktop

Microsoft administrators said they see a clear value to this managed service -- which could potentially remove some tedious aspects of desktop management -- in an age when most users prefer physical devices. "We have folks spread across the country, so we have to wait for a shipment of laptops, and then image them and get them set up for the users," said David Bussey, systems engineer at the nonprofit Public Company Accounting Oversight Board in Washington, D.C. "What [Microsoft Managed Desktop] has to offer fits some of those pain points we're going through." Microsoft Managed Desktop allows businesses to choose two- or three-year hardware refresh cycles from a list of available devices. Right now, that list is limited to Microsoft's own Surface hardware -- specifically the Surface Laptop, Surface Pro and Surface Book 2. It plans to expand device offerings with third-party partnerships, the company said.



Quote for the day:


"Scientific knowledge is an enabling power to do either good or bad - but it does not carry instructions on how to use it." -- Richard Feynman


Daily Tech Digest - October 02, 2018

SIE Europe
SIE Europe is co-founded by three international Internet luminaries: Dr. Paul Vixie, Chairman and CEO of Farsight Security, Christoph Fischer, CEO of BFK edv-consulting GmbH and Peter Kruse, co-founder of CSIS Security Group A/S. “We founded SIE Europe to build a European-based community of Internet defenders who want to make the Internet safer for all users. As part of this initiative, SIE Europe will provide the infrastructure to collect, aggregate and share real-time DNS data in strict compliance with the privacy laws and regulations of the European Union, including General Data Protection Regulations (GDPR),” said Dr. Paul Vixie, Chairman and CEO of Farsight Security. All online transactions, good or bad, begin with the DNS. By providing visibility to the IP addresses, domain names and other digital artifacts of the DNS used by threat actors, security professionals will be able to accurately identify and map criminal infrastructures in their networks and take preventive measures to protect their networks from future cybercrime activity.



Facebook could face up to $1.6bn fine for data breach


Facebook said the attack exploited the “complex interaction of multiple issues in our code” and stemmed from a change made to the video uploading feature in July 2017. In response, Facebook said it had fixed the vulnerability, informed law enforcement and reset the access tokens of the almost 50 million accounts known to be affected. “We’re also taking the precautionary step of resetting access tokens for another 40 million accounts that have been subject to a “View As” look-up in the last year. As a result, around 90 million people will now have to log back in to Facebook, or any of their apps that use Facebook Login,” said Facebook. The company has also turned off the “View As” feature while it conducts a security review, but admitted it has yet to determine whether accounts were misused or any information accessed. Facebook said it is also still trying to establish the location and identity of the attackers and will reset the access tokens of any other accounts it believes may have been affected.


The CTO role: ‘It’s about planning and business opportunities’

Every CTO role is different, and in this case, Hanson, focuses on the sales side of the business, whereas other CTOs are more concerned with the development of products. “We have some very intelligent people in our product management division who look after the actual development of products. So I’m not on the product side. I’m more on the sales side,” confirms Hanson. His responsibility centres around making sure he finds out how Informatica’s prospects and consumers use the company’s technology. He needs to understand their challenges, governance and compliance issues moving forward; well as the pressures in their marketplace and how they need to leverage data to be successful and competitive in the marketplace. “It’s really my job to try and collect that information, and think about innovative uses for our products as they currently exist, and what type of initiatives we should try and help our prospects and customers with,” explains Hanson.


Big Data: changing the future of business models

null
The ability to analyse and make informed decisions from the use of data and its analytical capabilities is vital if a business is to succeed. In an increasingly competitive industry, it is imperative that firms are able to make quick and increasingly complex decisions to cater for the changing demands from customers and evolving market conditions.  By harnessing data, businesses can identify new opportunities within their existing business operations, create more efficient operations, increase profitability and improve customer service. By embracing data, businesses can gain a competitive edge over their rivals, ensuring they don’t lag behind the competition. Over the years, our data team has worked alongside businesses to help them find data-driven solutions and technologies with the aim of fast tracking their objectives and stimulating growth.


How I Lost My Faith in Private Blockchains

The business and legal worlds operate from an aspect of centralized entities, and while that remains the case, any forced attempts at decentralization are likely to come short. While it is possible that in the future we may see decentralized businesses, they are far more likely to come from the public blockchain world where they are able to grow organically in an entirely new paradigm. In the meantime, institutions and individuals should be evaluating permissioned blockchains like any other technology: it isn't magic, and it should be assessed like one would assess any other. The benefits of a technology should never be assumed based on buzzwords, hype or fear that "everyone else is doing it so why shouldn't I?" Instead, benefits should be assessed by asking what is the business problem, what are the different technology options available, and what are the quantifiable costs and benefits of each.


LinkedIn the latest to introduce its own server designs

LinkedIn the latest to introduce its own server designs
The idea behind the designs is to reduce the amount of work it takes to deploy servers in a data center. Again, this seems to assume people will build their own the way LinkedIn and other hyperscalers do it. It’s all designed to be like building with Lego bricks. LinkedIn also wanted to standardize hardware across both primary and edge data centers, which is likely why Vapor IO is involved. Edge locations don’t have a readily available technician, so if a company sends a technician to an edge container, the last thing it wants to do is make the tech waste time trying to figure out the layout of the equipment. By having common hardware between the two, the technician will work with familiar gear. LinkedIn claims these designs will mean being able to build infrastructure for 1 percent of the cost and six to ten times faster integration time, with greater power efficiency and other cost savings. However, it does not address the issue of IT staff building the hardware. LinkedIn, Google, Facebook, etc., can afford to hire engineers who build servers all day. Your average IT shop does not.


This is how cyber attackers stole £2.26m from Tesco Bank customers

The attackers most likely used an algorithm which generated authentic Tesco Bank debit card numbers and, using those virtual cards, they attempted to make thousands of unauthorised debit card transactions. The FCA said Tesco Bank's failures include the way in which the bank distributed debit card numbers and mistakes made in the reaction to the attack which meant that no action was taken for almost a day after the incident was first uncovered. A number of deficiencies in the way Tesco Bank handled security left customers vulnerable to cyber attackers in an incident that was "largely avoidable", said the FCA analysis of the incident which Tesco Bank had to this point been tight-lipped about -- to the frustration of other financial institutions. Poor design of Tesco Bank debit cards played a significant role in creating security vulnerabilities that led to thousands of customers having their accounts emptied. One of these involved the PAN numbers -- the 16-digit card number sequence used to identify all debit cards.


Google Chrome 70 is coming. Are your security certificates in order?

Google Chrome 70 is coming. Are your security certificates in order?
For those unfamiliar with the details of this, in 2017 Google and Mozilla decided to deprecate all Symantec-issued digital certificates based on their assessment that Symantec did not correctly validate its SSL certificates prior to issuing them to customers. Google and Mozilla then decided to put in place a multi-step plan to distrust any certificates issued from the Symantec PKI. This plan phased out Symantec certificates over the next year and a half. Instead of following the Google plan, Symantec elected to sell its certificate business to DigiCert. Despite the transaction, the requirement to replace all certificates issued from the Symantec PKI remained intact, requiring millions of certificates to be replaced during 2018. To assist customers in replacing their certificates, DigiCert contacted each certificate holder, offering free replacement certificates chained to the trusted DigiCert roots. The first major distrust date was on December 1, 2017, when no additional TLS certificates could be issued through the Symantec PKI. Prior to that date, DigiCert cut over all issuance processes to its PKI and validation systems.


Open Compute Project eyes European enterprise adoption with Experience Centre opening


The OCP’s championing of 21-inch server rack designs is often cited as a partial barrier to enterprise adoption of its technologies, as it makes it potentially harder for users to deploy the technology in existing datacentres where smaller server racks are consistently the norm. The centre’s opening is being overseen by datacentre infrastructure manufacturer Rittal and OCP supplier and service provider Circle B, in conjunction with Switch Datacenters, who is in the midst of building a datacentre based on OCP principles. “The three companies have determined that in the technology sector, IT managers at large enterprises and governments in the ... “These principles form the basis on which many hyperscalers operate. By adopting OCP designs in their datacentres large enterprises and governments can benefit from the same advantages as the hyperscalers: cost reductions, lower energy usage and much more flexibility.”


Building Agile Data Lakes with Robust Ingestion and Transformation Frameworks – Part 1


With the advent of Big Data technologies like Hadoop, there has been a major disruption in the information management industry. The excitement around it is not only about the three Vs – volume, velocity and variety – of data but also the ability to provide a single platform to serve all data needs across an organization. This single platform is called the Data Lake. The goal of a data lake initiative is to ingest data from all known systems within an enterprise and store it in this central platform to meet enterprise-wide analytical needs. However, a few years back Gartner warned that a large percentage of data lake initiatives have failed or will fail - becoming more of a data swamp than a data lake. How do we prevent this? We have teamed up with one of our partners, Clarity Insights, to discuss the data challenges enterprises face, what caused data lakes to become swamps, discuss the characteristics of a robust data ingestion framework and how it can help make the data lake more agile.



Quote for the day:


"One measure of leadership is the caliber of people who choose to follow you." -- Dennis A. Peer