Daily Tech Digest - October 08, 2018

A rough guide to your next (or first) fog computing deployment

city skyscrapers emerge from foggy cloudy / environment pollution / uncertainty / unknown future
There’s a hierarchy of storage options for fog computing that runs from cheap but slow to fast and expensive. At the former end, that option is network-attached storage. A NAS offers huge storage volumes, particularly over a distributed network, but that means latency times measured in seconds or minutes. Rotating disks could work well for big media libraries or data archives, according to Byers, while providing substantially better response times. Further up the hierarchy, flash storage, in the form of regular SSDs, provides much the same functionality as a spinning platter, with the well-known tradeoff in increased price-per-GB for much faster access times. That could work best for fast bulk storage, though Byers also notes that there are concerns about access speeds dropping off after a large enough number of read/write cycles. “After you write to a given address in the chip more than about 2,000 times, it starts getting harder to reprogram it, to the point where, eventually, you’ll get write failures on that sector of flash drive,” he said.



GDPR As Catalyst: Protect Data And Grow the Business (Part 4)

A successful collaboration depends on the ability to share information quickly and easily with third-party companies, working across organizational and geographical boundaries. However, it is vital to balance the need to provide business partners with ready access to enterprise data while safeguarding valuable intellectual property and sensitive corporate information. In addition, they must meet many industry- and country-specific compliance requirements – including the General Data Protection Regulation (GDPR) for managing personal data. Data processors and controllers are both responsible for GDPR requirements for personal data, wherever that data may be in their business network. They need to be able to quickly and securely share data with partners. By using dynamic attribute-based access controls, they can classify and segregate data based on metadata, content, association, or policy; establish fine-grained, attribute-based access policies; automate access authorization based on policies; and centralize activity logging and auditing to simplify compliance reporting.


Sony Smart TV Bug Allows Remote Access, Root Privileges


The flaws – a stack buffer overflow, a directory traversal and a command-injection bug – were found by Fortinet in March by its FortiGuard Labs team. The most serious of the vulnerabilities is the command-injection (CVE-2018-16593) bug, which is tied to a proprietary Sony application called Photo Sharing Plus. The app allows users to share multimedia content from their phones or tablets via Sony TVs. “This application handles file names incorrectly when the user uploads a media file,” wrote Fortinet’s Tony Loi, who found the vulnerability. “An attacker can abuse such filename mishandling to run arbitrary commands on the system, which can result in complete remote code-execution with root privilege.” Fortinet researchers said a compromised TV could be recruited into a botnet or be used as springboard for additional attacks against devices that shared the same network. To be successful, an adversary would need to be on the same wireless network as the Sony TV.


Separating high value from low value KPIs in data governance efforts

It's not necessarily a bad thing for a business to know how many overall data quality problems happen in a specified span. But, the reason this could be a lower-value KPI in many organizations is that it is likely not adequately specific. In contrast, a KPI for resolved issues indicates if a company is making gains in remedying problems or not. Looking at the number of data quality issues also becomes more problematic if a company has numerous locations. Failing to separate data quality issues into outstanding and resolved categories could also promote inaccurate presumptions about performance. Indeed, it's best to keep the number of data quality issues as low as possible. But, it's arguably even more critical for company representatives to accurately ensure they're promptly addressing and thoroughly handling all issues. Only viewing overall issues may not represent how those problems get treated.


The first smart display for business: Your Android phone

google home
The new smart display interface will constantly show contextual information such as the time, the weather, battery status, and other data. Google released its third version of the smartphone operating system Wear OS, which comes with an improved Google Assistant feature. The biggest change: proactivity. The Wear OS Google Assistant can offer all kinds of contextual information (some of it based on personal data mined in Gmail). This makes sense, because wristwatches can gather amazing contextual data, such as user location and also whether the user is walking or sitting. I think this is a preview of what’s coming for the docked Android phone version of Google Assistant. Phones have even better contextual information than watches, because placing the phone in the dock says a lot about intention — namely that the user is not intending to leave and go somewhere else, but plans to stay in a single place and may want hands-free notifications and assistance.


Microsoft halts rollout of Windows 10 October 2018 Update: What happens next?

Via email, a Microsoft spokesperson confirmed that announcement: "We have paused the rollout of the update while we continue to investigate reports from some customers." In a tweet, Dona Sarkar, who runs the Windows Insider Program, advised anyone affected by this issue to call Microsoft's support lines: "They have the tools to get you back to a good state." The implication in that tweet (and in the language from the original bulletin) is that the files have not been deleted but are available elsewhere on the system disk. Update: Roughly 36 hours after the initial publication of the support bulletin, Microsoft edited its contents. It now reads, "If you have manually checked for updates and believe you have an issue with missing files after an update, please minimize your use of the affected device and contact us directly..." [emphasis added] In the United States, you can reach Microsoft Support at 1-800-MICROSOFT (1-800-642-7676). For Windows 10 customers in other regions, check the list of local support numbers on the Global Customer Service Phone Numbers page.


Software-defined networking security involves 3 factors


To fully protect confidentiality, it's necessary to encrypt network traffic. IT teams should also consider encrypting the control channel in the environment, which includes the communications between an SDN controller and the data plane devices that actually move packets. Moreover, if an SDN system includes any ability to cache data -- e.g., as part of a network flight recorder feature -- or if it has data compression features, it may be necessary to encrypt data stored in memory, or even on a disk, in data plane devices or the controller. SDN systems can defend themselves from attack, but this requires hardened platforms for both controllers and data plane devices. If the SDN controller is running on a poorly secured Linux server, for example, it doesn't matter how secure the SDN system riding on the nodes is at a high level. Any off-the-shelf SDN system should have a secured base -- whether Linux, CentOS or something else -- when it comes out of the box.


Juniper CEO Rahim talks network, security and multicloud trends

There’s no way to get around the biggest trend, and that is the tectonic shift to cloud and multicloud. I am not just talking about the hyperscale users either. I am talking telcos and enterprises. It’s a sign of the times that every CIO is trying to take advantage of a multicloud environment, whether it’s to build out an infrastructure to handle it or deploy an overlay or underlay – they just cannot do it by themselves. That’s why we have so heavily invested in multicloud connectivity and software services development.  The second one is the move to 5G. Overall we don’t think 5G technologies will go mainstream until next year, but the preparation for it is well underway. Cloud services and providers are developing the infrastructure and capacity to take advantage of 5G now. Security would be the third, and what we are seeing is the trend of customers tying together networks and security technologies to develop more effective policies to block malware and protect the enterprise against threats better than ever before.


Open source is the future, but it will cost you more than you think

money.jpg
Open source has never been known for being the people that sit and finish up projects. They've always sort of gotten it to a good solid point that does 80% of what you want it to do, or it works well enough but there's not great interfaces and things on it. What tends to happen is, either commercial companies like Red Hat...end[] up making it usable for them afterwards. We obviously also see the public cloud beginning take those open source projects and turn them into managed services as well. Such companies—including system integrators—do the "last mile" work necessary to get open source projects ready for enterprise consumption. Red Hat makes billions on this model, yet it still remains more of an anomaly than it should. We have MongoDB, Elastic, the combined Cloudera and Hortonworks, and other open source companies, but not nearly as many as we should, given how dominant open source has become in the area of enterprise infrastructure.


Be Prepared for Disruption: Thinking the New Unthinkables


The fear is that talk of “purpose” still remains a convenient and fashionable slogan, but empty. It may only be mainstreamed when there is hard evidence that having strong values adds money and social value to the company. Mayer is leading the charge to find rigorous data through his work at Oxford and the British Academy project on the Future of the Corporation to establish a causal link between values and value. “Until that’s the case, it’s going to be extremely difficult to persuade the investment community that they should be moving in this direction in a big way,” he says. This is despite the very clear signals now being sent by the public to both corporate and political leaders that purpose matters. These unthinkable scenarios — such as Collymore’s young customers deserting overnight — must be accepted, not ignored. The challenge is even more acute in a world of digital transformation. Artificial intelligence and biotech are bringing huge changes to society. For leaders, a new clarity of purpose and a moral compass is essential, as is an understanding that huge new disruptions are the hallmark of the new normal.



Quote for the day:


"Open Leadership: the act of engaging others to influence and execute a coordinated and harmonious conclusion." -- Dan Pontefract


Daily Tech Digest - October 07, 2018

Over three years and one global trade war later, the possibility of this scenario has turned from a fringe and ahead-of-its-time concern, to a mainstream and relevant one. As AI continues to advance at a dizzying pace, the real-world applications of AI-related technologies have also increased – and so have concerns about living in a world inundated by intelligent machines capable of performing specific tasks. The dilemma facing lawmakers and leaders today is how developments in Artificial Intelligence will be thoughtfully monitored at a national and global level to protect the interests of man- and womankind, while also allowing enough freedom for citizens, corporations, and governments to leverage the new and rapidly advancing technology to increase efficiencies and generate added value. While there are no easy or glaringly obvious answers to this dilemma, Harvard’s Belfer Center for Science and International Affairs produced a report in 2017 recommending that the National Security Council, DoD, and State Department start studying what internationally agreed-on limits should be imposed on AI.


Rapidly developing technology has not only disrupted industries and business models—there is evidence it is changing consumer behavior and reshaping how companies should view their customers. M&E companies’ outreach campaigns and efforts to make technology user-friendly has paid off with older generations, whose behavior is mimicking younger generations’. Indeed, results from the 12th edition of Deloitte’s Digital Media Trends Survey indicate that the behaviors of Gen Z (ages 14–21), millennials (ages 22–37), and Gen X (ages 38–53)  are converging. ... Similarly, half of Gen X respondents reported that they play video games frequently, almost matching Gen Z and millennial respondents. As a result, many M&E providers are struggling to segment media consumption habits based only on generational behavior. Demographic generalizations—such as the assumption that people of the same gender and in a similar age and income bracket will consume products and services the same way, and be engaged by the same marketing ploys—are less accurate than they used to be.
AI for security can help defenders in a myriad of ways. However, there are also downsides to the emergence of AI. For one, the technology has also been leveraged by cybercriminals, and it’s clear that it can be co-opted for various nefarious tasks. These have include at-scale scanning for open, and vulnerable ports – or automated composition of emails that have the exact tone and voice of the company’s CEO, learned over time by 24-7 eavesdropping. And in the not-too-distant future, that automatic mimicking could even extend to voice. IBM scientists for instance have created a way for AI systems to analyze, interpret and mirror users’ unique speech and linguistic traits – in theory to make it easier for humans to talk to their technology. However, the potential for using this type of capability for malicious spoofing applications is obvious. And meanwhile, the zeal for adopting AI across vertical markets – for cybersecurity and beyond – has opened up a rapidly proliferating new attack surface—one that doesn’t always feature built-in security-by-design.


Bringing cloud intelligence to the edge in connected factories

A ton of new scenarios are enabled by the ability to run AI, that formerly only lived in the cloud, on local devices. Machine learning can now be used in IoT scenarios that require real time responses. Performance of solutions can be increased by eliminating the time it takes to transmit data to the cloud and back, you achieve close to instantaneous data analysis, which is vital to making critical operating decisions. Mission and safety critical IoT solutions are now resilient to internet connectivity. Azure IoT Edge enables devices to continue operating and transmitting data for analysis even offline, ensuring reliable production even with intermittent internet connectivity. Finally, IoT solution costs are decreased. Transmitting all your data to the cloud can be expensive, especially if you have facilities in remote places where internet access is expensive. By doing your analysis at the edge, you reduce the amount of data that you need to send to the cloud.


Comparing Features of 4 Popular Machine Learning Platforms
In simple words, it is the part of an artificial intelligence program that helps the computer to learn and adapt without being programmed to learn each new change. The commercial reign exploits and makes use of machine learning and term it as “predictive analytics.” Predictive analytics helps and allows researchers, data scientists and engineers to produce reliable results learning through the history and pattern of data input. ... As the world is continuing to develop artificial intelligence and machine learning software, India is also keeping up with the growth. The government of India has also started to focus on developing their own plan for AI. Software development companies in India are now focusing on creating artificially intelligent computer programs that may be used to assist human intelligence in fields like healthcare, weather and climate, crowd management, space research, and education. App developers and many app development companies in India, agencies are now coming up with the application of machine learning


The automation imperative

While automation success is possible through either traditional top-down (waterfall) deployment or more flexible agile methods, a systematic approach is key. Only 5 percent of respondents at successful companies say their deployment methods have been ad hoc, compared with 19 percent of peers not reporting success ... What’s more, successful organizations are implementing different automation technologies from the ones other organizations are adopting. Respondents with successful automation efforts are more than twice as likely as others to say their organizations are deploying machine learning. They are also more likely to cite the use of other cognitive-based automation capabilities, such as cognitive agents and natural-language processing. At respondents’ organizations overall, the most commonly adopted automation technology is robotic process automation, which respondents say is deployed in equal shares of successful and other organizations.


A close up of the underside of the Fitbit Alta HR, which tracks a users heart rate
Remembering that Navarra was also wearing a Fitbit on her left wrist at the time of her death, the investigators also worked to crack into that data. They ended up getting a search warrant for it. Fitbit Director of Brand Protection Jeff Bonham took custody of Navarra’s device and worked on retrieving data on her heart rate and movements from her final days. The investigators noted that Navarra’s desktop computer was just five to ten feet away from where they found her body in the dining room. Her last recorded movement was on Thursday, September 13, approximately when the coroner removed her body from her home. Before that, her last movement was on Saturday, September 8, the day Tony dropped off the pizza. It was also the last day the device recorded her heart rate. The Fitbit recorded a “significant” heart rate spike at 3:20pm, and it then rapidly declined. By 3:28—while Tony’s car was still parked in her driveway—her heart had stopped beating, according to the device.


A source of controversy due in part to fears for human employment, the presence of robots in our daily lives is nevertheless ine
A source of controversy due in part to fears for human employment, the presence of robots in our daily lives is nevertheless inevitable, engineers at the conference said. The trick to making them more palatable, they added, is to make them look and act more human so that we accept them into our lives more easily. In ageing societies, "robots will coexist with humans sooner or later", said Hiroko Kamide, a Japanese psychologist who specialises in relations between humans and robots. Welcoming robots into households or workplaces involves developing "multipurpose machines that are capable of interacting" with humans without being dangerous, said Philippe Soueres, head of the robotics department at a laboratory belonging to France's CNRS scientific institute. ... As such, robots must move around "in a supple way" despite their rigid mechanics and stop what they are doing in case of any unforeseen event, he added. That's why people are choosing "modular systems shaped like human bodies" which are meant to easily fit into real-world environments built for humans.
An Empathy Map is just one tool that can help you empathise and synthesise your observations from the research phase, and draw out unexpected insights about your user’s needs. An Empathy Map allows us to sum up our learning from engagements with people in the field of design research. The map provides four major areas in which to focus our attention on, thus providing an overview of a person’s experience. Empathy maps are also great as a background for the construction of the personas that you would often want to create later. An Empathy Map consists of four quadrants. The four quadrants reflect four key traits, which the user demonstrated/possessed during the observation/research stage. The four quadrants refer to what the user: Said, Did, Thought, and Felt. It’s fairly easy to determine what the user said and did. However, determining what they thought and felt should be based on careful observations and analysis as to how they behaved and responded to certain activities, suggestions, conversations, etc.


How to explain containers in plain English

CIO Containers Ecosystem
Software containers can nonetheless be a bit challenging to explain, particularly if the audience isn’t technical and doesn’t understand certain fundamentals about how software gets built and operated. ... Containers solve the packaging problem of how to quickly build and deploy applications. They’re akin to virtual machines, but with two notable differences: they’re lightweight and spun up in seconds; and they move reliably from one environment to another (what works on the developer’s computer will work the same in dev/test and production). In the digital era, applications are the business – speed and innovation are creating winners and losers across all industries. The beauty of containers, and why organizations are moving in this direction, is that they dramatically speed-up development.




Quote for the day:

"Without growth, organizations struggle to add talented people. Without talented people, organizations struggle to grow." -- Ray Attiyah

Daily Tech Digest - October 06, 2018

Scientists Just Created Quantum Artificial Life For The First Time Ever


This is still an early proof-of-concept prototype, but it opens the door to diving further into the relationship between quantum mechanics and the origins of life. The same principles governing quantum physics may even have had a role to play in forming our genetic code. It's like playing the Sims on a whole new level of physics. Creating artificial life inside computers has been the subject of many a previous experiment, but current software typically takes a classical, Newtonian approach in producing these models – step by step, with logical progressions. We know that the real world adds a dab of quantumness to the mix – strange phenomena happening at the micro and macro level – and the new research aims to add that same unpredictability to computer simulations as well. In other words, the simulations are no longer limited to 1s and 0s, but can introduce some of the randomness we see in everyday life. That promises to open up a whole new field ready to be explored.




If an AI system can create human-readable reports from unstructured internet data, then it can also decipher legislation. It will take time to train AI how to process legislative language effectively, but as ML algorithms become ubiquitous, easily deployable, and more affordable to run, it’s likely that someone will develop AI to make legislation more transparent. AI can transform the legislative process by moving it from lawyers manually reading and writing bills to modeling them. Perhaps, one analyst may read and write bills as another leverages AI, natural-language-processing algorithms, and data visualization to model their impact within existing complex legislative frameworks. AI can help to model, predict, and monitor the impact of legislation that lawmakers pass, but it can also keep the same lawmakers accountable on many other fronts. In a 2018 Gallup poll, only 5 percent of those surveyed had a high degree of confidence in the U.S. Congress. In many countries, simply trying to understand what an elected official or candidate running for public office believes or has historically voted on can be a daunting task.


Has Innovation Just Become An Infectious Disease?


Today's strategic imperative seems to be around the notion of "innovate or die" and that idea might be a little bit too close to the truth for many. But the whole idea of innovation for some of pharma seems more vague and focused on both an ambiguous endpoint and a fuzzy process. Innovation is served up as an ingredient in a process that offers an expectation of magical transformation. Never in a box, innovation is that unbridled perspective that everyone tires to (paradoxically) put into a package and sell to their customers. So, we have an epidemic. Accelerators, incubators and bean bag chairs give me goose bumps. Could it be that there's just too much innovation? I don't think that's the case. But I do believe that the germ of innovation can grow in different ways that are very powerful--both transformative and malignant! The role of innovation is more a function of applying inventionto a marketplace. Amazon and Apple have largely mastered this process and have ignited the flame of consumer-centricity in the life sciences industry.


Data governance in healthcare

The vast amount of data generated and collected by a multitude of stakeholders in healthcare comes in so many different forms —insurance claims, physician notes, medical records, medical images, pharmaceutical R&D, conversations about health in social media, and information from wearables and other monitoring devices. Data is growing faster than ever before and by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet. It is the scale of this data that sits at the very heart of the fourth industrial revolution and the impact it will ultimately have on the way we care for patients and communities in the future. As healthcare environments become increasingly dependent on digital technologies to support care delivery (at a user, organizational and healthcare system level), their ability to use and exchange data becomes a critical enabler of transformation. In healthcare systems around the world, data and analytics (D&A) is re-shaping the way leaders make evidence-based decisions to improve patient outcomes and operational performance.


Why Google is So Sure Their AI Solution Will Beat Out the Rest


Google has mastered search. It has dominated the field in this area, but the company is making its way in cloud computing as well. Another area it seeks to lead? Artificial Intelligence (AI). Technology companies are selling AI as part of their cloud services, and profiting as a result. The capacity of data centers is sufficient enough to support it. A technology prowess, Google has experience with the cloud. It is now looking to serve external customers in new ways, rather than focus all the attention on its internal operations. Customers are not only the end user consumers one sees every day on the street. Competitors are big names such as Microsoft and Amazon, as the company aims to draw in customers such as Netflix and Spotify. Why does Google see its AI solution as the one to beat out of other business services? The answer may be in the techniques and capabilities the company is investing in. Machine learning is one of them, and has been an interest for a long time. Others include image recognition and also search and video recommendations.


How to Avoid the All-Flash Capacity Glut

When all-flash drives were 1TB or smaller, most organizations needed to buy well over 24 drives, but now 24 drives (a.k.a. 384TBs) will more than cover all the needs of production storage in many data centers. For many AFA vendors these are the minimum configurations. Those that offer a 12 drive alternative will see a significant drop in performance, despite the fact that many of these high capacity drives are rated to deliver 70,000 IOPS or more. Again, that is 70K IOPS PER DRIVE, yet many 12 drive systems can’t deliver more than 30,000 IOPS. Given the raw performance of an SSD, a 24 drive system should deliver about 1.5 million IOPS! The problem is that most storage vendors have built their software code using legacy techniques. They haven’t rethought the algorithms that drive the core of the storage system and they haven’t adapted them to take advantage of multi-core processors. Part of the reason for this development approach is time to market, because by leveraging legacy code and legacy techniques they are able to bring products to market faster.


What's the Password? Play Some Music and Log In Via Brainwaves

Music Brain
Firstly, we capture the EEG signal from a brain-machine interface: currents that flow during synaptic excitations of the dendrites of many pyramidal neurons in the cerebral cortex. EEG signals have been shown to be sufficiently different between individuals and therefore suitable for use in the authentication process. Then we port that data into our EEG Workbench, which was created by Luis Gutierrez [currently working in the UC Riverside InfoSec department] as part of his Masters thesis. ... Our main problem is the lack of available data. For machine learning, you need an enormous amount of data, so that's what we're focusing on right now—building up data stores. Secondly, there aren't many commercially available brain-machine interfaces with reliable data output that we can use for machine learning at this time. Many have an issue with the signal-to-noise ratio. We need to isolate which diode on the BMI gives us the best response. 


Blockchain Grows Up as Bankers Take the Place of 'Crypto Cowboys'

Despite its promise in improving business functions -- finance and supply chain management are two of the most often-cited use cases -- there are still a number of hurdles to the commercial adoption of blockchain. One is that the technology is still in its relative infancy; another is how exactly regulators would account for illegal activity amidst a mass of cross-border payments.  There are also economic factors. As long as "get rich quick" crypto fever is still alive, it's that much harder to incentivize blockchain enthusiasts to take on the less sexy work of building protocols for business. "Blockchain is an extremely powerful idea, but it's very far from being a mature technology," said Christian Laang, CEO of of the supply chain management platform Tradeshift. "If people are becoming millionaires from ICO [initial coin offerings], they're disincentivized to create the next generation of technology. There's a little bit of a bubble with all the short term-ism."


Resilient Systems in Banking


Having a resilient service for customers means ensuring that when a failure occurs, the part of the system affected by the error is small in comparison to your system as a whole. There are two ways to ensure this. Redundancy is about ensuring the system as a whole extends out beyond the scope of failure. However much is impaired, we've simply got more in reserve. Isolation is about ensuring that the scope of failure remains confined within a small area and cannot spread out to the boundaries of our system. However you design your system, you must have good answers for both of these. With these in mind, you must mentally test your design against every fault you can imagine. To help, consider the following dimensions of failure: faults at an infrastructural level (like network failure), as well as faults at an application level (like uncaught exceptions or panics); and faults that are intrinsic to the software we build (caused by us, i.e. bugs) as well as those that are extrinsic (caused by others e.g. invalid messages). It is not sufficient to assume the intrinsic faults will be shaken out by testing. It is as important to protect yourself from yourself as from the rest of the world.


Digital transformation needs to come from the top

The corporate culture of the modern enterprise should be to embrace change and encourage "failing forward," so that the organization can evolve and learn about new technologies and also identify growth opportunities and/or risks to the business, Canaday said. Not all of this innovation needs to come from within, however. Many organizations are looking to hire outside of their industry or partner with innovative third parties to bring fresh perspectives, ideas, and expertise that can help spark internal ideation and creativity, Canaday said. As organizations undergo the latest evolution in technology, business models have shifted to become focused on mobile channels, and more recently there's been an increase in voice-activated interfaces. What's driving this is customer demand. "Companies need to operate where the client transacts, and need to be hyper-focused on the customer experience" to ensure that those customers will continue doing business with them, Canaday said.




Quote for the day:


"Coaching is unlocking a person's potential to maximize their own performance. It is helping them to learn rather than teaching them." -- John Whitmore


Daily Tech Digest - October 04, 2018


We have to describe the world as it is for us to gain useful insights. Sure, we might then use those to convert that reality to how it ought to be, but our ingoing information, plus its processing, has to be morally blind. There is quite a movement out there to insist that all algorithms, all AIs, must be audited. That there can be no black boxes – we must know the internal logic and information structures of everything. This is so we can audit them to ensure that none of the either conscious or unconscious failings of thought and prejudice that humans are prey to are included in them. But, as above, this fails on one ground – that we humans are prey to such things. Thus a description of, or calculation about, a world inhabited by humans must at least acknowledge, if not incorporate, such prejudices. Otherwise the results coming out of the system aren’t going to be about this world, are they?



Understanding Spring Reactive: Introducing Spring WebFlux


With the introduction of Servlet 3.1, Spring MVC could achieve non-blocking behavior. But, as the Servlet API contains several interfaces that are still blocking (maybe because of support for backward compatibility), there was always the chance of accidentally using blocking APIs in the application, which was intended to be developed as non-blocking. In such scenarios, the usage of a blocking API will certainly bring down the application sooner or later. ... The purpose of this series is to demonstrate the evolution of the Servlet/Spring from the blocking to non-blocking paradigm. I am not going into the details of Spring WebFlux in this tutorial. But, still, I am going to introduce a sample Spring Boot application using Spring WebFlux. One point which we should notice in the above diagram is that Spring WebFlux is Servlet Container agnostic. Spring Webflux works on Servlet Container and also on Netty through Reactor Netty Project. In my Spring boot application, I have a dependency on WebFlux as spring-boot-starter-webflux, and at server startup, it says that the application is ready with Netty.


Asking the right questions to define government’s role in cybersecurity

Asking the right questions to define government̢۪s role in cybersecurity
Cyberthreats cross national boundaries, with victims in one jurisdiction and perpetrators in another—often among nations that don’t agree on a common philosophy of governing the internet. And complicating it all, criminal offences vary, legal assistance arrangements are too slow, and operating models for day-to-day policing are optimized for crimes committed by local offenders. ... Each country is addressing the challenge in its own way, just as companies tackle the issue individually. Approaches vary even among leading countries identified by the Global Cybersecurity Index, an initiative of the United Nations International Telecommunications Union. Differences typically reflect political and legal philosophy, federal or national government structures, and how far government powers are devolved to state or local authorities. They also reflect public awareness and how broadly countries define national security—as well as technical capabilities among policy makers.


Iron Ox uses AI and robots to grow 30 times more produce than traditional farms


Iron Ox’s first 1,000-square-foot farm, which is in full production as of this week, taps a robotic arm equipped with a camera and computer vision systems that can analyze plants at sub-millimeter scale and execute tasks like planting and seeding. A 1,000-pound mobile transport system roughly the size of a car, meanwhile, delivers harvested produce — including leafy greens such as romaine, butterhead, and kale and herbs like basil, cilantro, and chives — using sensors and collision avoidance systems “similar to that of a self-driving car.” Cloud-hosted software acts as a sort of brain for the system, ingesting data from embedded sensors and using artificial intelligence (AI) to detect pests, forecast diseases, and “ensure cohesion across all parts.” It might sound like pricey tech, but Alexander and company said they worked to keep costs down by using off-the-shelf parts and implementing a scalable transport system.


From Visibility To Vision: Staying Competitive In An Open Banking Future


One of the reasons the digital experiences of established banks remain so lackluster is a failure by both customers and employees to report instances of slow or faulty systems. Across the board there is a growing apathy and acceptance of poorly performing technology, creating a self-perpetuating cycle of unsatisfied users. The first step in rectifying this problem is to give the power and visibility back to the IT team and business by providing them with system monitoring solutions that can quantify “normal” behavior as a benchmark to identify deviations from normal, so they can truly measure the user’s experience. These solutions would effectively bypass the reliance on the end-user to report issues and instead focus on creating more agile capabilities to proactively identify and rectify areas of degrading performance. Once IT departments are equipped with an intelligent and proactive infrastructure, banks can effectively compete by delivering digital services that offer a superior customer experience.


Everyone, everywhere is responsible for IIoT cyber security


Cyber security threats are coming at us from every direction, not just from our corporate networks. Operational networks were simply not built for connectivity, and carefully thought-out security protocols are being ignored for the benefit of data access to drive productivity gains. Unfortunately, threat vectors now extend even to base-level assets. Attackers can target anything from a connected thermostat to a wireless field device in order to cause danger. This heralds a new type of aggressive, innovative cyber attack for industrial control systems, which are becoming increasingly accessible over the internet, often inadvertently. The actors, too, have changed, and they are becoming more sophisticated every day. Attack techniques, tools and lessons are readily available on the dark web, which means low-level cyber criminals have access to the information they need to attempt more serious attacks.


How updating an outdated industrial control system can work with fog computing

industrial iot industry networking sensors
According to fog computing and automation startup Nebbiolo Technologies – which declined to name the client directly, saying only that it’s a “global” company – the failure of one of those Windows IPCs could result in up to six hours of downtime for said client. They wanted that time cut down to minutes. It’s a tricky issue. If those 9,000 machines were all in a data center, you could simply virtualize the whole thing and call it a day, according to Nebbiolo’s vice president of product management, Hugo Vliegen. But it's a heterogeneous environment, with the aging computers running critical control applications for the production lines – their connections to the equipment can't simply be abstracted into the cloud or a data center. Architecturally, however, the system is a bit simpler. Sure, there are a lot of computers, but they’re all managed remotely. The chief problem is visibility and failover, Vliegen said. “If they fail, they’re looking at six hours downtime,” he said on Tuesday in a presentation at the Fog World Congress in San Francisco.


5 mistakes even the best organizations make with product and customer data

“In 2018, digital business transformation will be played out at scale, sparking shifts in organizational structure, operating models, and technology platforms. CEOs will expect their CIOs to lead digital efforts by orchestrating the enabling technologies, closing the digital skills gap, and linking arms with CMOs and other executive peers better positioned to address the transformational issues across business silos.”  The need to address these business silos has been a key driver in the growth of master data management (MDM). MDM integrates multiple disparate systems across organizations by streamlining the process of aggregating and consolidating information about products, customers, suppliers, employees, assets and reference data from multiple sources and formats. It connects that information to derive actionable insights and publishes it to backend systems as well as online and offline channels.


Codefirst: The Future of UI Design


If you look at your laptop, tablet, or mobile phone today, you’ll notice that the latest craze to sweep the industry is flat design. Flat design was a dramatic departure from Apple’s ubiquitous skeuomorphism style to one that celebrated minimalism. This trend boasted a UI that leveraged simplicity, flat surfaces, cleaner edges, and understated graphics. The flat design trend evidences a shift within the industry to make designs scale across many different form factors. Websites, on the other hand, have incorporated polygonal shapes, simple geometric layers, and bold lines that grab the audience’s attention. Tactile designs have also grown in popularity in recent months. This design trend makes objects appear hyper-real. Beyond these current trends, there are many examples of websites without borders, without multiple layers, with purposeful animation, and large images. Going forward, you can undoubtedly expect the bar to be raised within the app and web world to ensure that both UI and UX work seamlessly together to improve user interactions.


Incorporate NIST security and virtualization recommendations


The main goal of following these NIST virtualization recommendations is to ensure the secure execution of the platform's baseline functions. These recommendations primarily target cloud service providers that offer infrastructure as a service and enterprise IT teams planning to implement virtual infrastructures to host line-of-business applications. According to NIST, hypervisor platforms are susceptible to security threats via three primary channels: the enterprise network where the hypervisor host resides, rogue or compromised VMs accessing virtualized resources, and web interfaces for the platform's management services and consoles. NIST breaks down the hypervisor platform into the following five baseline functions: VM process isolation (HY-BF1), device mediation and access control (HY-BF2), direct command execution from guest VMs (HY-BF3), VM lifecycle management (HY-BF4), and hypervisor platform management (HY-BF5).



Quote for the day:


"Great Leaders Focus On Sustainable Success Rather Than Quicker Wins." -- Gordon TredGold


Daily Tech Digest - October 03, 2018

Lady Justice
The problem with many of the standard metrics is that they fail to take into account how different groups might have different distributions of risk. In particular, if there are people who are very low risk or very high risk, then it can throw off these measures in a way that doesn't actually change what the fair decision should be. ... The upshot is that if you end up enforcing or trying to enforce one of these measures, if you try to equalize false positive rates, or you try to equalize some other classification parity metric, you can end up hurting both the group you're trying to protect and any other groups for which you might be changing the policy. ... A layman's definition of calibration would be, if an algorithm gives a risk score—maybe it gives a score from one to 10, and one is very low risk and 10 is very high risk—calibration says the scores should mean the same thing for different groups. We basically say in our paper that calibration is necessary for fairness, but it's not good enough. Just because your scores are calibrated doesn't mean you aren't doing something funny that could be harming certain groups.


Here̢۪s a solution to the AI talent shortage: Recruit philosophy students image
Who would have thought it? If schools and universities are going to help create a generation that is equipped to support the AI revolution, they might be better off teaching philosophy and psychology. Sport might be a good analogy. If you are trying to hire talent, you might be better off hiring staff while they are young, grabbing them from school or university as part of placements perhaps, an approach Melanie Oldham explains in this piece. It is an approach that sports clubs are fully versed in — football teams with their academies and talent scouts, scouring the playing fields on a Saturday morning. It often works out as a more effective approach than getting the cheque book out and buying players after they emerge. But for Rinku Singh and Dinesh Patel the route to stardom in baseball was not conventional. They joined the American baseball world after entering a talent contest in India. It was an unorthodox recruitment process made famous by the movie ‘Million Dollar Arm.’



What Is Deep Learning AI? A Simple Guide With 8 Practical Examples


It encompasses machine learning, where machines can learn by experience and acquire skills without human involvement. Deep learning is a subset of machine learning where artificial neural networks, algorithms inspired by the human brain, learn from large amounts of data. Similarly to how we learn from experience, the deep learning algorithm would perform a task repeatedly, each time tweaking it a little to improve the outcome. We refer to ‘deep learning’ because the neural networks have various (deep) layers that enable learning. Just about any problem that requires “thought” to figure out is a problem deep learning can learn to solve. The amount of data we generate every day is staggering—currently estimated at 2.6 quintillion bytes—and it’s the resource that makes deep learning possible. Since deep-learning algorithms require a ton of data to learn from, this increase in data creation is one reason that deep learning capabilities have grown in recent years.


A CIO forges a data strategy plan for creating actionable data


Information that you don't think is relevant right now can change in value. So wherever we can put a hook to preserve information for the future, we'll do that. Even if we don't take all the content and turn it into actionable data, we may take that data and leave it unstructured. We always like to leave that door open if there's information that the client has but can't think of a business case to use right now. ... It's a way of representing information -- subject, predicate, object. You start with metadata: You pull the information out about the data you're working with. Say I'm working with a journal article, so who is the author? What college did the author go to? That's just raw data. Now you want to relate that to other data. You have this author who attended this university and got this degree. Now you have not just three pieces of data, you have three related pieces of information that give you much more context.


Facebook Breach: Single Sign-On of Doom

Facebook Breach: Single Sign-On of Doom
"Due to the proliferation of SSO, user accounts in identity providers are now keys to the kingdom and pose a massive security risk. If such an account is compromised, attackers can gain control of the user's accounts in numerous other web services," according to "O Single Sign-Off, Where Art Thou?," a recently published report into "single sign-on account hijacking and session management on the web" authored by five researchers at the University of Illinois at Chicago. In the case of the Facebook breach, for example, its SSO system could have been used for a range of other sites, including its own Instagram, as well as Tinder, Spotify and others. "Our study on the top 1 million websites according to Alexa found that 6.3 percent of websites support SSO. This highlights the scale of the threat, as attackers can gain access to a massive number of web services," the researchers say. ... "Another very critical yet overlooked problem is that the stolen tokens can be used to obtain access to a user's account on other websites that support Facebook SSO *even if the user doesn't use Facebook SSO* to access them," he adds. "This depends on third-party implementations."


Augmented reality, fog, and vision: Duke professor outlines importance of smart architectures

8 virtual or augmented reality
Some of the trade-offs, she said, are already fairly well-known. For instance, many tasks that aren’t terribly demanding from a compute or network perspective are best accomplished at the edge, but the advantages in terms of latency are outweighed by the cloud’s more potent computing capabilities for more complex tasks. “When the task is small, the response time is dominated by the communication time, and the communication time is much smaller for edge systems,” she said. “Once you talk about larger tasks, however, there are more resources in the cloud, so computing time becomes more of a component in response time and the cloud connection will be faster than the edge.” “We also noted that connections to the cloud are much faster in on-campus conditions than they are in nearby residential areas, and this is well-known – connections from campuses to the cloud are optimized.” It’s an important point for academic researchers, she noted. Testing systems in areas that might not have a university laboratory’s optimized network connections yields results that are much more applicable to the real-world challenges faced by businesses.


Achieving the right balance of data privacy and IT security


A comprehensive data protection strategy must consider the integration of best practices to both security and privacy. Data integrity, retention, and availability are part of the overall data protection goal for an organization, and as such, they are tied directly to individuals’ rights as data subjects. ... Privacy cannot exist without security, but security can exist without privacy – not an ideal situation for anyone concerned. With the continued advance of technology, organizations and individuals must continue to increase awareness and knowledge of data protection, data threats, and the steps required to ensure security and privacy while still maintaining effective business practices and relatable social media interactions. The way to develop a resilient privacy and data protection program is to combine privacy- and security-related thinking into a common approach that makes it easier for employees in all organizational levels to do the right thing. As we continue to move forward in the data-driven world, we must view ourselves as data subjects and strive to attain an agile balance between security and privacy interests.


New details released on Huawei's intent-based network


The new S7530-HI and S6720-HI are fully programmable Ethernet switches based on Huawei's silicon Ethernet Network Processor. The custom application-specific integrated circuit delivers advanced features and is complemented with merchant silicon for standard functions. One of the unique attributes of this intent-based network line is it includes an integrated wireless controller for unified wired and wireless network management. The S7530-HI is equipped with all Gigabit Ethernet ports, and the S6720-HI has 100 Gigabit Ethernet uplinks. That makes the S6720-HI the first programmable, fixed form-factor switch with uplinks of that speed. These switches target the campus network and are designed to work with Huawei's wireless access points, which are ready for the internet of things, because they support a range of wireless protocols, including Bluetooth, Zigbee and radio frequency ID.


How Bank of England is using Splunk for proactive security


The bank is using Splunk to move away from a reactive SOC that only responds to known threats, and is now working towards being more proactive – or, as Pagett calls it, SOC 2.0. “The proactive model is around getting in lots of data and then what we call behavioural profiling or adversary modelling,” he says. “We try to model what our attackers might do from a behavioural point of view, and then we look for those behaviours.” Pagett says hackers can change the technology and techniques they use, but it is difficult for them to change their behaviour, making this the easiest way to spot when an attack is about to happen or is under way. The bank uses Splunk to mine the datasets needed to begin predicting these shifts in behaviour. This could range from a large number of failed password attempts to something more sophisticated, such as a spear-phishing attack with booby-trapped Microsoft Word attachments.


IT pros see security win with Microsoft Managed Desktop

Microsoft administrators said they see a clear value to this managed service -- which could potentially remove some tedious aspects of desktop management -- in an age when most users prefer physical devices. "We have folks spread across the country, so we have to wait for a shipment of laptops, and then image them and get them set up for the users," said David Bussey, systems engineer at the nonprofit Public Company Accounting Oversight Board in Washington, D.C. "What [Microsoft Managed Desktop] has to offer fits some of those pain points we're going through." Microsoft Managed Desktop allows businesses to choose two- or three-year hardware refresh cycles from a list of available devices. Right now, that list is limited to Microsoft's own Surface hardware -- specifically the Surface Laptop, Surface Pro and Surface Book 2. It plans to expand device offerings with third-party partnerships, the company said.



Quote for the day:


"Scientific knowledge is an enabling power to do either good or bad - but it does not carry instructions on how to use it." -- Richard Feynman