Daily Tech Digest - October 04, 2018


We have to describe the world as it is for us to gain useful insights. Sure, we might then use those to convert that reality to how it ought to be, but our ingoing information, plus its processing, has to be morally blind. There is quite a movement out there to insist that all algorithms, all AIs, must be audited. That there can be no black boxes – we must know the internal logic and information structures of everything. This is so we can audit them to ensure that none of the either conscious or unconscious failings of thought and prejudice that humans are prey to are included in them. But, as above, this fails on one ground – that we humans are prey to such things. Thus a description of, or calculation about, a world inhabited by humans must at least acknowledge, if not incorporate, such prejudices. Otherwise the results coming out of the system aren’t going to be about this world, are they?



Understanding Spring Reactive: Introducing Spring WebFlux


With the introduction of Servlet 3.1, Spring MVC could achieve non-blocking behavior. But, as the Servlet API contains several interfaces that are still blocking (maybe because of support for backward compatibility), there was always the chance of accidentally using blocking APIs in the application, which was intended to be developed as non-blocking. In such scenarios, the usage of a blocking API will certainly bring down the application sooner or later. ... The purpose of this series is to demonstrate the evolution of the Servlet/Spring from the blocking to non-blocking paradigm. I am not going into the details of Spring WebFlux in this tutorial. But, still, I am going to introduce a sample Spring Boot application using Spring WebFlux. One point which we should notice in the above diagram is that Spring WebFlux is Servlet Container agnostic. Spring Webflux works on Servlet Container and also on Netty through Reactor Netty Project. In my Spring boot application, I have a dependency on WebFlux as spring-boot-starter-webflux, and at server startup, it says that the application is ready with Netty.


Asking the right questions to define government’s role in cybersecurity

Asking the right questions to define government’s role in cybersecurity
Cyberthreats cross national boundaries, with victims in one jurisdiction and perpetrators in another—often among nations that don’t agree on a common philosophy of governing the internet. And complicating it all, criminal offences vary, legal assistance arrangements are too slow, and operating models for day-to-day policing are optimized for crimes committed by local offenders. ... Each country is addressing the challenge in its own way, just as companies tackle the issue individually. Approaches vary even among leading countries identified by the Global Cybersecurity Index, an initiative of the United Nations International Telecommunications Union. Differences typically reflect political and legal philosophy, federal or national government structures, and how far government powers are devolved to state or local authorities. They also reflect public awareness and how broadly countries define national security—as well as technical capabilities among policy makers.


Iron Ox uses AI and robots to grow 30 times more produce than traditional farms


Iron Ox’s first 1,000-square-foot farm, which is in full production as of this week, taps a robotic arm equipped with a camera and computer vision systems that can analyze plants at sub-millimeter scale and execute tasks like planting and seeding. A 1,000-pound mobile transport system roughly the size of a car, meanwhile, delivers harvested produce — including leafy greens such as romaine, butterhead, and kale and herbs like basil, cilantro, and chives — using sensors and collision avoidance systems “similar to that of a self-driving car.” Cloud-hosted software acts as a sort of brain for the system, ingesting data from embedded sensors and using artificial intelligence (AI) to detect pests, forecast diseases, and “ensure cohesion across all parts.” It might sound like pricey tech, but Alexander and company said they worked to keep costs down by using off-the-shelf parts and implementing a scalable transport system.


From Visibility To Vision: Staying Competitive In An Open Banking Future


One of the reasons the digital experiences of established banks remain so lackluster is a failure by both customers and employees to report instances of slow or faulty systems. Across the board there is a growing apathy and acceptance of poorly performing technology, creating a self-perpetuating cycle of unsatisfied users. The first step in rectifying this problem is to give the power and visibility back to the IT team and business by providing them with system monitoring solutions that can quantify “normal” behavior as a benchmark to identify deviations from normal, so they can truly measure the user’s experience. These solutions would effectively bypass the reliance on the end-user to report issues and instead focus on creating more agile capabilities to proactively identify and rectify areas of degrading performance. Once IT departments are equipped with an intelligent and proactive infrastructure, banks can effectively compete by delivering digital services that offer a superior customer experience.


Everyone, everywhere is responsible for IIoT cyber security


Cyber security threats are coming at us from every direction, not just from our corporate networks. Operational networks were simply not built for connectivity, and carefully thought-out security protocols are being ignored for the benefit of data access to drive productivity gains. Unfortunately, threat vectors now extend even to base-level assets. Attackers can target anything from a connected thermostat to a wireless field device in order to cause danger. This heralds a new type of aggressive, innovative cyber attack for industrial control systems, which are becoming increasingly accessible over the internet, often inadvertently. The actors, too, have changed, and they are becoming more sophisticated every day. Attack techniques, tools and lessons are readily available on the dark web, which means low-level cyber criminals have access to the information they need to attempt more serious attacks.


How updating an outdated industrial control system can work with fog computing

industrial iot industry networking sensors
According to fog computing and automation startup Nebbiolo Technologies – which declined to name the client directly, saying only that it’s a “global” company – the failure of one of those Windows IPCs could result in up to six hours of downtime for said client. They wanted that time cut down to minutes. It’s a tricky issue. If those 9,000 machines were all in a data center, you could simply virtualize the whole thing and call it a day, according to Nebbiolo’s vice president of product management, Hugo Vliegen. But it's a heterogeneous environment, with the aging computers running critical control applications for the production lines – their connections to the equipment can't simply be abstracted into the cloud or a data center. Architecturally, however, the system is a bit simpler. Sure, there are a lot of computers, but they’re all managed remotely. The chief problem is visibility and failover, Vliegen said. “If they fail, they’re looking at six hours downtime,” he said on Tuesday in a presentation at the Fog World Congress in San Francisco.


5 mistakes even the best organizations make with product and customer data

“In 2018, digital business transformation will be played out at scale, sparking shifts in organizational structure, operating models, and technology platforms. CEOs will expect their CIOs to lead digital efforts by orchestrating the enabling technologies, closing the digital skills gap, and linking arms with CMOs and other executive peers better positioned to address the transformational issues across business silos.”  The need to address these business silos has been a key driver in the growth of master data management (MDM). MDM integrates multiple disparate systems across organizations by streamlining the process of aggregating and consolidating information about products, customers, suppliers, employees, assets and reference data from multiple sources and formats. It connects that information to derive actionable insights and publishes it to backend systems as well as online and offline channels.


Codefirst: The Future of UI Design


If you look at your laptop, tablet, or mobile phone today, you’ll notice that the latest craze to sweep the industry is flat design. Flat design was a dramatic departure from Apple’s ubiquitous skeuomorphism style to one that celebrated minimalism. This trend boasted a UI that leveraged simplicity, flat surfaces, cleaner edges, and understated graphics. The flat design trend evidences a shift within the industry to make designs scale across many different form factors. Websites, on the other hand, have incorporated polygonal shapes, simple geometric layers, and bold lines that grab the audience’s attention. Tactile designs have also grown in popularity in recent months. This design trend makes objects appear hyper-real. Beyond these current trends, there are many examples of websites without borders, without multiple layers, with purposeful animation, and large images. Going forward, you can undoubtedly expect the bar to be raised within the app and web world to ensure that both UI and UX work seamlessly together to improve user interactions.


Incorporate NIST security and virtualization recommendations


The main goal of following these NIST virtualization recommendations is to ensure the secure execution of the platform's baseline functions. These recommendations primarily target cloud service providers that offer infrastructure as a service and enterprise IT teams planning to implement virtual infrastructures to host line-of-business applications. According to NIST, hypervisor platforms are susceptible to security threats via three primary channels: the enterprise network where the hypervisor host resides, rogue or compromised VMs accessing virtualized resources, and web interfaces for the platform's management services and consoles. NIST breaks down the hypervisor platform into the following five baseline functions: VM process isolation (HY-BF1), device mediation and access control (HY-BF2), direct command execution from guest VMs (HY-BF3), VM lifecycle management (HY-BF4), and hypervisor platform management (HY-BF5).



Quote for the day:


"Great Leaders Focus On Sustainable Success Rather Than Quicker Wins." -- Gordon TredGold


No comments:

Post a Comment