Daily Tech Digest - June 11, 2019

Intel to Acquire Barefoot Networks, Accelerating Delivery of Ethernet-Based Fabrics

barefoot 2x1
An essential part of the equation is providing data center interconnects that can keep pace with our customers’ extraordinary and growing requirements. This is why interconnect is one of our six technology pillars in which we are investing to serve our customers. With this in mind, Intel has signed an agreement to acquire Barefoot Networks, an emerging leader in Ethernet switch silicon and software for use in the data center, specializing in the programmability and flexibility necessary to meet the performance and ever-changing needs of the hyperscale cloud. Upon close, the addition of Barefoot Networks will support our focus on end-to-end cloud networking and infrastructure leadership, and will allow Intel to continue to deliver on new workloads, experiences and capabilities for our data center customers. Led by Dr. Craig Barratt and based in Santa Clara, California, the Barefoot Networks team is a great complement to our existing connectivity offerings. Barefoot Networks will add deep expertise in cloud network architectures, P4-programmable high-speed data paths, switch silicon development, P4 compilers, driver software, network telemetry and computational networking.


Most code-signing processes insecure, study shows


“The reality is that every organisation is now in the software development business, from banks to retailers to manufacturers,” said Bocek, with the survey indicating that 69% of those polled expect their usage of code signing to grow in the next year. “If you’re building code, deploying containers, or running in the cloud, you need to get serious about the security of your code signing processes to protect your business,” he said.  The Venafi study found that although security professionals understand the risks of code signing, they are not taking proper steps to protect their organisation from attacks. Specifically, 35% do not have a clear owner for the private keys used in the code-signing processes at their organisations. Code-signing processes are used to secure and assure the authenticity of software updates for a wide range of software products, including firmware, operating systems, mobile applications and application container images. However, more than 25 million malicious binaries are enabled with code-signing certificates, and cyber criminals are misusing these certificates in their attacks.


Machine Learning has Significant Potential for the Manufacturing Sector


AI doesn’t just affect production and products. It can also allow manufacturers to expand the relationships they have with their customers beyond the point of sale. One company that has successfully incorporated machine learning into its catalog is Cummins Power Generation, an Indiana-based manufacturer of power-generating equipment, including generators and prime and stand-by systems. The company teamed up with Microsoft and Avtex several years ago to develop a remote monitoring system that collects data from Cummins products around the world. This system, known as the Power Command Cloud, “connects to millions of Cummins generators around the world, providing greater visibility into how equipment is performing and enabling refueling and performance maintenance at the exact time to maximize uptime,” Microsoft reported in 2016. This machine learning solution helps Cummins’ customers by monitoring multiple components, alerting users to trouble, and working to minimize the length and frequency of outages.


Correctness vs Change: Which Matters More?

The biggest obstacle to movement is not the work going into it, but it is the fear. Many modern standards help reduce this. Refactoring, with automated tests, helps us build a clear model of the code so that we can change it with confidence. Microservices help us limit the impact of changing a particular piece. Lehman even anticipated microservices: “Each module could be implemented as a program running on its own microprocessor and the system implemented as a distributed system.” He knew the world wasn’t ready for them, though: “Many problems in connection with the design and construction of such systems still need to be solved.” Some of these problems are now solved. We have automation for the maintenance of software components in quantity; we have APIs for infrastructure, builds and delivery. The crucial question in software architecture is “how will we change it?” We design both the future state of the system, and a path from here to there. To keep growing, we need to keep getting better at changing it, too. 


The Economics Of Artificial Intelligence - How Cheaper Predictions Will Change The World

Adobe Stock
"As economists studying innovation and technological change, a conventional frame for trying to understand and forecast the impact of new technology would be to think about what the technology really reduces the cost of," he tells me. "And really its an advance in statistical methods – a very big advance – and really not about intelligence at all, in a way a lot of people would understand the term ‘intelligence.' It's about one aspect of intelligence, which is prediction. “When I look up at the sky and see there are grey clouds, I take that information and predict that it’s going to rain. When I’m going to catch a ball, I predict the physics of where it’s going to end up. I have to do a lot of other things to catch the ball, but one of the things I do is make that prediction.” In business, we have to make these predictions many, many times each day. Will we make a higher profit by selling large volumes cheaply, or small volumes at a high price? Who is the best team member to take on a job? Where will we get the best "bang for our buck" out of our marketing budget?


5G and IoT – how to deal with data expansion as you scale

5G and IoT – how to deal with data expansion as you scale image
The speed at which organisations can integrate and analyse data will be vital because context is so important. For example, knowing a vehicle is stationary may not mean very much – unless you know it was travelling at 50 mph two seconds earlier. There will be a certain amount of data that can be processed at the edge in real time, but this is not suitable for other use cases. For example, getting contextual analysis in a matter of seconds will also be vital if organisations are to benefit from IoT, yet this has to be processed centrally in order to provide the right results to the business as a whole. Similarly, in the consumer setting, knowing that a customer has walked into a store is one thing, but to make the information useful, the retailer also needs to know all their online purchases, website clickstreams, service centre calls and so on. Building this Single Customer View is not something that can be achieved at the edge, however much it helps to have data close to the customer. This is why the distribution of data is so important – data might be created in multiple places, but it needs to be managed and used in the right places where it can provide the most value back to the business.


US Border License Plate and Traveler Photos Exposed

The breach comes as CPB has been increasingly using new tools and gathering more biometric information when people enter the United States. President Donald Trump has made border security and immigration key themes of his tenure, and these themes look to figure prominently in the 2020 presidential election. CPB says it has notified Congress and is working with law enforcement and cybersecurity experts. The agency's own Office of Professional Responsibility, which investigates corruption and mismanagement, has also begun an inquiry. The breach alert has already drawn scrutiny from lawmakers. Democratic Rep. Bennie G. Thompson of Mississippi says he plans to convene hearings next month covering the Department of Homeland Security's use of biometric data. "Government use of biometric and personal identifiable information can be valuable tools only if utilized properly," Thompson says. "We must ensure we are not expanding the use of biometrics at the expense of the privacy of the American public."


Cisco software to make networks smarter, safer, more manageable

intelligentnetwork
Together the new software and DNA Center will help customers set consistent policies across their domains and collaborate with others for the benefit of the entire network. Customers can define a policy once, apply it everywhere, and monitor it systematically to ensure it is realizing its business intent, said Prashanth Shenoy, Cisco vice president of marketing for Enterprise Network and Mobility. It will help customers segment their networks to reduce congestion, improve security and compliance and contain network problems, he said. “In the campus, Cisco’s SD-Access solution uses this technology to group users and devices within the segments it creates according to their access privileges. Similarly, Cisco ACI creates groups of similar applications in the data center,” Shenoy said. “When integrated, SD-Access and ACI exchange their groupings and provide each other an awareness into their access policies. With this knowledge, each of the domains can map user groups with applications, jointly enforce policies, and block unauthorized access to applications.” In the Cisco world it basically means there now can be a unification of its central domain network controllers and they can work together and let customers drive policies across domains.


Governing the onslaught of connected devices – what’s at stake for enterprises?

Referred to as so-called “killer apps,” smart speakers, and digital assistants, from leading tech giants (Amazon Echo/Alexa, Google Home, Siri, and Cortana) are becoming increasingly ubiquitous. Equally, the “Ring” and “Nest” series of products, owned by Amazon and Alphabet respectively, are another set of goods experiencing rapid adoption. As consumers accept more internet-enabled devices into their homes, they not only welcome the novel benefits, but also new concerns regarding the data collected. What was once unregulated, significant enforcement activities have taken place since the start of 2019. Several U.S. states, following GDPR’s passage last May, have proposed their own data protection laws that provide certain GDPR-like consumer rights. ... A multi-faceted problem, enterprises engaged in IoT use cases need a proper data framework in place. Such an infrastructure requires board-level sponsorship along with grassroots engagement across the entire corporate and IT ecosystems, with individuals taking responsibility and accountability for the way data is used.


Waste-Free Coding: Zero-Cost Abstraction in Java

You will learn where the main areas of waste exist in a Java application and the patterns that can be employed to reduce them. The concept of zero cost abstraction is introduced, and that many optimizations can be automated at compile time through code generation. A maven plugin simplifies the developer workflow. Our goal is not high performance, that comes as a by-product of maximizing efficiency. The solution employs Fluxtion which uses a fraction of the resources compared with existing java event processing frameworks. Climate change and its causes are currently of great concern to many. Computing is a major source of emissions, producing the same carbon footprint as the entire airline industry. In the absence of regulation dictating computing energy consumption we, as engineers, have to assume the responsibility for producing efficient systems balanced against the cost to create them. On a panel session from InfoQ 2019 in London, Martin Thompson spoke passionately about building energy-efficient computing systems. He noted controlling waste is the critical factor in minimizing energy consumption.



Quote for the day:


"Ninety percent of leadership is the ability to communicate something people want." -- Dianne Feinstein


No comments:

Post a Comment