Daily Tech Digest - June 13, 2019

The fight to keep open source truly “open” ⁠— open source providers need to stand up image
The benefits of keeping open source open far outweigh its cons. Allowing developers from all backgrounds and training practices to review and modify code means that it is constantly being improved, in turn allowing the entirety of the industry to benefit from innovation, better security and healthy competition. It also gives developers greater mobility. Not only does open source software means developers are free to train and practice in any type of coding language they please, but also that these non-proprietary coding languages become increasingly popular and in-demand; granting developers flexibility in their work and career. Open source platforms — specifically those that remain true to its roots — are so valuable that it can attract hordes of venture capital funding, even if there are no immediate prospects for monetary returns. Jocelyn Goldfein, a partner at venture capital fund Zetta, pointed out: “There’s probably at least two dozen venture firms that invest a lot in open source now.” Nowadays, the default question isn’t why a platform would be open source — rather, why wouldn’t it? 


How edge computing makes 5G essential

With 5G, it is really about decreasing latency and increase bandwidth, and it's being driven more by enterprise applications more so than consumer although you kind of see the consumer influences with people sitting on subways watching movies or playing video games or doing even teleconferencing while they're going to or from work. So those edge applications where the data latency matters, you can think about eMedicine or any of the mission-critical type things that are important with smart cities, you certainly don't want the power to go out while somebody is crossing the street, and your smart sensors go down. That need for the infrastructure out at the edge to be resilient, robust is a critical aspect of what's going to happen with 5G. As we start into that transformation, Vertiv, as a company, is really focused in on how can we support the critical infrastructure at the edge to ensure that capabilities are always on through battery backup, that you don't have thermal runaway in hot locations like Phoenix, or very cold locations, if it's in Alaska or wherever that might happen to be.


How Far Are We From Achieving Artificial General Intelligence?

uncaptioned
Artificial General Intelligence (AGI) can be defined as the ability of a machine to perform any task that a human can. Although the aforementioned applications highlight the ability of AI to perform tasks with greater efficacy than humans, they are not generally intelligent, i.e., they are exceedingly good at only a single function while having zero capability to do anything else. Thus, while an AI application may be as effective as a hundred trained humans in performing one task it can lose to a five-year-old kid in competing over any other task. For instance, computer vision systems, although adept at making sense of visual information, cannot translate and apply that ability to other tasks. On the contrary, a human, although sometimes less proficient at performing these functions, can perform a broader range of functions than any of the existing AI applications of today. While an AI has to be trained in any function it needs to perform with massive volumes of training data, humans can learn with significantly fewer learning experiences. Additionally, humans -- and  agents with artificial general intelligence -- can generalize better to apply the learnings from one experience to other similar experiences.



Tomorrow's Cybersecurity Analyst Is Not Who You Think

First, cybercriminals are becoming much better at penetrating organizations using nontechnical means. With social engineering and phishing techniques, they can bypass organizations' increasingly advanced defenses by manipulating insiders to gain access. Research shows that phishing and social engineering were the most common methods of compromise in 2018, serving as the conduit to the initial point of entry in more than 60% of security breaches in both cloud and point-of-sale environments, as well as in 46% of corporate and internal network breaches. Second, the volume of data in organizations is growing exponentially and is increasingly stored in a more decentralized manner, making it difficult to ensure it's being optimally protected. Research firm IDC predicts the volume of data worldwide will grow tenfold by 2025 to 163 zettabytes, with the majority being created and managed by enterprises. This growth is being driven by the proliferation of artificial intelligence, the Internet of Things, and other machine-to-machine technologies in enterprises across all industries.


Trainline On Track: Innovating And Navigating Change

Trainline
Talent remains a key challenge for businesses – particularly when the knowledge and expertise required for technical roles is so vast. As Director of Engineering, Midgley is tasked with building and leading a team of 300 tech travel specialists. “The thing that I’m most passionate about is engineering culture. We really seek out the very best talent in each of our locations, and it’s always hard. The flip side is that, once these amazing people are through the door, you have to retain them and make them feel rewarded, satisfied, and highly motivated,” he says. To some extent, retaining talent is about giving people room to grow. This might include sending employees on study days and courses, or encouraging them to take part in enriching or educational activities. At the same time, this brings new learnings, approaches, and technologies into the organisation. In terms of working structures, Trainline has adopted the ‘two-pizza rule‘ favoured by Amazon, in which employees form small working groups that could, as the name suggests, be fed by two pizzas. As well as leveraging internal employees as a source of innovation and ideas, the company collaborates with hundreds of international partners.


CAD and PLM: transforming the industrial landscape and shaping how humans work

“When we marry digital with human — which is what AR is all about — we get vastly more productive workers, thanks to the ability of the digital world to monitor, control, and optimise the world of humans,” continued Heppelmann. AR is about virtualising physical infrastructure in the industrial world, with real-time information for safety and productivity. “It should act as a digital mentor.” ... Combining technological innovation with the skills of workers expands human possibilities, and it must be part of business strategy. Humans do not have the capability to shift through the vast amounts of data that factories or indeed, any businesses create. They need a way to look at the whole environment and know where to run analytics to optimise productivity. ... IoT can gather and analyse data of physical machines, which will mitigate the problem of unplanned downtime, which drives down equipment availability — in a factory setting, for example. Machine learning can then be used to predict when this problem will next occur, while AR and VR can be used by the human to view the problem up close in a digital environment; on top of the obvious safety and training benefits.


Cisco launches a developer-community cert program

run digital vanguard business executive with briefcase career growth
Perhaps one of the biggest additions – rolled out here at the company’s Cisco Live customer event – is the new set of professional certifications for developers utilizing Cisco’s growing DevNet developer community. The Cisco Certified DevNet Associate, Specialist and Professional certifications will cover software development for applications, automation, DevOps, cloud and IoT. They will also target software developers and network engineers who develop software proficiency to develop applications and automated workflows for operational networks and infrastructure.  “This certification evolution is the next step to reflect the critical skills network engineers must have to be at the leading edge of networked-enabled business disruption and delivering customer excellence,” said Mike Adams, vice president and general manager of Learning@Cisco. “To perform effectively in this new world, every IT professional needs skills that are broader, deeper and more agile than ever before. And they have to be comfortable working as a multidisciplinary team including infrastructure network engineers, DevOps and automation specialists, and software professionals.”


The Rise of 'Purple Teaming'

Part of what makes Red Teaming and Purple Teaming so valuable is they provide insight into the specific tactics and approaches that attackers might use. Deception technology can enhance this visibility by incorporating deception technology into the testing program. The first benefit comes from detecting attackers early by enticing them to engage with decoys or deception lures. The second comes from gathering full indicators of compromise (IOCs) and tactics, techniques, and procedures (TTPs) into lateral movement activity. This significantly enhances visibility into how and when attackers circumvent security controls, enriching the information that typically results from these exercises. Cyber deceptions deploy traps and lures on the network without interfering with daily operations. A basic deployment can easily be completed in under a day, providing the Blue Team an additional detection mechanism that blends in with the operational environment. This creates more opportunities to detect when the Red Team bypasses a defensive control, forcing team members to be more deliberate with their actions and making simulated attack scenarios more realistic.


Agile vs. Top-Down Management: Leadership Must Evolve as an Organization Matures

Business people at a table
Agile was first used in software development. Its purpose then was to deliver a more relevant product to customers through smaller iterations on a shorter cycle, which gave developers the opportunity to incorporate user feedback into future releases. As other teams and departments appropriated agile, its purpose evolved into accelerating growth by reducing the time-to-value of growth initiatives like marketing while also ensuring a project is effective before ramping it up to scale. The problem with Agile at scale is that time-to-value usually isn’t dependent upon a single team’s ability to execute. Even projects that appear small involve multiple teams, and stakeholders within them are juggling competing priorities and relying upon different feedback sources to inform the direction of projects. When Agile team operates without awareness of the work in the rest of the organization, it can become misaligned from other teams. Real-time communication is necessary between teams to help with these handoff points; otherwise, it’s like teams are playing hot-potato with projects; tossing the ball to someone who may not be ready to catch it.


What is data protection by design and default

To explain how the approach works, we must first break it into its two component parts. The first is data protection by design, which ensures that organisations address information security and privacy in the planning stage of any system, service, product or process that uses personal data. With cyber attacks on the rise, a growing public interest in data privacy and the strengthened penalties introduced by the GDPR, it makes sense to prioritise information security. If you don’t, you’ll be left trying to tack security controls onto existing set-ups. This could lead to improperly implemented controls that expose vulnerabilities, and expensive restructuring projects. ... Data protection by default ensures that organisations conduct data processing activities only if they are necessary to achieve a specific goal. As such, it links to the GDPR’s principles of data minimisation and purpose limitation. One way to achieve this is to give data subjects the strongest possible privacy settings by default – hence the name. This helps prevent data being collected excessively, and gives the data subject the option to consent to more extensive data practices if they want to use other services.



Quote for the day:


"Wisdom comes from experience. Experience is often a result of lack of wisdom." -- Terry Pratchett


Daily Tech Digest - June 12, 2019

IoT security vs. privacy: Which is a bigger issue?

ringvideodoorbellpro
Predictably, most of the teeth-gnashing has come on the consumer side, but that doesn’t mean enterprises users are immune to the issue. One the one hand, just like consumers, companies are vulnerable to their proprietary information being improperly shared and misused. More immediately, companies may face backlash from their own customers if they are seen as not properly guarding the data they collect via the IoT. Too often, in fact, enterprises shoot themselves in the foot on privacy issues, with practices that range from tone-deaf to exploitative to downright illegal—leading almost two-thirds (63%) of consumers to describe IoT data collection as “creepy,” while more than half (53%) “distrust connected devices to protect their privacy and handle information in a responsible manner.” ... Police in more than 50 cities and towns across the country are apparently offering free or discounted Ring doorbells, and sometimes requiring the recipients to share footage for use in investigations. Many privacy advocates are troubled by this degree of cooperation between police and Ring, but that’s only part of the problem. Last year, for example, Ring workers in Ukraine reportedly watched customer feeds. Amazingly, though, even that only scratches the surface of the privacy flaps surrounding Ring.



Researchers crack digital safe using HSM flaw


The researchers found that the firmware built into the module was signed, but not encrypted. This meant that they could analyze how it worked, and they found that it allowed them to upload and run additional custom code. They used the software development kit (SDK) provided with the HSM to upload a custom firmware module to the unit. This gave them access to a shell inside the HSM that they could use to run a debugger and analyze the inner workings of the unit. From there, they ran a fuzzer, which sends a lot of queries to the HSM’s PKCS #11 API. PKCS #11 is a cryptographic API created by RSA. They hit the API with a large number of parameters looking for data that might throw the HSM into an unstable state. These tests uncovered several buffer overflow error bugs that they could trigger by sending the HSM certain commands. The researchers were able to write a module that they could run as unsigned custom firmware on the HSM that enabled them to dump all its secrets. They could recover keys, read secrets directly from the HSM’s memory, and dump the contents of the module’s flash storage, including its decryption key.


Combine containers and serverless to optimize app environments


Serverless is a new and misleading label for an old concept: run applications or scripts on demand without provisioning the runtime infrastructure beforehand. SaaS apps, such as Google Docs, might be considered serverless; when users create a document, they don't have to provision the back-end system that runs the application. Serverless takes this concept to application code, which is abstracted from its various infrastructure services, such as storage, databases, machine learning systems and streaming data processing. Google Cloud emphasizes that serverless functions aren't limited to event-driven code execution, but rather include many of its IaaS and PaaS products that instantiate and terminate on demand and don't require prior setup. On cloud serverless platforms, like AWS Lambda and Azure Functions, functions run code in response to an event trigger, such as an event on a message queue or notification service, and are typically used for short-duration jobs that handle tasks such as data acquisition, filtering and transformation, application integration and user input.


Ensuring trust in an age of digital banking

First, the bank needs to be sustainable. That includes following a code of conduct: integrating sustainability risk in processes and strengthening policies and enabling transparent reporting, as well as conducting the work that prevents the bank from being used for different types of financial crime. This is our license to operate. Second, we develop financial services with positive climate impact as a response to our customers’ needs. We have a very proud 10-year history of offering green bonds. Last year we launched green mortgages. In January, we launched our first blue bond [for investing in marine conservation projects], and we also offer green car leasing. We are trying to cater to customer demand. We understand that people care about what they do with their money. We have a very ambitious plan to introduce more financial solutions that capture what every single individual cares about. Today there is a good array of different products and services with positive climate impact, but it is still too little to meet the growing demand.


Hybrid Development: The Value at the Intersection of TDD, DDD, and BDD

Test Driven Development
What is the best way to tackle a large development project? You break it down into smaller, more manageable segments, or in the case of DDD - domains. When you split the project into smaller domains, you can have segregated teams handle the functionality of that domain end-to-end. And to best understand those domains, you enlist the help of domain experts; someone that understands the problem and that realm of knowledge more than anyone else.  Typically, the domain expert is not the one who is responsible for developing the solution, rather, DDD collectively is used to help bridge the knowledge gap that usually exists between these experts and the solution that is trying to be realized. Through models, context, and ubiquitous language, all parties involved should have a clear understanding of what the particular problems are and how the ensuing build will be structured. ... As the complexity of your projects grow, the only way to maintain the viability of your build and ensure success is to have your development practices grow with it.


Reaping the benefits of a strong strategy-driven business analytics IQ

Analytics IQ is a measure of an organization’s ability to leverage analytics to support business and IT objectives. Many organizations start their analytics journey eagerly, but without a clear strategy. This approach often leads to failed pilot projects, which have not provided the needed insights to answer business questions. Let us take a step back and first focus on analytics. It is easier to understand analytics when you understand the process that data goes through to become actual, actionable intelligence, rather than unusable numbers and words. I like to think about it in terms of retail. The price of an item is just plain data. However, when we add additional indicators, e.g., the price is attached to a celebrity’s merchandise, and recently, that person was involved in a controversy — then this data becomes information, something of interest to us. The information can then be used to try and predict what will happen to the price of this merchandise in the following days. That is intelligence: When we add context to information, it becomes intelligence.


Triada backdoors were pre-installed on Android devices


The story of Triada began when Kaspersky Lab researchers discovered it in early 2016, and at that time the main purpose of the Android malware was "to install spam apps on a device that displays ads," according to Google. Last week, Lukasz Siewierski, a reverse engineer on the Android security and privacy team at Google, explained that Triada was much more advanced than previously thought. "The methods Triada used were complex and unusual for these types of apps," Siewierski wrote in a blog post. "Triada apps started as rooting Trojans, but as Google Play Protect strengthened defenses against rooting exploits, Triada apps were forced to adapt, progressing to a system image backdoor." While Google added features to Android to protect against threats like Triada, the threat actors behind the malware took another unusual approach in the summer of 2017 and performed a supply chain attackto get the backdoor malware preinstalled on budget phones.


What Stands Out in Proposed Premera Lawsuit Settlement?

Technology attorney Steven Teppler points to the attention given to "fixing" the health insurer's security problems. The proposed agreement, which was filed on May 31 in a federal court in Oregon, would settle a class action lawsuit that consolidated more than 40 lawsuits filed after the data breach was revealed in March 2015 by the Seattle-based insurer. It awaits court approval. The settlement proposes $32 million for breach victims and related legal costs and would require the health insurer to invest $42 million in bolstering data security. The settlement "not only takes care of victims, but takes care of business internally at the organization to make sure there are resources devoted to fixing or mitigating the security problem, but also that there are ways to establish milestones to make sure what is promised is actually done," Teppler says in an interview with Information Security Media Group. Under the settlement, Premera would spend at least $14 million annually over the next three years on enhanced data security measures.


5 ways to achieve a risk-based security strategy


A risk-based security approach, on the other hand, identifies the true risks to an organization's most valuable assets and prioritizes spending to mitigate those risks to an acceptable level. A security strategy shaped by risk-based decisions enables an organization to develop more practical and realistic security goals and spend its resources in a more effective way. It also delivers compliance, not as an end in itself, but as natural consequence of a robust and optimized security posture. Although a risk-based security strategy requires careful planning and ongoing monitoring and assessment, it doesn't have to be an overly complex process. There are five key steps to implementing risk-based security, and though time-consuming, they will align security with the goals of the organization. Board-level support is paramount. Input from numerous stakeholders throughout the organization is essential, as risk mitigation decisions can have a serious effect on operations which security teams may not fully appreciate if they make these decisions in isolation.


Large firms look to zero-trust security to reduce cyber risk


Essentially, a zero-trust approach is about applying authentication and authorisation to ensure that all traffic within an enterprise is properly authenticated and authorised, whether it is someone coming in from the outside on a VPN connection, an application talking to another application on the network, or a user trying to use an application on the network. “The data from the survey shows many similarities between the various countries in terms of the gaps and threats that large enterprises need to deal with with respect to secure access,” said Scott Gordon, chief marketing officer at Pulse Secure. “Perhaps the most significant difference in secure access priorities was more focus on improving endpoint security and remediation prior to access in the US (57%) compared with 43% in the UK and just 31% in German, Austria and Switzerland. This trend also matches higher IoT adoption in the US, although Europe is catching up fast.” A key takeaway from this report, said Gordon, is that large organisations across Europe are dealing with an increasingly hybrid IT environment.



Quote for the day:


"Though nobody can go back and make a new beginning... Anyone can start over and make a new ending." -- Chico Xavier


Daily Tech Digest - June 11, 2019

Intel to Acquire Barefoot Networks, Accelerating Delivery of Ethernet-Based Fabrics

barefoot 2x1
An essential part of the equation is providing data center interconnects that can keep pace with our customers’ extraordinary and growing requirements. This is why interconnect is one of our six technology pillars in which we are investing to serve our customers. With this in mind, Intel has signed an agreement to acquire Barefoot Networks, an emerging leader in Ethernet switch silicon and software for use in the data center, specializing in the programmability and flexibility necessary to meet the performance and ever-changing needs of the hyperscale cloud. Upon close, the addition of Barefoot Networks will support our focus on end-to-end cloud networking and infrastructure leadership, and will allow Intel to continue to deliver on new workloads, experiences and capabilities for our data center customers. Led by Dr. Craig Barratt and based in Santa Clara, California, the Barefoot Networks team is a great complement to our existing connectivity offerings. Barefoot Networks will add deep expertise in cloud network architectures, P4-programmable high-speed data paths, switch silicon development, P4 compilers, driver software, network telemetry and computational networking.


Most code-signing processes insecure, study shows


“The reality is that every organisation is now in the software development business, from banks to retailers to manufacturers,” said Bocek, with the survey indicating that 69% of those polled expect their usage of code signing to grow in the next year. “If you’re building code, deploying containers, or running in the cloud, you need to get serious about the security of your code signing processes to protect your business,” he said.  The Venafi study found that although security professionals understand the risks of code signing, they are not taking proper steps to protect their organisation from attacks. Specifically, 35% do not have a clear owner for the private keys used in the code-signing processes at their organisations. Code-signing processes are used to secure and assure the authenticity of software updates for a wide range of software products, including firmware, operating systems, mobile applications and application container images. However, more than 25 million malicious binaries are enabled with code-signing certificates, and cyber criminals are misusing these certificates in their attacks.


Machine Learning has Significant Potential for the Manufacturing Sector


AI doesn’t just affect production and products. It can also allow manufacturers to expand the relationships they have with their customers beyond the point of sale. One company that has successfully incorporated machine learning into its catalog is Cummins Power Generation, an Indiana-based manufacturer of power-generating equipment, including generators and prime and stand-by systems. The company teamed up with Microsoft and Avtex several years ago to develop a remote monitoring system that collects data from Cummins products around the world. This system, known as the Power Command Cloud, “connects to millions of Cummins generators around the world, providing greater visibility into how equipment is performing and enabling refueling and performance maintenance at the exact time to maximize uptime,” Microsoft reported in 2016. This machine learning solution helps Cummins’ customers by monitoring multiple components, alerting users to trouble, and working to minimize the length and frequency of outages.


Correctness vs Change: Which Matters More?

The biggest obstacle to movement is not the work going into it, but it is the fear. Many modern standards help reduce this. Refactoring, with automated tests, helps us build a clear model of the code so that we can change it with confidence. Microservices help us limit the impact of changing a particular piece. Lehman even anticipated microservices: “Each module could be implemented as a program running on its own microprocessor and the system implemented as a distributed system.” He knew the world wasn’t ready for them, though: “Many problems in connection with the design and construction of such systems still need to be solved.” Some of these problems are now solved. We have automation for the maintenance of software components in quantity; we have APIs for infrastructure, builds and delivery. The crucial question in software architecture is “how will we change it?” We design both the future state of the system, and a path from here to there. To keep growing, we need to keep getting better at changing it, too. 


The Economics Of Artificial Intelligence - How Cheaper Predictions Will Change The World

Adobe Stock
"As economists studying innovation and technological change, a conventional frame for trying to understand and forecast the impact of new technology would be to think about what the technology really reduces the cost of," he tells me. "And really its an advance in statistical methods – a very big advance – and really not about intelligence at all, in a way a lot of people would understand the term ‘intelligence.' It's about one aspect of intelligence, which is prediction. “When I look up at the sky and see there are grey clouds, I take that information and predict that it’s going to rain. When I’m going to catch a ball, I predict the physics of where it’s going to end up. I have to do a lot of other things to catch the ball, but one of the things I do is make that prediction.” In business, we have to make these predictions many, many times each day. Will we make a higher profit by selling large volumes cheaply, or small volumes at a high price? Who is the best team member to take on a job? Where will we get the best "bang for our buck" out of our marketing budget?


5G and IoT – how to deal with data expansion as you scale

5G and IoT – how to deal with data expansion as you scale image
The speed at which organisations can integrate and analyse data will be vital because context is so important. For example, knowing a vehicle is stationary may not mean very much – unless you know it was travelling at 50 mph two seconds earlier. There will be a certain amount of data that can be processed at the edge in real time, but this is not suitable for other use cases. For example, getting contextual analysis in a matter of seconds will also be vital if organisations are to benefit from IoT, yet this has to be processed centrally in order to provide the right results to the business as a whole. Similarly, in the consumer setting, knowing that a customer has walked into a store is one thing, but to make the information useful, the retailer also needs to know all their online purchases, website clickstreams, service centre calls and so on. Building this Single Customer View is not something that can be achieved at the edge, however much it helps to have data close to the customer. This is why the distribution of data is so important – data might be created in multiple places, but it needs to be managed and used in the right places where it can provide the most value back to the business.


US Border License Plate and Traveler Photos Exposed

The breach comes as CPB has been increasingly using new tools and gathering more biometric information when people enter the United States. President Donald Trump has made border security and immigration key themes of his tenure, and these themes look to figure prominently in the 2020 presidential election. CPB says it has notified Congress and is working with law enforcement and cybersecurity experts. The agency's own Office of Professional Responsibility, which investigates corruption and mismanagement, has also begun an inquiry. The breach alert has already drawn scrutiny from lawmakers. Democratic Rep. Bennie G. Thompson of Mississippi says he plans to convene hearings next month covering the Department of Homeland Security's use of biometric data. "Government use of biometric and personal identifiable information can be valuable tools only if utilized properly," Thompson says. "We must ensure we are not expanding the use of biometrics at the expense of the privacy of the American public."


Cisco software to make networks smarter, safer, more manageable

intelligentnetwork
Together the new software and DNA Center will help customers set consistent policies across their domains and collaborate with others for the benefit of the entire network. Customers can define a policy once, apply it everywhere, and monitor it systematically to ensure it is realizing its business intent, said Prashanth Shenoy, Cisco vice president of marketing for Enterprise Network and Mobility. It will help customers segment their networks to reduce congestion, improve security and compliance and contain network problems, he said. “In the campus, Cisco’s SD-Access solution uses this technology to group users and devices within the segments it creates according to their access privileges. Similarly, Cisco ACI creates groups of similar applications in the data center,” Shenoy said. “When integrated, SD-Access and ACI exchange their groupings and provide each other an awareness into their access policies. With this knowledge, each of the domains can map user groups with applications, jointly enforce policies, and block unauthorized access to applications.” In the Cisco world it basically means there now can be a unification of its central domain network controllers and they can work together and let customers drive policies across domains.


Governing the onslaught of connected devices – what’s at stake for enterprises?

Referred to as so-called “killer apps,” smart speakers, and digital assistants, from leading tech giants (Amazon Echo/Alexa, Google Home, Siri, and Cortana) are becoming increasingly ubiquitous. Equally, the “Ring” and “Nest” series of products, owned by Amazon and Alphabet respectively, are another set of goods experiencing rapid adoption. As consumers accept more internet-enabled devices into their homes, they not only welcome the novel benefits, but also new concerns regarding the data collected. What was once unregulated, significant enforcement activities have taken place since the start of 2019. Several U.S. states, following GDPR’s passage last May, have proposed their own data protection laws that provide certain GDPR-like consumer rights. ... A multi-faceted problem, enterprises engaged in IoT use cases need a proper data framework in place. Such an infrastructure requires board-level sponsorship along with grassroots engagement across the entire corporate and IT ecosystems, with individuals taking responsibility and accountability for the way data is used.


Waste-Free Coding: Zero-Cost Abstraction in Java

You will learn where the main areas of waste exist in a Java application and the patterns that can be employed to reduce them. The concept of zero cost abstraction is introduced, and that many optimizations can be automated at compile time through code generation. A maven plugin simplifies the developer workflow. Our goal is not high performance, that comes as a by-product of maximizing efficiency. The solution employs Fluxtion which uses a fraction of the resources compared with existing java event processing frameworks. Climate change and its causes are currently of great concern to many. Computing is a major source of emissions, producing the same carbon footprint as the entire airline industry. In the absence of regulation dictating computing energy consumption we, as engineers, have to assume the responsibility for producing efficient systems balanced against the cost to create them. On a panel session from InfoQ 2019 in London, Martin Thompson spoke passionately about building energy-efficient computing systems. He noted controlling waste is the critical factor in minimizing energy consumption.



Quote for the day:


"Ninety percent of leadership is the ability to communicate something people want." -- Dianne Feinstein


Daily Tech Digest - June 10, 2019

AI needs a certification process, not legislation


Neither legal regulation nor ethical guidelines will keep AI development from running amok. That doesn’t mean there isn’t a solution, though. In fact, the solution is a lot simpler than you might think: Establish an independent body that can create standards and a program for certification. ... These compliance measures have highly technical standards that require organizations to comply with specific password protection measures, mobile phone security, data segregation, firewall protections, and many more nuanced topics. While there’s no legal penalty for non-certification, certification is often a necessity for businesses wanting to engage with one another. In AI, I propose that technical experts, investors, and policymakers within the space come together to create a global, independent governing body responsible for establishing and enforcing AI standards. The standards — which should be reviewed regularly with annual certification requirements — should spell out specific requirements such as compliance around avoiding bias in data sets, checks to ensure AI is being used ethically and in a way that isn’t discriminatory, controls around automated decision making, and emergency measures to stop an AI machine.




One way predictive analytics is changing transportation is in how it is forcing firms to evaluate how they arrange data sourced from electronic logs, video event recorders, electronic control modules, and other vehicle sensors. Organizing these sources is critical for triaging which transportation challenges to solve and means finding relationships among the data that can be made into useful experiences. For an automotive example, think of a Corvette. Specialty versions of the vehicle offer a Performance Data Recorder that enables telemetry overlays of vehicle data on video from a high-definition camera.That data, sourced from various system activities, is used to analyze driver sessions on a race track, enhancing a customer wish.  Exploring data organization will rise as autonomous vehicle fleets become more prominent on public roads. Vehicles have historically managed this data in one format or another, but until now there were no opportunities to consider data from a network, moreover with consideration of a central repository or local platform to host data. Autonomous vehicles generate real-time data creation, which can inform managers with real-time logistic decisions.



Innovation Hubs v Regulatory Sandboxes and the Future of Innovation Facilitators


Regulators gain a better understanding of innovation in financial services, and firms understand better the regulatory and supervisory expectations against the backdrop of rapid technological advancement. “In particular, innovation facilitators can help competent authorities to keep pace with developments by gaining near ‘real time’ insights into emerging technologies (such as distributed ledger technologies, big data analytics, artificial intelligence and machine learning) and their application in the financial sector. Competent authorities can apply these insights for the purposes of anticipating regulatory and supervisory issues and responding proactively.” On the other hand, it makes regulators more accessible to firms and in particular start-ups that lack resources and experience in regulatory matters However, the report also summarises some of the risks that are perceived by competent authority regarding the operation of innovation facilitators in general and with regulatory sandboxes in particular.

Cybersecurity insurance: Read the fine print

istock-503519736.jpg
Ernest Martin Jr. mentioned cybersecurity insurance is trying to protect a new and volatile industry; a good example would be determining how to insure a business that locates the company's technology (hardware and/or software) in a third-party's data center, which is becoming a common practice. "Even when a cyber policy provides a particular type of coverage, the actual scope of that coverage can be restricted in many ways," Dallas attorney Amy Elizabeth Stewart explains to Bounds. Stewart suggests firms that outsource their digital assets should understand how the coverage works when third-party vendors are involved, if they want to avoid unpleasant surprises. Bounds offers an example from Renee Hornbaker, former financial chief for Stream Energy Inc. as well as Flowserve Corporation (now retired). Hornbaker told Bounds she did not look forward to getting cybersecurity insurance, adding, "I found it to be costly, difficult to purchase, and the application process was onerous." Bounds brings up another good point about what could be a problem to some company executives: Obtaining insurance likely will entail disclosing a lot of sensitive information to the insurer, such as infrastructure setup and security practices.



Apache Mesos is an open source cluster management tool that abstracts and isolates resources within distributed IT environments. Enterprises use Mesos with, or as an alternative to, Kubernetes for container orchestration in large-scale deployments. ... Readers should expect the build process -- compiling and linking the components of Apache Mesos -- to take about one hour on a two-core machine with 8 GB of memory. Close any servers and end any running tasks on your machine before you begin compiling the Apache Mesos installation. This process can take 100% of the memory and prevent even SSH login attempts. All commands must execute via the sudo command, which enables you to act as the administrative root user. Test frameworks are not critical: It's a complicated process to write a Mesos test framework, and a regular user is unlikely to need one. Instead, IT admins are more likely to use a Mesos framework developed by an established vendor such as Hadoop, Spark or Cassandra.


The Problem with Quantum Computers

The Problem with Quantum Computers
The trouble is, quantum mechanics challenges our intuition. So we struggle to figure out the best algorithms for performing meaningful tasks. To help overcome these problems, our team at Los Alamos National Laboratory is developing a method to invent and optimize algorithms that perform useful tasks on noisy quantum computers. Algorithms are the lists of operations that tell a computer to do something, analogous to a cooking recipe. Compared to classical algorithms, the quantum kind are best kept as short as possible and, we have found, best tailored to the particular defects and noise regime of a given hardware device. That enables the algorithm to execute more processing steps within the constrained time frame before decoherence reduces the likelihood of a correct result to nearly zero. In our interdisciplinary work on quantum computing at Los Alamos, funded by the Laboratory Directed Research and Development program, we are pursuing a key step in getting algorithms to run effectively. The main idea is to reduce the number of gates in an attempt to finish execution before decoherence and other sources of errors have a chance to unacceptably reduce the likelihood of success.


pink blue powder explosion entrepreneurship innovation
Everyone possesses the ability to be good innovators. We are all born like this. But most of us unlearn these abilities through spending much of our lives within the tightly controlled systems that are constituted by our educational institutions and workplaces. A large 10-year study of 1,600 children which tested their creativity—defined as the ability to engage in divergent thinking, i.e. the ability to have original ideas which differ from anything you have ever seen before—measured the creativity of children who were 5, 10, and 15 years old. ... If those numbers don’t give you pause, I don’t know what will. These results also explain why our organizations lack innovation power. As citizens, we unlearn our skills of divergent thinking, and most of our organizations are built to promote and maintain this state. The organizations may have been founded by people who were creative geniuses, but unless the founders still run the organizations and are very visible bearers of the culture, the organizations quickly change and are left to people who have largely unlearned divergent thinking, and have rather learned convergent thinking, which is the ability to be critical.


Was Chase’s Digital-Only Bank Spinoff a Viable Strategy?


Financial institutions need to transform themselves from product-centric to customer-centric, from efficiency to flexibility, and from digital support to digital-only. The winners in the banking industry will find ways to collect and act on insights faster than the competition. This is what Amazon does and what consumers will expect from their financial institution. This can’t be achieved by protecting existing branch-based organizations, processes, or by hoping that increased investments in technology will save the day. This is because the financial support of legacy branches and processes (at least to the degree that is occurring in most organizations) is not sustainable. Alternative providers can provide greater value, better rates and a better experience at a lower cost. According to noted author and futurist Brett King, “We’re likely to see more digital-only offerings from traditional banks fail in the future where banks aren’t truly committed to digital transformation. The problem is that many traditional banks are doing this for PR reasons — not because they believe in digital as a destination. Ultimately they will fail because the traditional organization kills it off or starves it of adequate support”


Cisco to buy IoT security, management firm Sentryo

nwan 019 iiot
Sentryo's ICS CyberVision lets enterprises ensure continuity, resilience and safety of their industrial operations while preventing possible cyberattacks, said Nandini Natarajan , industry analyst at Frost & Sullivan. "It automatically profiles assets and communication flows using a unique 'universal OT language' in the form of tags, which describe in plain text what each asset is doing. ICS CyberVision gives anyone immediate insights into an asset's role and behaviors; it offers many different analytic views leveraging artificial intelligence algorithms to let users deep-dive into the vast amount of data a typical industrial control system can generate. Sentryo makes it easy to see important or relevant information." In addition, Sentryo's platform uses deep packet inspection (DPI) to extract information from communications among industrial assets, Natarajan said. This DPI engine is deployed through an edge-computing architecture that can run either on Sentryo sensor appliances or on network equipment that is already installed. Thus, Sentryo can embed visibility and cybersecurity features in the industrial network rather than deploying an out-of-band monitoring network, Natarajan said.


Reducing data security complexity: Avoiding endpoint bloat

Whether agents, particularly security control agents, persist over time is the only metric worth our attention, because it puts a spotlight on the greatest hidden danger of all: the naturalness of security decay. Things fall apart. Rust never sleeps. Agents topple over. Decay is the fate of all security agents. But if these serve as the foundation of our security goals or most technical expression of security intent, then what could possibly be more important? It’s also not a question of whether security decay is happening in your environment, you can rest assured it is. What must be asked is, will you persist through it? This question demands an answer. Ideally, organizations reduce their overall security costs by monitoring how their endpoint controls work (or don’t) to reduce endpoint security decay. They validate safeguards and eliminate compliance failures. And they respond to threats and exposures with the confidence to control devices from anywhere.



Quote for the day:


"Leaders are people who believe so passionately that they can seduce other people into sharing their dream." -- Warren G. Bennis,


Daily Tech Digest - June 09, 2019

Budweiser's Parent Company Invests In Blockchain For Farmers

uncaptioned
While the blockchain industry has swayed over the past year to a focus on how blockchain can save giant enterprises money by removing unnecessary middlemen, the investment by Anheuser-Busch InBev, a member of the inaugural Forbes Blockchain 50 list, and best known as the maker of Budweiser, is a return to blockchain’s roots as a way of empowering the unbanked. “Through this work, we are helping to create a digital ledger of farmers’ transactions that will create an economic identity and enable access to financial services,” said Maisie Devine, a director at AB InBev, in a statement. “This will ultimately allow farmers to grow their business and improve the livelihoods of their families and communities.” Belgium-based AB InBev’s work with BanQu was announced in August 2018 with a pilot in Zambia that served 2,000 of the region’s smallholder cassava farmers with subsequent services brought to Uganda, India, Brazil, Costa Rica, India, Indonesia, Jordan, Malawi, Somalia, South Africa, Syria, Uganda and the United States.



GDPR One Year On: Increasing Demand for ''Security By Design''

GDPR’s focus on personal data highlights how software is made and what components are used. Globally, businesses awoke to the reality that open source components are part of their software supply chains. “Security hasn’t caught up to 21st-century software engineering, so that’s being addressed now,” he said. GDPR put pressure on the industry to rethink, and re-engineer, software security at the start. Ilkka emphasized that negative publicity is a key motivating factor. No one wants to be part of the next big breach, meaning security is quickly becoming a mainstream priority, he adds. Simultaneously, a corporate shift is occurring. More software development teams are adopting a DevOps approach to production. This approach, which favors rapid iterations and software releases, produces better software, faster. A consequence is that security must be embedded from the start. A successful, secure design must be automated, repeatable, and scalable.


Tackling bias in artificial intelligence (and in humans)

Tackling bias in artificial intelligence (and in humans)
In many cases, AI can reduce humans’ subjective interpretation of data, because machine learning algorithms learn to consider only the variables that improve their predictive accuracy, based on the training data used. In addition, some evidence shows that algorithms can improve decision making, causing it to become fairer in the process. For example, Jon Kleinberg and others have shown that algorithms could help reduce racial disparities in the criminal justice system. Another study found that automated financial underwriting systems particularly benefit historically underserved applicants. Unlike human decisions, decisions made by AI could in principle (and increasingly in practice) be opened up, examined, and interrogated. To quote Andrew McAfee of MIT, “If you want the bias out, get the algorithms in.” At the same time, extensive evidence suggests that AI models can embed human and societal biases and deploy them at scale.



For two hours, a large chunk of European mobile traffic was rerouted through China

China Telecom
"Today's incident shows that the internet has not yet eradicated the problem of BGP route leaks," Madory said. "It also reveals that China Telecom, a major international carrier, has still implemented neither the basic routing safeguards necessary both to prevent propagation of routing leaks nor the processes and procedures necessary to detect and remediate them in a timely manner when they inevitably occur. "Two hours is a long time for a routing leak of this magnitude to stay in circulation, degrading global communications." But if any other ISP would have caused this incident, it would have likely been ignored. Alas, it was China Telecom, and there's a backstory. An academic paper published by experts from the US Naval War College and Tel Aviv University in October last year blamed China Telecom for "hijacking the vital internet backbone of western countries." The report argued that the Chinese government was using local ISPs for intelligence gathering by systematically hijacking BGP routes to reroute western traffic through its country, where it can log it for later analysis.


Why cryptocurrency’s not quite ready for takeoff


Why? Because crypto remains too far outside of what’s known as the “adjacent possible.” ... The adjacent possible can be illustrated using a coordinate graph. A point on the vertical axis shows how competent the technology is today, and a point on the horizontal axis shows how ready society is to accept and adopt the technology. The curve that connects the two points constantly moves outward over time, as technology gets better and society embraces new innovations. Inside the curve is technology that exists and is accepted. TVs, smartphones, and airliners are all safely inside this zone. Outside the curve is what’s not yet possible or adopted. Either the technology isn’t good enough yet, or the public isn’t ready — or, more typically, both. Holographic entertainment, augmented reality glasses, and consumer space travel sit out in that zone. Build such a product, and it will be too far ahead of its time. The magic happens in the thin band separating the two zones — in the adjacent possible.


Ireland's Priviti and Aussie fintech Accurassi partner for Open Banking

Following the Australian government’s response to the Review into Open Banking in 2018, Australia’s major banks will be required to make data available on credit and debit card, deposit and transaction accounts and mortgages by February 2020. The government’s legislated Consumer Data Right gives Australians greater control over their data and enables them to choose to share their data with trusted recipients for purposes they have authorised. This will first apply to the banking sector, followed by the energy and telecommunications sectors. Seizing the opportunity, Accurassi is launching its marketplace solution with banks and energy suppliers in the next few months and using consumer utility bill data to power personalised energy comparison services. The Priviti API will be embedded into the user experience to provide explicit authorisation to energy retailers for the release of bills to Accurassi.


Meet Kedro, McKinsey’s first open-source software tool

Kedro
The name Kedro, which derives from the Greek word meaning center or core, signifies that this open-source software provides crucial code for ‘productionizing’ advanced analytics projects. Kedro has two major benefits: it allows teams to collaborate more easily by structuring analytics code in a uniform way so that it flows seamlessly through all stages of a project. This can include consolidating data sources, cleaning data, creating features and feeding the data into machine-learning models for explanatory or predictive analytics. Kedro also helps deliver code that is ‘production-ready,’ making it easier to integrate into a business process. “Data scientists are trained in mathematics, statistics and modeling—not necessarily in the software engineering principles required to write production code,” explains Yetunde. “Often, converting a pilot project into production code can add weeks to a timeline, a pain point with clients. Now, they can spend less time on the code, and more time focused on applying analytics to solving their clients’ problems.”


Can Artificial Intelligence Save Us From Asteroidal Armageddon?

NASA'S Planetary Defense Coordination Office uses the Catalina Sky Survey facility in Tucson, Arizona, to catalog space objects
NASA’s Planetary Defense Coordination Office already uses numerous telescopes to find and monitor NEOs that might have the potential to impact Earth. But the non-profit Aerospace Corporation’s A.I. team is working with NASA on implementing software dubbed NEO AID (Near-Earth Object Artificial Intelligence Detection) to differentiate false positives from asteroids and comets that might be real threats. Nightly, researchers at locations such as the Catalina Sky Survey on Mount Lemmon in Tucson, Ariz. pore over hundreds of images of star fields in search of fast-moving objects that need more scrutiny, says Aerospace Corporation. It’s here that Aerospace A.I. engineers used 100 terabytes of data to build and train an artificial intelligence model that is now capable of classifying NEO targets of interest. And by Aerospace Corporation’s calculations, this new A.I. tech has already increased the sky survey’s performance by 10 percent with room for development. NASA’s Center for Near-Earth Object studies says that with over 90 percent of NEOs larger than one kilometer already discovered, the NEO program is now focusing on finding the 90 percent larger than 140 meters.


Machine Learning Is Not Magic: It’s All About Math, Stats, Data, and Programming


One of the main reasons why I kept making a U-turn was the liberal dosage of mathematics found in almost every ML resource that I bookmarked. Despite my determination and commitment, the thought that I need to learn advanced mathematics kept pushing me away. Let me admit it — I dread dealing with mathematics. I barely managed to pass my math papers in high school. When I was a teen, I rejoiced when I found that it was possible to build a career in IT without a master’s degree in mathematics. The fact that some advanced math became a prerequisite for ML disappointed me and, in many ways, brought back the nightmare of my school days. But as I continued to work with my customers on Internet of Things and data-centric projects, the possible usage of ML kept coming back to us. Meanwhile, the hype around ML has reached the peak. So much so that the cloud providers started to push ML more than the core IaaS components like VMs, storage, and networking. It also became extremely clear that ML is becoming the front and center of many emerging technologies including Cognitive Computing, Artificial Intelligence, Chatbots, Personal Assistants, and Predictive Maintenance.


Shifting the Conversation to Security by Design

It wasn’t a surprise that healthcare organizations were asking for this as well. We think that asking for changes to a mandate or regulation is a good thing in theory, but it’s tricky; you don’t want to over-mandate or over-regulate, but you also don’t want to under-regulate either. With cyber hygiene, if you are going to have meaningful regulation, you want to make sure it balances the technology side of the equation with the people side. You often hear the individuals in an organization getting blamed as the weakest link. We don’t like to think of it that way. We like to say individuals can be your strongest line of defense if they are adequately trained and have the right tools and resources, both from a technological perspective, but also from a training perspective. Safe harbor would remove penalties for healthcare organizations that suffer a cyber incident if they were in full compliance with HIPAA requirements and any other mandates that could secure the network or data. No matter how much money you spend, there is no protection that will render a system completely secure.



Quote for the day:


"Being responsible sometimes means pissing people off." -- Colin Powell