Daily Tech Digest - August 10, 2019

Blockchain’s real promise: Automating trust

Blockchain’s real promise: Automating trust
The opportunity for transformation is significant because the cost of establishing trust in a supply chain is incredibly high. Consider the problem of counterfeit goods: The Organisation for Economic Cooperation and Development estimates that $461 billion worth of fake goods are sold annually, amounting to 2.5 percent of global trade. According to the Global Brand Counterfeiting Report 2018, total global counterfeiting is expected to surge to $1.82 trillion by 2020, exposing businesses to revenue loss, quality issues, and potential reputational damage. As companies grapple with how to build trust among their suppliers, they are doling out big money on activities such as duplicative testing, manual auditing, and reconciliation, while investing in extra insurance and legal assistance to backstop any failure to meet contractual obligations. In the airline industry, for example, carriers are grounding planes longer, hoarding an excess of spare parts, and avoiding the use of less-expensive used parts and planes because they don’t fully trust their provenance.



The brain inspires a new type of artificial intelligence

Brain dynamics do not comply with a well-defined clock synchronized for all nerve cells, since the biological scheme has to cope with asynchronous inputs, as physical reality develops. "When looking ahead one immediately observes a frame with multiple objects. For instance, while driving one observes cars, pedestrian crossings, and road signs, and can easily identify their temporal ordering and relative positions," said Prof. Kanter. "Biological hardware (learning rules) is designed to deal with asynchronous inputs and refine their relative information." In contrast, traditional artifical intelligence algorithms are based on synchronous inputs, hence the relative timing of different inputs constituting the same frame is typically ignored. The new study demonstrates that ultrafast learning rates are surprisingly identical for small and large networks. Hence, say the researchers, "the disadvantage of the complicated brain's learning scheme is actually an advantage". Another important finding is that learning can occur without learning steps through self-adaptation according to asynchronous inputs.


Are you blinded by the promise of AI?

blind spot side view mirror car vehicle
Data is being collected by technologies across your business faster than any human can assess, analyze and leverage it. The big leap forward in data analytics has been the machine learning capabilities that result in algorithms that can forecast behavior and offer recommendations and/or potential pathways. From the recommendations that come from content streaming services to shopping options that pop up in online advertising, many of us see these algorithms at work every day. Customer data—when collected with permission, security and high integrity—offers businesses potent customer personalization opportunities, from sending discounts or information around important life events (anniversaries, holidays, etc.) to creating personalized communications. Because AI can collect and analyze large data sets at remarkably fast rates, businesses can use it to predict potential issues, favorable market opportunities or customer needs. To identify these opportunities, organizations need to work across all business groups to assess the numerous places data is collected, and how and when that data can be used


Software quality issues: not just for Boeing CIOs

It is clearly no exaggeration to describe as becoming increasingly life-critical. Software keeps planes in the sky, tests the cars we drive, keeps the health systems running our hospitals and ensures 1.5 million smart meters keep houses warm across the UK is vital. However, with software enabling some of the most important and life-critical functions, we need to know it can execute, flawlessly, again and again. In Boeing’s case, a functional issue prevented the software from performing as it was expected to. However other issues, which are coding related, not functional, are arguably harder to detect. Coding issues in software can result in poor quality and IT outages similar to Boeing. CISQ’s recent research on software quality estimated the cost of poor software quality to $2.8 Trillion for the US alone. CIOs, therefore, need the ability to oversee the current state of an organisations software. Yet, the 2018 Software Intelligence Report found only half (51%) of CIOs claim to “have ‘some’ knowledge of current applications” software quality. Even worse, less than 50% of CIOs believe their organisations have enough insight into the software to make the best decisions.


Cyberattack Warning As Dangerous Issues Found On Popular Office Printers

Office printer.
In both those disclosures, the primary risk exposed was unpatched devices providing a soft entry point into a would-be secure network. In essence, attackers don't need to try too hard to develop sophisticated TTPs when there are vulnerable IoT devices that, in many cases, are not even on the radar of corporate technology security teams. "IoT devices," Microsoft pointed out, "are purposefully designed to connect to a network and many are simply connected to the internet with little management or oversight. In most cases however, the customers’ IT operation center don’t know they exist on the network." How true is that of network printers, connecting to the open internet to download printer drivers while also appearing on internal networks? In the cyberattacks identified by Microsoft, those devices—including the office printer—became "points of ingress from which the actor established a presence on the network and continued looking for further access.


The Pros and Cons of Emotional AI

Group Of Men School Symposium Learning Studying Cybersecurity AI Cisco Pro Cons Article
Some analysts also worry that if emotional AI gauges how people feel often enough or provides responses with simulated feelings, it could give humanity an excuse not to stay connected. For example, if an AI tool could check on a loved one and send a report that says everything’s fine, a user could decide that’s enough information and not bother confirming it’s true. What if a person has a disability that causes them to have trouble controlling their facial expressions, or perpetually grimace because they’re in pain? Those things don’t have anything to do with the kind of service received. If emotional AI makes the wrong judgment, it could bring unwanted attention to the individual and cause embarrassment. It’s also possible that AI could pick up on a person’s emotions and do something that worsens how they feel. Many people have had at least a few instances where Facebook’s “On This Day” feature showed something they’d rather not recall. Some companies are developing AI that could respond to people’s angry or sad emotions to cheer them up or calm them down.


New wave of smart cities has arrived -- and they’re nothing like science fiction

smart city
Successful smart city projects blend disciplines, bringing together experts in behavioral change alongside specialists in artificial intelligence and information technologies. Interdisciplinary work can be messy and difficult, it can take longer and may not always work -- but when it does, it can bring real benefits to cities. For instance, Nottingham City Council and Nottingham Trent University have been part of the Remourban regeneration program, working across sectors with cities around Europe. Homes in the Nottingham suburb of Sneinton have been upgraded with new outside walls and windows, a solar roof and a state-of-the-art heating system -- a process that takes just a few days. The result is improved insulation and reduced energy bills for residents, but also better public health: calculations suggest that bad housing costs the UK’s National Health Service £1.4 billion a year, and improving the quality of homes can cut visits to local doctors almost by half. The German city of Darmstadt has worked with citizens, universities, museums and businesses to plan for the future.


White House Staffer Seeking Threesomes, According to Data Leak

3Fun
An explanation of how the data was obtained was published in a detailed blog post Thursday. Writer Alex Lomas acknowledged it's possible tech-savvy web users could have manipulated the data to make it appear their locations were close to seats of power. The focus of the post wasn’t on the technology-enhanced sex habits of Supreme Court law clerks, but on the continued issues with leaked data and hookup apps. That includes the ready release of personal information allowing users of Grindr and Romeo to be tracked down from great distances. “We think it is utterly unacceptable for app makers to leak the precise location of their customers in this fashion,” Lomas wrote. “It leaves their users at risk from stalkers, exes, criminals, and nation states.” While the differentiating features for such apps involve the ability to locate other nearby users, developers have faced calls to address flaws in the technology that allows people to access private data and to find the precise location of users from significant distances and then target them.


State Farm Investigates Credential-Stuffing Attack

State Farm Investigates Credential-Stuffing Attack
"State Farm discovered a bad actor or actors attempting to gain access to customers' online accounts using a list of user IDs and passwords from other sources," the company spokesperson tells ISMG. "To defend against the attack, we reset passwords for these online accounts in an effort to prevent additional attempts by the bad actor. We have implemented additional controls and continue to evaluate our information security efforts to mitigate future attacks." It's not clear how many customers were affected by the incident, and the State Farm spokesperson did not specify how many notification letters went out. "We encourage customers to regularly change their passwords to a new and unique password, use multifactor authentication whenever possible and review all personal accounts for signs of unusual activity," the spokesperson says. Credential stuffing has emerged as one of the biggest threats to enterprises across the world. A 2018 report by security vendor Akamai found that companies were reporting nearly 13 credential stuffing incidents each month in which the attacker successfully identified valid credentials.


Blockchain’s Role In Collaborative Competition

Blockchain’s immutable record avoids legal disputes by clearly attributing IP. Through self-regulating and decentralised smart contracts, organisations can collaborate with far more confidence. Rather than trusting a competing company, organisations place their trust in technology. Let’s say that one of the manufacturers in the IVC’s data sharing strategy opens up the information they have about the production and performance of a specific tool. If another company wants to access the information, they can, but may need to offer IP or capital in exchange. Once the exchange is made, each party gets something out of it which ultimately benefits the manufacturing industry as a whole. Blockchain is one of many disruptive technologies that has made it easier for companies to stay competitive through collaboration. The IVC’s encouragement of blockchain adoption demonstrates just how much capitalist companies have changed. Instead of locking IP away forever, organisations recognise that data exchange is advisable, if not necessary.



Quote for the day:


"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour


Daily Tech Digest - August 09, 2019

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge


The disruptions – which happen near-instantly – need to be detected as early as possible. So far, simulations have been unable to deliver fast enough predictions – so the researchers turned to machine learning, which has shown promising results for disruption prediction. The goal: to meet the 95 percent correct disruption prediction threshold required by the under-construction ITER Tokamak, which will be the larger fusion reactor in the world. Julian Kates-Harbeck (lead author on the paper published in Nature) answered this challenge by developing the Fusion Recurrent Neural Network (FRNN), an AI disruption prediction tool. FRNN learns from thousands of experimental runs – tracking plasma current, temperature, density and other variables – and attempts to learn which factors signal imminent disruptions. To meet the level of reliability that ITER will demand, the researchers ran FRNN on powerful machines. After initial runs on Tiger (a cluster at Princeton University), they turned to the (now-decommissioned) Titan supercomputer, where they ran FRNN on 6,000 Nvidia Tesla K20X GPUs.



This Bank Gave Bitcoin to Its Entire Staff. Now It’s Taking Crypto Clients

Quontic Bank opened a checking account for a bitcoin ATM company a few weeks ago and is in the process of completing a contract to deliver banking services to another crypto startup. The bank wouldn’t name either client. “We’re just taking steps so that when the regulatory environment becomes more crypto-friendly, we don’t have a lot of catching up to do,” said Quontic chief executive Steven Schnall, who acquired the bank in 2009. “We’re looking to diversify our product offering and our customer mix by entering into that field.” While Schnall wouldn’t say how big he wants Quontic’s crypto business to be, he claimed the pending contract “could impact millions of Americans.” Crypto-friendly banks are extremely rare, in part because of the extra work they have to do complying with know-your-customer (KYC) and anti-money laundering (AML) regulations. “Banks and other financial institutions have to look out for any suspicious activity,” said Joshua Klayman, head of the blockchain and digital assets practice at law firm Linklaters.


Seven very simple principles for designing more ethical AI


AI systems have already been designed to help or hurt humans. A group at UCSF recently built an algorithm to save lives through improved suicide prevention, while China has deployed facial recognition AIsystems to subjugate ethnic minorities and political dissenters. Therefore, it’s impossible to assign valence to AI broadly. It depends entirely on how it’s designed. To date, that’s been careless. AI blossomed with companies like Google and Facebook, which, in order to give away free stuff, had to find other ways for their AI to make money. They did this by selling ads. Advertising has long been in the business of manipulating human emotions. Big data and AI merely allowed this to be done much more effectively and insidiously than before. AI disasters, such as Facebook’s algorithms being co-opted by foreign political actors to influence elections, could and should have been predicted from this careless use of AI. They have highlighted the need for more careful design, including by AI pioneers like Stuart Russell, who now advocates that “standard model AI” should be replaced with beneficial AI.


PCI SSC warns organisations about growing threat of online skimming

Online skimming is a variation of a criminal tactic used to gain access to payment card information. Until recently, it was more commonly associated with physical fraud, in which criminals use a device (‘skimmer’) that interacts with a victim’s payment card. One of the most common skimming methods is to place a duplicate card reader on top of an ATM’s payment card slot. Criminals can then siphon off card details as the card enters the machine. This reader will typically be paired with a pinhole camera or duplicate keypad placed over the machine so that the fraudsters can log the customer’s PIN. Online skimming works in much the same way, except the ATM is replaced by an online payment form and the physical skimming device is replaced by malicious code. Magecart is the umbrella term used involving criminal groups exploiting vulnerabilities that mostly target Magento-based online stores or content management systems. A number of recent data breaches such as Ticketmaster/British Airways was believed to be part of such credit card skimming operations.


What is edge computing? Here's why the edge matters and where it's headed

190227-schneider-edge-11-inside-a-micro-dc-cabinet.jpg
Edge computing has been touted as one of the lucrative, new markets made feasible by 5G Wireless technology. For the global transition from 4G to 5G to be economically feasible for many telecommunications companies, the new generation must open up new, exploitable revenue channels. 5G requires a vast, new network of (ironically) wired, fiber optic connections to supply transmitters and base stations with instantaneous access to digital data (the backhaul). As a result, an opportunity arises for a new class of computing service providers to deploy multiple µDCs adjacent to radio access network (RAN) towers, perhaps next to, or sharing the same building with, telco base stations. These data centers could collectively offer cloud computing services to select customers at rates competitive with and features comparable to, hyperscale cloud providers such as Amazon, Microsoft Azure, and Google Cloud Platform. Ideally, perhaps after a decade or so of evolution, edge computing would bring fast services to customers as close as their nearest wireless base stations.


Get creative with feature flags in IT operations


DevOps shops can use feature flags in conjunction with other application deployment methods, including what Condo referred to as a progressive release methodology. Rather than the typical pattern of releasing approved code and then deploying that release to production, a progressive release instead deploys latent code to production, where it is then tested or held until a designated time and switched on. A feature flag, in this instance, makes that final changeover as simple as pressing a button. But feature flags also create significant technical debt if left unchecked -- technical debt that IT admins must clean up or otherwise manage during troubleshooting, new releases and other ops tasks. Organizations that rely on feature flags have two options, Condo said: "They could leave the feature flag there, because they have some greater strategy about how they manage things ... or [flag code removal] becomes part of the normal routine, and [admins] remove that code so that it never gets turned off accidentally and [there are] fewer flags to manage."


The emergence of London’s Silicon Roundabout: why? And lessons for future tech hubs!

The emergence of London’s Silicon Roundabout: why? And lessons for future tech hubs! image
Technology shifts such as the iPhone, the Android with its emphasis on opensource, WordPress slashing the cost of building a website, the cloud reducing the need for startups to invest in expensive hardware — helped create a space for startups. As for London — immigration may have been another factor: As Espinal pointed out, he originally hails from Honduras, worked in the US, came to the UK from the US, as the UK visa system allowed this. “At the time, immigration regulation favoured highly skilled migrants.“ Espinal says you can unpick the tech story in terms of afters and befores — pre-SEIS post-SEIS, pre/entrepreneur relief, post-entrepreneur relief — EIS, the emergence of accepted ways of providing shareholder agreements, convertibles, cross border investing. Vidra reckons that the 2012 Olympics was a factor — the second wave, anyway. What with that and the Royal Wedding, the London brand name was strong. The Olympics illustrated another point; it was said that every nation competing in the games had 10,000 supporters from the local population.



DDoS attacks in Q2 2019

In early June, a powerful DDoS attack hit Telegram. The attack was carried out primarily from Chinese IP addresses, which gave founder Pavel Durov reason to link it to the demonstrations in Hong Kong; in his words, the political opposition there uses Telegram to organize protests, which Beijing takes a very dim view of. The only headline attack this quarter seemingly driven by commercial considerations targeted video game developer Ubisoft on June 18 — just before the release of its new Operation Phantom Sight expansion for the game Rainbow Six Siege. It caused connection problems for many players, and even provoked calls on Reddit for better DDoS protection. The largest would-be DDoS attack in Q2 turned out to be a false alarm. In late June, some segments of the Internet experienced operational issues worthy of a major DDoS offensive, but the actual cause lay elsewhere. As it turned out, a small ISP in Pennsylvania had made a configuration error, turning itself into a priority route for some Cloudflare traffic. The provider could not handle the load, and thousands of websites serviced by Cloudflare went down as a result.


Integrating Technology into Your Supply Chain: Five Questions You Need to Ask

Integrating Supply Chain Tech.jpg
Unfortunately, many companies learn the hard way that when it comes to supply chain technology, it is not “one size fits all” or “technology for technology’s sake.” Stalled projects, unrealized benefits, disrupted operations, and customer and employee frustration point to the importance of selecting the right kind of emerging technology based on your operating profile and future outlook of your business. By far, the biggest driver of disruption for companies is e-commerce and the extraordinarily high service expectations it is creating. In fact, a recent report from DHL Supply Chain, “The E-Commerce Supply Chain: Overcoming Growing Pains,” found that pressure to fulfillcustomer expectations continues to challenge businesses building out e-commerce offerings and the new supply chains they need. Customers expect a great, painless e-commerce purchase experience with an ever-shortening delivery time. We are noticing profile changes in other market verticals as well, as order sizes decrease and service expectations increase.


Celebrating the Indian CEO

British consumer goods and healthcare giant Reckitt Benckiser Group Plc recently made some waves when it named PepsiCo’s Laxman Narasimhan as its next chief executive officer, looking outside its own ranks for a new leader after a difficult few years. Interestingly, he replaces Rakesh Kapoor, another Indian who had been at the helm for eight years. Narasimhan’s appointment is the latest in a series of appointments of Indian-origin CEOs at top global firms in the last decade or so. Think – Vasant Narasimhan (Novartis), Sundar Pichai (Google), Satya Nadella (Microsoft), Shantanu Narayen (Adobe), Ajay Banga (Master Card), Ivan Menezes (Diageo), Sonia Syngal (Old Navy), Rajeev Suri (Nokia) and more recently Vivek Sankaran, President & CEO of Albertsons Companies and Nitin Paranjpe, global COO at Unilever.There’s no ignoring the trend of Indians making it to global leadership roles at multinational firms. And while this trend has largely been noticed in the tech firmament and Silicon Valley where the geek background helped in no small measure, the other industries surprisingly are not far behind.



Quote for the day:


"Limitations are what someone else tries to impose on you. Don't accept it. Question it!" -- Elizabeth McCormick


Daily Tech Digest - August 08, 2019

VR is ahead for now, but AR will be a larger market in the long run

VR is ahead for now, but AR will be a larger market in the long run image
Taking any process, whether it’s in the personal or business sphere, usually involves some sort of digital interface to accomplish it. One example would be navigating using Google Maps to an unfamiliar location; a restaurant or a bar. A business equivalent of this could be in a warehouse, where an employee needs to navigate to the right shelf to pick up or deposit an item. On the consumer side, another example includes performing an oil change on a vehicle and on the business side, an equivalent could be maintaining an elevator system. Fixing the water pressure on your boiler… the list of applications is really endless. All of these processes require a lot of knowledge, sometimes specialist and sometimes not. But, the point is that all of these tasks can be performed with the aid of augmented reality — eventually. Currently, the use of AR in the above scenarios is being held back by the price, the design of the headsets, their ease of use and various cultural hurdles; a lack of understanding, for one. With AR, everything in both a business and a social sense can become a lot more efficient and accurate.



Data and AI Power the Future of Customer Engagement in Financial Services

The beauty of digital engagement driven by data, advanced analytics and AI is that a dialogue can be created across a broad range of topics. Instead of trying to sell a limited range of products in a program-based environment, all products and services can be included in the conversation, based on identified need. Each interaction is based on the individual profile, preferences and behavior of the consumer. With the integration of chatbots, voice and live agents, consumers can provide feedback on each communication and interaction, allowing models to improve and become even more personalized over time. The learnings during the process not only makes the engagement more personal, it makes it more powerful because recommendations will be more accurate. While what’s learned will potentially increase the amount of dialogue over time, personalization and contextualization improves, as do results and revenues. More importantly, the learnings will be shared across the organization, making every contact point more intelligent and consistent. 


How to Fit Smart Home Technology into Your Business


Smart home technology comes in many forms, and some of the most popular applications include automated lights, locks, and thermostats. Now more consumers are using virtual assistants to tie all of their connected smart devices together into one cohesive smart home. Today, 27% of people in the U.S. currently own and use a virtual assistant, such as the Amazon Echo or Google Home. Consumers primarily use them to connect with and command apps without picking up their phones. However, companies can also utilize virtual assistants to schedule meetings, scan through emails, and (like consumers) get important data and information on the fly. For example, rather than ask your assistant about the weather, you could ask what last quarter’s revenue looked like compared to the quarter before it. At a 2018 data and analytics summit, Gartner analyst Svetlana Sicular pointed out that AI still needs further development before it can emulate human-to-human conversation.


Cloud adoption questions every IT admin should ask


Changing tools or platforms is akin to moving out of a house. You don't recall what's stored in the basement, nor are you prepared for how long it will take to go through every forgotten box. Cloud migration won't be a quick project, and it starts with an inventory of what's in place, if those items are needed and how -- or even if -- they will work on the new platform. While these legacy on-prem deployments are complex and often underdocumented, the IT admin can migrate or even alter components of the system as needed -- not so when a cloud provider controls the physical data center, security and other aspects of IT infrastructure. Cloud service adoption is a struggle as the business is accustomed to IT staff using their collective knowledge and authority to solve any problem. When something breaks at the cloud provider's end, IT must log the issue with the vendor and wait for resolution.


Survival skills for the digital age


People no longer need to be working from a single desk for the sake of presenteeism. Instead, they are tethered to a suite of devices and platforms that were designed by humans but are not human themselves: Intelligent technology is more like a third person in the room with a person and her device, but it should not be confused with another human. My research showed that it is common for people to spend more than 75 percent of the workday on at least three devices, often more. Financial Times journalist Hannah Kuchler, who covers Silicon Valley, told me that “overall, [people] do things dictated by tools they have, and none of the productivity tools in this new collaborative era really encourage proper thinking.” There’s the example of Slack. Answering Slack messages is no different from answering emails, and neither involves deep thinking, which is a prerequisite for coming up with innovative, value-added solutions. The office-less world requires a new set of digital survival skills, which are vital to preventing burnout.


Digital Transformation And The Cognitive Enterprise

Digital Transformation And The Cognitive Enterprise
The impact of digital transformation on talent strategies brings opportunities, as well as challenges. The digitisation of work and life is driving a need for new types of talent with new skills. Yet, despite the pressing need for digitally fluent employees to deliver innovation at an extraordinary rate and pace, talent with in-demand skills is limited. To ensure we have digitally skilled talent in place to drive organisations forward, there is a critical need for exponential learning. Leaders must also actively create an internal culture that supports the need for agility and constant change management amid ongoing disruption. HR has a vital role to play in supporting the attraction and delivery of digitally fluent talent. An urgent requirement is to offer personalised learning to employees – AI-enabled platforms can really elevate this experience. HR must prioritise current skills development and for enterprises, the skills gap is very real. Of the global executives we spoke to in our recent CHRO research, 60% say they are struggling to keep their workforce current and relevant. 


Leadership In A Digital Age Is Fundamentally Different

Communication network concept. AI(Artificial Intelligence).
The research we did showed two powerful forming elements for great leaders in successful corporations thriving with digital transformation. Firstly, the winners (sucking up 72% of the available returns) see everybody in the organizations as being mutually responsible to each other all the time. Every moment of every day. Secondly, thriving leaders recognize that customer journeys don't work in a world where customers or consumers can choose which of the thousand moments' they want to act in. This means we need to be continuously connected and aware of how each of the moments that might matter are being handled. These organizations have worked out how to do both of these things and how to deliver power to people and processes that have work in each moment. That is super scary for command and control cultures for obvious reasons. The organizations that get this right drive 40% better OPEX performance and see 25%+ changes in total revenue changes compare to industry peers.


The Challenges in Integrating Cross-Boundary Teams

When it comes to building a team, two prominent approaches exist: one is to map its sequential growth, and the other is to allow the non-sequential way they confront an issue. In the sequential growth approach, the team first interacts slowly, learns the tasks and goals, and then progresses competitively towards the goal. It involves the surfacing of conflicts, their resolution, acceptance, and then moving towards performance. On the contrary, a non-sequential team moves in a flexible way; they don't have the slow progressive pattern which we see in sequential teams. It is hence usually an association of known people; those who have worked previously or those who understand each other deeply. When assembling cross-functional teams, it is essential to understand the role of team mental models and transactive memory in enhancing team effectiveness. The team mental models make the members aware of the team environment, in a shared and organized understanding of the way key elements interact within the team atmosphere. It is the way an individual member perceives these factors.


R. Michael Anderson reveals the key to (tech) leadership

R. Michael Anderson reveals the key to (tech) leadership image
Before a person can become a successful leader, in technology or elsewhere, Anderson advocates improving the “relationship with ourselves. You can’t be a strong leader until you’re really strong with a relationship with yourself.” “People want to be led, and they want to be led by somebody that they respect. If there’s somebody that’s insecure and doesn’t respect themselves, nobody’s going to respect or have loyalty to them. “The type of people who have a leadership presence are assertive and respect themselves and have confidence in themselves, those are the ones who inspire loyalty and confidence. “Leaders need bold goals, a purpose or mission and have the guts to be able to stick with it, even when things are getting some resistance — inner resilience is key.” ... Good programmers are analytical, “sometimes introverted” and hard working. To be a successful leader these traits (bar being introverted) are alone not enough.


Amazon's PartiQL query language eyes all data sources


Amazon created PartiQL to meet internal demand for multi-source data queries across structured, semi-structured and unstructured data. Those needs came from various corners of Amazon's business, including its retail arm. Amazon released PartiQL's tutorial materials, specification and reference implementation under the Apache 2.0 license. AWS has used PartiQL internally for services such as S3 Select, Glacier Select and RedShift Spectrum, and it was adopted as a query language for Quantum Ledger Database, which AWS launched last year. PartiQL is compatible with standard SQL, which means enterprises can use existing queries with PartiQL in conjunction with SQL query processors. It also treats nested data as a first-class citizen, and doesn't require predefined schemas to be placed on a data set. PartiQL does contain SQL extensions, but they are minimal and simple for DBAs and developers to understand, AWS claims. Finally, PartiQL is data format-independent, which means one query applies to JSON, ORC, CSV and other data types.




Quote for the day:

"A leader takes people where they would never go on their own." -- Hans Finzel


Daily Tech Digest - August 07, 2019

Diligent Engine: A Modern Cross-Platform Low-Level Graphics Library

Image 13 for Diligent Engine: A Modern Cross-Platform Low-Level Graphics Library
The next-generation APIs, Direct3D12 by Microsoft and Vulkan by Khronos are relatively new and have only started getting widespread adoption and support from hardware vendors, while Direct3D11 and OpenGL are still considered industry standard. New APIs can provide substantial performance and functional improvements, but may not be supported by older platforms. An application targeting wide range of platforms has to support Direct3D11 and OpenGL. New APIs will not give any advantage when used with old paradigms. It is totally possible to add Direct3D12 support to an existing renderer by implementing Direct3D11 interface through Direct3D12, but this will give zero benefits. Instead, new approaches and rendering architectures that leverage flexibility provided by the next-generation APIs are expected to be developed. There exist at least four APIs (Direct3D11, Direct3D12, OpenGL/GLESplus, Vulkan, plus Apple's Metal for iOS and osX platforms) that a cross-platform 3D application may need to support. Writing separate code paths for all APIs is clearly not an option for any real-world application and the need for a cross-platform graphics abstraction layer is evident.


"The whole goal of IAM is to make it a whole lot simpler for the user, rather than having to log on and configure access on thousands of different applications," Johnson said. "And the person on the user- or employee-enablement side of the house is really thinking about, 'How can I implement all this permissioning in a way that makes the users' lives easier, not harder?'" Organizations can run into trouble with IAM, however, when the right hand doesn't know what the left is doing, Johnson added. While the CISO and cybersecurity team might operate under false assurance that person X does not have access to resource Y, for example, someone in employee enablement might have in fact granted that access -- unaware of the security implications at play. "Then, when something bad occurs, the board might say, 'How could this happen?'" Johnson said. 


Microsoft finds Russia-backed attacks that exploit IoT devices  

Russian hammer and sickle / binary code
Devices compromised in this way acted as back doors to secured networks, allowing the attackers to freely scan those networks for further vulnerabilities, access additional systems, and gain more and more information. The attackers were also seen investigating administrative groups on compromised networks, in an attempt to gain still more access, as well as analyzing local subnet traffic for additional data. STRONTIUM, which has also been referred to as Fancy Bear, Pawn Storm, Sofacy and APT28, is thought to be behind a host of malicious cyber-activity undertaken on behalf of the Russian government, including the 2016 hack of the Democratic National Committee, attacks on the World Anti-Doping Agency, the targeting of journalists investigating the shoot-down of Malaysia Airlines Flight 17 over Ukraine, sending death threats to the wives of U.S. military personnel under a false flag and much more. According to an indictment released in July 2018 by the office of Special Counsel Robert Mueller, the architects of the STRONTIUM attacks are a group of Russian military officers, all of whom are wanted by the FBI in connection with those crimes.


Privacy Attacks on Machine Learning Models


This type of attack is called a Membership Inference Attack (MIA), and it was created by Professor Reza Shokri, who has been working on several privacy attacks over the past four years. In his paper Membership Inference Attacks against Machine Learning Models, which won a prestigious privacy award, he outlines the attack method. First, adequate training data must be collected from the model itself via sequential queries of possible inputs or gathered from available public or private datasets that the attacker has access to. Then, the attacker builds several shadow models -- which should mimic the model (i.e. take similar inputs and outputs of the target model). These shadow models should be tuned for high precision and recall on samples of the training data that was collected. Note: the attack aims to have different training and testing splits for each shadow model, so you must have enough data to perform this step.


Unsupervised learning explained

Unsupervised learning explained
Think about how human children learn. As a parent or teacher you don’t need to show young children every breed of dog and cat there is to teach them to recognize dogs and cats. They can learn from a few examples, without a lot of explanation, and generalize on their own. Oh, they might mistakenly call a Chihuahua “Kitty” the first time they see one, but you can correct that relatively quickly. Children intuitively lump groups of things they see into classes. One goal of unsupervised learning is essentially to allow computers to develop the same ability. As Alex Graves and Kelly Clancy of DeepMind put it in their blog post, “Unsupervised learning: the curious pupil,” ... Mixture models assume that the sub-populations of the observations correspond to some probability distribution, commonly Gaussian distributions for numeric observations or categorical distributions for non-numeric data. Each sub-population may have its own distribution parameters, for example mean and variance for Gaussian distributions. Expectation maximization (EM) is one of the most popular techniques used to determine the parameters of a mixture with a given number of components.


Mobile-Only Bank Monzo Warns 480,000 Customers to Reset PINs

Mobile-Only Bank Monzo Warns 480,000 Customers to Reset PINs
Monzo reports that PINs are supposed to be encrypted and stored in the bank's internal systems with limited access, but because a bug allowed the PINs to be stored in plaintext, more employees could have accessed them. The software bug has since been fixed, the company reports. "If we've contacted you to tell you that you've been affected, you should head to a cash machine to change your PIN to a new number as a precaution," according to the company's blog. So far, Monzo's investigation hasn't turned up any cases of fraud stemming from the unsecured PINs, and no one from outside the bank apparently accessed the data, according to the bank's statement. A spokesperson for the company did not immediately reply to Information Security Media Group's request for comment. The Guardian reports, however, that this security vulnerability has persisted for at least the last six months, and that the incident has been referred to the U.K. Information Commissioner's Office, which is Britain's watchdog agency for consumer privacy issues.


CIOs In Banking And Financial Firms Still Grappling With Cybersecurity

CIOs in banking and financial firms still grappling with cybersecurity - CIO&Leader
When it comes to cybersecurity awareness and practices, CIOs in the banking and financial services industry are at a much higher maturity curve than their peers. Despite their awareness and concerns about online threats, a new study found that banking organizations are struggling to manage cybersecurity risks, with many CIOs acknowledging that they are still not doing enough to protect their systems, networks and data. The Synopsis report, based on a survey of CIOs and IT security practitioners from global financial services organizations conducted by Ponemon Institute, found that more than half of these firms have experienced theft of sensitive customer data or system failure and downtime because of insecure software or technology. Besides, the study shows, banking and financial firms’ CIOs are struggling to manage cybersecurity risk in their supply chain and are failing to assess their software for security vulnerabilities before release. “While the financial services industry is relatively mature in terms of its software security posture, organizations are grappling with a rapidly evolving technology landscape and facing increasingly sophisticated adversaries,” says Drew Kilbourne


Building Maintainable Software Systems

To keep the code clean and maintainable one can use clean architecture principles. There’s a whole book on that by Robert Martin and the acronym SOLID to go with it, but here I’m going to simplify it as separating what the system does from how it does it, and that the “what” is not dependent on the “how.” What the system does at its core is the domain and the use cases that surround it. How the system does it, relates to its infrastructure, presentation and configuration. ... A key point in determining which architecture, code organization, language, framework, etc. to use is the ability to justify your decisions. If you can’t justify the decision(s), then you are taking a chance that it will just work out. A better approach might be to first justify the decision to yourself, so that you can later justify it before others. A good way is to record those decisions, for example, by using Architecture Decision Records. Writing down your decision(s) helps you identify if it really makes sense, but it also benefits those coming after you to understand why the system is in its current state.


Best mobile device security policy for loss or theft


The first step to develop a reasonable response procedure for a stolen or lost work phone is to acknowledge what's at stake. Today's business smartphones and sometimes even tablets store a huge amount of information and access, so IT must address lost or stolen devices as serious threats. ... Another way IT can reduce the damage of a lost work phone is to ensure that users are on board with a mobile device security policy and established best practices. Users must know the exact steps to follow once the loss or theft occurs, such as how to report a lost device and how to help locate it. IT professionals may have listed or documented these steps in a manual, but they must communicate the process to users as well. Finally, IT must evaluate existing controls and processes for lost mobile devices. IT professionals can run tests for these policies on a one-off basis every year via a survey or in a one-on-one meeting


Facial recognition… coming to a supermarket near you

Facial Recognition Technology<br>Facial Recognition System, Concept Images. Portrait of young man.
As with all algorithmic assessment, there is reasonable concern about bias. No algorithm is better than its dataset, and – simply put – there are more pictures of white people on the internet than there are of black people. “We have less data on dark-skinned people,” says Pantic. “Large databases of Caucasian people, not so large on Chinese and Indian, desperately bad on people of African descent.” Davis says there is an additional problem, that darker skin reflects less light, providing less information for the algorithms to work with. For these two reasons algorithms are more likely to correctly identify white people than black people. “That’s problematic for stop and search,” says Davis. Silkie Carlo, the director of the not-for-profit civil liberties organisation Big Brother Watch, describes one situation where an 14-year-old black schoolboy was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”. That said, the Facewatch facial-recognition system is, at least on white men under the highly controlled conditions of their office, unnervingly good. Nick Fisher, Facewatch’s CEO, showed me a demo version; he walked through a door and a wall-mounted camera in front of him took a photo of his face



Quote for the day:


"Leaders make decisions that create the future they desire." -- Mike Murdock


Daily Tech Digest - August 06, 2019

Evolution of the internet: Celebrating 50 years since Arpanet

NW_internet 50th anniversary2
Daily traffic on the Internet surpassed 3 million packets in 1974. First measured in terabytes and petabytes, monthly traffic volume is now measured in exabytes, which is 1018 bytes. In 2017, the annual run rate for global IP traffic was 122 exabytes per month, or 1.5 zettabytes per year, according to Cisco’s Visual Networking Index. Annual global IP traffic will reach 396 exabytes per month, or 4.8 zettabytes per year, by 2022, Cisco predicts. As traffic volume has grown, so too has the number of devices connected to the internet. Today, the number of devices connected to IP networks is approaching 20 billion. By 2022, there will be 28.5 billion networked devices, up from 18 billion in 2017, Cisco predicts. That’s more than the number of people in the world. Overall, Cisco predicts there will be 3.6 networked devices per person by 2022, up from 2.4 in 2017. Today, smartphone traffic continues to grow and is poised to exceed PC traffic in the coming years. In 2018, PCs accounted for 41% of total IP traffic, but by 2022 PCs will account for only 19 percent of IP traffic, according to Cisco’s data. Smartphones will account for 44 percent of total IP traffic by 2022, up from 18% in 2017.


Why Every Developer Should Know a Bit of Technical Writing

First, technical writing can help you communicate more easily with your teammates. If you’re collaborating with other software developers on a regular basis, you know the importance of exchanging ideas, ensuring you’re working for the same high-level goals, and solving problems together. Technical writing abilities help you formally structure these bits of communication so your coworkers can better understand them; with an efficiently written message, you can avoid most misconceptions and ultimately work faster. You can also use your technical writing abilities to communicate with out-groups more efficiently, especially if those groups have limited technical knowledge. Rather than using terms unique to the development field, or describing code directly, you’ll have to find high-level ways to describe the challenges you’re facing, or use metaphors so that other people can grasp what you’re saying. Either way, you’ll be more valuable in client meetings, and you’ll be able to talk to account managers and team leaders in other departments in a way that makes sense to them, while still conveying what you need to convey.


Are developers honestly happy working 60-hour weeks?


The annual Stack Overflow survey is one of the most comprehensive snapshots of how programmers work, with this year's poll being taken by almost 90,000 developers across the globe. Commenting on the data, Robert Pozen, senior lecturer for technological innovation, entrepreneurship, and strategic Management at MIT Sloan School of Management, said although many "white-collar professionals" are content to work for longer than the standard 40-hour week, working hours can only be extended so far before it will negatively affect them. "Many professionals are quite happy working 40 to 55 hours per week," he says. "But if professionals work for 70 to 80 hours per week on a regular basis, their productivity will gradually deteriorate on average. They will lose focus, and the long work hours will undermine the rest of their lives. "Of course, professionals can have fruitful work spurts on projects they like or think are important. But that is the exception, rather than the rule." For developers, that fall in productivity is often mapped to an increase in poor quality and buggy code that will need to be fixed at some point, actually costing companies more in the long run.


What Millennials Think Of Boomers & Vice Versa

As with many misunderstandings at work, generational or otherwise, it’s always a good idea to take a step back and look for the upsides. Downsides are easy to find. (It’s why there are so many misunderstandings!) So the next time you find yourself looking across the generational divide with misgivings, here are some upsides to keep in mind about all the generations. Millennials owe a debt of gratitude to Gen X-ers for bringing a new generational identity to the workplace, one in which self-sufficiency and resourcefulness are highly valued, along with minimal management and maximum independence. This, combined with a bit of Gen X cynicism, paved the way for the Millennial perspective. Other Millennial advantages come from the time in history in which they grew up. For example, I’ve been surprised repeatedly by the exposure to other cultures that young people in this generation have had — high school students who spend a summer studying in South Korea, college students who opt for a gap year in Hungary, or who head to Ghana to work construction.


Evolution in action: How datacentre hardware is moving to meet tomorrow’s tech challenges


A demonstration system used separate memory and compute “bricks” (plus accelerator bricks based on GPUs or FPGAs) interconnected by a switch matrix. Another example was HPE’s experimental The Machine. This was built from compute nodes containing a CPU and memory, but instead of being connected directly together, the CPU and memory were connected through a switch chip that also linked to other nodes via a memory fabric. That memory fabric was intended to be Gen-Z, a high-speed interconnect using silicon photonics being developed by a consortium including HPE. But this has yet to be used in any shipping products, and the lack of involvement by Intel casts doubts over whether it will ever feature in mainstream servers. Meanwhile, existing interconnect technology is being pushed faster. Looking at the high performance computing (HPC) world, we can see that the most powerful systems are converging on interconnects based on one of two technologies: InfiniBand or Ethernet.


Developers Are More Remote-Based, Company Connected & Burnt Out


Remote work is the new normal for developers. It's not only something they prefer, but something they increasingly demand from employers. Eighty-six percent of respondents currently work remotely in some capacity, with nearly 1/3 working from home full time. Forty-three percent say the ability to work remotely is a must-have when considering an offer with a company. The traditional narrative of remote workers as isolated and disengaged from their companies is proving false for many. Seventy-one percent of developers who work remotely said they feel connected to their company’s community. But the issue hasn’t disappeared entirely. The twenty-nine percent who don’t feel connected say they feel excluded from offline team conversations or don’t feel integrated into their company’s culture when working remotely. The burnout problem is real. Two-thirds of all respondents said their stress levels have caused them to feel burnt out or work fatigued, regardless of whether or not they work remotely. Developers expect remote work to improve work-life balance. But the reality doesn’t always line up with that hope.


Think beyond tick-box compliance


According to Holt, compliance, alongside the need to recognise and leverage the business value of data, are data control challenges. In her experience, viewing them in this way makes the alignment of business and compliance objectives much less of a problem. “Organisations can begin to identify existing use cases and processes that depend on this control, and form interdisciplinary teams involving stakeholders from both compliance and other business roles to collaborate on shared outcomes and objectives. From this comes shared processes and workflows, shared technology, and – to some extent – shared budgets. By intertwining compliance goals within the broader enterprise initiative for data control and value realisation, there’s the potential for compliance to cease being a cost centre over time,” says Holt. “Benefits, such as improved customer relations and consumer trust, provide ‘softer’ returns that are often difficult to quantitatively measure over a short-term period, but can be significant and should not be neglected in calculations,” she adds.


The Phantom Menace in Unit Testing

Let me state up front that this is not a rant about unit testing; unit tests are critically important elements of a robust and healthy software implementation. Instead, it is a cautionary tale about a small class of unit tests that may deceive you by seeming to provide test coverage but failing to do so. I call this class of unit tests phantom tests because they return what are, in fact, correct results but not necessarily because the system-under-test (SUT) is doing the right thing or, indeed, doing anything. In these cases, the SUT “naturally” returns the expected value, so doing (a) the correct thing, (b) something unrelated, or even (c) nothing, would still yield a passing test. If the SUT is doing (b) or (c), then it follows that the test is adding no value. Moreover, I submit that the presence of such tests is often deleterious, making you worse off than not having them because you think you have coverage when you do not. When you then go to make a change to the SUT supposedly covered by that test, and the test still passes, you might blissfully conclude that your change did not introduce any bugs to the code, so you go on your merry way to your next task.


Evaluate the COBIT framework 2019 update


ISACA updated every part of the COBIT framework for 2019. The changes and additions to COBIT 2019 are encapsulated within the COBIT document suite, which is available to ISACA members for free. The principal changes include a new publication within the core framework, several new objectives, security practices updates and updated references to other standards, guidelines and regulations. Four core publications express the COBIT framework. The introduction and methodology publication provides definitions, explains management objectives and lays out the COBIT framework's structure. The governance and management objectives publication details the COBIT model and all constituent governance and management objectives, each associated with a specific process. A design publication, which is new in COBIT 2019, offers practical and prescriptive guidance that enables adopters to put COBIT into practice within the specific needs of their organizations.


Lessons Learned From A Year Of Testing Web Platform

Certain kinds of failures had side-effects that we didn’t anticipate. Even though our fancy automatic recovery mechanisms kicked in, the workers were doomed to fail all subsequent attempts. That’s because the unexpected side-effects persisted across independent work orders. The most common explanation will be familiar to desktop computer users: the machines ran out of disk space. From overflowing logs and temporary web browser profiles, to outdated operating system files and discarded test results, the machines had a way of accumulating useless cruft. It wasn’t just storage, though. Sometimes, the file system persisted faulty state. This entire class of problem can be addressed by avoiding state. This is a core tenet in many of today’s popular web application deployment strategies. The “immutable infrastructure” pattern achieves this by operating in terms of machine images and recovering from failure by replacing broken deployments with brand new ones. The “serverless” pattern does away with the concept of persistence altogether, which can make sense if the task is small enough.



Quote for the day:


"If you want extraordinary results, you must put in extraordinary efforts." -- Cory Booker