Daily Tech Digest - September 14, 2020

The 5 Biggest Technology Trends In 2021 Everyone Must Get Ready For Now

In recent years we have seen the emergence of robots in the care and assisted living sectors, and these will become increasingly important, particularly when it comes to interacting with members of society who are most vulnerable to infection, such as the elderly. Rather than entirely replacing the human interaction with caregivers that is so important to many, we can expect robotic devices to be used to provide new channels of communication, such as access to 24/7 in-home help, as well as to simply provide companionship at times when it may not be safe to be sending nursing staff into homes. Additionally, companies finding themselves with premises that, while empty, still require maintenance and upkeep, will turn to robotics providers for services such as cleaning and security. This activity has already led to soaring stock prices for enterprises involved in supplying robots. Drones will be used to deliver vital medicine and, equipped with computer vision algorithms, used to monitor footfall in public areas in order to identify places where there is an increased risk of viral transmission.


New BlindSide attack uses speculative execution to bypass ASLR

Speculative execution is a performance-boosting feature of modern processors. During speculative execution, a CPU runs operations in advance and in parallel with the main computational thread. When the main CPU thread reaches certain points, speculative execution allows it to pick an already-computed value and move on to the next task, a process that results in faster computational operations. All the values computed during speculative execution are discarded, with no impact on the operating system. Academics say that this very same process that can greatly speed up CPUs can also "[amplify] the severity of common software vulnerabilities such as memory corruption errors by introducing speculative probing." Effectively, BlindSide takes a vulnerability in a software app and exploits it over and over in the speculative execution domain, repeatedly probing the memory until the attacker bypasses ASLR. Since this attack takes place inside the realm of speculative execution, all failed probes and crashes don't impact the CPU or its stability as they take place and are suppressed and then discarded.


Five things businesses need to think about when implementing AI

The phrase “first time right” rarely applies to implementing AI, and that is especially true in making predictions and forecasts. Achieving an acceptable level of accuracy could take a number of iterations and continuous course corrections. Failure, then, must happen fast in order to learn what to correct. Since the stakes are high and there is always a risk of failure, it is also important to start with a smaller problem or a subsection of a large problem. This helps to reduce the risk associated with the cost of failure. There is no shame in dropping an idea and rethinking the approach. In fact, that willingness to rethink is vital. If the viability of a solution is in doubt, persisting with it – and, by so doing, wasting time and money – is never the right way to go. It is always advisable to change course or, in some cases, drop the idea altogether and pick a new one. Once a smaller problem is resolved and the business can see its value – and associated ROI – the solution can be scaled up to solve a bigger problem. ... IT and AI projects are inherently different. IT projects proceed with a clear idea and a set target for the desired output from day one. AI, by contrast, is mostly used in the quest to understand the unknown. It is therefore impossible to know what the output will be ahead of time.


Edge computing: The next generation of innovation

Edge computing may be relatively new on the scene, but it’s already having a transformational impact. In “4 essential edge-computing use cases,” Network World’s Ann Bednarz unpacks four examples that highlight the immediate, practical benefits of edge computing, beginning with an activity about as old-school as it gets: freight train inspection. Automation via digital cameras and onsite image processing not only vastly reduces inspection time and cost, but also helps improve safety by enabling problems to be identified faster. Bednarz goes on to pinpoint edge computing benefits in the hotel, retail, and mining industries. CIO contributing editor Stacy Collett trains her sights on the gulf between IT and those in OT (operational technology) who concern themselves with core, industry-specific systems – and how best to bridge that gap. Her article “Edge computing’s epic turf war” illustrates that improving communication between IT and OT, and in some cases forming hybrid IT/OT groups, can eliminate redundancies and spark creative new initiatives. One frequent objection on the OT side of the house is that IoT and edge computing expose industrial systems to unprecedented risk of malicious attack. 


5 SMART goals for a QA analyst

Software testers need a basic knowledge of these programming language staples for continued career growth. Successful execution of manual tests and automated scripts is helpful, but testing activities only go so far. It's even more important to know the conditions under which data enters into one of the programming structures, and what must happen for that data to exit it. Let's start with if-then-else logic. In this structure, if is whether a condition exists. If it does exist, then execute the then function. Otherwise, execute the else function (or do nothing). The if-then-else structure works well when a condition is true or false. A case structure might be appropriate, when a condition falls into one slot in a range of possibilities. A case structure expands on if-then-else by providing multiple functions to execute if certain conditions exist. For example, an if-then-else structure might check if a number falls in a range between 2-10 and, if it does, then the number is multiplied by five. If the number is not in that range, it will fall into the else condition, and is not multiplied at all. A case structure specifies what to do when a number falls into one of many ranges. In this example, when a number is between 2-10, it falls into Case A and is multiplied by five.


Q&A on the Book The Art of Leadership

Given leadership is a career restart, there are daily mistakes. The one I see the most with engineering leaders (but I suspect it applies to all leaders) is the tendency to regress when the stakes are highest. It’s when a new manager thinks he or she is helping during crunch time by helping finish the feature, fixing bugs, or otherwise regressing to their prior role because they think they are helping. Let’s catalog the reasons they aren’t helping: They’ve put the team in a situation where they appear to be unable to complete the necessary work. Bad planning; They’re doing the work their team should do, so they’re sending unintentional signals to the team that they don’t believe the team can do the work. Bad signal.; They’re not giving the team the chance to rise to the occasion. To figure out a creative means to complete the work. This might be impossible because of bad planning, but assume it’s not. What does the team think when the leader keeps saving the day by fixing bugs? It’s a safety net, sure, but it’s a net that isn’t allowing others to grow. Leaders often rationalize this behavior as “I want to remain technical.” I want engineering leaders to be deeply technical, too. 


What A Remote Workforce Means For Innovation

Though managers might periodically have doubts about remote workers’ productivity, it’s vital for companies to create a culture of trusting their employees to complete their work and continue innovating even when they’re off-site, Strawmyer added. As personnel continue to work from home, companies might need to reevaluate how they assess productivity. After all, spending a lot of time in the office isn’t an effective productivity indicator. Employees in the office environment are just as, if not more, capable of wasting time, Strawmyer said. In the beginning of the pandemic, workers were focused on getting used to their remote workflow. But now that they are adjusting to the current working conditions, Bang anticipates that more innovation will follow. Depending on their unique situation, removing a lengthy commute or wrapping up other projects has freed some employees up to focus on long-term ideas, he added. However, while working from home does free up time, there’s a big difference between transitioning to remote work under normal circumstances and shifting to remote work during a pandemic, Strawmyer acknowledged. 


5 Best Techniques for Automated Testing

To obtain better results through automated testing, testing must be started earlier and ran frequently as required. The sooner QA team get engaged in the project life cycle, the better you test, the more glitches and anomalies you find. Automated unit tests can be executed on first day and then the tester can progressively build their automated test suite. On the flip side, detection of bugs on early phase is cost lesser to fix than those identified later in deployment or production. Hence, with the shift left movement, proficient testers and software developers are now empowered to create and run tests. The significant automated testing tools allow users to carry out functional user interface tests for desktop and web apps easily using preferred IDEs.  ... Automation can’t replace manual testers. Automation tests represent executing some tests more frequently. The expert tester has to start few by attempting the smoke tests first and afterward cover the build acceptance testing. After that they can move onto recurrent performed tests, then onto the time taking tests. Besides this, QA tester has to ensure every test they automate, should saves time for a manual tester to concentrate on other vital things.


Is low-code/ no-code the dark horse in enterprise technology for a Covid-19 afflicted world?

Tthe adoption of low-code by enterprises is still in its infancy on account of a host of issues which range from technology complexity, vendor lock-in, or maybe even a basic lack of understanding of how low-code functions. “Scaling automation is a typical problem and many companies could have invested in licenses not knowing how to leverage its potential. Secondly, enterprise automation can seem very complex if you’re not using the right tools and technologies,” Persistent’s Dixit said. “To ensure the success of any project, execution requires the right mix of domain, tech and skilled consultants,” he added. LeapLearner’s Ranjan sees a lack of integration using APIs as a hindrance. “This limits the ability to create applications and systems which can solve complex problems and are AI enabled. Hopefully as the low-code/ no-code eco system grows, this will change,” he said. But some low-code platform startups, such as Bengaluru based Mate Labs, are working towards creating a low-code environment that would be able to address these challenges.


How to approach Agile team organization

Forming a productive Agile team requires a significant commitment of individual energy, time and concentration from each member. When Agile teams first come together, they go through what psychologist Bruce Tuckman termed the stages of forming, storming, norming and performing. At first, everyone must figure out the team dynamics. After a team forms, it establishes hierarchies of decision-making and leadership. Everyone finds their place and function within the team. This process takes time, as team members find their niche within the group's overall normal. In the normalizing phase, the team functions smoothly, with less arguing or trying to rise above the others. The group becomes a team rather than a collection of co-workers. When a team stabilizes, it's a sign that members learned how to optimize their skills within working relationships. The most successful Agile teams develop bonds that allow them to be productive and interact on a personal, genuine level. When project managers switch out team members, team development starts all over again. When a stable team is disrupted, it must go through the forming, storming, norming, performing process all over again.



Quote for the day:

"Leadership is an opportunity to serve. It is not a trumpet call to self-importance." -- J. Donald Walters

Daily Tech Digest - September 13, 2020

The Importance Of Predictive AI In Cybersecurity

The utilization of AI systems, in the realm of cybersecurity, can have three kinds of impact, it is constantly expressed in the work: «AI can: grow cyber threats (amount); change the run of the mill character of these dangers (quality); and present new and obscure dangers (quantity and quality). Artificial intelligence could grow the set of entertainers that are fit for performing noxious cyber activities, the speed at which these actors can play out the exercises, and the set of plausible targets. Fundamentally, AI-fueled cyber attacks could likewise be available in more powerful, finely targeted and advanced activities because of the effectiveness, scalability and adaptability of these solutions. Potential targets are all the more effectively identifiable and controllable. In a mix of defensive techniques and cyber threat detection, AI will move towards predictive techniques that can identify Intrusion Detection Systems (IDS) pointed toward recognizing illegal activity within a computer or network, or spam or phishing with two-factor authentication systems. The guarded strategic utilization of AI will likewise focus soon on automated vulnerability testing, also known as fuzzing.


Top 9 ways RPA and analytics work together

Increasingly, organizations are using robotic process automation in analytics tasks from assembling data spread across the company to analyzing how business processes work and how they can improve. "RPA is helping streamline the processes that create valuable insights, changing what areas analytics are measuring and helping to find new domains of time-consuming tasks to focus on," said Michael Shepherd, engineer at Dell Technologies Services. When it comes to RPA and analytics, the automation tool should be a complement to, rather than a replacement for, an integrated data platform across the company, said Jonathan Hassell, content director for data and AI at O'Reilly Media. When companies lack an integrated data platform, it makes analytics more difficult overall. "RPA can help in some ways, but the potential for RPA to unlock insights in data and further output and processes requires a good data platform with great health and hygiene," he said. Hassell recommended organizations look at three key ways RPA can change analytics. First, it helps create better data from the outset. Second, the organization can deploy it in the context of machine learning to sift through large quantities of data and identify useful information for humans to look at.


Rethinking AI Ethics - Asimov has a lot to answer for

AI per se is neither ethical or not, it is how it is applied by practitioners. But if an AI model is introduced into the market that routinely disrupts ethical concepts, what is it? A neutrally ethical artifact produced by unethical people? Think about the judicial system COMPAS that routinely made bail and sentencing recommendations two- to three-times more severe for African-Americans. That is clearly an unethical AI application. Instead of teaching people about the ethics of Plato, and Aristotle and Kant and trying to draw a line from that to build AI applications, a better approach is to start with identifying developers who are simply good people. People who show forbearance, who have the backbone to resist their organization's directive to deliver wrong things. People who have prudence can look beyond the results of a model and project how it will affect the world. Only good people should develop AI. A quick search turns up over a hundred AI Ethics proclamations from governments, non-profit special interest groups, government committees, etc, and it's consuming too much energy and bandwidth. In fact, the cynical view is that all of this is just a way to avoid authorities from issuing rules for AI.


Designing for Privacy — An Emerging Software Pattern

The first thing to do is to separate, and consider differently, the user data that your system needs to “know” from the user data that the system can collect and treat without actually having access to it. We call those two categories: Known and Unknown User Data. There is no magic recipe for this separation as it depends on your system’s functionality. There are systems where all user data can be considered Unknown — where the system has no knowledge of the user to whom it delivers its functionality. Yet, most systems need to identify the user in order to make him pay for a service that they deliver. A ride sharing platform might want to consider the identity of riders and drivers “known” but the destination of the ride and any messages exchanged between drivers and riders “unknown”. A hotel booking platform might perform the split in such a way to connect users with hotels, get a fee from hotels, but ignore the dates of the booking that reveal the user’s whereabouts. Once you segment which user data to treat as “known” and which as “unknown” you can adopt a new flavor of client-server architecture — the one when you treat the “known” data as you normally would, but where the “unknown” user data is kept at the user’s endpoint; 


Blockchain may break EU privacy law—and it could get messy

This technology’s transparency and immutability, meaning it cannot be edited later, is one of its biggest selling points. It’s also why massive enterprises have been drawn to using it. But if someone wanted to invoke their right to be forgotten—and order entries about themselves to be erased from the blockchain—networks may be duty bound to comply under court order. Here’s the kicker: in some cases, it may be near impossible to obey these orders because of the sheer levels of computing power required to edit a blockchain. And in others, the decentralized nature of some networks may mean it’s impossible to pin down someone who can be held responsible for fulfilling the court order. As part of her research, Dr Wahlstrom looked at a variation of blockchain technology that is known as Holochain. She concluded that it could be more compatible with the “right to be forgotten” legislation because of how its distributed database breaks the blockchain up—meaning it is easier for a smaller node to prevent contested data from being reshared. “This allows individuals to verify data without disclosing all its details or permanently storing it in the cloud,” she added.


RBI seeks exemption from data protection law

The banking regulator has also gone a step further and suggested that instead of the Central government, “sectoral regulators be given the power to classify personal data as critical.” Any critical data, according to the proposed act, can be processed only in India. Objecting to classification of financial data as sensitive personal data , RBI’s note maintained that this would lead to higher compliance and explicit consent, which “would translate to increase in costs for providing services to customers. Financial inclusion efforts rely on lower service charges for offering basic banking services. The increase in costs would compel banks to increase the charges associated with offering banking services.” RBI’s note also pointed out that countries such as the UK, France, Germany and Italy do not make such a classification. Privacy experts said the RBI cannot legally claim an exemption from the obligations that stem from the 2017 Supreme Court ruling as the Puttuswamy judgment, which upheld privacy as a fundamental right for Indian citizens. “What the bill does is flesh out that right in terms of the actual actions that need to be done. 


Digital transformation takes more than the wave of a wand

Championing digital transformation requires more than a magic wand or “plug and play” approach. Even the best technology is virtually worthless if everyone isn’t able, available and on board to use it. Without the proper talent, employee training and company-wide desire to evolve in place, people will inevitably revert to their old ways, using antiquated and siloed tools like Excel. Plus, the systems you already have in place have to keep running smoothly as you roll out digitization plans. Digital transformation does not happen with the flip of a switch. It requires ongoing strategic efforts to create a balance among new technologies, strategic solutions and traditional systems. This is why the aforementioned culture shift is essential before starting your transformation journey. As Raconteur author Ben Rossi says in this “Digital innovation and the supply chain” report, “A top-down mandate from board level to drive supply chain transformation is critical to getting the rest of the company to collaborate and change their mindset. Without that, heads of supply chain will run into resistance to change, in turn reducing the chance of achieving the broader transformation goals.” 


Genetic Algorithms: a developer perspective

There are various interesting theories regarding convergence in evolutionary algorithms, but these are of no concern to us here. Our interest is in understanding how these algorithms may be used to solve Artificial Intelligence problems, rather than in understanding why they actually work. One important class of evolutionary algorithms used in practical applications is genetic algorithms: these stress the importance of the data representation used to encode possible solutions to our optimization problem. The class name is inspired by an analogy with genetic code – the material that encodes our ‘phenotype’ or physical appearance. The use of the adjective ‘genetic’ reflects the fact that evolving solutions are represented by data structures, usually strings, reminiscent of biological genetic code. ... The goal of a genetic algorithm is to discover a phenotype that maximises fitness, by allowing a certain population to evolve across several generations. The next question is: how does the evolution of individuals happen? Genetic algorithms apply a set of ‘genetic operations’ to chromosomes of each generation to allow them to reproduce and, in the process, introduce casual mutations, much as occurs in most living beings.


The Story of Data — Privacy By Design

Every byte of data has a story to tell. The question is whether the story is being narrated accurately and securely. Usually, we focus sharply on the trends around data with a goal of revenue acceleration but commonly forget about the vulnerabilities caused due to bad data management. Data possesses immense power, but immense power comes with increased responsibility. In today’s world collecting, analyzing and build prediction models is simply not enough. I keep reminding my students that we are in a generation where the requirements for data security have perhaps surpassed the need for data correctness. Hence the need for Privacy By Design is greater than ever. ... Until recently businesses have focused on looking at data over long stretches of time, made possible by Big Data. With the advent of Internet of Things (IoT) analyzing real-time data has gained immense importance. It is very common these days to have devices in our homes that collect personal data and transmit it to external locations for either monitoring or analytical purposes. In many cases the the poor consumer is finding it difficult to balance the benefits they get from surrendering their personal data against the risk involved with providing them.


Enterprise Data Literacy: Understanding Data Management

Sandwell believes that the Data Literacy problem stems from specialized information needs and a lack of shared context. He remarked: “Data Literacy affects all organizational levels. Everyone uses data for different reasons, including senior managers and the Chief Data Officer (CDO). The CDO tends to come from the business side and takes that perspective. However, he or she may have a steep learning curve about making technical infrastructure ready to serve and deliver.” On the technical side, workers have a good data inventory; however, they have less of an understanding of what the data contents mean to the business. Meanwhile the more data literate data scientists and business analysts put business and technical information together faster, with more direct data querying and manipulation. So, across the enterprise, everyone has a different Data Literacy perspective and talks at cross purposes to one other. Add to the situation various data maturity levels across departments and enterprises. Some ask about “the data on-hand, where to access it, and how it gets used and by whom.” Others have figured out these basics and have different questions on how to do Metadata Management and create a data catalog of all the data sets. Since everyone has different data requirements at different times, getting to a uniform Enterprise Data Literacy remains elusive.



Quote for the day:

"Leading people is like cooking. Don_t stir too much; It annoys the ingredients_and spoils the food." -- Rick Julian

Daily Tech Digest - September 12, 2020

Women in Fintech: How Open Banking Can Help Address Data Bias

A disturbing recent example is the story of Jamie Heinemeier Hansson, who was granted permission to borrow 20 times less on her Apple Card than her husband David was. This was despite her having a better credit score, as well as the couple filing a joint tax return and having an equal share in their property. The Apple Card incident highlighted that computers are not impartial. Artificial intelligence may well be able to digest vast amounts of information and identify patterns far beyond the capability of humans, but the historical data from which such systems “learn” in order to draw conclusions can be biased, even if it is unintentional. So a system can make a discriminatory decision about a woman’s credit rating due to inherent bias in its training – for example, as women were less likely to have been granted credit, the algorithm continues that pattern – despite having not specifically asked her gender. However, many believe that while technology can perpetuate these biases, it could also be used to address them, particularly in the open banking era. “I genuinely believe technology can level the playing field fundamentally,” says Sam Seaton, CEO of Moneyhub. 


Simplify agile, devops, and ITSM with Jira automations

Jira automations work like other IFTTT algorithms, except they have access to all the underlying data and workflows within Jira Software. A Jira automation trigger can be one of several types, including Jira issue types, sprints, and versions. You can design automations for when team members add or modify Jira issues, when scrum masters start or complete sprints, or when team leads create, update, or release versions. These triggers are highly useful for scrum masters, product owners, and technical leads who want to simplify the work needed to keep Jira updated with high-quality data. Jira automation also supports triggers tied to devops events such as pull requests, builds, branches, commitments, and deployments. These events connect with Bitbucket, GitLab, and Github and update Jira issue or version status based on developer activities performed in version control. More advanced triggers can run on a defined schedule or respond to webhooks. Teams using these two triggers can get very creative with integrating Jira workflows with other tools or automating administrative tasks on a schedule. Once you configure the trigger, you have the option to add more filtering conditions or to branch the flow and support different sets of actions.


How trusted data is driving resilience and transformation beyond Covid-19

Over the next three to five years, most business workflows will be disrupted by the application of data and artificial intelligence (AI). Efficiency will be prioritised because it underpins business survival. If we take power and utilities as an example, we can expect disruption of the billing workflow, call centres, customer onboarding, customer service, and distribution. Document intelligence will also be used to glean insights from large volumes of information. Ultimately, data and AI will reinvent the entire end-to-end value chains of industries. Companies that recognise the strategic value of data will be the leaders in digital transformation, giving them a competitive position in the market. ... The pandemic has highlighted the value of data since having and sharing information on individuals will be key to defeating the virus. So, in the evolving normal, we can expect more data-sharing platforms – platforms that allow the public sector to share information with the private sector and platforms that allow different companies within the private sector to share information with each other. Boundaries between sectors will blur over time and regulation will adapt to accommodate data sharing.


Bluetooth Bug Opens Devices to Man-in-the-Middle Attacks

The Bluetooth SIG is recommending that potentially vulnerable Bluetooth implementations introduce the restrictions on CTKD that have been mandated in Bluetooth Core Specification versions 5.1 and later. These restrictions prevent the overwrite of an authenticated key or a key of a given length with an unauthenticated key or a key of reduced length. “The Bluetooth SIG is also broadly communicating details on this vulnerability and its remedies to our member companies and is encouraging them to rapidly integrate any necessary patches,” according to Bluetooth. “As always, Bluetooth users should ensure they have installed the latest recommended updates from device and operating system manufacturers.” Several Bluetooth-based attacks have cropped up over the past year. In May, academic researchers uncovered security vulnerabilities in Bluetooth Classic that could have allowed attackers to spoof paired devices and capture sensitive data. In February, meanwhile, a critical vulnerability in the Bluetooth implementation on Android devices was discovered that could allow attackers to launch remote code-execution (RCE) attacks – without any user interaction.


Australia’s very small step to make the Internet of Things safer

Security flaws in IoT devices are common. Hackers can exploit those vulnerabilities to take control of devices, steal or change data, and spy on us. In recognition of these risks, the Australian government has introduced a new code of practice to encourage manufacturers to make IoT devices more secure. The code provides guidance on secure passwords, the need for security patches, the protection and deletion of consumers’ personal data and the reporting of vulnerabilities, among other things. The problem is the code is voluntary. Experiences elsewhere, such as the United Kingdom, suggest a voluntary code will be insufficient to deliver the protections consumers need. ... A better option would have been a “co-regulatory” approach. Co-regulation mixes aspects of industry self-regulation with both government regulation and strong community input. It includes laws that create incentives for compliance (and disincentives against non-compliance) and regulatory oversight by an independent (and well-resourced) watchdog. The Australia government has, at least, described its new code of practice as “a first step” to improving the security of IoT devices.


Four ways network traffic analysis benefits security teams

The SecOps team will often need the network data and behavior insights for security analytics or compliance audits. This will usually require network metadata and packet data from physical, virtual and cloud-native elements of the network deployed across the data center, branch offices and multi-cloud environments. The easier it is to access, index and make sense out of this data (preferably in a “single pane of glass” solution), the more value it will provide. Obtaining this insight is entirely feasible but will require a mix of physical and virtual network probes and packet brokers to gather and consolidate data from the various corners of the network to process and deliver it to the security tool stack. NDR solutions can also offer the SecOps team the ability to capture and retain network data associated with indicators of compromise (IOCs) for fast forensics search and analysis in case of an incident. This ability to capture, save, sort and correlate metadata and packets allows SecOps to investigate breaches and incidents after the fact and determine what went wrong, and how the attack can be better recognized and prevented in the future.


A Beginner’s Introduction To DevOps Principles

To put it simply, DevOps is all about integrating these two teams together (hence the portmanteau of a name). It isn’t going to make your developers into sysadmins, or vice versa, but it should help them work together. Each aspect and phase is complemented with tools that make this whole process easier. DevOps is more than just tools and automation, and implementing a set of “DevOps tools” won’t automatically make your team work twice as fast, but these tools are a major part of the process, and it’d be hard to be as efficient without some of them. ... Rather than testing and building only once when everything is finished, in a DevOps environment, each developer will ideally submit changes to source control multiple times a day, whenever issues are complete or a minor milestone is reached. This allows the build and testing phases to start early, and make sure no developer gets too far away from the HEAD of the master source control. This stage is mostly about proper source control management, so having an effective git service like GitHub, Gitlab, or BitBucket are crucial to keeping continuous integration running smoothly. You don’t have to deploy every commit to production right away, but quick automated deployments are a major part of being able to push rapid releases.


It's the biggest job in tech. So why can't they find anyone to do it?

The failure to appoint a senior leader to coordinate the mammoth task of digitizing public services is at odds with the government's rhetoric. Three years ago, the UK re-iterated the need to create a "government as a platform" in a brand-new digital strategy, with the objective of harnessing the potential of digital to improve the efficiency of public services. The goal? To enable "digital by default" across government, and use technology and data to better serve citizens with digitally enabled public services that would be easier, simpler and cheaper. Since then, many reports have emerged stressing the difficulty of achieving this digital transformation journey without proper management from the very top. Last year, for instance, a report from the House of Commons' Science and Technology Committee found that the government's digital momentum was slowing, and that the shift was partly due to a lack of senior leadership. These failures have been especially palpable in the past few months. As the global COVID-19 pandemic threw the world upside down, the need for a government that effectively delivers digital services in a time of crisis became ever-more important.


Visa Warns of Fresh Skimmer Targeting E-Commerce Sites

The Visa alert does not indicate how Baka is initially delivered to a network. But the report notes that the malicious code is hosted on several suspicious domains, including: jquery-cycle[.]com, b-metric[.]com, apienclave[.]com, quicdn[.]com, apisquere[.]com, ordercheck[.]online and pridecdn[.]com. Once the initial infection takes hold, the skimmer is uploaded through the command-and-control server, but the code loads in memory. This means the malware is never present on the targeted e-commerce firm's server or saved to another device, helping it to avoid detection, according to the alert. "The skimming payload decrypts to JavaScript written to resemble code that would be used to render pages dynamically," according to Visa. Once embedded in an e-commerce site's checkout page, the skimmer begins to collect payment and other customer data from various fields and sends the information to the fraudsters' command-and-control server, Visa notes. Once data exfiltration is complete, Baka performs a "clean-up" function that removes the skimming code from the checkout page, according to the alert. This also helps ensure that JavaScript is not spotted by anti-malware tools.


Elon Musk is one step closer to connecting a computer to your brain

While the development of this futuristic-sounding tech is still in its early stages, the presentation was expected to demonstrate the second version of a small, robotic device that inserts tiny electrode threads through the skull and into the brain. Musk said ahead of the event he would “show neurons firing in real-time. The matrix in the matrix.” And he did just that. At the event, Musk showed off several pigs that had prototypes of the neural links implanted in their head, and machinery that was tracking those pigs’ brain activity in real time. The billionaire also announced the Food and Drug Administration had awarded the company a breakthrough device authorization, which can help expedite research on a medical device. Like building underground car tunnels and sending private rockets to Mars, this Musk-backed endeavor is incredibly ambitious, but Neuralink builds on years of research into brain-machine interfaces. A brain-machine interface is technology that allows for a device, like a computer, to interact and communicate with a brain. 




Quote for the day:

"The actions of a responsible executive are contagious." -- Joe D. Batton

Daily Tech Digest - September 11, 2020

How this open source test framework evolves with .NET

Fixie v3 is a work in progress that we intend to release shortly after .NET 5 arrives. .NET 5 is the resolution to the .NET Framework vs. .NET Core development lines, arriving at One .NET. Instead of fighting it, we're following Microsoft's evolution: Fixie v3 will no longer run on the .NET Framework. Removing .NET Framework support allowed us to remove a lot of old, slow implementation details and dramatically simplified the regression testing scenarios we had to consider for reach release. It also allowed us to reconsider our design. The Big Three requirements changed only slightly: .NET Core does away with the notion of an App.config file closely tied to your executable, instead relying on a more convention-based configuration. All of Fixie's assembly-loading requirements remained. More importantly, the circumstances around the design changed in a fundamental way: we were no longer limited to using types available in both .NET Framework and .NET Core. By promising less with the removal of .NET Framework support, we gained new degrees of freedom to modernize the system.


A 5-step Guide to Building Empathy that can Boost your Development Career

When you reflect on yourself, also analyze your interactions. When you speak, do you ramble on? Do you raise your voice easily, or get easily upset? Do you talk more than listen? How do you come across physically? Do you roll your eyes, or dart them around the room? Do you slouch or bury your hands in your pockets? Think about the language you use during conversations. Do you use habitual phrases that help or hinder your message? Is your language helping others to pay attention or tune you out? Does it encourage conversations and build bridges? Are you making others feel heard and respected, or ignored and underappreciated? To start your self-awareness journey, you can take advantage of a number of tools: DISC, Real Colors, and Myers-Briggs are all great starting points to understanding your own personality. These tools are not there to dictate who you are, but to guide you in understanding who you are. When you take the quiz, you are essentially having a conversation with that quiz. The results are simply telling you how you showed up to that conversation - the outcome is affected by your mood, attitude, energy, recent events, etc.


New CDRThief malware targets VoIP softswitches to steal call detail records

"At the time of writing we do not know how the malware is deployed onto compromised devices," Anton Cherepanov, one of ESET's top malware hunters, wrote in an analysis today. "We speculate that attackers might obtain access to the device using a brute-force attack or by exploiting a vulnerability. Such vulnerabilities in VOS2009/VOS3000 have been reported publicly in the past," Cherepanov added. However, once the malware has a foothold on a Linux server running Linknat VOS2009 or VOS3000, the malware searches for the Linknat configuration files and extracts credentials for the built-in MySQL database, where the softswitch stores call detail records (CDR, aka VoIP calls metadata). "Interestingly, the password from the configuration file is stored encrypted," Cherepanov pointed out. "However, Linux/CDRThief malware is still able to read and decrypt it. Thus, the attackers demonstrate deep knowledge of the targeted platform, since the algorithm and encryption keys used are not documented as far as we can tell. It means that the attackers had to reverse engineer platform binaries or otherwise obtain information about the AES encryption algorithm and key used in the Linknat code."


Open-sourcing TensorFlow with DirectML

TensorFlow is a widely used machine learning framework for developing, training, and distributing machine learning models. Machine learning workloads often involve tremendous amounts of computation, especially when training models. Dedicated hardware such as the GPU is often used to accelerate these workloads. TensorFlow can leverage both Central Processing Units (CPUs) and GPUs, but its GPU acceleration is limited to vendor-specific platforms that vary in support for Windows and across its users’ diverse range of hardware. Bringing the full machine learning training capability to Windows, on any GPU, has been a popular request from the Windows developer community. The DirectX platform in Windows has been accelerating games and compute applications on Windows for decades. DirectML extends this platform by providing high-performance implementations of mathematical operations—the building blocks of machine learning—that run on any DirectX 12-capable GPU. We’re bringing high-performance training and inferencing on the breadth of Windows hardware by leveraging DirectML in the TensorFlow framework. 


Developing a plan for remote work security? Here are 6 key considerations

Training needs to address all aspects of your structure, specifically: information security, data security, cybersecurity, computer security, physical security, IoT security, cloud security, and individual security. Each area of an architecture needs to be tested and hardened regularly for your organization to truly be shielded from security breaches. Be specific about your program: train your staff on how to defend your information around your HR records (SSNs, PII, etc.) and data that could be exposed (shopping cart, customer card numbers), as well as in cyber defense to provide tools against nefarious actors, breaches and threats. Staff must be trained to know how to lock down computers, so individual machines and network servers are safe. This training should also encompass how to ensure physical security, to protect your storage or physical assets. This comes into play more as the IoT plays a larger role in connecting our devices and BYOD policies allow for more connections to be made between personal and corporate assets. Individual security: each employee is entitled to be secure in their work for a company, and that includes privacy concerns and compliance issues.


Phishing attack baits victims by promising access to quarantined emails

As analyzed by the Cofense Phishing Defense Center, this phishing attack is directed toward employees within an organization. Impersonating the technical support team of the user's employer, the campaign pretends to have quarantined three email messages, blocking them from reaching the recipient's inbox. Clicking on a link promises access to these messages but instead directs the person to a phishing page. The user is then prompted to sign in with their email account credentials, which are then captured by the attacker. The campaign seems convincing in a variety of ways, according to Cofense. By spoofing the account of the internal support staff, the phishing email appears to come from a trusted source. The quarantine notice sounds real, even claiming that the quarantined messages failed to process and must be reviewed to confirm their validity. Further, the notice has an air of immediacy by saying that two of the messages are considered valid and will be deleted in three days unless action is taken. Such a notice could convince the recipient that these are messages of importance to their organization, requiring a quick response to review them before they're gone.


Laying The Groundwork For ‘Fintech 2.0’ With Digital Assets

Increasingly, government entities are interested in stablecoin technology as well. While it's a promising development in the world of digital assets, Woodford said he doesn't expect state-back initiatives to go live and take off anytime soon. Rather, the biggest value in these efforts is in validating digital assets as a whole. "If you look at what has caused the shift in mentality in the last 12-18 months, it went from, 'No, we don't want this,' to, 'No, but this is interesting' to the point now where it's interesting and people are actively engaging in this space," he explained. "One of the reasons for that is because of the sentiment, caused by those government announcements. It's one driver, but it's more important and meaningful now in terms of how it's adjusted the attitude." The fact is, any dramatic change in the world's payments landscape isn't going to happen overnight — certainly not a shift from fiat currency toward digital assets like bitcoin. It's part of the reason why stablecoin technology is so popular; it's a blend between fiat and digital currency, and that mix is critical to driving traction. As such, Zero Hash, which recently announced the closure of its Series C funding round, is planning to not only augment its lending offering, but to integrate ACH processing capabilities within its infrastructure.


Smart contact lens prototype raises eyebrows

The human iris controls pupil size in response to light, a critical function that allows the retina to take in appropriate sensory information. Too much light and the world is washed out, too little and it's veiled in darkness. A host of eye diseases and deficiencies inhibit the iris from responding appropriately, including aniridia and keratoconus. Light sensitivity, similarly, is a painful debilitation and is often associated with chronic migraine. Researchers at Imec, an innovation hub based in Belgium, along with partners like CMST, a Ghent University-affiliated research group, the Instituto de Investigación Sanitaria Fundación Jiménez Díaz in Madrid, Spain, and Holst Centre have been developing an low-powered wearable solution. The contact lens's iris aperture is tunable thanks to an integrated liquid crystal display (LCD) that manipulates concentric rings.  "By combining our expertise on miniaturized flexible electronics, low-power ASIC design and hybrid integration, we have demonstrated the capacity to develop a solution for people who suffer from iris deficiencies, higher order aberrations and photophobia, a common yet debilitating symptom seen in many neuro-ophthalmic disorders," says researcher prof. Andrés Vásquez


3 tips for supercharging your remote workforce with AI and automation

Organisations today are facing numerous pressures to enable a remote workforce, particularly in the IT function, since we have entered the post-Covid era. At a time when the traditional modus operandi is constantly being tested, there are some ‘new’ approaches that have actually been in use in other parts of the market for a while now. We can take several lessons from the consumer tech world and how it leverages automation and AI to reduce maintenance and ease automation. Let’s take at the Nest thermostat as an example. A single thermostat changes temperature about 1500 times per year, so a large house with 3 thermostats changes temperature about 5000 times per year. ... Make sure you have a single API-endpoint in the cloud to enumerate & automate all of your storage assets on-prem. Having a cloud-managed platform provides the visibility and orchestration of your assets across sites, servers and applications and you can take advantage of a single API in the cloud to then automate all or a portion of those as needed. You get an aggregated view, or you can filter by data centre or application, server group, etc. Then ask interesting questions like, where is there available capacity for a new project?


Plan for change but don’t leave security behind

The best advice is to plan for change – technical, process and culture – but do not, whatever you do, leave security till last. It has to be front and centre of any plans you make. One concrete change that you can make immediately is taking your security people off just “fire-fighting duty”, where they have to react to crises as they come in: businesses can consider how to use them in a more proactive way. People don’t scale, and there’s a global shortage of security experts. So, you need to use the ones that you have as effectively as you can, and, crucially, give them interesting work to do, if you plan to retain them. It’s almost guaranteed that there are ways to extend their security expertise into processes and automation which will benefit your broader teams. At the same time, you can allow those experts to start preparing for new issues that will arise, and investigating new technologies and methodologies which they can then reapply to business processes as they mature. ... One of the main mistakes we see businesses make is attempting to deploy Kubernetes without the appropriate level of in house expertise. Kubernetes is an ecosystem, rather than a one-off executable, that relies on other services provided by open source projects. 



Quote for the day:

"Leadership flows from the minds of followers more than from the titles of leaders, more from the perception of willing followers than from anointment." -- Lane Secretan

Daily Tech Digest - September 10, 2020

Does An Analytics Head Require A Doctoral Degree?

Obviously, researchers in business are not expected to publish papers or guide students as their academic counterparts do. They are looked up to analyze complex business problems methodically as a scientist does. They are expected to make suitable approximations and define some simple parts in the complex whole and attack them using known repeatable, robust principles and techniques. ... Let us say, a large IT services company wants to fill leadership roles in the data science consulting practice. This person should have enough technical depth and the ability to identify the business gaps, communicate with the clients and most importantly build solutions that provide measurable business value (interestingly, this last skill is never considered a core competency in any traditional PhD in AI or other Masters and Bachelors courses). Let us say, an IT product company decides to smarten its application and wants leadership that can take them to the market quickly and profitably. The leaders should have the skill to define the product, design the technicalities, and lead the data science and DevOps teams compassionately and efficiently for rapid design and development. Hence, A leader in data science is not necessarily a technical expert who worked in the company long enough or a business leader who is a taskmaster!


Ripple20 Malware Highlights Industrial Security Challenges

Since availability is critical to ICS systems, and since the systems themselves can be fragile and quirky, these are generally the responsibility of operational technology (OT) teams. The information technology (IT) team usually manages the corporate network. OT employees are familiar with process technology and the systems they manage, but they do not generally know a great deal about information security, which can lead to insecure deployments. One fairly common situation for manufacturers is a divide, sometimes adversarial, between the IT and OT staff within a company.  OT employees do not want the IT staff to tamper with their systems out of fear of downtime that can cost the company. From what we have seen, these relationships often resemble red team versus blue team attitudes at many organizations. The blue team can resent the efforts of the red team because those efforts create more work for the blue team and can be considered a criticism of their work. OT employees also often don't want to consult with their IT counterparts when making arrangements such as remote access, leading to situations such as RDP on control networks commonly being exposed to the public Internet.


India can soon be the tech garage of the world

The government has a crucial role to play in positioning India as the Tech Garage of the World. It should act as a catalyst, and bring together the synergies of the private sector with the aim of innovating for India and the world. It has the potential to provide an enabling environment and a favourable regulatory ecosystem for the development of technology products and provide the size and scale necessary for their rollout. The product development should ideally be undertaken through private entrepreneurship, with the government acting as a facilitator. The key principles of product design should incorporate transparency, security and ease of access. The products must have open architecture, should be portable to any hosting environment and should be available in official and regional languages. The irrevocable shift brought about by covid-19 presents opportunities to develop new technology platforms. In this process, data integrity, authenticity and privacy should be embedded into the design of a product. A balance needs to be struck between regulation and product design through a dynamic collaboration between the government and technology entrepreneurs.


The State of Chatbots: Pandemic Edition

Generally speaking, there are two types of chatbots right now. The first kind is the more primitive kind that is based on simple question and answer rules. This kind is the easiest to deploy quickly, in response to some catastrophic event, like, for instance, a pandemic. It has a scripted set of answers. The problem with this kind of chatbot is that it is very limited, and it can't be enhanced or expanded. It's a one-trick chatbot. "The deterministic-rules based approach chatbots are easy to stand up quickly," Ian Jacobs, a principal analyst at Forrester Research, told InformationWeek. That means there was a huge number of these deployed during the pandemic. "There was an increase in call volume, and you were doing anything you could to get answers to customers without hiring another thousand call center agents," he said. These bots were doing very simple things, but "We are getting to the point where the value that brands are getting out of those very simple bots has already been achieved." One example of this type of bot was deployed by a credit union in the northwestern United States in April when stimulus checks were on the way, Jacobs said. This organization stood up a simple bot designed to answer basic questions that people were asking about the checks.


Digital Transformation Success Elusive For Financial Institutions

When financial institution executives were asked about the importance of alternative digital transformation strategies, improving the overall customer experience was considered to be of high or very high importance by 88% of organizations. The importance of improving the customer experience was followed closely by the need to improve the use of data, AI and advanced analytics (76% rated high or very high). Illustrating the perceived broad scope of digital transformation initiatives at most financial institutions, the majority of the other possible digital transformation strategies were each rated almost identically by financial institution executives in the Digital Banking Report research. Innovation agility, improving marketing and sales, improved efficiency, improved risk management and reducing costs were each rated high or very high by roughly six in ten executives. It is a bit concerning that the need to change the existing business model and transforming legacy core systems were considered the least important strategies despite research that indicates these strategies are of significant importance for transformation success.


Organizations must rethink traditional IT strategy to succeed in the new normal

This newfound self-confidence, combined with IT pros’ achievements during this time, will completely transform how IT is viewed by the business in the future. IT may earn a more prominent voice in the C-suite, as 40% of surveyed IT pros believe they will now be involved in more business-level meetings. Likewise, IT’s role will be up-leveled due to the vast upskilling 26% of IT pros underwent during this experience. With 31% admitting there’s a need to rethink internal processes to better accommodate the rapid change of pace required post-COVID, it’s highly likely a focus on IT pros’ upskilling will continue into the future. “As always, with new responsibilities comes the need for new skills. While almost half of survey respondents felt they received the training required to adapt to changing IT requirements, nearly one-third experienced the opposite, and are at risk of being left behind as IT teams continue to grapple with how best to support the new normal,” said Johnson. IT pros said they’ve gained an increased sense of confidence in their expanded roles, responsibilities, and ability to adapt to unexpected change in the future, despite contending with more challenging working conditions over the course of the pandemic.


Why Linux still needs a flagship distribution

Now, imagine a single distribution has been chosen, from the hundreds of currently available distributions, to represent Linux to hardware manufacturers, vendors, and software companies. That one Linux distribution would be used by hardware manufacturers and software companies to create computers and software guaranteed to run on Linux. That distribution would have only one desktop environment, one package manager, one init system, and the current stable version of the Linux kernel. Users could also download this Linux distribution and use it at will, but the primary purpose of "Flagship Linux" would be to make things easier for manufacturers and developers. Set aside your affinity for the Linux distribution you use and ponder this for a moment: Would you rather argue over which distribution is the best, or would you rather see Linux enjoy massive growth on the desktop and laptop arenas? We've already seen a number of manufacturers start the rollout of preinstalled Linux laptops. Lenovo, Dell, HP are all joining in on the fun, but the process hasn't been easy. As you can see, those manufacturers are, for the most part, all winnowing down the selection of Linux distributions available.


Federated Machine Learning for Loan Risk Prediction

A model is only as strong as the data it’s provided, but what happens when data isn’t readily accessible or contains personally identifying information? In this case, can data owners and data scientists work together to create models on privatized data? Federated learning shows that it is indeed possible to pursue advanced models while still keeping data in the hands of data owners. This new technology is readily applicable to financial services, as banks have extremely sensitive information ranging from transaction history to demographic information for customers. In general, it’s very risky to give data to a third party to perform analytical tasks. However, through federated learning, the data can be kept in the hands of financial institutions and the intellectual property of data scientists can also be preserved. In this article, we will demystify the technology of federated learning and touch upon one of the many use cases in finance: loan risk prediction. Federated Learning, in short, is a method to train machine learning (ML) models securely via decentralization. That is, instead of aggregating all the data necessary to train a model, the model is instead sent to each individual data owner.


How to Protect Chatbots from Machine Learning Attacks

Chatbots are particularly vulnerable to machine learning attacks due to their constant user interactions, which are often completely unsupervised. We spoke to Scanta to get an understanding of the most common cyber attacks that chatbots face. Scanta CTO Anil Kaushik tells us that one of the most common attacks they see are data poisoning attacks through adversarial inputs. Data poisoning is a machine learning attack in which hackers contaminate the training data of a machine learning model. They do this by injecting adversarial inputs, which are purposefully altered data samples meant to trick the system into producing false outputs. Systems that are continuously trained on user-inputted data, like customer service chatbots, are especially vulnerable to these kinds of attacks. Most modern chatbots operate autonomously and answer customer inquiries without human intervention. Often, the conversations between chatbot and user are never monitored unless the query is escalated to a human staff member. This lack of supervision makes chatbots a prime target for hackers to exploit. To help companies protect their chatbots and virtual assistants, Scanta is continuously improving their ML security system, VA Shield.


The Expanding Role of Metadata Management, Data Quality, and Data Governance

After the data has been accurately defined, it is important to put in place procedures to assure the accuracy of the data. Imposing controls on the wrong data does no good at all. Which raises the question: How good is your data quality? Estimates show that, on average, data quality is an overarching industry problem. According to data quality expert Thomas C. Redman, payroll record changes have a 1% error rate; billing records have a 2% to 7% error rate, and; the error rate for credit records: as high as 30%. But what can a DBA do about poor quality data? Data quality is a business responsibility, but the DBA can help by instating technology controls. By building constraints into the database, overall data quality can be improved. This include defining Referential Integrity into the database. Additional constraints should be defined in the database as appropriate to control uniqueness, as well as data value ranges using check constraints and triggers. Another technology tactic that can be deployed to improve data quality is data profiling. Data profiling is the process of examining the existing data in the database and collecting statistics and other information about that data.



Quote for the day:

"Concentrate all your thoughts upon the work in hand. The Sun's rays do not burn until brought to a focus." -- A.G. Bell