Daily Tech Digest - March 19, 2020

Microsoft: .NET 5 preview for Windows 10, iPhone, Android Surface Duo apps is out


Ahead of the final version of .NET 5, Microsoft has a clear message for developers: ".NET Core and then .NET 5 is the .NET you should build all your NEW applications with."  "Having a version 5 that is higher than both .NET Core and .NET Framework also makes it clear that .NET 5 is the future of .NET, which is a single unified platform for building any type of application," said Scott Hunter, director of program management at Microsoft .NET.  The first preview includes support for Windows Arm64 and the .NET Core runtime, while the second preview will include an SDK with ASP .NET Core but not WPF or Windows Forms, which should arrive in a subsequent preview.  The preview should allow developers to update existing projects by updating the target framework.  The main goals for .NET include providing a unified .NET SDK with a single Base Class Library (BCL) across all .NET 5 applications, with Xamarin moving to the .NET core BCL. Since Xamarin is integrated into .NET 5 the .NET SDK will support mobile. Microsoft's ongoing work on Blazor should also mean web application support across platforms, including browsers, on mobile devices and as a native desktop application for Windows 10 and Windows 10X.



IR35 reform delay: how tech companies and contractors should respond

IR35 reform delay: how tech companies and contractors should respond image
Paul Wright, head of the technology practice, Odgers Interim has some very important advice on how companies should respond to the regulatory respite- revoke any blanket bans on contractors. He says “businesses have now been given some breathing room to get their houses in order and I cannot stress enough how important it is for them to take this time to revoke any blanket assessment statues they have enforced and re-evaluate their contingent workforce needs. “As the impact of Covid-19 steers the economy into unchartered waters, the UK’s freelance, independent and contractor workforces will be more important than ever for tech firms – which already rely heavily on this industry.” Wright also sees contractors and freelancers as the solution to absences in the permanent workforce cause by Covid-19. “Many organisations will not only need to procure the specialist skillsets of contractors and independents to help guide them through increasing levels of disruption but will also need to call upon their support to fill in for permanent staff who are either self-isolating or having to look after family members.


Data Governance: How to Tackle 3 Key Issues

Data Governance: How to Tackle 3 Key Issues
Some security practitioners argue that larger organizations should designate different accountable parties for protecting the privacy of customer, product and financial data - or even designate those in charge in each region. But organizations need someone at the top of the chain, such as a chief data officer, so that federated ownership can be kept in check, Deb says. Deb has also implemented a RACI - responsible, accountable, consulted and informed - matrix that helps him assign data owners. "So respective business units or their heads own the data and the accountability," he says. "For instance, IT is the data custodian, assurance functions are the data governors and so on. That way, an entire RACI matrix is built for every application, platform and data we process internally." One of the major roadblocks in the data governance process is the problem of shadow IT, Deb says. Shadow IT is where development happens either in-house or through an outsourced partner without the supervision and governance of the IT InfoSec and privacy teams.


9 Cybersecurity Takeaways as COVID-19 Outbreak Grows

Security experts cite phishing attacks as being one of the biggest threats in this new environment, and warn that existing efforts to safeguard employees are too often inadequate. "Phishing attacks are on the rise, and employees at home might be especially vulnerable," attorneys Jonathan Armstrong and André Bywater say in a client note. "We've expressed concerns before that a lot of 'off-the-shelf' phishing training is not fit for purpose. It's important to make sure employees are trained and that they have regular reminders. Organizations using [Office 365] may be especially vulnerable at this time." To help, many organizations are releasing materials for free. For example, the SANS Institute has released large parts of its commercial awareness materials. But with phishing attacks that prey on coronavirus fears already surging, many organizations are playing catchup. "Like many phishing scams, these emails are preying on real-world concerns to try and trick people into doing the wrong thing," the U.K.'s National Cyber Security Center says, noting that shipping, transport and retail industries were being targeted.


Reasons For Transitioning To Cloud Computing In 2020


Cloud computing has now become a common term that all of us have heard of. However, unfortunately, many of us still don’t understand the complete potential of cloud computing. It is high time for all us to understand how it can make our lives easier. Instead of storing data on a computer or hard drive, cloud computing stores programs and data over the internet. In other words, in order to access your data, you must be connected to the internet. In fact, many of us already use cloud computing unknowingly, while listening to our favorite tunes on Spotify or using Google Drive for data storage.  The flexibility and functionality of cloud computing have already proven to be a lifesaver for businesses. However, cloud computing for a business is entirely different from the personal use of the cloud. Before the implementation of cloud computing, businesses need to choose between Software-as-a-Service (SaaS), Platform-as-a-Service (or PaaS), or Infrastructure-as-a-Service (IaaS). In a nutshell, PaaS allows users the freedom to come up with customized applications as per their requirements. On the other hand, SaaS requires users to subscribe to a chosen application.


IT Priorities 2020: Digitisation drives IT modernisation growth


Opening up APIs, with access controlled via an API management platform, is one of the ways IT departments can minimise the effort needed to modernise applications. The survey reported that 47% of IT professionals said they planned to increase the use of cloud infrastructure to support digital transformation initiatives in 2020. Applications can be replatformed from on-premise servers to public cloud-hosted infrastructure-as-a-service (IaaS) platforms. In fact, 38% of the respondents said they would increase their cloud budgets in 2020. This potentially shifts spending from a capital expenditure model for on-premise datacentre hardware to pay-as-you-go in the public cloud. Many of the legacy applications that are migrated to the cloud can only run in virtual machines (VMs). VMs in the public cloud replace physical servers or on-premise VMs. But as organisations move along their journey to become cloud-native, in some instances, IT professionals are looking at splitting legacy code into functional building blocks.


AI adoption in the enterprise 2020

AI adoption report post
AI adoption is proceeding apace. Most companies that were evaluating or experimenting with AI are now using it in production deployments. It’s still early, but companies need to do more to put their AI efforts on solid ground. Whether it’s controlling for common risk factors—bias in model development, missing or poorly conditioned data, the tendency of models to degrade in production—or instantiating formal processes to promote data governance, adopters will have their work cut out for them as they work to establish reliable AI production lines. Survey respondents represent 25 different industries, with “Software” (~17%) as the largest distinct vertical. The sample is far from tech-laden, however: the only other explicit technology category—“Computers, Electronics, & Hardware”—accounts for less than 7% of the sample. The “Other” category (~22%) comprises 12 separate industries. One-sixth of respondents identify as data scientists, but executives—i.e., directors, vice presidents, and CxOs—account for about 26% of the sample. The survey does have a data-laden tilt, however: almost 30% of respondents identify as data scientists, data engineers, AIOps engineers, or as people who manage them.


Electronics should sweat to cool down, say researchers

Overflow  >  Pouring more binary water into a glass than it can hold causing overflow.
Computing devices should sweat when they get too hot, say scientists at Shanghai Jiao Tong University in China, where they have developed a materials application they claim will cool down devices more efficiently and in smaller form-factors than existing fans. It’s “a coating for electronics that releases water vapor to dissipate heat from running devices,” the team explain in a news release. “Mammals sweat to regulate body temperature,” so should electronics, they believe. The group’s focus has been on studying porous materials that can absorb moisture from the environment and then release water vapor when warmed. MIL-101(Cr) checks the boxes, they say. The material is a metal organic framework, or MOF, which is a sorbent, a material that stores large amounts of water. The higher the water capacity one has, the greater the dissipation of heat when it's warmed. MOF projects have been attempted before. “Researchers have tried to use MOFs to extract water from the desert air,” says refrigeration-engineering scientist Ruzhu Wang, who is senior author of a paper on the university’s work that has just been published in Joule.


Silverlight Reborn? Check Out 'C#/XAML for HTML5'

C#/XAML for HTML5
Now ... comes C#/XAML for HTML5 from Userware, which today announced its Silverlight-replacement project, also called CSHTML5, has reached release candidate status after a lengthy beta program. The tool comes as a Visual Studio extension in the Visual Studio Marketplace, promising to create HTML5 apps using only C# and XAML -- or migrate existing Silverlight apps to the Web. "Developers are now able to use C# and XAML to write apps that run in the browser," the French company said. "Absolutely no knowledge of HTML5 or JavaScript is required to use the extension, as it compiles your files to HTML5 and JavaScript for you. That means you can now build Web apps with static typing and all the strengths of C# and XAML, and make sure your code is ready when WebAssembly comes out." WebAssembly is upcoming experimental technology presented as an open standard that lets developers write low-level assembly-like code for the browser in non-JavaScript languages like C, C++ and even .NET languages like C# for improved performance over JavaScript. Until WebAssembly fully supported in the Web ecosystem, CSHTML5 might be seen as an alternative for .NET-centric developers.


More Business Websites Hit by Credit-card Skimming Malware

A malicious script planted on the NutriBullet website's payment page stole credit card numbers, expiry dates, CVV codes, names, and addresses of unsuspecting blender buyers and sent it to a server under the control of cybercriminals. According to the report, the sensitive data was then sold to other criminals on underground forums. RiskIQ says that although NutriBullet has attempted to clean up the poisoned webpages, the attackers continue to break back in and plant malicious code - suggesting that the attackers continue to exploit a way of compromising the blender maker's infrastructure. Peter Huh, the CIO of NutriBullet, confirmed that a security breach had occurred and said that a forensic investigation into the incident had been initiated. There is no word yet as to what plans NutriBullet has to inform affected customers. In both cases it feels like the companies at the centre of the security breaches should be responding more transparently with their users, ensuring that they are informed promptly and given as much detail as possible about what has occurred.



Quote for the day:


"Leaders must encourage their organizations to dance to forms of music yet to be heard." -- Warren G. Bennis


Daily Tech Digest - March 17, 2020

How Biometric Identity Will Drive Personal Security In Smart Cities


While smart cities can offer unprecedented levels of convenience to improve our everyday lives they also rely on vast networks of data, including personal customer information to predict our preferences. This has led to concerns around the high levels of data used and stored by smart systems, and the security provided to our digital identity. We know that existing personal and unique identifiers, such as passwords and PINs are no longer secure enough to protect our systems, and this is even more important in hyper-connected cities as, once a city becomes ‘smart’ the inter-connected networks widen, and the potential for cyberattacks or data breaches grows. So as this trend continues, how can we develop smart cities that are both convenient and secure? To resolve this, providers of smart city networks need to establish a chain of trust in their technology. This is a process common in cybersecurity, where each component in a network is validated by a secure root. In wide connected networks, this is vital to protect sensitive personal or business data and ensure consumer trust in the whole system.


Coronavirus challenges remote networking


The security of home Wi-Fi networks is also an issue, Nolle said. IT pros should require workers to submit screenshots of their Wi-Fi configurations in order to validate the encryption being used. "Home workers often bypass a lot of the security built into enterprise locations," he said. Education of new home workers is also important, said Andrew Wertkin, chief strategy officer with DNS software company BlueCat. "There will be remote workers who have not substantially worked from home before, and may or may not understand the implications to security," Wertkin said. "This is especially problematic if the users are accessing the network via personal home devices versus corporate devices." An unexpected increase in remote corporate users using a VPN can also introduce cost challenges. "VPN appliances are expensive, and moving to virtualized environments in the cloud often can turn out to be expensive when you take into account compute cost and per-seat cost," Farmer said. A significant increase in per-seat VPN licenses have likely not been budgeted for.


Implementing CQRS Pattern with Vue.js & ASP.NET Core MVC

Image 2
If you’re a software professional, then you’re familiar with the Software enhancement and maintenance work. This is the part of software development life cycle; so that, you can correct the faults, delete/ enhance the existing features. The software maintenance cost can be minimized if you use software architectural pattern, choosing right technologies and be aware of the industry trends for the future, consider resource reliability/availability for now and future, use design pattern/principle in your code, re-use your code and keep open your option for future extension, etc. Anyway, if you use any known software architectural pattern in your application, then it will be easy for others to understand the structure/component design of your application. I’ll explain a sample project implementation according to the CQRS pattern using MediatR in ASP.NET Core MVC with vue.js. ... The main goal of this project is to explain the CQRS architectural pattern. I’m plaining to implement a tiny Single-Page-Application (SPA) project. The choice of the technology is important, and you should choose it according to your requirements.


What does 'network on demand' mean for enterprises?


Network on demand -- or on-demand networking -- can be delivered as either a managed network service or as cloud-based networking. In a managed network service model, a third party manages, meters and bills the infrastructure. In a cloud-based networking model, a business contracts directly with the cloud provider and makes all the decisions about its network. In either model, on-demand networking changes the dynamics from a Capex model in which customers pay upfront and amortize to a consumption-based model where users pay monthly for what they consume. Network on-demand options can be more flexible, enabling businesses to scale their network bandwidth and provision up and down to match business needs. In the on-demand world, burdens shift toward more planning and monitoring of service-level agreements and consumption versus hardware and traffic. The most logical customers for on-demand managed networking services are smaller businesses that don't have the internal resources to adequately handle networking.


Data is your best defence against a coronavirus downturn

Data is your best defence against a coronavirus downturn image
Remember, good information in its many forms, including analytics, insights, predictions, diagnoses, prescriptions, and so forth, often is a lower-cost substitute for inventory, property and even money. Uber and Lyft for example have substituted information about who needs a ride and who has a car for fleets of taxis. Airbnb and HomeAway have done the same for bedrooms. Even most traditional retailers and manufacturers have been able to reduce their inventory levels, some to just-in-time inventory, based on detailed, near real-time supply and demand information. Moreover, more than 30% of companies today exchange information they collect or generate in return for goods and services from others. And this merely represents one of several ways to monetize your data. Investors themselves even seem to favor organisations that make significant investments in data and analytics. Public companies with chief data officers, data governance programs, and data science organizations command a nearly 2x market-to-book valuation over the rest of the market.


Needed: A Cybersecurity Good Samaritan Law

As the US becomes more sophisticated in protecting the digital world, physical systems are becoming a target — one with an attack surface that's relatively easy to penetrate. Gaining physical access is one of the easiest ways to hack into a network. This could include accessing paper records, installing equipment or software on the network, or simply putting in covert backdoor systems. The concept of combining physical attacks and cyberattacks to test a system is nothing new. The term "red teaming" is used in the industry to describe a method of system testing based on thinking and acting like a bad guy. Red teams help businesses to see how break-ins and business disruptions occur, to test strength and durability of their defenses, to identify where vulnerabilities exist, and to expose weaknesses that could be considered negligent and contributing to a breach. The risks of conducting red teaming increase as more bad guys hide themselves in cyberspace. Law enforcement and the legal system have the power to interpret the legality of our work.


CIO interview: Malcolm Lowe, head of IT, Transport for Greater Manchester


“The organisation has a lot of data and information,” he says. “It was in lots of pockets; people were using all sorts of different tools and techniques. We recognised there was a great opportunity for the organisation to really embrace analytics.” Lowe says his initial efforts were focused on getting people from across the organisation to understand what opportunities data might provide. He focused on showing business stakeholders what he calls “the art of the possible” through a proof of concept. “We had some spare capacity, we had some spare licences and we got a couple of data engineers to create an alpha,” he says. “I’ve got some bright people in my team. I tasked them to get as much data as they could from across the organisation for a single month. We put that data into an Azure SQL Server Data Warehouse and put Power BI over the top of it. “We found a couple of use cases across the organisation for people who were really interested in our ideas. We built something for them, they got to use it and they really liked it. I’m a big believer in people seeing something tangible...."


What is natural language processing? The business benefits of NLP explained

What is natural language processing? The business benefits of NLP explained
Natural language processing (NLP) is the branch of artificial intelligence (AI) that deals with communication: How can a computer be programmed to understand, process, and generate language just like a person? While the term originally referred to a system’s ability to read, it’s since become a colloquialism for all computational linguistics. Subcategories include natural language generation (NLG) — a computer’s ability to create communication of its own — and natural language understanding (NLU) — the ability to understand slang, mispronunciations, misspellings, and other variants in language. ... Machine translation is one of the better NLP applications, but it’s not the most commonly used. Search is. Every time you look something up in Google or Bing, you're feeding data into the system. When you click on a search result, the system sees this as confirmation that the results it has found are right and uses this information to better search in the future. Chatbots work the same way: They integrate with Slack, Microsoft Messenger, and other chat programs where they read the language you use, then turn on when you type in a trigger phrase. Voice assistants such as Siri and Alexa also kick into gear when they hear phrases like “Hey, Alexa.”


Keeping machine learning algorithms humble and honest in the ‘ethics-first’ era


Removing the complexity of the data science procedure will help users discover and address bias faster – and better understand the expected accuracy and outcomes of deploying a particular model. Machine learning tools with built-in explainability allow users to demonstrate the reasoning behind applying ML to a tackle a specific problem, and ultimately justify the outcome. First steps towards this explainability would be features in the ML tool to enable the visual inspection of data – with the platform alerting users to potential bias during preparation – and metrics on model accuracy and health, including the ability to visualise what the model is doing. Beyond this, ML platforms can take transparency further by introducing full user visibility, tracking each step through a consistent audit trail. This records how and when data sets have been imported, prepared and manipulated during the data science process. It also helps ensure compliance with national and industry regulations – such as the European Union’s GDPR ‘right to explanation’ clause – and helps effectively demonstrate transparency to consumers.


Decipher the true meaning of cloud native


The definition of cloud native has become more confusing as organizations and IT professionals incorporate it into their everyday usage, despite defining the term in different ways. The most oft-cited definition is the murky CNCF definition that was introduced in 2018. That cloud native definition mostly reiterates the points that the CNCF made when it launched in 2015, though it does emphasize some concepts not included at the CNCF launch, such as automation, observability and resiliency. Still, the current CNCF definition doesn't explain exactly what counts as cloud native and what doesn't. That is, unless you think any type of application that uses containers and microservices or relies on automation or resiliency counts as cloud native. ... At a high level, certain technologies, like containers and microservices, form an important part of what many people consider to be cloud native. Yet, there is virtually no specific guidance from any organization regarding how, exactly, these technologies need to be used in order for an app to meet the requirements of the cloud native definition.



Quote for the day:


"What great leaders have in common is that each truly knows his or her strengths - and can call on the right strength at the right time." -- Tom Rath


Daily Tech Digest - March 16, 2020

How Machine Learning, A.I. Might Change Education


One area in which A.I. intersects with student learning is in ethics. Some studies are already exploring the ethical issues of replacing teachers with bots. However, although bots can enhance education, they can’t replace teachers, according to Bernhardt L. Trout, professor of chemical engineering and director of Society, Engineering, and Ethics at the Massachusetts Institute of Technology. Trout argues that A.I. can enrich the learning of students as they master skills, languages and basic math, but it can’t help students learn creativity or critical thinking. “Bots will not be able to decide for us what is good, although they might be able to help us learn better the issues around the decision of what is good,” he said. “Bots are limited in making certain choices about education in ways that human beings are not limited, so this is where we get into the more ethical issues.” Trout sees bots teaching themes or the usage of certain words, for example, but they may be limited in helping students critique literature. He believes a bot is unable to teach the essential concepts needed to understand the work of philosophers such as Plato or Dante, or painters such as Michelangelo: “That’s where I think there is an intrinsic limitation.”



Rethinking change control in software engineering


When organizations mandate that their ops teams focus solely on stability, change control can quickly become change prevention, much to the chagrin of development teams that are mandated to continuously update and deliver new features. With DevOps now inverting the traditional IT delivery model, the question becomes: Can change control still work in the way it was intended? It's likely that small, software-focused organizations running in the cloud won't use the term change control. They may just execute deployments when it makes sense, especially if the team doesn't yet charge for their services, or they have a way to turn a new service on for only a limited number of users. On the other end, large organizations that still run COBOL tend to use monolithic ticketing systems to manage permissions and change approvals. However, most teams probably find themselves somewhere in the middle of these two extremes, leaving them in a place where they need to find a realistic balance between both the resiliency and flexibility of feature deployments.


What is the internet backbone and how it works

global network connections
Like any other network, the internet consists of access links that move traffic to high-bandwidth routers that move traffic from its source over the best available path toward its destination. This core is made up of individual high-speed fiber-optic networks that peer with each other to create the internet backbone. The individual core networks are privately owned by Tier 1 internet service providers (ISP), giant carriers whose networks are tied together. These providers include AT&T, CenturyLink, Cogent Communications, Deutsche Telekom, Global Telecom and Technology (GTT), NTT Communications, Sprint, Tata Communications, Telecom Italia Sparkle, Telia Carrier, and Verizon. By joining these long-haul networks together, Tier 1 ISPs create a single worldwide network that gives all of them access to the entire internet routing table so they can efficiently deliver traffic to its destination through a hierarchy of progressively more local ISPs. In addition to being physically connected, these backbone providers are held together by a shared network protocol, TCP/IP. They are actually two protocols, transport control protocol and internet protocol that set up connections between computers, insuring that the connections are reliable and formating messages into packets.


Working from home: Your common challenges and how to tackle them


Interruptions come from outside, like a knock at the door from a delivery driver asking you to take in a parcel for a neighbour. Other potential interruptions; family and pets and friends who fail to understand that just because you are at home, you are still working. Closed doors, do not disturb signs and noise-cancelling headphones all come in handy. More working from home tips here. Distractions are slightly different. These are mostly the result of being in a different environment to the one which you are used to, and that means habits are disrupted and priorities get muddled. In the office your priorities are (mostly) well defined – you're there to work. At home your priorities are different; having fun, cooking, eating, cleaning, watching TV – almost by definition everything not work related. Bringing work into the home, especially if it's for the first time, especially now, confuses all of this. It also makes you think you can combine the two, which is why you'll try to wash the dishes while on a conference call (and yes, everyone will know). Here the solution is around building a new work routine so that focusing is easier. That's why every set of remote working tips talks about getting up and getting dressed, and attempting to work regular hours.


Telehealth and Coronavirus: Privacy, Security Concerns
Keith Fricke, principal consultant at tw-Security, notes that it's critical for healthcare entities to take a number of critical security measures when using telemedicine applications. That includes ensuring the transmission of information over the internet is encrypted and making sure that the endpoints where telehealth transmissions begin and end are secured, he notes. "I don't think these risks are heightened by the coronavirus," he says. "However, a rush to establish new telehealth applications or a rush to expand existing ones to meet demands driven by COVID-19 can lead to overlooking important controls necessary to maintain security and privacy of information. "As with any technology deployment involving the storage, processing or transmission of PHI or other confidential information, it is important to implement telehealth services with the appropriate technical, physical and administrative controls." As the use of telemedicine expands in dealing with the outbreak, new risks will also evolve, Fricke adds.


When Will 100% Remote Be an Accepted Norm?

Picture yourself graduating from college in the 1980s or 1990s, ready to change the world with your college degree and your freshly polished programming skills. Depending on the year you started working in the industry, you might have to share a terminal to write the program code required to complete your job. The idea of having a computer at your desk wasn't a reality. For those starting a little later, you might have a computer at your desk, but it is merely a client to a host system housing your program logic and processing power. The system you programmed on was in near proximity to you. Later, that idea was broken into an application server and some type of data store or database. There wasn't a cloud option to host your system, but some did have custom connectivity to align data centers across private corporate networks. Remember, the internet wasn't a "thing" we could rely on, yet. There was a fleet of programmers who grew up in this reality. 


how sit and uat differ
With user acceptance testing, the development organization gives the build or release to the target consumer of the system, late in the software development lifecycle. Either the end users or that organization's product development team perform expected activities to confirm or accept the system as viable. The UAT phase is typically one of the final steps in an overall software development project. ... Testers who evaluate functionality as it's delivered are usually prepared to also check application functionality as a whole, integrated solution. SIT is often a more technical testing process than UAT. Testers design and execute SIT, as they've become familiar with the types of defects common in the application throughout the SDLC. The SIT phase precedes UAT. Because the technical expertise between users and testers varies significantly, the two demographics are likely to find vastly different defects between UAT and SIT. SIT often uncovers bugs unit tests didn't catch -- defects that rely on a workflow or interaction between two application components, such as a sequential order of operations.


How Red Hat tackles security


In the just-released Red Hat Product Security Report 2019, Red Hat said it's seeing more customers than ever trying to grapple with ever-mounting security issues by using third-party scanners. But, he said, "While scanning tools can provide a useful 'single pane of glass' view of vulnerabilities across an enterprise-wide environment, they generally do a poor job of articulating risks specific to a technology or implementation." So, Red Hat Engineering and Red Hat Product Security both explain exactly what's what with security issues and making Red Hat's "upstream packages enterprise-ready by regression testing, hardening, and tweaking the package to meet our customers' unique business demands and our release standards." To help improve this process, Red Hat made a fairly sizable change to Red Hat Enterprise Linux (RHEL) support life cycle. Because "RHEL is the foundation of all of our products and services, we felt it was important to expand the scope of what we supported."  So, RHEL now includes patches and fixes for Important-rated issues, which typically cover the largest share of issues. Previously, Red Hat was more selective about which Important-rated issues were addressed in RHEL's Extended Update Support.


Banks are adopting account aggregator framework on data

Banks are adopting account aggregator framework on data
Account aggregators are responsible for transferring, but not storing, client data. An AA ecosystem, as envisaged by the Reserve Bank of India (RBI), would be a platform for financial services companies to reach out to the consumer to seek consent before using their personal data to optimise their product offerings. "All the work is going in that direction. We are coordinating between the ecosystems. The scale-up will see a hockey stick effect. These banks and AA companies are part of the first wave. Many are waiting to join the second wave," said BG Mahesh, a cofounder of Sahamati, a non-profit collective of account aggregators. So far, Cams Finserv, FinSec AA Solutions and Cookiejar Technologies have received operating licences from the Reserve Bank of India. Kotak Mahindra Group said it was launching a pilot among 50,000 employees to test use cases for the AA framework in banking, broking, wealth management, and insurance. "As we speak we are launching a pilot with our employees before we launch it for our customers.


Making Your Code Faster by Taming Branches


Most software code contains conditional branches. In code, they appear in if-then-else clauses, loops, and switch-case constructs. When encountering a conditional branch, the processor checks a condition and may jump to a new code path, if the branch is taken, or continue with the following instructions. Unfortunately, the processor may only know whether a jump is necessary after executing all instructions before the jump. For better performance, modern processors predict the branch and execute the following instructions speculatively. It is a powerful optimization. There are some limitations to speculative execution, however. For example, the processor must discard the work done after the misprediction and start anew when the branch is mispredicted. Thankfully, processors are good at recognizing patterns and avoiding mispredictions, when possible. Nevertheless, some branches are intrinsically hard to predict and they may cause a performance bottleneck. Programmers can be misled into underestimating the cost of branch mispredictions when benchmarking code with synthetic data that is either too short or too predictable.



Quote for the day:


"You can't lead anyone else further than you have gone yourself." -- Gene Mauch


Daily Tech Digest - March 15, 2020

The rising threat of drones to cybersecurity: What you need to know

picture of a drone
While it may seem impossible for a drone to affect cybersecurity, there are several factors that make it entirely possible for drones to carry out many malicious cybercrimes. For instance, drones equipped with cameras have been associated with spying. In fact, there have been many arrests for drone spying — and that’s not all a drone can do. In addition to taking bird’s-eye pictures and video, drones can also be used to spy on networks, capture data and block communications, making them a huge threat to cybersecurity as a whole. The fact that drones carry this type of threat to cybersecurity is due to their vast capabilities. In addition to cameras, many drones come equipped with GPS, USB ports, and other means that can easily allow them to be hijacked. Hackers can use tools to easily tap into drones if the owner doesn’t install certain security measures. This leaves many commercial drones at risk of exploitation due to the fact that they communicate with their operators via WiFi and GPS, which often tend to be unencrypted. With all that a drone can do, it comes as no surprise that they pose such a risk to cybersecurity. In addition to the privacy issue and the fact that drones are vulnerable to hackers, previous incidents prove how risky the small aircrafts can be.



The report also highlighted that just 9% of security professionals are neurodivergent, although meaningful and reliable comparison of this measure against the wider industry is not yet possible – DCMS nevertheless said it found a concerning lack of awareness of neurodiversity in the sector. The research process highlighted a number of barriers and challenges to increasing the diversity of Britain’s cyber security workforce. DCMS said that while diversity was seen as more important, there remain pockets of scepticism, with some interviewees claiming the topic was overemphasised, or no worse than in other digital sectors, and therefore not a problem. Many respondents also said they did not view a diverse workforce as a means to help tackle the skills shortage in security, focusing instead on non-specific benefits. This is in spite of a growing and substantial body of evidence that proves diverse teams are a hugely important factor in building a responsible organisation.


Learning Data Science Skills Is Easier Than You Think

Futuristic Circuit Board Render
Data skills are valuable across all industries and job functions as decision making is becoming more and more data-driven, and gaining these skills isn’t as challenging as originally thought. The Burning Glass report states that “the demand for metrics — and the growing ease of measuring and visualizing them — is reshaping business practices across industries,” citing marketing and business analysis as examples. It also highlights the demand for data science and analytics skills in decision-making roles, including managers across a range of industries. So, where to start? IBM Data Scientist Joseph Santarcangelo, Ph.D., shared his expertise on getting started in data science, which starts with learning Python: “Today with data science, for a lot of it you don’t have to have a Ph.D. anymore. You don’t have to spend years and years studying something. The runway is a lot shorter this year for data science...now all you really have to know is Python and have a basic understanding of what’s going on and it’s pretty remarkable where you can go.”



Data Experts Say New Sources Must Not Replace Traditional Data


Participants added that new institutional frameworks, including legal guidelines, are needed to manage the influx of new technologies. Lisa Bersales, the first National Statistician of the Philippines, called for quality-assurance frameworks for big data and citizen-generated data. Gero Carletto, World Bank, noted that risks arise from the lack of standards for integrating different data sources. The speakers also advised caution about the “recent boom” in public-private data partnerships, suggesting that they must be managed carefully. Fredy Rodriguez, Cepei, explained the need for partnerships to establish an effective institutional framework in order to share data and determine how shared data will be used. Finally, the discussion drew attention to the evolving role of the National Statistics Offices (NSOs). Experts said NSOs’ mandate has evolved significantly in the past few years; no longer just producers of data, they are now responsible for coordinating a broad data ecosystem of entities across government, civil society, and the private sector, and for brokering new partnerships to produce, clean, compile, and analyze data to produce official statistics. In effect, NSOs have become “data stewards.”


Digital transformation: 3 ways to ease the fear factor

cio role digital transformation
Convey what the state of the business could be like without digital transformation. Understanding that the company’s future could be at risk and that their skills will become obsolete with antiquated legacy systems will likely have a significant impact on everyone. Remind employees that digital change is about designing and delivering better products and services and that this is why many people get involved with IT if the first place – to make a positive change. Positioning change in this way can help everyone see it through a different lens. Be direct and honest in all your communications, especially with employees who actively oppose change. State what the goals are, what the rollout will look like, and what the benefits will be for customers and partners as well as employees. Create a conversation and openly acknowledge concerns. Don’t shy away from difficult conversations – these are the ones employees will focus on, and failing to engage in them will drive the message that change is unpalatable.


How to use digital twins to reduce risk

Big data analytics, financial charts, business team working on computer.
In the last few years, the term "digital twin" has entered the lexicon, likely as a result of overzealous consultants applying a complicated name to a simple concept. A digital twin is nothing more than a computer simulation of something in the physical world. The Cessna I careened through the skies of Chicago on my monochrome monitor as a youth was a digital twin, just as a spreadsheet predicting next year's sales can also be a digital twin, as they both aim to simulate a future outcome using data and logic. Digital twins are incredibly valuable for the rather obvious reason that they can help you gather key insights and model potential future outcomes at a fairly low cost, thus de-risking larger investments. Consider my early experiments in flying an aircraft. For $40 or so I was able to crash my "digital twin" of a $200,000 aircraft multiple times before gathering the critical insight that I needed to pull back on the stick instead of pushing forward. In a more relevant recent example, I worked with a client who was trying to determine if the logistical costs of a complex distribution network could be sustained at a price customers were willing to pay. 


a worker fixing a power line
For many industrial networks, the highest standard of security is an "air gap," a physical disconnect between the inner sanctum of software connected to physical equipment and the less sensitive, internet-connected IT systems. But very few private-sector firms, with the exception of highly regulated nuclear power utilities, have implemented actual air gaps. Many companies have instead attempted to restrict the connections between their IT networks and their so-called OT or operational technology networks—the industrial control systems where the compromise of digital computers could have dangerous effects, such as giving hackers access to an electric utility's circuit breakers or a manufacturing floor's robots. Those restricted connections create choke points for hackers, but also for remote workers. Rendition InfoSec founder and security consultant Jake Williams describes one manufacturing client that carefully separated its IT and OT systems. Only "jump boxes," servers that bridge the divide between sensitive manufacturing control systems and nonsensitive IT systems, connected them. Those jump boxes run very limited software to prevent them from serving as in-roads for hackers.


Zero trust: Taking back control of IT security


They say: “Zero trust changes the traditional model of ‘trust, but verify’ – where you assume that any device or asset attached to your internal network is likely to be permitted and safe to access internal-only resources, but still verify that this is the case. Instead, that becomes ‘never trust, always verify’ – where every device must pass authentication and security policy checks to access any corporate resources, and to control access only to the extent required.” Trust involves an interplay between people and technology. According to Walsh and Grannells, the starting point for these trust factors is a well-thought-out and up-to-date set of policies, standards, procedures and work practices, supplemented by detailed, up-to-date network documentation and asset inventories covering information, software licences and hardware. The pair believe zero trust enables IT security to regain control. “The shift to zero trust is where information security is taking back control of the many new perimeters of the corporate ecosystem,” they say.


How do we stay smarter than our smart home devices?


It’s difficult to argue with the statement that connected devices do already enrich our lives and will continue to do so more impressively in the near and distant future. The not-so-great news? IoT manufacturers really need to step up their cybersecurity game. Many are already working tirelessly to do so, but as many are pretty much starting from scratch, they have their work cut out for them.  With more than half of companies failing to require third-party security and privacy compliance, it’s no surprise that in the past couple of years we’ve seen connected device data breaches almost double, going from 15% to 26%. Furthermore, some of these incidents encroached on peoples’ privacy in a very alarming way. Remember when Amazon’s Alexa recorded a private conversation and sent the content to a user’s random contact? Or when we’ve learned that our BFF, Roomba’s iRobot, can actually map our homes and share this information?  But with incredible devices like smart thermostats that can save us money – and even save our lives by turning off the stove if it’s on for too long – giving up on IoT because of its cybersecurity flaws is not an option.


FortiGuard Labs’ Derek Manky Talks Swarm Attacks, War of Deception

FortiGuard Labs’ Derek Manky Talks Swarm Attacks, War of Deception
Using swarm technology, intelligent swarms of bots can share information and learn from each other in real time. They could target a network, attacking multiple systems at the same time, and overwhelming the network because of the sheer number of attacks and speed at which they occur. “This is a way they could weaponize it, particularly with 5G being rolled out, which means a lot of devices can communicate really quickly together and that’s when you have a swarm,” Manky said. “You have connected devices that communicate, and if you hook up an AI system to that, those devices can launch an attack on their own. It looks quite scary.” On the bright side, organizations can still get ahead of these types of attacks, Manky said. This starts with basic cybersecurity hygiene, which, unfortunately is something many companies still struggle with. “You need a proper security architecture, segmentation,” which reduces a company’s attack surface by essentially sealing off workloads from the rest of the network, thus preventing hackers from gaining access to the wider system.



Quote for the day:


"Trust is the highest form of human motivation." -- Stephen R. Covey


Daily Tech Digest - March 14, 2020

Data Science Is Now Bigger Than 'Big Data'

Getty Images.
The now-ubiquitous term “big data” begins its meteoric rise in lockstep with cloud computing’s fall, suggesting that the public’s focus on hardware rental was rapidly replaced with how all of that computing power was being used: to analyze massive datasets. In contrast, “data science” and “deep learning” both take off in 2013 and accelerate over 2014. Interestingly, despite deep learning’s Cambrian Explosion over the past few years, search interest appears to have leveled off as of last January, perhaps suggesting that we are now searching more for the individual applications of deep learning rather than the phrase itself. Most significantly, as of January of this year, “data science” has surpassed “big data” in total search volume. Just as cloud computing’s hardware focus gave way to big data’s emphasis on what we do with all that hardware, so too has the focus shifted now from assembling huge piles of data to the people and processes making sense of all of that data. While it may be entirely coincidental, it is interesting to note that data science and deep learning burst into popularity in the immediate aftermath of Edward Snowden’s June 2013 disclosures, raising questions of whether vastly increased public awareness of data mining led to increased interest in those fields.



How to write a business continuity plan: the easy way

The most obvious reason to implement a BCP is to ensure that your organisation remains productive in the event of a disruption. Customers must still be able to use your services, employees must be able to continue doing their job and you can’t allow yourself to face a huge backlog of work as delays continue. But business continuity isn’t only about short-term goals. The cyber security landscape has become increasingly volatile in recent years, with cyber crime continuing to spiral and organisations’ reliance on technology leading to vast numbers of accidental and deliberate data breaches. As a result, organisations need to prove to customers and stakeholders that they are prepared for anything. Business continuity is especially important for OES (operators of essential services) and DSPs (digital service providers), as the delays could either be widespread or cause major headaches. To ensure that such organisations are sufficiently prepared for risks, the EU adopted the NIS Directive, which was transposed into UK law as the NIS (Network and Information Systems) Regulations 2018.


New Flat Lens Enables Focus-Free Cameras With Drastically Reduced Weight


“Our flat lenses can drastically reduce the weight, complexity and cost of cameras and other imaging systems, while increasing their functionality,” said research team leader Rajesh Menon from the University of Utah. “Such optics could enable thinner smartphone cameras, improved and smaller cameras for biomedical imaging such as endoscopy, and more compact cameras for automobiles.” In Optica, The Optical Society’s (OSA) journal for high impact research, Menon and colleagues describe their new flat lens and show that it can maintain focus for objects that are about 6 meters apart from each other. Flat lenses use nanostructures patterned on a flat surface rather than bulky glass or plastic to achieve the important optical properties that control the way light travels. “This new lens could have many interesting applications outside photography such as creating highly efficient illumination for LIDAR that is critical for many autonomous systems, including self-driving cars,” said Menon. Conventional cameras, whether used in smartphones or for microscopy, require focusing to ensure that the details of an object are sharp. If there are multiple objects at different distances from the camera, each object must be focused separately.


Open-source security: This is why bugs in open-source software have hit a record high


A large source of newly found bugs comes from Google's open-source fuzzing tools, such as OSS-Fuzz, which by 2018 had helped find 9,000 flaws in two years. As of January 2020, it's helped find 16,000 bugs in 250 open-source projects. WhiteSource found that 85% of open-source vulnerabilities are disclosed and have a fix already available. However, it notes that some users are not aware of these fixes because only 84% of known open-source bugs make it to the National Vulnerability Database (NVD). "Information about vulnerabilities is not published in one centralized location, rather scattered across hundreds of resources, and sometimes poorly indexed – often making searching for specific data a challenge," it notes. WhiteSource last year brought its vulnerability database to GitHub to support its security-alerts service. GitHub scans project dependencies for vulnerabilities in projects written in PHP, Java, Python, .NET, JavaScript and Ruby. It's helped developers find and fix millions of known flaws in dependencies. "Our concern is that, while these tools will help to report vulnerability issues in a proper manner, they will probably only aggravate the issue with software developers who are already struggling to keep up with the increased rate," WhiteSource notes.


Phishing attacks exploit YouTube redirects to catch the unwary

Phishing attacks exploit YouTube redirects to catch the unwary
Attackers are increasingly exploiting the fact that email gateways turn a blind eye to links to popular sites such as YouTube, in order to phish passwords from unsuspecting computer users. Researcher Ashley Trans of Cofense highlighted the threat in a blog post describing a recent phishing campaign. In the attack, an unsuspecting user receives an email which purports to come from SharePoint, claiming that a new file has been uploaded to his company’s SharePoint site. ... Closer examination reveals that although the link in the email does indeed point initially at YouTube (youtube.com), it also sends a series of parameters telling YouTube to redirect any traffic to a URL at <companyname>[.]sharepointonline-ert[.]pw, which in turn ultimately takes the user’s browser to its final destination: a phishing page hosted on a legitimate Google site, googleapis.com. ... The disappointing truth is that YouTube provides a method for anyone to create a link at youtube.com, which automatically redirects browsers to third-party phishing sites without any warning.


CIO interview: Miguel Rio Tinto, Emirates NBD


It has been an enormous effort and a challenging journey, with the organisation completely changed to adopt agile practices. The company is about 65% and 75% into its digital transformation cycle. “We still have some important milestones to achieve but by the end of 2020, we will be working and using the same technologies as cloud-natives and we will be comprised of 100% cloud-enabled, agile teams who collaborate with the business.” To achieve his mission, Rio Tinto has a Dubai-headquartered technology operation of around 1,200 internal and external staff. “We brought in managers from banks in Australia, Canada, the US, Turkey, Europe, India and Dubai. Half of the managers are new to organisation and over the course of 18 months, half of our engineers were also replaced,” he says. The 1,200 IT staff are arranged into 60 different sets of “squads”, which directly collaborate with business units, including retail, wholesale and enterprise.


Want  To Be A Cyber Security Pro? It Goes Way Beyond Learning To Code

cybersecurity
Learning Linux for the basics, such as terminal usage, SSH (Secure Shell), users and permissions, processes, networking, databases could be very handy as well. Those not accustomed to the Linux environment and its command line, etc. can first learn them using a bunch of web resources and tutorials to begin. Core Linux commands, input/output redirecting and piping, file manipulation, basic network configuration and user account management are some of the key things to focus here, which can be incredibly useful for security expertise later on. But as cybersecurity is a broad field, experts need to have a solid grasp on networking also and for this learner may have to spend hundreds of hours learning the nitty-gritty of company networks, and how hackers may break them to gain access to sensitive data. According to many security experts, professionals in the space may also choose to learn more about how networks and systems operate and less programming.  Network security specialists identify, anticipate and fix security threats to computer networks. They additionally perform an essential function in keeping the integrity and secrets of a company’s data and knowledge systems.


Microsoft: WSL2's Linux kernel will be delivered to Windows 10 users via Windows Update


Specifically, Microsoft has decided to remove the Linux kernel from the Windows OS image with WSL2. Instead, the company will deliver it to users' machines using Windows Update. Users will be able to manually check for new kernel updates by clicking the "Check for Updates" button or by waiting for Windows to do this automatically. "Our end goal is for this change to be seamless, where your Linux kernel is kept up to date without you needing to think about it. By default this will be handled entirely by Windows, just like regular updates on your machine," said Microsoft Program Manager Craig Loewen in a blog post today outlining the coming change. Loewen noted that initially, Windows 10 2004 users and Insider testers using Slow Ring preview builds will temporarily need to manually install the Linux kernel. They'll receive within "a few months" an update that will add automatic install and servicing capabilities. (In fact, Slow Ring testers just got today, March 13, a new Windows 10 2004 test build, 19041.153, which includes this servicing change to WSL2.)


Commission Calls for Revamping US Cybersecurity


The commission, which was mandated under the 2019 National Defense Authorization Act, is co-chaired by Sen. Angus King, I-Maine, and Rep. Mike Gallagher, R-Wis. It also includes Trump Administration officials. Its mission is to develop "a consensus on a strategic approach to defending the United States in cyberspace against cyberattacks of significant consequences," according to the report. The report lists China, Russia, Iran and North Korea as major threats to cybersecurity in the U.S., pointing at intellectual property theft carried out by Chinese operators and the election meddling carried out by Russian actors that has damaged public trust in the integrity of American elections. The report puts much of its emphasis on election security and how other countries are attempting to manipulate the vote through hacking and disinformation. "If we don't get election security right, deterrence will fail and future generations will look back with longing and regret on the once powerful American Republic and wonder how we screwed the whole thing up," King and Gallagher note in the report.


Maintaining Mental health on Software Development Teams

Work-related anxiety and mental disorders are becoming a common challenge among tech companies. According to the International Journal of Social Sciences, software developers have a considerably higher chance of experiencing fatigue, burnout, anxiety, and stress, compared to their colleagues who perform mechanical tasks. Deteriorating mental health not only threatens the wellbeing of employees, but the companies’ overall productivity. Researchers from the Institute of Software Technologies in Stuttgart found that mentally-exhausted or depressed developers produce a lower quality of code and tend to miss deadlines. Today, tech companies are realizing the importance of mental health and taking action to ensure their dedicated development teams stay healthy and sane. Here, at Beetroot, we strive to create a homely and comfortable atmosphere that minimizes the pressure felt on our teams. However, despite our best efforts, there are still challenging times. We recently spoke with our HR representative and psychologist, Vova Vovk, about mental health.



Quote for the day:


"If you are not willing to give a less experienced qualified professional a chance, don't complain you are charged double for a job worth half." -- Mark W. Boyer