Daily Tech Digest - February 06, 2020

Is your CISO stressed? According to Nominet, they are

Is your CISO stressed? According to Nominet, they are image
Overworked CISOs would sacrifice their salary for a better work-life balance, according to the research. Investigating the causes of CISO stress, the research found that almost all CISOs are working beyond their contracted hours, on average by 10 hours per week. And, the report suggests that even when they are not at work many CISOs feel unable to switch off. As a result, CISOs reported missing family birthdays, holiday, weddings and even funerals. They’re also not taking their annual leave, sick days or time for doctor appointments — contributing to physical and mental health problems. The key findings: 71% of CISOs said their work-life balance is too heavily weighted towards work; 95% work more than their contracted hours — on average, 10 hours longer a week — which means CISOs are giving organisations $30,319 (£23,503) worth of extra time per year; Only 2% of CISOs said they were always able to switch off from work outside of the office, with the vast majority (83%) reporting that they spend half their evenings and weekends or more thinking about work.



This latest phishing scam is spreading fake invoices loaded with malware


The attachment claims the user needs to 'enable content' in order to see the document; if this is done it allows malicious macros and malicious URLs to deliver Emotet to the machine. Because Emotet is such a prolific botnet, the malicious emails don't come from any one particular source, but rather infected Windows machines around the world. If a machine falls victim to Emotet, not only does the malware provide a backdoor into the system, allowing attackers to steal sensitive information, it also allows the attackers to use the machine to spread additional malware – or allow other hackers to exploit compromised PCs for their own gain. The campaign spiked towards the end of January and while activity has dropped for now, financial institutions are still being targeted with Emotet phishing campaigns. "We are continuing to see Emotet traffic, though the intensity has reduced considerably," Krishnan Subramanian, researcher at Menlo Labs told ZDNet. In order to protect against Emotet malware, it's recommended that users are wary of documents asking them to enable macros, especially if it's from an untrusted or unknown source. Businesses can also disable macros by default.


Research network for ethical AI launched in the UK


The initiative is being led by the Ada Lovelace Institute, an independent data and AI think tank, in partnership with the Arts and Humanities Research Council (AHRC), and will also seek to inform the development of policy and best practice around the use of AI. “The Just AI network will help ensure the development and deployment of AI and data-driven technologies serves the common good by connecting research on technical solutions with understanding of social and ethical values and impact,” said Carly Kind, director of the Ada Lovelace Institute. “We’re pleased to be working in partnership with the AHRC and with Alison Powell, whose expertise in the interrelationships between people, technology and ethics make her the ideal candidate to lead the Just AI network.” Powell, who works at the London School of Economics (LSE), specifically researches how people’s values influence how technology is built, as well as how it changes the way we live and work.


How Can We Make Election Technology Secure?

Simplified view of the chain of voting devices.  
Graphic by Ives Brant, TrustiPhi
Let's start with some common problems presented by modern-day election machines. Single point of failure. A compromise or malfunction of election technology could decide a presidential election. Between elections. Election devices might be compromised while they are stored between elections. Corrupt updates. Any pathway for installing new software in voting machines before each election, including USB ports, may allow corrupt updates to render the system untrustworthy. Weak system design. Without clear guidelines and thorough, expert evaluation, the election system is likely susceptible to many expected and unexpected attacks. Misplaced trust. Technology is not a magic bullet. Even voting equipment from leading brands has delivered wildly wrong results in real elections. Election administrators need to safeguard the election without relying too heavily on third parties or technologies they don't control. It takes a lot of work to lock down a complex voting system to the point where you'd bet the children's college fund — or the future of society — on its safety.


The Human-Powered Companies That Make AI Work

Machine learning models require human labor for data labeling
Machine learning is what powers today’s AI systems. Organizations are implementing one or more of the seven patterns of AI, including computer vision, natural language processing, predictive analytics, autonomous systems, pattern and anomaly detection, goal-driven systems, and hyperpersonalization across a wide range of applications. However, in order for these systems to be able to create accurate generalizations, these machine learning systems must be trained on data. The more advanced forms of machine learning, especially deep learning neural networks, require significant volumes of data to be able to create models with desired levels of accuracy. It goes without saying then, that the machine learning data needs to be clean, accurate, complete, and well-labeled so the resulting machine learning models are accurate. Whereas it has always been the case that garbage in is garbage out in computing, it is especially the case with regards to machine learning data.


cloud security / data protection / encryption / security transition
There are multiple IaC frameworks and technologies, the most common based on Palo Alto's collection effort being Kubernetes YAML (39%), Terraform by HashiCorp (37%) and AWS CloudFormation (24%). Of these, 42% of identified CloudFormation templates, 22% of Terraform templates and 9% of Kubernetes YAML configuration files had a vulnerability. Palo Alto's analysis suggests that half the infrastructure deployments using AWS CloudFormation templates will have an insecure configuration. The report breaks this down further by type of impacted AWS service -- Amazon Elastic Compute Cloud (Amazon EC2), Amazon Relational Database Service (RDS), Amazon Simple Storage Service (Amazon S3) or Amazon Elastic Container Service (Amazon ECS). ... The absence of database encryption and logging, which is important to protect data and investigate potential unauthorized access, was also a commonly observed issue in CloudFormation templates. Half of them did not enable S3 logging and another half did not enable S3 server-side encryption.


Serverless computing: Ready or not?

binary code vortex
By nature, serverless computing architectures tend to be more cost-effective than alternative approaches. "A core capability of serverless is that it scales up and down to zero so that when it’s not being used you aren’t paying for it," Austin advises. With serverless technology, the customer pays for consumption, not capacity, says Kevin McMahon, executive director of mobile and emerging technologies at consulting firm SPR. He compares the serverless model to owning a car versus using a ride-sharing service. "Prior to ride sharing, if you wanted to get from point A to B reliably you likely owned a car, paid for insurance and had to maintain it," he explains. "With ride-sharing, you no longer have to worry about the car, you can just pay to get from A to B when you want—you simply pay for the job that needs to be done instead of the additional infrastructure and maintenance." Serverless computing can also help adopters avoid costs related to the overallocation of resources, ensuring that expenses are in line with actual consumption, observes Craig Tavares, head of cloud at IT service management company Aptum.


Oops! Microsoft gets 'black eye' from Teams outage

fail epic fail disaster disrupt upset angry thinkstock
“This is definitely a black eye for Microsoft, especially when it has touted its reliability in the wake of some high-profile Slack outages in the last couple of years,” said Irwin Lazar, vice president and Service Director at Nemertes Research. “It is surprising that Microsoft didn't renew its certificate, and it shows that as Teams rapidly grows they will have to ensure they are addressing operational issues to prevent further downtime.” Indeed, the prompt reaction to the outage is an indication of the growing importance of Teams as more and more office workers rely on team messaging tools. “There is nothing like taking a service down to illustrate its popularity and importance. However, this is not a best practice we recommend,” Larry Cannell, a research director at Gartner, dryly noted. An SSL certificate enables a secure connection between a web browser or app and a server, and is required for HTTPS-enabled sites. It helps protect users against security risks such as man-in-the-middle attacks by allowing data to be encrypted. When a certificate expires, the server can’t be identified and information cannot be sent. That was the case with Teams on Monday.


Looking to hire a '10x developer'? You can try, but it probably won't boost productivity


As Nichols notes in a blog, various studies since the 1968 one have estimated that top-performing developers are between four and 28 times more productive than average performers. But Nichols says his study found evidence to contradict the idea that some programmers are inherently far more skilled or productive than others. Performance differences are partly attributable to the skill of an individual, he writes, but each person's productivity also varies every day, depending on the task and other factors. "First, I found that most of the differences resulted from a few, very low performances, rather than exceptional high performance. Second, there are very few programmers at the extremes. Third, the same programmers were seldom best or worst," he explains. He argues that these findings should change the way a software project manager approaches recruitment. For example, they shouldn't necessarily just focus on getting the top programmers to boost organizational productivity, but find "capable" programmers and develop that talent. The study involved 494 students with an average of 3.7 years' industry experience. The students used the C programming language and were tasked with programming solutions through a set of 10 assignments.


5 steps to creating a strong data archiving policy

Suppose you decide to archive data that hasn't been modified or accessed in three years. That decision leads to a number of other questions related to the data management. For example, should all the data that meets the three-year criteria be archived, or can some types of data simply be deleted rather than archived? Likewise, will data remain in your archives forever or will the data be purged at some point? You must have specific plans that address the exact circumstances under which data should be archived, as well as a plan for what will eventually happen to archived data. Many companies assume that having a data archiving policy means they have a deletion policy; they eventually wind up wishing they had spelled out the specifics of deletion and archival. ... Regulatory compliance is also critical. Not every organization is subject to federal regulatory requirements surrounding data retention policy, but those that are can face severe penalties if they fail to properly retain required data. Multinational companies also must be aware of varying regulatory policies.



Quote for the day:


"Leadership does not always wear the harness of compromise." -- Woodrow Wilson


Daily Tech Digest - February 05, 2020

G Suite vs. Office 365: What's the best office suite for business?

Google G Suite vs. Microsoft Office
Both suites work well with a range of devices. Because it’s web-based, G Suite works in most browsers on any operating system, and Google also offers apps for Android and iOS. Microsoft provides Office client apps for Windows, macOS, iOS and Android, and its web-based apps work across browsers. The suites also offer the same basic core applications. Each has word processing, spreadsheet, presentation, email, calendar and contacts programs, along with videoconferencing, messaging and note-taking software. Each has cloud storage associated with it. But those individual applications are quite different from one suite to the other, as are the management tools for taking care of them in a business environment. And both suites offer scads of additional tools as well. So it can be exceedingly difficult to decide which suite is better for your business. That’s where this piece comes in. We offer a detailed look at every aspect of the office suites, from an application-by-application comparison to how well each suite handles collaboration, how well their apps integrate, their pricing and support and more. Our focus here is on how the suites work for businesses, rather than individual use.



How remote work rose by 400% in the past decade

istock-639637280-1.jpg
The report found that the rise of remote work popularity is thanks to the evolution of supporting technologies including powerful mobile devices, ultra-fast internet connections, and proliferation of cloud-based storage and SaaS solutions. "The rise of cloud-based SaaS software has been instrumental to the growth of remote work," de Lataillade said. "Employees can now instantly connect and collaborate with colleagues around the world at any time." Employees definitely took advantage: The majority (78%) of employees said they work remotely some of the time; more than half (58%) said they work remotely at least once a month; and, 36% of respondents said they work remotely at least once a week, the report found. While 36% might not seem like a huge percentage, it's a significant jump from 10 years ago. In 2010, the US Census Bureau found that only 9.5% of employees worked remotely at least once a week, indicating that the number of people working remotely on a weekly basis has grown by nearly 400% in the last decade, according to the report.


Social media targeting algorithms need regulation, says CDEI


“Platforms should be required to maintain online advertising archives, to provide transparency for types of personalised advertising that pose particular societal risks. These categories include politics, so that political claims can be seen and contested and to ensure that elections are not only fair but are seen to be fair; employment and other ‘opportunities’, where scrutiny is needed to ensure that online targeting does not lead to unlawful discrimination; and age-restricted products.” The report acknowledged, however, that personalisation of users’ online experiences increased the usability of many aspects of the internet. “It makes it easier for people to navigate an online world that otherwise contains an overwhelming volume of information. Without automated online targeting systems, many of the online services people have come to rely on would become harder to use,” it said.


NIST Drafts Guidelines for Coping With Ransomware

NIST Drafts Guidelines for Coping With Ransomware
The proposed guidance offers a "how to" guide to implementing best practices. For example, it includes tips on vulnerability management and using backups to protect data. The second draft, "Data Integrity: Detecting and Responding to Ransomware and Other Destructive Events," offers advice on improving the detection and mitigation of ransomware and other security issues within their infrastructure. It also delves into how integrity monitoring, event detection, vulnerability management, reporting capabilities and mitigation and containment can be implemented to improve network defenses. Much like the NIST Cybersecurity Framework, these guidelines offers best practices that organizations can pick and choose based on their own network architectures, says Jennifer Cawthra, the National Cybersecurity Center of Excellence lead for data security and healthcare. "We put together a reference architecture to demonstrate that you can solve a cybersecurity challenge," Cawthra tells ISMG. "Now this is not the only way to solve a problem; it's just an example. ..."


From Legacy to Hybrid Data Platforms: Managing Disparate Data in the Cloud


Specifically, as part of an overall adaptive analytics fabric (the virtualized data and associated tools to aid analytics speed, accuracy, and ease of use), virtualization empowers companies to treat all their disparate data repositories as a single, unified data source that's extensible to support future technologies. A fabric provides a bridge across data warehouses, data marts, and data lakes, delivering a single view of an organization's data without having to physically integrate, engineer, or re-architect it. This abstraction enables enterprises to instantly surface usable data, no matter where it's actually stored, to produce fast, timely insights. The ability to merge data from different sources reveals another advantage. Rather than combining data into a single system that necessitates formatting data for the lowest common denominator of capability, adaptive analytics fabrics enable enterprises to store data in the data structures that best fit its use.


Eclipse Foundation Launches Edge-Native Working Group


The Foundation unveiled the new working group at the EDGE Computing World conference, currently underway at the Computer History Museum in Mountain View, CA. The distributed architecture called edge computing is transforming the way data is handled, processed, and delivered from millions of devices that are distant from a datacenter by bringing the compute power and storage physically closer to the application. The industry watchers at Allied Market Research expect the edge computing market to be worth $16.5 billion within the next five years. This isn't the Foundation's first work with computing at the edge. The independent, not-for-profit steward of the Eclipse open-source software development community already hosts production-ready code designed to enable devs to build, deploy, and manage edge apps at enterprise scale. But the working group provides an organized focus on the edge. The new working group already has two flagship projects: Eclipse ioFog, which provides a complete edge computing platform, including all the pieces needed to build and run apps at the edge at enterprise scale;


MIT's blockchain-based 'Spider' offers 4X faster cryptocurrency processing

cryptojacking / cryptocurrency attack
The Spider topology allows cryptocurrency network users to invest only a fraction of funds in each account associated with a network and process roughly four times more transactions “off chain” before rebalancing on the blockchain. The Spider routing scheme "packetizes" transactions and uses a multi-path transport protocol to achieve high-throughput routing in PCNs. Packetization allows Spider to complete even large transactions on low-capacity payment channels over time, while the multi-path congestion control protocol ensures balanced use of channels and fairness across flows, the researchers said in their research paper. Ultimately, the more balanced the routing of PCNs, the smaller the capacity required — meaning, overall funds across all joint accounts — for high-transaction throughput, the school said. “The MIT researchers’ network performance improvement techniques are akin to packet switching used commonly in the telecommunications systems and queue management used by many system/network management solutions to alleviate network congestion and traffic at data centers and other data aggregation points,” said Avivah Litan, a vice president of research at Gartner.


5G will bring smart cities to life in unexpected ways

There's an interesting trend that we see developing, and when you look into smart city, we actually got and I like to call it also intelligent urban ecosystem. We started to look into not just government engagement with citizens but also all the private sector stakeholders like the vehicle manufacturers, the real estate development, the construction companies, the lifestyle and leisure companies, tourism. If you start to connect to a smartphone, all these data streams and experiences together as you look for maybe a semi-autonomous car to look for the right parking, close to the museum, for an elderly person that doesn't speak the language, so you see how that processing comes together, then 5G might as well be a very good venue to actually allow that technology and that process to apply a very good service experience for that elderly person. We have cities that, from a location perspective in smart streets or smart districts are starting to develop strategies to a smart post, for instance, where you have multiple sensors for security,


How Data Will Drive the Transportation Industry in 2020


The advent of new technologies in transportation continues to evolve industry practices at a rapid pace. As network connectivity continues to improve, customers are benefiting from convenience and speed. The eventual culmination of this evolution is the autonomous roadway. In 2020, the industry will experience the rise of 5G, which will drive the framework for the connected roadway. The increased bandwidth of 5G will allow for the placement of advanced sensors on roadways and traffic signals. The sensors will kick-start real-time data collection that allows for living 3-D maps, affording a safe environment for autonomous vehicles. The growth of 5G technology will allow for all fundamental components of an autonomous roadway to be built out; regulatory standards and safety practices can be tested and put into practice, eventually leading to the construction of smart infrastructures on a global scale. The more companies can educate themselves now about 5G’s capabilities and its adoption, the more successful they will be in 2020 and beyond.


Setting Up a Virtual Office for Remote Teams

Setting Up a Virtual Office for Remote Teams
Once you’re ready to go virtual, one of the best ways to ensure flawless collaboration between office and remote workers is having all employees use the same technology stack and digital tools. This will eliminate the merry-go-round of apps and software for individual developers, teams, and their managers. At MightyCall, we have an open culture regarding remote work, allowing our developers, product designers, and other employees to work from their virtual office whenever they need to. Regardless of the specific tech niche your company serves, we found there are several components to creating a productive environment for remote teams. ... Building a proactive remote working environment today is key to seeing your company thrive in the future. According to expert reports, by 2028 an overwhelming 73% of teams will have remote workers. This will result not only in greater workplace autonomy but demographically diverse and more inclusive hiring. While technology plays an essential role in birthing the idea of virtual offices, the success of the remote work experiment for each team depends on the human factor.



Quote for the day:


"I believe that the capacity that any organisation needs is for leadership to appear anywhere it is needed, when it is needed." -- Margaret J. Wheatley


Daily Tech Digest - February 04, 2020

What to know about software development security — why it’s still so hard and how to tackle it

What to know about software development security right now image
When it comes to securing applications, consider threat modelling, a process that identifies but also prioritises potential threats from an attacker’s perspective. Questions to ask might include: what kind of data would an attacker be seeking? One popular threat model is STRIDE, which can be used in tandem with the DREAD risk assessment framework: this helps work out how likely is the threat to happen, the threat’s potential consequences, and whether the risk can be tolerated. For web applications, the Open Web Application Security Project Foundation, has published its top 10 list of the most common and critical security risks and this is an excellent reference source. Each threat is ranked by its agents, exploitability, prevalence, detectability, technical impact, and business impact. The top 10 OWASP help with API security, too. There is a shifting emphasis towards securing them at every part of the lifecycle, starting with the development stage.



Top 10 underused SD-WAN features

blue globe world network global transformation connected global connection
For enterprises that do business with the federal government, such as aerospace and defense companies, or enterprises with PCI compliance responsibilities, which includes just about everybody else, encryption keys need to be rotated on a regular basis (typically every 90 days). This can be a tedious manual process that entails complex change control policies and can require planned downtime. SD-WAN platforms can replace conventional VPN-based key rotations with an automated system that can be programmed to make the rotations as frequently as every minute without any interruption to data plane traffic. The result is better security, no downtime and no need for manual intervention. ... There are many scenarios in which companies need to keep different types of traffic separated from each other. For example, in the case of a merger or acquisition, the combined company might be a single entity on paper, but for business or compliance or security reasons, each business unit continues to operate independently. If the company then decides to upgrade to SD-WAN, it might be considering the purchase of two sets of physical devices.



Programming languages: Go and Python are what developers most want to learn


Do developers need a degree? Apple CEO recently said the skills needed to code could be achieved by teaching kids at an earlier stage in high school. Job seekers without a degree can also get jobs at Google, IBM, Home Depot, and Bank of America. HackerRank found that most developers hired at companies of all sizes do have a degree. But it found that small businesses with between one and 49 employees are the biggest source of employment for developers without a degree. It found that 32% of developers at small companies lack a degree compared with 9% of developers who work for firms with more than 10,000 employees. The top recruiting priority, with 38%, for hiring managers in 2020 is finding full-stack developers. The second and third most commonly sought after categories are back-end developers and data scientists. However, full-stack developers face more pressure than other groups, with 60% tasked with learning a completely new framework and 45% required to learn a new language last year. That proportion is higher than all other categories, including front-end developers, back-end developers, data scientists, DevOps engineers, and quality-assurance engineers.


Brexit messes up IT project plans


Bartels said in the report: “The slowdown in tech purchases by business and government is real and reflects the fact that business executives are generally reluctant to grow their tech budgets faster than the growth in their revenues.” Referencing the UK’s exit from the European Union on 31 January, Bartels noted that prime minister Boris Johnson and the Conservative majority in Parliament will need to negotiate a transition agreement that still leaves many key issues to be negotiated. “Those continued uncertainties will depress UK economic growth,” he said. Forrester expects tech purchasers to cut their budgets in response to these uncertainties, and forecast that the UK will experience a 1.4% decline in tech spending in 2020. Forrester noted that tech market growth across Western Europe “was not too bad in 2019”, with the region’s constant currency growth of 4.5% actually higher than the global growth rate of 3.9%. But Bartels noted in the report that Brexit uncertainties will knock down the UK’s tech spending.


Multi-cloud adoption is the future, so be prepared


Multi-cloud architectures add complexity. Each platform has its own set of rules for operations and management, which makes it harder to cross-train staff and give IT teams a holistic view of a company's cloud and on-premises assets. "Most organizations are pretty clear that they do not want broad silos operating independently in perpetuity," Johnston Turner said. Enterprises want to be able to optimize these different assets within their own constraints around security, cost and performance. That's why IDC predicts 70% of enterprises will deploy unified VMs, Kubernetes and multi-cloud management processes by 2022 to facilitate standardized governance and to provide a single view across environments. "You're just always going to have to do the talking to the lower level systems," she said. "But increasingly, the level of [automated extraction] -- the policy and controls that you can put on top of those controllers -- is really what matters."


Smart Cities: Accelerating the Path to Digital Transformation


Accessing data and achieving interoperability from various smart city IoT services are also critical steps on any digital transformation journey. Otherwise, data remains locked in siloes, making it difficult and expensive to develop smart city applications. Unfortunately, the smart city market is still young and lacks standard data models for sensors and applications. Cisco Kinetic for Cities addresses this and reduces deployment complexity by normalizing data to consistent, well-defined data models by integrating an ecosystem of over 90 pre-integrated partners. This allows CKC to bring multiple services and vendors together in a single-pane-of-glass dashboard to enable better operation and smarter correlated policies. We also offer a series of CKC API training modules and CKC sandbox on Cisco DevNet where cities can join or leverage over 600,000 Cisco developers to develop new applications and services. At Cisco, we believe smart city technologies must be secure, scalable, and interoperable — not just to meet today’s needs, but also to enable cities to undergo a sustainable journey towards digital transformation.


The 5 Hottest Technologies In Banking For 2020

Robot Notebook Safe Piggy Bank Euro Coins
APIs are about speed, agility, and personalization. You’e dead in the water if: 1) It takes nine to 12 months to integrate partners’ products and/or data, or 2) The partnership process requires significant time and resources to negotiate legal matters, revenue sharing, pricing, etc. And for all the talk about personalization in banking, nothing that exists today comes close to what’s possible in an environment with a robust set of partial-stack fintech providers and smart full-stack banks integrated through APIs. ... For all the hype surrounding chatbots and machine learning (ML), few community-based financial institutions have deployed these technologies. Going into 2020, just 4% of the institutions surveyed by Cornerstone have already deployed chatbots—twice as many as had deployed them going into 2019. But going into 2019, 13% of the survey respondents said they would be making investments in chatbots—and most ended up not investing. There was a big jump in the percentage of institutions who have deployed machine learning from 2% in 2019 to 8% in 2020. And for 2020, another 17% expect to deploy ML tools. If history is any guide, however, fewer will actually invest.


Tech spending slowing along with world economy

stock market investment graph with indicator and volume data.
The one bright spot in their forecast is software. Forrester expects spending on software and cloud computing to actually grow during the forecast period by 5% in 2020 and 5.1% in 2021. Still these numbers are significantly less than 2018's 8.3% growth in software spending. "Any transformation of business operations or processes today involves the purchase of software," the report states. "The investments that firms are making to replace on-premises software with cloud alternatives is another factor driving the growth in software." After surging in 2017 and 2018 because of better economic activity globally and tax incentives in the US, however, spending on computer and telecom equipment is expected to fall. Even with growth slowing, spending on technology goods and services by businesses and governments will be robust in the coming years, going from $3.09T in 2016 to $3.71T in 2021.  As usual, the US outpaces the rest of the world by a massive margin. Anticipated US spending on technology for 2020, is $1.50T. At $289B, China, the next biggest spender, is expected to spend about a fifth of that amount.



Tesla and other autopilot-driven cars tricked with 2D projections

case-1.jpg
Researchers say that objects can be projected in a variety of ways, using cheap $300 projectors that you can buy from Amazon. These projectors can be handheld, or installed on flying drones. In addition, the research team says that projecting rogue 2D objects doesn't necessarily mean the projections need to be visible for long periods of time. A few hundreds milliseconds is enough, they said. Short-burst projections would be invisible to the human eye, but they'd still be visible and picked up by the powerful sensors and video cameras used by ADAS and autopilot systems. This opens the door for real-world scenarios where human drivers wouldn't even spot the projections, but the car would suddenly break or steer towards oncoming traffic. This is an important observation because most car makers advise drivers to use the autopilots only under direct supervision. Car vendors say the systems should be used to assist drivers while driving, but hands should always be kept on the wheel and eyes on the road.


Digital strategy for 2020-2030 sets out police technology plans


The strategy’s five ambitions are underpinned by seven enablers, which will provide the foundation for the nationwide digital transformation. These include data, strategic alignment and design, modernised core technology, connected technology, risk and security, talent, and transforming procurement. The enablers primarily focus on the need to develop common standards, approaches and structures across UK policing organisations, as well as to deliver better value for money. For example, the strategy recommends creating a national data management guide to drive data quality and consistency, while also developing a holistic data and technology framework to enable more consistent risk decisions. The strategy also recommends defining a “technology blueprint” for the next decade that avoids “the creation of bespoke solutions in favour of commercial off-the-shelf (Cots) applications.” The strategy claims that using Cots products, which it recommends setting specific procurement frameworks for, will ensure the standardisation of procurement and enhance value for money.



Quote for the day:


"Take time to deliberate; but when the time for action arrives, stop thinking and go in." -- Andrew Jackson


Daily Tech Digest - February 03, 2020

Why UK's Huawei decision leaves the fate of global 5G wireless in US hands

200130-5geo-07.jpg
"The UK has been doing business with Huawei for a long time through Openreach. They had been operating, with oversight, in the country for years," noted Doug Brake, who directs broadband and spectrum policy for Washington, DC-based Information Technology & Innovation Foundation. Openreach, to which Brake refers, is the division of top British telco BT responsible for deploying fiber optic infrastructure. It had been partnering mainly with Huawei until last November, when it began an evaluation process in search for additional partners. "So for the UK to come out and publicly brand them as a high-risk vendor, cordon them off to only 35 percent of the access network — not even let them into the core network," said Brake, "really puts Huawei in a tight box." For its part, Huawei did what it could Tuesday to thwart any possible interpretation of tightness, or a box. Omitting any mention of security or exploiting back doors in the infrastructure, Huawei Vice President Victor Zhang issued a statement, reading in part: "This evidence-based decision will result in a more advanced, more secure, and more cost-effective telecoms infrastructure that is fit for the future..."



Lex: An Optimizing Compiler for Regular Expressions

This perhaps isn't the fastest C# NFA regex engine around yet, but it does support Unicode and lazy expressions, and is getting faster due to the optimizing compiler. A Pike Virtual Machine is a technique for running regular expressions that relies on input programs to dictate how to match. The VM is an interpreter that runs the bytecode that executes the matching operation. The bytecode itself is compiled from one or more regular expressions. Basically, a Pike VM is a little cooperatively scheduled concurrent VM to run that bytecode code. It has some cleverness in it to avoid backtracking. It's potentially extremely powerful, and very extensible but this one is still a baby and very much a work in progress. The VM itself is solid, but the regular expressions could use a little bit of shoring up, and it could use some more regex features, like anchors.


Google launches open-source security key project, OpenSK


FIDO is a standard for secure online access via a browser that goes beyond passwords. There are three modern flavours of it: Universal Second Factor (U2F), Universal Authentication Factor (UAF), and FIDO2. UAF handles biometric authentication, while U2F lets people authenticate themselves using hardware keys that you can plug into a USB port or tap on a reader. That works as an extra layer on top of your regular password. FIDO2 does away with passwords altogether while using a hardware key by using an authentication protocol called WebAuthn. This uses the digital token on your security key to log straight into a compatible online service. To date, Yubikey and Google have both been popular providers of FIDO-compatible keys, but they’ve done so using their own proprietary hardware and software. Google hopes that by releasing an open-source version of FIDO firmware, it will accelerate broader adoption of the standard. Google has designed the OpenSK firmware to work on a Nordic dongle, which is a small uncased board with a USB connector on it.


Early use of AI for finance focused on operations, analytics


Anecdotal evidence suggests AI excels at financial processes that involve repetitive operations on large volumes of data. "It will eliminate the need for people to do a lot of the boring, repetitive work that they're doing today," Kugel said. "It will make it possible for systems to wrap themselves around the habits and requirements of the user, as opposed to the user having to adapt how they work within the limitations of technology." Data quality will also improve and, with it, the quality of analytics as AI gets better at flagging errors for people to correct, Kugel said. AI is also helping with tedious accounts payable tasks, such as confirming that goods were received and that an invoice contains the right items, Tay said. Companies that use automated payments are deploying machine learning to scan payment patterns for deviations. "If the machine learning algorithm tells them that the probability of the goods having been received and everything being good with that specific invoice, they'll pay that immediately," Tay said.


SaaS, PaaS, IaaS: The differences between each and how to pick the right one

Businessman using mobile smartphone and connecting cloud computing service with icon customer network connection. Cloud device online storage. Cloud technology internet networking concept.
In theory, PaaS, IaaS and SaaS are designed to do two things: cut costs and free organizations from the time and expense of purchasing equipment and hosting everything on-premises, DiDio said. "However, cloud computing services are not a panacea. Corporate enterprises can't just hand everything off to a third-party cloud provider and forget about them. There's too much at stake." Internal IT departments must remember what DiDio calls "the three "Cs: communication, collaboration and cooperation,'' which she said are all essential for successful business outcomes and uninterrupted smooth, efficient daily operational transactions. "When properly deployed and maintained, IaaS is highly flexible and scalable,'' DiDio said. "It's easily accessed by multiple users. And it's cost effective." IaaS is beneficial to businesses of all types and sizes, she said. "It provides complete and discretionary control over infrastructure… Many organizations find that they can slash their hardware costs by 50% or more using IaaS." However, IaaS "requires a mature operations model and rigorous security stacks including understanding cloud provider technologies,'' noted Vasudevan. IaaS also "requires skill and competency in resource management."


Startup uses machine learning to support GDPR’s right to be forgotten

“Every user has over 350 companies holding sensitive data on them, which is quite shocking,” says Ringel. “Not only that, but this number is growing by eight new companies a month, which means our personal footprint is highly dynamic and changing all the time.” According to Ringal, the conversation about data privacy needs to focus much more on data ownership. “Privacy is all about putting fences around us, preventing our personal information being shared with other people,” he says. “But the problem with that is that we miss out on the fun – every day we use online services and share our data with companies because it is convenient and efficient. Now, with GDPR, we can actually take our data back whenever we choose.” Once users know where their data is, Mine helps them reclaim it by submitting automated right-to-be-forgotten requests to the companies with the click of a button. For users on the trial version of Mine, the startup will email the request to the company and copy the user in to follow up communications.


Serverless Cloud Computing Will Drive Explosive Growth In AI-Based Innovation

Photo:
As cloud computing has advanced, more companies have made the transition to the cloud-based platform as a service model (PAAS), which delivers computing and software tools over the internet. PaaS can be scaled up or down as needed, which reduces up-front costs and allows you to focus on developing software applications instead of dealing with hardware oriented tasks. To support this shift toward the PaaS cloud, public cloud companies have begun heavily investing in building or acquiring serverless components that have pre-built unit functionality. These out-of-the-box tools allow organizations to test new concepts, iterate and evaluate without taking on high risk or expense. In the past, only large companies with considerable resources could afford to experiment with AI-based innovation. Now startups or small teams within larger enterprises have access to cloud-based, prepackaged algorithms offering different AI models that can fast-track innovation.  Let’s explore practical examples of how this trend helps democratize innovation in artificial intelligence by minimizing the time, money and resources needed to get started.


The Past, Present And Future Of Oracle’s Multi-Billion Dollar Cloud Bet

Larry had more confidence than I did. He was sure of it. I was more cautiously optimistic. We started running our little business on QuickBooks because we hadn’t built our system yet. When our system got to the point where we could run our own business’ business on it, I imported our QuickBooks file and saw our business in a browser at home. I was at home looking at all the key metrics of how we were spending, and how we were growing, and who our employees were, all there in the browser. That’s when I was sure it was going to work because I knew we were first to do that. I felt that with Larry’s strong backing we’d be able to reach a lot of companies, and that’s what happened. He was sure from the very beginning. It really was his idea to do it as a web-based application. He was the pioneer, and this was before Salesforce.com started, which he was also involved with. He wanted to do accounting, and I encouraged us to move beyond just accounting, and together we came up with this concept of the suite, and thus the name of the company, ultimately, became NetSuite.


Rogue IoT devices are putting your network at risk from hackers


Security standards for IoT devices aren't as stringent as they are for other products such as smartphones or laptops, so in many cases, it's been known for IoT manufacturers to ship highly insecure devices – and sometimes these products never receive any sort of patch either because the user isn't aware of how to apply it, or the company never issues one. A large number of connected devices are also easily discoverable with the aid of IoT search engine Shodan. Not only does this leave IoT products potentially vulnerable to being compromised and roped into a botnet, insecure IoT devices connected corporate networks could enable attackers to use something as trivial as a fitness tracker or a smart watch as an entry point into the network, and use it as means of further compromise. "Personal IoT devices are easily discoverable by cybercriminals, presenting a weak entry point into the network and posing a serious security risk to the organisation. Without a full view of the security policies of the devices connected to their network, IT teams are fighting a losing battle to keep the ever-expanding network perimeter safe," said Malcolm Murphy, Technical Director for EMEA at Infoblox.


Europe’s new API rules lay groundwork for regulating open banking


The EU and the U.K. have both passed laws that explicitly require their banks to create application programming interfaces and open those APIs to third-party developers. And banks in the U.S. should take notice. These new laws are paving the way to standardization for open banking which could lead to rapid innovation and a competitive advantage for the European banking system. These new laws are also more friendly to fintech companies as it streamlines access to a growing network of bank data. Fintechs within the U.S. must create individual data sharing agreements with each bank partner, and the negotiations for each partnership can be resource intensive. However, in the EU a fintech can get access to all bank APIs through registering as an account information service provider (AISP) or payment initiation service provider (PISP). This could create a situation where the U.S. may lose out on technology investments and see innovative financial professionals leave the nation to work in the rapidly advancing open-banking environment within the EU.



Quote for the day:


"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr


Daily Tech Digest - February 02, 2020

Just how big a deal is Google’s new Meena chatbot model?

Digital Human Brain Covered with Networks
Meena can chat, over a few turns of a conversation, believably. Meena, however, cannot reliably teach you anything. Meena is not trying to help you finish a task or learn something new specifically. It converses with no explicit goal or purpose. While we probably spend too much of our time chatting about not much of importance, we tend to be looking for something specific when interacting with a bot-powered digital service. We want to get a ticket booked or a customer support issue resolved. Or we want to get accurate information about a particular domain or emotional or psychological support for a challenge we are facing. Conversational products have a purpose, and even if they fail at the more open-ended questions, they are trying to work with you to complete a task. Meena places the human-likeness of the conversation above all. However, there is much for us to learn about what is an appropriate conversational approach given different types of tasks. There is research that shows that more “robot” like responses are preferable in certain situations (especially where sensitive personal information is involved) and that being human-like is not the end-all and be-all of bots. Where does Meena, with the conversations it has learned from social media interactions, find a role?



Bacteria cells with selective focus 157067927
Environmental IoT is one area they say could benefit. In smart cities, for example, bacteria could be programmed to sense for pollutants. Microbes have good chemical-sensing functions and could turn out to work better than electronic sensors. In fact, the authors say that microbes share some of the same sensing, actuating, communicating and processing abilities that the computerized IoT has. In the case of sensing and actuating, bacteria can detect chemicals, electromagnetic fields, light, mechanical stress and temperature — just what’s required in a traditional printed circuit board-based sensor. Plus, the microbes respond. They can produce colored proteins, for example. And not only that, they respond in a more nuanced way compared to the chip-based sensors. They can be more sensitive, as one example. ... Bacteria should become a “substrate to build a biological version of the Internet of Things,” the scientists say. Interestingly, similar to how traditional IoT has been propelled forward by tech hobbyists mucking around with Arduino microcontrollers and Raspberry Pi educational mini-computers, Kim and Posland reckon it will be do-it-yourself biology that will kick-start IoBNT.


AI still doesn’t have the common sense to understand human language


The test was originally designed with the idea that such problems couldn’t be answered without a deeper grasp of semantics. State-of-the-art deep-learning models can now reach around 90% accuracy, so it would seem that NLP has gotten closer to its goal. But in their paper, which will receive the Outstanding Paper Award at next month’s AAAI conference, the researchers challenge the effectiveness of the benchmark and, thus, the level of progress that the field has actually made. They created a significantly larger data set, dubbed WinoGrande, with 44,000 of the same types of problems. To do so, they designed a crowdsourcing scheme to quickly create and validate new sentence pairs. (Part of the reason the Winograd data set is so small is that it was hand-crafted by experts.) Workers on Amazon Mechanical Turk created new sentences with required words selected through a randomization procedure. Each sentence pair was then given to three additional workers and kept only if it met three criteria: at least two workers selected the correct answers, all three deemed the options unambiguous, and the pronoun’s references couldn’t be deduced through simple word associations.



A new bill could punish web platforms for using end-to-end encryption


The bill doesn’t lay out specific rules. But the committee — which would be chaired by the Attorney General — is likely to limit how companies encrypt users’ data. Large web companies have moved toward end-to-end encryption (which keeps data encrypted for anyone outside a conversation, including the companies themselves) in recent years. Facebook has added end-to-end encryption to apps like Messenger and Whatsapp, for example, and it’s reportedly pushing it for other services as well. US Attorney General William Barr has condemned the move, saying it would prevent law enforcement from finding criminals, but Facebook isn’t required to comply. Under the EARN IT Act, though, a committee could require Facebook and other companies to add a backdoor for law enforcement. Riana Pfefferkorn, a member of the Stanford Law School’s Center for Internet and Society, wrote a detailed critique of the draft. She points out that the committee would have little oversight, and the Attorney General could also unilaterally modify the rules. The Justice Department has pushed encryption backdoors for years, citing threats like terrorism, but they haven’t gotten legal traction. Now, encryption opponents are riding the coattails of the backlash against big tech platforms and fears about child exploitation online.


Technologies of the future, but where are AI and ML headed to?


The fluid nature of data science allows people from multiple fields of expertise to come and crack it. Shantanu believes if JRR Tolkien, being the brilliant linguist that he was, pursued data science to develop NLP models, he would have been the greatest NLP expert ever, and that is the kind of liberty and scope data science offers. ... For a country like India, acquiring new skills is not something of a luxury but a necessary requirement, and the trends of upskilling and reskilling are also currently on the rise to complement with the same. But data science, machine learning, and artificial intelligence are those fields where mere book-reading and formulaic interpretation and execution just does not cut it. If one aspires to have a competitive career in futuristic technologies, machine learning and data science have a larger spectrum of required understanding of probability, statistics, and mathematics on a fundamental level. To break the myths around programmers and software developers entering this market, machine learning involves understanding of basic programming languages (Python, SQL, R), linear algebra and calculus, as well as inferential and descriptive statistics.


Why Cybersecurity Training Needs a Serious Refresh

Cybersecurity.training
It’s easy to understand that if the technology market moves very fast, the security segments of it move even faster. This is the very definition of a dynamic environment—new dangers appear on the threat matrix every day, which means the ground is always shifting. It’s also easy to see how good security technology meets this challenge by constantly updating itself to combat new incoming threats. But here’s where it gets murky: Can we as individuals keep pace with the threats? And if we can’t, can even the most sophisticated tools ward off all dangers? No, we can’t, and that’s a big reason why the bad guys are usually ahead. Think of it as the human factor. The tools keep getting better, but inside this swirling vortex of innovation and sophistication, we as people—consumers, business professionals, and security specialists—have to scramble to understand new dangers and newer defenses. Even for tech teams dedicated to protecting the network, it’s a constant nightmare. For the rest of us, the reality is that while the threat matrix changes by the hour, IT security sessions take place maybe a few times a year, and it’s hard to even fit those into a busy schedule.


10 Key Challenges for Fintech Startups Worth Your Attention

Fintech Regulatory Issues
The financial services industry is arguably the most regulated in the world. Laws are enacted to safeguard financial systems from abuse. The emergence of fintech has changed the way we view and handle money, creating a grey area for regulation. This issue has drawn the attention of regulators and lawmakers. Therefore, fintech startups have to contend with different regulatory hiccups on day to day basis because of their unstructured operating models. Besides, regulations on fintech operations vary from one jurisdiction to the other. Therefore, startups should fully understand the legal complications before operating in a particular country. While Fintech has brought much disruption in the financial industry, banks will not just sit pretty and watch as they lose their market share. Also, fintech ventures don’t only compete with existing financial powerhouses, such as PayPal. They soon will have to contend with new players, such as Amazon and other technology behemoths foraying into financial services. Due to their strong asset base, banks wield clout and can either buy out fintech companies or partner with them. As a venture, you should decide if you want to confront the big guys head-on or if you should instead explore greener pastures.


Why we’re failing to regulate the most powerful tech we’ve ever faced

Given the force of this technology, shouldn’t governments be bracing for its effect with robust regulations? The U.S. government so far is taking a mostly hands-off approach. U.S. Chief Technology Officer Michael Kratsios warned federal agencies against over-regulating companies developing artificial intelligence. There are views, too, that the U.S. government doesn’t want to issue meaningful regulation, that the administration finds regulation antithetical to its core beliefs. There is greater movement underway by the European Union (EU), which will issue a paper in February proposing new AI regulations for “high-risk sectors,” such as healthcare and transport. These rules could inhibit AI innovation in the EU, but officials say they want to harmonize and streamline rules in the region. China is pursuing a different strategy designed to tilt the playing field to its advantage as exemplified by its standards efforts for facial recognition. Ultimately, it is in the worldwide public interest for the AI superpowers, the U.S. and China, to collaborate on common AI principles.


Great Powers Must Talk to Each Other About AI


While the dynamics of artificial intelligence and machine learning, or ML, research remain open and often collaborative, the military potential of AI has intensified competition among great powers. In particular, Chinese, Russian and American leaders hail AI as a strategic technology critical to future national competitiveness. The military applications of artificial intelligence have generated exuberant expectations, including predictions that the advent of AI could disrupt the military balance and even change the very nature of warfare. At times, the enthusiasm of military and political leaders appears to have outpaced their awareness of the potential risks and security concerns that could arise with the deployment of such nascent, relatively unproven technologies. In the quest to achieve comparative advantage, military powers could rush to deploy AI/ML-enabled systems that are unsafe, untested or unreliable. As American strategy reorients toward strategic competition, critical considerations of surety, security and reliability around AI/ML applications should not be cast aside.


Burn, drown, or smash your phone: Forensics can extract data anyway

rickayers-nist.jpg
JTAG stands for Joint Task Action Group, the industry association that formed to create a standard for the manufacturing of Integrated Circuits. The NIST study only included Android devices because most Android devices are "J-taggable," while iOS devices aren't. The forensic technique takes advantage of taps, short for test access ports, which are usually used by manufacturers to test their circuit boards. By soldering wires onto taps, investigators can access the data from the chips. To perform a JTAG extraction, Reyes-Rodriguez first broke the phone down to access the printed circuit board (PCB). She carefully soldered thin wires the size of a human hair onto small metal components called taps, which are about the size of a tip of a thumbtack. "JTAG is very tedious and you do need a lot of training," says Ayers. "You need to have good eyes and very steady hand." The researchers compared JTAG to the chip-off method, which is another forensic technique. While JTAG work was done at NIST, the chip-off extraction was conducted by the Fort Worth Police Department Digital Forensics Lab and a private forensics company in Colorado called VTO Labs.



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman