Daily Tech Digest - June 23, 2020

Four Steps Public-Sector CIOs Should Take To Break Down Silos Impeding Innovation

Government agencies, almost by design, are large and slow-moving. When something goes wrong, the response is often to add another policy and another layer of approvals and reviews. This slows things down even more and frustrates efforts by CIOs and other decision-makers to make informed and timely choices. Further inhibiting—and complicating—operations, individual mission centers facing bureaucratic barriers often create their own duplicative capabilities, delivered quickly and effectively, but just for their own use. These silos are especially common when it comes to information technology and are given the pejorative label of “Shadow IT” by CIOs and others at the enterprise level who want to assert control over all agency technology. ... Don’t reinvent solutions just because that’s the way it’s been done. Resist the urge to customize. Change your policies and practices, if you can, so you can set and use standards that break down application, data and user silos. Push back internally on those policies that exist for the lowest common denominator. Challenge your technologists to leverage these standards and build tools that can solve enterprise problems at speed and scale. 


Italian Banking Association ready to trial Central Bank Digital Currency

In the announcement it read, " Italian banks are available to participate in projects and experiments of a digital currency of the European Central Bank, contributing, thanks to the skills acquired in the creation of infrastructure and distributed governance, to speed up the implementation of a European-level initiative in a first nation." A year ago the Association of Italian Banks set up a working group dedicated to deepening the understanding related to digital coins and crypto assets. From this group 10 recommenations were announced that include: Monetary stability and full respect for the European regulatory framework must be preserved as a matter of priority; Italian banks are already operating on a Distributed ledger technology Dlt infrastructure with the Spunta project. They intend to be part of the change brought about by an important innovation such as digital coins; A programmable digital currency represents an innovation in the financial field capable of profoundly revolutionizing money and exchange. This is a transformation capable of bringing significant potential added value, in particular in terms of the efficiency of the operating and management processes. ...


The next software disruption: How vendors must adapt to a new era

The rise of PaaS has changed what it takes to be a successful enterprise-software vendor. As PaaS services become more sophisticated, software application vendors have a tougher time justifying a price premium for products that could be delivered with a thin user interface on top of generic PaaS services. With PaaS tools giving attackers and customers themselves the means to develop new applications quickly, software vendors that do not innovate in kind will face increased risk. Software vendors need to defend their share of the profit pool by taking a clear look at where they have the best and most defendable opportunities to differentiate themselves. Rather than going head-to-head with the Big Three, one strategy is to specialize and tailor solutions to the needs of targeted verticals and use cases. This strategy proved successful in the early 2010s, when SaaS disruptors first entered the market. The legacy-software vendors that were closest to the customer and had a high degree of industry and domain expertise protected their market share and maintained their enterprise value-to-revenue multiples while customers that stressed differentiation on the basis of their technology were more vulnerable



How Manufacturers Can Address Cybercrime in the Ongoing Pandemic

Security has never been a top priority for manufacturers. Security features and best practices are often not taken into account when new products are purchased. With COVID-19 requiring companies across all industries to explore remote workforce options, manufacturing companies prioritized, and invested in, automation systems that make it easier for their employees to do their jobs from the safety of their homes. Although it is encouraging to see companies making investments to support their employees, many automation tools are being purchased without considering their security features. Standard security best practices such as checking for previous reported vulnerabilities, changing factory settings and passwords, and training employees in the secure ways to use the new solutions are not happening. With fewer guards and controls in place, it's easy for industrial control systems to be hacked simply through accident or user error. Despite the challenges plaguing the industry -- outdated technology, a disconnect between safety and security, and vulnerabilities associated with remote work operations -- there are small steps that manufacturers can take to significantly improve their security posture.


IoT Security Is a Mess. Privacy 'Nutrition' Labels Could Help

At the IEEE Symposium on Security & Privacy last month, researchers from Carnegie Mellon University presented a prototype security and privacy label they created based on interviews and surveys of people who own IoT devices, as well as privacy and security experts. They also published a tool for generating their labels. The idea is to shed light on a device's security posture but also explain how it manages user data and what privacy controls it has. For example, the labels highlight whether a device can get security updates and how long a company has pledged to support it, as well as the types of sensors present, the data they collect, and whether the company shares that data with third parties. “In an IoT setting, the amount of sensors and information you have about users is potentially invasive and ubiquitous," says Yuvraj Agarwal, a networking and embedded systems researcher who worked on the project. "It’s like trying to fix a leaky bucket. So transparency is the most important part. This work shows and enumerates all the choices and factors for consumers." Nutrition labels on packaged foods have a certain amount of standardization around the world, but they're still more opaque than they could be. And security and privacy issues are even less intuitive to most people than soluble and insoluble fiber.


Smart Devices: How Long Will Security Updates Be Issued?

Europe's automobile industry is bound by regulations for supporting vehicle components to ensure consumers have access to critical parts, says Brad Ree, CTO of ioXt and board member with the ioXt Alliance, which is a trade group dedicated to securing IoT devices. But Ree says with connected devices, no regulator has yet made the leap to ensure that the software is supported for an extended period. "Right now, consumers really don't know how long the product is going to be supported," Ree says. That's critical because smart devices cost more than devices without software control features. The U.S. is trying to nudge manufacturers in the right direction. Two years ago, the National Telecommunications and Information Administration created a document about what type of information companies should clearly communicate to consumers before they buy a smart device. The voluntary recommendations include describing whether and how a device receives security updates and the anticipated timeline for the end of security support. 



Why the open source DBaaS market is hot

"The good news is that there's a lot of open source database choice for organizations," said James Curtis, senior research analyst at S&P Global. "The bad news is that there's a lot open source choice and that can cause some confusion." While a growing number of vendors support open source database products, the public cloud vendors also offer versions of many popular open source databases, Curtis noted. For example, AWS boasts a managed Cassandra service, as well as support for MySQL and PostgreSQL with its Relational Database Service (RDS). When they get ready to decide on which route to take, Curtis said that organizations need to choose a vendor that provides the support they are looking for. For open source database vendors, DBaaS might also represent a threat as it has the potential to replace or cannibalize existing on-premises deployments. Among DBaaS benefits, one of the most important is reducing the time organizations need to spend managing the infrastructure. "What will happen in the future is that database workloads will gravitate to the right environment in which it makes sense to run that workload," Curtis said. "Some workloads are best suited to run on premises and perhaps always will."


Organizations Must Reset Expectations to Spring Back from Pandemic

The first step is identifying an organization’s critical assets and the missions they support. The SEI's foundational process improvement approach to operational resilience management, the CERT Resilience Management Model (CERT-RMM), defines four asset types: people, facilities, technology, and information. "The COVID-19 crisis has impaired our people and our facilities, so it’s akin to a natural disaster," said Butkovic. However, most disaster plans did not anticipate that the event would affect everyone, everywhere. "Typically, you don’t have fires at all of your facilities at the same time, with little notion of when they’ll be put out. In that way, there are lessons to be learned from cyber events, which can affect all locations simultaneously." During a cyber attack, an organization might keep its technology assets out of harm's way by modifying firewall rules. During the COVID-19 pandemic, most human assets are keeping out of harm’s way by staying away from the workplace. But not all safeguards can remain in place forever. 


The Future of Work: Best Managed with Agility, Diversity, Resilience

While the future is uncertain, one clear trend is that remote work will play a larger role during and after the pandemic. After experiencing several weeks of office closures, organizational leaders are questioning the wisdom of maintaining the same amount of office space because in most cases, employees have proved they can be productive and collaborate effectively while working remotely. On the flip side, some employees have discovered they prefer working at home, at least part-time. To affect social distancing in the short-term, employers must rethink space utilization. Interestingly, they may find they've stumbled upon their longer-term strategy, which is some version of a partly remote, partly on-site workforce. With digital transformation, more tasks and processes are aided or facilitated by software. Meanwhile, the organizations' tech stacks are becoming increasingly virtual (cloud-based), intelligent (machine learning and AI), and diverse (including IoT). However, digital transformation isn't just about technology implementation, it's also about cultural transformation which reflects greater diversity and cross-departmental collaboration.


Building Resiliency in the Age of Disruption and Uncertainty

Attendees discussed how risk needs to be managed holistically. James Fong, Regional Business Director at RSA, highlighted the need to view risk in the context of four pillars namely, operations, workforce, supply chain and cybersecurity. Fong said that “Operational risk management, IT and security risk management, regulatory and corporate compliance, business resiliency, third party governance and audit management, need to be part of an integrated risk management plan.” Fong continued “Risk data needs to be shared on customised dashboards for executives, CISOs and others. The data needs to give a clear understanding of the monetary cost associated with the risk. For example, how much is a risk worth? What is the cost of the threat?” Importantly, organisations need to understand the risk associated with third party suppliers. A more common view expressed is that no matter how much you prepare yourself, there will always be instances when organisations need to react to situational change. For example, incoming threats that can choke or change content in the media industry.



Quote for the day:

"Challenges in life always seek leaders and leaders seek challenges." -- Wayde Goodall

Daily Tech Digest - June 22, 2020

Future in Fintech in 2020: a revolution in financial sectors

Experts foresight 2020 to bring a tremendous change in the future of the Fintech industry. Goldman Sachs predicts that by the end of 2020, the worldwide Fintech pie will reckon up $4,7 trillion. Parenthetically, the interesting fact is that nearly one-third of Goldman Sachs’ employees are engineers, which makes more than on Twitter or Facebook. We found out the five main trends in Fintech banking that are going to disrupt the industry and drive the immense growth. The age of totally digital banking is approaching. The majority of existing banks already offer global payments and transfers virtually, and those who don’t yet will join the trend. The ability to trade currencies along with Bitcoin and Ethereum online will come to a daily basis, and according to the forecasts, it will lead to a drop in physical bank visits by 36% by 2020. Though in 2019 the blockchain technology became a widely discussed topic, its embodiment into financial services was relatively slower, compared to other spheres. The future of Fintech in 2020 is intimately tied to the blockchain technology, and the main reasons are transparency and trust it guarantees, significantly decreasing the time needed for transactions and improving the cash flow. 77% of surveyed incumbents expect to embrace blockchain by 2020.


New taskforce to push cyber security standards

It follows earlier reports on Monday that the federal government is crafting minimum cyber security standards for businesses, including critical infrastructure, as part of its next cyber security strategy.  The taskforce will focus its efforts on “harmonising baseline standards and providing clarity for sector specific additional standards and guidance” and improving interoperability. It also aims to enhance "competitiveness standards by sector for both supplier and consumers” and support Australian cyber security companies to seize opportunities globally. ... “We know that the current plethora of different security standards make it difficult for government and industry to know what they’re buying when it comes to cyber security,” he said. “By bringing together industry to identify relevant standards and provide other practical guidance, we aim to make government more secure, whilst providing direction for industry to build their cyber resilience. “This will realise our ambition for NSW to become the leading cyber security hub in the Southern Hemisphere.”  


4 edge computing use cases delivering value in the enterprise

However, all that data doesn't need to be handled in centralized servers; similar to the healthcare edge computing use cases, every temperature reading from every connected thermometer, for example, isn't important. Rather, most organizations only need to bring aggregate data or average readings back to their central systems, or they only need to know when such readings indicate a problem, such as a temperature on a unit that's out of normal range. Edge computing enables organizations to take and understand the data near those endpoint devices, thereby limiting the cost and complexity of sending reams of often unneeded data points to central systems, while still gaining the benefits of understanding the performance of its equipment. The ROI of this is critical: The insights into the data generated by endpoint devices enable remote monitoring so organizations can identify performance problems and safety issues early, even when no one is on-site. Using edge computing with predictive and prescriptive analytics can deliver even bigger ROI, as they enable organizations to predict the optimal time to service their equipment.


Why There Are Silos And Gaps In SOCs… And What To Do About It

Detections are an event that looks anomalous or malicious. And the issue today in a modern security operations center (SOC) is that detections can bubble up from many siloed tools. For example, you have firewall and network detection and response (NDR) for your network protection, Endpoint Detection and Response (EDR) for your endpoints’ protection and Cloud Application Security Broker (CASB) for your SaaS applications. Correlating those detections to paint a bigger picture is the issue, since hackers are now using more complex techniques to access your applications and data with increased attack surfaces. Your team is either claiming false positives or an inability to see through these detections and get a sense of what is critical vs. noise. The main purpose of SIEMs is to collect and aggregate data such as logs from different tools and applications for activity visibility and incident investigation. That said there are still a lot of manual tasks needed, like transforming the data including the data fusion to create context for the data, i.e., enrichment with threat intelligence, location, asset and/or user information.


Intel Tiger Lake processors to feature built-in malware protection

Intel CET deals with the order in which operations are executed inside the CPU. Malware can use vulnerabilities in other apps to hijack their control flow and insert malicious code into the app, making it so that the malware runs as part of a valid application, which makes it very hard for software-based anti-virus programs to detect. These are in-memory attacks, rather than writing code to the disk or ransomware. Intel cited TrendMicro’s Zero Day Initiative (ZDI), which said 63.2% of the 1,097 vulnerabilities disclosed by ZDI from 2019 to today were related to memory safety. "It takes deep hardware integration at the foundation to deliver effective security features with minimal performance impact," wrote Tom Garrison, vice president of the client computing group and general manager of security strategies and initiatives at Intel in a blog post announcing the products. "As our work here shows, hardware is the bedrock of any security solution. Security solutions rooted in hardware provide the greatest opportunity to provide security assurance against current and future threats. Intel hardware, and the added assurance and security innovation it brings, help to harden the layers of the stack that depend on it," Garrison wrote.


Why We Need DevOps for ML Data

We are now starting to see MLOps bring DevOps principles and tooling to ML systems. MLOps platforms like Sagemaker and Kubeflow are heading in the right direction of helping companies productionize ML. They require a fairly significant upfront investment to set up, but once properly integrated, can empower data scientists to train, manage, and deploy ML models. Unfortunately, most tools under the MLOps banner tend to focus only on workflows around the model itself (training, deployment, management) — which represents a subset of the challenges for operational ML. ML applications are defined by code, models, and data4. Their success depends on the ability to generate high-quality ML data and serve it in production quickly and reliably… otherwise, it’s just “garbage in, garbage out.” The following diagram, adapted and borrowed from Google’s paper on technical debt in ML, illustrates the “data-centric” and “model-centric” elements in ML systems. 


Data Lake vs Data Warehouse

Although data warehouses can handle unstructured data, they cannot do so efficiently. When you have a large amount of data, storing all your data in a database or data warehouse can be expensive. In addition, the data that comes into the data warehouses must be processed before it can be stored in some shape or structure. In other words, it should have a data model. In response, businesses began to support Data Lakes, which stores all structured and unstructured enterprise data on a large scale in the most cost-effective way. Data Lakes stores raw data and can operate without having to determine the structure and layout of the data beforehand. In the case of the Data Lake, the information is structured at the output when you need to extract data and analyze it. At the same time, the process of analysis does not affect the data themselves in the lake — they remain unstructured so that they can be conveniently stored and used for other purposes. This way we get the flexibility that Data Warehouse hasn't. Thus, the Data Lake differs significantly from the Data Warehouse. However, LSA's architectural approach can also be used in the construction of Data Lake(my representation).


Lessons learned: Strategies to adjust IT operations in a crisis

Some IT organizations already had a robust VPN setup, as well as sufficient laptops for staff to continue their work from home. Others, particularly those without remote work policies already in place, had to rush to adjust. But even after all the firefighting, some IT organizations have used the pandemic as an opportunity to identify potential improvements within their environments -- whether through automation or AI, security updates or a streamlined help desk. Use the below synopses of five recent SearchITOperations articles by TechTarget senior news writer Beth Pariseau to explore the adjustments organizations have made to maintain, manage and even optimize IT operations during a crisis. With the overwhelming shift back to localized work environments in the 2010s, many IT organizations in 2020 scrambled to accommodate new restrictions and complications related to COVID-19. In-person, impromptu discussions and weekly co-located meetings became impossible with all staff offsite, which created bottlenecks and, in some cases, a slow-down in productivity.


Millions of Connected Devices Have Exploitable TCP/IP Flaws

Treck says in a statement that it has updated its TCP/IPv4/v6 software to fix the issues. JSOF notes that organizations should use Treck's stack version 6.0.1.67 or higher. JSOF dubbed the flaws Ripple20 to reflect how a single vulnerable component can have a ripple effect on "a wide range of industries, applications, companies, and people." The company is due to present its findings at Black Hat 2020, which will be a virtual event. Four of the flaws are rated critical, and two of them could be exploited to remotely take control of a device. Others require an attacker to be on the same network as the targeted device, which makes these flaws more difficult - but not impossible - to exploit. "The risks inherent in this situation are high," JSOF says. "Just a few examples: Data could be stolen off of a printer, an infusion pump behavior changed, or industrial control devices could be made to malfunction. An attacker could hide malicious code within embedded devices for years." Simpson of Armis says that patching will be time consuming since administrators may have to manually update every make and model of vulnerable device.


Introducing The Fourth Generation of Core Banking System

The cracks began to show in the wake of the global financial crisis as the banks were faced with a difficult challenge in that they had to both drive the costs of their IT infrastructure down to allow them to maintain competitive banking products in the market, as well as having to adapt to shifting consumer expectations and increasingly stringent regulator demands. A notable example of the latter being the introduction of the third installation of the Basel accord, Basel III, which places increasing demands on banks to adapt their core systems in ways that seem to directly clash with the traditional model of end of day batch style processing, such as the requirement of intraday liquidity management. All of a sudden the banks found themselves facing two key challenges that seemed to conflict with each other. To drive the cost of their infrastructure down they needed to get rid of their mainframes and run on leaner infrastructure which would lower the glass ceiling on the amount of processing power in the system. At the same time, to adapt to regulatory changes they had to increase the frequency of their batch processing jobs, which would require more processing power.



Quote for the day:

"Management is efficiency in climbing the ladder of success; leadership determines whether the ladder is leaning against the right wall." -- Stephen Covey

Daily Tech Digest - June 21, 2020

Core systems strategy for banks

There are two main options (with a few variations) for banks that conclude that they need to replace their core banking system: a traditional enterprise core banking system (self-hosted or as a utility) and a next-generation cloud-based core banking system. Most current implementations are still of the traditional variety. But we are seeing an increase in banks of all sizes putting off traditional core implementations with the aim of experimenting with next-gen systems. There is some evidence to suggest that banks will try and shift en masse to a cloud-based microservice architecture in the next few years. The core method of communication between machines will be APIs. Armed with a micro-service based architecture, the new core banking applications will become core enablers of the shift to this architecture. Traditional core banking providers have become aware of the need and potential inherent in a cloud-based microservice architecture; banking leaders should keep a close watch on developments here. We also expect to see some M&A activity between traditional and next-gen core banking system providers.


Cybersecurity In The M&A Process: A CISO's Strategy

IT departments and information security professionals are traditionally not included in the discussions leading into a merger or acquisition and are usually not given the liberty to conduct their own assessments prior to M&A execution. This can lead to a dramatic increase in cyber risks or, even worse, inheriting compromised networks. With the rapid scaling of organizations in the world of M&A, it can become exponentially more difficult to control cybersecurity risks when information security departments are already struggling to keep attackers at bay with the limited personnel and resources they have. However, there are strategies that can help get information security professionals into business conversations regarding M&As. If the cards are played correctly, this can lead to positive financial and cybersecurity outcomes. Develop a proactive plan within your organization to leverage cybersecurity as a tool at the negotiation table for the M&A process. The equation is simple: If your organization inherits a compromised network or an organization that has a poor security posture, this will cost you extra dollars that are unseen through the lens of traditional M&A cost calculations.


North Korean state hackers reportedly planning COVID-19 phishing campaign targeting 5M across six nations

SingCERT confirmed it received "information regarding a potential phishing campaign" and, in response, posted an advisory on its website Friday. It said there were "always" ongoing phishing attempts by various cybercriminals that used different themes and baits and spoofed different entities. This tactic remained a common and effective technique used to gain access to individuals' accounts, deliver malware, or trick victims into revealing confidential data, said SingCERT, which sits under Cyber Security Agency (CSA). ZDNet asked the government agency several questions including whether there had been a database breach and what tools the Manpower Ministry had adopted to prevent their email accounts from spoofing attacks. It did not respond specifically to any of the questions and, instead, issued a response that confirmed CSA had reached out to relevant parties to notify them about the potential phishing campaign. "Opportunistic cybercriminals have been using the COVID-19 situation to conduct malicious cyber activities and with the increasing reliance on the internet during this period, it is important to be vigilant," the agency said


CIA Finds It Failed to Secure Its Own Systems

The report calls out the CIA's Center for Cyber Intelligence for not prioritizing internal cybersecurity and focusing, instead, on developing offensive cyber weapons. This lax attitude toward preventive cybersecurity measures within the CIA continued even after previous high-profile data breaches of the agency and other intelligence departments, the report states. On Tuesday, Wyden wrote to John Ratcliffe, the director of national intelligence, demanding to know if the U.S. intelligence community planned to implement better cybersecurity practices and questioning why the CIA did not do more to protect its internal security operations from both outside attacks and internal threats. "The lax cybersecurity practices documented in the CIA's WikiLeaks Task Force report do not appear to be limited to just one part of the intelligence community," Wyden writes. "The Office of the Inspector General of the Intelligence Community revealed in a public summary of a report it published last year that it found a number of deficiencies in the intelligence community's cybersecurity practices."


Cyber Security Careers Germany – Finding New Roles in a Burgeoning Sector

From machine learning to autonomous response, cyber security is a burgeoning space and this is creating opportunities across Germany, from Berlin and Frankfurt to Cologne, Munich and Hamburg. Whether local markets are largely comprised of businesses still in lockdown or those that have returned to socially distanced office environments, Glocomms Germany expert consultants are able to ensure that organisations are able to meet their recruitment needs and individuals can begin planning career-defining moves. As the business world continues to adapt to the impact of COVID-19 on networks and systems, cyber security remains at the top of the agenda across sectors. Luis Rolim, Chief Marketing Officer at Glocomms commented "As the world emerges from the COVID-19 pandemic, Glocomms remains at the forefront of delivering quality talent to the technology sector. We're in this together and we look forward to helping businesses across Germany with their recruitment and talent acquisition." Glocomms Germany is part of the Phaidon International group and is a trusted recruitment partner in Europe and beyond.


What is emotion AI and why should you care?

One of the areas of emotion AI is sentiment analysis, a field that has existed since at least the early 2000s. Sentiment analysis is usually conducted on textual data, be it emails, chats, social media posts, or survey responses. It uses NLP, computational linguistics, and text analytics to infer positive or negative attitudes (aka “orientation”) of the text writer: Do they say good or bad things about your brand and your products or services? The obvious applications of sentiment analysis have been brand/reputation management (especially on social media), recommender systems, content-based filtering, semantic search, and understating user/consumer opinions, and the need to inform product design, triaging customer complaints, etc. Several of the conference presentations were devoted to this topic, which, despite all the recent progress in NLP and related fields, is still hard. Not least because there is little agreement among researchers on even what constitutes basic human emotions and how many of them are there, said Bing Liu, Professor of Computer Science at the University of Illinois at Chicago. Emotions are also notoriously hard to identify and code (label), since they are ambiguous, shifting, overlapping, and adjacent. For example, one can feel anger, sadness, and disgust at the same time. Moreover, emotions are not always easy to pin down.


Security surprise: Four zero-days spotted in attacks on researchers' fake networks

To examine the security threats to industrial systems, the researchers used a network of 120 high-interaction honeypots – fake industrial infrastructure – in 22 countries to mimic programmable logic controllers and remote terminal units. Over a period of 13 months, there were 80,000 interactions with the honeypots – mostly scans – and nine interactions that made malicious use of an industrial protocol. While that might sound like a small number, four of the nine interactions also featured previously unknown attacks, or zero-days, one being the first use of a previously identified proof-of-concept attack in the wild. The attack types include denial-of-service and command-replay attacks. These vulnerabilities and associated exploits were disclosed to the device manufacturers. "While the yield was small, the impact was high, as these were skilled, targeted exploits previously unknown to the ICS community," the researchers said. The research was presented at a NATO-backed cybersecurity conference.


Revised DOJ compliance guidance offers risk-management lessons for cybersecurity leaders

“One of the reasons the DOJ puts this out is to help compliance officers and security teams and people who are worried about bribery and corruption to ensure that the board and leadership give enough attention to these issues and properly fund them to mitigate risk,” Penman says. Regardless of whether civil or criminal litigation is involved, the kind of guidance DOJ puts out is devoured by compliance officers across all organizations, Penman says, and when it comes to compliance, cybersecurity is top of mind for those executives. “We’re just about to publish results of the survey of around 1,400 compliance officers. The highest priority or concern for risk compliance programs in that survey was enhancing data privacy and cybersecurity and data protection.” Compliance programs are more critical than ever given the COVID-19 crisis, Alison Furneaux, vice president of marketing for cybersecurity compliance management company CyberSaint, tells CSO. “The attack surface has expanded dramatically. Organizations are being forced to innovate. They’re being forced to put into place processes that they didn’t have before. They’re being forced to document and prepare for audits in a much more proficient way.”


The Difference Between Enterprise Architecture and Solutions Architecture

Perhaps it’s misleading to use “versus” to describe the difference between enterprise architecture and solutions architecture. They are very much collaborators in the organization and should not be looked at as competitive in terms of which provides more value. A better way of highlighting the difference between the two is through their focus on strategy vs. technology. A focus on strategy implies a broad understanding of the mechanics of any given technology. This is because there is a lot more to strategy than just the technology needed to implement it. A skewed focus on technology would mean that the processes, people and other variables required to inform strategy are ignored. Conversely, a focus on technology is necessary to ensure implementations and operations can run smoothly. By its nature, it is more “in the weeds” and so the necessary holistic perspective of the organization can be harder to understand and/or account for. With their holistic view of the organization, enterprise architects take on the strategy. They then use their strategic planning perspective to inform and delegate to solutions architects.


Police ties to Ring home surveillance come under scrutiny

The idea of cameras in police investigations isn’t new. Grainy black-and-white footage has been used for surveillance for years. But newer products that cost as little as $100 and connect with a cellphone make the market much more accessible. And the more people have the cameras, the more appealing their potential becomes for police and government officials. More localities are joining the registry trend. At least 75 police departments and municipalities in 21 states announced programs since 2018, according to a Stateline review. “I do think for law enforcement it’s easy to understand the appeal,” said Lior Strahilevitz, a professor at the University of Chicago’s Law School. “There are a lot of instances where if only there had been a bystander on that corner at that time, the crime could have been solved.” The registries come in a variety of forms — some a simple spreadsheet, others a more sophisticated account with vendors such as a Motorola-run program called CityProtect. (A Motorola spokeswoman declined to give a specific number but said “hundreds” of police agencies use its CityProtect service for registering cameras and/or reporting crime.) The registries can include any kind of camera from Ring to Nest to lesser known brands.



Quote for the day:

"The highest reward for a man's toil is not what he gets for it but what he becomes by it." -- John Rushkin

Daily Tech Digest - June 20, 2020

Linux Foundation and Harvard announce Linux and open-source contributor security survey

Here's how it works: The Core Infrastructure Initiative (CII) Best Practices badge shows a project follows security best practices. The badges let others quickly assess which projects are following best practices and are more likely to produce higher-quality secure software. Over 3,000 projects are taking part in the badging project. There are three badge levels: Passing, silver, and gold. Each level requires that the OSS project meet a set of criteria; for silver and gold that includes meeting the previous level.  The "passing" level captures what well-run OSS projects typically already do. A passing score requires the programmers to meet 66 criteria in six categories. For example, the passing level requires that the project publicly state how to report vulnerabilities to the project, that tests are added as functionality is added, and that static analysis is used to analyze software for potential problems. As of June 14, 2020, there were 3,195 participating projects, and 443 had earned a passing badge. The silver and gold level badges are intentionally more demanding. The silver badge is designed to be harder but possible for one-person projects.


The startup making deep learning possible without specialized hardware

It didn’t take long for the AI research community to realize that this massive parallelization also makes GPUs great for deep learning. Like graphics-rendering, deep learning involves simple mathematical calculations performed hundreds of thousands of times. In 2011, in a collaboration with chipmaker Nvidia, Google found that a computer vision model it had trained on 2,000 CPUs to distinguish cats from people could achieve the same performance when trained on only 12 GPUs. GPUs became the de facto chip for model training and inferencing—the computational process that happens when a trained model is used for the tasks it was trained for. But GPUs also aren’t perfect for deep learning. For one thing, they cannot function as a standalone chip. Because they are limited in the types of operations they can perform, they must be attached to CPUs for handling everything else. GPUs also have a limited amount of cache memory, the data storage area nearest a chip’s processors. This means the bulk of the data is stored off-chip and must be retrieved when it is time for processing. The back-and-forth data flow ends up being a bottleneck for computation, capping the speed at which GPUs can run deep-learning algorithms.


Company boards aren't ready for the AI revolution

Beyond governance of Big Data and AI, there’s a second bottleneck and that’s talent. The well-worn phrase is true: every business is a technology company now; soon, though, most will also be AI companies. So when it comes to hiring good data scientists and AI experts, these businesses will have to compete not only with their peers but also tech giants like Facebook, Amazon and Google. Instead of attempting to raid the physics and mathematics departments of their local universities for talent, I therefore recommend that companies look elsewhere for AI experts - on their own payroll. Most businesses have incredible talent in-house. All they have to do is provide their staff with the necessary training and support, which can be done with the help of technology partners, provided these are platform-agnostic so that they can support a wide range of technologies and use cases. Training will have to be delivered on two levels. The first is AI enablement, by training staff to program and handle the technical aspects of AI and machine learning; they need to understand how to use bots, deploy robotic process automation and use machine learning to harness big data.


The digital divide: Not everyone has the same access to technology

As we exit the immediate crisis here, the health crisis, and move into a period of economic recovery, we're certainly going to see tremendous amounts of job loss, transitions in needed skills, and our labor force is going to be dramatically affected around the world by what's happening now. We do have an opportunity to think about re-skilling in a new way. Can we provide certain swaths of the economy with educational resources that will help them participate in the technology economy in ways that were not permissible or possible before? Can we think through an infrastructure build that will enable schools, for example, in rural areas or in parts of the world that haven't traditionally had access to technology, to train their students in these kinds of skills? I think there is an opportunity to think systemically about changes that are needed, that have been needed for a long time, quite frankly, and to use this recovery period as an opportunity to bridge that divide and to ensure that we're providing opportunities for everyone. 


How Decentralization Could Alleviate Data Biases In Artificial Intelligence

A few projects are also exploring the potential for blockchain-based federated learning, so to speak, in improving AI outcomes. Federated learning makes it possible for AI algorithms to amass experience from a wide range of siloed data. Instead of having the data moved to the computation venue, the computation happens at the data location. Federated learning allows data providers to retain control over their data. However, privacy risks lurk whenever federated learning is employed. Blockchain is able to alleviate this risk thanks to its superior traceability and transparency. Also, a smart contract could be used to discourage malicious players by requiring a security deposit, which is only refundable if the algorithm doesn’t violate the network’s privacy standards. Ocean Protocol and GNY are two projects exploring blockchain-based federated learning. Ocean recently launched a product, called Compute-to-Data, which allows data providers and data consumers to securely buy and sell data on the blockchain. The Singapore-based startup already has some enterprise names including Roche Diagnostics, the diagnostic division of multinational healthcare company F. Hoffmann-La Roche AG using its services.


Democratizing artificial intelligence is a double-edged sword

At one end of the spectrum is data, and the ingestion of data into data warehouses and data lakes. AI systems, and in particular ML, run on large volumes of structured and unstructured data — it is the material from which organizations can generate insights, decisions, and outcomes. In its raw form, it is easy to democratize, enabling people to perform basic analyses. Already, a number of technology providers have created data explorers to help users search and visualize openly available data sets. Next along the spectrum come the algorithms into which the data is fed. Here the value and complexity increase, as the data is put to work. At this point, democratization is still relatively easy to achieve, and algorithms are widely accessible; open source code repositories such as GitHub (purchased by Microsoft in 2018) have been growing significantly over the past decade. But understanding algorithms requires a basic grasp of computer science and a mathematics or statistics background. As we continue to move along the spectrum to storage and computing platforms, the complexity increases. During the past five years, the technology platform for AI has moved to the cloud with three major AI/ML providers: Amazon Web Services (AWS), Microsoft Azure, and Google Compute Engine.


What Will Happen When Robots Store All Our Memories

Mostly, though, Memory Bots became routine and part of the social fabric of the future as controversies faded, laws and regulations were refined to curb abuses and maximize safe usage, and people became intrigued and distracted by the latest new gadget that was going to wow them, then scare them, and then become routine. In the old Shlain Goldberg house in Marin County, you could still find Ken, or the essence and memories of Ken, captured inside an eight-inch-tall black cylindrical tube on the kitchen counter that looked remarkably like an ancient Alexa. (Sadly, Ken, as well as Tiffany, had just missed the advent of longevity tech that allowed their daughter to live thousands of years and counting.) Except that Ken-Alexa had a swivel head that was constantly recording everything, with the positive-negative filter still set right where Ken had left it, in the middle of the dial. Even when Odessa was centuries old but still looked the same as she did when she was 25, she could talk to her dad, and ask him questions, and hear him laugh.


Applying Observability to Ship Faster

We needed to learn to think in monitoring terms, learn more about monitoring tooling, and how best to monitor. Most monitoring systems are set up for platform and operations monitoring. Using these for application monitoring is taking them and engineering somewhere new. Early on, we got some weirdness out of our monitoring. The system was telling us we had issues when we didn’t. It sounds silly now, but reading and re-reading the monitoring system documentation until we really got it helped. Digging deeper into how different types of metrics and monitors were designed to be used allowed us to build a more stable monitoring system. We also found that there were things we wanted to do, that we couldn’t do with out-of-the-box monitoring. Our early application monitoring was noisy and misfired. Too frequently it told us we had problems that we didn’t have. We kept iterating. We ended up building more of the monitoring in code than we expected, but it was well worth the time. We got the bare bones of a monitoring system early, and by using it in the real world, we worked out what we really needed.


What’s Next for Self-Driving Cars?

The machine vision systems in cars today are excellent at recognizing obstacles like other vehicles and pedestrians. Anticipating how they’ll act is another issue entirely. People behave irrationally by running red lights or jaywalking, and that kind of behavior is hard for an AI to react to or expect. These AI systems will get better with more training data, but collecting that data can be complicated. Right now, putting an autonomous car on the road can be dangerous, but they need to be out there to gather data. As a result, the process of getting all the necessary training may be a long one. Autonomous cars may not be ready to disrupt the industry, but implementation is still possible. Public transportation is an ideal application for today’s self-driving vehicles because it’s a more predictable form of driving. By driving pre-defined routes at slower speeds, autonomous public transports can start to gather that all-important training data. Some companies have already started taking advantage of this area. A business called May Mobility has been running self-driving shuttles to train stops since May 2019. 


4 roles responsible for data management and security

Including a section in apps that provides transparency on how it uses data can help ease security concerns. Zoom, which has been in the news due to its increased use amid COVID-19 and security concerns, recently brought in leaders in the security space and a new acquisition to help. Having a strong opt-in strategy is also important. Apple and Google have a good approach with their work on contact tracing. But opting in is not going to give you all – or even enough – of the data. ... The CDO should set strategy for managing all of an organization's data – both from a defensive standpoint (addressing compliance regulations, data privacy, good data hygiene, etc.) and from an offensive one (making data more easily consumable for those who want and need it). Some key agencies do plan to have specialist CDOs. The Department of Defense has been working to recruit candidates for its CDO position. And at the end of March, the Centers for Disease Control and Prevention (CDC) published the official job post for its CDO opening. ... Consumers are grappling with data collection, something they've struggled with for a while. People are trying to become more educated about application data collection and personal data privacy and security.




Quote for the day:

"Experience is a hard teacher because she gives the test first, the lesson afterwards." -- Vernon Law

Daily Tech Digest - June 19, 2020

The Rise of TensorFlow

IT teams working with deep learning initiatives can enhance that innovation through production quality management. Many aspects of software development, such as Test-Driven Development (TDD) and Continuous Integration/Continuous Development (CI/CD) are being incorporated into DataOps, and consequently, MLOps. IT teams can seek opportunities to establish robust data pipelines created from MLOps practices. The instances can provide clues for translating lessons learned that could potentially fit the machine learning concepts applied to quantum computing. Quantum computing research is very nascent, with many theories and calculations that feel more at home in a Star Trek episode than in a real-world application. But the TensorFlow community is growing with encouragement from Google. Google offers a few notebook tutorials that users can demo, along with an installation guide. During the Google I/O19 Summit, TensorFlow advocate Josh Gordon shared that 1,800 developers had been contributing trial and production-ready projects using TensorFlow. The high interest in the developer community to explore TensorFlow capabilities holds even higher potential to yield valuable insights in quantum computing research and applications.


Coming Soon: 'Trust Mark' Certification for IoT Devices

The program proved "that you can have an independent, agnostic, vendor-sponsored certification program provided you have the right checks and balances in there," Tett says. The Trust Mark program will have an independent decision authority that decides whether products are approved, Tett says. The products will be evaluated by separate, independent testing facilities. In other countries, a host country IoT association can promote and market that evaluated products that have gone through the program, Tett says. "IoT is not just an Australian problem," Tett says. "It's the world's problem." The testing program will have two phases. First, manufacturers will develop a statement of claims, describing security, safety and privacy aspects of a device. Those could include "baseline" aspects, such as policies on default passwords, how encryption is used and how the device can be patched. They also could include information about specific security features, such as how the device securely transmits personal data. In the second phase, an accredited test facility will verify the manufacturer's claims and issue a letter of recommendation. But a decision authority will decide whether a device passes or fails. If it passes, it will be certified.


5 Things You Can Do Right Now to Prepare for the Post-Coronavirus Business World

Cybersecurity is already an important topic to large businesses, and with the EU's General Data Protection Regulation, California's Consumer Privacy Act and other privacy laws, as well as countless news stories about the cost and impact of data breaches, it is something smaller businesses are being forced to confront head on. With the surge in employees working remotely during the virus outbreak, we have seen more and more data breaches and cyberattacks. Employees using unsecured infrastructure and third-party tools are two of the leading causes of potential breaches. Combine this with data storage and access practices that violate privacy laws — for example, telemedicine on non-HIPAA-compliant platforms — and suddenly the need for secure solutions takes center stage. ... With the unprecedented business shutdown across America, businesses will be increasingly looking at ways to have a greater degree of control over their expenses. These will include businesses requesting shorter contract durations, emergency clauses and provisions in agreements, ways to have a more easily scalable workforce utilizing temporary workers and temporary agencies, and an overall desire to lower expenses, especially recurring
expenses.


How to Secure Machine Learning

Building security in for machine learning presents an interesting set of challenges. Primary among these is the fact that in any machine learning system data plays an outside role in system security. In fact, my view is that the datasets an ML system is trained, tested, and ultimately operated on account for 60% or more of overall security risk, while the learning algorithms and other technical aspects of the system (including source code) account for the rest. For that reason, in my work with BIML, I have focused my attention on architectural risk analysis sometimes called an ARA, (touchpoint number two for software security), as the most effective approach to get started with. This stands in contrast to starting with touchpoint one (code review), but the reasons why should be mostly obvious. In a January 2020 report titled, "An Architectural Risk Analysis of Machine Learning Systems: Toward More Secure Machine Learning," BIML published an ARA as an important first step in its mission to help engineers and researchers secure ML systems. In the report, we painstakingly identified 78 risks. Of those 78 risks, I present the top five here.


PowerPoint 2016 and 2019 cheat sheet

The most important feature that launched with PowerPoint 2016 for those who work with others is live collaboration that lets people work on presentations together from anywhere in the world with an internet connection. To do it, you must be logged into your Microsoft or Office 365 account, and the presentation must be stored in OneDrive, OneDrive for Business or SharePoint Online. However, while Office 365 subscribers or anyone using PowerPoint Online can see the changes that other users of those versions make to a shared presentation in real time as they happen, PowerPoint 2016 and 2019 users have to save their presentations periodically to see and share changes. So while it is live collaboration, it’s not real-time visibility into that collaboration. Still, it does allow you to work with others on the same presentation at the same time. To collaborate on a presentation, open it, then click the Share icon in the upper-right part of the screen. If you haven’t yet saved your file in OneDrive, OneDrive for Business or SharePoint Online, you’ll be prompted to do so. Clicking the Share button opens the Share pane on the right-hand side of the screen. Think of the pane as command central for collaboration.


Microsoft wants developers to be Quantum-inspired

A Quantum Developer Kit (QDK) and new language Q# fill out the Microsoft quantum computing portfolio and are available on the open source GitHub repository. As Computer Weekly has previously reported, quantum computing is a technique that promises to solve problems that cannot be programmed using a traditional algorithm run on a classical binary computer design. Whereas traditional or classical computer architectures are very good at dealing in binary decisions, and solve problems by making discrete “yes” and “no” decisions, the complexity of some problems rises exponentially. This effectively means the problem cannot be solved in a traditional way. Giving an update to the company’s strategy, Ben Porter, director of business development at Microsoft, said: “Having spoken to customers across every industry, there is a need to study algorithms to solve complex problems.” But developing novel quantum algorithms is just the first part of Microsoft’s strategy. 


Technology in Banking: The developing role of biometrics

Multifactor authentication will soon be a requirement of industry regulation. A crucial element of the impending PSD2 regulation, and something that’s critical to the Open Banking scheme that’s just launched in the UK,is the need for banks to provide ‘Strong Customer Authentication’ (SCA) to protect users against external threats whilst not compromising their experience. Banks will need to verify an identity using at least two different authenticators. The regulation states this as ‘something you have’, ‘something you know’ and ‘something you are’, which could be translated as your device, your PIN number and a biometric feature. Since the introduction of fingerprint readers into phones, several leading banks integrated the technology into their apps. Now, the multifactor authentication requirements of PSD2 mean that all remaining European banks should at least be considering doing the same. By supporting the use of biometric authentication via a mobile device – whether fingerprint or other methods such as facial recognition – banks can provide a solution that combines security with usability, creating a better user experience.


Learning Progressive Web Apps - Book Review and Q&A

The use of “progressive” in the name is a shout out to the progressive enhancement approach to web development popularized in the early 2000s. In this context it means that a web app is progressively enhanced based on how much of the technology is supported in the browser. When your PWA runs on an older browser, one that doesn't support all or any of the PWA capabilities, it works just like any non-PWA. However, when the browser supports the core PWA technologies, additional capabilities unlock in the app depending on what code you have therein. PWAs allow you to register a chunk of JavaScript code called a service worker (SW) that runs in the browser context (as opposed to the app context). With a SW in place, your web app can do things a normal web app (a non-PWA) can't. Things like receive push notifications, enable an app to work while offline, even sync updated data with the server in the background when the app isn't running. SWs, along with a web app manifest file, also enable a more streamlined, app-controlled approach to installing the app on the user's desktop or home screen.


The new strategy for banks: Create value instead of competing

Banks should focus on creating new value instead of trying to catch up with existing and new competitors or simply contributing to apps and services that are a replication of what is already out there. Finding, developing and nurturing a new market space is a far easier thing to do and one can become a disruptor in their industry. A clear understanding of ongoing trends and the industry situation at large is key for any player in this space. One needs to study this constantly evolving business landscape in order to make better-informed and more accurate business decisions. Once banks unearth the problem areas of the industry, they can use this information to reimagine market boundaries and identify potential new customers for the industry. Before the plan has been put into action, it would be imperative to construct a roadmap to achieve your goal which aims to deliver new value to customers through it. Data is crucial for each of the steps mentioned above. Fortunately, we are currently in a digital age where the growth of data is on an exponential rise. IDC predicts that the collective sum of the world’s data will grow from 33 zettabytes in 2018 to a 175ZB by 2025, for a compounded annual growth rate of 61 percent. However, only 43 percent of ASEAN companies in 2019 were using big data to enhance their businesses, according to AOPG Insights’ report titled “The ASEAN Appetite for Data in Motion”.


Maze Ransomware Gang Continues Data-Leaking Spree

By having good security defenses in place, and up-to-date backups stored offline - so they cannot be crypto-locked by ransomware - victims can wipe and restore systems. This still takes time and energy, and doesn't address the root cause of how attackers infected systems in the first place, which organizations must also ascertain. But this strategy avoids victims having to even consider whether or not they might pay criminals. The U.S. Cybersecurity and Infrastructure Security Agency offers a detailed list of additional best practices for defending against ransomware. For organizations or individuals that fall victim, it recommends reporting the incident immediately to CISA, or a local FBI or U.S. Secret Service field office, to potentially receive help for dealing with that particular strain. Seeing gangs such as Maze continuing to notch new victims is a reminder to all organizations to get a ransomware-response plan in place - including training employees - immediately if they don't already have one. Security firm Kaspersky surveyed 2,000 business employees in the U.S. and another 1,000 in Canada last November and found that 45% said they didn't know what to do if they got hit by ransomware.



Quote for the day:

"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann

Daily Tech Digest - June 18, 2020

Return to the office: This company is giving workers beeping wristbands to keep them socially distancing

Getting back into the swing of things will be by no means be easy. With remote workers having been confined to their homes for so long, sharing a workspace with others will require employees to be mindful of their surroundings, not to mention to curb their desire to reform old office huddles. "As the trial began, it became clear that many people were undercutting the correct physical distance," Renner admits. "But, as they got used to wearing the sensors, the trial participants got a better feel for the distance they needed to keep – and the number of beeps heard around the office quickly fell." Renner makes a point that some of the challenges of re-entering office life are things that businesses could very easily overlook. "It may seem straightforward, but one of our challenges has been to work out how people can safely bring in, prepare and eat their own lunch in the office," he says. "Initially, employees were asked not to use the microwaves as a lot of people touch these appliances. But quite a few people wanted to bring in their own food. So, we changed the rules and allowed people to use the kitchen again and, to make things safer, we moved cutlery and plates outside of drawers, so people don't have to touch so many handles and surfaces."


R&D in the Banking Sector: Making the case for Innovation Data Labs

As the BFSI industry turns its attention to Fintechs to meet their digitization challenges; the eventual target areas for their R&D efforts   does not waver: Develop and Deploy new technologies to better serve B2B banking customers, Increase profits, improve compliance and security preparedness and reduce infrastructure costs. If the end-goals are similar, then where does the Banking R&D differentiation come from? In one word: Reliability. To infuse reliability as a core rubric in its R&D paradigm means Banks have to check a number of boxes. Firstly, Banks and Technology teams need to bedrock ‘reliability-as-a- yardstick’ in their partnerships; across vendors, across geographies, across platforms. Secondly, Reliability is built over time by adopting a divergent approach. The ‘traditional-hire-and-instruct-engineers-on-a-project-mode’, does not produce optimum test results because to harness advanced technologies necessitates an experimental mindset as opposed to the erstwhile engineering approach. Finally, reliability comes at a cost. To experiment with production in real time comes with a sizable expense – One, the cost of errors can be high, and Two, the multifarious skill base


We are at a critical point for mental health in the tech sector

We are now entering a new phase, with lockdown easing and more aspects of life beginning to move towards something more like normality. But while instinctively one might expect this to reduce the mental strain on people, for a significant proportion this easing is in fact ushering in a whole new phase of worry, concerned about catching the virus or passing it on to members of their household. This just underlines the fact that the Covid-19 crisis and its effects are not a one-off shock – but a long-term shift into a new normal. It becomes essential for tech businesses to ensure they are responsive to this and provide their staff with the support they need, for example by creating online resources, supporting the creation of mental health networks and discussion forums and, potentially, offering staff access to counselling services too. Encouragingly, we found that 56% of companies have increased the level of personal and emotional support to staff since the crisis began. However, half of businesses still don’t offer any formal support for mental health issues. The difference this makes is visible: three quarters of those working for unsupportive companies are either concerned about their mental health now or in the past.


Ethics in AI – Responsibilities a business has to the consumer

The software may be fool proof, but the same cannot be said about the data. Biases in the initial data the program is learning from will quickly spread to its outputs. Amazon had to scrap its recruitment AI tool because it started penalising CVs for containing the word “women’s”. In the male-dominated IT industry, men had been recruited at a higher rate than women. Words unique to women’s CVs appeared much less in successful hires compared to general words like “leadership”. The AI concluded that these words must be of low value and started penalising them. The lesson to be learned from this example is to identify gaps in the data and apply weightings so that demographics are equally represented. The fidelity with which an AI can classify massive amounts of data can even discourage looking for errors. Who’s going to argue with a program that can classify thousands of people’s faces with 98% accuracy via impenetrable mathematics? This is compounded by the so-called Black Box AIs that never show their workings. Typically, it involves the software projecting the data across high-dimensional mathematical spaces to extract unique features, but it is very abstract. Resist the temptation to outsource your thinking to the program or assume it knows what it’s doing.


How UK arts CIOs are keeping the show going on during the pandemic

“Any kind of technology is a tool that theatre can use, both creatively and to make itself visible. Just as limelight, gas lighting or the Victorian illusion of Pepper’s ghost were once new technologies, digital is another technology that can benefit theatre,” says theatre critic Lyn Gardner. How can the arts survive an uncertain future? Returning to the old ways of visiting museums or mingling in crowds at shows hardly seems viable, now that social distancing seems to be on the table at least in the medium term. Although there might be a day when we're able to enjoy again the unique experiences only the live arts can offer, in the meantime, CIOs and CTOs have an essential role to play in supporting the existing alternatives, and imagining new ones. Augmented reality, mixed reality or virtual reality could be one way forward for this sector, so urgently in need of sustainable recovery. Solomon Rogers, founder and CEO of immersive content studio REWIND and chairman of both the BAFTA Immersive Entertainment Advisory Group and Immerse UK, is of the opinion that these technologies present the arts with limitless opportunities.


For digital transformation success, get serious about open source

Software is powering almost every business and they want to use that as a competitive advantage…. [Companies] need [the] ability to move quickly and they need to be able to change directions quickly to respond to new threats or seize new opportunities." Similar sentiments were expressed on the earnings calls of Fastly, Elastic, and Twilio, and no doubt will continue to be highlighted by others. However, you can't really talk about the importance of software without calling out just how central open source is to the software that every organization on earth builds and uses. While you can absolutely pay others to support open source for you, the companies that want to have the most control over their digital futures will be those that also contribute strategically to open source projects. ... The first is simply to provide funding to a particular project, either to help defray development costs or something else like stage an (almost certainly online) event. The second is to commit your own developers to the project. This can be the most effective way because the more code they contribute, the more influence you can earn over the direction of the project. 


Accurate data in, better insights out

“Ensure data is checked for quality as close to the source as possible,” he says. “The more accurate it is upstream, the less correction will be needed at the time of analysis – at which point the corrections are time-consuming and fragile. You should ensure data quality is consistent all the way through to consumption.” This means carrying out ongoing reviews of existing upstream data quality checks. “By establishing a process to report data quality issues to the IT team or data steward, the data quality will become an integral part of building trust and confidence in the data. Ensure users are the ones who advise on data quality,” says Cotgreave. “When you clean data, you often have to find inaccurate data values that represent real-world entities like country or airport names. This can be a tedious and error-prone process as you validate data values manually or bring in expected values from other data sources,” he adds. “There are now tools that validate the data values and automatically identify invalid values for you to clean your data.” 


Machine learning in Palo Alto firewalls adds new protection for IoT, containers

“It is very important for us to apply ML when you start collecting huge amounts of data about your network,” said Sreeni Kancharla, vice president and CISO of Cadence Design Systems, an electronic design-automation software and engineering-services company speaking at the Palo Alto PAN 10 introduction. It’s important to get a faster response time to threats without making the security environment more complex, Kancharla said. On the IoT front PAN 10.0 supports a subscription service that targets IoT systems. “IoT devices present unique challenges for security teams. They are connected to an enterprise’s central network, yet they are generally unmanaged,” Oswal said. “For the most part, they are also unregulated, shipped with unknown or unpatched vulnerabilities, and often their useful life exceeds their supported life.” Oswal noted that a recent Palo Alto Unit 42 IoT threat report that said 57% of IoT devices are vulnerable to medium- or high-severity attacks, and 98% of all IoT-device traffic is unencrypted. Unit 42 is the vendor’s threat-research arm.


.NET Core: Interaction of Microservices via Web API

Almost everyone who has worked with microservices in the .NET Core probably knows the book of Christian Horsdal, “Microservices in .NET Core: with examples in Nancy” The ways of building an application based on microservices are well described here, monitoring, logging, and access control are discussed in detail. The only thing that is missing is a tool for automating the interaction between microservices. In the usual approach, when developing a microservice, a web client for it is being developed in parallel. And every time the web interface of the microservice changes, additional efforts have to be expended for the corresponding changes in the web client. The idea of ​​generating a pair of web-api / web-client using OpenNET is also quite laborious, I would like something more transparent for the developer. So, with an alternative approach to the development of our application, I would like: The microservice structure is described by the .NET interface using attributes that describe the type of method, route and way of passing parameters, as is done in MVC; Microservice functionality is developed exclusively in the .NET class, implementing this interface...


AI: A Remedy for Human Error

An employee might follow instructions in a phishing email not only because it looks authentic, but that it conveys some urgency (usually from a manager or someone else of importance). Employee training can help reduce the likelihood of error, but solving the technological shortcoming is more effective: if a phishing email is blocked from delivery in the first place, we can help mitigate the human error factor. This is where artificial intelligence can be a game-changer. We already use AI to simplify our home lives, using it to perform a variety of tasks, from turning on lights, to playing our favourite music. But if AI solutions are deployed in the workplace, we can help address the biggest elephant in the IT room: data security. Data security is a major area of concern, and it’s likely the leading cause for lost hours – and lost sleep – for security and IT professionals. According to a recent survey of over 500 IT professionals in the financial services industry, a whopping 94% said that they lack confidence in the ability of employees, consultants, and partners to safeguard customer data. And because cybersecurity is a complex domain – with many unknowns and moving parts – the rigid, conventional solutions can’t be effective. However, AI solutions can learn, adapt, and dynamically react to an organisation’s cybersecurity needs.



Quote for the day:

"A leader is judged not by the length of his reign but by the decisions he makes." -- Klingon Proverb