Daily Tech Digest - January 09, 2020

The importance of wearable hardware in the enterprise

The importance of wearable hardware in the enterprise image
“A large oil and gas firm is using geolocation wearables, connected via an IoT network, for site workers across multiple fields and rigs,” said Didier Pagnoux, director for IoT solutions at Altran. “It is also adding ‘wearable trackers’ for spare parts so workers can find them faster during emergencies, such as leakages. This is especially useful given how vast and extensive some fields, rigs and mines can be. “Wearables are starting to play a major role in the oil and gas industry. This is significant because much of the oil and gas industry is rooted deeply in 20th century methodologies, systems and processes.” Pagnoux continued by explaining how wearable hardware helps oil and gas companies to gain real-time insight into the environment in which their employees work, and gauge whether or not conditions are safe enough. “Embedded sensors within safety jackets and helmets are also being used within mines and rigs to feed a range of data on the conditions workers experience,” he said. “This is to monitor the air quality and to prevent accidents.



Operationalizing Threat Intelligence at Scale in the SOC

The period of time for which threat data is valid is limited. Organizations need current information about vulnerabilities and malware being used in attacks before they are targeted. Intelligence feeds will have shifting levels of urgency and simplifying the prioritization process is a complex task. In the past, security practitioners shared Word documents, PDFs, or simple file formats like CSV tables and Excel Sheets of indicators of compromise These were difficult to operationalize due to taxonomy and formatting differences, lack of integration, and the time-sensitive nature of the data. Also, it is difficult to describe and share a more complex behavioral indicator such as a threat actor tactic in a standardized format. The cyber community has tried — and failed — to institute an effective culture of sharing. Taxonomies and standards have been created but none have caught on at scale, leaving accessibility to CTI fragmented. As a result, most sharing doesn't go beyond domains. And even though security analysts across industries share common goals, often the organization does not see it that way and sharing and collaborating is hidden from management.


Grilling the answers: How businesses need to show how AI decides


“It became very obvious that if you are going to be using these machine learning algorithms to inform, or guide some really important decisions in our lives, then you really need to have this confidence or trust,” she says. But explaining machine learning decision-making to a data scientist is one thing; explaining it to consumers or the public will require a great deal more creative thinking, says Mojsilovic. “Fairness may be a complex ethical issue, but in a way, explainability is even more difficult,” she says. “Think about how humans explain things, how we navigate the world around us and how we communicate. We do it in so many different ways. We look for examples and counterexamples and summarise things, and so on. We thought about how to take that expressiveness of human interaction and create the methods to communicate [the way AI reaches conclusions].


A California student has filed a suit against Chinese-based TikTok, which she accuses of retrieving her data without permission
The vulnerabilities, as per the cybersecurity firm, could allow people with malicious intent to have access to user accounts and do a lot of things, such as steal their confidential information, delete their videos, make their private videos public, and so on. The vulnerabilities can also allow attackers to upload unauthorized videos to compromised accounts. The firm found that the app's subdomain was vulnerable to a type of attack where seemingly benign or “innocent” websites can be used to hack accounts and steal information. These, called XSS attacks, allow hackers to insert malicious scripts into trusted websites. Attackers can leverage this vulnerability to send TikTok users spoofed messages that contained links. These messages are made to look like they are legitimate and are from TikTok. If a person clicks or taps on the links, the attacker can then gain access to his or her TikTok account for whatever purpose he may have in mind. Check Point looked into TikTok's vulnerability to XSS attacks and successfully retrieved confidential user information, which included private email addresses and birthdates. The cybersecurity firm informed TikTok of the vulnerabilities on Nov. 20 last year, and by December, the app company was able to fix them.


CES 2020 car show features liquid crystal sun visors, EyeLocks, and smart LiDAR

screen-shot-2020-01-09-at-7-11-35-am.png
At CES 2020, Cerence showed how voice recognition and head tracking can be used together to open windows and doors. These button-free controls use voice recognition, gaze detection, touch, and gesture to create a natural, human-like in-car experience. The demo also included intelligent voice traffic notifications that leverage natural language generation to assist drivers with route selection. Bosch has made the sun visor smart with a camera and a transparent liquid crystal display. The Bosch Virtual Visor blocks only the portion of the visor where the sun would strike the driver's eyes while leaving the rest of the visor transparent. This improves visibility for the driver and automates adjustments to the visor, allowing the driver to focus on the road.  Also at CES 2020, EyeLock announced that SiriusXM will use the company's iris authentication tech to safeguard its new mobile e-wallet. The in-car platform lets drivers pay tolls, purchase gas, or stop at the drive-through without reaching for a wallet. Drivers use voice commands or a touch screen to start an e-wallet transaction and then an iris scan verifies the request. The custom EyeLock prototype will be placed in the visor of the car, allowing for the authentication of the driver, and other passengers that are enrolled in the system.


New Iranian data wiper malware hits Bapco, Bahrain's national oil company

Bapco map
At the time of writing, Bapco appears to be the only victim of an attack with the Dustman malware, although this doesn't mean the malware was not deployed on the network of other targets. According to the CNA report, attackers don't seem to have planned to deploy Dustman at the time they did, but appear to have triggered the data-wiping process as a last-ditch effort to hide forensic evidence after they made a series of mistakes that would have revealed their presence on the hacked network. Sources who spoke with ZDNet on the condition of anonymity claimed the Bahrain company was compromised over the summer. Saudi CNA officials, along with our sources, confirmed the point of entry was the company's VPN servers. The CNA report cites "remote execution vulnerabilities in a VPN appliance that was disclosed in July 2019" as the attackers' point of entry into Bapco's network While officials didn't blame any specific appliance, they are most likely referring to a Devcore report published over the summer that disclosed remote execution bugs in a wealth of enterprise-grade VPN servers, such as those from Fortinet, Pulse Secure, and Palo Alto Networks.


The case for change: New world, new skills


By upskilling, we mean giving people the opportunity to gain the knowledge, tools, and abilities they need to use and understand advanced and ever-changing technologies in the workplace and their daily lives. Not everyone has to learn to code, but many people need to understand and manage artificial intelligence, data analytics, autonomous vehicles, and other technologies that can’t even be predicted — those emerging now and those that will be created in the future. But upskilling is not simply a matter of teaching people how to use a new device. That device may be obsolete by the following year. It involves learning how to think, act, and thrive in a digital world in a way that is sustainable over time. Each nation will need its own approach, and each will need to consider the demographics of its citizens, its level of tech maturity, and the makeup of its economy to develop its own upskilling solution. A territory with a developed economy, an aging population, and a strong service sector will have different priorities than a region with a developing, mostly rural economy and a population in which most people are under 30.


How to create data literacy: 3 keys

Big data explained
A data literacy program creates associate development opportunities. Say you take three classes in Portuguese and you learn the vocabulary and the basic rules of grammar. You gain an appreciation for the language, you can read it, and you can make basic sense out of what others are communicating. To this end, we offer classes to help Red Hat associates develop their data literacy skills in a way that’s appropriate based on their role in the organization. Whether they are just starting their data literacy journey, are data practitioners, or are data leaders/advocates, everyone can grow their skills. Not everyone will have the same end goal but everyone can learn from seeing real data stories of business value gained. For example, we have courses ranging from “The Power of Data Visualization” to “Data Storytelling.” This is a great start, but who hasn’t taken a class and walked out the door (or logged off) with the best of intentions but no real plan for using the new knowledge. What happens? You never really feel confident in speaking the language.


Add Augmented Analytics to Your Business Data Practices


As we've said, the best endorsement of our instructions for adding augmented analytics into your business's data strategy is the independent feedback of our peers. In a survey of G2 Crowd reviewers, 75 percent of those surveyed favored Oracle Analytics for "Predictive Analytics Feature Satisfaction" compared to 68 percent who either favored Microsoft Power BI or Tableau desktop products. Similarly, when asked about services, 78 percent of survey respondents put Oracle at the top of the list of vendors whom they felt satisfied their needs for Big Data Features. Contrast that with Microsoft and Tableau, which scored 76 percent and 73 percent respectively. By connecting with big data sources such as those that leverage Hadoop, users can analyze unstructured data like text, videos, and image data sets, among others. This enables businesses to monitor and dig insights out of nontraditional data sets—like social media posts, emails, or IoT sensors, to name a few—that provide streaming data. Not only do these advanced features provide previously undiscovered insights, they offer relief to organizations that are not able to hire large teams of data analysts through true self-service functionality delivered by natural language.


Improving digital quotient through digital skilling

Improving digital quotient through digital skilling image
When building teams for any end-to-end process, skills for any particular role typically look a lot different. A system admin may need to know a lot more about development, a developer may need to know about user experience, and a business executive may need to know about cloud computing. The skills to support a digital enterprise are comprised of these new skills along with foundational literacies and character qualities. Improving a firm’s digital quotient means supporting staff who do not see themselves as living inside of silos. New talent should see themselves as working across departments, continuously stay abreast of latest disruptions in the landscape to engage with their counterparts effectively. ... Fear is the biggest threat to the digital quotient. While the agenda for digital quotient is far too large to be implemented all at once, it is also far too important not to be pursued. When digital quotients rise, so does business performance, in both technology-focused and traditional business firms.



Quote for the day:


"Remember teamwork begins by building trust. And the only way to do that is to overcome our need for invulnerability." -- Patrick Lencioni


Daily Tech Digest - January 08, 2020

Why data and analytics is so significant for Wells Fargo

Why data and analytics is so significant for Wells Fargo
Enterprise analytics is brought in the company firstly to offer better experience and secondly, as there have been lots of advancements in AI and machine learning, Wells Fargo wanted to create a centre of excellence to make sure that it is bringing the “latest and greatest” into the bank. In order to do that, it is looking into the machine learning use cases. “The first step was to create what we call an Artificial Intelligence Program Bank. It comprised of three different teams that were put together to do this. The first team is the business team, which is part of our innovation team and their mandate was to identify the big use cases that we want to go after and what are the big focus areas, and to figure out the areas that they want to understand and see where they can apply AI and machine learning. The second team was my team, which is all about data and data science. We ensure that we bring the right data, identify the problems, and then make sure that we have the right team members to be able to do the model development. The third team in the group was related to technology. We decided to bring these three groups together, and drive forward the application of AI in the bank,” informs Thota.



Jupyter Notebooks is a popular tool for data-science pros, allowing them to create and share code, visualizations, and other information useful in notebooks.  Microsoft enabled Jupyter notebooks native editing in VS Code in its October release of the Python extension, allowing data scientists to manage source control, open multiple files, and use the auto code-filling feature IntelliSense. In the January release, VS Code Python extension users can now see the current kernel that the notebook is using as well as the status of the kernel, such as whether it is idle or busy. Users can change to other Python kernels from the VS Code kernel selector. Microsoft also promises that this release brings performance improvements for Jupyter in VS Code in both the Notebook editor and the Interactive Window. The improvements are the result of caching previous kernels and optimizing the search for Jupyter, according to Microsoft. Microsoft says the initial start of the Jupyter server is faster and that subsequent starts are more than twice as fast. Users should experience a noticeably faster process for creating a new blank Jupyter notebook and when opening Jupyter Notebooks with a large file size.


Why Analytics Alone is No Longer Enough


Self-service analytics has been on the agenda for a long time, and has brought answers closer to the business users, enabled by “modern BI” technology. That same agility hasn’t happened on the data management side – until now. “DataOps” has come onto the scene as an automated, process-oriented methodology aimed at improving the quality and reducing the cycle time of data management for analytics. It focuses on continuous delivery and does this by leveraging on-demand IT resources and automating test and deployment of data. Technology like real-time data integration, change data capture (CDC) and streaming data pipelines are the enablers. ... Demand for data catalogues is soaring as organizations continue to struggle with finding, inventorying and synthesizing vastly distributed and diverse data assets. In 2020, we’ll see more AI infused metadata catalogues that will help shift this gargantuan task from manual and passive to active, adaptive and changing. This will be the connective tissue and governance for the agility that DataOps and self-service analytics provides.



Evaluating data loss impact in the cloud

Evaluating data loss impact in the cloud image
Following a data loss incident, organisations can see a decline in the value of competitively differentiating assets. The value of individual data sets within large organisations is something that should be assessed and measured by individual data owners within each team (engineering, product, marketing, HR etc). These data owners understand the life cycle, value, and use of their specific data and should be working in collaboration with the information security team to ensure appropriate risk practices are followed. For cloud, in addition to the data itself, competitive advantage components may include algorithms tuned by the data for business intelligence and data analytics purposes. ... The scale of reputational damage depends on the organisational business model, the details of any incident, and on the category of data itself. Customer data loss can lead to long-term reputational damage, especially if the organisation has been clearly critiqued for poor organisational and technical controls in protecting the data.


Amid privacy and security failures, digital IDs advance

blockchain bitcoin circuitry global
Self-sovereign identity envisions consumers and businesses eventually taking control of their identifying information on electronic devices and online, enabling them to provide validation of credentials without relying on a central repository, as is done now. Self-sovereign identity technology also takes the reins away from the centralized ID repositories held by the social networks, banking institutions and government agencies. A person’s credentials would be held in an encrypted digital wallet for documenting trusted relationships with the government, banks, employers, schools and other institutions. But it’s important to note that self-sovereign ID systems are not self-certifying. The onus on whom to trust depends on the other party. Whoever you present your digital ID to has to decide whether the credentials in it are acceptable. "For example, If I apply for a job…, and they require me to prove I graduated from a specific school and need to see my diploma, I can present that in digital form." said Ali.


Security Think Tank: Hero or villain? Creating a no-blame culture

In the general business IT world, all too often the end-user is identified as the point of blame for an intrusion, resulting in a culture of fear with people afraid to report anything suspicious, especially if they have clicked on a link they shouldn’t have. If there is one thing we should have learned, it is that nobody is immune to social engineering. There are numerous examples of security experts and senior managers of security companies being duped, so we must accept it is going to happen. Just as in the aviation example, this comes down to education and appropriate reporting mechanisms. Reporting must be easy, quick and provide positive feedback. Ideally, for phishing emails there should be a button to click to send the suspicious email to an automated analysis, which gives the user instant feedback on whether the email was safe or not and which automatically alerts the security operations team of any unsafe email. For other suspicious activity, feedback could be via a web portal linked to a ticketing system.


An autonomous, laser-guided mosquito eradication machine

bzigopointer.png
A three-year-old startup called Bzigo is developing a device that accurately detects and locates mosquitoes. Once a mosquito is detected, the device sends a smartphone notification while the mosquito is marked by a laser pointer. ... An autonomous laser marker that keeps a bead on the bloodsuckers might just even the playing field. This might strike you as kind of a silly idea, but the tech behind it is pretty intriguing. The device is composed of an infrared LED, a hi-res wide angle camera, custom optics, and a processor. The innovation lies in several computer vision algorithms that can differentiate between a mosquito and other pixel-size signals (such as dust or sensor noise) by analyzing their movement patterns. A wide covering patent on the device and its technologies has been recently approved giving Bzigo a leg up in the high stakes world of mosquito sport hunting. bIt's also worth noting that Bzigo is hardly the first company to try to build a better mosquito solution using technology.


How to Build a Microservices Architecture With Node.Js to Achieve Scale?

microservices with node.js
Building real-world applications in the JavaScript language requires dynamic programming, where the size of the JavaScript application grows uncontrollably. New features and updates get released, and you need to fix bugs to maintain the code. To execute this, new developers need to be added to the project, which becomes complicated. The structure of modules and packages is unable to downsize and simplify the application. To run the application smoothly, it is essential to convert the large, homogeneous structure into small, independent pieces of programs. Such complexities can be resolved effortlessly when JavaScript applications are built on microservices, more so with the Node.js ecosystem. In software application development, microservices are a style of service-oriented architecture (SOA) where the app is structured on an assembly of interconnected services. With microservices, the application architecture is built with lightweight protocols. The services are finely seeded in the architecture. Microservices disintegrate the app into smaller services and enable improved modularity.


Passive optical LAN: Its day is dawning

4 catastrophe vulnerable disaster fiber optic cables
The concept of using passive optical LANs in enterprise campuses has been around for years, but hasn’t taken off because most businesses consider all-fiber networks to be overkill for their needs. I’ve followed this market for the better part of two decades, and now I believe we’re on the cusp of seeing POL go mainstream, starting in certain verticals. The primary driver of change from copper to optical is that the demands on the network have evolved. Every company now considers its network to be business critical where just a few years ago, it was considered best effort in nature. Downtime or a congested network meant inconvenienced users, but today they mean the business is likely losing big money. ... The early adopters of POL are companies that are highly distributed with large campuses and need to get more network services in more places. This includes manufacturing organizations, universities, hospitality, cities and airports. Although I’ve highlighted a few verticals, the fact is that any business can take advantage of POL.


Decision Strategies for a Micro Frontends Architecture

To define a micro frontend, we must first identify it — is it a horizontal or vertical split? In a horizontal split, different teams work on different frontends, but in the same view. This means they have to coordinate in the composition of a page. In a vertical split, each team is responsible for a specific view which they are in full control of. Mezzalira thinks that adhering to a vertical slicing in general simplifies many of the coming decisions. It minimizes the coordination between teams, and he believes it’s closer to how a frontend developer is used to work. In an earlier blog post, he described how to identify micro frontends in more detail. When composing micro frontends, there are three options: client-side, edge-side and server-side composition. Client-side means implementing a technique like an application shell for loading single page applications. In edge-side, Edge Side Includes (ESI), or some similar technique on or near the edge is used.



Quote for the day:


"Leaders must encourage their organizations to dance to forms of music yet to be heard." -- Warren G. Bennis


Daily Tech Digest - January 07, 2020

Wi-Fi 6 will slowly gather steam in 2020

Future Wi-Fi
Making sure devices are compliant with modern Wi-Fi standards will be crucial in the future, though it shouldn’t be a serious issue that requires a lot of device replacement outside of fields that are using some of the aforementioned specialized endpoints, like medicine. Healthcare, heavy industry and the utility sector all have much longer-than-average expected device lifespans, which means that some may still be on 802.11ac. That’s bad, both in terms of security and throughput, but according to Shrihari Pandit, CEO of Stealth Communications, a fiber ISP based in New York, 802.11ax access points could still prove an advantage in those settings thanks to the technology that underpins them. “Wi-Fi 6 devices have eight radios inside them,” he said. “MIMO and beamforming will still mean a performance upgrade, since they’ll handle multiple connections more smoothly.” A critical point is that some connected devices on even older 802.11 versions – n, g, and even b in some cases – won’t be able to connect to 802.11ax radios, so they won’t be able to benefit from the numerous technological upsides of the new standard. Making sure that a given network is completely cross-compatible will be a central issue for IT staff looking to upgrade access points that service legacy gear.


Cloud storage solutions gaining momentum through disruption by traditional vendors

Cloud storage solutions gaining momentum through disruption image
This disruption, where traditional on-prem vendors have brought their offerings out into the public cloud, has led to the emergence of more innovative cloud storage solutions. “This has given customers flexibility in how they approach storage in a cloud environment, with more enterprise-style services being offered,” continued Beale. On-prem vendors want to have storage apps in the cloud. In part, this is a marketing positioning exercise where they want to be seen as new and innovative vendors, by working in the cloud. But, there’s a second, more technical and practical reason for this changing cloud storage landscape. “Around 80% of the organisations that we speak to have some sort of cloud presence. That means they’re using cloud-based technologies and at some point, multiple cloud providers, along with an on-prem solution,” explained Beale. Having a ubiquitous big data plain across an organisation is appealing to customers, because they don’t have to spend a lot of time, money or resources on disparate platforms across multiple vendors.


Why flexible work and the right technology may just close the talent gap

https://www.citrix.com/glossary/what-is-digital-transformation.html
Increasingly what we see is that freelancers become full-time freelancers; meaning it’s their primary source of income. Usually, as a result of that, they tend to move. And when they move it is out of big cities like San Francisco and New York. They tend to move to smaller cities where the cost of living is more affordable. And so that’s true for the freelance workforce, if you will, and that’s pulling the rest of the workforce with it. What we see increasingly is that companies are struggling to find talent in the top cities where the jobs have been created. Because they already use freelancers anyway, they are also allowing their full-time employees to relocate to other parts of the country, as well as to hire people away from their headquarters, people who essentially work from home as full-time employees, remotely. ... And along the way, companies realized two things. Number one, they needed different skills than they had internally. So the idea of the contingent worker or freelance worker who has that specific expertise becomes increasingly vital.


Life on the edge: A new world for data


For many CIOs, a strategy for edge computing will be entirely new. Sunil urges CIOs to assess what parts of edge computing can be achieved in-house and what should be done through a consulting firm. “A system integrator will play a big role in bringing it all together,” he says. Chris Lloyd-Jones, emerging technology, product and engineering lead at Avanade, says large enterprises are starting to build IoT platforms to centrally manage edge computing devices and provide connectivity across geographic regions. “Edge computing is no longer just about an on-board computer where data from the device is uploaded via a USB cable,” he says. “Edge computing now handles 4G and 5G connectivity with periodic connectivity, and support for full-scale machine learning and computationally intensive workloads. Data can be transmitted to and from the cloud. This provides centralised management.” Lloyd-Jones says the cloud can be used to train machine learning models, which can then be deployed to edge devices and managed like any other IT equipment.


Microsoft: RDP brute-force attacks last 2-3 days on average


Usually, these attacks use combinations of usernames and passwords that have been leaked online after breaches at various online services, or are simplistic in nature, and easy to guess. Microsoft says that the RDP brute-force attacks it recently observed last 2-3 days on average, with about 90% of cases lasting for one week or less, and less than 5% lasting for two weeks or more. The attacks lasted days rather than hours because attackers were trying to avoid getting their attack IPs banned by firewalls. Rather than try hundreds or thousands of login combos at a time, they were trying only a few combinations per hour, prolonging the attack across days, at a much slower pace than RDP brute-force attacks have been observed before. "Out of the hundreds of machines with RDP brute force attacks detected in our analysis, we found that about .08% were compromised," Microsoft said. "Furthermore, across all enterprises analyzed over several months, on average about 1 machine was detected with high probability of being compromised resulting from an RDP brute force attack every 3-4 days," the Microsoft research team added.


AI, privacy and APIs will mold digital health in 2020

Interoperability is a major player in health tech innovation: patients will always receive care across multiple venues, and secure data exchange is key to providing continuity of care. Standardized APIs can provide the technological foundations for data sharing, extending the functionality of EHRs and other technologies that support connected care. Platforms like Validic Inform leverage APIs to share patient-generated data from personal health devices to providers, while giving them the ability to configure data streams to identify actionable data and automate triggers. In the upcoming year, look for major players like Apple and Google to make strides toward interoperability and breaking down data silos. Apple’s Health app already is capable of populating with information from other apps on your phone. Add your calorie intake to a weight loss app? Time your miles with a running app? Monitor your bedtime habits with a sleep tracking app? You’ll find that info aggregating in your Health app. Apple is uniquely positioned to be the driver of interoperability, and Google is not far behind.


Capitalizing on the promise of artificial intelligence


Remarkably, a majority of early adopters within each country believe that AI will substantially transform their business within the next three years. However, as pointed out in Is the window for AI competitive advantage closing for early adopters?—part of Deloitte’s Thinking Fast series of quick insights — the early adopters also believe that the transformation of their industry is following close on the heels of their own AI-powered business transformation. Globally, there’s a sense of urgency among adopters that now is the time to capitalize on AI, before the window for competitive advantage closes. However, comparing AI adopters across countries reveals notable differences in AI maturity levels and urgency. While many nations regard AI as crucial to their future competitiveness, these comparisons indicate that some countries are adopting AI aggressively, while others are proceeding with considerable caution—and may be at risk of being left behind. Consider Canada.


Building the ‘Intelligent Bank’ of the Future

Of high awareness within the banking industry, but not yet understood by consumers is the evolving nature of open banking, which has proceeding in stages in Europe and elsewhere, but not yet in the U.S. From the consumer perspective, people want easier ways to manage their money and make their daily life easier. Many financial institutions, on the other hand, are somewhat overwhelmed by the prospects of delivering on the open banking promise. The paradox exists between the desire to deliver more integrated solutions while being transparent around the sharing of data between multiple organizations. Most of the concerns around open banking revolve around the collection and sharing of data with third parties and the inherent risks around such sharing. There is also the need to educate both the consumer and the employee on data security. The end result is less than clear regulations around open banking, and very few organizations actually being prepared to deliver on what has been promised consumers. That said, it is interesting that more than four in ten financial institutions (41%) are looking beyond just offering banking products in the future.


Backdoors and Breaches incident response card game makes tabletop exercises fun

Backdoors & Breaches  >  Incident Response Card Game
Unlike some tabletop exercises that can take months to prepare and last for days, Backdoors and Breaches makes it simple to role-play thousands of possible security incidents, and to do so even as a weekly exercise. The game can be played just by blue teamers but could also involve a member of the legal team, management, or a member of the public relations team. The ideal game involves no more than six players to ensure that everyone is engaged and participating. "This game can be played every Thursday at lunch," Blanchard tells CSO. If the upside of the B&B card deck is the ability to instantly create thousands of scenarios from generic attack methods, the downside is that it lacks cards for specific industries, or company-specific issues. Black Hills plans for expansion decks in 2020, including one for industrial control system (ICS) security and another for web application security. The B&B deck launched at DerbyCon 2019, and Blanchard says they plan to give away free decks at every infosec conference they attend in 2020. The decks are also available on Amazon for $10 plus shipping, which, he says, just covers their costs.


An Introduction to Blazor and Web Assembly

An Introduction to Blazor and Web Assembly
Blazor is a new framework that lets you build interactive web UIs using C# instead of JavaScript. It is an excellent option for .NET developers looking to take their skills into web development without learning an entirely new language. Currently, there are two ways to work with Blazor: running on an ASP.NET Core server with a thin-client, or completely on the client’s web browser using WebAssembly instead of JavaScript. ... UI component libraries were created long before Blazor. Most existing frameworks that target web applications are based on JavaScript. They are still compatible with Blazor due to its ability to interoperate with JavaScript. Components that are primarily based on JavaScript are called wrapped JavaScript controls, as opposed to components written entirely in Blazor, which are referred to as native Blazor controls. Native Blazor controls parse the Razor syntax to generate a render tree that represents the UI and behavior of that control. The render tree is why it’s possible to run server-side Blazor. The tree is parsed and used to generate HTML on the server that’s sent to the client for rendering. In the case of Blazor WebAssembly, the render tree is parsed and rendered entirely in the client.



Quote for the day:


"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup


Daily Tech Digest - January 06, 2020

Deep learning vs. machine learning: Understand the differences

Deep learning vs. machine learning: Understand the differences
Dimensionality reduction is an unsupervised learning problem that asks the model to drop or combine variables that have little or no effect on the result. This is often used in combination with classification or regression. Dimensionality reduction algorithms include removing variables with many missing values, removing variables with low variance, Decision Tree, Random Forest, removing or combining variables with high correlation, Backward Feature Elimination, Forward Feature Selection, Factor Analysis, and PCA (Principal Component Analysis). Training and evaluation turn supervised learning algorithms into models by optimizing their parameter weights to find the set of values that best matches the ground truth of your data. The algorithms often rely on variants of steepest descent for their optimizers, for example stochastic gradient descent, which is essentially steepest descent performed multiple times from randomized starting points.



Why enterprises should care about DevOps


The old days of manually doing everything as an IT person are gone, and companies that are still operating that way are undergoing transformation. But I don’t think we’re ever going to get rid of operational concerns. It’s just going to be that rather than doing things manually, or through graphical consoles, you're going to work via APIs, scripting languages and automation tools like Puppet. And in many ways – and I say this quite a lot – DevOps has made operations people feel that they must become developers to get their job done. But it’s more about embracing software engineering principles. It’s about version control, release management, branching strategies, and continuous integration and delivery. We’ve seen this repeatedly, and that’s why we added features to Puppet Enterprise around continuous delivery, because the most successful customers were those that were adopting infrastructure as code.


Legal engineering: A growing trend in software development

Legal advice service concept with lawyer working for justice, law, business legislation, and paperwork expert consulting, icons with person in background
Legal engineers come from incredibly diverse backgrounds and collectively have years of experience and insights that benefit our customers tremendously. They include former attorneys from top law schools and some of the country's best law firms, experts in contract law, and a former civil rights trial attorney. We have other legal engineers who came to us from top-tier management consulting firms and several who gained considerable experience at some of Silicon Valley's best SaaS companies. These diverse backgrounds and responsibilities mean that the role of legal engineering can seem very different depending on who you ask. To our customers, they are thought partners, advising on best practices for building a modern legal team. To our product team, they are the voice of the user, listening and synthesizing valuable feedback. Sometimes, we even refer to them internally as our in-house S.W.A.T. team, because they are ready and able to jump in and help fix any situation. Ultimately, legal engineers are at the forefront of the modernization of in-house legal. As legal technology continues to evolve, so will legal engineering.


Banner: Fragmentation by Country
In this post, we look at how Fragmentation varies across the globe and key statistics you should keep in mind if you have a presence in these markets. The growth mantra of online businesses is scale — reach more users, fast. However, as you scale across countries, it’s important to ensure that your app/website is compatible with your users’ devices and browsers. Compatibility is to online businesses what distribution is to brick and mortar ones. You might have the best product in the world, but it counts for nothing if your customers don’t have the experience you designed for them. For instance, being compatible with the top 20 devices will help you cover 70% of the US audience. In India, not only will the devices be different, the coverage provided will be less than 35%. Similarly, if your mobile website doesn’t load properly in the Opera browser, you would have ignored almost half of the Nigerian market!


Industry 4.0 / Industrial IoT / Smart Factory
“This consolidation will strengthen the ability of the IIC to provide guidance and advance best practices on the uses of distributed-ledger technology across industries, and boost the commercialization of these products and services,” said 451 Research senior blockchain and DLT analyst Csilla Zsigri in a statement. Gartner vice president and analyst Al Velosa said that it’s possible the move to team up with TIoTA was driven in part by a new urgency to reach potential customers. Where other players in the IoT marketplace, like the major cloud vendors, have raked in billions of dollars in revenue, the IIoT vendors themselves haven’t been as quick to hit their sales targets. “This approach is them trying to explore new vectors for revenue that they haven’t before,” Velosa said in an interview. The IIC, whose founding members include Cisco, IBM, Intel, AT&T and GE, features 19 different working groups, covering everything from IIoT technology itself to security to marketing to strategy.



Up to half of developers work remotely; here's who's hiring them

It is estimated that there are between 18 to 21 million developers across the globe. Of this, only about one million -- or five percent -- are in the United States, so you can see how an employer in the US, or anywhere else for that matter, needs to spread its recruiting and staffing wings. It's in the best interest for tech-oriented employers, then, to be open to this global pool of talent. There are a number of companies leading the way, actively hiring globally distributed tech workforces. Glassdoor recently published a list of leading companies that encourage remote work, which includes some prominent tech companies, and Remotive has been compiling a comprehensive list of more than 2,500 companies of all sizes that hire remote IT workers. Survey data from Stack Overflow, analyzed by Itoro Ikon, finds that out of almost 89,000 developers participating in its most recent survey, 45% work remotely at least part of the time, and 10% indicated they are full-time remote workers.


The Fundamental Truth Behind Successful Development Practices: Software is Synthetic


Look across the open plan landscape of any modern software delivery organization and you will find signs of it, this way of thinking that contrasts sharply with the analytic roots of technology. Near the groves of standing desks, across from a pool of information radiators, you might see our treasured artifacts - a J-curve, a layered pyramid, a bisected board - set alongside inscriptions of productive principles. These are reminders of agile training past, testaments to the teams that still pay homage to the provided materials, having decided them worthy and made them their own. What makes these new ways of working so successful in software delivery? The answer lies in this fundamental yet uncelebrated truth - that software is synthetic. Software systems are creative compounds, emergent and generative; the product of elaborate interactions between people and technology.


5G is poised to transform manufacturing

5G mobile wireless network technology
Today, many manufacturers use as fiber, Wi-Fi and 4G LTE rather than 5G because 5G infrastructure, standards, and devices are yet to be available and proven. “But many people are starting to look at 5G today, looking at it as a more future-proof strategy than adopting 4G,” said Dan Hays, principal and head of US corporate strategy practice at PricewaterhouseCoopers LLP. “4G LTE has been around for a little over a decade.” 5G devices available today are very early ones. “They are not yet at the mass-production level and have not come down the cost curve to drive large-scale adoption,” he said. According to Erik Josefsson, vice president and head of advanced industries at Ericsson, which makes underlying 5G technology, 5G is currently at Release 15, which offers high data rate, extended coverage, and low latency compared to 4G – but doesn’t get down to the goal of 1-millisecond latency. "You can get 10 milliseconds," he said. "But you're not down to 1 millisecond yet. Release 16 is ultra-reliable low-latency, down below 10 milliseconds, for more complex use cases."


These five tech trends will dominate 2020


The constant drip-drip of data leaks and privacy catastrophes show that security is still, at best, a work-in-progress for many organisations. And security is still a minor consideration for many business leaders too.. Perhaps that's because there have been so many leaks that they think the risk to their reputation is low. It's a dangerous assumption to make. More apps and more devices mean security teams are already spread too thinly. Add in new risks like Internet of Things projects, 5G devices and deepfakes and the challenges mount unless companies take the broadest possible view of security. Organised crime and ransomware will still be the most consistent threats to most businesses; state-sponsored attacks and cyber-espionage will remain an exotic but potentially high-profile threat to a minority. For all this, the biggest risks will still be the basic ones; staff falling for phishing emails, or using their pets' names as passwords, and poorly configured cloud apps. There will always be new threats, so prepare for the strangest while not forgetting the basics.


Three Surprising Ways Archiving Data Can Save Serious Money


Until recently, backup solutions for enterprises typically fall into two strategies: tape or disk-to-disk (D2D) replication. Both of these solutions come with significant price tags to backup a single terabyte of primary data. The common misconception is that tape backup is cheap. While an actual tape might be cheap, backing up primary data with tape also requires tape libraries, servers, software, data center space, power, cooling, and management overhead. These costs add up very quickly. Our research shows that to backup a single terabyte of primary with tape could cost $138-$1,731 per year, depending on how frequently you are completing a full backup. The other common backup solution – replication – requires backup workflows that replicate data from the primary NAS system to a secondary storage platform from the same vendor. In most cases, this means that the secondary storage system is architecturally similar to the primary NAS device, requiring hardware, software, data center space, power, cooling, and management.



Quote for the day:


"There are many elements to a campaign. Leadership is number one. Everything else is number two.
 -- Bertolt Brecht


Daily Tech Digest - January 05, 2020

Overcoming Racial Bias In AI Systems And Startlingly Even In AI Self-Driving Cars

AI systems can have embedded biases, including in AI self-driving cars.
The algorithm that’s doing pattern matching might computationally begin to calculate that if someone is tall then they are a basketball player. Of course, being tall doesn’t always mean that a person is a basketball player and thus already the pattern matching is creating potential issues as to what it will do when presented with new pictures and asked to classify what the person does for a living. Realize too that there are two sides to that coin. A new picture of a tall person gets a suggested classification of being a basketball player. In addition, a new picture of a person that is not tall will be unlikely to get a suggested classification of being a basketball player (therefore, the classification approach will be inclusive and furthermore tend toward being exclusionary). In lieu of using height, the pattern matching might calculate that if someone is wearing a sports jersey, they are a basketball player.



How SwissLife France’s EAs Used Lean to Raise Their Level of Influence

From what we knew about Lean, we felt it was something that could help us get a grip again on this flow to better deliver on our mission. But we also knew that Lean applied well on activities that were already processed with an important flow of "pieces" and short cycle times. And we were completely aware that Enterprise Architecture is different in essence, since it is essentially an upstream activity where you produce abstract artifacts like plans and designs, and no concrete items. Furthermore, there is also some fuzzy logic in what architects deliver, because measuring the "quality" of an architecture is a challenge and can involve very long cycle times. At first sight, none of this was very compatible with Lean. But we had other IT teams at SwissLife which had already conducted successful Lean projects in the past, and we have had good contacts with the Lean coaches who had led them. So we decided to give Lean a go!


null
Oddly enough, the AI that can drive the explosive growth of a digital firm often isn’t even all that sophisticated. To bring about dramatic change, AI doesn’t need to be the stuff of science fiction—indistinguishable from human behavior or simulating human reasoning, a capability sometimes referred to as “strong AI.” You need only a computer system to be able to perform tasks traditionally handled by people—what is often referred to as “weak AI.” With weak AI, the AI factory can already take on a range of critical decisions. In some cases it might manage information businesses (such as Google and Facebook). In other cases it will guide how the company builds, delivers, or operates actual physical products (like Amazon’s warehouse robots or Waymo, Google’s self-driving car service). But in all cases digital decision factories handle some of the most critical processes and operating decisions. Software makes up the core of the firm, while humans are moved to the edge. Four components are essential to every factory. The first is the data pipeline, the semiautomated process that gathers, cleans, integrates, and safeguards data in a systematic, sustainable, and scalable way. The second is algorithms, which generate predictions about future states or actions of the business.



Meritocracy, Ethics and Enterprise Architecture

The big problem for all of us is that, if such an organization may turn strong enough to take enough control of the recruiting market, we may have to join, pay and play by its rules to be able to profess at all. This is also the case of some standards today which, while they provide no returns, have monopolized the training and certifications market reducing them to worthless diploma mills. Having jumped at an apparently good cause, delivering standards to the profession, an organization may cause a lot of grief later on to all of us. Think of the cost on you for refusing to adopt the standard, be trained and certified in it. It’s bad we cannot do anything about this. The good old detective question "cui bono" illustrates if not solves the dilemma of such standards showing that the organizations promoting the standards win much more than the EA community and the ultimate customers, the companies on this world. Now, does EA warrant, as such, a code of ethics and an associated organization to police the entry to and the execution of the profession?


2020 will be a challenging year for challenger banks


There are a few basic features that separate challenger banks from legacy retail banks. Signing up is extremely simple and only requires a mobile app. The mobile app itself is usually much more polished than traditional banking apps. Users receive a Mastercard or Visa debit card that communicates with the company’s server for each transaction. This way, users can receive instant notifications, block and unblock their cards and turn off some features, such as foreign payments, ATM withdrawals and online transactions. Challenger banks usually customers promise no markup fees on transactions in foreign currencies, but there are sometimes some limits on this feature. So how do these companies make money? When you pay with your card, banks generate a tiny, tiny interchange fee of money on each transaction. It’s really small, but it could become serious revenue at scale with tens of millions or hundreds of millions of users. Challenger banks also offer other financial services like insurance products, foreign exchange or consumer credit.


Fresh Cambridge Analytica leak ‘shows global manipulation is out of control’

An explosive leak of tens of thousands of documents from the defunct data firm Cambridge Analytica is set to expose the inner workings of the company that collapsed after the Observer revealed it had misappropriated 87 million Facebook profiles. More than 100,000 documents relating to work in 68 countries that will lay bare the global infrastructure of an operation used to manipulate voters on “an industrial scale” are set to be released over the next months. It comes as Christopher Steele, the ex-head of MI6’s Russia desk and the intelligence expert behind the so-called “Steele dossier” into Trump’s relationship with Russia, said that while the company had closed down, the failure to properly punish bad actors meant that the prospects for manipulation of the US election this year were even worse. The release of documents began on New Year’s Day on an anonymous Twitter account, @HindsightFiles, with links to material on elections in Malaysia, Kenya and Brazil.


Neuro-symbolic A.I. is the future of artificial intelligence. Here’s how it works

neuro symbolic ai the future mit ibmwatsonshapes
Neuro-symbolic A.I. is not, strictly speaking, a totally new way of doing A.I. It’s a combination of two existing approaches to building thinking machines; ones which were once pitted against each as mortal enemies. The “symbolic” part of the name refers to the first mainstream approach to creating artificial intelligence. From the 1950s through the 1980s, symbolic A.I. ruled supreme. To a symbolic A.I. researcher, intelligence is based on humans’ ability to understand the world around them by forming internal symbolic representations. They then create rules for dealing with these concepts, and these rules can be formalized in a way that captures everyday knowledge. If the brain is analogous to a computer, this means that every situation we encounter relies on us running an internal computer program which explains, step by step, how to carry out an operation, based entirely on logic. Provided that this is the case, symbolic A.I. researchers believe that those same rules about the organization of the world could be discovered and then codified, in the form of an algorithm, for a computer to carry out.


Common Coding Mistakes You Should Avoid


According to the single responsibility pattern, a function should only be responsible for doing one thing. And one thing only. I’ve seen way too many functions that fetch, process, and present data all in one function. It’s considered better programming to split this up. One function that fetches the data, one that processes it, and another one that presents the data. The reason it is important to keep a function focused on a single concern is that it makes it more robust. Let’s say that in the foregoing example the data got fetched from an API. If there is a change to the API—for example, there is a new version—there is a greater danger that the processing code will break if it is part of the same function. This will most likely cause the presentation of the data to break as well. ... We’ve all seen entire blocks of code containing multiple functions being commented out. No one knows why it’s still there. And no one knows if that block of commented-out code is still relevant. Yet, no one deletes that block of code, which is what you should really do with it.


9 policies and procedures you need to know about if you’re starting a new security program

blue padlock in circle pixels digital security padlock
A change management policy refers to a formal process for making changes to IT, software development and security services/operations. The goal of a change management program is to increase the awareness and understanding of proposed changes across an organization, and to ensure that all changes are conducted methodically to minimize any adverse impact on services and customers. A good example of an IT change management policy available for fair use is at SANS. An organization’s information security policies are typically high-level policies that can cover a large number of security controls. The primary information security policy is issued by the company to ensure that all employees who use information technology assets within the breadth of the organization, or its networks, comply with its stated rules and guidelines. I have seen organizations ask employees to sign this document to acknowledge that they have read it (which is generally done with the signing of the AUP policy).


Why 2019 Was Actually A Secret Success For Blockchain In Financial Services

Facebook's Libra
As interest in digital assets grows, the infrastructure for securely holding and keeping bitcoin and other cryptocurrencies in a regulated and compliant manner is considered one of the major challenges for any institutional newcomer, regardless of size and trading volume. In 2019, we saw the entrance of significant players, as both Bakkt (backed by ICE and the NYSE) and Fidelity Digital Assets are able to provide safe-keeping and custodian services on top of other services. Additionally, custody was a hot topic throughout 2019, as we saw startups like Trustology trying to get in and hope for market share while the traditional asset managers and custodians like Vanguard, State Street, and Northern Trust were slowly building products, solutions, and partnerships. In retail banking, and especially in the back-office services of settlement, reconciliation, transaction audit, and visibility, the most interesting developments in 2019 centered on stablecoins and CBDCs. We saw the launch of projects like J.P. Morgan’s own stablecoin and Utility Settlement Coin/Fnality, which is backed by banks like UBS, Barclays, and BNY Mellon, among others.



Quote for the day:


"Be with a leader when he is right, stay with him when he is still right, but, leave him when he is wrong." -- Abraham Lincoln