Daily Tech Digest - January 10, 2020

The smart cities challenge: How tech will update antiquated infrastructures


In terms of transportation initiatives , "Yes, we have to think of transportation," Lightman said, but for smart cities to operate optimally, she continued, we need to "look holistically as a system of a system," one that includes issues of "climate change and the critical thread of citizen engagement which runs through it."  She cited an example in her home state: 16 years ago, Pittsburgh went bankrupt and lost half of its population. Now stable and growing its a city poised to become an ideal smart city (Lightman acknowledges that losing half the population put considerably less stress the city's infrastructure). Carnegie Mellon , she said, is looking to address issues with "the infrastructure that's been neglected for almost 20 years; there are a lot of bridges and roads crumbling, and we have 40 active landslides."  This is where emerging technology like artificial intelligence (AI) and machine learning shines. Lightman stressed how important artificial intelligence (AI) is in predicting natural disasters such as landslides. "AI," she said, "will solve many problems."


5 Tips on How to Build a Strong Security Metrics Framework

Know your audience. This advice applies to many areas, including metrics. The first step toward building a strong metrics framework is to understand who you're building it for, even if there are multiple audiences. The metrics reported to the board and executives will be different than those you use to make operational improvements and tactical adjustments. The metrics provided to customers showing that their data is protected will be different than the metrics for security management to make well-informed decisions. A good metrics framework provides the right metrics to the appropriate audiences, even when there are multiple audiences. ... If you've ever had your home or car inspected, you know that there are acceptable levels for radon in a home or emissions from a car. It isn't black or white or on or off. There is a range of levels within which the home or car passes the test, and outside of which, it fails. The same should be true for metrics.


U.S. Funds Program With Free Android Phones For The Poor — But With Permanent Chinese Malware

Cyber Security Concerns In The Global Wake of Hacking Threat
The affected device is a UMX phone shipped by Assurance Wireless and one of the preinstalled malware, according to MalwareBytes senior analyst Nathan Collier, is the creation of a Chinese entity known as Adups. Though the tool looks and operates as a Wireless Update program, it’s capable of auto-installing apps without any user consent, which it starts doing immediately, according to a MalwareBytes analysis of a device, shared with Forbes ahead of publication. Adups hadn’t responded to a request for comment at the time of publication. “This opens the potential for malware to unknowingly be installed in a future update to any of the apps added by Wireless Update at any time,” Collier wrote in a blog post published Thursday.  Historically Adups tools have been caught siphoning off private data from phones, including the full-body of text messages, contact lists and call histories with full telephone numbers.  A second malware comes preloaded on the Assurance Wireless-supplied device—the phone’s own Settings app, Collier claimed. 


4 habits of effective DevOps engineers


Having an understanding of technology foundations will go a long way in DevOps, says Dempers. Enterprise deployments are accelerating, and there isn't enough time to dig into the weeds of every new technology, he says. "Learn about the underlying fundamentals of a technology, rather than how to use the technology, and how to apply the technology." For example, "instead of just learning how to run a Docker container, dive into the Linux features that make containerization work, and learn about those features. It makes it really easy to understand how Docker works. Then you can then move on to technologies like Kubernetes that use the same Linux kernel features." With an understanding of the underlying technology, it will be easier to communicate across the organization and understand how the technologies interact, Dempers adds. "You basically learn how to put all the pieces of the puzzle together and paint a picture in your head about the technologies. Then you can focus on the gaps of the things you're missing, rather than just focusing on how to use a technology."


The US just released 10 principles that it hopes will make AI safer

An American Flag
The newly proposed plan signifies a remarkable U-turn from the White House’s stance less than two years ago, when people working in the Trump administration said there was no intention of creating a national AI strategy. Instead, the administration argued that minimizing government interference was the best way to help the technology flourish. But as more and more governments around the world, and especially China, invest heavily in AI, the US has felt significant pressure to follow suit. During the press briefing, administration officials offered a new line of logic for an increased government role in AI development.  “The US AI regulatory principles provide official guidance and reduce uncertainty for innovators about how their own government is approaching the regulation of artificial intelligence technologies,” said US CTO Michael Kratsios. This will further spur innovation, he added, allowing the US to shape the future of the technology globally and counter influences from authoritarian regimes. There are a number of ways this could play out.


Learning from the Travelex cyber attack: Failing to prepare is preparing to fail


The key lesson we can take from the Travelex breach is that an effective response to a breach is a critical business function and is no longer the sole province of the IT department. Rather, it should be a core business competency supported by senior management with input from other business areas, such as HR, legal and compliance, public relations, customer support and the data protection team. As demonstrated by the Travelex breach, an incident can disrupt your business, with critical systems taken offline. To minimise the levels of disruption a cyber attack can inflict on your business, your incident response plan should be integrated closely with your business continuity plans. Finally, practice makes perfect, so regularly test how effective your processes are. Better to discover weaknesses in how you can respond to an incident during an exercise rather than in the midst of a real crisis.


The Bank of the Future Will Have Data Vaults and Money Vaults


Think about Google Assistant and Google Live on Google. These are next-generation digital services that can learn from their users, and can get better as their users use them. In the banking world, almost all banks are trying to build such services on their digital channels – next-generation concierge services that can understand the needs of their users and can adapt and give the right information to the right user at the right time. That’s what we refer to as “context-aware computing” or “contextualization.” Building these types of capabilities in the past required a lot of I.T. processes, algorithmic expertise, understanding things such as statistical modeling and predictive modeling. Flybits has really simplified that process for banking institutions. Instead of expecting the institution to hire data scientists and algorithmic experts, we have built platforms that even a marketing intern can be trained on, allowing them to focus more on use cases and creativity rather than worrying about I.T. complexities. This allows the bank or credit union to bring these next-generation predictive use cases to the market faster and in more efficient ways.


Restart Data and AI Momentum This Year

Image: geralt - pixabay
Starting small is the right way to tackle such a project, Bean agrees. "Companies need to demonstrate quick wins and measurable results to establish credibility and build momentum," he said. "We believe that those firms that start small, focus on a key business question or two, and show quick results, are most successful at creating a foundation for future success." IT's contribution to these steps come in a few key ways. Davenport said that IT plays an important role in helping the business leaders understand what's possible with a particular technology. "They need to educate and build relationships as much as they need to build technology infrastructure," he said. The partnership between IT and line-of-business owners is key to the success of projects, according to Bean. ... One key role that remains in flux in 2020, according to the survey, is Chief Data Officer or Chief Analytics Officer. A growing number of organizations are hiring for this role from outside the firm.


Google details its three-year fight against the Bread (Joker) malware operation

android mobile malware
In a blog post detailing its fight against the Bread gang published last night, Google said that the operators "have at some point used just about every cloaking and obfuscation technique under the sun in an attempt to go undetected." Google's security team said the malware was not what someone would call sophisticated, but just more persistent than others. "Sheer volume appears to be the preferred approach for Bread developers," Google said. "At different times, we have seen three or more active variants using different approaches or targeting different carriers," Google added. "At peak times of activity, we have seen up to 23 different apps from this family submitted to Play in one day." Google also said that Bread malware strains have also been spotted on the Play Store, suggesting this malware operation knew what and who to target from the get-go and never deviated from its path even if they weren't initially successful.


The 5 Biggest Cybersecurity Trends In 2020 Everyone Should Know About

The 5 Biggest Cybersecurity Trends In 2020 Everyone Should Know About
While AI is undoubtedly being researched and developed as a means of crippling an enemy state’s civil and defense infrastructure during war, it’s also easily deployable by criminal gangs and terrorist organizations. So rather than between nations, today’s race is between hackers, crackers, phishers and data thieves, and the experts in cybersecurity whose job it is to tackle those threats before they cause us harm. Just as AI can “learn” to spot patterns of coincidence or behavior that can signal an attempted attack, it can learn to adapt in order to disguise the same behavior and trick its way past our defenses. This parallel development of offensive and defensive capabilities will become an increasingly present theme as AI systems become more complex and, importantly, more available and simpler to deploy. Everything from spam email attempts to trick us into revealing our credit card details to denial-of-service attacks designed to disable critical infrastructure will grow in frequency and sophistication.



Quote for the day:


"Nobody in your organization will be able to sustain a level of motivation higher than you have as their leader." -- Danny Cox


Daily Tech Digest - January 09, 2020

The importance of wearable hardware in the enterprise

The importance of wearable hardware in the enterprise image
“A large oil and gas firm is using geolocation wearables, connected via an IoT network, for site workers across multiple fields and rigs,” said Didier Pagnoux, director for IoT solutions at Altran. “It is also adding ‘wearable trackers’ for spare parts so workers can find them faster during emergencies, such as leakages. This is especially useful given how vast and extensive some fields, rigs and mines can be. “Wearables are starting to play a major role in the oil and gas industry. This is significant because much of the oil and gas industry is rooted deeply in 20th century methodologies, systems and processes.” Pagnoux continued by explaining how wearable hardware helps oil and gas companies to gain real-time insight into the environment in which their employees work, and gauge whether or not conditions are safe enough. “Embedded sensors within safety jackets and helmets are also being used within mines and rigs to feed a range of data on the conditions workers experience,” he said. “This is to monitor the air quality and to prevent accidents.



Operationalizing Threat Intelligence at Scale in the SOC

The period of time for which threat data is valid is limited. Organizations need current information about vulnerabilities and malware being used in attacks before they are targeted. Intelligence feeds will have shifting levels of urgency and simplifying the prioritization process is a complex task. In the past, security practitioners shared Word documents, PDFs, or simple file formats like CSV tables and Excel Sheets of indicators of compromise These were difficult to operationalize due to taxonomy and formatting differences, lack of integration, and the time-sensitive nature of the data. Also, it is difficult to describe and share a more complex behavioral indicator such as a threat actor tactic in a standardized format. The cyber community has tried — and failed — to institute an effective culture of sharing. Taxonomies and standards have been created but none have caught on at scale, leaving accessibility to CTI fragmented. As a result, most sharing doesn't go beyond domains. And even though security analysts across industries share common goals, often the organization does not see it that way and sharing and collaborating is hidden from management.


Grilling the answers: How businesses need to show how AI decides


“It became very obvious that if you are going to be using these machine learning algorithms to inform, or guide some really important decisions in our lives, then you really need to have this confidence or trust,” she says. But explaining machine learning decision-making to a data scientist is one thing; explaining it to consumers or the public will require a great deal more creative thinking, says Mojsilovic. “Fairness may be a complex ethical issue, but in a way, explainability is even more difficult,” she says. “Think about how humans explain things, how we navigate the world around us and how we communicate. We do it in so many different ways. We look for examples and counterexamples and summarise things, and so on. We thought about how to take that expressiveness of human interaction and create the methods to communicate [the way AI reaches conclusions].


A California student has filed a suit against Chinese-based TikTok, which she accuses of retrieving her data without permission
The vulnerabilities, as per the cybersecurity firm, could allow people with malicious intent to have access to user accounts and do a lot of things, such as steal their confidential information, delete their videos, make their private videos public, and so on. The vulnerabilities can also allow attackers to upload unauthorized videos to compromised accounts. The firm found that the app's subdomain was vulnerable to a type of attack where seemingly benign or “innocent” websites can be used to hack accounts and steal information. These, called XSS attacks, allow hackers to insert malicious scripts into trusted websites. Attackers can leverage this vulnerability to send TikTok users spoofed messages that contained links. These messages are made to look like they are legitimate and are from TikTok. If a person clicks or taps on the links, the attacker can then gain access to his or her TikTok account for whatever purpose he may have in mind. Check Point looked into TikTok's vulnerability to XSS attacks and successfully retrieved confidential user information, which included private email addresses and birthdates. The cybersecurity firm informed TikTok of the vulnerabilities on Nov. 20 last year, and by December, the app company was able to fix them.


CES 2020 car show features liquid crystal sun visors, EyeLocks, and smart LiDAR

screen-shot-2020-01-09-at-7-11-35-am.png
At CES 2020, Cerence showed how voice recognition and head tracking can be used together to open windows and doors. These button-free controls use voice recognition, gaze detection, touch, and gesture to create a natural, human-like in-car experience. The demo also included intelligent voice traffic notifications that leverage natural language generation to assist drivers with route selection. Bosch has made the sun visor smart with a camera and a transparent liquid crystal display. The Bosch Virtual Visor blocks only the portion of the visor where the sun would strike the driver's eyes while leaving the rest of the visor transparent. This improves visibility for the driver and automates adjustments to the visor, allowing the driver to focus on the road.  Also at CES 2020, EyeLock announced that SiriusXM will use the company's iris authentication tech to safeguard its new mobile e-wallet. The in-car platform lets drivers pay tolls, purchase gas, or stop at the drive-through without reaching for a wallet. Drivers use voice commands or a touch screen to start an e-wallet transaction and then an iris scan verifies the request. The custom EyeLock prototype will be placed in the visor of the car, allowing for the authentication of the driver, and other passengers that are enrolled in the system.


New Iranian data wiper malware hits Bapco, Bahrain's national oil company

Bapco map
At the time of writing, Bapco appears to be the only victim of an attack with the Dustman malware, although this doesn't mean the malware was not deployed on the network of other targets. According to the CNA report, attackers don't seem to have planned to deploy Dustman at the time they did, but appear to have triggered the data-wiping process as a last-ditch effort to hide forensic evidence after they made a series of mistakes that would have revealed their presence on the hacked network. Sources who spoke with ZDNet on the condition of anonymity claimed the Bahrain company was compromised over the summer. Saudi CNA officials, along with our sources, confirmed the point of entry was the company's VPN servers. The CNA report cites "remote execution vulnerabilities in a VPN appliance that was disclosed in July 2019" as the attackers' point of entry into Bapco's network While officials didn't blame any specific appliance, they are most likely referring to a Devcore report published over the summer that disclosed remote execution bugs in a wealth of enterprise-grade VPN servers, such as those from Fortinet, Pulse Secure, and Palo Alto Networks.


The case for change: New world, new skills


By upskilling, we mean giving people the opportunity to gain the knowledge, tools, and abilities they need to use and understand advanced and ever-changing technologies in the workplace and their daily lives. Not everyone has to learn to code, but many people need to understand and manage artificial intelligence, data analytics, autonomous vehicles, and other technologies that can’t even be predicted — those emerging now and those that will be created in the future. But upskilling is not simply a matter of teaching people how to use a new device. That device may be obsolete by the following year. It involves learning how to think, act, and thrive in a digital world in a way that is sustainable over time. Each nation will need its own approach, and each will need to consider the demographics of its citizens, its level of tech maturity, and the makeup of its economy to develop its own upskilling solution. A territory with a developed economy, an aging population, and a strong service sector will have different priorities than a region with a developing, mostly rural economy and a population in which most people are under 30.


How to create data literacy: 3 keys

Big data explained
A data literacy program creates associate development opportunities. Say you take three classes in Portuguese and you learn the vocabulary and the basic rules of grammar. You gain an appreciation for the language, you can read it, and you can make basic sense out of what others are communicating. To this end, we offer classes to help Red Hat associates develop their data literacy skills in a way that’s appropriate based on their role in the organization. Whether they are just starting their data literacy journey, are data practitioners, or are data leaders/advocates, everyone can grow their skills. Not everyone will have the same end goal but everyone can learn from seeing real data stories of business value gained. For example, we have courses ranging from “The Power of Data Visualization” to “Data Storytelling.” This is a great start, but who hasn’t taken a class and walked out the door (or logged off) with the best of intentions but no real plan for using the new knowledge. What happens? You never really feel confident in speaking the language.


Add Augmented Analytics to Your Business Data Practices


As we've said, the best endorsement of our instructions for adding augmented analytics into your business's data strategy is the independent feedback of our peers. In a survey of G2 Crowd reviewers, 75 percent of those surveyed favored Oracle Analytics for "Predictive Analytics Feature Satisfaction" compared to 68 percent who either favored Microsoft Power BI or Tableau desktop products. Similarly, when asked about services, 78 percent of survey respondents put Oracle at the top of the list of vendors whom they felt satisfied their needs for Big Data Features. Contrast that with Microsoft and Tableau, which scored 76 percent and 73 percent respectively. By connecting with big data sources such as those that leverage Hadoop, users can analyze unstructured data like text, videos, and image data sets, among others. This enables businesses to monitor and dig insights out of nontraditional data sets—like social media posts, emails, or IoT sensors, to name a few—that provide streaming data. Not only do these advanced features provide previously undiscovered insights, they offer relief to organizations that are not able to hire large teams of data analysts through true self-service functionality delivered by natural language.


Improving digital quotient through digital skilling

Improving digital quotient through digital skilling image
When building teams for any end-to-end process, skills for any particular role typically look a lot different. A system admin may need to know a lot more about development, a developer may need to know about user experience, and a business executive may need to know about cloud computing. The skills to support a digital enterprise are comprised of these new skills along with foundational literacies and character qualities. Improving a firm’s digital quotient means supporting staff who do not see themselves as living inside of silos. New talent should see themselves as working across departments, continuously stay abreast of latest disruptions in the landscape to engage with their counterparts effectively. ... Fear is the biggest threat to the digital quotient. While the agenda for digital quotient is far too large to be implemented all at once, it is also far too important not to be pursued. When digital quotients rise, so does business performance, in both technology-focused and traditional business firms.



Quote for the day:


"Remember teamwork begins by building trust. And the only way to do that is to overcome our need for invulnerability." -- Patrick Lencioni


Daily Tech Digest - January 08, 2020

Why data and analytics is so significant for Wells Fargo

Why data and analytics is so significant for Wells Fargo
Enterprise analytics is brought in the company firstly to offer better experience and secondly, as there have been lots of advancements in AI and machine learning, Wells Fargo wanted to create a centre of excellence to make sure that it is bringing the “latest and greatest” into the bank. In order to do that, it is looking into the machine learning use cases. “The first step was to create what we call an Artificial Intelligence Program Bank. It comprised of three different teams that were put together to do this. The first team is the business team, which is part of our innovation team and their mandate was to identify the big use cases that we want to go after and what are the big focus areas, and to figure out the areas that they want to understand and see where they can apply AI and machine learning. The second team was my team, which is all about data and data science. We ensure that we bring the right data, identify the problems, and then make sure that we have the right team members to be able to do the model development. The third team in the group was related to technology. We decided to bring these three groups together, and drive forward the application of AI in the bank,” informs Thota.



Jupyter Notebooks is a popular tool for data-science pros, allowing them to create and share code, visualizations, and other information useful in notebooks.  Microsoft enabled Jupyter notebooks native editing in VS Code in its October release of the Python extension, allowing data scientists to manage source control, open multiple files, and use the auto code-filling feature IntelliSense. In the January release, VS Code Python extension users can now see the current kernel that the notebook is using as well as the status of the kernel, such as whether it is idle or busy. Users can change to other Python kernels from the VS Code kernel selector. Microsoft also promises that this release brings performance improvements for Jupyter in VS Code in both the Notebook editor and the Interactive Window. The improvements are the result of caching previous kernels and optimizing the search for Jupyter, according to Microsoft. Microsoft says the initial start of the Jupyter server is faster and that subsequent starts are more than twice as fast. Users should experience a noticeably faster process for creating a new blank Jupyter notebook and when opening Jupyter Notebooks with a large file size.


Why Analytics Alone is No Longer Enough


Self-service analytics has been on the agenda for a long time, and has brought answers closer to the business users, enabled by “modern BI” technology. That same agility hasn’t happened on the data management side – until now. “DataOps” has come onto the scene as an automated, process-oriented methodology aimed at improving the quality and reducing the cycle time of data management for analytics. It focuses on continuous delivery and does this by leveraging on-demand IT resources and automating test and deployment of data. Technology like real-time data integration, change data capture (CDC) and streaming data pipelines are the enablers. ... Demand for data catalogues is soaring as organizations continue to struggle with finding, inventorying and synthesizing vastly distributed and diverse data assets. In 2020, we’ll see more AI infused metadata catalogues that will help shift this gargantuan task from manual and passive to active, adaptive and changing. This will be the connective tissue and governance for the agility that DataOps and self-service analytics provides.



Evaluating data loss impact in the cloud

Evaluating data loss impact in the cloud image
Following a data loss incident, organisations can see a decline in the value of competitively differentiating assets. The value of individual data sets within large organisations is something that should be assessed and measured by individual data owners within each team (engineering, product, marketing, HR etc). These data owners understand the life cycle, value, and use of their specific data and should be working in collaboration with the information security team to ensure appropriate risk practices are followed. For cloud, in addition to the data itself, competitive advantage components may include algorithms tuned by the data for business intelligence and data analytics purposes. ... The scale of reputational damage depends on the organisational business model, the details of any incident, and on the category of data itself. Customer data loss can lead to long-term reputational damage, especially if the organisation has been clearly critiqued for poor organisational and technical controls in protecting the data.


Amid privacy and security failures, digital IDs advance

blockchain bitcoin circuitry global
Self-sovereign identity envisions consumers and businesses eventually taking control of their identifying information on electronic devices and online, enabling them to provide validation of credentials without relying on a central repository, as is done now. Self-sovereign identity technology also takes the reins away from the centralized ID repositories held by the social networks, banking institutions and government agencies. A person’s credentials would be held in an encrypted digital wallet for documenting trusted relationships with the government, banks, employers, schools and other institutions. But it’s important to note that self-sovereign ID systems are not self-certifying. The onus on whom to trust depends on the other party. Whoever you present your digital ID to has to decide whether the credentials in it are acceptable. "For example, If I apply for a job…, and they require me to prove I graduated from a specific school and need to see my diploma, I can present that in digital form." said Ali.


Security Think Tank: Hero or villain? Creating a no-blame culture

In the general business IT world, all too often the end-user is identified as the point of blame for an intrusion, resulting in a culture of fear with people afraid to report anything suspicious, especially if they have clicked on a link they shouldn’t have. If there is one thing we should have learned, it is that nobody is immune to social engineering. There are numerous examples of security experts and senior managers of security companies being duped, so we must accept it is going to happen. Just as in the aviation example, this comes down to education and appropriate reporting mechanisms. Reporting must be easy, quick and provide positive feedback. Ideally, for phishing emails there should be a button to click to send the suspicious email to an automated analysis, which gives the user instant feedback on whether the email was safe or not and which automatically alerts the security operations team of any unsafe email. For other suspicious activity, feedback could be via a web portal linked to a ticketing system.


An autonomous, laser-guided mosquito eradication machine

bzigopointer.png
A three-year-old startup called Bzigo is developing a device that accurately detects and locates mosquitoes. Once a mosquito is detected, the device sends a smartphone notification while the mosquito is marked by a laser pointer. ... An autonomous laser marker that keeps a bead on the bloodsuckers might just even the playing field. This might strike you as kind of a silly idea, but the tech behind it is pretty intriguing. The device is composed of an infrared LED, a hi-res wide angle camera, custom optics, and a processor. The innovation lies in several computer vision algorithms that can differentiate between a mosquito and other pixel-size signals (such as dust or sensor noise) by analyzing their movement patterns. A wide covering patent on the device and its technologies has been recently approved giving Bzigo a leg up in the high stakes world of mosquito sport hunting. bIt's also worth noting that Bzigo is hardly the first company to try to build a better mosquito solution using technology.


How to Build a Microservices Architecture With Node.Js to Achieve Scale?

microservices with node.js
Building real-world applications in the JavaScript language requires dynamic programming, where the size of the JavaScript application grows uncontrollably. New features and updates get released, and you need to fix bugs to maintain the code. To execute this, new developers need to be added to the project, which becomes complicated. The structure of modules and packages is unable to downsize and simplify the application. To run the application smoothly, it is essential to convert the large, homogeneous structure into small, independent pieces of programs. Such complexities can be resolved effortlessly when JavaScript applications are built on microservices, more so with the Node.js ecosystem. In software application development, microservices are a style of service-oriented architecture (SOA) where the app is structured on an assembly of interconnected services. With microservices, the application architecture is built with lightweight protocols. The services are finely seeded in the architecture. Microservices disintegrate the app into smaller services and enable improved modularity.


Passive optical LAN: Its day is dawning

4 catastrophe vulnerable disaster fiber optic cables
The concept of using passive optical LANs in enterprise campuses has been around for years, but hasn’t taken off because most businesses consider all-fiber networks to be overkill for their needs. I’ve followed this market for the better part of two decades, and now I believe we’re on the cusp of seeing POL go mainstream, starting in certain verticals. The primary driver of change from copper to optical is that the demands on the network have evolved. Every company now considers its network to be business critical where just a few years ago, it was considered best effort in nature. Downtime or a congested network meant inconvenienced users, but today they mean the business is likely losing big money. ... The early adopters of POL are companies that are highly distributed with large campuses and need to get more network services in more places. This includes manufacturing organizations, universities, hospitality, cities and airports. Although I’ve highlighted a few verticals, the fact is that any business can take advantage of POL.


Decision Strategies for a Micro Frontends Architecture

To define a micro frontend, we must first identify it — is it a horizontal or vertical split? In a horizontal split, different teams work on different frontends, but in the same view. This means they have to coordinate in the composition of a page. In a vertical split, each team is responsible for a specific view which they are in full control of. Mezzalira thinks that adhering to a vertical slicing in general simplifies many of the coming decisions. It minimizes the coordination between teams, and he believes it’s closer to how a frontend developer is used to work. In an earlier blog post, he described how to identify micro frontends in more detail. When composing micro frontends, there are three options: client-side, edge-side and server-side composition. Client-side means implementing a technique like an application shell for loading single page applications. In edge-side, Edge Side Includes (ESI), or some similar technique on or near the edge is used.



Quote for the day:


"Leaders must encourage their organizations to dance to forms of music yet to be heard." -- Warren G. Bennis


Daily Tech Digest - January 07, 2020

Wi-Fi 6 will slowly gather steam in 2020

Future Wi-Fi
Making sure devices are compliant with modern Wi-Fi standards will be crucial in the future, though it shouldn’t be a serious issue that requires a lot of device replacement outside of fields that are using some of the aforementioned specialized endpoints, like medicine. Healthcare, heavy industry and the utility sector all have much longer-than-average expected device lifespans, which means that some may still be on 802.11ac. That’s bad, both in terms of security and throughput, but according to Shrihari Pandit, CEO of Stealth Communications, a fiber ISP based in New York, 802.11ax access points could still prove an advantage in those settings thanks to the technology that underpins them. “Wi-Fi 6 devices have eight radios inside them,” he said. “MIMO and beamforming will still mean a performance upgrade, since they’ll handle multiple connections more smoothly.” A critical point is that some connected devices on even older 802.11 versions – n, g, and even b in some cases – won’t be able to connect to 802.11ax radios, so they won’t be able to benefit from the numerous technological upsides of the new standard. Making sure that a given network is completely cross-compatible will be a central issue for IT staff looking to upgrade access points that service legacy gear.


Cloud storage solutions gaining momentum through disruption by traditional vendors

Cloud storage solutions gaining momentum through disruption image
This disruption, where traditional on-prem vendors have brought their offerings out into the public cloud, has led to the emergence of more innovative cloud storage solutions. “This has given customers flexibility in how they approach storage in a cloud environment, with more enterprise-style services being offered,” continued Beale. On-prem vendors want to have storage apps in the cloud. In part, this is a marketing positioning exercise where they want to be seen as new and innovative vendors, by working in the cloud. But, there’s a second, more technical and practical reason for this changing cloud storage landscape. “Around 80% of the organisations that we speak to have some sort of cloud presence. That means they’re using cloud-based technologies and at some point, multiple cloud providers, along with an on-prem solution,” explained Beale. Having a ubiquitous big data plain across an organisation is appealing to customers, because they don’t have to spend a lot of time, money or resources on disparate platforms across multiple vendors.


Why flexible work and the right technology may just close the talent gap

https://www.citrix.com/glossary/what-is-digital-transformation.html
Increasingly what we see is that freelancers become full-time freelancers; meaning it’s their primary source of income. Usually, as a result of that, they tend to move. And when they move it is out of big cities like San Francisco and New York. They tend to move to smaller cities where the cost of living is more affordable. And so that’s true for the freelance workforce, if you will, and that’s pulling the rest of the workforce with it. What we see increasingly is that companies are struggling to find talent in the top cities where the jobs have been created. Because they already use freelancers anyway, they are also allowing their full-time employees to relocate to other parts of the country, as well as to hire people away from their headquarters, people who essentially work from home as full-time employees, remotely. ... And along the way, companies realized two things. Number one, they needed different skills than they had internally. So the idea of the contingent worker or freelance worker who has that specific expertise becomes increasingly vital.


Life on the edge: A new world for data


For many CIOs, a strategy for edge computing will be entirely new. Sunil urges CIOs to assess what parts of edge computing can be achieved in-house and what should be done through a consulting firm. “A system integrator will play a big role in bringing it all together,” he says. Chris Lloyd-Jones, emerging technology, product and engineering lead at Avanade, says large enterprises are starting to build IoT platforms to centrally manage edge computing devices and provide connectivity across geographic regions. “Edge computing is no longer just about an on-board computer where data from the device is uploaded via a USB cable,” he says. “Edge computing now handles 4G and 5G connectivity with periodic connectivity, and support for full-scale machine learning and computationally intensive workloads. Data can be transmitted to and from the cloud. This provides centralised management.” Lloyd-Jones says the cloud can be used to train machine learning models, which can then be deployed to edge devices and managed like any other IT equipment.


Microsoft: RDP brute-force attacks last 2-3 days on average


Usually, these attacks use combinations of usernames and passwords that have been leaked online after breaches at various online services, or are simplistic in nature, and easy to guess. Microsoft says that the RDP brute-force attacks it recently observed last 2-3 days on average, with about 90% of cases lasting for one week or less, and less than 5% lasting for two weeks or more. The attacks lasted days rather than hours because attackers were trying to avoid getting their attack IPs banned by firewalls. Rather than try hundreds or thousands of login combos at a time, they were trying only a few combinations per hour, prolonging the attack across days, at a much slower pace than RDP brute-force attacks have been observed before. "Out of the hundreds of machines with RDP brute force attacks detected in our analysis, we found that about .08% were compromised," Microsoft said. "Furthermore, across all enterprises analyzed over several months, on average about 1 machine was detected with high probability of being compromised resulting from an RDP brute force attack every 3-4 days," the Microsoft research team added.


AI, privacy and APIs will mold digital health in 2020

Interoperability is a major player in health tech innovation: patients will always receive care across multiple venues, and secure data exchange is key to providing continuity of care. Standardized APIs can provide the technological foundations for data sharing, extending the functionality of EHRs and other technologies that support connected care. Platforms like Validic Inform leverage APIs to share patient-generated data from personal health devices to providers, while giving them the ability to configure data streams to identify actionable data and automate triggers. In the upcoming year, look for major players like Apple and Google to make strides toward interoperability and breaking down data silos. Apple’s Health app already is capable of populating with information from other apps on your phone. Add your calorie intake to a weight loss app? Time your miles with a running app? Monitor your bedtime habits with a sleep tracking app? You’ll find that info aggregating in your Health app. Apple is uniquely positioned to be the driver of interoperability, and Google is not far behind.


Capitalizing on the promise of artificial intelligence


Remarkably, a majority of early adopters within each country believe that AI will substantially transform their business within the next three years. However, as pointed out in Is the window for AI competitive advantage closing for early adopters?—part of Deloitte’s Thinking Fast series of quick insights — the early adopters also believe that the transformation of their industry is following close on the heels of their own AI-powered business transformation. Globally, there’s a sense of urgency among adopters that now is the time to capitalize on AI, before the window for competitive advantage closes. However, comparing AI adopters across countries reveals notable differences in AI maturity levels and urgency. While many nations regard AI as crucial to their future competitiveness, these comparisons indicate that some countries are adopting AI aggressively, while others are proceeding with considerable caution—and may be at risk of being left behind. Consider Canada.


Building the ‘Intelligent Bank’ of the Future

Of high awareness within the banking industry, but not yet understood by consumers is the evolving nature of open banking, which has proceeding in stages in Europe and elsewhere, but not yet in the U.S. From the consumer perspective, people want easier ways to manage their money and make their daily life easier. Many financial institutions, on the other hand, are somewhat overwhelmed by the prospects of delivering on the open banking promise. The paradox exists between the desire to deliver more integrated solutions while being transparent around the sharing of data between multiple organizations. Most of the concerns around open banking revolve around the collection and sharing of data with third parties and the inherent risks around such sharing. There is also the need to educate both the consumer and the employee on data security. The end result is less than clear regulations around open banking, and very few organizations actually being prepared to deliver on what has been promised consumers. That said, it is interesting that more than four in ten financial institutions (41%) are looking beyond just offering banking products in the future.


Backdoors and Breaches incident response card game makes tabletop exercises fun

Backdoors & Breaches  >  Incident Response Card Game
Unlike some tabletop exercises that can take months to prepare and last for days, Backdoors and Breaches makes it simple to role-play thousands of possible security incidents, and to do so even as a weekly exercise. The game can be played just by blue teamers but could also involve a member of the legal team, management, or a member of the public relations team. The ideal game involves no more than six players to ensure that everyone is engaged and participating. "This game can be played every Thursday at lunch," Blanchard tells CSO. If the upside of the B&B card deck is the ability to instantly create thousands of scenarios from generic attack methods, the downside is that it lacks cards for specific industries, or company-specific issues. Black Hills plans for expansion decks in 2020, including one for industrial control system (ICS) security and another for web application security. The B&B deck launched at DerbyCon 2019, and Blanchard says they plan to give away free decks at every infosec conference they attend in 2020. The decks are also available on Amazon for $10 plus shipping, which, he says, just covers their costs.


An Introduction to Blazor and Web Assembly

An Introduction to Blazor and Web Assembly
Blazor is a new framework that lets you build interactive web UIs using C# instead of JavaScript. It is an excellent option for .NET developers looking to take their skills into web development without learning an entirely new language. Currently, there are two ways to work with Blazor: running on an ASP.NET Core server with a thin-client, or completely on the client’s web browser using WebAssembly instead of JavaScript. ... UI component libraries were created long before Blazor. Most existing frameworks that target web applications are based on JavaScript. They are still compatible with Blazor due to its ability to interoperate with JavaScript. Components that are primarily based on JavaScript are called wrapped JavaScript controls, as opposed to components written entirely in Blazor, which are referred to as native Blazor controls. Native Blazor controls parse the Razor syntax to generate a render tree that represents the UI and behavior of that control. The render tree is why it’s possible to run server-side Blazor. The tree is parsed and used to generate HTML on the server that’s sent to the client for rendering. In the case of Blazor WebAssembly, the render tree is parsed and rendered entirely in the client.



Quote for the day:


"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup


Daily Tech Digest - January 06, 2020

Deep learning vs. machine learning: Understand the differences

Deep learning vs. machine learning: Understand the differences
Dimensionality reduction is an unsupervised learning problem that asks the model to drop or combine variables that have little or no effect on the result. This is often used in combination with classification or regression. Dimensionality reduction algorithms include removing variables with many missing values, removing variables with low variance, Decision Tree, Random Forest, removing or combining variables with high correlation, Backward Feature Elimination, Forward Feature Selection, Factor Analysis, and PCA (Principal Component Analysis). Training and evaluation turn supervised learning algorithms into models by optimizing their parameter weights to find the set of values that best matches the ground truth of your data. The algorithms often rely on variants of steepest descent for their optimizers, for example stochastic gradient descent, which is essentially steepest descent performed multiple times from randomized starting points.



Why enterprises should care about DevOps


The old days of manually doing everything as an IT person are gone, and companies that are still operating that way are undergoing transformation. But I don’t think we’re ever going to get rid of operational concerns. It’s just going to be that rather than doing things manually, or through graphical consoles, you're going to work via APIs, scripting languages and automation tools like Puppet. And in many ways – and I say this quite a lot – DevOps has made operations people feel that they must become developers to get their job done. But it’s more about embracing software engineering principles. It’s about version control, release management, branching strategies, and continuous integration and delivery. We’ve seen this repeatedly, and that’s why we added features to Puppet Enterprise around continuous delivery, because the most successful customers were those that were adopting infrastructure as code.


Legal engineering: A growing trend in software development

Legal advice service concept with lawyer working for justice, law, business legislation, and paperwork expert consulting, icons with person in background
Legal engineers come from incredibly diverse backgrounds and collectively have years of experience and insights that benefit our customers tremendously. They include former attorneys from top law schools and some of the country's best law firms, experts in contract law, and a former civil rights trial attorney. We have other legal engineers who came to us from top-tier management consulting firms and several who gained considerable experience at some of Silicon Valley's best SaaS companies. These diverse backgrounds and responsibilities mean that the role of legal engineering can seem very different depending on who you ask. To our customers, they are thought partners, advising on best practices for building a modern legal team. To our product team, they are the voice of the user, listening and synthesizing valuable feedback. Sometimes, we even refer to them internally as our in-house S.W.A.T. team, because they are ready and able to jump in and help fix any situation. Ultimately, legal engineers are at the forefront of the modernization of in-house legal. As legal technology continues to evolve, so will legal engineering.


Banner: Fragmentation by Country
In this post, we look at how Fragmentation varies across the globe and key statistics you should keep in mind if you have a presence in these markets. The growth mantra of online businesses is scale — reach more users, fast. However, as you scale across countries, it’s important to ensure that your app/website is compatible with your users’ devices and browsers. Compatibility is to online businesses what distribution is to brick and mortar ones. You might have the best product in the world, but it counts for nothing if your customers don’t have the experience you designed for them. For instance, being compatible with the top 20 devices will help you cover 70% of the US audience. In India, not only will the devices be different, the coverage provided will be less than 35%. Similarly, if your mobile website doesn’t load properly in the Opera browser, you would have ignored almost half of the Nigerian market!


Industry 4.0 / Industrial IoT / Smart Factory
“This consolidation will strengthen the ability of the IIC to provide guidance and advance best practices on the uses of distributed-ledger technology across industries, and boost the commercialization of these products and services,” said 451 Research senior blockchain and DLT analyst Csilla Zsigri in a statement. Gartner vice president and analyst Al Velosa said that it’s possible the move to team up with TIoTA was driven in part by a new urgency to reach potential customers. Where other players in the IoT marketplace, like the major cloud vendors, have raked in billions of dollars in revenue, the IIoT vendors themselves haven’t been as quick to hit their sales targets. “This approach is them trying to explore new vectors for revenue that they haven’t before,” Velosa said in an interview. The IIC, whose founding members include Cisco, IBM, Intel, AT&T and GE, features 19 different working groups, covering everything from IIoT technology itself to security to marketing to strategy.



Up to half of developers work remotely; here's who's hiring them

It is estimated that there are between 18 to 21 million developers across the globe. Of this, only about one million -- or five percent -- are in the United States, so you can see how an employer in the US, or anywhere else for that matter, needs to spread its recruiting and staffing wings. It's in the best interest for tech-oriented employers, then, to be open to this global pool of talent. There are a number of companies leading the way, actively hiring globally distributed tech workforces. Glassdoor recently published a list of leading companies that encourage remote work, which includes some prominent tech companies, and Remotive has been compiling a comprehensive list of more than 2,500 companies of all sizes that hire remote IT workers. Survey data from Stack Overflow, analyzed by Itoro Ikon, finds that out of almost 89,000 developers participating in its most recent survey, 45% work remotely at least part of the time, and 10% indicated they are full-time remote workers.


The Fundamental Truth Behind Successful Development Practices: Software is Synthetic


Look across the open plan landscape of any modern software delivery organization and you will find signs of it, this way of thinking that contrasts sharply with the analytic roots of technology. Near the groves of standing desks, across from a pool of information radiators, you might see our treasured artifacts - a J-curve, a layered pyramid, a bisected board - set alongside inscriptions of productive principles. These are reminders of agile training past, testaments to the teams that still pay homage to the provided materials, having decided them worthy and made them their own. What makes these new ways of working so successful in software delivery? The answer lies in this fundamental yet uncelebrated truth - that software is synthetic. Software systems are creative compounds, emergent and generative; the product of elaborate interactions between people and technology.


5G is poised to transform manufacturing

5G mobile wireless network technology
Today, many manufacturers use as fiber, Wi-Fi and 4G LTE rather than 5G because 5G infrastructure, standards, and devices are yet to be available and proven. “But many people are starting to look at 5G today, looking at it as a more future-proof strategy than adopting 4G,” said Dan Hays, principal and head of US corporate strategy practice at PricewaterhouseCoopers LLP. “4G LTE has been around for a little over a decade.” 5G devices available today are very early ones. “They are not yet at the mass-production level and have not come down the cost curve to drive large-scale adoption,” he said. According to Erik Josefsson, vice president and head of advanced industries at Ericsson, which makes underlying 5G technology, 5G is currently at Release 15, which offers high data rate, extended coverage, and low latency compared to 4G – but doesn’t get down to the goal of 1-millisecond latency. "You can get 10 milliseconds," he said. "But you're not down to 1 millisecond yet. Release 16 is ultra-reliable low-latency, down below 10 milliseconds, for more complex use cases."


These five tech trends will dominate 2020


The constant drip-drip of data leaks and privacy catastrophes show that security is still, at best, a work-in-progress for many organisations. And security is still a minor consideration for many business leaders too.. Perhaps that's because there have been so many leaks that they think the risk to their reputation is low. It's a dangerous assumption to make. More apps and more devices mean security teams are already spread too thinly. Add in new risks like Internet of Things projects, 5G devices and deepfakes and the challenges mount unless companies take the broadest possible view of security. Organised crime and ransomware will still be the most consistent threats to most businesses; state-sponsored attacks and cyber-espionage will remain an exotic but potentially high-profile threat to a minority. For all this, the biggest risks will still be the basic ones; staff falling for phishing emails, or using their pets' names as passwords, and poorly configured cloud apps. There will always be new threats, so prepare for the strangest while not forgetting the basics.


Three Surprising Ways Archiving Data Can Save Serious Money


Until recently, backup solutions for enterprises typically fall into two strategies: tape or disk-to-disk (D2D) replication. Both of these solutions come with significant price tags to backup a single terabyte of primary data. The common misconception is that tape backup is cheap. While an actual tape might be cheap, backing up primary data with tape also requires tape libraries, servers, software, data center space, power, cooling, and management overhead. These costs add up very quickly. Our research shows that to backup a single terabyte of primary with tape could cost $138-$1,731 per year, depending on how frequently you are completing a full backup. The other common backup solution – replication – requires backup workflows that replicate data from the primary NAS system to a secondary storage platform from the same vendor. In most cases, this means that the secondary storage system is architecturally similar to the primary NAS device, requiring hardware, software, data center space, power, cooling, and management.



Quote for the day:


"There are many elements to a campaign. Leadership is number one. Everything else is number two.
 -- Bertolt Brecht