Daily Tech Digest - October 14, 2020

Financial crime group FIN11 pivots to ransomware and stolen data extortion

Despite casting a wide net with its phishing campaigns, FIN11 choses to perform deeper compromises on only a small subset of its victims, which are likely selected based on their size, industry and likelihood of paying. Like several other sophisticated ransomware gangs, FIN11 uses manual hacking to move laterally through networks and deploy its ransomware, so the group might not have enough manpower to do this on a large scale. If a victim looks interesting, after the initial intrusion the FIN11 attackers deploy multiple backdoors with the goal of moving laterally and obtaining domain administrator privileges. Even though its exclusive tools like FlawedAmmyy and MIXLABEL are used to gain the initial foothold, the lateral movement activity involves the use of many publicly available tools. This is similar to how an increasing number of hacker groups operate. Once domain admin credentials have been obtained, the attackers use various tools to disable Windows Defender and deploy the CLOP ransomware to hundreds of computers using Group Policy Objects. FIN11's ransom notes include only an email address for victims to contact and do not specify a ransom amount, suggesting the ransom is later customized based on who the victim organization is.


How to ignite a mainframe transformation with three key mindset changes

There’s often a misconception that IT departments have to plan their entire mainframe transformation at the same time, which usually leads to delays and pushback from teams who believe the effort is simply too ambitious, or fear it will take too long to achieve. It’s important to remember that mainframe teams usually have a backlog of essential, customer-impacting work to complete, so it’s difficult to take resources away from those tasks to support an internal transition project. It’s far more effective to break the transformation down into smaller steps, using Agile thinking to enable incremental change, and establish continuous feedback and improvements. Instead of trying to build a complete environment for Agile delivery on the mainframe, it’s better to break the process down into steps, using shorter sprints to manage the transition and mitigate any risk and resource constraints. Start by modernising a single aspect of mainframe delivery, such as improving the developer experience with an integrated development environment (IDE), then add automated testing processes, or application analysis and visualisation in stages, to avoid overwhelming teams with a major transition project all at once. It also helps get more people on board, by allowing them to see the benefits of each step before they take the next one.


Agile resilience in the UK: Lessons from COVID-19 for the ‘next normal’

Alongside establishing a guiding purpose, the most effective organizations focused on more frequent communications, taking an adult-to-adult tone that explained decisions and shared a realistic assessment. During the COVID-19 pandemic at UK Power Networks, for example, the CEO shared daily video messages showing the rationale behind corporate decisions. Feedback from employees demonstrated the positive effect of this clear communication and transparency. For organizations that have found a new focus during the COVID-19 crisis, the next key step should be to consider if they can enhance and develop their common purpose to hold true in more normal times, giving employees the same clarity of decision making and ability to act as during the COVID-19 crisis. Agile organizations often speak of a shared purpose and vision—the “North Star”—which helps people feel personally and emotionally invested in the organization. This North Star allows employees to individually and proactively watch for changes in customer preferences and the external environment, and then, act upon them. ... The second shared practice we found was that organizations created new forums and structures, or repurposed existing ones, to act as rapid-decision-making bodies.


Build Next-Generation Cloud Native Applications with the SMOKE Stack

Enterprise technology needs to help organizations take action in real time. Doing this effectively means modernizing application architecture from batch processing to event-driven. Serverless computing is an event-driven architecture that abstracts infrastructure, so developers can focus on writing the application code. With serverless, application teams don’t need to worry about the complexity of maintaining, patching, supporting and paying for infrastructure that they need on an elastic basis. This makes serverless perfect as the glue to integrate services from anywhere. At TriggerMesh, we think serverless is only the beginning. The real power comes from what serverless enables. Serverless architectures allow even the largest enterprises with years or decades of legacy code to break out of the constraints of their own data centers and a single cloud. Open source, standards and specifications free enterprise developers to mash-up services from on-premises and any cloud, to rapidly compose event-driven applications that support high velocity — so that you can bring new features and products to market fast.


Ransomware: It’s time to bring cybersecurity audits up to GDPR status

According to Check Point, the number of daily ransomware attacks worldwide has increased by half over the past three months -- close to doubling in the United States alone -- as threat actors take advantage of the operational disruption and rapid shift to home working caused by COVID-19. Ezat Dayeh, Senior Engineer Manager UK&I at Cohesity, told ZDNet in an interview that the company has seen a recent and "dramatic" increase in the volumes of ransomware incidents. As more people are working from home due to COVID-19, this may have introduced new risk factors -- but the increasing sophistication of such attacks is of concern, too. "When we think about two or three years ago, when people were hit with ransomware, nine out of ten times they would basically say, "it's definitely impacted production, we've got issues, but we can go back to our backups," and worst-case scenario, we will just do a restore," Dayeh said. "But now, with that sophistication, the bad guys know this. Ransomware can come into a network [and] it won't do anything but it will start looking around and see what it can access on the network."  


Facebook’s New Open Source Framework For Training Graph-Based ML Models

The use of WFST data structure is prevalent among speech recognition, natural language processing, and handwriting recognition applications. WFST, especially in the speech recognition systems, provides a common and natural representation for the hidden Markov models (HMM), context-dependency, grammar, pronunciation dictionaries, and weighted determinization algorithms to optimise time and space requirement. One of the most popular WFST-based products is the Kaldi toolkit for speech recognition which is trained to decode speeches. Kaldi heavily relies on OpenFST, which is an open-source WFST toolkit. To understand the importance of GTN framework for a WFST graph, we consider a general speech recogniser. A speech recogniser consists of an acoustic model that predicts the letters in the speech, its language model, and also identifies the word that may follow. These models are represented as WFSTs and are trained separately before combining to output the most likely transcription. It is, at this juncture, that the GTN library steps in to train the different models, which in turn provides better results. Before GTN, the use of the individual graphs at the training time was implicit, and the graph structure needed to be hard-coded in the software. 


What will quantum computing mean for business?

There are four main areas that are already a focus of attention. Cybersecurity is the obvious first one, because if quantum computers render existing encryption worthless, they can also be used to produce more secure algorithms, random number generators and keys that can’t be defeated by their own processing prowess. The other areas revolve around the capacity quantum computing has for comparing lots of different possibilities and finding the optimum one amongst them or best fit. For example, in financial services this could provide portfolio optimisation, high-frequency trading advantages, and more efficient fraud detection. Goldman Sachs, RBS and Citigroup are already recruiting towards taking advantage of these possibilities. Logistics is another obvious beneficiary. Traffic management, delivery route optimisation, and other traffic-related problems are finding potential quantum solutions, with Daimler and Honda already aiming to acquire quantum computers for these kinds of activities. Similarly, manufacturing, pharmaceuticals, and materials science can optimise their processes, such as the manufacturing supply chain. Existing quantum computers with just 50 qubits are delivering good results for applications such as protein folding and new drug formula discovery.


Windows “Ping of Death” bug revealed – patch now!

Interestingly, the bug you see triggering in the video above that provokes the BSoD is caused by a buffer overflow. TCPIP.SYS doesn’t correctly check the size of one of the data fields that can optionally appear in IPv6 ICMP packets, so you can shove too much data at it and corrupt the system stack. Bang! Down it goes. Two decades ago, almost any stack-based buffer overflow on Windows could be used not only to crash a system, but also, with a bit of care and planning,to take over the processor’s flow of execution and divert it into a program fragment – known as shellcode – of your own choosing. In other words, Windows stack overflows in neworking software almost always used to lead to so-called remote code execution exploits, where attackers could trigger the bug from afar with specially crafted network traffic, run code of their own choosing, and thereby inject malware without you even being aware. But numerous security improvements in Windows, from Windows XP SP3 onwards, have made stack overflows harder and harder to exploit, and these days they can often only be used to force crashes, not to take over completely. Nevertheless, a malcontent on your network who could crash any computers at will, servers and laptops alike, could cause plenty of harm just through what’s known as a denial of service attack, especially because recovering from each crash requires a complete reboot.


The CISO’s newest responsibility: Building trust

As part of this evolution, CISOs have had to build confidence among all stakeholders—customers, partners, employees, board members and other executives—that they and their security teams have the organization’s best interests in mind when it comes to cybersecurity decisions. ... “Things are all upside down now. No one is working the same, and there’s a lot of discomfort out there. So as a security person you have to build that trust. It’s part of your job, and it’s what you get paid to do,” says Gene Fredriksen, a veteran security executive now serving as executive director of the National Credit Union Information Sharing & Analysis Organization (NCU-ISAO) and cybersecurity principal for Pure IT Credit Union Services. ... The CISO’s capacity to cultivate trust is more than an esoteric discussion or business-school exercise: Experts say it’s an essential element for any CISO who wants to be successful in the role because it enables him or her to enact the policies, procedures and technologies needed to secure the organization and, thus, prove to others—including customers—that their interactions with the company are safe.


Data Analytics Without a Plan is Like Panning for Gold

Of the many lessons COVID-19 has to teach, data analysis is one of the least appreciated. A lack of quality data has led to unanswerable questions about the availability of ventilators, hospital beds, and personal protective equipment. Poor data collection has hindered contact tracing efforts. In a pandemic, collecting the right data and applying it in the right way can save lives. A hospital in Boston was lauded for using a forecasting model to anticipate how many bags of blood it would need. Singapore, one of the countries with the slowest spread of COVID-19, uses blockchain and analytics to reduce exposures through contact tracing. Many of the economy’s heavy hitters, like Amazon and Facebook, were designed from the outset to apply data. If a shopper looks repeatedly at an item on Amazon, the site will show similar items, adjust the price, or offer promotions to prod a purchase. Facebook’s Cambridge Analytica scandal demonstrates what can happen when data is applied indiscriminately. People felt violated by the depth of information the company was able to glean from their internet use. 



Quote for the day:

"Leaders lead when they take positions, when they connect with their tribes, and when they help the tribe connect to itself." -- Seth Godin

Daily Tech Digest - October 13, 2020

MLOps: More Than Automation

For MLOps to learn from DevOps, we must center the needs of data scientists and the people that are impacted by their models first. It isn’t enough to say that practicing MLOps means advocating for automation and monitoring at all steps to do things faster. Without this focus, we will see an increase in the deployment of models that have uninspected and unintended consequences that often disproportionately impact marginalized communities. So, as a data scientist, what is it that I need? Keeping up with the latest and greatest event streaming services, distributed systems or methods of continuous deployment of integration isn’t where my mind lights up. I would like to spend most of my time understanding the domain space of the model I’m about to build, the nuanced impact of that model and whether it’s going to meet the needs of my customers and the people they serve. There are a few ways to notice if you’re applying MLOps basically as a Band-Aid, a way to just go faster, that will ultimately break down. When looking for a solution to automate, consider if you’re only reducing the work required for manual processes or if you’re also enabling data scientists to focus on the hard problems they’re trained to tackle.


6 Signs DevSecOps Maturity Has a Long Way to Go

Nevertheless, AppSec teams still struggle on many fronts to bake security into the process of delivering software, and the vast majority of organizations are early on in their DevSecOps journey. According to another recent study conducted WhiteSource, only 20% of organizations believe they’ve reached full DevSecOps maturity. And 73% of respondents say they feel forced to compromise on security to meet short development lifecycles. Which is fine in a lot of situations, because what is risk management but a constant exercise in compromise? It’s all about weighing the risks against the benefits of a certain activity, and coming up with a balance in action and controls that minimize the risk while maximizing the benefits. The problem for DevSecOps today is that the indicators show there’s still little rigor or due diligence to come up with a disciplined method for determining that balance, let alone executing on it. ... The disconnect on what DevOps pros prioritize over time—security work versus innovation and feature delivery—ultimately comes down to how they’re measured and incentivized by their bosses. Many executive teams may pay lip service to the need of better cooperation between security , 44% according to security pros interviewed in the Ponemon study. 


Half of all virtual appliances have outdated software and serious vulnerabilities

"Poor processes account for the product age problem in many cases," Orca said in its report. "Out-of-date products remain available after they’ve reached their end-of-life. The overall product is no longer supported, the operating systems may be unsupported, and/or updates and patches are no longer being applied. As a result of Orca Security’s research, 39 products have been removed from distribution." Commercial appliances scored about the same on average as free and open-source ones, with the latter having a slight advantage. However, hardened virtual appliances whose operating systems and software stacks had been stripped down to minimize attack surface, scored much higher than all other appliances -- 94.2 on average. Over half of tested appliances came from system integrators. These images have all the necessary components to run certain Web applications -- for example an image with WordPress, but also the Apache Web server and MySQL database and the OpenSSL security library. Their average score was 77.6, which is close to the overall average score for all appliances, but lower than those from security vendors.


CPRA: More opportunity than threat for employers

The CPRA is actually a lot more lenient than the GDPR in regard to how it polices the relationship between employers and employees’ data. Unlike for its EU equivalent, there are already lots of exceptions written into the proposed Californian law acknowledging that worker-employer relations are not like consumer-vendor relations. Moreover, the CPRA extends the CCPA exemption for employers, set to end on January 1, 2021. This means that if the CPRA passes into law, employers would be released from both their existing and potential new employee data protection obligations for two more years, until January 1, 2023. This exemption would apply to most provisions under the CPRA, including the personal information collected from individuals acting as job applicants, staff members, employees, contractors, officers, directors, and owners. However, employers would still need to provide notice of data collection and maintain safeguards for personal information. It’s highly likely that during this two-year window, additional reforms would be passed that might further ease employer-employee data privacy requirements. While the CPRA won’t change much overnight, impacted organizations shouldn’t wait to take action, but should take this time to consider what employee data they collect, why they do so, and how they store this information.


Digital transformation: 3 hard truths

Digital transformation projects that are born as “IT initiatives” run the risk of being viewed as changes for the sake of new technology. Digital transformations must be viewed as business transformations, with business leaders not only buying into the proposed plans and value but driving the organizational and process changes that are needed to be successful. The widespread adoption of technologies means an organization doesn’t gain a competitive edge when it uses them, but rather how it uses them. Success lies in creating balanced IT-business partnerships that provide experts from both technical and business domains so new technologies can be integrated deep into the business. Intel’s AI projects are a perfect example of this in practice. Together, IT and the business have been able to achieve over $500 million in business value in 2019. Digital transformation isn’t a “from->to” process that reaches a static, determined “end state.” Today’s competitive pressures and the pace of technological change are simply too great to allow for a transformation to ever be “finished.” We need to view digital transformation as always evolving, always underway – with leaders and businesses embracing a dynamic state of constant disruption.


Ransomware operators now outsource network access exploits to speed up attacks

"Since the start of 2020 and the emergence of the now-popular "ransomware with data theft and extortion" tactics, ransomware gangs have successfully utilized dark web platforms to outsource complicated aspects of a network compromise," the researchers say. "A successful ransomware attack hinges on the development and maintenance of stable network access which comes with a higher risk of detection and requires time and effort. Access sellers fill this niche market for ransomware groups." As of September this year, Accenture has tracked over 25 persistent network access sellers -- alongside the occasional one-off -- and more are entering the market on a "weekly basis." Many of the sellers are active on the same underground forums haunted by ransomware groups including Maze, NetWalker, Sodinokibi, Lockbit, and Avaddon. Sellers have now begun touting their offerings on single forum threads, rather than separate posts, and RDP remains a popular option for network access. In an interesting twist, rather than sell-off a zero-day vulnerability to one seller, some traders are using these unpatched bugs to exploit numerous corporate networks and sell access to threat actors in separate bundles to generate additional revenue.


What 5G brings to IoT today and tomorrow

IoT devices today are mostly connected via cabled technologies, Engarto says. These include both shielded twisted-pair LAN and coaxial cables. “In some limited areas Wi-Fi may have some usage,” but is not always ideal, she says. “5G enables many more sensors to be put in place without a need for cable and conduit for each cable,” Engarto says. But the newer wireless technology “will be one of many networking solutions designed to address IoT’s full needs,” says Patrick Filkins, senior research analyst, IoT and mobile network infrastructure, at research firm International Data Corp. (IDC). “For example, 5G can address endpoints that require any breadth of latency, reliability, and security,” Filkins says. “While 5G will be a Swiss-army knife solution to IoT, all from a single platform, some enterprises may not need the full breadth of 5G’s capabilities. In many cases, such as LPWAN [low-power WAN], you can achieve connectivity through alternatives such as LoRaWAN.” Wi-Fi 6 and Wi-Fi HaLoW will also play a role in dense, shorter-range IoT use cases, Filkins says, although with a potential loss in reliability. “5G is an uplift from LTE when it comes to promising zero downtime communications, by baking in new technologies enabling near-zero packet loss,” Filkins says.


Why India’s Proposed Data Protection Authority Needs Constitutional Entrenchment

The DPA has been entrusted the role of a fourth branch institution, primarily due to its overarching role in protecting the fundamental right to privacy of citizens against not only possible transgressions of such privacy by the private sector but also possibly by the government itself. As opposed to a sectoral regulator, it is a sector-agnostic body and has wide powers cutting across sectors and economic spheres. It is empowered to penalise both Central and state governments when they fail to protect an individual’s personal data. In fact, it is also empowered to monitor sensitive data processed by other fourth branch watchdogs such as the CAG and the EC and even more significantly, the Legislature and Judiciary itself. As such, the DPA carries out crucial fourth branch oversight and accountability functions against almost all institutions of governance in our system. Why does the DPA, in its current form, lack the independence needed to be a strong fourth branch institution and ward off attempts of political interference? This is primarily attributable to the fact that its structure and composition was inspired from sectoral regulators such as SEBI, IRDA and TRAI, based on the recommendation by the Financial Sector Legislative Reforms Commission as mentioned in the Justice B.N. Srikrishna committee report.


Automation and AI: Challenges and Opportunities

Today, it is widely acknowledged that automation and AI technologies will gradually transform the global workplace, with intelligent machines performing human tasks in some cases and aiding the human in other cases. The presence of robotic machines in the workplace will ultimately increase efficiency and reduce costs. As a result, many human occupations will disappear, while others will adapt to technology-enabled roles. ... Although businesses have shown a recent trend of hiring AI developers at a breakneck speed to fulfill their in-house automation needs, few understand the fundamental challenges that this technology brings with it. As a result, the “AI comfort zone” is still missing in enterprise business circles, and business operators are still doubtful about the cost benefits associated with AI. Everywhere you look today, you come across automated machines or systems driven by powerful computers, multi-channel data, and very smart algorithms. The modern society is grappling with chat bots, PDAs, self-driving vehicles on roads, and automated check-outs in grocery stores. ... Although Data Governance is still a concern among most business operators, it is widely accepted that augmented intelligence has the capability of emulating the human decision-making process. 


Microsoft India Announces Public Preview of Power Automate Desktop Solution

Power Automate Desktop is a part of Microsoft Power Automate service and is claimed to enable coders and non-coders alike to automate processes and tasks across desktop and web applications with minimal effort from a single intelligent platform. According to sources, the design environment allows non-coders to automate processes quickly without writing a single line of code. It also provides complete control and flexibility for advanced users, programmers and developers in a scalable and secure environment. It further democratises the RPA capabilities within Power Automate by providing a desktop automation option for citizen developers and business users. Irina Ghose Executive Director of Cloud Solutions, Microsoft India stated, “Organisations and IT departments are seeking ways to quickly adapt to the unprecedented pace of change across every industry around the world. With Microsoft Power Automate Desktop, we aim to empower organisations automate tasks across the desktop and web, using an integrated platform to complete tasks at speed and scale.”



Quote for the day:

"You get in life what you have the courage to ask for." -- Nancy D. Solomon

Daily Tech Digest - October 12, 2020

How building bricks could store electricity

A conducting polymer called PEDOT, which is used in traditional battery-substitute supercapacitors, works well with the porous structure of bricks: "In this work, we have developed a coating of the conducting polymer PEDOT, which is comprised of nanofibers that penetrate the inner porous network of a brick; a polymer coating remains trapped in a brick and serves as an ion sponge that stores and conducts electricity," D'Arcy said in the university publication. The red pigment in bricks — bricks are made from clay that contains iron oxide, or rust — is essential for triggering the polymerization reaction, the researchers explain. D'Arcy writes in The Conversation: "We fill the pores in bricks with an acid vapor that dissolves the iron oxide and converts it to a reactive form of iron that makes our chemical syntheses possible. We then flow a different gas through the cavities to fill them with a sulfur-based material that reacts with iron. This chemical reaction leaves the pores coated with an electrically conductive plastic, PEDOT." The bricks could be connected to solar panels in lieu of batteries, D'Arcy suggests. Powering IoT sensors could be a possible use-case.


What Is GPT-3 And Why Is It Revolutionizing Artificial Intelligence?

In terms of where it fits within the general categories of AI applications, GPT-3 is a language prediction model. This means that it is an algorithmic structure designed to take one piece of language (an input) and transform it into what it predicts is the most useful following piece of language for the user. It can do this thanks to the training analysis it has carried out on the vast body of text used to “pre-train” it. Unlike other algorithms that, in their raw state, have not been trained, OpenAI has already expended the huge amount of compute resources necessary for GPT-3 to understand how languages work and are structured. The compute time necessary to achieve this is said to have cost OpenAI $4.6 million. To learn how to build language constructs, such as sentences, it employs semantic analytics - studying not just the words and their meanings, but also gathering an understanding of how the usage of words differs depending on other words also used in the text. It's also a form of machine learning termed unsupervised learning because the training data does not include any information on what is a "right" or "wrong" response, as is the case with supervised learning.


Biology and healthcare's AI moment, ethics, predictions, and graph neural networks

Hogarth mentioned that the speculation phase in AI for biology and healthcare is starting, with lots of capital flowing. There are going to be some really amazing companies that come out of it, and we will start to see a real deployment phase kick in. But it's equally certain, he went on to add, there are going to be instances that will be revealed to be total frauds. So, what about AI ethics? Benaich and Hogarth cite work by pioneers in the field, touching upon issues such as commercial gender classification, unregulated police facial recognition, the ethics of algorithms, and regulating robots. For the most part, the report focuses on facial recognition. Facial recognition is widespread the world over and has lead to controversy, as well as wrongful arrests. More thoughtful approaches seem to gather steam, Benaich and Hogarth note. The duo's report cites examples such as Microsoft deleting its database of 10 million faces (the largest available) collected without consent, Amazon announced a one-year pause on letting the police use its facial recognition tool Rekognition to give "congress enough time to put in place appropriate rules." And IBM announced it would sunset its general-purpose facial recognition products.


How to build up cybersecurity for medical devices

The easy answer to this is “yes,” since many MDMs in the medical device industry perform “competitive analysis” on their competitors’ products. It is much easier and cheaper for them to have a security researcher spend a few hours extracting an algorithm from a device for analysis than to spend months or even years of R&D work to pioneer a new algorithm from scratch. Also, there is a large, hundreds-of-millions-of-dollars industry of companies who “re-enable” consumed medical disposables. This usually requires some fairly sophisticated reverse-engineering to return the device to its factory default condition. Lastly, the medical device industry, when grouped together with the healthcare delivery organizations, constitutes part of critical national infrastructure. Other industries in that class (such as nuclear power plants) have experienced very directed and sophisticated attacks targeting safety backups in their facilities. These attacks seem to be initial testing of a cyber weapon that may be used later. While these are clearly nation-state level attacks, you have to wonder if these same actors have been exploring medical devices as a way to inhibit our medical response in an emergency.


Generating Photons for Communication Between Processors in a Quantum Computing System

“The entanglement between the photons can then be transferred into the processors for use in quantum communication or interconnection protocols.” While the researchers said they have not yet implemented those communication protocols, their ongoing research is aimed in that direction. “We did not yet perform the communication between processors in this work, but rather showed how we can generate photons that are useful for quantum communication and interconnection,” Kannan says. Previous work by Kannan, Oliver, and colleagues introduced a waveguide quantum electrodynamics architecture using superconducting qubits that are essentially a type of artificial giant atom. That research demonstrated how such an architecture can perform low-error quantum computation and share quantum information between processors. This is accomplished by adjusting the frequency of the qubits to tune the qubit-waveguide interaction strength so the fragile qubits can be protected from waveguide-induced decoherence to perform high-fidelity qubit operations, and then readjusting the qubit frequency so the qubits are able to release their quantum information into the waveguide in the form of photons.


Why the Serverless Revolution Has Stalled

Most serverless platforms only allow you to run applications that are written in particular languages. This severely limits the agility and adaptability of these systems. Admittedly, most serverless platforms support most mainstream languages. AWS Lambda and Azure Functions also provide wrapper functionality that allows you to run applications and functions in non-supported languages, though this often comes with a performance cost. So for most organizations, most of the time, this limitation will not make that much difference. But here's the thing. One of the advantages of serverless models is supposed to be that obscure, infrequently used programs can be utilized more cheaply, because you are only paying for the time they are executing.  ... The second problem with serverless platforms, or at least with the way that they are implemented at the moment, is that few of platforms resemble one another at an operational level. There is little standardization across platforms when it comes to the way that functions should be written, deployed, and managed, and this means that migrating functions from one vendor-specific platform to another is extremely time consuming.


Crypto banks are going to swallow fiat banks in 3 years — or even less

You probably already know that Kraken, a cryptocurrency exchange based out of San Francisco, is now the first-ever cryptocurrency business in the United States to become a bank. For now, being an officially chartered bank means that Kraken will be able to offer more banking and funding options to existing customers. It also means Kraken Financial is going to be able to operate in multiple jurisdictions without having to deal with state-by-state compliance plans. Kraken is currently working with Silvergate Bank to offer SWIFT and FedWire funding options to U.S. customers. More and more of these kinds of partnerships will become the status quo in the near future. That’s why now is the time for traditional banks that are lagging behind to start paying attention. Silvergate Bank is a step ahead of the rest at the moment. The company boasts 880 digital asset companies as clients. Those clients have deposited more than $1.5 billion with the bank. That’s still a small amount of money relative to the market capitalizations of most major banks or even most major cryptocurrencies for that matter.


A basic overview of micro front ends

The concept for micro front ends, as described by Cam Jackson, consultant at ThoughtWorks, is similar to that of microservice design: "Slicing up big and scary things into smaller, more manageable pieces, and then being explicit about the dependencies between them." This architectural choice frees up the team to make independent choices for the technology, codebase and release processes. If you've used any leading e-commerce store lately, you may have noticed the visual format of the webpage sometimes changes after you log in. Instead of a single home page with static text and buttons, you are more likely to encounter a navigable series of clickable boxes that adjust their size in relation to the size of the browser window. These boxes are all designed to guide you to a particular purchase decision, as well as align it with your recently viewed items, past orders, recommendations, discounts and so on. In the website design, a controller is responsible for knowing how much screen resolution is available at a given time, and how much space each visual component will take. It has to optimize those spaces and call the services that will populate them. If a service is down, the controller will also need to adjust the screen in response to the failed call, or call a different service that can provide the needed function.


Project management: How to cope with massive uncertainty and get stuff done

Steve Bates, principal at consultant KPMG, also recognises the need for agility and says this requirement has gone mainstream. We're at a point now that the whole enterprise – not just the IT department – has to be adaptive and agile. "And that's going to take time; you can't just do that overnight. What you're going to see is the tendency for large-scale, long-term investments to be broken into smaller chunks. That allows IT and the business to work together to demonstrate quick value and then assess continuously if they're on the right track," he says. Bates says the future in tech is likely to be about fewer multi-year investments in platform technology: "I think business and IT both want quick, modular services, and then continuous assessment and alignment of both the market and the condition of the technology estate. I think over-planning would be a mistake; doing small horizons is probably better." For IT, that's not a big deal – that's simply an agile way of working that most organisations have already embraced. But for the rest of the business, the impact of that shift is significant. Bates says non-IT executives traditionally look to line up capital and then execute on it. Now, the trend will be to invest in smaller chunks because of macro-economic volatility.


Inside Job: Former Worker Allegedly Holds Records for Ransom

Unity Health has not indicated whether the accused former Nuance employee worked on-premises at the hospital, in a Nuance office or remotely from home or another location. "The challenging part of a situation such as this is that technology-based [access] controls only go so far," says Keith Fricke, principal consultant at security consultancy tw-Security. "Options may exist to prevent someone from printing patient information from within an application or even preventing a screen print. However, nothing can stop someone with authorized access to patient information from using a smartphone to take pictures of patient data displayed on a computer monitor," he notes. Worker behavior cannot be fully controlled, Fricke says. "When someone makes poor choices, the best a healthcare provider can often do is provide evidence that workers receive training on policies and understand expected behaviors and responsibilities as part of their employment." Auditors investigating such infractions look for evidence that the affected organization "did all the right things" in making investments in people, processes and tools to protect sensitive information, Fricke adds.



Quote for the day:

"Leadership is the creation of an environment in which others are able to self-actualize in the process of completing the job." -- John Mellecker

Daily Tech Digest - October 11, 2020

Could Microsoft be en route to dumping Windows in favor of Linux?

Microsoft has been doing everything in its power to migrate users from the standard client-based software to cloud and other hosted solutions, and its software cash cow has become web-centric and subscription-based. All of those Linux users could still work with Microsoft 365 and any other Software as a Service (SaaS) solution it has to offer--all from the comfort and security of the Linux operating system. That's a win-win for Microsoft and consumers because Windows isn't as much of a headache to deal with (by way of bug hunting and security patching its proprietary solutions), and consumers get a more reliable solution without missing out on anything. If Microsoft plays its cards right, the company could re-theme KDE or just about any Linux desktop in such a way that it's not all that different from the Windows 10 interface. Lay this out right, and consumers might not even know the difference--a "Windows 11" would simply be the next evolution of the Microsoft desktop operating system. Speaking of winning, IT pros would spend less time dealing with viruses, malware, and operating system issues and more time on keeping the network running and secure.


Open sourcer Camunda buffs up RPA platform in an overcrowded market

Speaking to The Register, Camunda CEO and co-founder Jakob Freund said: “We have seen from Camunda customers such as Deutsche Telekom and National Westminster Bank that RPA is a great short-term solution, however it's a highly maintenance-intensive technology. RPA is essentially reading and automating the same user interface that a user would use.” A front-end UI might be in a spreadsheet, a scanned document or even a CRM or ERP system and any changes to it could break bots. “This makes RPA labour intensive,” he said. While Camunda solution may be a step in the right direction, the journey to bot nirvana is a long one. Craig Le Clair, Forrester Research veep, said: “There are many BPM vendors like Camunda that are acquiring or positioning in some way for the RPA or for what we call now Intelligent Automation market." He said Camunda approach was to see RPA thus far as a simple task automation without stronger rules management. It also usually failed to take account of a broader API and UI integration approach.


Google Search As a Vishing Tool: Big Cyber Security Challenge For Bank Customers

The article is based on the analysis of Google search and various cyber crimes cases registered in the cyber crime cells across Indian Territory and cases reported in various banks across India. Google search is the key focus area in this article. Cyber criminals update the name and mobile number in Google search by logging their Gmail account. When bank customer searches contact number of the concerned bank branches/merchants/payment intermediaries for enquiring or resolving the problem, he calls on the mentioned fake mobile number which has been updated by the fraudsters. Consequently, Cyber criminal gets the sensitive financial and secret information from the bank customers in fraudulent manner over phone line for resolving their problem and dupe the bank customers by debiting their bank account. Banks/merchants/payment intermediaries always provide genuine contact numbers at their respective website so that customers can easily access that numbers for establishing communication with banks /merchants/payment intermediaries in case of any related issue. But due to unawareness of phishing and vishing like scam, bank customers are easily duped by the cyber criminals.


How visualizing my Agile backlog improved productivity

One of the key lessons that I’ve had to learn as a UX Designer is how to work with the Agile backlog. Part of that is to understand how User Research and Business requirements can fit together. What I mean by that is that during the discovery phases of projects, you’re likely to also do User Interviews or other research while the Business is forming their backlog. You may have a lot of insights that help inform different backlog items, but these may be things that you find out on your research that no one else on the product team may have encountered. For example, there may be user workarounds to the official process that people don’t necessarily want to advertise, but it’s still useful user insight. So how do you combine these two groups? I initially tried to argue from the backlog standpoint, creating specific backlog items, but it was hard for members of the team to get on board with these things. Because even if they trust you that you did good research, it’s something that they’re not familiar with. And that’s exactly what was happening. People would view an isolated business requirement and change the wording or not realize what we were trying to say, without realizing how it might impact other items or the big picture.


Curiosity Artificial Intelligence: A Know-All Guide

If we take humans as an example, curiosity is what makes us learn things. Starting from a stage when humans get maturity, he/she becomes curious about the movements around them and learns from it. Then think about AI getting curious. AI is already known for its features that have reached the cliff by its technological improvements. AI has surpassed human abilities by making predictions and decisions in a split second by going through all the data. When we compare human curiosity with AI curiosity, AI emulates the behaviour in an algorithm that could enhance the potential for self-directed machine learning so that AI system would be driven to seek out or develop solutions to unfamiliar problems. In artificial intelligence, reinforcement learning (RL) is the process of motivating AI to perform desired behaviour and punishing it for undesired ones. RL seeks a feedback signal that assures that AI is making a step closer towards its goal. It learns observation and experience through the process. Training the AI in positive or negative feedback is up to the person who approaches.


TypeScript creator: How the programming language beat Microsoft's open-source fears

But as TypeScript matured and open source gained acceptance at Microsoft in its pivot from Windows to the cloud, the company would in 2014 shift TypeScript to a model of "open development" via a public repository on GitHub. Using GitHub allowed the community of TypeScript and JavaScript developers to influence its future. Because of that different approach, his team now has "zero distance" to its customers – the developers who use either JavaScript or TypeScript. "There's open source in the technical sense in that you give people your source code and give away your intellectual property rights, and technically that's open source," explains Hejlsberg. "But then there's open development, where you actually do your entire development process in the open, which we've been doing now since TypeScript moved to GitHub in 2014."  Today, GitHub, which Microsoft acquired in 2018 for $7.5bn, is where the TypeScript team of about 20 Microsoft engineers do all its daily work, allowing for a "closeness to our customers like nothing I've seen before".


The Disruption Era: The Future Of Coworking

Indeed, technology is unquestionably going to be a crucial point. A short report from CBInsights shows the “office of the future” will probably have voice tech systems to avoid touching, as well as autonomous cleaning solutions; air-improving systems to make us more productive; sensors to count people in rooms, along with a wellness check at the entry. One other important point is sustainability. Since workers spend most of their time indoors, it’s crucial to make that environment livable and sustainable. According to Accenture, just 11% of the employees are completely satisfied with their workspace. Also, many governmental initiatives plan to cut carbon footprints down by developing nearly zero-energy consumption buildings. Some companies also have come up with different solutions to reduce energy usage, such as solar panels, systems that use rain or water energy as well as some innovative solutions like converting workers’ footsteps into data and energy. According to the global coworking community GCUC, while the U.S. and Europe are expected to grow slowly, China is catching up fast and is expected to step over within the next few years, as the global number of coworking spaces increases by over one-third by 2022.


Visa Alert: POS Malware Attacks Persist

The three POS malware variants that targeted one hospitality company in June were identified as RtPOS, MMon and PwnPOS, according to the Visa report. "There is evidence to suggest that the actors employed various remote access tools and credential dumpers to gain initial access, move laterally and deploy the malware in the POS environment," according to the report. The malware variants are designed to scrape payment card data from Windows-based POS devices, but each performs its functions differently, according to the report. The RtPOS malware uses a specialized algorithm to check for payment card data before bundling the information into a file that the fraudsters later exfiltrates through a command-and-control server, the report notes. The MMon malware, on the other hand, deploys a command-line memory scraping technique that collected payment card data from a POS device's memory. The Visa report notes this malicious code, in use since 2010, frequently is customized. The PwnPOS malware creates persistence within POS devices and attempts to scrape payment card data from memory.


Quantum is years away, but business case can be made today

Working with D-Wave, Accenture ran a test using quantum annealing to demonstrate that there are benefits to using quantum techniques over traditional methods when hundreds of assets and/or factors are involved in the calculation. According to BBVA, the promising results have convinced the team to continue its investigation of this case with other technologies. Discussing the potential impact of quantum computing on financial services, Carlos Kuchkovky, global head of research and patents at BBVA, said:“Although this technology is still in an early stage of development, its potential to impact the sector is already a reality. Our research is helping us identify the areas where quantum computing could represent a greater competitive advantage, once the tools have matured sufficiently. We believe this will be, for certain concrete tasks, in the next two to five years.”  Alan Baratz, CEO of D-Wave, said: “Quantum computing is poised to fundamentally transform the way businesses – especially large-scale enterprises – solve critical problems. As enterprise leaders and decision-makers rethink business processes to become more agile and innovative, they need the tools and support to turn their ideas into quantum applications that have a real impact on their business.


Dr Lal Pathlabs data leak: Legal opinions on liabilities, punishments ...

The company now claims that the loophole has been patched but there is no explanation about the magnitude of information leaked online. With sensitive personal data leak happening every now and then, The420 team spoke to legal and cyber experts to find out legal liabilities and punishment in such cases. All the cyber experts unanimously said it is the companies’ responsibility to ensure the safety of their customers’ data. Patching the loophole is the bare minimum expected from them. Experts highlighted that private companies are not serious about data protection which can also be gauged by the amount they spend on its security. Explaining the legal action in such cases, Delhi based senior lawyer Karnika Seth said, “Section 43 A of the IT Act, 2000 requires companies to safeguard personal data and personally sensitive data. Health records are sensitive data. ISO 27001 certification is one of the standards required to comply with extant data protection measures.” Explaining other legal provisions in such cases, Mumbai based Dr Prashant Mali, Cyber & Privacy Expert Lawyer said, “Section 85 of the IT Act, 2000 deals with the offence by companies and Section 72A deals with a contractual data breach. The law is stringent implementation after 19 years of law in force is pathetic and no awareness.”



Quote for the day:

"Make sure you have finished speaking before your audience has finished listening." -- Dorothy Sarnoff

Daily Tech Digest - October 10, 2020

Internet of Things Applications

IoT applications can transform reactive medical-based systems into active wellness-based systems. Resources that are used in current medical research lack important real-world information. It uses controlled environments, leftover data, and volunteers for clinical trials. The Internet of Things improves the device's power, precision and availability. IoT focuses on building systems rather than just tools. Here's how the IoT-enabled care device works. ... Most of you have heard about the term smart city. Smart city uses technology to provide services. The smart city includes improving transportation and social services, promoting stability and giving voice to their citizens. The problems faced by Mumbai are very different from Delhi. Even global issues, such as clean drinking water, declining air quality, and increasing urban density, occur in varying intensity cities. Therefore, they affect every city. Governments and engineers use the Internet of Things to analyze the complex factors of town and each city. IoT applications help in the area of water management, waste control and emergencies. ... By the year 2050, the world's growing population is estimated to have reached about 10 billion. To feed such a large population, agriculture needs to marry technology and get the best results.


Learning from Bugs and Testers: Testing Boeing 777 Full Flight Simulators

Every failure is an opportunity to learn something. Finding and fixing a bug is great, but understanding why the bug is there is where you learn even more. It might be a requirement that wasn’t properly formulated, implemented, tested, or anything else. As a software tester, imagine that a bug you have reported was caused by an untrapped exception in a division by zero. If you saw that bug several times or if your software is mission-critical then it is worth asking every team to check if they have encapsulated all code where they perform a division with a “try catch” statement. The Ariane 5 Rocket blew up because of untrapped exceptions: There go half-a-billion dollars and a dent in reputation. The US Air Force almost lost an F-22 squadron (state-of-the-art fighter jets valued $350 million a pop) when they flew over the International Date Line (IDL) leaving the pilots with only flight controls (dedicated and separated computers). When you cross the IDL the “time” skips by one day depending on your direction of travel. A space probe crashed on Mars because the software testers forgot to test what would occur when sensors are sending erroneous data to the mission computer.


55 New Security Flaws Reported in Apple Software and Services

A team of five security researchers analyzed several Apple online services for three months and found as many as 55 vulnerabilities, 11 of which are critical in severity. The flaws — including 29 high severity, 13 medium severity, and 2 low severity vulnerabilities — could have allowed an attacker to "fully compromise both customer and employee applications, launch a worm capable of automatically taking over a victim's iCloud account, retrieve source code for internal Apple projects, fully compromise an industrial control warehouse software used by Apple, and take over the sessions of Apple employees with the capability of accessing management tools and sensitive resources." The flaws meant a bad actor could easily hijack a user's iCloud account and steal all the photos, calendar information, videos, and documents, in addition to forwarding the same exploit to all of their contacts. The findings were reported by Sam Curry along with Brett Buerhaus, Ben Sadeghipour, Samuel Erb, and Tanner Barnes over a three month period between July and September. After they were responsibly disclosed to Apple, the iPhone maker took steps to patch the flaws within 1-2 business days, with a few others fixed within a short span of 4-6 hours.


An introduction to Windows 10’s new PowerToys

Over the past few months, PowerToys has really cleaned up its installation and update capabilities. Versions 0.18.2 and earlier would require users to jump into Task Manager and terminate various processes and applications before its installation could complete. Via Twitter, one of team leader Clint Rutkas’ associates informed me this was because of the program’s inclusion of the Core .NET DLLs into PowerToys. But as of release v0.20.1, PowerToys handles all open applications without issue. Now, it simply asks for permission to restart the Windows Explorer (process name: explorer.exe) as it finishes up updating or installing itself. Speaking of updates, Rutkas and his programmers are working hard to push frequent updates to PowerToys. A look at the Releases page shows eight releases since early June, or about two per month. ... Pressing and holding the Windows key causes an overlay showing a list of Windows-key shortcuts to appear on the primary Windows display. Once shown, that menu — a.k.a. the Shortcut Guide — persists as long as the winkey remains depressed. These shortcuts, or key combos, require that the Windows key and the named key be depressed together.


3 ways criminals use artificial intelligence in cybersecurity attacks

Generative Adversarial Networks (GANs) are basically two AI systems pitted against each other—one that simulates original content and one that spots its mistakes. By competing against each other, they jointly create content convincing enough to pass for the original. Nvidia researchers trained a unique AI model to recreate PAC-MAN simply by observing hours of gameplay, without a game engine, as Stephanie Condon explained on ZDNet. Bandos said that attackers are using GANs to mimic normal traffic patterns, to divert attention away from attacks, and to find and exfiltrate sensitive data quickly. "They're in and out within 30-40 minutes thanks to these capabilities," he said. "Once attackers start to leverage artificial intelligence and machine learning, they can automate these tasks." GANs also can be used for password cracking, evading malware detection, and fooling facial recognition, as Thomas Klimek described in the paper, "Generative Adversarial Networks: What Are They and Why We Should Be Afraid." A PassGAN system built by machine learning researchers was trained on an industry standard password list and was eventually able to guess more passwords than several other tools trained on the same dataset.


Serverless Horror Stories

One of the well-known challenges of event-driven serverless architectures is the difficulty in tracing requests end-to-end, in order to investigate performance issues. A case in point is when the Segment company encountered a problem with a popular serverless service, DynamoDB. Segment was experiencing a serious performance issue with their DynamoDB instances that was slowing down their entire system. To mitigate this, the company had to increase the provisioned throughput of DB instances, but this in turn vastly increased their AWS bill. When Segment’s own troubleshooting efforts failed to uncover the problem, they asked AWS support for help. Using internal tools, AWS generated a partition heatmap of DynamoDB instances. Although the heatmap was not very readable, they were able to spot a single DB partition that was having performance issues, clearly indicating that their workload was not distributed evenly across partitions. It was still, however, not clear which records or keys were problematic. So Segment continued to investigate the issue and found a relatively trivial bug that was very hard to spot but which, when fixed, reduced their DynamoDB capacity by a factor of four and saved them $300,000 annually.


Travelex Cyber-attack Timeline

We have compiled a detailed timeline of the Travelex cyber-attack based on information that's available freely on the internet and in media reports. Our objective is to simply present this information in an easy-to-consume visual guide that can help cybersecurity practitioners and enthusiasts to get further clarity on what went wrong and how. ... The idea of us creating this timeline is not to vilify/defame any business or victims of a cyber-attack. However, from every cyber incident there is something all of us can learn about covering our bases when it comes to being truly cyber-resilient.  In this case, it appears that the cyber-criminals managed to attack Travelex thanks to the unpatched critical vulnerabilities in its Pulse Secure VPN servers. Hence, the lesson here would be to always ensure that your cybersecurity infrastructure is as updated and foolproof as possible. Regular review of the IT infrastructure is also imperative to ensure that your business is as secure as is possible. ... Amar insists that this initiative isn't aimed at attacking Travelex or any organisation. Our objective to create these attack timelines is purely for educational purposes. Amar has been in the thick of many cyber-attacks and he absolutely understands the pressure, the chaos and the collective desire to do the right thing when in the midst of a major crisis.


How to Build, Deploy, and Operationalize AI Assistants

Some of the issues with non-linear conversations, where the user introduces a new topic in the middle of the conversation or modifies a previous statement, remained. These types of multi-turn conversations are particularly challenging, and they also happen to be the way that most users actually talk. In an effort to resolve some of these issues, the team experimented with Rasa’s TED (Transformer Embedding Dialogue) policy. Using a transformer architecture, the TED policy can selectively pick which conversation turns to pay attention to, and which conversation turns to ignore. Additionally, and perhaps, distinctively in comparison to recurrent neural network architectures, transformers use a self-attention mechanism, by which they’re able to choose which elements in a conversation to pay attention to, in order to make an accurate prediction. In other words, transformers are uniquely equipped to handle non-linear conversations where a user might change topics, engage in chitchat in the middle of a conversation, because they’re less likely to become perplexed when a user does something unexpected. In addition, it provides hyperparameters that can be used to fine-tune the model.


The next frontier for risk tech should be in the cloud

Risk management technology moving to the cloud allows organizations to better incorporate their entire technology stack into their GRC processes. With more available integrations, data can be exchanged more securely and can result in more impactful business insights. And, when information moves seamlessly between applications and platforms, it’s also easier to incorporate more employees and managers into a company’s risk culture and processes. Investing in agile technology gives companies the opportunity to scale, and scale quickly. In a space like GRC and risk management, which relies on data and insights garnered through data, it’s important to not only invest in agile technology but also in software built on a graph database. Graph databases, compared to relational databases, are much more flexible and offer greater user-visibility. Information is more easily stored and the infrastructure allows it to generate relationships between data sets so solutions can unlock more insights and functionality. Now, a cloud-based risk management solution can align with and improve pre-existing processes. As a result, Gartner predicted 100% annual growth through 2022, when naming graph databases one of its biggest data trends.


Data localisation in emerging markets: The case of Turkey

Data localisation requirements under Turkish law have so far been sector-specific and, hence, limited in scope. Several sectors are already under obligation to keep primary and secondary data in Turkey: banking, e-sim technology companies, finance, healthcare and energy. When the social media law discussed above was formally enacted in July 2020, the Turkish ID number requirement for social media logins was dropped and the data localisation measure became a request to the outlets “to take the necessary measures towards hosting Turkey-based users’ data in Turkey” without enforcement. With no clear enforcement mechanism for the data localisation requirement in the Turkish social media law, Turkey diverged from the Russian approach of forced data localisation in social media and approximated to the social media regulation approaches of other G20 member emerging markets such as South Africa and India. Yet another blow towards forced data localization came from the “Personal Data Protection Board” in September 2020, rendering the Convention No. 108 of the Council of Europe (to which Turkey is a signatory) on cross-border transfers of personal data not applicable for Turkey.



Quote for the day:

"Leadership is intangible, and therefore no weapon ever designed can replace it." -- Omar N. Bradley