Daily Tech Digest - March 10,2020

How can companies thrive under CCPA regulation?

How can companies thrive under CCPA regulation? image
One major challenge that Manley says companies deal with when it comes to data management under CCPA is dealing with consumer data that’s located across a number of devices and software infrastructures. He explained: “I think the biggest challenge we see a lot of people having is that they often don’t understand how many places are holding customer data. “They were thinking very much about the data that they had on their premises, maybe things that are in file servers, databases, corporate laptops, that sort of thing, and it takes a while to then realise that they’ve got a number of SaaS applications, whether it’s Salesforce, Slack or Office 365. ... According to Manley, the companies that manage to succeed while staying within CCPA boundaries use the regulation as an opportunity to reflect on their operations. “Regulations like CCPA are a good baseline for what your company should be doing anyway,” he said. “For a lot of the better organisations, we see them saying that the goal isn’t just to hit the baseline, but it’s to use this as a starting point for discussion about what we want to be as a business.



Impactful, but Overhyped AI

IoT AI
Many companies struggle with how to successfully integrate AI into their businesses. Lux Research released a report called “Artificial Intelligence: A Framework to Identify Challenges and Guide Successful Outcomes” that analyzes the market, outlines several challenges companies face in integrating AI, and hones in on several factors businesses should consider before investing in AI. The four factors the research firm suggests to help businesses make wise AI investments and decisions include: clearly understanding the outcomes implementing AI will provide for their businesses; focusing on an AI product’s capabilities instead of flashy marketing; knowing when the technology is mature enough to mitigate risk; and identifying practical challenges to both implementation and maintenance of the technology once it is in place. There’s no doubt that AI technologies can be impactful in helping companies achieve digital transformation, but there is also a lot of hype that is not necessarily helping the space and the players within it.


Multiple nation-state groups are hacking Microsoft Exchange servers

Microsoft Exchange
These state-sponsored hacking groups are exploiting a vulnerability in Microsoft Exchange email servers that Microsoft patched last month, in the February 2020 Patch Tuesday.The vulnerability is tracked under the identifier of CVE-2020-0688. ... This Exchange vulnerability is not, however, straightforward to exploit. Security experts don't see this bug being abused by script kiddies (a term used to describe low-level, unskilled hackers). To exploit the CVE-2020-0688 Exchange bug, hackers need the credentials for an email account on the Exchange server -- something that script kiddies don't usually have. The CVE-2020-0688 security flaw is a so-called post-authentication bug. Hackers first need to log in and then run the malicious payload that hijacks the victim's email server. But while this limitation will keep script kiddies away, it will not stop APTs and ransomware gangs, experts said. APTs and ransomware gangs often spend most of their time launching phishing campaigns, following which they obtain email credentials for a company's employees.


3 cloud architecture problems that need solutions


Many push as much as they can to the edge, but realize that you’re moving away from a centralized system (the public cloud), to many decentralized systems (the edge devices or servers). You need to understand that you must maintain these edge systems, and they are much more difficult to monitor, govern, secure, update, and configure. Multiply that effort by hundreds of edge computing devices and you've got an operational nightmare. Second, what to containerize? Many enterprises say containers are their strategy and not just an enabling technology. This almost religious belief in the power of containers has pushed many an application to the cloud in containers, but that’s really not how business should be moving there. The issue is that there are no hard and fast rules as to what can—and should—exist in a container. Legacy applications that will take a great deal of effort to refactor (rewrite) for containers are not likely candidates; however, in many instances, the cloud migration team attempts to move them first. This means that enterprises will fail to find value in containers for some of their applications that move to the cloud.


How to break down data silos: 4 obstacles and solutions

Silos
With the growth of shadow IT, vendor software and databases can come through virtually any departmental door. Systems from different vendors that departments independently buy don't necessarily interact well with each other. When this occurs, systemic data silos can arise because of cross-system and data integration failures. The best way to address this issue is to require interoperability and a full set of application programming interfaces (APIs) in the requests for proposal (RFP) that IT and individual business departments issue to vendors. One way to assure that system and data interoperability is a front-page requirement on RFPs is for IT to create a standard RFP that is required by purchasing or whichever department authorizes tech purchases. This standardized form can be used by IT and end-user departments. Most systems and databases sold by vendors have some type of APIs for data integration; however, totally seamless integration and the ability to easily aggregate data from disparate systems can never be assumed.


How CIOs can limit the business disruption of the coronavirus — Gartner

How CIOs can limit the business disruption of the coronavirus — Gartner image
With various quarantine measures and travel restrictions undertaken by organisations, cities and countries, uncertainties and disruptions are beginning to have more of an impact on businesses and their workforces. This increases the chance that business operations are either being suspended or run in a limited capacity. In response to this, Gartner are promoting the use of AI to automate some tasks particularly basic customer service protocols and candidate screenings. In its report, Gartner also recommends that in organisations where remote working capabilities have not yet been established, CIOs need to work out interim solutions, including using instant messaging for general communication, file sharing/meeting solutions, and access to enterprise applications such as enterprise resource planning (ERP) and customer relationship management (CRM). ... If it isn’t possible for organisations to meet their clients face to face, Gartner recommends using digital channels such as video calls and live streaming solutions that can serve various customer engagement and selling scenarios.


What Does A Typical Day Of A Data Scientist Look Like?

A day of a data scientist
Like every other professional, the day of a data scientist will be dotted with emails to answer and meetings to attend. But this is where the similarities end. Unlike in most jobs, each day throws up new challenges and unique problems for a data scientist. This comes in the form of varied projects, and that in turn changes with the industry they operate in. But despite the flux, what ties together each workday for them collectively are data-related tasks. Depending on your profile, you will either be – broadly speaking — pulling data, shaping it, merging it, or analysing it — all with the end goal of solving problems for businesses. This is accomplished by using a wide variety of tools that look for patterns or trends within a given data set, and trying to simplify data problems. ... As emphasised in the first point, the primary task of a data scientist is to be problem-solvers, and that cannot be achieved in silos. A typical day would involve engaging with stakeholders at multiple levels to determine the questions that need pointed answers. Not just that, it is their job to come up with different approaches to solve these problems.


Introducing Alpine.js: A Tiny JavaScript Framework

Ever built a website and reached for jQuery, Bootstrap, Vue.js or React to acheive some basic user interaction? Alpine.js is a fraction of the size of these frameworks because it involves no build steps and provides all of the tools you need to build a basic user interface. Like most developers, I have a bad tendency to over-complicate my workflow, especially if there’s some new hotness on the horizon. Why use CSS when you can use CSS-in-JS? Why use Grunt when you can use Gulp? Why use Gulp when you can use Webpack? Why use a traditional CMS when you can go headless? Every so often though, the new-hotness makes life simpler. Recently, the rise of utility based tools like Tailwind CSS have done this for CSS, and now Alpine.js promises something similar for JavaScript. In this article, we’re going to take a closer look at Alpine.js and how it can replace JQuery or larger JavaScript libraries to build interactive websites. If you regularly build sites that require a sprinkling on Javascript to alter the UI based on some user interaction, then this article is for you.


Job Trends For Data Scientists In The Next 5 Years

Data scientist
A trend that has emerged in recent times is that companies which earlier identified themselves as ‘non-tech’, are beginning to position themselves as tech companies, and this is likely to continue. A case in examples is banks. For instance, the term ‘analyst’ used in the context of this industry, might now be called a ‘data scientist’, as long as they are seeking to monetise the company’s data assets. One of the main drivers for this trend is the copious amounts of data available today – and this has been increasing exponentially. What is more, fuelled by the rise of (Internet of Things) IoT and social media, this growth is not expected to slow down anytime soon. The IoT market in India alone is reportedly likely to reach a whopping 2 billion connections by 2022. This is buttressed by the fact that not only are more devices coming online but with greater improvements in hardware, the type of data delivered will be more diverse. The same goes for social media. According to Hootsuite, the number of social media users worldwide in 2019 rose up to 3.484 billion — recording an increase of 9% y-o-y.


Huawei P40 Pro expected to have 7 cameras, 10x optical zoom, and 5G support


According to known Apple leaker Ming Chi Kuo, a 10x optical zoom camera could be included as one of the sensors in the P40 Pros camera system, making it the world's first phone to achieve such a feat. The Mate 30 Pro featured a quad-camera set-up, and included a 50x digital zoom and a 5x optical zoom, which catapulted it into the mobile hall of fame.  Optical zoom is achieved by switching from a wide-angle camera to a telephoto camera. The magnification number is a reflection of the difference of those two lens lengths. Using the telephoto camera without "pinching in" results in a higher-quality image instead of using digital zoom which is what happens when you pinch the screen of your phone while using the main camera --- or when you try to zoom in beyond the telephoto camera's capabilities. According to GizChina, the P40 Pro's rear camera will come with a 52-megapixel Sony IMX700 sensor, which is 10 megapixels higher than P30 Pro's rear camera. The 52-megapixel sensor is significantly lower in terms of resolution than Samsung's Galaxy S20 Ultra 108-megapixel sensor, but reports suggest this new sensor can bring bigger pixels and better low-light image quality.



Quote for the day:


"The captain of a ship can run a great ship, but he can't do anything about the tides." -- Matthew Norman


Daily Tech Digest - March 09, 2020

Can Continuous Intelligence and AI Predict the Spread of Contagious Diseases?


Past efforts to model the spread of contagious diseases may have made false assumptions about the data they relied on? Does the fact that many people in one geographic region search for the name of an emerging contagious disease mean the disease is present and growing? Perhaps, perhaps not. The danger is relying on coincidences and not linking cause to effect. Did past and current efforts have all the data they needed? One issue with forecasting the spread of a disease is that models might not have accurate data. The issue is especially relevant at the onset of new diseases. It is quite easy to blur flu-like symptoms in patients. Doctors may not know the symptoms of a disease at its onset, or they may make inaccurate diagnoses. Are the models based on the right science? At the early stage of investigating a newly found disease, even basic information, like how a disease spreads, is unknown. Is it airborne? Does it spread via exposure to blood or other bodily fluids? What’s the incubation period? Such mechanisms need to be nailed down before predictions can be made.



Out at Sea, With No Way to Navigate: Admiral James Stavridis Talks Cybersecurity

We're still figuring out how this is going to work. To shift metaphors to the oceans, it's as though we're out at sea, we're in a bunch of boats, but we haven't really put in place buoys and navigational aids, and we haven't really defined who's going to protect us. So if if I'm a commercial ship at sea, I know the US Navy is going to come and defend me if I'm an American ship and I'm under attack. And in fact, we actively discourage merchant ships from mounting their own defenses. The defense requirements, I think, ought to be vested in the state. But in the world of cyber, realistically, if you're a commercial entity, particularly a target-rich kind of environment like financials or critical infrastructure, say electric grid, the government so far has not really stepped up to that task of broadly protecting you. Yeah, you can get some help from the NSA and some help from the FBI and some help from the CIA. But broadly speaking, you are going to have to have some mechanisms, at least on the detection and on the defensive side.


Containers march into the mainstream

Containers march into the mainstream
A decade ago, Solomon Hykes’ invention of Docker containers had an analogous effect: With a dab of packaging, any Linux app could plug into any Docker container on any Linux OS, no fussy installation required. Better yet, multiple containerized apps could plug into a single instance of the OS, with each app safely isolated from the other, talking only to the OS through the Docker API. That shared model yielded a much lighter weight stack than the VM (virtual machine), the conventional vehicle for deploying and scaling applications in cloudlike fashion across physical computers. So lightweight and portable, in fact, that developers could work on multiple containerized apps on a laptop and upload them to the platform of their choice for testing and deployment. Plus, containerized apps start in the blink of an eye, as opposed to VMs, which typically take the better part of a minute to boot. To grasp the real impact of containers, though, you need to understand the microservices model of application architecture. Many applications benefit from being broken down into small, single-purpose services that communicate with each other through APIs, so that each microservice can be updated or scaled independently.


Democratizing data, thinking backwards and setting North Star goals

Essentially, database is a fairly old technology, but it has always been about three things. One thing is value. How do you get the best out of your data, which is, what are the features that you provide, the power of querying the data, of updating it, of correlating it, and doing things with the data? The second thing has been security. How do you make sure that the data stays under your control, that you own it and determine what happens with the data? And the third is, I would call it cost or performance, is making sure that you don’t overpay for the data, right? That it’s kind of cheap to, or kind of gets more and more affordable, to do what you want to do with your data and control it. ... The best way to process data is if it’s really structured and you know exactly what it is, right? And you have a schema, essentially. And I spent a lot of time working on semi-structured data, which has some structure that you kind of extract and that is kind of like getting good value out of all data, not just your structured data like your bank accounts, but also your email, the books you write, the word documents you write, getting some value out of that.


Artificial intelligence and machine learning an essential part of cybersecurity


World Wide Technology also plans to use AI and ML this year as part of its cybersecurity plans, according to chief technology advisor Rick Pina. "In today's digital age, the security of data, applications, and processes is of the utmost importance; and AI and ML now play an integral part in this cybersecurity process. AI and ML have brought enticing new prospects for speed, accuracy, and connectivity to the public and private sectors, allowing government agencies and corporate organizations to make great strides in governed self-service access, alongside data security and reliability," Pina said. ... Michael Hanken, vice president of IT at Multiquip, said he isn't planning to use AI and ML yet, but he is researching its benefits and limits to see how it might work in conjunction with cybersecurity in the future. Dan Gallivan, director of IT for Payette, said, "AI and ML are not part of the official plan this year but I do feel they are in the not too distant future as we learn more about artificial intelligence and machine learning development capabilities and then experiment with them in cybersecurity."


7 Cloud Attack Techniques You Should Worry About

(Image: Adam121 - stock.adobe.com)
As organizations transition to cloud environments, so too do the cybercriminals targeting them. Learning the latest attack techniques can help businesses better prepare for future threats. "Any time you see technological change, I think you certainly see attackers flood to either attack that technological change or ride the wave of change," said Anthony Bettini, CTO of WhiteHat Security, in a panel at last week's RSA Conference. It can be overwhelming for security teams when organizations rush headfirst into the cloud without consulting them, putting data and processes at risk. Attackers are always looking for new ways to leverage the cloud. Consider the recently discovered "Cloud Snooper" attack, which uses a rootkit to bring malicious traffic through a victim's Amazon Web Services environment and on-prem firewalls before dropping a remote access Trojan onto cloud-based servers. As these continue to pop up, many criminals rely on tried-and-true methods, like brute-forcing credentials or accessing data stored in a misconfigured S3 bucket. There's a lot to keep up with, security pros say.


Robotic Process Automation Implementation Choices


The first step in implementing RPA is identifying tasks that lend themselves to automation. There are some common characteristics to look for even though RPA application areas cut across broad swaths of organizations. Specifically, IBM notes that an “RPA-ready” application is one that is: Simple, consistent, and repeatable; Repetitive low-skill tasks that create human issues such as high error rates and low worker morale; Existing or planned processes where stripping off routine tasks can free humans and deliver significant productivity, efficiency, or cost benefits; and Tasks that offer meaningful opportunities to improve customer and worker experiences by speeding up existing processes. Some tasks may meet many of these criteria but still not be suitable for RPA. For example, a task may meet every criterion, but if the task requires additional data capture capabilities or a redesign of the process, RPA may not be the right fit. RPA can be applied to a very broad range of tasks across most industries.


Android security warning: One billion devices no longer getting updates


All of the phones in the tests were infected successfully by Joker – also known as Bread – malware. Every single device tested was also infected with Bluefrag, a critical vulnerability that focuses on the Bluetooth component of Android. Which? said there should be greater transparency around how long updates for smart devices will be provided so that consumers can make informed buying decisions, and that customers should get better information about their options once security updates are no longer available. The watchdog also said that smartphone makers have questions to answer about the environmental impact of phones that can only be supported for three years or less. Google told ZDNet: "We're dedicated to improving security for Android devices every day. We provide security updates with bug fixes and other protections every month, and continually work with hardware and carrier partners to ensure that Android users have a fast, safe experience with their devices." When operating systems and security updates are delivered varies depending on the device, manufacturer and mobile operator. Because smartphone makers will tweak bits of the Android operating system, they often deploy patches and updates at a slower pace than Google does on its own devices, or not at all.


The Dark Side of Microservices

From a technical perspective, microservices are strictly more difficult than monoliths. However, from a human perspective, microservices can have an impact on the efficiency of a large organization. They allow different teams within a large company to deploy software independently. This means that teams can move quickly without waiting for the lowest common denominator to get their code QA’d and ready for release. It also means that there’s less coordination overhead between engineers/teams/divisions within a large software engineering organization. While microservices can make sense, the key point here is that they aren’t magic. Like nearly everything in computer science, there are tradeoffs — in this case, between technical complexity for organizational efficiency. A reasonable choice, but you better be sure you need that organizational efficiency, for the technical challenges to be worth it. Yes, of course, most clocks on earth aren’t moving anywhere near the speed of light. Furthermore, several modern distributed systems, rely on this fact by using extremely accurate atomic clocks to sidestep the consensus issue.


Essential things to know about container networking

IDG Tech Spotlight  >  Containers + Virtualization [ Network World / March 2020 ]
Choosing the right approach to container networking depends largely on application needs, deployment type, use of orchestrators and underlying OS type. "Most popular container technology today is based on Docker and Kubernetes, which have pluggable networking subsystems using drivers," explains John Morello, vice president of product management, container and serverless security at cybersecurity technology provider Palo Alto Networks. "Based on your networking and deployment type, you would choose the most applicable driver for your environment to handle container-to-container or container-to-host communications." "The network solution must be able to meet the needs of the enterprise, scaling to potentially large numbers of containers, as well as managing ephemeral containers," Letourneau explains. The process of defining initial requirements, determining the options that meet those requirements, and then implementing the solution can be as important choosing the right orchestration agent to provision and load balance the containers. "In today's world, going with a Kubernetes-based orchestrator is a pretty safe decision," Letourneau says.



Quote for the day:


"Leadership without mutual trust is a contradiction in terms." -- Warren Bennis


Daily Tech Digest - March 08, 2020

Navigating the compliance minefield with robotic process automation

Navigating the compliance minefield with robotic process automation image
RPA can assist with compliance by helping create more robust and effective compliance programs. From a reduced volume of legal issues, better retention of employees and customers, and improved business operations, RPA can help in many ways. It enables organisations to take greater control over their own operations and deal with compliance issues more easily if they arise. It also offers higher levels of compliance as, once a process is established as an automated workflow with RPA, it is executed in the same way every time without errors, regardless of whether the process concerns data transfer and migration, invoice processing, or purchase order issuing. This means that RPA empowers companies to establish unparalleled levels of process accuracy, especially compared to the work that can be done by human employees. Consequently, businesses can better maintain higher levels of compliance across all business processes. 



For self-driving cars, winter is coming


For Richard Porter, director of technology and innovation at UK self-driving hub organization Zenzic, project CAVForth is evidence enough that the government will effectively meet its deadline for 2021. "We will have an automated bus service commercially carrying a large number of passengers," he told ZDNet. "It will be up and running by 2021, and it will be the main project through which we will be delivering on that deadline." "We interpreted the government's deadline as a commitment to prove by 2021 that the technology can actually start to deliver commercial services. Then, we can start delivering those services at a significant, visible scale." Is the smart car anticlimax simply due to misinterpretation of the government's commitments? Perhaps. But it is worth noting that the industrial strategy's vision does not include human safety operators monitoring the vehicle; and that none of CAVForth's buses will have such a degree of autonomy. Whether experts agree or not on the politics of the government's promise, there is one point that brings about consensus across the industry: even if connected car technology is looking like it will be ready to go by 2021, the UK – and other countries, for that matter – is still a long way from having all the necessary frameworks to make sure autonomous cars can be deployed safely


Microsoft Warns Of 'Devastating' Cybersecurity Threat To Windows Users

The Microsoft logo and company name shown on a banner in black and white
The critical message to digest from the Microsoft deep dive into this threat is that not all ransomware is the same. The automated, bot-driven worm-like ransomware that spits out across the interwebs like a cyber-blunderbuss is damaging enough, for sure. However, the Microsoft threat protection intelligence team is warning about the type of hands-on, human-operated, highly targeted threat that is more commonly associated with the credential-stealing and data exfiltration antics of nation-state actors. Indeed, there is a similarity beyond the targeting; some of these ransomware attack methodologies have evolved to exfiltrate as well as encrypt data. DoppelPaymer, which recently hit the headlines when I reported how Lockheed Martin, SpaceX and Tesla had all been caught in the crossfire of one cyber-attack on a business in their supply chains, is an excellent example of the breed. More of that in a moment, though. First, let's look at the attack tactics and techniques Microsoft is alerting users to.


SLIDE algorithm for training deep neural nets faster on CPUs than GPUs


SLIDE doesn’t need GPUs because it takes a fundamentally different approach to deep learning. The standard “back-propagation” training technique for deep neural networks requires matrix multiplication, an ideal workload for GPUs. With SLIDE, Shrivastava, Chen and Medini turned neural network training into a search problem that could instead be solved with hash tables. This radically reduces the computational overhead for SLIDE compared to back-propagation training. For example, a top-of-the-line GPU platform like the ones Amazon, Google and others offer for cloud-based deep learning services has eight Tesla V100s and costs about $100,000, Shrivastava said. ... Deep learning networks were inspired by biology, and their central feature, artificial neurons, are small pieces of computer code that can learn to perform a specific task. A deep learning network can contain millions or even billions of artificial neurons, and working together they can learn to make human-level, expert decisions simply by studying large amounts of data.


Why is agile so much more successful than waterfall?

diagram of planning a cloud
The reasons why waterfall methodology is not as successful as agile seem clear. But the underlying causes are not necessarily down to reckless approaches to managing the software development project. The waterfall approach does not arrogantly dismiss early and frequent integration testing. Everyone would love to be able to detect significant risks as early as possible. The issue is with the inability to integrate services and components that are not ready for testing (yet). As we progress on a project, we prefer to utilize the divide-and-conquer approach. Instead of doing the development and building sequentially (one thing at a time), we naturally prefer to save time by doing as many things as possible in parallel. So, we split the teams into smaller units that specialize in performing dedicated tasks. However, as those specialized teams are working, they are aware of or are discovering various dependencies. However, as Michael Nygard says in Architecture without an end state: "The problem with dependencies is that you can't depend on them." So the project starts slowing down as it gets bogged down by various dependencies that are not available for integration testing.


The state of container-based cloud development

The state of container-based cloud development
Enter containers, with new challenges and opportunities regarding state retention. In the world of containers, we are taught to be stateless. In container design, including courses I’ve taught, the idea is that a container emerges as an instance, does what it’s programmed to do, and goes away without maintaining state. If indeed it works on data from some external source, it’s handed the data by another process or service, returning the data to another process before being removed from memory. Still, no state maintained. The core issues are that containers, as invented years ago, just could not save state information. There was no notion of persistent storage, so maintaining state was impossible. We were taught early on that containers were for operations that did not require state retention. Some people still argue the need for stateless when building container-based applications, contending that it’s the cleanest approach, and that thinking stateful means thinking in outmoded ways. However, that may not be acceptable to most enterprises developers who are using containers. Traditional applications are not purpose-designed and built for containers.


Take advantage of these 5 benefits of server-side rendering


A key benefit of server-side processing is that it doesn't offload data processing to the client. Instead, the browser does what it's designed to do best, which is rendering static HTML to the client. The browser removes the variability of the user's device processing power from the equation, and server-side processing performance becomes more predictable. Single page applications and responsive web apps that rely heavily on client-side rendering significantly minimize the number of round trips that happen with the server because most of the stage management and page transitions happen on the client. Unfortunately, when a page relies heavily on client-side state management, the server is no longer informed as the end user moves from page to page, clicks on buttons or otherwise interacts with the site. This means key metrics such as time on page, exit page counts and bounce rate are either impossible to collect, or are calculated incorrectly.


Disruptive Defenses Are The Key To Preventing Data Breaches

Photo:
Is this the new normal? Can there be any expectation of security and privacy when even the most stringent of data privacy regulations appear to have little effect? Companies, government agencies and consumers must change their behavior if they expect to stem this tide. They must adopt disruptive defenses to make it extremely hard for attackers to compromise data. What is a disruptive defense? It is an uncommon defense, based on existing industry standards, that raises application security to higher levels than what is currently used by most applications. There are six disruptive defenses that, when deployed, create significant barriers to attackers. ... Cryptography represents the last bastion of defense when protecting sensitive data. As such, cryptographic keys are the only objects standing between an attacker and a major headache for your company. While convenient, keys protected in files are protected by passwords and are subject to the same attacks that compromise user passwords. By using cryptographic hardware -- present in all modern systems -- applications create major barriers to attacks.


Build Great Native CLI Apps in Java with Graalvm and Picocli


Picocli is a modern library and framework for building command line applications on the JVM. It supports Java, Groovy, Kotlin and Scala. It is less than 3 years old but has become quite popular with over 500,000 downloads per month. The Groovy language uses picocli to implement its CliBuilder DSL. Picocli aims to be "the easiest way to create rich command line applications that can run on and off the JVM". It offers colored output, TAB autocompletion, subcommands, and some unique features compared to other JVM CLI libraries such as negatable options, repeating composite argument groups, repeating subcommands and sophisticated handling of quoted arguments. Its source code is in a single file so it can optionally be included as source to avoid adding a dependency. Picocli prides itself on its extensive and meticulous documentation. Picocli uses reflection, so it is vulnerable to GraalVM’s Java native image limitations, but it offers an annotation processor that generates the configuration files that address this limitation at compile time. ... We can make our application more user-friendly by using colors on supported platforms. This doesn’t just look good, it also reduces the cognitive load on the user: the contrast makes the important information like commands, options, and parameters stand out from the surrounding text. The usage help message generated by a picocli-based application uses colors by default.


AT&T, Palo Alto Networks and Broadcom develop virtual firewall framework

The framework is an expansion to the Distributed Disaggregated Chassis (DDC) white box architecture that AT&T submitted to the Open Compute Project last September. The expansion delivers a dynamically programmable fabric with embedded security at the edge of the network, AT&T said. Specifically, the framework embeds AI and machine learning in the network fabric to prevent attacks. "Security has always been at the forefront of AT&T's network initiatives," said Michael Satterlee, VP of network infrastructure and services for AT&T. "Traditionally, we have had to rely on centralized security platforms or co-located appliances which are either not directly in the path of the network or are not cost effective to meet the scaling requirements of a carrier. This new design embeds security on the fabric of our network edge that allows control, visibility and advanced threat protection." AT&T said the framework -- which uses an open hardware and software design to support flexible deployment models -- also represents its white box approach to network design and deployment.



Quote for the day:


"Superlative leaders are fully equipped to deliver in destiny; they locate eternally assigned destines." -- Anyaele Sam Chiyson


Daily Tech Digest - March 07, 2020

Hybrid-cloud management requires new tools, skills

staffing the hybrid cloud 2 public private cloud clouds
A complex hybrid cloud requires constant oversight as well as a way to intuitively and effectively manage an array of operations, including network performance, workload management, security and cost control. Not surprisingly, given the large number of management tasks needed to run an efficient and reliable hybrid cloud environment, adopters can select from a rapidly growing array of management tools. "There’s a dizzying array of options from vendors, and it can be difficult to sort through them all," says R. Leigh Henning, principal network architect for data center operator Markley Group. "Vendors don’t always do the best job at making their differentiators clear, and a lot of time and effort is wasted as a result of this confusion. Companies are getting bogged down in an opaque field of choices." The current hybrid cloud management market is both immature and evolving, declares Paul Miller, vice president of hybrid cloud at Hewlett Packard Enterprise. Vendors are still getting a handle on the types of management tools their customers need. "Offerings are limited and may not be supported across all public, on-premises and edges," Miller adds.



The tech foundations supporting financial services in Asia

“We measure our success based on our contribution to the overall strategy of the organisation,” Angelin-Linker explained. “This is achieved either through the introduction of automation, digitalisation or simply by providing effective solutions to allow our customers and staff to be productive. “Consumers will have a very different expectation on how they do their banking and how they want to get access to their financial information. “Business initiatives will be centred around mobility, flexibility and accessibility of services so our focus will be on how we can provide the information our customers want quickly and securely to help them make right decisions.” Angelin-Linker also emphasised the importance of investing in process improvements to deliver significant productivity improvements within the business, a task which will require a great deal of prioritisation.


3 microservices resiliency patterns for better reliability


While the retry pattern works for transient failures, teams still need a reliable microservices resiliency pattern that handles larger, long-term, permanent faults. If a retry mechanism accidentally invokes a severely damaged service several times until it gets the desired result, it could result in cascading service failures that become increasingly difficult to identify and fix. The circuit breaker pattern creates a component that resembles a traditional electric circuit breaker. This component sits between requesting services and the services' endpoints. As long as these services communicate normally, the circuit breaker delegates messages between them in a closed state. When a retried service request travelling through the closed circuit fails a predetermined number of times, the breaker opens the message circuit to halt service execution. During this open state, the breaker stops service execution and returns error messages to the requesting service for each failed transaction.


Building intelligent school security systems of tomorrow

While greater application of video based tools for improving school and campus security is a positive sign, it is only the tip of the ice berg. Administrations are just beginning to scratch the surface of their video systems’ capabilities beyond security monitoring. When integrated with video analytics, schools can use their security cameras for proactive crime prevention and smarter operational planning. The data that video solutions yield is invaluable and can substantially improve public safety and the overall campus experience for staff and students. From a perimeter security perspective, cameras with analytics, such as facial, object and motion detection, act as a force multiplier for threat detection. For example, individuals who are not permitted to enter campus, such as known sex offenders or criminally wanted persons, can be identified just by walking into the view of a camera, in which case security personnel and police can be immediately notified. Security directors can track suspicious bags or other objects, dispatching an officer to the scene for further investigation.


Forget foldable phones. Large rollable displays are the way to go


The mockup had no electronics, and the display was little more than a thin, flimsy-feeling plastic sheet printed with what the display would look like. There were multiple instances when I had trouble prying the thing open. And the entire time, I was fearful it would break. TCL said it has a working prototype, including a mechanism that automatically opens and closes the phone. We saw footage of the device and the moving screen. It moved slower than we would've liked, and still looked rough as a prototype. But the idea is enough to get me excited, and aside from the wow factor, there are a few reasons why. Much of the attention around foldables has been focused on the hinge and getting the display to fold down completely. That's why the Mate X folds outward, and why the Galaxy Fold has an unsightly gap in the middle. The clamshell Galaxy Z Flip and Motorola Razr use different hinges to minimize the actual turn radius, but they don't actually fold flat. A scrollable phone would avoid that issue. Because it would roll out, there'd be no need for it to fold completely shut or for a fancy hinge to get around the crease issue. There wouldn't be creases.


It’s time to stop calling every firm that uses technology a tech company


It’s important to draw a distinction between what a tech company is and isn’t not only because mislabeling could portend another stock market bubble, but because management’s attention, like other resources, is limited. A tech-bedazzled management is liable to spend too much time, money, and energy on the underlying technology, or the touting of it, rather than on what will ultimately determine whether the company can grow, scale, and prosper. So how do we decide which companies get to be called tech companies and which are using the term as sleight of hand? Defining a company begins first by looking at what it produces and sells. Second, going deeper, it means asking not just what the company sells, but what the customer buys. Finally, defining the company means asking who the competitors are and why customers choose one company over another. The first element is what a company sells; the second is what customers want; the third is what they want from that specific company. The sum of those answers will tell you what that company is and what it should be designed to be.


Enterprises being won over by speed, effectiveness of network automation

gears / build management + automation / circuits
Network automation is designed to streamline the maintenance of physical and virtual network devices. Enterprises are looking to reduce their dependency on manual methods, and automation can simplify repetitive IT processes, improve consistency across branches and geographies, lower operational costs, and reduce human errors. Enterprises are deploying automation technologies in various types of networks, across data centers, wide area networks (WAN) and cloud environments. Major players in the market include traditional network vendors such as Cisco Systems and VMware; IT management players including SolarWinds, Forward Networks and Micro Focus; and automation specialists and startups such as AppViewX and NetBrain Technologies. It's a burgeoning field: MarketsandMarkets Research reports that the global network automation market is on track to grow from $2.3 billion in 2017 to an estimated $16.9 billion by 2022. "It’s a really exciting topic in the networking industry right now because the scale and complexity of networks is really greater than it ever was before," says Brandon Butler, senior research analyst covering enterprise networks at IDC, a Framingham, Mass.-based industry analyst firm.


Is IT the Good Guy or Bad Guy in Upskilling?

Image: Pixabay
How can CIOs lead change and drive digital training to close the skills gap across the entire organization? Instead of compartmentalizing learning to only technical skills, CIOs should lead by example to enable a culture of workforce-led innovation, an approach built around leadership and employee crowdsourcing. They are in the prime position to demonstrate different ways of applying technology in real business scenarios. Consider data: In HR, mixed data sources perform predictive data analytics to identify future skills, and challenges related to both old and new data warehousing pop up. Data governance becomes hypercritical and IT is distinctively positioned to support this function. As for the hub: The need for a virtual lab environment is logical to support citizen-led digital innovation. Who has the best timesheet bots? Who wrote the best demand pipeline visualization? The trick to scaling all of this is interoperability. And IT is especially primed to address this since it’s been making workforce technology interoperable dating back to the when PCs entered the workplace in large numbers.


How do software developers and architects work together?


Developers should ask lots of questions when they work with architects, Holnes said. Understand why the architect makes certain decisions, and repeat back information to ensure a shared understanding. Developers and architects collaborate best when the thought process is public. Context matters. For example, the architect announces that the team will build a web app feature for online shoppers with Python, even though two developers are experts in Go. Go would ensure high performance, but the whole team doesn't use it yet, and the retailer they work for wants the capability ready for a major product launch. In this case, Python is the right choice to balance benefits and constraints. On another project, however, the architect might ask developers to choose the language. Just as code-savvy architects can benefit a project, architecture-savvy developers are an asset.


How agile teams can support incident management

During an incident, software developers should aid in fixing the issue and restoring service in minimal time. Once the developers are called in, the assumption must be that operational engineers have already reviewed and possibly ruled out infrastructure-related concerns, and that site reliability engineers have already explored a list of common problems with the application. When there is a major incident, incident managers will often set up bridge calls, chat sessions, and physical war rooms to assemble a multidisciplinary team to work through the problem collaboratively. Developers who are called in should know and follow the incident response and communications protocols established for these war rooms. In the war room, developers should be application experts. After reviewing monitors, log files, and other alerts, they should make recommendations on courses of action. It’s essential to use specific language and separate fact from speculation.



Quote for the day:


"Leadership is a dynamic process that expresses our skill, our aspirations, and our essence as human beings." -- Catherine Robinson-Walker


Daily Tech Digest - March 05, 2020

CISO Imperatives in the Age of Digital Transformation

istock 1126779135
With proliferation of open source, enterprises need to secure not just commercial software, but also invest in securing open source software. Every member in a connected ecosystem from vendors, services providers, practitioners to end consumers, needs to be secure. Any weak link can put the entire ecosystem at risk. Open source usage is increasingly seen in categories like cloud management, security, analytics and storage, which have historically been dominated by proprietary products. Some of the key emerging open source technologies are open source firewall, instantaneous server-less workloads, trustworthy AI, blockchain, quantum computing, etc. Fueled by open methodologies and peer production, employees from enterprises are contributing to open source communities and collaborating better, thus forcing management to rethink their strategies. 5G next generation wireless technology will enable enhanced speed and performance, lower latency and better efficiency. It is expected to be broadly used for IoT communications and videos while controls/automation, fixed wireless access, high-performance edge analytics, and location tracking are the second tier uses for 5G-capable networks.



Verizon: Companies will sacrifice mobile security for profitability, convenience

mobile security / unlocked data connections
"For a number of reasons, mobile today is a smaller issue than many others," Zumerle said via email. "Among other factors, the operating system is more hardened, and mobile devices have less access to critical enterprise infrastructure and data." The Verizon report found that 39% of organizations admitted to suffering a security compromise involving a mobile device — up from 33% in the 2019 report and 27% in 2018. Of those that suffered a compromise, 66% said the impact was major and 36% said it had lasting repercussions. Twenty-percent of organizations that suffered a mobile compromise said a rogue or insecure Wi-Fi hotspot was involved. "Although the risks of public Wi-Fi are becoming well known, convenience trumps policy – even common sense — for many users. Some organizations are trying to prevent this by implementing Wi-Fi-specific policies, but inevitably, rules will be broken," Verizon said. According to MobileIron, 7% of protected devices detected a man-in-the-middle (MitM) attack in the past year.


Report: Most IoT transactions are not secure

Iot
Zscaler is a bit generous in what it defines as enterprise IoT devices, from devices such as data-collection terminals, digital signage media players, industrial control devices, medical devices, to decidedly non-business devices like digital home assistants, TV set-top boxes, IP cameras, smart home devices, smart TVs, smart watches and even automotive multimedia systems. “What this tells us is that employees inside the office might be checking their nanny cam over the corporate network. Or using their Apple Watch to look at email. Or working from home, connected to the enterprise network, and periodically checking the home security system or accessing media devices,” the company said in its report. Which is typical, to be honest, and let (s)he who is without sin cast the first stone in that regard. What’s troubling is that roughly 83% of IoT-based transactions are happening over plaintext channels, while only 17% are using SSL. The use of plaintext is risky, opening traffic to packet sniffing, eavesdropping, man-in-the-middle attacks and other exploits. And there are a lot of exploits.


Envision The Future To Unlock Business Value

While we were busy applying service packs and working out how to prevent “dumb users” from getting themselves into trouble at work, those same people were beginning to enjoy the spoils of the 21st century. Armed increasingly with high speed domestic and even mobile broadband, as well as a wide range of tactile consumer tech devices, they were gradually starting to enjoy a dizzying array of consumer services that were transforming their daily lives. From building stronger relationships with friends and family through social networking, through to the transformation in their retail and lifestyle habits, for the first time ever, normal, every day people (not just nerds like me and my colleagues) were beginning to enjoy the opportunity of a world where technology is something that lifts our capability, helping us to achieve more in all aspects of our lives. Slowly, the centre of gravity of people’s use of technology shifted from the world of work to their personal lives to the point where, certainly by the end of the last decade, most people had access to better technology in their domestic lives than they did at work.


5 big microservices pitfalls to avoid during migration


Rushing into microservices adoption is one of the most common mistakes software teams make. Even though microservices provide a chance to deploy new applications and updates quickly, the distributed architecture's inherent complexity means it's not ideal for certain types of organizations or applications. Teams should review the state of their existing development culture to see if management skills are in place. They should also examine existing applications to determine whether they are suitable and ready for a migration to microservices. Agile and DevOps principles should be in place, as microservices tend not to play well with a Waterfall development approach. Teams also need diligent training and access to documentation before they begin a migration of monolith-based workloads. Performance issues soon arise when a microservices migration starts without a proper plan and appropriate infrastructure investments in place. Teams can mitigate these issues if they ensure services are strictly independent from each other but can still communicate normally, as is the target for a loosely coupled architecture.


AI, Azure and the future of healthcare with Dr. Peter Lee

What’s interesting about AI for Health is that it’s the first pillar in the AI for Good program that actually overlaps with a business at Microsoft and that’s Microsoft Healthcare. One way that I think about it is, it’s an outlet for researchers to think about, what could AI do to advance medicine? When you talk to a lot of researchers in computer science departments, or across Microsoft research labs, increasingly you’ll see more and more of them getting interested in healthcare and medicine and the first things that they tend to think about, if they’re new to the field, are diagnostic and therapeutic applications. Can we come up with something that will detect ovarian cancer earlier? Can we come up with new imaging techniques that will help radiologists do a better job? Those sorts of diagnostic and therapeutic applications, I think, are incredibly important for the world, but they are not Microsoft businesses. So the AI for Health program can provide an outlet for those types of research passions. And then there are also, as a secondary element, four billion people on this planet today that have no reasonable access to healthcare.


Why Unsupervised Machine Learning is the Future of Cybersecurity


There are two types of Unsupervised Learning: discriminative models and generative models. Discriminative models are only capable of telling you, if you give it X then the consequence is Y. Whereas the generative model can tell you the total probability that you’re going to see X and Y at the same time. So the difference is as follows: the discriminative model assigns labels to inputs, and has no predictive capability. If you gave it a different X that it has never seen before it can’t tell what the Y is going to be because it simply hasn’t learned that. With generative models, once you set it up and find the baseline you can give it any input and ask it for an answer. Thus, it has predictive ability – for example it can generate a possible network behavior that has never been seen before. So let’s say some person sends a 30 megabyte file at noon, what is the probability that he would do that? If you asked a discriminative model whether this is normal, it would check to see if the person had ever sent such a file at noon before… but only specifically at noon. Whereas a generative model would look at the context of the situation and check if they had ever sent a file like that at 11:59 a.m. and 12:30 p.m. too, and base its conclusions off of surrounding circumstances in order to be more accurate with its predictions.


Advanced Tech Needs More Ethical Consideration & Security

The recent confrontation between the US and Iran is a case in point. Threats of cyber warfare along with conventional military action put security executives at every major organization on high alert and questioning what to do in the event of a breach. There are worries of vulnerabilities to the infrastructure and that attackers could be impossible to identify. Very few organizations are fully prepared to respond to an incident at an enterprise or organizational level. An effective response to a major cyber incident requires current, effective IT-focused cyber plans, but also participation from all lines of business and operational support areas to ensure a successful integrated, orchestrated recovery. The benefits of advanced technologies to industry and commerce are manifold. In healthcare, robotic surgeries improve recovery rates and reduce days spent in the hospital. AI and machine learning boost productivity in the data-dependent financial services industry, increasing analytical efficiency while reducing manual work and human errors. The same goes for most industries. 


Internet of think with padlock showing security
IoT-specific regulations aren’t the only ones that can have an impact on the marketplace. Depending on the type of information a given device handles, it could be subject to the growing list of data-privacy laws being implemented around the world, most notably Europe’s General Data Protection Regulation, as well as industry-specific regulations in the U.S. and elsewhere. The U.S. Food and Drug Administration, noted Maxim, has been particularly active in trying to address device-security flaws. For example, last year it issued security warnings about 11 vulnerabilities that could compromise medical IoT devices that had been discovered by IoT security vendor Armis. In other cases it issued fines against healthcare providers. But there’s a broader issue with devising definitive regulation for IoT devices in general, as opposed to prescriptive ones that simply urge manufacturers to adopt best practices, he said. Particular companies might have integrated security frameworks covering their vertically integrated products – such as an industrial IoT company providing security across factory floor sensors – but that kind of security is incomplete in the multi-vendor world of IoT.



Intel CSME bug is worse than previously thought

Intel CPU
At the time, the CVE-2019-0090 vulnerability was only described as a firmware bug that allowed an attacker with physical access to the CPU to escalate privileges and execute code from within the CSME. Other Intel technologies, like Intel TXE (Trusted Execution Engine) and SPS (Server Platform Services), were also listed as impacted. But in new research published today, Ermolov says the bug can be exploited to recover the Chipset Key, which is the root cryptographic key that can grant an attacker access to everything on a device. Furthermore, Ermolov says that this bug can also be exploited via "local access" -- by malware on a device, and not necessarily by having physical access to a system. The malware will need to have OS-level (root privileges) or BIOS-level code execution access, but this type of malware has been seen before and is likely not a hurdle for determined and skilled attackers that are smart enough to know to target the CSME.



Quote for the day:


"The problem with being a leader is that you're never sure if you're being followed or chased." -- Claire A. Murray


Daily Tech Digest - March 04, 2020

A Cyber View Of Smart Cities

Photo:
No single cybersecurity solution on the market today provides automated remediation, and while options such as SOAR attempt to orchestrate responses, the reality is that most are simple isolation and reactive patching routines. While cyber vendors tout machine learning and AI systems, those efforts are focused on cleaning out noise from incoming information and attempting to find anomalies. None provides any level of remediation that does not require a human to directly run that effort. Not only are these cybersecurity tools not providing automated remediation, but they are also architected in such a way that they disrupt when they make changes and are unable to move into a full remediation capability down the road. For modern cybersecurity, smart cities are a zero-sum game that will never reach the levels of protection that will be required. The final insult is the future wherein AI, already much faster than humans, will be used to attack these already improperly protected smart cities. 



Programming code abstract technology background of software developer and  Computer script
The platform has been tested with private developers and startups in the US and in France, Joubert said. So far, the feedback has been good with two suggested areas of improvement, he said. Testers said they want to see enhanced coverage so the platform can generic more specific unit tests, and they want to see an increase in the number of languages Ponicode is supporting, according to Joubert. "We're trying to make it very smooth and integrated for developers," he said. "It's really, really important that the developer keeps control." Generating unit tests is complex because developers need to first understand the function and what the intention is inside the code. Then they have to generate a test case and then give some values to tell the function what to do, he said. The third task is generating specific values to test properly. "We created an algorithm that trains the AI to generate unit tests," Joubert said. With Ponicode, developers can run their app in VS Code because the platform will understand how it can be tested; choose easily among the suggestions generated by the platform, and increase coverage in a click, without writing a single line of code, he said.


The Missing Piece In Quantum Computing And IoT

white jigsaw puzzle piece on pink background
Using the key principles of quantum computing mentioned earlier, we can create quantum key distribution, the most secure way to encrypt and decrypt information – and thereby send messages securely – that has been developed to date. This is true for several reasons. For one, quantum cryptology such as this utilises a property of quantum physics called entanglement. Maria Korolov explains this process as when ‘two particles become entangled so that they have the same state, and then one of these particles is sent to someone else. When the recipient looks at the particle, it’s guaranteed to be the same state as its twin…the state of the two entangled particles, while identical, is also random.’ As such, entanglement allows you to send an encryption key in the form of two ‘identical, random particles’, which can be used to send messages using symmetric encryption. This method doesn’t require a means of transmission and, as such, it becomes more difficult for information to leak. Encryption is therefore made considerably stronger.



Cryptocurrency Bourses Win India Case Against Central Bank Curbs


A three-judge bench headed by Justice Rohinton F. Nariman agreed with petitions by cryptocurrency exchanges, start ups and industry bodies that had challenged the Reserve Bank of India’s April 2018 decision to ban banks from offering any services to support digital currencies. The court struck down the RBI’s curbs on Wednesday. The ruling is an opportunity for virtual currency investors and businesses in India to push against stricter rules being planned by a skeptical government, and potentially raises hope for projects such as Facebook Inc.’s Libra cryptocurrency. The Supreme Court is separately hearing another case, in which it will decide on regulations for digital currencies, and Wednesday’s judgment weakens the case for strict norms. “Cryptocurrencies are an exciting technology that needs to be carefully studied,” said Vaibhav Kakkar, a partner at law firm L&L Partners. “With this order, there is a likelihood of more mature and balanced regulation of cryptocurrencies and the fintech sector as a whole.”


What is the difference between LoRa and LoRaWAN?


LoRa, or Long Range, is a proprietary, low-power and long-range wireless technology that uses license-free wireless spectrum -- much like Wi-Fi uses the unlicensed 2.4 GHz and 5 GHz frequencies. The exact frequency LoRa uses depends on the physical location of a deployment. For example, LoRa uses the 915 MHz band in North America and the 868 MHz band in Europe. Thus, it's important to know which frequencies can be legally used in each LoRa deployment location. From a range perspective, LoRa can communicate up to 10 km away under optimal, line-of-sight conditions. ... LoRaWAN is an open, cloud-based protocol -- designed and maintained by the LoRa Alliance -- that enables devices to communicate wirelessly with LoRa. Essentially, LoRaWAN takes LoRa wireless technology and adds a networking component to it, while also incorporating node authentication and data encryption for security. From an enterprise IT deployment perspective, LoRaWAN networks are ideal for IoT devices that continuously monitor the status of something and then trigger alerts back to gateways when the monitored data surpasses a specified threshold.


'Malware-free' attacks now most popular tactic amongst cybercriminals


The increasing popularity of malware-free attacks underscored the need for organisations not to rely solely on antivirus tools, said CrowdStrike. The security vendor defined malware-free attacks as those in which files or file fragments are not written to disk. These could be attacks where codes executed from memory or where stolen credentials are tapped to enable remote logins. It added that malware-free attacks typically require various detection techniques to identify and intercept, such as behavioural detection and human threat hunting. The 2020 threat report also saw more incidents of ransomware and ransom demands from cybercriminals who, increasingly, conducted data exfiltration, which enabled them to exploit sensitive data that was proprietary information or potentially embarrassing for victims. In addition, nation-state adversaries last year targeted a range of industries, but were especially interested in the telecommunications sector, which saw increased attack frequency from nations such as China and North Korea, noted CrowdStrike. State actors from China, in particular, were keen to target the industry in a bid to steal intellectual property and competitive intelligence, said the US security vendor.


How IT Leaders Can Attract and Retain the Right Talent

Image: tomertu - stockadobe.com
Beyond looking to recent graduates, consider untapped pools of talent to diversify your workforce. While often overlooked because of “lack of relevant technical experience,” veterans offer skills that could greatly impact your existing teams, including strong leadership, productivity and decision-making capabilities. We can look to companies like Salesforce for inspiration: Its veteran program Vetforce connects the military community with open IT positions. Another pool of talent often left behind are those who have taken time off and want to restart their careers, including parents with new children or those who had to care for a loved one in a time of need. For example, we partnered with Path Forward to offer returnship programs. These programs help professionals with five or more years of work experience, and who have been out of the paid workforce for a minimum of two years, to bridge their transition back into the workforce. We have found excellent, talented employees through this channel. Once you have a candidate in mind, ask the right interview questions to determine their potential fit on your team.


Could Crypto Exchanges, Wallets Be Targetted With Banking Trokans?


Using Remote Access Trojans (RATs), hackers can reportedly bypass security infrastructure on smartphones, enabling cybercriminals to carry out transactions directly from the infected mobile devices. According to the report, hackers are already using banking trojans like Hydra and Gustuff to attack crypto exchanges and wallets. Using Hydra’s screencast capabilities, cybercriminals can remotely monitor real-time activities on the infected mobile devices. Hydra also allows hackers to clone the infected device, providing access to stored financial information. As part of its report, ThreatFabric revealed that rogue actors are using Hydra to hack crypto wallets on platforms like Binance, Bitfinex, and Coinbase among others. With Gustuff, hackers have access to keylogging and browser overlay attack vectors allowing rogue actors to trick victims into entering their financial details on fake websites that closely resemble their real banking or crypto exchange platforms. According to ThreatFabric, Gustuff’s potential target is also currently expanding to include crypto wallets like Electrum, Blockchain.com, and Xapo.



AI for Payment Optimization: Current Practices and Use Cases

AI for Payment Optimization: Current Practices and Use Cases
Fraud detection is a major problem in the financial world as it slows down payment processing. Furthermore, it can be difficult to detect, using standard methods, in accounts with a large number of payments on a daily basis. A good example of how AI is used in fraud detection comes from VISA, one of the largest digital payment processors in the world. They’ve been using AI systems for the last 25 years, which allowed the system to improve and learn as the technology got better. Their artificial intelligence system for payment authorization and fraud detection learns user behavior and understands patterns. So, whenever an activity is not according to a user’s profile, it is being flagged as suspicious. Once a transaction is considered suspicious, VISA’s AI connects with the bank that issued the card letting them know about the situation. From here, the bank will either block the transaction (based on the risk assessment made by VISA) or send a text message asking the account owner to confirm that he/she initiated the transaction. 


Parliament: New cyber security label for smart devices

From robot vacuum cleaners to smart light bulbs, connected devices are poised to surge in popularity.
Announced by Singapore's Senior Minister of State for Communications and Information (MCI) Janil Puthucheary in Parliament on Tuesday (March 2), the initiative aims to address this "growing area of concern". "The scheme will raise consumer awareness of more secure products and aims to encourage manufacturers to adopt additional cyber security safeguards," said Dr Janil during the debate on MCI's budget. To be launched later this year, the scheme will initially be voluntary, administered by the Cyber Security Agency of Singapore. Singapore's labelling scheme will follow the European Union's standard for IoT devices, which spells out the minimum standards for manufacturers, including having no default passwords and ensuring that there are regular software updates over the air without user supervision. Singapore is among the first group of countries to adopt the standard. CSA said that the labels will indicate the security provisions present in the smart devices. More details will be announced later.



Quote for the day:


"Leaders dig into their business to learn painful realities rather than peaceful illusion." -- Orrin Woodward