Daily Tech Digest - September 21, 2020

Ransomware attacks growing in number, severity: Why experts believe it will get worse

The attackers are also changing their tactics, making it difficult to defend against them. “Those tactics include things like not just encrypting victims’ computer networks but also exfiltrating data on those victim networks,” Holmes said. “Now when victims realize they’ve become a victim of a ransomware attack; it’s not just that their data is encrypted. It’s also that they’ve lost all their — or much of their — information. The ransomware actors are holding that data at risk and telling victims if they don’t pay the ransom, they’re going to leak that data to the public.” Perhaps not surprisingly, ransomware attackers are starting to band together, Holmes said. “We’ve seen some of these ransomware actors entering into a cartel with one another. Under this cartel model, they’ll share information amongst the group members and share intelligence and share techniques. It’s a bit concerning to us as well [because] it shows a major shift change among the ransomware actors.” The ransomware cartels are forming into business-like organizations with specialization of labor, Jason Conboy of DHS’s Investigations division said. “You have your malware office… and they have effectively built skills to write that malware. You’ve got your money mules.


Cloud banking: More than just a CIO conversation

The bank of 2030 will look very different from today. Facing changing consumer expectations, emerging technologies, and alternative business models, banks need to start putting strategies in place now to help them prepare for this future. An important indicator of the shifting landscape? Cloud computing is moving to the forefront as a focus for the chief information officer, C-suite executives, and board members. Banking and capital markets leaders increasingly recognize that cloud is more than a technology; it is a destination for banks and other financial services firms to store data and applications and access advanced software applications via the internet. The leading public cloud providers offer an array of innovative products-as-a-service that can be accessed on their platforms and help banks implement business and operating models to improve revenue generation, increase customer insights, contain costs, deliver market-relevant products quickly and efficiently, and help monetize enterprise data assets. The cloud also offers a huge opportunity to synchronize the enterprise; to break down operational and data silos across risk, finance, regulatory, customer support, and more. Once massive data sets are combined in one place, the organization can apply advanced analytics for integrated insights.


Machine Learning Biases Might Define Minority Health Outcomes

Algorithmic biases present marginalized patients as more healthy, disqualifying them from the speciality care they need. ... AI bias issues are in no way new to the computer science community; however, it is unclear whether the medical community even recognizes the problem. Algorithmic-driven patient care results in less adequate care for people striving to push back against the tide of systemic racism, given that the lack of diversity in big data and automated-decision regulations has not been ameliorated by the United States federal government. In contrast, European data regulations state that machine-automation developers must use appropriate mathematical and statistical techniques to ensure both that risk of error is minimized and that discriminatory effects are prevented. Regardless of geography, it is undeniable that managing these prejudices requires careful attention to data, the use of artificial intelligence to help detect bias, and the building of diverse teams. The federal government must ensure that those behind the automation are ethically and legally obliged to ensure AI is on the side of fairness.


Ireland’s data watchdog slammed for letting adtech carry on ‘biggest breach of all time’

Responding to the ICCL submission, the DPC’s deputy commissioner Graham Doyle sent this statement: “Extensive recent updates and correspondence on this matter, including a meeting, have been provided by the DPC. The investigation has progressed and a full update on the next steps provided to the concerned party.” However in a follow up to Doyle’s remarks, Ryan told TechCrunch he has “no idea” what the DPC is referring to when it mentions a “full update”. On “next steps” he said the regulator informed him it will produce a document setting out what it believes the issues are — within four weeks of its letter, dated September 15. Ryan expressed particular concern that the DPC’s enquiry does not appear to cover security — which is the crux of the RTB complaints, since GDPR’s security principle puts an obligation on processors to ensure data is handled securely and protected against unauthorized processing or loss. ... The regulator also pointed to another related open enquiry — into adtech veteran Quantcast, also beginning in May 2019.The DPC has said the Quantcast enquiry is examining the lawful basis claimed for processing Internet users’ data for ad targeting purposes, as well as considering whether transparency and data retention obligations are being fulfilled.


Write Better Automated Tests With Cypress in Angular

Cypress is much like Protractor for Angular applications, but Cypress is much faster to run and easier to debug. Cypress not only is good at the automated end-to-end testing that is independent of our applications, but also is capable of unit testing against methods in model classes, service classes, and so on. Cypress provides a complete end-to-end testing experience in your browser where you can see your tests in an automated way. Cypress provides it's own test runner where you can do the tests locally. Cypress provides some other cool features like Time Travel, Debuggability, Real-time reloads, Automatic waiting. These brilliant features make apart this tool and we will see these in action. Enough talk. If you've already cloned this repo then you're good to go but if not then you probably have an Angular project and you just have to add Cypress as a dependency. ... The cypress package includes a desktop app and the Cypress binary. It may take a few minutes to run this command for the first time because the package installation needs to download the Cypress binary. The Cypress binary is saved in a global cache directory, so installing this package in the future will be much faster for the same version.


Connecting the dots in an uncertain world

A tendency to underestimate the unexpected — both the chance of it happening and the potential in it — is a common one. To combat this bias, instead of doing “risk management,” which tends to focus on known unknowns, you can build a muscle for planning for and finding potential in the unexpected. This will be what counts during periods of crisis and uncertainty. One way of doing this is to develop a North Star you can use to connect unexpected events to your mission. Take Kaan Terzioglu, CEO of Turkish telecom company Turkcell from 2015 to 2019. When there was an outbreak of serious violence in Turkey in 2016 following an attempted coup d’etat, Kaan and his team responded by making the Internet free for a month so people were able to communicate with those they cared about without worrying about bills. And during a regional refugee crisis, the company’s quick and proactive actions to develop an app to help Syrian refugees find crucial health and financial information in the country calmed tensions in the camps. As Turkcell has built on this initial app, adding more services, customer loyalty among the refugee population has given Turkcell a major market share among the 3 million Syrian refugees in Turkey. 


What advances in HPC and AI mean for business, and what they do not

What businesses do need to be mindful of when looking to deploy AI is that it is not a magic bullet to answering business critical questions. What’s more, and as we’ve seen in several recent unfortunate examples, is if your AI is not trained with a wide set of data, it can end up amplifying a wrong supposition, rendering the end-product useless. For example, if you’re only training your facial recognition programme with pictures of white men, you’re going to get some biased and potentially discriminatory outcomes. As with all forms of modelling, the findings you get out are only as good as the data that you put into it. There are no shortcuts around this simple fact and any organisation trying to do so will quickly become unstuck. The key take-away from these missteps is not that AI has failed or is incapable of carrying out a task; it’s that humans have, can and often do apply the technology poorly. Today, AI is transforming almost every industry and vertical, from pathology, where it’s used to detect cancer, to AI picking up the phone to answer contact centre customer queries, from water controlled indoor farming to driving autonomous vehicles in warehouses and on public roads.


Interview with Event Modeling Founder - Adam Dymitruk

The event model contains independent units of work (slices). This allows not requiring stand-ups, sprints and many other processes that are not effective with remote work. These saved hours are reinvested in working on implementing the solution. The transition to remote work is not even felt with the guidance event modeling brings. Because the patterns are prescribed, event modeling takes the guesswork out of many aspects of writing software. The design allows a fixed and flat cost curve where the complexity of the system doesn't make a project end in a "death march". Event Modeling is simple. Whether you're looking at BPMN or UML, the traditional practices for design are incredibly heavy. The main benefit is that you can get the same value for a much smaller investment of time to create a blueprint for how your current or potential solution works or will work. The goal is to sweep aside the mountain of books and get on organization going quickly. It takes 15 minutes to explain Event Modeling. The other main benefit is having a reliable plan to start implementation. Most new projects in software today head into starting development blind. An important test that the event model provides is that it shows if you are "information complete".


The 4 Top Artificial Intelligence Trends For 2021

From advancement in medical technology and standards of care, to advances in communication technology that enabled outbreaks to be spotted more quickly and lockdowns imposed. Over the next year, AI will be added to the list of technological developments that are enabling us to more effectively deal with pandemics. The growth in the amount of scientific and medical literature alone is enormous, with more than 28,000 papers published by April this year relating to Covid-19. A dedicated search engine powered by natural language processing (NLP) algorithms has already been made available, so anyone can get AI assistance when investigating this massive dataset. Work is also ongoing to develop AI solutions to help deal with the huge backlog of other medical issues, such as cancer, where treatment has been affected as resources are diverted to fight Covid-19. Over the next year, we are likely to see the accelerated adoption of AI across many other areas of healthcare, not only related to tackling viruses. By developing our ability to apply machine learning problem-solving to these massive, real-time global datasets, we will spot outbreaks more easily, track contact between infected people, enable more accurate diagnoses, and, by predicting ways that a virus might evolve in the future, develop more effective and lasting vaccinations.


New US digital identity legislation promises more secure verification

The Improving Digital Identity Act would create an Improving Digital Identity Task Force within the executive office of the president. Its mission is to establish a government-wide effort to develop secure methods for federal, state and local government agencies to validate identity attributes and support interoperable digital identity verification in both the public and private sectors. The task force would be comprised of cabinet secretaries, heads of other federal agencies, state and local government officials, congressional committee designated members, and a position appointed by the president. Additionally, the National Institute of Standards and Technology (NIST) would develop a standards framework for digital identity verification to guide federal, state and local governments in selecting their digital identity solutions. NIST would have one year to publish a final version of the framework. The legislation requires the task force to publish a report with recommendations on research and development in systems that enable digital identity verification. Upon its completion and with consent of the individual, the framework will enable government agencies to securely vouch for its citizens in real-time when online.



Quote for the day:

"Power should be reserved for weightlifting and boats, and leadership really involves responsibility." -- Herb Kelleher

No comments:

Post a Comment