Daily Tech Digest - August 20, 2019

Blockchain is not a magic bullet for security. Can it be trusted?

Bitcoin's vulnerabilities have already been successfully exploited in significant hacks.
As with any technology, security issues arise when developers program requirements into products and services. The lines of code, consensus mechanisms, communication protocols, etc., all have the potential to host vulnerabilities that can be exploited for malicious use. But blockchain at the moment remains a divergent technology: multiple protocols and programming languages are being developed in parallel. As a result, it is difficult for developers to acquire the experience needed to secure their code, while most are under stringent time pressure to deliver. Because blockchain relies heavily on cryptography, the practice of secure communication, it gives many the impression that it’s a self-secured technology. This could not be further from the truth, as blockchains are built on top of communication networks and equipment that need to be secured. Traditional information security challenges apply to blockchain, too. Furthermore, cryptography is, like any other security discipline, a changing field: quantum computers are already expected to break a number of cryptographic algorithms.



How to unlock the true value of data


Central to a hub architecture will be the technologies used to get data flowing into it from applications and other data sources, and then provisioning outward to consumers – internal just as much as external. These might include extract, transform and load (ETL) tools that support bulk or batch movement of data, data replication and data virtualisation. They can also include app-integration middleware, such as the enterprise service bus, and message-oriented technologies that move data around in the form of message constructs. Whatever tools are used, on-premise and cloud service versions are available to tap, and there are still other elements to consider, such as governance tools to help with data compliance and metadata management tools to tag and manage data flows better. One of the big headaches for those tasked with developing a business’s data architecture is control.


Texas Pummeled by Coordinated Ransomware Attack

Texas Pummeled by Coordinated Ransomware Attack
In an updated statement released Saturday, DIR said the total victim count stood at 23 organizations. The Texas Military Department as well as Texas A&M University System's Cyberresponse and Security Operations Center teams "are deploying resources to the most critically impacted jurisdictions," it added. The U.S. Department of Homeland Security as well as FBI's cyber division, among others, have also been assisting with the response. "At this time, the evidence gathered indicates the attacks came from one single threat actor. Investigations into the origin of this attack are ongoing; however, response and recovery are the priority at this time," DIR said. "It appears all entities that were actually or potentially impacted have been identified and notified." Systems and networks run by the state of Texas have not been disrupted, DIR says. Officials in Austin said their systems were unaffected by the attack. "We are monitoring the situation," Bryce Bencivengo, a spokesman for Austin's Office of Homeland Security and Emergency Management, told local NPR member station KUT.


Value Engineering: The Secret Sauce for Data Science Success

If what your organization seeks is to exploit the potential of data science to power your business models; then the Data Science Value Engineering Framework provides the “How” the organization can do it. The Value Engineering Framework starts with the identification of a key business initiative that not only determines the sources of value, but also provides the framework for a laser-focus on delivering business value and relevance. A diverse set of stakeholders is beneficial because it provides more perspectives on the key decisions upon which the data science effort needs to focus. The heart of the Data Science Value Engineering Framework is the collaboration with the different stakeholders to identify, validate, value and prioritize the key decisions (use cases) that they need to make in support of the targeted business initiative. After gaining a thorough understanding of the top priority decisions (use cases) the analytics, data, architecture and technology conversations now have a frame within which to work (by understanding what’s important AND what’s not important).


People will usually follow those who have the most positional authority and concomitant control of resources, but also will follow those with other forms of power, such as eloquence, passion, sincerity, commitment, and charisma. In the teams I work with, people tend to pay the most attention to and be most influenced by those with both types of power. But sometimes people will even choose to follow less senior individuals if they have inspiring ideas and energy. No matter how powerful they are, though, when people show themselves to be untrustworthy, through something they do inside or outside the team, their influence vanishes. Others might still pay attention to them, but now only for transactional purposes. Those “leaders” are no longer really leading. If you want to be a real leader, one with voluntary followers, remember that you must earn and keep your people’s trust. They will carefully assess your attitude and actions, in particular whether you look out for others in addition to yourself. If their assessment is that you are trustworthy, they’ll stick with you.


These robot snakes designed by AI could be the next big thing in the operating theatre


In order to create snakebots that work in the confines of each individual's anatomy, the QUT team generate tens of virtual versions of the snakebot and set evolutionary algorithms to work on them, in a survival-of-the-fittest contest designed to create the best available bot.  First, the patient's knee is scanned by CT or MRI and a model of its internal anatomy is modelled. Alongside the surgeon, the QUT system then delineates which parts of the knee the surgeon will need to reach during the operation, as well as the parts they need to avoid. ... Afterwards, they're ranked according to their performance. Then the evolutionary algorithm refines the better-performing designs according to the results of the simulation, running the simulation over and over again, and tweaking the winning bots and rejecting those that aren't up to scratch. "This is copying what's been observed in nature and the process of evolution but you do it inside the computer... we kill off the ones that didn't do very well and we mate the ones that do well. Mating, in this case, means you combine half the characteristics of one and half of the other. You do random mutations on some of them - change a few little bits during the mating."


What Cybersecurity Trends Should We Expect From The Rest Of 2019?

uncaptioned
Currently, the industry standard for security relies on two-factor authentication when users choose to log into the software. While many email services and social media sites only ask for one form of authentication, two-factor authentication is the future. However, by the time companies adopt this, multifactor will have taken off. Most data breaches are caused by leveraging bad passwords. Weak, stolen or default passwords are usually the biggest culprits for a data leak. Single authentication allows this to happen since passwords can be limited to just something you know. By giving out a dongle or integrating an app with temporary passwords that expire, you can ensure that only verified users get access. Since more people than ever are worried about stolen identities, we should see this kind of authentication process take off in coming years. ... Companies are even finding ways to deceive potential hackers. By imitating your company's more critical data and assets, this bait can act as a trap for anyone trying to get ahold of your data.


Case Study: Improving ID and Access Management

A few years ago, Molina Healthcare was using a homegrown solution to onboard and offboard users daily in batches from the company's HR system into Active Directory, she says. But the company was growing quickly, so the mostly manual process of provisioning and de-provisioning access to Molina's systems was time-consuming, Sankepally says in an interview with Information Security Media Group. "With the increasing demands, we couldn't complete all the business processes involved, and there was a lack of standards," she says. "Our onboarding process was taking 10 to 20 days." As a result, the company made a move to standardize and automate its ID and access management platform, choosing to implement technology from SailPoint Technologies, she says. "Today we have more than 15,000 active identities supporting 15 different states with different lines of business ... including caregivers on the ground." For onboarding users, the company now has a "near real-time integration" with its cloud-based HR system that has automated the onboarding and offboarding process, she says.


GDPR faces growing pains across Europe


European countries have clearly demonstrated different strategies on penalties. Also, they have set up different structures for implementing the regulations. In Germany, for example, DPAs are organised on a German state level – but there is also a separate DPA at federal level, with jurisdiction over telecom and postal service companies. The result is that Germany has 17 data protection authorities, instead of just one. Another area where European countries disagree is in their interpretations of some of the finer points of GDPR. For example, Austria’s DPA ruled that all a data controller has to do in response to a request for data deletion is to remove individual references to that data. Nations have also demonstrated differences of opinion on how to calculate fines. For example, some local legal authorities in Germany have argued that the GDPR fines imposed in that country should be calculated according to German law, which would result in much lower fines than those imposed at the European level.


Visa Adds New Fraud Disruption Measures

Visa now is adding fraud disruption to supplement its transaction fraud detection and remediation efforts. The company today at the Visa US Security Summit 2019 in San Francisco outlined five new capabilities it now uses to prevent fraudulent transactions. "We're looking to identify and disrupt fraud before it happens," says David Capezza, senior director of payment fraud disruption at Visa. "We want to take a more proactive approach and identify these attacks and shut them down before they occur." Rivka Gewirtz Little, research director for global payment strategies at IDC, says Visa's new approach blends both its cyber and fraud units. "Typically, organizations are focused on the transaction," Gewirtz Little says. "What's interesting here is that Visa is creating a true cyber fraud system where the cyber team and fraud teams are integrated: the cyber team focuses on the attack against the enterprise and the fraud team looks at ways of preventing the attack. It's not always the same set of tools, the same team and objectives."



Quote for the day:


"Leadership offers an opportunity to make a difference in someone's life, no matter what the project." -- Bill Owens


No comments:

Post a Comment