Rapid7 penetration tests reveal multitude of software flaws, network misconfigurations
People are simply too predictable when it comes to creating passwords, and that’s even if an organization enforces password length and complexity standards. For example, “Summer2018!” meets the objectives of a password that is required to have at least one uppercase letter, one lowercase letter, one number, and one special character. But Rapid7 noted that it is one of the worst passwords a person can choose. Seasonal passwords came in as the third most common type of password. ... What do organizations most care about protecting? Despite the almost-daily data breach announcements, Rapid7 found that organizations are more concerned with protecting their own sensitive data such as internal communications and financial metrics than protecting the sensitive data of their customers or employees. As for organizations’ top five biggest priorities for protecting information, sensitive internal data is at the top with 21 percent, PII was second at 20 percent, authentication credentials were third at 14 percent, protecting payment card data came in at 7.8 percent, and bank account data was fifth at 6.5 percent.
Three AI And Machine Learning Predictions For 2019
The U.S. Army is currently using machine learning to predict when combat vehicles need repair. Think about it, there are millions of pieces of equipment that our Army uses each and every day. To keep track of the data involved, they are recruiting the help of an AI assistant. For the first implementation, a few dozen armored infantry transports will receive sensors inside of the vehicles’ engines. These sensors will record temperature and RPM and will transmit it to the software. Machine learning capabilities will look for patterns in the data that match engine failures in similar vehicles. What if your car did this? AAA might become obsolete if your car could tell you that the transmission is about to crap out on you. If the army is using the technology, I'm sure it won't be long till we see it in the civilian world. Automotive isn't the only industry that is seeing potential new uses for this tech, healthcare is about to see some changes too. As if Google wasn’t already on the AI map, they have begun to predict the likelihood of a patient’s death using machine learning – with staggering 95% accuracy.
NVMe is a protocol for accessing high-speed storage media that’s designed to reduce latency and increase system and application performance. It's optimized for all-flash storage systems and is aimed at enterprise workloads that require low latency and top performance, such as real-time data analytics and high-performance relational databases. Storage vendors have been re-tooling their systems to support the faster interconnect protocol, and IBM is no exception. A key change in the FlashSystem 9100 is the use of small form factor NVMe drives. IBM redesigned its FlashCore technology to fit into a standard 2.5-inch SSD form factor with NVMe interfaces – a move that reduced the physical size of the drives by more than half. That redesign made an impression on Owen Morley, director of infrastructure at online dating platform Plenty Of Fish. Morley is among a group of users of IBM's all-flash storage who came together at an event in Mexico City to share their thoughts on the new 9100 system and the potential for NVME-accelerated storage in their own enterprises.
Edge computing will be vital for even simple IoT devices
The evolution of wearables required each generation to monitor and collate a greater number of measurements (raw data). Developers found optimal ways of doing this by processing raw data locally (on the edge of the application using the Bluetooth chips’ increasingly powerful onboard processors) and then forwarding to a smartphone app and the cloud (for data sharing and tracking) only the essential information (desired data). The technology enabled continuous (low-latency) monitoring, and the modest Bluetooth wireless throughput was sufficient to update apps and cloud servers of the key tracking information without requiring extended on-air duration that would otherwise be needed to stream raw data. Sending only the key information also minimized the impact on the user’s cellphone data allowance (data cost). Things go wrong, hackers never quit Because users didn’t always carry their smartphones, wearables had to operate autonomously when not connected. Resiliency was built into the systems. They didn’t depend on a continuous network or internet connection for successful operation (redundancy).
Nation-State Spear Phishing Attacks Remain Alive and Well
The trouble with phishing is that it relies on social engineering - meaning it's designed to trick users - and it can potentially be used to compromise any online account. Unfortunately, we humans are both easy to trick - at least some of the time - as well as fallible. And attackers can pummel would-be victims with phishing attacks until one succeeds. The scale of the phishing challenge is reflected by the number of video interviews touching on phishing that I recently conducted at the London Infosecurity Europe conference. Experts described everything from the increasingly targeted nature of phishing attacks and the importance of never forgetting the human factor as well as training users, using technology to extract data from emails and attachments and implementing the practice of tracking malicious domains to better block phishing campaigns. But as this patchwork of practices, procedures and technology demonstrates, there's no single fix for the phishing problem. Furthermore, with more of our business and personal lives now living in the cloud, the impact of falling victim to a phishing attack continues to increase.
Privacy pros gaining control of technology decision-making over IT
“This global survey is critical in our efforts to better understand how privacy professionals are addressing compliance challenges and the technologies that are being deployed now and in the near future,” said Chris Babel, CEO of TrustArc. “Though security budgets remain larger, we’re seeing a marked shift in privacy teams’ influence over technology purchasing decisions. This trend confirms what we’re seeing among our customers – that they have a growing need for technology solutions to help them manage privacy compliance at scale on a global basis.” The EU GDPR and other global and domestic legal reforms, combined with technological advancements, have made the task of operationalizing privacy and data protection vastly more complicated. Businesses now must account for how data is entering the organization, how it is being used, what permissions are attached to it and who has the responsibility for managing it. To address these challenges, the demand for privacy technology continues to grow rapidly.
Measuring Tech Performance: You’re Probably Doing It Wrong
First, velocity is a relative and team-dependent measure, not an absolute one. Teams usually have significantly different contexts which make comparing velocities inappropriate. (Seriously, don’t do this.) Second, when velocity is used as a productivity measure, teams are very likely to game it: they inflate their estimates and focus on completing as many stories as possible at the expense of collaboration with other teams (which might decrease their velocity and increase the other team's velocity, making them look bad). Not only does this destroy the utility of velocity for its intended purpose, it also inhibits collaboration between teams. Velocity as a productivity metric violates our guidelines by focusing on focusing on local measures and not global measures. This is particularly obvious in the second critique above: by (understandably) making choices to optimize their own velocity, teams will often not collaborate with other teams. This often results in scenarios where subpar solutions are available to the organization because there isn't’ a focus on global measures.
How to spot bad data, and know the limitations when its good
A 2016 survey of CEOs found 84 percent of them felt concerned about the quality of data they used while making decisions. And they have valid reasons for feeling wary — bad data could cause financial repercussions if business leaders put too much trust in material that’s ultimately lacking. It’s also crucial to consider the wasted time from bad data. When professionals engage in data-driven marketing, they may be relying on content filled with non-human influences such as bots or malware. If that happens, they could get false perceptions of customers’ journeys at websites or the factors that cause them to linger on certain pages versus others. There are reputational risks, too. If a company releases public research that later gets proven inaccurate, it’ll be difficult for that entity to encourage trust in future material. When business leaders blindly trust data — especially when making decisions — they inevitably set the stage for problems. Staying aware of the characteristics of bad data discussed here is an excellent first step in being proactive.
Law firms failing to meet their client’s digital expectations, according to study
Martin Flick, CEO of Olive Communications, said: “Today’s busy, always on and mobile first consumer wants to buy goods and services, and communicate with sellers whenever, wherever, and however they choose.” “Increasingly this is through digital interaction. When it comes to their lawyer or solicitor, they want to engage in the same way, without the frustration of having to wait days for paper documents to arrive in the post or for an email to come through with the answer to a question that could be easily resolved with an instant message or automated response.” “Consumers want more control over their legal affairs with sometimes, little or no human intervention, and with the speed, efficiency, and security that multiple channel web-based communications offer.” The study found that a significant portion of law firms are embracing new technology internally, for example, 69% are using IM and chat to communicate with each other. However, few of these firms are extending the use of technology externally to enhance the client experience.
Backup best practices: A NAS is not enough
The idea of 3-2-1 is to have three copies of every file, two of which are on different physical devices, and one of which is located off-site. Our guy didn't have that. He counted entirely on one NAS for all his backups. He has an offsite backup, but it hadn't been updated.The "off" part of my strategy is to have at least one full backup air-gapped from the Internet. I do this for my stuff by keeping one backup server shut down, except for a once a week quick incremental backup nibble ... The point of this article, though, is to remind you of the 3-2-1-off-and-away strategy and to not be dumb. A single NAS as your backup strategy is not enough. As a rule, I have two NAS boxes running all the time. One is my hot, live working environment. Another is an offline backup. In my case, I was fortunate that the ioSafe folks sent me their flood-and-fire-proof ioSafe 1515+, so my backup NAS isn't just a second NAS, it's an armored bomb-proof bunker of a backup NAS. At some point in the future, I'll take you through my whole storage architecture.
Quote for the day:
"You may not control all the events that happen to you, but you can decide not to be reduced by them." -- Maya Angelou
No comments:
Post a Comment