January 26, 2016

Is Persistent Storage Good For Containers?

Despite this conventional wisdom, there persists (pun intended) a desire to bring persistent storage to containers. The reasons for this varies. In some cases, an application needs data to persist and its performance requirements can not be met through backends like objects stores or network file systems; typically, this is a SQL database like MySQL or Postgres that isn’t designed to scale out in the way a NoSQL database might. In other cases, a company that is moving to containers and cloud-native apps may have a desire to leverage existing technology when possible, such as a storage array.


Save the ransom: How being prepared and proactive foils the plot

Having good solid, working backups is one of the most important choices that one can make. Maintaining more than one backup plan both offline and offsite, is crucial. Always check backups and test-restore on a regular basis—valid, working backups are part and parcel of the proactive process. Since ransomware targets and encrypts visible files—including mapped network drives and network shares—utilizing an offline backup strategy ensures that your organization will not come to a grinding halt or have to cough up any bitcoins. Remember to always physically disconnect the hardware backup device from the network, after the backup is complete.


The Ten Commandments of Microservices

With the emergence of containers, the unit of deployment gradually started to shift from away from the VM models. Linux container technologies, such as LXC, Docker, runC and rkt, make it possible to run multiple containers within the same VM, allowing DevOps to package each app component or app module in a dedicated container. Each container has everything—from the OS to the runtime, framework and code—the component needs to run as a standalone unit. The composition of these containers can logically form an application. The focus of an application becomes orchestrating multiple containers to achieve the desired output.


Spear Phishing: Real Life Examples

A spear phishing case that involved the RSA security unit of data-storage giant EMC Corp shows how even a company known for security in the cyber realm can be target and victim of an attack. In 2011, RSA was attacked using a Flash object embedded in an Excel (.XLS) file that was attached to an e-mail with the subject line “2011 Recruitment Plan”. Small groups of employees were targeted, and the e-mail was filtered and landed in the users’ junk mail folder. Unfortunately, all it takes is for one person to fall victim of the scam. As explained by the RSA FraudAction Research Labs, regardless of the state-of-the-art perimeter and end-point security controls, security procedures and high-end technology used by a company,


Employee Off-Boarding: How to Keep your Data Safe During Employee Turnover

Companies should make it clear that all data on device is the property of the organization. We ran through some of the specifics behind remote wipe features in this blog, but as a refresher here’s more insight: Employees should have signed a policy disclosing that when e-mail is configured on a personal smartphone, tablet or computer, that if they leave without notice and take their devices, that the company will remote-wipe the device. This means that the employee will not only lose the e-mails on that device, but also all of their personal data on that device. You should encourage employees to be forthcoming about leaving your organization to avoid this issue and present personal devices to be wiped by your IT department without losing their personal data.


Diana Larsen on Agile Fluency Model

Maybe the most surprising thing that’s being added in terms of larger chunks is complex adaptive systems. One of the things that this model has taught back to us is how it actually reflects the foundations, the conditions, the underpinnings of complex adaptive systems–teams as complex adaptive systems. We’re adding information about how complex adaptive systems work when they are teams and when they manifest as a team. What you can look for to help that system flourish. More of that kind of understanding and material, that’s just fascinating to me. We had a tiny bit of that in the first edition. We’re going to have quite a bit more in the second. And we’ll have a couple of new stories. That will be fun.


The CISO's Role in Fighting Extortion

To mitigate risks to corporate data, organizations need to use network segmentation to "ensure that sensitive data is only available within the network to the parts of the network that actually need to be able to access that data," Miller says. They also should use air-gapping to help ensure that sensitive data is not accessible from the public Internet. "Of course there are many other steps that organizations can take to secure sensitive databases and other information as well. ... One of the hopeful measures that companies can take is ensuring that any employee who has a public-facing role and could be contacted by an extortionist is aware of what to do."


Testing the test: How reliable are risk model backtesting results?

This blog reveals that the distributional nature of the profit and loss (P&L) distribution being modelled can have a significant impact upon the previously known factors driving Type-II errors. ... Risk models are not expected to produce reliable and robust risk estimates 100% of the time. Indeed when specifying a model, users build-in expectations around its accuracy often defined by the number of breaches it produces (i.e. occasions whereby the P&L of a portfolio is greater than that predicted by the model). The Kupiec-POF test therefore attempts to assess model performance by comparing the amount of breaches a user would expect a model to produce with the actual amount it does.


Alleviate data wrangling pain points with visualization tools

"The tool presents a visual representation of the data," said Alon Bartur, Trifacta's principle product manager. "It makes certain assumptions concerning the structuring of this data, and the user sees these assumptions by indicators that assess what the likely quality level is of each piece of data. Users know immediately from the indicators whether the data that they are seeing is of high quality or whether it is questionable and might require additional investigation. The user interface is designed for point and click interactions and the system gives the users suggestions of how to organize data reports, as well as certain data transforms that the user can run and what the likely outcomes of these transforms are."


Does Anyone Really Want the govt Deciding Encryption Policy?

What is the best way to keep everyone safe from the various bad guys out there? In one limited sense, this shares an argument from the U.S. gun debates. Is it safer for an individual to have a gun or is it more likely that the bad guy would simply take that gun and use it against the citizen? In the encryption argument, the question is whether it's safer to let the government have full access or will that just make it easier for the bad guys to steal that full access?  Framed in that "which truly makes us safer" perspective, I think there are good arguments on both sides. But if that technology-oriented question is going to be answered by any individual, I'm somehow more comfortable with the Tim Cooks making that call than some politician. At least Tim Cook is honest about his motivation



Quote for the day:


"You have to think anyway, so why not think big?" -- Donald Trump


No comments:

Post a Comment