November 25, 2015

Russian financial cybercrime: how it works

With online financial transactions becoming more common, the organizations supporting such operations are becoming more attractive to cybercriminals. Over the last few years, cybercriminals have been increasingly attacking not just the customers of banks and online stores, but the enabling banks and payments systems directly. The story of the Carbanak cybergroup which specializes in attacking banks and was exposed earlier this year by Kaspersky Lab is a clear confirmation of this trend. ... Information on the number of attacks may indicate the extent of the problem but does not reveal anything about who creates them and how. We hope that our review will help to shed light on this aspect of financial cybercrime

The State of Millennials Worldwide

As the survey’s authors noted, aspiring to the freedom brought by self-employment while still living with or being supported by family is an age-old contradiction. They found that in Asia or Latin America, these align with cultural norms, but the spread to North American cities where this historically hasn’t been the case are a strong indicator for municipal leaders to find ways to support this growing segment of their constituents. "Young people may respond positively to policies or programs that foster a mind-set of measured risk for personal or global growth, while laying the groundwork for long-term stability," the authors found.

In Machine Learning, What is Better: More Data or better Algorithms

“In machine learning, is more data always better than better algorithms?” No. There are times when more data helps, there are times when it doesn’t. Probably one of the most famous quotes defending the power of data is that of Google’s Research Director Peter Norvig claiming that “We don’t have better algorithms. We just have more data.”. This quote is usually linked to the article on “The Unreasonable Effectiveness of Data”, co-authored by Norvig himself (you should probably be able to find the pdf on the web although the original is behind the IEEE paywall). The last nail on the coffin of better models is when Norvig is misquoted as saying that “All models are wrong, and you don’t need them anyway”

Artificial Intelligence: 10 Things To Know

"We are trying to make a system which at first sight looks like it might be behaving in some manner that we might ascribe to intelligence," said Moore. "Everything, however, with 'artificial' in the label is actually just a really, really, really fancy calculator, all the way from chess programs to software in cars, to credit-scoring systems, to systems that are monitoring pharmaceutical sales for signs of an outbreak." ... "And people are making bad decisions, which are costing huge numbers of lives every year, by not going to physicians under some circumstances or not letting a doctor know about something important or mismanaging their medications.

Composable Infrastructure: Cutting Through the Noise

By separating the physical components of the server, those resources can then be pooled and programmatically composed into a logical server and then, subsequently decomposed, returning the elements back to the pools allowing for reuse. This breaking down of the server means that not only can the most efficient and optimal use of resources be made, but also the lifecycle management of those resources is also decoupled. So, in the case of M-Series, the next CPU generation that would drive a complete replacement of the server with a traditional rack-mounted server would only require the replacement of the CPU and possibly DIMMs to achieve an upgrade. Subsystems like the local storage, RAID controller, network adapter, power supplies, fans, and cabling are preserved until upgrades of those respective elements would yield benefit to the business.

eBook: Foundations of Data Science, by Microsoft Research

The field of algorithms has traditionally assumed that the input data to a problem is presented in random access memory, which the algorithm can repeatedly access. This is not feasible for modern problems. The streaming model and other models have been formulated to better reflect this. In this setting, sampling plays a crucial role and, indeed, we have to sample on the fly. in Chapter ?? we study how to draw good samples efficiently and how to estimate statistical, as well as linear algebra quantities, with such samples. One of the most important tools in the modern toolkit is clustering, dividing data into groups of similar objects. After describing some of the basic methods for clustering, such as the k-means algorithm, we focus on modern developments in understanding these, as well as newer algorithms.

Jai Ranganathan on architecting big data applications in the cloud

There are some fundamental design principles behind the original HDFS implementation, which don’t actually work in the cloud. For example, this notion that data locality is fundamental to this system design; it starts changing in the cloud when you’re looking at these large cloud providers — they are doing all these software-defined networking tricks and they can do bisectional bandwidth, like 40 gigs per second, across their data center … suddenly, you’re talking about moving hundreds of terabytes of data back and forth from a storage to a compute layer without any huge performance penalties. Suddenly, their performance is disadvantageous to this, but it’s not as bad as you think.

Security is the common theme in 2016 top IT projects

The heightened interest doesn't come as a surprise to experts. "Everyone's concerned with security issues due to the nature of what's been happening recently," said Turner who works for a non-profit organization in western New York that's striving to better connect Medicaid patients with health care providers. After another turbulent year of high-profile breaches, including Ashley Madison, CVS and the Office of Personnel Management, security threats are top-of-mind for board members and CEOs, alike, putting a spotlight on CIOs and senior IT leaders. For Vlasich, security and cloud computing, which ranked as a top IT project for the second year in a row, are intertwined thanks, in part, to rogue IT.

Java: The Missing Features

Java’s import syntax is quite limited. The only two options available to the developer are either the import of a single class or of an entire package. This leads to cumbersome multi line imports if we want just some but not all of a package, and necessitates IDE features such as import folding for most large Java source files. ... Java’s arrays aren’t collections, and the "bridge methods" provided in the helper class Arrays also have some major drawbacks. For example, the Arrays.asList() helper method returns an ArrayList, which seems entirely reasonable, until closer inspection reveals that it is not the usual ArrayList but rather Arrays.ArrayList.

Key Methods for Ensuring FRCP Data Preservation Compliance

The new FRCP amendments introduce the notion of “reasonable” preservation effort to preserve data across all forms of enterprise communication. In court, organizations must prove they made reasonable efforts to prevent communications data, in any form, from being destroyed. Failure to do so will lead the court to the assumption that the information not preserved is harmful to your defense. By some estimates, eDiscovery costs U.S. organizations around $41 billion annually. Not only is this expensive, but it can also be a time-intensive exercise. So, how can organizations demonstrate “reasonable” preservation efforts? 

Quote for the day:

"Speaking about it and doing it are not the same thing." -- Gordon Tredgold

No comments:

Post a Comment