June 18, 2015

Security—A Perpetual War: Lessons from Nature
A phishing website’s main goal is to masquerade as a legitimate website and make users give out their secrets (password, credit card number, or the like). Thus, the essence of this attack technique is to attract victims and fool them into swallowing the bait. Many predators in the animal and plant kingdoms have long used this technique. For example, the Anglerfish (Lophius Piscatorius), sometimes referred to as the “sea-devil,” has 80 long filaments along the middle of its head; the most important filament is the longest one, which terminates in a lappet that can move in every direction. This lure attracts other fish; the Anglerfish then seizes them with its enormous jaws as they approach.


Why VMware may fall victim to virtualization cost cutting
The report's authors define shadow data as all of the "potentially risky data exposures lurking in sanctioned cloud apps, due to lack of knowledge of the type of data being uploaded, and how it is being shared." Based in San Jose, Calif., Elastica provides cloud application security services that rely on data science algorithms. It is not enough, according to Elastica, to understand shadow IT -- evaluating cloud apps on an enterprise scale requires the use of data science methods that analyze files and cloud transactions, in order to classify data and identify threats to security and compliance. A set of sophisticated analysis tools is probably called for, since they found the average number of cloud apps in an enterprise was an eye-popping 774.


Companies Should Heed DOJ’s New Cybersecurity Guidance to Minimize Liability
In releasing its “Best Practices for Victim Response and Reporting of Cyber Incidents,” the DOJ's Cybersecurity Unit called upon law enforcement and private industry to share in the effort to improve systems that protect consumer information. The Guidance sets forth detailed steps to improve cybersecurity and breach response at all stages within the breach lifecycle, ranging from preparation and deterrence to incident notification, response and, ultimately, remediation. The DOJ standards are being viewed by many industry observers as the new benchmark against which corporate cyber-incident preparedness and response efforts may be measured. Although the proposed standards may not apply to all organizations in all instances, companies of all sizes would be ill-advised to ignore them.


Google is taking a page from Facebook and starting to talk about its homegrown hardware
Historically, Google has treated its homegrown hardware as a trade secret, unwilling to discuss it. But this week, Google took a big step and started talking more openly, particularly about the networking tech it's invented. Two things caused Google to change its mind: One is that its rival down the road, Facebook, has not only been talking about its own technology, but created an open source hardware foundation to give those designs away to anyone for free. The Open Compute Project allows anyone to use those designs, modify them, and share improvements that Facebook can use in turn. Contract manufacturers are standing by to build the hardware.


6 Survival Strategies for CIOs
Companies often talk about “IT” and “the business” as if they were totally separate entities, but information technology now touches almost every facet of the organization. Leadership and digital leadership must become one and the same, but this doesn’t happen easily when business and IT professionals have spent their careers isolated from each other. A survey conducted by CSC’s Leading Edge Forum (2014 Outside-In Barometer) shows that most business executives still view IT as a back-office function, known for stability rather than disruption. As a result, we’ve seen other leaders emerge to challenge the CIO for dominance. In this type of environment, how can a CIO stay relevant? As investments for digital innovation increase, how can CIOs ensure this money is allocated to them? Here are six strategies for doing so:


Is Complexity the Downfall of IT Security?
The problem with an extremely complex security system is reasonably obvious if you think about it, but it may be helpful to consider a somewhat similar situation: reliability. When building an airplane, for instance, engineers will add redundancy to the various systems to ensure that if one fails, a standby system is ready to take over. One might think, on first glance, that the engineers could achieve almost any reliability level they wanted simply by adding more and more redundancy. But the problem is that in addition to just the redundant system—say, rudder control—there must also be a system that manages the transfer in the event of a failure. But even that system is subject to failure and may require redundancy. The gist of the matter is that beyond a certain point, additional redundancy can actually harm reliability, contrary to what intuition would dictate.


Tomomi Imura on Mobile Web, Future of CSS
Currently so many developers depend on preprocessors such as Sass or Less because there are so many features that we want to use that are missing from the current web standard. First of all we have so many different browsers means we need to have a browser specific prefix, so if we want to support new features like animations, we have to add browser prefix, the vendor prefix for each one of them and that can be really long, so we want to get rid of those and by using preprocessor. Or I would say variables, if we want to set some colors to certain variables we can reuse the same variable or we don’t have to keep changing each time in design I make changes, right? So this is not doable yet with current CSS but now we have a new standard that is coming, there is a proposal about CSS variables and other things that close a gap in between current standard and something that preprocessors do, so that would be really wonderful news to us.


Lawmaker Urges U.S. Personnel Office Chief to Quit Over Hacking
In testimony before being questioned, Archuleta said the agency fends off an average of 10 million hacking attempts a month and the attacks will increase. “Government and non-government entities are under constant attack by evolving and advanced persistent threats and criminal actors,” she said. Archuleta said the detection of the attacks was an example of improved security monitoring by the agency. “We discovered these intrusions because of our increased efforts in the last 18 months to improve cybersecurity at OPM, not despite them,” Archuleta said. However, lawmakers cited a report from OPM’s inspector general last year that recommended Archuleta shut computer systems that lacked security validations. Archuleta said she didn’t disable the systems because it could have negatively affected other databases and records.


Cut big data blending time from several months to several hours
"There are two approaches when it comes to preparing big data for analytics," said Merritt. "The first approach is building a data warehouse, which is defined and designed by business users and IT. This data warehouse is usually built from system of record and transactional data. The data is also cleaned and checked for quality with an ETL (extract, transform, load) process before it is blended. The second approach is what we focus on. This is a self-service data preparation approach that is especially designed for business users who have a need to prepare and query big data without support from IT. They can pull in data from different sources and work with data organization in formats that are already familiar to them."



Intelligent machines part 2: Big data, machine learning and its challenges
Although deep learning has proven to be a powerful form of machine learning over recent years, its expense might not yield much higher performance on certain tasks, says Robin Anil, an ex-Googler who left the company this year to work on statup Tock with other former Google staff. “The places where deep learning have given large improvement are on things like image recognition where traditional algorithms like logistic regression did not do well. “You might be able to get small improvements by applying deep leaning into an existing problem that has already been solved using logistic regression, but that small improvement and the amount of compute power that you use may not be worth it,” Anil points out.



Quote for the day:

"If you just focus on the smallest details, you never get the big picture right." -- Leroy Hood