June 25, 2014

Data Distribution Network (DDN) vs. Content Distribution Network (CDN)
The difference from the CDN vs. data distribution network (DDN), is that live conversational data is cached in real-time rather than content cached at regular periods so it’s typically much smaller and much more up to date. The data is cached in a hierarchy of topics to allow for easy subscription to subsets of data (topic branches). The data comes from an originating server, typically called data sources, such as a database or an enterprise service bus. Instead of requesting the data, applications (used by customers, employees, machines) subscribe to the data. If data is already cached, the end user or machine will get the current version (or state) of the data and then any subsequent updates are pushed as the data changes.


CanaryRelease
Canary release is an application of ParallelChange, where the migrate phase lasts until all the users have been routed to the new version. At that point, you can decomission the old infrastructure. If you find any problems with the new version, the rollback strategy is simply to reroute users back to the old version until you have fixed the problem. A benefit of using canary releases is the ability to do capacity testing of the new version in a production environment with a safe rollback strategy if issues are found. By slowly ramping up the load, you can monitor and capture metrics about how the new version impacts the production environment. This is an alternative approach to creating an entirely separate capacity testing environment, because the environment will be as production-like as it can be.


Leading Innovation is the Art of Creating ‘Collective Genius’
Collective Genius shows how Bill Coughran, Google's then senior vice president of engineering, created an environment where engineers could figure out on their own how to best address the company's massive storage challenges in 2006. The problem: Storage issues were created by the huge amount of data processed by the Google File System, (GFS), designed for Google web searches. One team, called Big Table, argued for adding systems on top of GFS; the other team, called Build from Scratch, wanted to replace GFS entirely. Coughran decided to give the two teams space to defend their ideas, letting them collect data and test rigorously.


5 best practices for a world-class SAS environment
SAS administration requires specialized knowledge that typical IT teams do not have on hand. Over the last 10 years, my colleagues and I have found that SAS support requires IT skills, knowledge of the company’s data and knowledge of how that data gets applied to solve specific business problems.  Companies that want a world-class SAS environment need to have dedicated resources who can proactively maintain SAS. With a dedicated resource, you'll be well-positioned to increase performance, minimize downtime and ultimately maximize your investment in SAS software.


A checklist for defining your mobile application architecture
Given the wide range of technology available in the mobile space and the rapidly evolving nature of a mobile enterprise, it is important to go through a process to define the application architecture blueprint. ... A robust architecture is not just for the current release; it will help you build a long-term mobile foundation. There are many other architectural decisions you will have to make around integration, testing and hosting for your mobile solution. In this post, I focused on the key components of the mobile application architecture that will serve as a guideline to the application development team.


eBook: De-identification Protocols: Essential for Protecting Privacy
Recent reports, including those emanating out of John Podesta’s Big Data and Privacy Workshops, have further fuelled this misunderstanding. ... We again submit that these views are an over-simplification, inconsistent with current evidence, and largely based on the re-identification of poorly de-identified information. The purpose of this paper is to clarify what it means to properly de-identify personal information, to underscore the value of strong de-identification, to interpret recent research which has been used to call into question the value of de-identification in the protection of privacy, and to emphasize the conclusions that may properly be drawn from this research.


Verizon Virtual Visits Enters Telehealth Market
Medical services are provided through a relationship Verizon forged with a third-party provider network, which the company declines to name. Virtual Visits matches patients to the next available participating clinician in their state. Organizations such as health systems also can use their own healthcare professionals or use a hybrid mix that combines a blend of internal clinicians augmented by external clinicians for after-hours support, says Kling. "Some may want more nurse practitioners; some may want more physicians," Kling says of Virtual Visits' contracts with payers. "Verizon did its own market research on that. Customers and consumers are fine with both." Typical visits take 30 minutes and can occur almost anywhere, according to Verizon Enterprise Solutions.


Casework for Data Governance
If data governance simply reacts in an ad hoc manner to the services requested of it, then it is likely to be limited in its effect and will have difficulty demonstrating how it is making a positive difference. However, if data governance can manage its service requests via a casework approach then it can be much more successful. Casework involves using a standard process to record, assign, prioritize, manage, report on and close out service requests. We often think of casework as something that the police, or social workers, doctors or elected representatives do, but it is quite feasible - and actually quite necessary - for successful data governance units to adopt casework principles and apply them to their everyday activities.


Gartner’s top 10 security technologies for 2014
Gartner yesterday highlighted the top ten technologies for information security and their implications for security organisations in 2014. Analysts presented their findings during the Gartner Security & Risk Management Summit, taking place this week in the US. “Organisations are dedicating increasing resources to security and risk,” said Neil MacDonald, vice president and Gartner Fellow. “Nevertheless, attacks are increasing in frequency and sophistication. “Security and risk leaders need to fully engage with the latest technology trends if they are to define, achieve and maintain effective security and risk management programmes that simultaneously enable business opportunities and manage risk.”


Encrypted Web traffic can reveal highly sensitive information
Almost all websites that exchange sensitive data rely on SSL/TLS (Secure Sockets Layer/Transport Security Layer) technology, which encrypts data exchanged between a person's computer and a server. The data is unreadable, but the researchers developed a traffic analysis attack that makes it possible to identify what individual pages in a website a person has browsed with about 80 percent accuracy. Previous research had shown it was possible to do such analysis, but the accuracy rate was 60 percent. They evaluated the effectiveness of the attack using 6,000 web pages within 10 websites: the Mayo Clinic, Planned Parenthood, Kaiser Permanente, Wells Fargo, Bank of America, Vanguard, the ACLU, Legal Zoom, Netflix and YouTube.



Quote for the day:

"'Emergencies' have always been the pretext on which the safeguards of individual liberty have been eroded." -- Friedrich August von Hayek

No comments:

Post a Comment