April 10, 2015

SDDC adoption on a 'slow roll'
"It is a very new model, especially on the [software-defined networking] side," Dennehy said. "Customers are being extra careful about how they go down this road." In addition to the changes that SDDC brings to hardware and software, it also will usher in changes to IT staff. Tasks previously performed by highly skilled employees can be performed by software, according to Forrester's brief, "The Software-Defined Data Center Is Still A Work In Progress" by Richard Fichera. ... "The adoption of SDN is really concentrated in telecom and the very big data centers such as Google, Amazon and Facebook," Dennehy said. As for software-defined storage, it's not "plug and play" said Stanley Stevens, also a senior analyst at TBR.

Technology is turning genealogy on its head
The search for identity is often rooted in the past, which is why genealogy remains so popular. Technology has helped in many ways, from making it easy for home family-tree builders to create diagrams and search local council records, to powerful servers crunching data to find geographic correlations that might imply family connections. And then there's our DNA. Watson and Crick's discovery of the double-helix DNA 'code' didn't immediately change the world. But as computing power has increased, so too has the scope of DNA analysis. Sequencing that once cost tens of thousands of dollars now costs much less than one percent of that – and it's sequencing that tells us who we are biologically, or at least what we're made of.

Lambda Complexity: Why Fast Data Needs New Thinking
Rather than address the flaws directly, you simply run both the batch and streaming systems in parallel. Lambda refers to the two systems as the “Speed Layer” and the “Batch Layer”. The Speed Layer can serve responses in seconds or milliseconds. The Batch Layer can be both a long-term record of historical data as well as a backup and consistency check for the speed layer. Proponents also argue that engineering work is easier to divide between teams when there are two discrete data paths. It’s easy to see the appeal. But there’s no getting around the complexity of a Lambda solution. Running both layers in parallel, doing the same work, may add some redundancy, but it’s also adding more software, more hardware and more places where two systems need to be glued together.

HP Spectre x360 review: A sexy convertible that just can't take the heat
This configuration is actually fairly competitive. Outfitted with similar components, Dell’s XPS 13, for example, is $800—but it’s not a convertible and it even lacks the touchscreen at that price. Also, the XPS 13’s smaller, lighter form factor feels great until you touch the keyboard. The Spectre x360’s keyboard is far more comfortable to type on than the XPS 13’s. Frankly, I’d probably trade the XPS 13’s compact size for the Spectre x360’s keyboard in a second if it were my everyday driver. Other details of the Spectre x360 also impressed me. The tiny power button on the left side of the frame is a bit annoying—you have to hunt for it. However, it takes just enough pressure that you can’t easily activate it by accident. On the convertible Yoga 3 Pro, I’d put the machine to sleep all the time just by picking up the chassis.

Why heresy is good business strategy: Dell’s Armughan Ahmad
Ahmad said that Dell Blueprints – which optimize Dell integration with partner ecosystems – are a critical part of their strategy. Dell Blueprints comprise five separate disciplines: Unified Communications and Collaboration, like Skype for Business (formerly Microsoft Lync); Enterprise Applications such as OLTP, CRM and databases; VDI; Big Data analytics; and high performance computing. “Underneath these, we have these vendor partnerships, and all these companies power these solutions for us,” Ahmad said. “We let them put their hooks deep into our products, and we are willing to democratize the IT for that.” Here’s where the heresy comes into play. Dell’s model embraces a willingness to wipe Dell’s own technology off the partnered products, sacrificing short term CAPEX profits, in the interest of longer term benefits from reducing customer OPEX costs.

A Data Scientist's Advice to Business Schools
The expectation on any business graduate is that they possess an ability to strike a middle language between the priorities of a business and the deep domain knowledge of a company's experts. They should carry that 'generalist's touch' and be able to synthesize myriad high-level approaches into real-world utility for their organization. To produce graduates like this a business school must find ways to teach the general high-level approaches used by domain experts across a company's departments. Graduates should have an understanding of how an expert's deep expertise in their field adds value to the overall strategic direction of the company. Only then can value-producing conversations and disruptive ideation exist between the business graduate and the domain expert.

AT&T's data breach settlement called a 'slap on the wrist'
It's "alarming" that AT&T allowed contractor workers to have access to unencrypted customer records, Blech added. "There should no longer be any debate as to whether sensitive customer data should be encrypted or not," he said. It's interesting that the data breach settlement came through the FCC, when the U.S. Federal Trade Commission has been the agency that often pursues companies for data breaches, said Robert Cattanach, a partner at law firm Dorsey & Whitney focusing on cybersecurity and other regulatory litigation. The FCC settlement, the largest in agency history for a data breach, "ups the ante" for penalties, but the FCC may still have been a better option for AT&T, Cattanach said.

Q&A: Marcus Ranum chats with Privacy Professor CEO Rebecca Herold
Identify the risks those vendors present to the organization based on a variety of factors, including the types of information they are accessing, whether or not they are storing sensitive and personal information within their own systems, and the types of safeguards they have in place for those systems. Document it. Determine which vendors are high, medium and low risk; then dedicate attention appropriately. Perform regular security and privacy reviews -- there are many ways to do this -- for the high-risk vendors, as well as appropriate checks for the medium- and low-risk vendors. Keep an eye out for any published reports of breaches for the vendors they are using.

Internet of Things must learn interoperability lessons from history
“IoT is a whole myriad of different ways of connecting things,” he says. “It could be fixed, Wi-Fi, NFC, cellular, ultra-narrow band or even ZigBee - so many but they have different uses. You have to mix and match what is best to make connections work.” In the early days of Ubiquisys Franks encountered similar issues. There were, he says, a number of wireless proprietary technologies that wouldn’t talk to each other, making it impossible to roam from town to town let alone country to country. The solution was to get all the technologies into the same room and try and thrash out an interoperability plan.

You’ve Completed Unit Testing; Your Testing has Just Begun
Stopping just after unit testing the code is akin to starting mass production of automobiles after testing each nut and bolt of a car. Of course nobody would ever take such a huge risk; in real life, the car would first be taken on many test drives to check that the assembly of not just every nut and bolt, but every other part perform in coordinated orchestration as intended. In the software development world, test driving translates into what we affectionately refer to as integration testing. Integration testing guarantees that the collaboration of classes works. In the Java world, both the Spring framework and the Java EE platforms are containers that provide APIs over available services, for example JDBC for database access.

Quote for the day:

"Anyone can hold the helm when the sea is calm."  -- Publilius Syrus

No comments:

Post a Comment