August 17, 2015

Five Critical Layers of Next-Gen Data Center Automation and Orchestration

If you look at the modern data center and cloud landscape you’ll notice a lot more interconnectivity and new capabilities to dynamically pass resources. Some solutions even allow for cross-connects for the easier flow of data. The interesting piece here is how all of these technologies, which are currently influencing the end user and corporation, are directly pushing for the evolution of the modern data center through data center automation. Cloud computing, Big Data and IT consumerization have transformed the data center into the central hub for everything. Today, there are entire organizations that are born from a cloud model which resides within the data center.


Polyera Wove Band Boasts Flexible Display

At a basic level, a display is made up of two parts: The first is a frontplane, which is the layer that makes the image you see. The second is a backplane, which is an array of TFTs that control which pixels in the frontplane turn on and off. With Polyera Flexible TFTs, the backplane can be made flexible, making the full display flexible, in contrast to traditional display technologies where the backplane has previously been a constraint. "Most attempts to make flexible displays have relied on depositing traditional electronic materials, such as silicon, on plastic substrates," the company explains on its website. "This approach allows the creation of products with displays in a fixed curve, but the brittleness of these electronics layers makes them unsuitable for products which are dynamically flexible.


Key findings from the 2015 US State of Cybercrime Survey

It’s no wonder, then, that we found rising concern among the 500 US executives, security experts, and others from the public and private sectors who participated in the 2015 US State of Cybercrime Survey. In fact, 76% of respondents said they are more concerned about cybersecurity threats this year than in the previous 12 months, up from 59% the year before. Organizations must summon the vision, determination, skills, and resources to build a risk-based cybersecurity program that can quickly detect, respond to, and limit fast-moving threats. The US State of Cybercrime Survey is a collaborative effort with PwC, CSO, the CERT® Division of the Software Engineering Institute at Carnegie Mellon University, and the U.S. Secret Service.


Target launches new beacon program using a system in need of change

This is the world we live in now. With each new evolution of The Internet of Things, we have to face the fact that every step forward will be faced with a step or two backwards, as those that can... do. Unfortunately, there is no getting around it. Hacks will happen. Should you create something of interest, something on a network, it will be found and it will be cracked. The more you advertise something, the more will know about it.And if you think there isn't a monetary gain to be had from hacking your system... think twice. Your swell new app or system has a user-base, that means there's a database of users that can be sold to the highest bidder.


Are comatose servers your next big IT headache?

To get a sense of the cost of the problem, think about how much you could save if you just turned off a third of the hardware that you manage – got rid of or re-used the licensing, unplugged the hardware, and liquidated the rest of it. It’s a problem with an enormous cost, and even if the study is half wrong, at 15 percent, that’s still a significant cost. Why does this happen? Fundamentally it comes down to the problem of not knowing what you have and what it is doing. It used to be a little easier to keep track of things because in order to roll out new servers, you had to requisition one, send a PO, receive it, inventory it and mark it, so at least you knew what type of silicon you had on your server closet racks. The operating system and software was another story, but at least you had a fighting chance.


For CIOs invested in digital transformation, think platform

It's the technology, the data, the people and the processes that hardwire some capability into your organization. One place where this was attempted in many companies was the implementation of an ERP and the associated transformation. Companies that had all of these processes were messy, disjointed and they suddenly said, 'We ought to have standard processes around finance, around supply chain.' So they built platforms. Some of them were really good -- this is why I've so enjoyed studying Lego; they went after their supply chain and fixed it. Campbell's Soup went after theirs; Nordstrom has an unbelievable supply chain platform.


3 security mistakes developers make with online services

One big danger with online source code comes from the data that is committed there. For example, if developers use a public repository to host the code of an internal application, they could be inadvertently exposing corporate infrastructure details on the internet. Sometimes configuration files refer to internet-facing test systems, user names, passwords, or hidden interfaces. Other intellectual property includes proprietary algorithms, undocumented APIs, or even production data that is being used as test data. Private keys (for SSH, for TLS web servers, for mobile app signing, and more) are found frequently in online repositories, where it is easy to search for them. Small to medium-size businesses are especially likely to leverage free or nearly-free online services this way.


Can big databases be kept both anonymous and useful?

This is a true dilemma. People want both perfect privacy and all the benefits of openness. But they cannot have both. The stripping of a few details as the only means of assuring anonymity, in a world choked with data exhaust, cannot work. Poorly anonymised data are only part of the problem. What may be worse is that there is no standard for anonymisation. Every American state, for example, has its own prescription for what constitutes an adequate standard. Worse still, devising a comprehensive standard may be impossible. Paul Ohm of Georgetown University, in Washington, DC, thinks that this is partly because the availability of new data constantly shifts the goalposts. “If we could pick an industry standard today, it would be obsolete in short order,” he says.


Hackers Find Infiltrating Government Computers is not Mission Impossible

“You would think the federal government would have better safeguards, but ultimately they are only as strong as their weakest employee,” says Daugherty, who has spoken at cybersecurity gatherings. “That boils down to knowledge and training.” Daugherty says security risks are one reason there are concerns about Hillary Clinton using a private server for her e-mail when she was secretary of state. “The potential for sensitive e-mails to be lost is the issue,” he says. “Whether they actually were or were not lost is not the issue, so Hillary's e-mail headache isn't going away anytime soon.” He suggests tips that both government agencies and private businesses need to remember to defend against hackers:


Optimization Analytics Comes to the Mass Market

As computing capabilities became increasingly affordable, companies could use more complex algorithms to handle ever more difficult optimization problems. For instance, the airline industry used it to determine how best to route aircraft between two cities and to staff flight crews. Not only can software find the best solution for scheduling these assets in advance, it also can rapidly re-optimize the solution when weather or mechanical issues force a change in how aircraft and crews are deployed. Airlines were also in the vanguard in the 1980s when they started using revenue management techniques. In this case, the optimization process was designed to enable established airlines to compete against low-cost startups.




Quote for the day:

"The Internet, mobile and social media have ushered in a new normal when it comes to customers' expectations." -- Paul Cho

No comments:

Post a Comment