May 16, 2014

Towards Agile CMMI Level 3: Requirement Development and Verification
Requirement grooming is a practice widely used in the agile community. We’ll focus on the verification for requirements. Grooming ensures that backlog items will be consistent, won’t be repeated, and will not become obsolete. This is usually performed in non-agile environments with a checklist, but there is no reason not to also use a checklist with product-backlog items to remind us what a good user story should be. Since the same artefact is checked in formal lifecycles, why not use the same checklist with an agile practice?

Remedies for Data Migration Pain
Data governance is a powerful pill as it not only knocks out the causes of the common data headaches, it helps prevent them. Data migration headaches are no exception. A finely tuned data governance program can reduce duplicate data throughout the organization, reduce errors in reporting and coding and reduce costs associated with poor data quality. The administrative framework and council of a well-defined data governance program support decisions, especially cross-functional decisions, with objectivity and consistency, allowing for decisions to be made quickly with greater accuracy and confidence. The data governance charter also supports and encourages business and IT collaboration as well as interdepartmental data sharing.

Microsoft CEO Nadella aces first-100-day test
"I'd definitely give him an A on selecting his team," said Ottinger. "From the March 3 announcements on, Nadella made his decisions quickly." That day, Microsoft announced a major shake-up of its senior leadership. Out was Tony Bates, who had led business development -- and been in the running for the CEO spot -- and Tami Reller, marketing boss and once co-chief of Windows. Mark Penn, the creator of the "Scroogled" attack ads, was shifted to chief strategy officer; Chris Capossela was made chief marketing officer. "All of that within a month," said Ottinger, who ticked off other personnel changes, including Stephen Elop's return to Microsoft and Scott Guthrie's promotion to lead the cloud group Nadella had left.

Why Machine Learning Matters When Choosing a Big Data Vendor
Machine learning is the proper way to make use of all that big data companies are collecting and analyzing. In fact, machine learning is a far more effective way to analyze data since, unlike other methods, it is designed to work with vast amounts of different types of data that is constantly changing. Machine learning has the capability to analyze an entire set of data, not just a small portion of it, allowing for more accurate results. The very nature of machine learning also allows its analytics to operate at a faster pace.

Improve the Performance of Your Existing Storage Systems
Obviously, solid state storage is the king of the hill when it comes to raw performance, so it's no surprise that adding some solid state storage to an existing environment would improve overall performance of the system as a whole. In general, PCI-e-based sold state storage devices are leveraged for this purpose. They are installed inside a host machine and configured as a mega cache for the SAN. With a large cache at its disposal, a server can much more quickly read and write data from and to storage. The ultra-fast cache acts as the intermediary in this case so that the server itself doesn't have to deal with constantly slow reads and writes from the legacy storage environment. This solution is not low cost, though.

Mobile collapse and emerging markets cause cut in IT spending growth
"This volatility, coupled with the macroeconomic uncertainty in many emerging markets, is somewhat masking a more positive underlying foundation for enterprise IT spending, with firms continuing to invest in working off that pent-up demand to replace old servers, storage and network gear," said Stephen Minton, vice president in IDC's Global Technology & Industry Research Organisation. "Some of that spending is also driving IT services, despite the fact that an increasing number of businesses are moving more of their traditional IT budget to the cloud," he added. IDC expects 10 percent of software spending to be on cloud services by the end of 2014, while infrastructure-as-a-service should soak up 15 percent of spend on servers and storage.

Cloud bursting: Better tools are needed to live up to its promise
The promise of cloud bursting is to do a better job of fitting capacity to demand. Organizations can't squander cash on a kit that spends half of its time idle (if they could, no one would have bothered replacing their internal telephone networks with VoIP). It would be so much easier to cope with variable computing demand by maintaining only enough hardware for predictable loads and shipping the rest -- the spikes in demand -- to the cloud. The process of deciding how much hardware to buy for next year would rely more on metrics and less on gambling.

AT&T to activate HD Voice over 4G LTE in four states on May 23
To work, HD Voice requires that both parties in a call have an HD-capable handset and a cellular base station (usually located at the base of a tower) between them that's equipped to pass the HD signal. To enable HD in the handset, the AT&T spokeswoman said the carrier will begin selling "very soon" the HD Voice-capable Samsung Galaxy S4 mini with a 4.3-in. Super AMOLED display. Both T-Mobile and Sprint started rolling out HD Voice to their 3G networks last year. T-Mobile sold its first version of the iPhone, the iPhone 5, with HD Voice capability last year.

Want 'perfect' security? Then threat data must be shared
Most of the big anti-malware companies not only understand who is doing the crime and what they are after, but know within minutes whenever one of these groups initiates a new "campaign" (such as using a new malware program or new phishing strategy) or when they are initiating from new IP addresses. There are literally a hundred companies and thousands of people that have a pretty good understanding about the badness on the Internet. They can see the new trends as they are happening. Individually, none of the groups has all the information. But if you put all these groups together sharing information we'd have a pretty good lock on all the bad guys.

Security Trends in Cloud Computing Part 4: Public sector
Data classification is an important method of increasing the protection and proper management of information by separating it into categories based on sensitivity (high, medium or low, for example). More sensitive or critical information can be given greater protection, while lower risk information can be made more accessible, as appropriate. This process can improve compliance efforts and help safeguard important records, allowing other data and resources to be managed more smoothly. Conversely, weak or incomplete data classification strategies could result in improper handling of sensitive private data, raising the risks of accidental release or corruption, or the potential exposure of confidential information to employees whose jobs do not require access.

Quote for the day:

"In a crisis if I had only an hour I'd spend the first 50 minutes defining the problem and the last 10 minutes solving it." -- Albert Einstein

No comments:

Post a Comment