June 29, 2015

Blending humans and technology in the workforce
A key enabler behind this co-operation lies in the interfaces. Advances in natural language processing (NLP) and speech recognition are making it a lot easier for humans to interact with machines in real time. Speech recognition is becoming more effective thanks to the growing capability of machines to “understand” unstructured conversations. This is aided by the ability of machines to make instant internet searches and to use contextual clues. At the same time they are becoming more effective at incorporating user feedback to improve their accuracy. Such is the interest in NLP technologies that the market for this application is expected to grow quickly to reach $10bn by 2018 from $3.7bn in 2013.

DevOps & Product Teams - Win or Fail?
Developers are expected to deliver product features or user stories, preferably in a predictable way. When unforeseen problems cause delays, developers - keeping the release date in sight - struggle frantically to compensate, releasing incomplete features (although some would argue that there’s no such thing as releasing too early). Operations is usually prized on availability. MTTR may be more DevOps-friendly than MTBF, but regardless of how it's measured, outages are more difficult to prevent in face of constant change. This can cause engineers in operations to be over-cautious and too conservative. If lots of new product features are deployed to production, it’s the developers’ merit, but if any of those shiny new features cause an outage, the operations guys will be waking up to fix it.

UC San Diego Researchers Amp Up Internet Speeds
The results of the experiment, performed at UC San Diego's Qualcomm Institute by researchers from the Photonics Systems Group and published in the June edition of the research journal Science, indicate that fiber information capacity can be notably increased over previous estimates by pre-empting the distortion effects that will happen in the optical fiber. The official name of the paper is "Overcoming Kerr-induced capacity limit in optical fiber transmission." "Today's fiber optic systems are a little like quicksand," Nikola Alic, a research scientist from the Qualcomm Institute, the corresponding author on the Science paper, and a principal of the experimental effort, wrote in a June 25 statement.

Urgency of Present and Past in IoT Analytics
The most basic obstacle to extracting value from this OT-generated data is connecting it to traditional Information Technology (IT) systems. This integration is problematic because IT and OT evolved in different environments and use different data types and architectures. While IT evolved from the top down, primarily focusing on serving business and financial needs, OT grew from the bottom up, with many different proprietary systems designed to control specific equipment and processes. IT systems are based on well-established standards that can integrate large volumes of information across different applications. In the world of OT, no single industry standard yet exists for Machine to Machine (M2M) connectivity.

Why is Virtualization creating Storage Sprawl?
Storage sprawl has become worse in organizations that have made a significant investment in virtualization technologies. For example most virtual desktop infrastructures (VDI) have an all-flash array to handle desktop images, preventing boot storms, logout storms and maintaining acceptable performance throughout the day. But most VDI environments also need a file store for user home directories. There is little to be gained if this data is placed on the all-flash array, but certainly data centers need to provide storage to their users to support user created data. As a result most organizations end up buying a separate Network Attached Storage (NAS) device to support user home directories and other types of unstructured data.

Dealing With Data Privacy in the Cloud
“If the single biggest concern centres around the security of placing data in multi-tenant public clouds then this is a misjudgement," he said. "If anything, providers of public cloud managed hosting services know a lot more about system security than most individual firms. Second, privacy policy controls stipulated upon instances of private cloud may be harder to update that those held in public environments where service layers are stronger.” Indeed, confidence in cloud security is growing, according to the 2014 IDG Enterprise Cloud Computing Study. The survey found that the vast majority of enterprises were “very” or “somewhat” confident that the information assets placed in the cloud are secure.

Overcoming the business and technology barriers to DevOps adoption
"There was a degree of customer dissatisfaction with the service we provided, in the sense that we couldn’t keep up with the demand from developers, which meant there were long lead times in providing them with environments; and they were created in a manual way with semi-automated scripts," says Watson. "So, not only did it take us a long time to provide these individual environments, sometimes they weren’t always the same." This often led to disagreements between the software development and infrastructure building teams, as the environments they delivered didn’t always quite fit the bill. To rectify this, Watson created small groups of developers and infrastructure architects, while doing away with the ticketing system used to communicate requests between these groups.

Why is a cloud provider like a restaurant?
In my experience the thing that marks the unsuccessful restaurant is the belief that having the best kitchen equipment (or the cheapest, depending on their business model) – is the defining factor in their success. What rubbish! Any experienced restaurant patron can tell you that it’s all about the customer experience – and how the customer perceives the value delivered by the restaurant. The customer only has the ‘front-office’ experience – the location, parking valet, Maitre d’, bar, seating arrangement, ambience and of course, the menu and service. The restaurant might only be a drive-through fast-food outlet, but the same principles apply. The customer makes a choice of provider based on their required value proposition and expect that to be delivered.

The Road Ahead for Architectural Languages
MDE is a possible technological solution for successfully supporting the requirements of next-generation ALs. In MDE, architects use domain-specific modeling languages (DSMLs) to describe the system of interest. The concepts of a DSML - its first-class entities, relationships, and constraints - are defined by its metamodel. According to this, every model must conform to a specific metamodel, similar to how a program conforms to the grammar of its programming language. In MDE, it’s common to have a set of transformation engines and generators that produce various types of artifacts. Practitioners can take advantage of transformation engines to obtain source code, alternative model descriptions, deployment configurations, inputs for analysis tools, and so on.

Cloud computing may make IT compliance auditing even cloudier
There may be several issues at work that reduce IT's preparedness for compliance audits. Time is likely the leading culprit, as tending to compliance reporting is one more thing to be squeezed into a busy day. Related to that is lack of resources -- short-staffed IT departments may find it difficult to put someone on the case more than a few hours a week. Some industry observers say cloud computing has made the matter of compliance even, well, cloudier. The results are "not surprising when you consider the degree to which cloud systems and mobile access have penetrated most enterprises," says Gerry Grealish, CMO of Perspecsys. "Cloud SaaS systems and BYOD policies take the control of enterprise data -- including sensitive data -- away from enterprise IT teams and put it in the hands of third party vendors.

Quote for the day:

"It's not about how smart you are--it's about capturing minds." -- Richie Norton

No comments:

Post a Comment