July 29, 2016

Fixing the perception that enterprise IT is irrelevant

“The business has more choices than ever before when it comes to sourcing its technology needs, and enterprise IT isn’t necessarily their first choice,” the report says. Executives are getting used to the idea of on-demand, so IT needs to be aware of that and act in more of an entrepreneurial “service-first mindset,” providing legacy IT or as-a-service—whatever is appropriate. Not doing that is manifesting itself in business executives “working around IT,” Accenture explains. That needs to be stopped. It’s a bad idea mainly because it increases risk, although it is now a reality. Strategies for these transformations include self-testing the IT organization by looking at problem resolution response times and taking a “greenfield” approach. Starting afresh, in other words.


In a broader sense, businesses are making it a priority to achieve more value from outsourcing. They’re seeing cost savings, but they’re increasingly looking for benefits such as process consistency across business units and better data that can drive operational improvement and customer insight. Getting better data is also [the] key to more sustainable outsourcing relationships. Clients recognize that the overall strength of a partnership depends on trust, but that trust requires having fact-based information around the various functions and processes covered by each area of vendor management. Without that solid foundation of good information, organizations tend to be guided by emotions and end up focused on day-to-day firefighting.


Remove IT Systems from All Branch Offices to Harden Your Security Posture

Consolidating infrastructure at the edge is the critical first step. But it’s just the first step. Again, simply mashing together disparate pieces of hardware into one appliance will not solve short- or long-term performance, data security and management issues. You also need to make the edges “stateless.” If you’re a storage professional, you know “state” means facing daily operational challenges to manage and protect data at the ROBO that’s vulnerable to loss and theft. A lost storage piece at the ROBO will require hours, days, (or in some cases longer) of effort to bring it back online. And there’s no guarantee of success, particularly when resorting to older backups. Decoupling storage and compute, by moving data storage from the edges to the central data center creates stateless data stores, and in ideal scenarios, this can be done without compromising user experience.


Robotic Process Automation Slashes IT Costs, Alleviates Complexity

To the extent software robots take on activities employees have traditionally carried out, CIOs can significantly reduce—and potentially eliminate—the portion of their budgets they allocate to making usability enhancements to systems, thereby freeing up cash for innovation or other value-producing opportunities. Large RPA deployments aimed at reducing labor costs also offer a compelling ROI. A company that deploys 500 bots at a fully loaded cost of $20 million (including software licenses, planning, process reengineering, programming, testing, and implementation) could potentially realize $100 million in savings, assuming the bots replace 1,000 employees. Given the costs and benefits of RPA, some routine back-office functions that companies outsource, such as claims processing and certain data center operations, may now be more cost-effectively performed in-house with bots.


The Digital CEO (Part II of our interview with Alex Clyne)

What will cause problems for some will be a lack of intellectual horsepower – and also simple lack of knowledge. That’s why I stressed the importance of CEOs really getting to grips with digital. Generally, those who reach the top of big companies are not short on grey matter, but they can’t assume that they will always stay ahead of the curve. The fact that the railway and shipping companies once dominated the transport industry didn’t stop the air industry becoming the major player in long distance travel. The people who ran the railways thought they were in the railway business: they didn’t realise they were in the transport business. Wrong decisions have been/will be made for the right reasons because they are being made with ‘bad’ knowledge.


Go Leads Strong Big Data Showing in IEEE Programming Language Ranking

The open source code repository GitHub is one source of metrics, and Diakopoulos said GitHub activity is the main reason for Go's ascendance, along with activity on the Reddit news and information site, which features a programming category. Google's open source Go language (sometimes referred to as Golang) also featured prominently in a ranking published in May by PayScale Inc. and Millennial Branding, which said "Scala and Go are the emerging skills with the biggest pay boosts." Along with Go, other languages such as Julia, R, Scala and Python "are riding the number-crunching wave," Diakopoulos said about Tuesday's IEEE report. ...  "Julia was added to the list of languages we track in 2015, and in the past year it's moved from rank 40 to 33, still a marginal player but clearly possessing some momentum in its growth."


Five tips for accelerating your continuous delivery journey

Teams need to be able to mask production data and subset it to use it for testing needs to avoid letting personally-identifiable information (PII) into your testing practices. You need to shift testing left by starting API and back-end system request and response testing prior to the development of UIs and simulate environments and start testing much earlier in the process against those simulations. Steps like these will help you make testing agile enough to meet the speed of development as well as help development improve the testing of their code. ... Look to partner with a continuous delivery vendor that can help you leverage your current investments and tools of choice while enabling you to move your continuous delivery journey forward.


Attack attribution does little to improve enterprise security

When laws are broken in the physical world, there’s irrefutable evidence that links the guilty party to the crime. Maybe it’s fingerprints or a strand of hair or surveillance footage from a security camera. Whatever the evidence, it’s tangible and hard to manipulate. In the cyber world, however, evidence can be easily altered, making the task of figuring out who pulled off an attack much more difficult and sometimes impossible. To understand why attribution does not work, think like the people who are behind the operation. They have invested significant time and resources masking their identity prior to the operation’s start. They employ basic precautions like making sure their tools never communicate with a server based in the country where the attack originated. Instead, they’ll make the communication appear to originate from another nation and buy domain names in different countries.


IBM's Cloud CTO: 'We're in this game to win'

What IBM calls "cognitive" computing and its Watson artificial-intelligence services are another of Rometty's pillars, Comfort said. The third is a razor-sharp focus on industries. "What you've seen so far is our evolution and morphing into a cloud company," he said. "Now you're going to see that emerge more and more strongly through an industry lens." That industry focus is one of the key factors IBM is betting will set it apart from cloud competitors including Google, Microsoft and Amazon Web Services as cloud hype diminishes and attention shifts to innovation and industry transformation. In financial services, IBM is using its Bluemix Garages to bring developers together to work on blockchain-related technologies including new mobile banking and wealth management apps.


Benefits and Challenges of Data Mining in E-Commerce

The only way to get the most out of this data is to mine it to increase decision making or to enable business intelligence. In e-commerce data mining there are three important processes that data must pass before turning into knowledge or application. The first and easier process of data mining is data pre-processing and it is actually a step before the data mining, whereby, the data is cleaned by removing the unwanted data that has no relation with the required analysis. Hence, the process will boost the performance of the entire data mining process and the accuracy of the data will also be high and the time needed for the actual mining will be minimize reasonably.



Quote for the day:


"The greatest value of a picture is when it forces us to notice what we never expected to see." -- John Tukey


No comments:

Post a Comment