July 21, 2016

Cognitive Business: When Cloud and Cognitive Computing Merge

Another maybe even more important trend, that is actually being driven by cloud computing, is the rapid expansion of cognitive computing. In this arena, IBM’s Watson, famously known for defeating Jeopardy gameshow champions Ken Jennings and Brad Rutter, has quickly established itself as a commercial cognitive computing powerhouse. Contemporary reports of the Jeopardy contest from the New York Times cited this victory as IBM’s “…proof that the company has taken a big step toward a world in which intelligent machines will understand and respond to humans, and perhaps inevitably, replace some of them”. Although we are not yet at the human replacement stage, the merger of cloud and cognitive computing is rocking the business status quo.


The State of Digital Currency: A discussion with Ed Scheidt

One of the keys to acceptance is the ability to check validity of currency and reduce the risk of fraud. We discussed the fact that even with block-chain and other types of encryption, there needs to be new technology invented that provides the same level of trust (and risk reduction) that you get with physical currency. If you look at the current one-hundred dollar bill, it has a myriad of security features like a 3-d ribbon, color-shifting ink, watermarks, raised printing, etc. All of these features could be reproduced by a counterfeiter, but only with a large amount of time and resources. DC has none of these layered features in a mature way today, but will someday. So, for DC to really work, the digital equivalents of these features will need to be created, validated, produced, and trusted.


Blockchain: a case for the general ledger

Despite the potential of a distributed ledger, financial institutions are not rushing to replace legacy systems with the new technology. Blockchain or its variants will be adopted in a bigger scale only after early movers address underlying questions. Will a distributed network operate as efficaciously as the tried and tested centralised system? Can blockchain ensure interoperability? Who is responsible in the event of a dysfunctional system? How will cryptocurrency and related technology be regulated?  Fortunately, the industry is not waiting for answers. Several financial services enterprises are developing in-house models and forging partnerships to create proofs of concept. Venture capital is pouring into start-ups building payment platforms using cryptography even as industry leaders incorporate blockchain technology into securities management, post-trade processing, settlement, and asset servicing.


Looking Deeper, Seeing More: A Multilayer Map of the Financial System

Multilayer maps can capture more information.7 They portray the financial system as a network of networks. For example, a multilayer map can help identify a large market participant that is a node in more than one market layer. Such a company could be a source of strength to the financial system, if managed well. If not, it could be a source of weakness. The failure of one of these nodes in a layer can lead to failures of dependent nodes in other layers. This phenomenon can happen repeatedly, leading to a cascade of failures. For that reason, multilayer networks are more fragile than single-layer networks. Connections between the layers can amplify the scope and magnitude of stress in a single layer. Maps of multilayer networks show three stages of damage following a major shock.


Utah teen launches consumer drone that can fly over 70 mph

"After spending an hour with George, I was overwhelmingly impressed by his vision for a drone platform as well as his presence as an entrepreneur," wrote Ben Lambert, from Pelion Venture Partners, in a post on Medium. George clearly has an engineering mindset, but he's also a savvy businessman. When he was a kid (which wasn't that long ago, after all), he always had lemonade stands or some other way to make a few bucks. "I was always an entrepreneur at heart," he told me. In these early days, Teal has been operating out of Pelion's Salt Lake City office. George says he's managing to stay grounded while handling large responsibility at such a young age with the support from his family and school, but he also mentioned "half-jokingly" that spending quality time with his investors has helped. "Ben tells me every day that I suck," George said.


10 TB in a 1 cm space: Will chlorine atoms redefine storage?

The technology is dependent on the ability to quickly rearrange in square grids that sit next to each other as terraces. Each grid represents a single byte, and it contains slots that the atoms can be moved around in to represent either a one or a zero, thereby encoding the information. The atoms are moved between slots using a scanning tunnelling microscope. Atomic markers were added to the grids, making reading them easier and faster than previous methods. This new atomic storage technology is a major discovery, but it is still in the proof-of-principle phase, and it has some major drawbacks that may slow its development. One of the biggest issues is that it must be kept at the temperature of -196 °C, which is the the boiling point of liquid nitrogen. While warmer and cheaper than using liquid helium as a coolant, as noted in Nature, it still creates a problem.


Mads Torgersen on C# 7 and Beyond

QCon chair Wesley Reisz talks to Mads Torgersen who leads the C# language design process at Microsoft, where he has been involved in five versions of C#, and also contributed to TypeScript, Visual Basic, Roslyn and LINQ. Before he joined Microsoft a decade ago, he worked as a university professor in Aarhus, Denmark, doing research into programming language design and contributing to Java generics. Key takeaways are: The overall theme for C# 7 will be features that make it easier to work with data, including language level support for tuples. The release may also include pattern matching for type switching; C# 7 is the first new release of the language to be completely built in the open; Roslyn, the compiler and API, allows a much more agile evolution of the language.


Securing the NextGen aviation network

In the past, we were very, very focused. We had a very simple model, which was we would look at how our system is secured and if somebody else was having a technological problem on their side the way we would protect the integrity and the safety of the system was we simply wouldn't allow them in. That would result if airline A is having technology problems, we're not going to dispatch their flights. To a certain extent we still do some of that but now that all of our systems are interlinked, if an airline is experiencing a problem it's very important that we understand what is the potential that that could bleed over into our systems through the interconnections and gateways that we have connecting our system to theirs. Likewise, it's not just the companies and their operating systems. It's also the avionics systems in the aircraft themselves.


Doctor devises new database methodology to thwart hackers and end big data breaches

Yasnoff created the personal grid, in fact, to make it so each record of information is stored in a separate file, and each files is encrypted individually with its own encryption key.  “If a hacker breaks into a server room and literally takes a whole server away, that hacker would have to break through strong encryption to get one single patient record,” Yasnoff explained. “And then that hacker would have to break through more strong encryption to get a second record, and then repeat the same for a third, and a fourth, and so on. The work involved in getting hundreds of thousands to millions of records becomes prohibitively massive for a hacker.” There is, however, one catch: Unlike a database where all records are stored in one file, a clinician cannot quickly search patient records stored and encrypted separately within a database. But Yasnoff has come up with a solution to this hurdle.


Oracle To Reboot Java EE For The Cloud

Within cloud-based environments, infrastructure no longer relies on application servers running on dedicated hardware. Moreover, an enormous volume of transactions must be handled, requiring a different model for state and transaction management than what has been offered in Java EE for scaling applications, Kurian said. Meanwhile, container technologies such as Docker have emerged, with requirements for externalizing configuration management, deployment of applications, and packaging. Oracle wants to make accommodations for these paradigm changes. Oracle plans to fit Java EE 8 with capabilities for persisting data in a key-value store, based on NoSQL stores, and a transaction model for eventual consistency and relaxed transactions. On the whole, Oracle's improvements would help Java EE developers evolve their skill sets to leverage technology shifts such as these, Kurian said.



Quote for the day:


"You got to be careful if you don't know where you are going, because you might not get there." --Yogi Berra


July 20, 2016

The Body as Interface and Interpreting the Body talks

It helps us move away from viewing things in terms of the interfaces we are familiar with. For instance, we were able to provide an alternative to the mouse by introducing touch screens. We then moved from touch screens to more gestural interfaces with the Kinect and virtual reality goggles. We need to build devices that give users greater autonomy to determine where they go with the design. Thinking of the body as an interface and designing with that mindset, lends itself to a more experimental and iterative approach to design. ... You can track metrics like temperature, heart rate, blood pressure or breathing rate, to stop traders trading when they’re more likely to make a decision based on emotion. Emotion sensors can allow users to better control their behaviour in emotionally charged situations.


MicroProfile streamlines Java EE for microservices

The MicroProfile approach to optimizing for microservices is to start with a small core set of features and grow from there with heavy involvement from the community. The core platform will likely add functionality over time, some of which will come from Java EE related JSRs, and some that are not directly related to Java EE at all. For the latter, the MicroProfile community will investigate how to more directly address microservice-related patterns like circuit breakers, bulkheads and service discovery. The MicroProfile project aims to get Java EE back on the edge of innovation, Sharples said. "The goal is to ensure that when developers think about microservices they start with Java and Java EE; this enables them to start with the standards-based platform with familiar Java APIs."


CIOs and CISOs share insights on strategic collaboration

All executives with a C-level title should be working together toward the mission, said Mansur Hasib, program chair for cybersecurity technology at the Graduate School at the University of Maryland University College and author of the books “Cybersecurity Leadership” and “The Impact of Security Culture on Security Compliance.” ... "The C-level officers should be sitting together and offering each perspective on how to achieve this particular goal. The CIO might say, ‘OK, to do this we need to have a webinar, and we might need connections with the mayor’s office and maybe the state department of health.’ Another officer could say, ‘We need to put some ads in the newspaper,’ and someone else might say, ‘We need some town halls because consumers do not have technology for webinars, and further, maybe some door-to-door canvassing.’"


Why Virtual Reality is Auto Marketing's 'Sleeping Giant'

Automakers are also bringing virtual reality inside dealerships. For instance, Audi is rolling out VR systems at dealerships that allow customers to experience vehicles in various environments or to "virtually dive into specific parts of the vehicle and explore their technical design," according to Audi's website. "You're wearing the glasses and you really think you're in the car," Marcus Kuehne, Audi's virtual-reality project lead,told Bloomberg earlier this year. "You get a good feeling for the size -- do the rims fit to the body of the car, do the colors inside the car fit well together?" he added. "You can judge this much better through this technology than on a screen."


Microsoft is rolling out Windows 10 as a subscription service

At the enterprise level, Microsoft has always charged businesses for using Windows. The upgrade to Windows 10 from Windows 7 or 8 may be free, but the continued use of Windows in your business has never been free, nor should it be. The new twist in the conversation is that the fees for using Windows will be called a subscription now. Hardly earth shattering. At the consumer level, the future prospects of Windows 10 and the subscription model are much murkier. Where enterprises are willing to pay for more security assurances and management services, consumers may fail to see the value and resist a monthly fee. Microsoft knows this and will look for ways to mitigate such entrenched resistance.


Could Bulgaria's open source law transform government software worldwide?

The advantages of going open source are numerous, Bozhanov says. Most importantly, the new legislation will bring better written software, and developers will follow better practices. "Currently there's nobody inspecting the quality of the code or the architecture, and companies can get away with pretty low-quality solutions," he says. Open source will also offer more affordable software, with less money spent on support and fewer new projects commissioned simply because the old ones didn't work properly. Also, government contractors will be able to reuse the code when working on a common piece of functionality, without having to reinvent the wheel every time. "Companies will no longer be able to sell open-source solutions as complex custom software, which has [previously] happened," Bozhanov says.


The best mobile security plans examine risks first, then prescribe

The balance between risk and control is exacerbated when applied to mobile devices. Mobile devices (smart phones and tablets) are, by their very nature, designed to blend the organization and personal computing experiences. My phone is filled with personal photos and photos of whiteboard architecture and flow diagrams. My apps include my corporate email and expense approval as well as my personal mobile banking. ... When it comes to assessing risks, I like to first identify the specific risks and then, for each risk, define the likelihood and impact of the risk. I then figure out the best, most pragmatic way to mitigate the risks with the highest likelihood-impact combination.


Why ALM Is Becoming Indispensable in Safety-Critical Product Development

When developing complex software systems before, especially in scaled Agile environments, these issues are quite common. That's exactly the need that gave rise to the notion of Application Lifecycle Management. ALM tools help developers oversee and manage several (ideally all) stages of development using a single software solution. By design, they offer functionality across the entire lifecycle, supporting development from requirements to release. While ALM is a relatively modern concept, ALM solutions have been around for a decade or so, and have evolved a lot over the years. Some ALM vendors started out as developers of single-point solutions, and have developed further modules to add to the basic functionality of their products, or have acquired other solutions and created integrations between these preexisting modules.


Internet of Things in healthcare: What's next for IoT technology in the health sector

Inova Design's CEO Leon Marsh agrees: "The potential with IoT is that throughout a whole care pathway a person's data is continuously being gathered and used to help diagnose the patient so they can receive the best treatment as quickly as possible." Ideally, the objective data that could be taken from a network of IoT devices will also be able to significantly lower margins of error. And in the predictive realm, it could, for example, be able to detect the onset of a wide range of health issues, from high blood pressure to early signs of delirium. Emergency admissions could then, in theory, be reduced - with proactive health systems in place to address the problems before they become more serious or irreversible. More generally, data from a network of IoT devices has the potential to transform the check-in process, filling in past health data for professionals to review automatically.


Container Management Simplifies SDN Application Deployment

One of the problems that SDN companies attempted to tackle is the issue of firewall rule explosion. Firewall access control lists (ACLs) are notoriously difficult to understand and process. For example, a customer I worked with at a former company had 50,000 firewall rules on a single firewall device and they did not know if they could remove any one rule without breaking an application! Load balancers have similar problems as firewalls. With hundreds of applications, come thousands of rules that must reside in a single hardware load balancer. Clearly, there is a problem. One way to attempt to solve this problem is to create network application centricity. There are many network IT vendors that claim application-centric infrastructure and networking.



Quote for the day:


"The more that you read, the more things you will know. The more that you learn, the more places you’ll go." -- Dr. Seuss


July 19, 2016

Cybersecurity control a concern for digital businesses

Gartner predicts that by 2018, 25% of corporate data traffic will bypass enterprise security controls and flow directly to the cloud from mobile devices. With data no longer restricted to data centers, it is important to stop trying to control information and instead determine how it flows, Pratap added. “Finding all sensitive data and tracking all access in all forms will be too onerous for most organizations,” she said. “Each organization will have to manage their ability to do this within the limits of the resources they can commit. From personally identifiable information to sensitive intellectual property, the impact of compromise of such information on the organization needs to be assessed regularly.”


From Pig to Spark: An Easy Journey to Spark for Apache Pig Developers

Pig has a lot of qualities: it is stable, scales very well, and integrates natively with the Hive metastore HCatalog. By describing each step atomically, it minimizes conceptual bugs that you often find in complicated SQL code. But sometimes, Pig has some limitations that makes it a poor programming paradigm to fit your needs. The three main limitations are : Pig is a pipeline and doesn’t offer loops or code indirections (IF..THEN) which can sometimes be mandatory in your code. ... Finally, a third Pig limitation is related to input data formats: whereas Pig is good with CSV and HCatalog, it seems a bit less comfortable with reading and processing some other data formats like JSON (through JsonLoader), whereas Spark integrates them natively.


Insurance is ready for an upgrade

Before too long, IoT may enable carriers to become primarily the ensurers of safety and productive use of properties, rather than just the insurers of damages should a loss occur. If IoT detects the imminent failure of a $100 compressor in a $1 million piece of equipment that prevents a $100 million business-interruption loss, an entirely new value chain is created. If carriers don’t seize the moment, outside tech firms could launch IoT platforms that already have an ingrained risk-transfer component, thereby beating insurers at their own game. Nor are life insurers immune to the disruptions caused by enhanced connectivity. More life carriers will likely take the plunge into telematics, including some utilizing a fitness-monitoring device to award points for those who exhibit healthy behaviors, thereby allowing policyholders to earn premium discounts and other rewards while facilitating a richer, more holistic relationship with their insurer.


Introduction to data-science tools in Bluemix – Part 3

A big part of any data science activity is learning how to put the data in a format that helps you gain insight. A common task is looking at the data in time segments, joining them on date patterns or time of year dates. In this recipe we will look at how we transform dates so they can be used as date formats rather than text strings. In addition we will look as joining data frames from multiple data sources. ... You will notice that the date is in format of “MMMM-YY”, this is a concern because the year is not specific. Because I know the data, I have made a rule in this case that everything less than 20 is for the year 2000 and beyond. Everything 20 and above is for the 1900’s. The next concern is that I need my date format in “YYYY_MM-DD” format and there is no “days” in the source date. I am going to default it to “01”


Europe Builds a Network for the Internet of Things. Will the Devices Follow?

For growth to accelerate, says de Smit, a few things are necessary. The first is for the KPN network to enable location-based features, which would, for instance, allow a shipping container to be tracked in transit across the country—something expected to go live before the end of 2016. The second is IoT coverage beyond national borders. Siemens, Shimano, and other large companies are very interested in gaining access to IoT networks, but only when there is enough geographic coverage, says de Smit. That may take a few years. KPN is not the only company building out the IoT. SigFox, a French startup, claims its competing wireless grid already covers 340 million people in parts of 22 countries. The company raised well over $100 million in investment in 2015 alone, and is using the money to expand as rapidly as possible.


Red Hat Shoots to Solve Container Storage with Gluster and OpenShift

The integration translates to another option for storing data inside containers. That’s important because, to date, other persistent storage solutions for containers have tended to be clunky. Here’s why: Docker containers are ephemeral. They spin up and down as needed, which is what makes containerized infrastructure so scalable and agile. But it also makes it hard to store data persistently, since you can’t store permanent data inside containers very effectively if the containers themselves are not permanent. Previous attempts to solve this conundrum have centered on creating special containers dedicated to storage, or allowing containerized apps to access storage on the host system.


Organising for Analytics Success - Centralising vs. Decentralising

As we know the analytics team needs to have an acute understanding of the business and business unit they are working in. To be able to build models and derive insights its important that there is some context to the objectives of the business unit as well as the problem the analytics team is solving for. It's based on this premise, then, that many Heads of Analytics (and similar) believe that analytics has to be decentralised. Deploy a Head of Analytics into each business unit, allow them to work alongside the business owners and build insights with specific knowledge of the customer and the product. This structure makes perfect sense. Except when you take into account that there is a distinct lack of skills when it comes to people who can build advanced analytical models; and understand business; and have the ability to lead a team and engage with business.


Chief data officer job stakes claim in data innovation

We forget, but, before big data and analytics became the mainstays, shops would take all of their data out of transactional systems, build a data warehouse, do some data cleansing and run some reports and, maybe, if you were really, really good, that could become the golden copy of your data, which you could send back to your applications. That's what we called the closed loop. It was data warehouse nirvana. But the IT and application development groups would have their release cycles, and the data warehouse group would have its release cycles. Never the two would meet, and they didn't really care about each other. Now, the big data platform has really become the back end of some of the applications, especially for analytics like recommendation engines and applications that measure customers' propensity to buy.


5 steps to avoid overcommitting resources on your IT projects

Maureen Carlson, Partner, Appleseed Partners, says, "Not enough companies are connecting the dots about the impact of resource overcommitment and the ability to deliver on innovation to meet growth objectives. The research shows that companies are working on products or projects that are at risk of delayed delivery because there was not enough capacity to take them on in the first place. Mature organizations are in a position to evaluate capacity in real-time to make critical business tradeoffs and see continued investment in this area as a competitive differentiator." ... PMOs play a crucial role in assisting organizations with strategy and execution and as such must recognize the need for effective resource management and capacity planning.


Has open source become the default business model for enterprise software?

When it comes building the business, open source and proprietary are the same -- but different. The biggest difference is starting points. The proprietary software company starts with an idea that is refined based on identifying customer pain points and classic gap analysis. With open source, the trigger is less formal, because at the outset, the primary risk is sweat equity. Somebody gets an idea, develops it in the wild, and in place of gap analysis, there's the sink-or-swim process of developer interest going viral. But, ultimately, both need to deliver some unique value-add, scale it, and go to market. There is the neatness, or lack thereof, of the open-source model. Witness the long tail of adoption of Android updates, or the ordered disorder of the Hadoop platform, where each commercial platform has different mixes and matches of open-source projects.



Quote for the day:


"To double your net worth, double your self-worth. Because you will never exceed the height of your self-image." -- Robin Sharma


July 18, 2016

How big data is having a 'mind-blowing' impact on medicine

Research by Ericsson predicted that, while currently only 27% of the population in Africa has access to the internet, data traffic is already predicted to increase 20-fold by 2019—double the growth rate compared to the rest of the world. Terheyden explained while infrastructure may be rather basic in places such as Africa, and some improvements still need to be made around issues such as bandwidth, telehealth has already begun to open up new opportunities, so much so that when compared to the way medicine is practiced in developed countries, it appears archaic. "I know there are still some challenges with bandwidth...but that to me is a very short term problem," he said. "I think we've started to see some of the infrastructure that people are advocating that would completely blow that out of the water.


Harnessing the Data Tipping Point of IoT

It was not a huge leap for the industry to realize that an IoT global network of continuously connected devices would mean that data would not only be created at geometric rates, but that it would become one of the most valuable commodities in the world. And although there are many new start-up companies storing, analyzing and integrating massive lakes of big data created from the IoT, not many have actually considered how the IoT will transform how organizations think and implement data quality and information governance. Wikipedia defines information governance as a set of core disciplinary structure, policies, procedures, processes and controls implemented to manage information at an enterprise level, supporting an organization’s immediate and future regulatory, legal, risk and environmental and operation requirements.


Post-Brexit fintech – don’t just sit there, do something

It has been rightly said that nothing much will happen for a couple of years. Those who said that to reassure don’t know much about the markets. For the FS world, “nothing much happening” while we wait, slow evolution towards an unknown status is the worst kind of climate. Uncertainty isn’t a friend of the markets, and to navigate it, banks and FS institutions will retrench, limit “nice to have” activity, stretch timelines for experiments and investment cycles, diversify and minimise exposure and risk. The aspirational and experimental initiatives will be the first to suffer, not because banks suddenly don’t care, but because they need to protect their staff, shareholders and regulatory standing in order to still be around at the end of the storm to still have aspirations.


French public sector’s never-ending struggle with the cloud

There is no question that the French public sector could benefit from cloud technology. It could allow the government to consolidate its IT resources. Different government agencies currently have their own datacentres, and none of them are used at full capacity. As cloud computing is perfectly suited to fluctuations in capacity, scaling up and down would be quick and easy. It would also provide the flexibility required to implement new services more quickly. As government agencies develop new programmes, they could rapidly implement and deploy the applications to support them. Acknowledging this untapped opportunity, the French government last year released details of a two-pronged strategy to move data and services to the cloud.


Blockchain market outlook: Hype vs. reality

A blockchain system can link to the financial systems of all those distributed companies' ERP systems and/or link to their bank accounts directly and understand the exact cash position at any given time across the enterprise across the globe. You could actually watch it go up and down in real time practically. That kind of knowledge is pretty valuable, and the blockchain enables the financial arm of a company instant or almost-instant access to current information about current assets [and] cash positions and can rapidly roll up consolidated financials. ... So blockchain just gets added to the environment. You don't have to buy anything or redo anything. You just need to integrate at certain points the various financial systems to the blockchain so that it exposes its data to it.


Balancing the Demands of Big Data With Those Of Accurate Data

Gaming, ad tech and e-commerce are three examples of industries that have fully entered the multiverse of heavy, high-value workloads. For these industries, it is equally important to process massive numbers of transactions and retain complete data accuracy, but traditional database technology puts these aims at odds with each other. If B-grade sci-fi has taught us anything, it is that messing with nature—whether it be tweaking with the space-time continuum, or resurrecting dinosaurs—is dangerous business. Similarly, the kind of coding changes required to scale MySQL beyond its natural limits (such as those involved in sharding), or to ensure complete data integrity in NOSQL, wreak havoc on the very applications that the databases are supposed to power, making them fragile and much more complex to manage.


Artificial Intelligence Swarms Silicon Valley on Wings and Wheels

“Whenever there is a new idea, the valley swarms it,” said Jen-Hsun Huang, chief executive of Nvidia, a chip maker that was founded to make graphic processors for the video game business but that has turned decisively toward artificial intelligence applications in the last year. “But you have to wait for a good idea, and good ideas don’t happen every day.” By contrast, funding for social media start-ups peaked in 2011 before plunging. That year, venture capital firms made 66 social media deals and pumped in $2.4 billion. So far this year, there have been just 10 social media investments, totaling $6.9 million, according to CB Insights. Last month, the professional social networking site LinkedIn was sold to Microsoft for $26.2 billion, underscoring that social media has become a mature market sector.


How to go beyond the reboot to provide topnotch tech support

You head out to your vehicle to start your morning commute. You turn the key and the car doesn't start. What's the first thing you do? You grab a container of gasoline, right? No! Well not initially. You will more than likely take other steps to troubleshoot why your car doesn't start. You may check to see if the headlights can come on or turn the key to see if the vehicle's starter turns over. At any rate, your first step is not to put gasoline in the tank. Having a fair understanding of how vehicles operate aided you in your triage. Why not apply this to IT support? When a user gets an error submitting an online form, restarting the browser won't resolve the issue. Analyzing the error message may open a door to a resolution. The user may have been entering text into a numeric field of the online form.


11 Programming Languages For DevOps Success

DevOps depends on two critical pieces: Software development and operational automation. Each of these requires programming and (follow me, here) programming tends to need a programming language. For those trying to chart a career path in DevOps, the question of what language or languages to learn for each side of the equation is key. ... Are you on a DevOps team? Have you led part of a DevOps organization? I'm curious about the tools you or your team have used as part of successful DevOps. I'm equally curious about languages you think are important for people getting into the field in 2016. I'll be hanging out in the comments section below -- once you've reviewed our list, stop by and let me know what you think.


Skills gap leaves firms at risk from cyber attacks

“An insufficient number of specialists entering the IT market has forced organisations to consider effective retention programmes, training existing staff, partnering with educational institutions and developing flexible hiring policies that include both permanent and contract specialists,” he added. The technology sector as a whole is suffering from a skills gap, with many people resorting to up-skilling internal employees to fill vacant roles. Firms are increasingly looking for people with soft skills as well as technical skills. Owen said providing insights from data analysis and communicating IT security issues to others in the firm are important for an IT security employee. Cloud security skills are the most sought after for IT firms, but they are also the most challenging to find among potential candidates.



Quote for the day:


"The greatest single human gift - the ability to chase down our dreams." -- Prof. Hobby


July 17, 2016

Windows Containers and Docker

The Windows Server container shares the kernel with the OS running on the host machine, which means all containers running on that machine share the same kernel. At the same time, each container maintains its own view of the OS, registry, file system, IP address, and other components, with isolation provided to each container through process, namespace, and resource control technologies. The Windows Server container is well suited for situations in which the host OS and containerized applications all lie within the same trust boundary, such as applications that span multiple containers or make up a shared service. However, Windows Server containers are also subject to an OS/patch dependency with the host system, which can complicate maintenance and interfere with operations.


BNP's Ex-Blockchain Lead is Now Coding Smart Contracts for Clearinghouses

Along with co-founder and fellow BNP Paribas alum, David Acton, the bootstrapped team has already built a proof-of-concept for contract creation and trade registration that uses a smart contract for US treasuries and other cash-like short-term treasury instruments in Europe. The smart contracts are intended to represent bilateral contracts between parties that are backed by different guarantors, likely a member of the CCP clearinghouse. The smart contracts themselves would be administrated by the CCP. In Europe, CCPs such as the European Central Counterparty NV and Eurex Clearing serve counterparties by both taking funds from a buyer and assets from a seller and managing the risk in a wide range of ways. In the US, the DTCC fulfills a similar function.


The Batch Mode Window Aggregate Operator in SQL Server 2016: Part 2

Besides the general performance advantages of batch mode processing compared to row mode processing, this operator uses a dedicated code path for each window function. Many inefficiencies in the original row mode optimization are removed. For example, the need for an on-disk spool is eliminated by maintaining the window of rows in memory and using a multi-pass process over that window in a streaming fashion. ... Remember that when querying columnstore, sorting for computation of window functions is unavoidable since columnstore data isn’t ordered; however, you do get the benefits of reduced I/O cost, batch mode processing and parallelism, with much better scaling for larger number of CPUs, compared to row mode processing.


SQL Server, Power BI, and R

R has also been integrated into Power BI, allowing you to create fully integrated visualizations with the power of the R language. In this blog post I will show an example using R in SQL Server to create a model and batch score testing data, then use Power BI Desktop to create visualizations of the scored data. Rather than moving your data from the database to an external machine running R, you can now run R scripts directly inside the SQL Server database – train your model inside the database with the full power of CRAN R packages plus the additional features of speed and scalability from the RevoScaleR package. And once you have a model, you can perform batch scoring inside the database as well.


Strengthening the Foundations of Software Architecture

Once the software architecture is in place, it may subsequently be discovered the requirements have changed or were never fully understood. How easy it is to change the software depends on whether it was architected in such a way that alterations don’t significantly conflict with the original design. The more extreme agile methodologies deviate further from this blueprint. Applications are written in smaller slices, delivering value to the user faster but reducing visibility of the overall design. There often isn’t any one individual responsible for designing the architecture, and the decision-making is delegated to the developers incrementally working on the software. Because cycle times are reduced, the team can get feedback faster and respond more quickly to changing requirements, but how easy it is to implement changes still depends on whether they are congruous with the architecture.


Basho Open Sources Time Series Database Riak TS 1.3

Basho is indeed the biggest contributor to Riak TS, primarily because we had to make so many additions and changes to have a purpose built time series solution. We are currently talking to several companies about working on a series of capabilities to Riak TS to solve a large problem in the time series arena. As for the objective of open sourcing the code, we believe that we have a lot to offer the community in terms of innovative approaches, ideas, leadership in distributed systems and want to collaborate to build even better solutions. That process is almost always accelerated when you leverage open source as a path. We have a long history of open sourcing our software and gaining support from the community in creating better solutions


Modeling your big data enterprise architecture after the human body

Think about the storage systems in our brain. We have short-term, sensory, long-term, implicit, and explicit. Why do we have so many? The answer is there was an evolutionary benefit that each system provided over a generalized system. These systems most likely have different indexing strategies, flushing mechanisms, and aging-out/archiving processes. We find a parallel in our world of software architecture, with storage systems like RDBMS, Lucene search engines, NoSQL stores, file systems, block stores, distributed logs, and more. The same goes for our processing systems. Vision interpretation is very different from complex decision-making. Just like the brain, in software architecture, there are different execution patterns and optimizations that serve different use cases. Tools like SQL, Spark, SPARQL, NoSQL APIs, search queries, and many more. There is a reason for the different approaches to processing, and there will be more approaches in the future as we find different ways to address our problems.


How Cardihab uses data to speed recovery of cardiac patients

Cardihab is a spin-off company from the Commonwealth Science and Industrial Research Organisation, and is also a participant of the HCF Catalyst accelerator program. He explained a key problem behind why people do not complete their CR program is due to accessibility and convenience. "The way normal cardiac rehab works is it's usually a 6-8 week long program where the person has to go to a clinic once or twice a week and that can really be inconvenient, especially for patients who have returned to work, or for rural remote patients," McBride said. Cardihab has been designed to collect data about a patient including how many steps a patient has taken, and their blood pressure and sugar levels, via Bluetooth-enabled monitors. The information is then uploaded to the cloud and shared with the patient's clinician, who can access it through an online portal.


Deep Learning: Using an Artificial Brain to Protect against Cyberattacks

When applied to cybersecurity, it takes milliseconds to feed a raw data file and pass it through the deep neural network to obtain detection with the highest accuracy rate. This predictive capability of being able to detect a never- before seen malware variant enables not only extremely accurate detection, but also leads the way to real-time prevention because at the very second a malicious file is detected, it is already blocked. Therefore, while traditional machine learning yields better results than signatures and manual heuristics, deep learning has shown groundbreaking results in detecting first-seen malware, even compared with classical machine learning. This observation is consistent with improvements achieved by deep learning in other fields, such as computer vision, speech recognition, text understanding, etc.


SQL Server 2016 Upgrade Planning

If you are using Master Data Services or Data Quality Services, keep in mind an customization will get overwritten. You must back up your MDS and DQS databases before upgrading to prevent any accidental data loss. In SQL Server 2016 , those applications do have schema upgrade level changes. After upgrading to SQL Server 2016, any earlier version of the Add-Ins for Excel will no longer work. You will need to tell your users to download the SQL Server 2016 versions of the Master Data Services or Data Quality Services Add-In for Excel. Integration Services packages do not get automatically updated in a SQL Server upgrade. You will need to migrate packages afterward service upgrade completes with the Integration Services Package Upgrade Wizard. Developers can upgrade 2012 or 2014 projects to 2016 without manual adjustments after upgrade. They can also choose to incrementally update without deploying the whole project.



Quote for the day:

“I do not think that there is any other quality so essential to success of any kind as the quality of perseverance.” -- John D. Rockefeller


July 16, 2016

Big banks, big applications, big outsourcing

Outsourcing could help banks overcome these issues. It depends on how the banks approach their IT application outsourcing, Arora says. “Outsourcing done the traditional way—think of the functional siloes —will continue to emphasize the challenges outlined above. The new demand profile is different, and therefore the supply model needs to evolve as well,” he says. “However, in some situations outsourcing can also act as the bridge that helps a large bank collapse functional siloes and think of vertically integrated, front-to-back solutions.” Expect more as-a-service deals that involve business process, application, and infrastructure: enrollment-as-a-service, for example, or claims-as-a-service.


Lexington-Fayette Urban County CIO Aldona Valicenti Prepares for the IoT

As a combined city and county government, we must bring fiber to both a densely populated urban area and a more open rural area. In our request for proposal, we will ask that the service be ubiquitous. In some areas, companies come in and offer broadband, but will only build it in areas where enough people promise to purchase it. That’s not the setup we want. We want broadband available to everyone in the county, along with our urban areas. Our consultants say no one else in the country has tried this, so we are pioneers in that respect. In the end, we want every home in our county to have access to broadband. ... Let’s say a big fire happens in the city. With broadband, and each government building wired for the Internet of Things, sensors around town could notify the fire department immediately to send help faster.


SYN Flood Mitigation with synsanity

SYN cookies are a clever way of avoiding the storage of TCP connection state during the initial handshake, deferring that storage until a valid ACK has been received. It works by crafting the Initial Sequence Number (ISN) in the SYN-ACK packet sent by the server in such a way that it cryptographically hashes details about the initial SYN packet and its TCP options, so that when the ACK is received (with a sequence number 1 larger than the ISN), the server can validate that it generated the SYN-ACK packet for which an ACK is now being received. The server stores no state for the connection until the ACK (containing the validated SYN cookie) is received, and only at that point is state regenerated and stored. Since this hash is calculated with a secret that only the server knows, it doesn’t significantly weaken the sequence number selection and it’s still difficult for someone to forge an ACK for a different connection without having seen the SYN-ACK from the real server.


Microsoft expands its Surface Enterprise Initiative

This initiative extends the enterprise program to better involve Cloud Solution Providers who are also Surface Authorized Distributors. These resellers now have the option to offer Surface hardware alongside managed cloud services and subscriptions to other Microsoft services, such as Windows 10 and Office 365. Microsoft hopes that "Surface as a Service" will make it easier to upgrade to the latest devices so that IT departments can be assured everyone is working from the same up-to-date hardware and software. The company also hopes this initiative will bring more flexibility to purchasing options, so businesses have an easier time transitioning their workforce over to Surface devices. The company launched this program with ALSO, a leading CSP out of Europe, with plans to expand the program with partners all around the world.


Microsoft's Satya Nadella thinks these four technologies will reshape IT

Microsoft wants businesses to use its Azure cloud platform to connect the various collaboration and communications apps, line of business software and professional social networks like LinkedIn, which Microsoft is acquiring. "What if the software was built in ways that we could connect those disparate worlds," said Nadella, adding he believes Microsoft is "on the cusp" of being able to do so. By linking cloud data and services, he said Microsoft wants to allow companies to build highly automated business systems. The keynote saw a demonstration of how a company called Ecolab is using Microsoft's technology to automate its business. The utilities company uses a Power BI dashboard to view real-time data and analysis covering its site management, retail and operational performance. In the example, Ecolab used the Power BI dashboard to see that a customer called City Power had an issue with a cooling tower.


Smart Data Is a Bigger Priority Than Big Data for FinTech Companies

There is a significant difference between collecting piles of data and finding relevant information that can be used immediately. Generating critical information to create profiles and project customer behavior are what makes big data so appealing to FinTech companies. But it’s important not to lose track of the validity, quality and usefulness of the information. Big data is associated with a lot of “noise” that needs to be canceled out. Smart data, on the other hand, offers a more valuable proposition. Solving business problems hinges on the availability of information that’s clear and that can be used immediately. As is the case in nearly every business sector, quality trumps quantity every single time. Tackling the precise pain points of a business model requires a gentle and smart approach, rather than brute force and sheer volume.


Millennials Think About Work Too Much

The long-term goals that were more typical of Millennials seem to target the same general objectives of reducing stress and worry and exercising more, although Millennials were more specific. For example, Millennials were much more likely to mention specific wellness goals, such as yoga, than to simply say they were hoping to get more fit. But in this study too, Millennials were more likely to talk about work. They mentioned finding a new job with better benefits, more pay, better hours, and more work-life balance, as well as work that was more intrinsically rewarding. This, again, was much more typical of the Millennial age group than older or younger groups. When looking at long-term goals that are the least typical of Millennials, we find that Millennials are the group with the lowest interest in goals related to faith and worship. They were much less likely to use words such as “god,” “pray,” “spiritual,” or “Bible,” for example.


Taking Agile To The Next Level

Going beyond the essential idea that software should have testable goals - based on my own experiences trying to do that - I soon learned that not all goals are created equal. It became very clear that, when it comes to designing goals and ways of testing them (measures), we need to be careful what we wish for. Today, the state of the art in this area - still relatively unexplored in our industry - is a rather naïve and one-dimensional view of defining goals and associated tests. ... Typically, the tests ask the wrong questions (e.g., the airline who measured speed of baggage handling without noticing the increase in lost of damaged property and insurance claims, and then mandated that every baggage handling team at every airport copy how the original team hit their targets.)


Don't Break your Silos - Push Out the Silo Mentality

It is widely believed that, one of the easiest ways to unite people is to give them a common adversary, which motivates them to perform better collectively. If each of the team leaders starts to create the image of adversity in the face of the people from other teams, an inner competition may arise between the teams themselves. If it is friendly and just for the sport, it may boost the performance of the whole company as all of the teams will aim to perform at the peak of their skills, however with envy and price on the line, people from the teams may try to do everything in their power to stop their rivals from winning, including minimizing contact, withholding information, trying to create conflict between the management and other teams, and etc. Constant disagreement between the leaders of different teams or departments may drive a line between their subordinates and divide them.


Computer says: oops

False positives can never be eliminated entirely. But the scientific standard used in this sort of work is to have only one chance in 20 that a result could have arisen by chance. The problem, says Dr Eklund, lies with erroneous statistical assumptions built into the algorithms. And in the midst of their inspection, his team turned up another flaw: a bug in one of the three software packages that was also generating false positives all on its own. The three packages investigated by the team are used by almost all fMRI researchers. Dr Eklund and his colleagues write that their results cast doubt on something like 40,000 published studies. After crunching the numbers, “we think that around 3,000 studies could be affected,” says Dr Eklund. But without revisiting each and every study, it is impossible to know which those 3,000 are.



Quote for the day:


"In the end, we will remember not the words of our enemies, but the silence of our friends." -- Martin Luther King Jr


July 13, 2016

Linux Mint 18: The best desktop -- period

Mint uses the Linux 4.4 kernel. On top of that it uses the X.org 1.18.3 windowing system. While Mint's default desktop is Cinnamon, like I said earlier, you, not some company, get to choose your desktop. MATE, the GNOME 2.x clone, is already supported. Other desktops, such as KDE,LXDE, and Xfce, will be available soon. Despite these changes, Mint still runs on old computers you have sitting in your garage. You only need 512MBs of RAM to run it, although 1GB is recommended. You can fit Mint on a 10GB hard-drive, although 20GB is recommended. As for a display you can run it on 1024×768 resolution or even lower if you don't mind using the ALT key to drag windows with the mouse.


Angular versus React: Which One Will Reign Supreme?

They have some similarities which has led to numerous debates and discussion on which is the better tool for web development. Is Angular better, with its strongly defined structure and adherence to traditional coding rules? Or is React better with its flexibility and speed? Is Angular too rigid and strict for its own good to the point that it doesn’t really give developers the freedom to innovate? Or will the lack of a defined structure hurt React in the long run while its freeform mix of HTML and JavaScript results in lazy coders? Comparing these two has become even more exciting thanks to major developments to come out from both camps. Both React and Angular announced major releases for 2016. Angular just released Angular 2 while React promises major project website updates and more robust handling of animation.


Dutch Central Bank Prepares its Boldest Blockchain Experiment Yet

Like the bitcoin network itself, the experiment envisions how an FMI's internal operations could be distributed among participating nodes. To game the system – and break the financial market infrastructure — an attacker would need to gain more than half the computing power running the nodes. News of the experiment, scheduled to begin later this year, comes as financial market infrastructures are increasingly being targeted by hackers. Earlier this month, the chairman of the Bank for International Settlements (BIS) went so far as to call for immediate action on potential solutions to the issue. Now in a new interview, De Nederlandsche Bank's head of market infrastructure, Ron Berndsen, explained why he believes blockchain could be the key to preventing more attacks.


IBM and Samsung achieve breakthrough on flash killer for wearables, mobile devices

NAND flash on average takes one microsecond to write data compared to MRAM's 10 nanoseconds -- meaning MRAM is 100,000 times faster than NAND flash on writes and 10,000 times faster on reads, said Daniel Worledge, the senior manager of MRAM development at IBM Research, in an email reply toComputerworld. "This is important because it now falls into the sweet spot compared to other memory technologies and this level makes it viable to manufacture," Worledge said. "This could never be done with in-plane magnetized devices — they just don't scale," Worledge said, referring to hard disk drives and NAND flash. "While more research needs to be done, this should give the industry the confidence it needs to move forward. The time for Spin Torque MRAM is now."


Microsoft Tests Natural Language IFTTT Alternative

The app automation service, which can be reduced to the more appealing abbreviation CAP, is similar to glue services IFTTT and Zapier that allow apps to interact with each other. It's also similar to another recently introduced Microsoft offering, Flow, so much so that it's tempting to wonder whether the company's various teams talk to one another. But conversational comprehension is CAP's reason for being. The experimental project from Microsoft's Technology and Research group offers a way to automate app interactions using natural language rather than code. CAP supports Flow's menu-driven programming model in which a user directs the service, for example, to send an SMS notification to a mobile device when an email arrives in Outlook. This is a proven method for interaction and is easy enough for almost anyone to manage.


Why Open Source Graph Databases Are Catching on

Open source graph databases are proving especially popular, as companies increasingly shun proprietary software and vendor lock-in for data management and storage. Open source also gives software developers more flexibility and makes it easier to control up-front costs. All of the major social networks use open source graph databases. Twitter created the open source FlockDB for managing wide but shallow network graphs. Google's Cayley was inspired by the graph database behind Freebase and its Knowledge Graph, the knowledge base behind its search engine. Facebook uses Apache Giraph, which was built for high scalability. "Remember back to Alta Vista before Google? Alta Vista was good, but Google was so much better because it actually understood how all the pages on the web linked together," said Quinn Slack, co-founder and CEO of Sourcegraph


All your IoT devices are doomed

They won't become "unusable", but a lot of functionality is in jeopardy. And there are a lot of these devices in existence. If you take into account the other models affected along with the iPad 2, such as the original iPad mini and the iPad third-generation, that accounts for about 40 percent of all iPad devices in the wild that cannot take an iOS 10 update. As an industry, I think we need to step back and think about the realistic lifetimes of IoT and smart devices, and what can be done to extend their lifetimes when they are at risk of abandonment. The expected lifetime of an IoT device should probably be based on the type of device. I like to think of these devices as belonging to three, distinct groups: endpoints, hubs, and clients. An endpoint is a device managed by something else. These are devices that if unmanaged should still be able to function without a working cloud service.


HTTP-RPC: A Lightweight Cross-Platform REST Framework

HTTP-RPC services are accessed by applying an HTTP verb such as GET or POST to a target resource. The target is specified by a path representing the name of the resource, and is generally expressed as a noun represented as a URI, such as /calendar or /contacts. Arguments are supplied either via the query string or in the request body, like an HTML form. Results are generally returned as JSON, although operations that do not return a value are also supported. For example the following request might retrieve the sum of two numbers, whose values are specified by the a and b query arguments:


Offshore And Cloud Service Providers Upset IT Outsourcing's Top Tier

While many offshore firms have continued to grow at double-digit rates, that arbitrage-fueled expansion is likely to slow as the relative importance of labor costs decreases. “The mid-tier of offshore firms will start to compete more aggressively on price and the larger firms will have to become more like traditional firms — firstly by changing the mix of skills they have and also leveraging technology they have built or invested in,” Snowden says. “The business model will be based more on IP they own and less reliant on lots of cheap labor.” Whether cloud providers will maintain their current growth rates is unclear. “The big debate in this space — particularly around AWS — is whether infrastructure-as-a-service is a commodity service or not,” Snowden says


As IoT Proliferates, The Role Of IT Keeps Growing

IT’s influence on purchases and deployments of video surveillance is a relatively new phenomenon. Deploying IP-based video has broad implications for network infrastructure requirements, bandwidth usage and data storage consumption. As an example, for a national retailer, the security and loss prevention departments might introduce tens of thousands of IoT-based devices to manage. And in a school or commercial operation, it might be thousands of devices. As these devices transmit sensitive information, they are vulnerable targets for hackers. IT needs to take steps to protect these network-based edge devices against cyber attacks and ensure that data is transmitted securely from edge to core. In addition, IT needs to collaborate closely with those departments responsible for physical security and facilities management to protect against physical tampering or sabotage of the devices.



Quote for the day:


"You never know what worse luck your bad luck has saved you from." -- Cormac McCarthy


July 12, 2016

Why Microsoft is betting its future on AI

"It's the modern era — you don't have to be an expert in speech and language understanding," Connell says. "Just use our tools. Go build your branded bot with our tools and put it on whatever canvas — it might be Slack, it might be Facebook Messenger. We hope it might be Skype or Windows. But you choose." And with fears mounting among developers that a war could emerge over bot standards, Microsoft has been uncharacteristically diplomatic. It organized a conference in San Francisco in June to promote cooperation among bot-makers. "We're really interested in it being interoperable — we want it to be an ecosystem," says Lili Cheng, a senior engineer at Microsoft who helped organize the two-day event. (It was called Botness.) "It's more like, what are the problems and challenges that we are finding that we can work on together?"


Israeli Fintech Hybrid: Another Block in the Blockchain

Israeli universities are home to a number of Blockchain research pioneers, including Turing Award winner Professor Adi Shamir of the Weizman Institute of Science and Prof. Eli Ben-Sassonof the Technion. In the private sector, as per a recent report by the global consulting firm Deloitte, there are dozens of Israeli startups developing a wide-range of Blockchain technologies, in various sectors including security, hardware, virtual currency, payments, P2P and social platforms. For example, a startup offering a Blockchain solution to eliminate the need for documents in international shipping agreements, an application for secure and validated purchase and storing of goods online, and another creating Blockchain data templates for use by banks and enterprises.


The New Data Scientist Venn Diagram

What is relevant is to understand where an individual’s interest lies in the broad data science church and where the needs of the organisation are. The individual’s interest may be developing innovative algorithms to solve a new problem (the high-end data scientist described by Davenport and Patil), or identifying new business problems that can be solved with existing tools or distributed programming for Hadoop. The key is to match the organisation’s needs with an individual’s interest and not be bothered with the position title or the candidate’s label. Finally, as for finding this rare species, let me point out that the characteristics of curiosity, self-direction and innovation are required in all scientific research. Fashioning tools to overcome a challenge has always been the hallmark of a research scientist. Didn’t Newton invent infinitesimal calculus when the mathematical tools at his disposal were insufficient to calculate the instantaneous speed?


Surge in real-time big data and IoT analytics is changing corporate thinking

"The Internet of Things will enable sensor tacking of consumer type products in businesses and homes," he said. "You will be collect and analyze data from various pieces of equipment and appliances and optimize performance." The process of harnessing IoT data is highly complex, and companies like GE are now investigating the possibilities. If this IoT data can be captured in real time and acted upon, preventive maintenance analytics can be developed to preempt performance problems on equipment and appliances, and it might also be possible for companies to deliver more rigorous sets of service level agreements (SLAs) to their customers. Kelly is excited at the prospects, but he also cautions that companies have to change the way they view themselves and their data to get the most out of IoT advancement.


How to make sure your Hadoop data lake doesn't become a swamp

If Hadoop-based data lakes are to succeed, you'll need to ingest and retain raw data in a landing zone with enough metadata tagging to know what it is and where it's from. You'll want zones for refined data that has been cleansed and normalized for broad use. You'll want zones for application-specific data that you develop by aggregating, transforming and enriching data from multiple sources. And you'll want zones for data experimentation. Finally, for governance reasons you'll need to be able to track audit trails and data-lineage as required by regulations that apply to your industry and organization. This is no simple matter, Henschen writes. In order to complete it enterprises will require a mature set of tools for data ingestion, transformation, cataloging and other tasks.


Is Artificial Intelligence (AI) the next game changer in IT?

IT companies aspiring for a piece of the AI market can collaborate in areas of development with cloud service providers, mobile application developers, IT infrastructure service providers and analytics engine providers. AI applications and platforms need microprocessors to execute complex tasks at speed, cloud computing for low cost processing, data storage of massive volumes of unstructured data and smarter analytics engines with Natural Language Processing (NLP), voice and pattern recognition, machine learning, and mobility for wide spread use from remote locations. IT companies involved in any of these areas can collaborate to tap into the potential that AI offers. In terms of AI adoption; however, it’s still early days. Global organisations are working to understand various classes of AI solutions, where the technology is now, what is on the horizon, and how to convert in the near to long term.


Self-service BI Success Depends Upon Data Quality & Governance

Non-technical workers can make faster, better decisions because they no longer have to wait during long reporting backlogs. At the same time, technical teams will be freed from the burden of satisfying end user report requests so they can focus their efforts on more strategic IT initiatives. In order for self-service BI environments to be effective, they must be extremely intuitive and user-friendly. The majority of today’s business users simply don’t have the skills or technical savvy to work with complex tools or sophisticated interfaces. A self-service BI application will only be embraced by its intended audience if it gives them a means of simply accessing customized information, without extensive training. Here are three factors for organizations to consider before they implement self-service BI for their user base:


ITSM Capabilities: It’s Not About the Processes!

Outcomes are more important than processes. Pearl Zhu talks about Business Capability versus Processes over at Future of CIO. The article is well worth reading; in it, she describes a process as how it’s done, whereas a capability is what is done. In other words – what’s accomplished, or (the outcome). It comes down to this – a process by itself doesn't produce value. Processes produce outputs. What the business is able to achieve with those outputs are business outcomes, and can be measured in monetary terms. In a corporate environment, directors and senior managers are accountable for how they spend the company's money. They must ensure that what the company is investing in produces optimum value for the stakeholders. IT spend is no different. Whatever investments a company makes in IT must be held to that same standard – that of producing maximum value for the organization.


Why You Should't Pay The Ramsomware Fee

There is the issue of being able to trust that this is a single payment that will result in the return of data as promised, but enterprises that are hit with ransomware also experience the hard fact that a hit can make your most critical information inaccessible and in some cases not recoverable at all. “Some people might argue that paying is a viable option at that point,” Manship said. In paying though, they also have to consider whether they can trust the bad guy is going to keep their word. Certainly, this act of holding their data hostage could become a continuous cycle. Manship said, “There is no evidence that decrypting data means they are out of your system. Are they going to give you the key? How many times are they going to try to extort money out of you until they laugh and walk away and you are out of luck?”


Five questions boards should ask about IT in a digital world

The IT organization can no longer be considered just a service provider; how it manages the integration of emerging technologies can help determine the success of a company’s digital strategy. Therefore, simply relying on cost-related measures will not provide a full picture of IT performance. And the CIO’s boardroom presentations will continue to get lost in translation. Boards need to master a second language—one focused on digital themes, such as speed to market, agile product development, platform-based delivery models, and the benefits and challenges of analyzing various forms of corporate data. With a higher degree of digital fluency, boards can help C-suite leaders make better decisions about how to expand a company’s most successful technology initiatives and when to pull the plug on lagging ones.



Quote for the day:


"Direction, not intention, determines destination" -- Andy Stanley