“What’s very interesting now is that years ago you had to explain what open source was,” says Jagielski. “That’s certainly not the case anymore. “We never thought it would catch on as quickly or as deeply as it has,” he adds. “People who loved open source thought it had the potential to change the world – much of a cliché as that it – but never thought it would be realised.” In fact, these days every corporate entity – and his wife – is keen to stress its involvement with open source. All enterprise software is heading in this direction, one way or another, but in a lot of ways the flavour has changed and Apache is definitely part of the less commercial old guard. It has always attracted very loyal fans.
There are several reasons for this evolution. On top of some of the concerns we already mentioned it seems that robo-advisers are not as attractive as in the beginning when the robo-adviser offering was relatively straightforward and easy to understand but with ever more players hitting the market and various types of robo-advisers with various target groups and functionalities, the picture has become more confusing. Add to that reigning fears, glitches and, last but not least, challenges regarding customer and user experience plus too much focus on the costs (they are “cheaper” than human advisers) rather than on the overall benefit picture and the challenges become clearer.
So as the Architect, you’ve always done your projects, you’ve always carefully facilitated the discussions and guided decisions when defining solutions, and now you find yourself in a rapidly changing world where business people are building solutions themselves. You find yourself increasingly useless and no longer relevant. On the other hand, if you pick up a role that articulates the value of these new technologies in the new business contexts that are emerging, you really have to change your job a lot to become meaningful. The fundamental value of architecting has not changed, but the spectrum of choices, the moving parts, the building blocks have greatly increased and it is against a background where everybody wants things very quick and very cheap.
As indicated in the five levels of data infrastructure, data infrastructure is a necessary component for business growth in a similar way to how physical infrastructure is necessary for the growth of a community. When an organization only has a few employees, word of mouth can be a workable solution for managing very rudimentary forms of data, such as whether a particular customer has paid its bill.The organization needs more definitive and easily accessible answers on more and more topics, which means it needs more efficient ways to gather, use, and disseminate data. This is what data infrastructure allows you to do. ... Data infrastructure can be roughly divided into four data infrastructure elements that will give the company access to the data it needs to solve customer and business problems.
Over the last few years, LOD has slowly become an accepted way of exposing data to the internet. One could also say that LOD, together with the IoT is one of the key requirements for smart cities. If governments would open up their datasets, and in particular their sensor networks, over the internet using LOD, then this could ‘enable’ smart cities. The ‘things’, such as sensors, expose their data in a structured way and linked to other datasets. This in turn may lead to applications that are not yet foreseen as data is not yet available in this manner. For example, a traffic intensity sensor not only exposes the traffic intensity itself to the internet but also the information about the road on which it is located. Using that road location, the information can be combined with other data such as road maintenance information, an air pollution sensor in the vicinity and/or meteorological information.
Cybercriminals harnessed the internet of things (IoT) to carry out the attack, highlighting the vulnerability of the billions of connected devices around the world. Recent attacks on Three Mobile, Tesco Bank, Yahoo, and TalkTalk prove that every company is a potential target, and should be a real wake up call to global business leaders. Whether you are a small, local business, or a multinational brand – the question isn’t “will you be attacked?”, because you probably have been already. With vast networks of connected devices plugged into global digital infrastructure, and business value increasingly defined by intangible assets, this is a threat that knows no borders, time zones, or limits. In the insurance business, it is classed as one of the most complex and challenging man-made threats out there.
An average data scientist deals with loads of data daily. Some say over 60-70% time is spent in data cleaning, munging and bringing data to a suitable format such that machine learning models can be applied on that data. This post focuses on the second part, i.e., applying machine learning models, including the preprocessing steps. The pipelines discussed in this post come as a result of over a hundred machine learning competitions that I’ve taken part in. It must be noted that the discussion here is very general but very useful and there can also be very complicated methods which exist and are practised by professionals. ... Before applying the machine learning models, the data must be converted to a tabular form. This whole process is the most time consuming and difficult process and is depicted in the figure below.
The technology works to identify the ownership of energy as it is generated and then to manage multiple trading agreements between consumers who buy excess solar, ... Perth-based Power Ledger is testing various applications of blockchain across residential, retail and wholesale electricity markets in three different pilot projects across the country, and one in New Zealand ... “It’s a win for the people who have been able to afford to invest in roof-top solar, but also a win for customers who haven’t: they will be able to access clean, renewable energy at effectively a ‘wholesale’ rate. Everyone wins.” Well, perhaps not everyone. As the BNEF report notes, while most blockchain software and business models are currently at a proof-of-concept or trial stage of development, their potential to “rapidly disrupt traditional energy market structures” cannot be ignored.
IT organizations turn to enterprise vendors to help with the speed of innovation needed to support the business initiatives. In my discussions, IT organizations are finding a similar problem as the FANG group. Enterprise IT vendors are not positioned to innovate at the pace or the granularity of specific industry or organizational requirements. Out of necessity, IT organizations are beginning to embrace open source solutions to meet the challenge. Open source projects tend to move at the pace required by the project contributors. The contributors are starting to look much different than many traditional open source projects. Many forward-thinking end-user enterprises are dedicating developer resources to contribute to these projects directly, or take the base solution of a project and add capability.
“The microservice architectural style is an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. These services are built around business capabilities and independently deployable by fully automated deployment machinery”. This post is not about the pros and cons of microservices and assumes that you’re simply interested in the underlying technology to support it. Another post worth checking out for those issues covers the main challenges in debugging microservices — which is a common pitfall that many don’t think of when considering microservice architectures.
Quote for the day:
"Don't envy what people have, emulate what they did to have it." -- Tim Fargo