Daily Tech Digest - January 01, 2017

7 IT Recruiting Predictions For 2017

In 2016, recruiters grappled with an evolving job market, an incredibly competitive hiring landscape, a shortage of IT talent and sky-high salaries for in-demand roles. Those trends are likely to continue into 2017 as recruiters continue to adapt to the this high-demand, low supply hiring landscape, says Dave Morgan ... "We're very optimistic about what 2017 will hold. We don't see the IT market slowing down at all -- we saw a slight tick downward in the fall, but that demand has picked back up along with the urgency from clients around that demand," Morgan says. The three hottest areas of demand are around security, big data and cloud technologies, Morgan says, as incidents like December 2016's Yahoo breach dominate headlines, organizations struggle to manage and make sense of a tsunami of data and leverage the cloud for more efficient, cost-effective computing power.


IoT, mobility, big data, analytics and imaging intelligence to impact healthcare in 2017

The next generation hospital management system is a long term transformation plan for a three years’ horizon. ... But we are trying to see how we can stage it into two phases, so the phase one can be done by 2017 end and second phase by 2018. So we are trying to make sure we stay focused because beyond three years, you kind of lose oversight on it, then the project becomes really unviable. That’s the first priority we are holding to ourselves. On the cyber security side with the recent ransomware kind of news coming in, where patient records have been attacked by hackers and rasonware, etc., and since we are also moving towards web managing mode with payment gateways for websites and patient related portal which will have patients medical records, hence we thought that we should be upfront from cyber security stand point rather than the post-mortem effect. So that is becoming core in our focus area.


AngularJS vs. React: Two JavaScript Technologies & How to Use Them

React shines when you have lots of dynamic content changing within the view. Most client-side solutions on the web today struggle with rendering large lists of items within a single view. This “struggle” may be on the order of milliseconds, but in this increasingly digital world, a half a second delay is all it takes to kill your user experience. It’s why sites that have a lot of dynamic, constantly changing, data-intensive content like Instagram or Facebook choose to power their applications with React. As a library, you can also use React as the view component of AngularJS or other frameworks, because it does not tie you to a specific technology stack. That said, there’s a quickly growing community and a number of existing libraries and add-ons that can help you build an app from scratch.


Big Data, Big Self-Service

Data self-service became much more prevalent. It is not entirely trivial to organize. Best practices demand that effective access management security, and where necessary, encryption, is in place. There may also be a need for metadata capture software and data cleansing software. However, the pay-off is significant. The main dynamic of this is that the user no longer needs to go cap-in-hand to some IT developer to get access to data. In most organizations, there are limits to what can be held in a data warehouse and there may even be onerous procedures for getting at that data. To add new data sources to the data warehouse would often be prohibitive. The difference with a data lake can be startling. The data lake is, or should be, a single staging area for new data within the organization. It is extensible.


Information architecture: The key to governance, integration and automation

To develop a successful information architecture and meaningful insights, we need to enforce collaboration across business units, IT, the CDO office and other parts of the organization. And perhaps more importantly, we need to change the culture to get people thinking of how new technologies help eliminate all the barriers and create an environment in which everything is automated and transparent. Thinking about information taxonomy and classification isn’t necessary. Tools are just the tools, and we will continue to live in the past without proper collaboration, information sharing, knowledge base building and adoption of a new way of thinking about information. Technology is enablement; people need to understand, change the processes and do so using technology’s help.


Want to make better decisions? Break down the wall between data and IT

There is a lot of hype around the deficit of data scientists for hire, but I believe that you do not need an army of data scientists to make meaningful discoveries. With the tools and systems available today, it is possible for even one or two talented, motivated data scientists to create game-changing innovation. The key is to hire carefully, looking for strong training in quantitative disciplines (pick your favorite, it really does not matter), allied with the mindset of a restless tinkerer.  As a leader, I try to cultivate growth, autonomy, and teamwork throughout the company. This allows the company and data team to attract talent that wants to work with us and who has the same goals in mind. As a company, we aren’t just looking for people who have a degree in analytics or experience in the field. In fact, we have over 30 different graduate degrees on our team.


Using data science for predictive maintenance

Early identification of these potential issues helps organizations deploy maintenance team more cost effectively and maximize parts/equipment up-time. All the critical factors that help to predict failure, may be deeply buried in structured data like equipment year, make, model, warranty details etc and unstructured data covering millions of log entries, sensor data, error messages, odometer reading, speed, engine temperature, engine torque, acceleration and repair & maintenance reports. Predictive maintenance, a technique to predict when an in-service machine will fail so that maintenance can be planned in advance, encompasses failure prediction, failure diagnosis, failure type classification, and recommendation of maintenance actions after failure.


Ooutsourcing Trends To Watch in 2017

This year, we saw outsourcing integration challenges multiply, production workloads and enterprise systems hit the cloud, and security hit the top of the agenda. So what’s ahead for 2017? Uncertainty for one thing. Industry watchers expect a number of shifts in the IT and business process services space — not least of which will be the initiation of more flexible outsourcing terms as the world watches and waits to see what happens once president elect Donald Trump takes office and Brexit takes hold. We also expect to witness maturation in cloud computing, robotic process automation (RPA), and cognitive capabilities while entities like the call center and business models based solely on labor arbitrage fade into history.


How to Use AOP in C# with NConcern .NET

Most AOP frameworks require compromises due to technical limitations (changes needed in the source code, changes in the compilation or deployment process). Few of them are simple to handle and the coupling with the framework is often too strong. This is one of the reasons why too few people are interested in aspect-oriented programming. However, the application of the concepts of aspect-oriented programming is very simple provided they are well understood and have the right tools. This is why I would like to share with you a basic AOP scenario using NConcern .NET, a dedicated open source framework (under MIT licence)


Java Performance Mythbusters

The JVM can be more aggressive in its optimizations through its knowledge of exactly which classes are loaded at any point in time. Even the problem of application warmup can be alleviated with technologies like Azul’s Zing ReadyNow!, which stores a profile of an application during execution. At startup, the profile can be used to substantially reduce the time required to analyze and compile frequently used sections of code. The problem is that some people still believe Java works the way it did twenty years ago when looking at Java performance. One of my colleagues, when I was at Sun, had an excellent way to illustrate this. When talking to people with Java performance problems his approach was something like this



Quote for the day:


“The secret to a rich life is to have more beginnings than endings.” -- Dave Weinbaum


No comments:

Post a Comment