Daily Tech Digest - November 30, 2016

10 ways to gracefully kill an IT project

The factors behind terminating a project may vary; the complexity involved, limited staff resources, unrealistic project expectations, a naive and underdeveloped project plan, the loss of key stakeholders, higher priorities elsewhere, or some other element - but likely it will be a combination of some or many of these possibilities. ...  Halfway through implementation it becomes clear the backup process will consume more bandwidth than expected, will take an inordinate time to complete backing up servers with hefty storage levels, and will cost more in the long run than the existing tape solution which can be refactored for cheaper, more efficient and less complicated administration. Clearly this is the proverbial "record coming off the needle" moment at which a harsh truth must be acknowledged.


Do You Really Have To Migrate To The Cloud?

Call it “digital darwinism.” Cloud brings big benefits to application vendors. On-premise customers using previous versions require maintenance — and that’s are a big drain on vendor resources. And they’re not as happy as new customers because they don’t have the latest innovations. Sales people have to spend lots of time trying to help justify upgrades — time they could use to sell to new customers. Extra costs and slower innovation means that over the long term, vendors offering cloud solutions will win. ... It’s time to remind her that her email — perhaps the most sensitive data you have in the company — already flies freely across public networks, protected only by security protocols. Remind her that money in your company’s bank accounts is really just data stored in ethereal databases around the globe.


4 things you should never say if you want to innovate successfully

For us, realistic innovation means starting with our customers, not our engineering department. Our product team creates bare-bones prototypes in low-tech tools like PowerPoint and shops them around. We learn what resonates with our target market before engaging expensive development resources, which ensures that engineers only work on projects with real legs. The products that pass the PowerPoint test do go to development, but customers remain involved. We ask them not only if the technology fills a need, but whether we are communicating the value proposition in a meaningful way. Successful innovation requires both: the right offering and the right pitch. When you’re lucky and smart enough to find both, pour fuel on the fire. The nature of that fuel will be different for different companies and products.


How the convergence of automotive and tech will create a new ecosystem

The operating models of the two sides differ dramatically. For example, automakers reengineer their core products approximately once every seven years, with noticeable updates every three years, but do not update existing products. Tech companies redo their core products about every two years, make noticeable updates every two months, and provide continual updates for existing products. The OEMs’ systematic “waterfall” approach to product development tends to slow down innovation; the average time to market is about five years. Most tech players depend on agile operating models that enable a time to market of roughly two years.


Branding: A Strategic Imperative For High-Tech Marketers

Unless a company has just hired a brand new CMO who wants to leave a mark, goes through major changes such as a merger, or undergoes a departure from a product line, branding is usually a tool better left for consumer companies. In a world of big data marketing and mandatory ROI, branding spending is difficult to justify as its impact can appear intangible. It takes time for a new brand initiative to bring results when the building of a brand does not occur overnight. The question is: Why would you engage in a branding initiative in an industry where investors and shareholders have little patience? Well, here are a few reasons why we should consider branding as a key strategic imperative for enterprise software or cloud services, especially in a high-growth environment.


Technology Isn’t the Answer to Surviving Digital Disruption

The goal is not to become a tech company. The goal must be to embed technology so ubiquitously and so deeply within the culture and operating model of the company that it becomes transparent - allowing you to enhance the customer experience and deepen your relationships. The challenge is that most traditional organizations have a self-view that is essentially synonymous with the product they sell or the service they provide. The required shift is not to become a tech company, but rather to make the organization synonymous with the value they provide and the relationship they create. The organization can only re-envision its business model from that perspective. The real goal of digital transformation, therefore, is to leverage technology to reshape and enhance the value you deliver to your customers.


A new Digital CIO must emerge from digital economy disruption

Technology has certainly affected the way we do business and companies’ roles. The time is now to refocus, and turn those initiatives into a full-fledged digital transformation strategy, and that means that CIOs must reinvent themselves in order to understand how the disruptions that digital transformation will bring to the businesses as we know them can create opportunities to growth and establish themselves as the strategic digital leader companies expect and need them to be. Before we explore what’s coming with the new digital reality, it is important to look at the present and understand how companies and CIOs are dealing with their IT departments. Reinventing the IT function to support the digital transformation requires far-reaching changes, from talent to infrastructure, and takes multiple years to complete.


What is email security and how can SMEs get it right?

“Email was never intended to be used in the way it is now. It’s not really kitted out for all of the risks associated with the internet; it was designed for a more trusting environment,” he explains. And it’s a mistake to think that SMEs don’t present a worthwhile target. In fact, they present attractive opportunities. ... “What does worthwhile mean?” asks Mr Bauer. “It’s relative to the cost of putting on an attack, and to the downside of getting caught.” Both are low when it comes to an attack on an SME, which makes them more appealing than larger corporations.  Each time an attempt to hack your company is made via email, there are one of two aims at play: to steal money, or gain information.
Small businesses should bear those purposes in mind, because they can be key to spotting – and stopping – hacks.


Why Now Is the Ideal Time for the CIO to Work with Graphs

There’s clearly enormous market growth taking place. Forrester Research estimates that one in four enterprises will be using such technology by 2017 while Gartner reports 70% of leading companies will pilot a graph database project of some significance by 2018. Graph databases aren’t applicable or helpful for all problems; there are transactional and analytical processing needs for which relational technology will probably always be the correct option, and there are NoSQL database alternatives that handle other types of large dataset well. But graphs make sense for any organisation seeking to make the most of its connected data. That is why I would recommend that any CIO looks to NoSQL, including graph databases, as a powerful new tool to supplement their RDBMS investment and deal with the growing data tsunami.


A new way to anonymize data might actually work

This is definitely a good thing, as EHR databases are now popular targets for cybercriminals due to the amount of data available in one location, as well as the fact that data—unlike financial information—cannot be changed or canceled. The paper's authors explain that the PEP framework consists of two components: polymorphic encryption and polymorphic pseudonymisation. The researchers begin with polymorphic encryption by explaining how it differs from more traditional encryption processes: "In traditional encryption, one encrypts for some chosen recipient who then holds the decryption key; whereas in polymorphic encryption one encrypts in a general manner and at a later time the encryption can be transcribed to multiple recipients with different keys."



Quote for the day:


"You have all the reason in the world to achieve your grandest dreams. Imagination plus innovation equals realization." -- Denis Waitley


Daily Tech Digest - November 29, 2016

CEOs See More Value In Technology Than People

There is a clear trend among CEOs to magnify the relative importance of technology in the future of work with 67 percent saying they believe that technology will create greater value in the future than human capital will. Another 63 percent of CEOs said they perceive that technology will become their firm’s greatest source of future competitive advantage. But the economic reality differs sharply, with human capital, not physical capital, creating the greatest value for organizations. CEOs’ distorted perceptions demonstrate the extent to which people are being painted out of the future of work ... A full 44 percent of leaders in large global businesses told Korn Ferry that they believe that the prevalence of robotics, automation, and artificial intelligence (AI) will make people “largely irrelevant” in the future of work.


Every company is a technology company, but most don’t behave like one

An interesting anecdote from The Lean Startup, one of the manifestos for startup founders, is that Intuit holds themselves accountable to being innovative and agile by using two key metrics: (1) the number of customers using products that didn’t exist three years ago and (2) the percentage of revenue coming from offerings that did not exist three years ago. Historically for Intuit, it took a new product an average of 5.5 years to reach $50 million in revenue; at the time the book was written, they had multiple products generating $50 million in revenue that were less than a year old. Particularly, as the world is moving towards cloud computing, continuous development, and continuous updates are the name of the game.


5 Expensive Traps Of DIY Hadoop Big Data Environments

“Hadoop is known to be self-healing, so if a node goes down on a server, it’s not a problem,” Dijcks says. “But if you buy inexpensive servers, you’re more likely to have nodes down and spend more time fixing hardware. And when you have a chunk of nodes that aren’t working, you’ve lost that capacity.” ... IT departments figure, “‘We’ve invested a lot of time, we’ve worked on this very hard, and now we need to put it into production,’” Dijcks says. “You can learn on throwaway servers, because if [the environment] goes down, no worries—just restart it. But in production, the cluster needs to stay up through hardware failures, human interaction failures, and whatever can happen.”


A fast data architecture whizzes by traditional data management tools

"Knowledge is power, and knowledge of yesterday is not as valuable as knowledge about what's happening now in many -- but not all -- circumstances," said W. Roy Schulte, vice president and analyst at Gartner. Businesses want to analyze information in real time, an emerging term dubbed fast data. Traditionally, acting on large volumes of data instantly was viewed as impossible; the hardware needed to support such applications is expensive. ... The use of commodity servers and the rapidly decreasing cost of flash memory now make it possible for organizations to process large volumes of data without breaking the bank, giving rise to the fast data architecture. In addition, new data management techniques enable firms to analyze information instantly.


The Financial Impact of NOT having Data Governance

During a recent customer visit, we got discussing the financial impact of Data Governance. To help explain this point, I thought I’d share some of the more common problems associated with NOT having data governance. By looking at it from this point of view we can get an idea of what the business is doing to overcome these issues, against which we can then associate some value. ... This isn’t meant to be an exhaustive paper on the subject, more a sharing of thoughts and ideas. I’d also add that the ideas presented in this blog aren’t suggesting these impacts will happen, more a sharing of some common challenges we see in the world of Financial Services and a way to try to understand the potential financial impact they might cause. These challenges should be seen from the perspective of potentially being part of a broader Data Governance initiative.


Social Media Is Killing Discourse Because It’s Too Much Like TV

It makes us feel more than think, and it comforts more than challenges. The result is a deeply fragmented society, driven by emotions, and radicalized by lack of contact and challenge from outside. This is why Oxford Dictionaries designated “post-truth” as the word of 2016: an adjective "relating to circumstances in which objective facts are less influential in shaping public opinion than emotional appeals." ... Social media, in contrast, uses algorithms to encourage comfort and complaisance, since its entire business model is built upon maximizing the time users spend inside of it. Who would like to hang around in a place where everyone seems to be negative, mean, and disapproving? The outcome is a proliferation of emotions, a radicalization of those emotions, and a fragmented society.


Will 'Digital Fingerprint' Technology Prevent Data Thieves?

The virtual intelligent eye works by generating a digital "fingerprint," based on behavior for every single login by every single user in every single application and database across the organization. This information is a recording of the "who, what, when, where, why and how" data is being accessed within an organization. Once a baseline for behavior is established, the system can easily identify anomalies in user activity and send out the appropriate alerts immediately when there are deviations from normal behavior. The cost of this technology will be positively impacted by the continuing decline in the cost of storage and processing power -- from cloud computing giants like Amazon, Microsoft and Alphabet. The healthcare data security war can be won, but it will require action and commitment from the industry.


Data Socialization: How to Achieve “Data Utopia” and Expedite Outcomes

Data socialization is an evolution in data accessibility and self-service across individuals, teams and organizations that is reshaping the way organizations think about, and employees interact with, their business data. Data socialization involves a data management platform that unites self-service visual data preparation, data discovery and cataloging, automation and governance features with key attributes common to social media platforms, such as having the ability leverage user ratings, recommendations, discussions, comments and popularity to make better decisions about which data to use. It enables groups of data scientists, business analysts and even novice business users across a company to search for, share and reuse prepared, managed data to achieve true enterprise collaboration and agility.


What's to blame for every single data breach? People, not technology

“Every breach occurs because someone in that company did something they were not supposed to do or because someone in that company failed to do something they were supposed to do,” Abagnale said. “There is not a master hacker sitting in Russia who will get through the company. The hacker will say, ‘I am not getting into JP Morgan Chase because they spend a fortune every year on cybersecurity, but they employ 200,000 people worldwide, so all I am looking for is one of those people who failed to do something they were supposed to or did something they were not supposed to do.’” Abagnale  said he will explain the weaknesses and soft spots in companies and instill in attendees that the most important job they have is to keep the information entrusted with them safe.


Reactor By Example

Reactor's two main types are the Flux<T> and Mono<T>. A Flux is the equivalent of an RxJava Observable, capable of emitting 0 or more items, and then optionally either completing or erroring. A Mono on the other hand can emit at most once. It corresponds to both Single and Maybe types on the RxJava side. Thus an asynchronous task that just wants to signal completion can use a Mono<Void>. This simple distinction between two types makes things easy to grasp while providing meaningful semantics in a reactive API: by just looking at the returned reactive type, one can know if a method is more of a "fire-and-forget" or "request-response" (Mono) kind of thing or is really dealing with multiple data items as a stream (Flux).



Quote for the day:


"Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it." -- Edsger W. Dijkstra


Daily Tech Digest - November 28, 2016

Ultimate Kanban: Scaling Agile without Frameworks at Ultimate Software

Ultimate started experimenting with Agile principles (namely, Scrum) in 2005. This initial transition to Scrum provided Ultimate with better visibility into the progress of teams towards wider business goals. However, there were some common sources of interruption that the Scrum did not handle very well. Regulatory changes that required immediate attention often forced teams to throw out plans for their sprints and start work for the new requirements. The ideal small Scrum team size (7-9 members) led to arbitrarily small teams with a very high cross-team coordination costs. Most importantly, though, after trying our hand at Scrum for a while, we did not see any major improvement in productivity.


The next big job in tech: Robot programmer

If your business is interested in bringing robotic programmers in, Mass said it's important to integrate them with other engineers. "Don't isolate them," he said. "From my experience, some problems in robotics can only be solved by a clever combination of software, electronics and mechanical design. Sometimes, changing the surface or angle around a sensor can make all the difference to making it work reliably. Make sure all of your engineers are working closely together and are talking to each other about their problems. Sometimes a solution can come from an unexpected direction." How to go about training to be a robot programmer? There are many books that teach programming, and you can also get your hands on a robotics kit. Also, Mass said "you shouldn't be afraid of reading data sheets or using an oscilloscope."


Information Architecture: What Is It and Where Did it Come From?

In order to understand IA, we first need to know where it originated. The term first started appearing in the 1970s. In 1970, a group of people at the Xerox Palo Alto Research Center were responsible for developing technology that could support the ‘architecture of information’. They were single-handedly responsible for many important contributions in what is today known as human-computer interaction. They introduced the first person computer with a user-friendly interface, laser printing, and the first WYSIWYG text editor. Modern use of the term IA, strictly related to the design of information, was officially introduced in the mid-1970s at the American Institute of Architecture conference where a man named Richard Saul Wurman introduced an idea that he called ‘the architecture of information’.


Upcoming bank rules could serve as a model for money management firms

Mr. Jacco believes banking regulations on cybersecurity will eventually apply to money managers. “It will be harder for them,” he said. “Some of them don't have big external websites; maybe they just have trading sites. Now on top of that they need a risk management function.” The regulations also will create a compliance change and organizational shift at money managers, Mr. Jacco said. The federal regulations, once established, “could create a new market standard for cybersecurity in general. The market may force everyone — managers, regulators — into that direction. But this phenomenon could take a long time to play itself out,” said Morgan Lewis' Mr. Horn.


The Internet of Things is making hospitals more vulnerable to hackers

Unfortunately, IoT start-ups often consider security to be a low priority, or an expensive headache that can be dealt with later on. That's a problem when those systems can potentially make the difference between life and death. "When implementing IoT solutions the components are chosen for their low cost and specific capabilities; however, the capabilities are significantly below what might be justified when the assets protected are human life, and security costs may be a significant portion of the cost, or even greater than the cost of the components. Prevalent vulnerabilities, however, do not only facilitate malicious actions, they may also increase the likelihood and impact of human errors and system failures," the report warns.


Six key principles for efficient cyber investigations

Even the largest companies appear to be less equipped to deal with more sophisticated cyberattacks, like the latest IoT-based Mirai DDoS attack or the attacks detected months or years after the initial breach, such as the Yahoo and Dropbox attacks. Inundated by alerts, analysts lack the automated and intelligence-driven processes to hone in on attacks across the kill chain and breaches continue far too long. To address this fundamental mismatch, organizations need a new perspective on the way they detect and respond to attacks. Like police investigations in the real world, every cyber investigation starts with a lead upon which a hypothesis is built. As more evidence is gathered in the field, the case continues to build until investigators can confirm or refute the direction of the investigation.


Q&A on the ​Practice of System and Network Administration

The key is to get information as early as possible. Discovering a problem on launch day is the worst. A simple technique is have a beta launch to find problems early. Everyone knows that, but people don’t think to do it for internal systems or system administration tools. We take this even further. Can you launch a single feature to validate assumptions months ahead of the real launch? I like to launch a service with no features, just the welcome-page, months ahead of the actual system launch. This gives us time to practice software upgrades, develop the backup procedures, document and test our runbook, and so on. Meanwhile the developers flesh out the system by adding features. When the system is ready for real users, there are very few surprises because the system has been running for months. Best of all, users get access to new features faster.


Whatever you're doing in Linux, Windows 10 will soon do it too

"Whatever it is that you normally do on Linux to build an application: whether it's in Go, in Erlang, in C, whatever you use, please, give it a try on Bash WSL, and importantly file bugs on us. "It really makes our life a lot easier and helps us build a product that we can all use and be far more productive with." The pledge to improve Windows' support for Linux tools reflects a recent change in Microsoft's rhetoric towards open-source software. While Microsoft's then CEO Steve Ballmer described open-source software as a cancer in 2001, in 2014 Microsoft CEO Satya Nadella proclaimed that "Microsoft loves Linux". Nadella's declaration may be simplistic, and ignore Microsoft's desire to stop organizations switching from Microsoft to open-source desktop software, as seen in Munich, but the tech giant has changed its hardline approach—even if only for pragmatic reasons.


Fault injection destined to be a must-have technique for software pros

Purposefully creating situations that can cause services and software to crash or malfunction is called fault injection. This is a QA paradigm that two software engineers from Microsoft believe can mitigate the risks associated with modern software deployment and management, especially in relation to applications and services in the cloud, by helping engineers observe and find fixes for these failures in a controlled manner rather than dealing with them for the first time at an unexpected moment. ... Fault injection could be compared to the testing method known as "stress testing," Zervos added -- creating more traffic or putting more stress on a service externally. But even this type of test will not provide the kind of information or insight fault injection can provide, including a look at how dependencies will behave in a given situation.


2017 Predictions: Mobile Is The Face Of Digital

There is no question that mobile moments are the battleground to win, serve and retain your customers. What a mobile moment is and where it surfaces, however, will become amorphous as it extends beyond smartphones to platforms and connected devices and then eventually lives in a consumer’s personal ecosystem. App usage as we know it has likely peaked. In 2017, platforms will expand in importance as consumers continue to consolidate their time into fewer places on the smartphone. Already, they spend 84% of their time in just five apps. These experiences that we loosely still refer to as mobile (but not for much longer) experience will lives as fragments on third party platforms.



Quote for the day:


"Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won't come in." -- Alan Alda


Daily Tech Digest - November 27, 2016

Stretching Agile in Offshore Development

Stretching Agile is not easy, especially when factors like distributed development and teams with different culture come in place. In my opinion, how you communicate with people is fundamental if there is a need to create awareness and self responsibility. Another advice is to put robust working agreements in place, created and reviewed with involved parties, not enforced or imposed, so everybody will understand their benefits. ... First I would say it is primordial to understand the culture and how other people behave, react and talk. Second, it is important to have frequent face to face meetings and trips between sites. After getting to know people we tend to work together in a more collaborative way, that’s how humans behave. 


Not That Bright: Japanese Robot Fails Top-Ranked University Exam

It turns out the robot is not good at grasping "meaning in a broad spectrum," said Noriko Arai, a professor at the National Institute of Informatics, who heads the team behind Torobo-kun. Torobu-kun, for instance, did not perform well in English, where it had to link phrases to come to logical conclusions. It received scores of 36.2 in listening and 50.5 in written exams. "As the robot scored about the same as last year, we were able to gauge the possibilities and limits of artificial intelligence," she said. Torobu-kun received scores of 45.1, 47.3 and 57.8 from 2013 to 2015, according to the Asahi Shimbun. This year, the score was lower than last year. However, the machine showed progress in some areas, such as physics and world history.


What makes an awesome business analyst?

Awesome Business Analysts must learn how to operate well in the fog of projects. There’s always going to be ambiguity at the start of projects but it’s also the business analyst’s job to assist with removing ambiguity. Often ambiguity is hidden in project assumptions. Start by capturing an exhaustive list of all of the assumptions you’ve heard, both explicit and implicit, then attack them aggressively by doing what you can to clarify, validate and remove them. Project managers will be particularly happy if these investigations help to provide more clarity on the scope of the project. From the beginning of a project, it’s important to try to gain consensus around what success might look like. That way, when lost in the fog, there is a compass bearing that everyone knows so that they can course correct throughout the project to ensure that everyone is moving in the right direction.


Artificial Intelligence (AI) and Biotechnology: Striking a “Balance of Power”

Nations must not look upon AI as a novelty or an economic asset, but also as a central component to national security. Nations depending on Facebook and Google to prop up their IT infrastructure rather than brewing their own national alternatives are akin to nations during the age of empires inviting British gunboats into their harbors. Artificial intelligence, biotechnology, and other forms of emerging technology must be viewed by each nation, state, community, and individual not as a mere novelty or potential industry, but also as a potential means to grant those who develop and monopolize it economic, political, and even military superiority history has taught us they most certainly will abuse.


Is big data really the future of marketing?

Capturing all the data you need is great but what you do with it is far more important. Making your data work hard is not restricted to delivering campaigns. You now have the ability to extract that data to feed other elements of marketing; feed behavioural information to or from your mobile app, your email service provider or your corporate CRM in real time. By bringing these different elements together you will be one step closer to a single customer view, the Holy Grail for marketers. It would seem that the future for the marketer will continue to change and at a pace too. The challenge to make sense of the data you capture becomes increasingly difficult on relational databases as volumes and variety increase.


How machine learning can make humans better managers

Machine learning can help humans become better managers by removing any biases a manager might have. With machine learning, employee performance is backed up by raw, inarguable data that shows how employees are actually performing. By taking advantage of this rich repository of data, managers can better recognize which employees are achieving important goals. In turn, they can provide appropriate feedback without relying on their personal opinions. With its ability to eliminate bias and prompt a data-driven approach to feedback and recognition from managers, machine learning can completely transform the workplace by making coming to work an engaging experience for every employee — no matter their age, race or gender. Employees shouldn’t have to worry about the personal biases of their managers.


10 predictions for the Internet of Things and big data in 2017

“Test/dev and disaster recovery will be the main components of a company’s environment that will be moved to the cloud, and production continuing to remain on premises,” says Marc Clark, director of cloud strategy and deployment at Teradata. ... Deep learning is getting massive buzz recently. Unfortunately, many people are once again making the mistake of thinking that is a magic, cure-all bullet for all things analytics, according to Bill Franks, chief analytics officer at Teradata. “The fact is that deep learning is amazingly powerful for some areas such as image recognition,” says Franks. “However, that doesn’t mean it can apply everywhere. While deep learning will be in place at a large number of companies in the coming year, the market will start to recognise where it really makes sense and where it does not.”


Visibility into the DevOps Value Chain

The culture of the DevOps community seven to ten years ago was very motivated toward open source. Open source tools are almost, by definition, point solutions. I think a lot of the automation solutions, even the commercial automation solutions have been designed to solve a very specific or narrow problem. So there are tools that solve the deployment problem. There are tools that solve the configuration problem. There are tools that solve testing problems. And so on…There is no such thing as a standard DevOps tool chain. They’re like snowflakes. So developers gravitate toward their tool of choice and the DevOps culture encourages experimentation. Enterprises haven’t bought into the giant, does everything kind of tool. Instead enterprises are choosing very specific point solutions and then weaving them altogether to generate efficiencies across the value stream.


Underpinning Enterprise Data Governance with Machine Intelligence

One of the more valuable benefits of strengthening enterprise data governance with machine intelligence capabilities is an expeditious efficiency that is otherwise difficult to match. Semantic technologies allow for machine-readable data which can accelerate most processes involving those data, decreasing time spent on data modeling and other facets of data preparation. “The ability for data to be discoverable and linkable through an adoption of identifiers in a consistent way allows that data to move and to be reached more rapidly,” Hodgson said. “Whether you get the data into a machine learning environment is another matter. But at least you’re insured of its integrity, and that’s a big issue as well.”


How Business and IT Can Find Middle Ground for a Data Governance Framework

There’s great value in establishing a liaison and mediator between business and IT team leads. This helps business teams work with IT to maintain information protection, governance, and data quality while also working with business representatives to create value from data assets faster. The governance protocol then moves down the ladder to all aspects of the business where data is involved. Each business unit needs a representative to make sure that their team is up-to-speed on the process for inputting and drawing data and trained with the technology that enables them to do so. Data governance is not just about technology. It’s about key stakeholders and employees creating processes and best practices to properly organize, validate, and derive business value from their own information.



Quote for the day:


"The task of leadership is not to put greatness into humanity, but to elicit it, for the greatness is already there." -- John Buchan


Daily Tech Digest - November 26, 2016

Jim Jagielski: Open source pendulum will swing back towards community

“What’s very interesting now is that years ago you had to explain what open source was,” says Jagielski. “That’s certainly not the case anymore. “We never thought it would catch on as quickly or as deeply as it has,” he adds. “People who loved open source thought it had the potential to change the world – much of a cliché as that it – but never thought it would be realised.” In fact, these days every corporate entity – and his wife – is keen to stress its involvement with open source. All enterprise software is heading in this direction, one way or another, but in a lot of ways the flavour has changed and Apache is definitely part of the less commercial old guard. It has always attracted very loyal fans.


Artificial intelligence and Big Data to manage your wealth: robo-advisers in evolution

There are several reasons for this evolution. On top of some of the concerns we already mentioned it seems that robo-advisers are not as attractive as in the beginning when the robo-adviser offering was relatively straightforward and easy to understand but with ever more players hitting the market and various types of robo-advisers with various target groups and functionalities, the picture has become more confusing. Add to that reigning fears, glitches and, last but not least, challenges regarding customer and user experience plus too much focus on the costs (they are “cheaper” than human advisers) rather than on the overall benefit picture and the challenges become clearer.


How Architects Can Survive and Thrive in the Digital Era

So as the Architect, you’ve always done your projects, you’ve always carefully facilitated the discussions and guided decisions when defining solutions, and now you find yourself in a rapidly changing world where business people are building solutions themselves. You find yourself increasingly useless and no longer relevant. On the other hand, if you pick up a role that articulates the value of these new technologies in the new business contexts that are emerging, you really have to change your job a lot to become meaningful. The fundamental value of architecting has not changed, but the spectrum of choices, the moving parts, the building blocks have greatly increased and it is against a background where everybody wants things very quick and very cheap.


Data Infrastructure: Leveraging Information

As indicated in the five levels of data infrastructure, data infrastructure is a necessary component for business growth in a similar way to how physical infrastructure is necessary for the growth of a community. When an organization only has a few employees, word of mouth can be a workable solution for managing very rudimentary forms of data, such as whether a particular customer has paid its bill.The organization needs more definitive and easily accessible answers on more and more topics, which means it needs more efficient ways to gather, use, and disseminate data. This is what data infrastructure allows you to do.  ... Data infrastructure can be roughly divided into four data infrastructure elements that will give the company access to the data it needs to solve customer and business problems.


Smart Cities and Linked Data

Over the last few years, LOD has slowly become an accepted way of exposing data to the internet. One could also say that LOD, together with the IoT is one of the key requirements for smart cities. If governments would open up their datasets, and in particular their sensor networks, over the internet using LOD, then this could ‘enable’ smart cities. The ‘things’, such as sensors, expose their data in a structured way and linked to other datasets. This in turn may lead to applications that are not yet foreseen as data is not yet available in this manner. For example, a traffic intensity sensor not only exposes the traffic intensity itself to the internet but also the information about the road on which it is located. Using that road location, the information can be combined with other data such as road maintenance information, an air pollution sensor in the vicinity and/or meteorological information.


Could your connected fridge be the newest cybercriminal recruit?

Cybercriminals harnessed the internet of things (IoT) to carry out the attack, highlighting the vulnerability of the billions of connected devices around the world. Recent attacks on Three Mobile, Tesco Bank, Yahoo, and TalkTalk prove that every company is a potential target, and should be a real wake up call to global business leaders. Whether you are a small, local business, or a multinational brand – the question isn’t “will you be attacked?”, because you probably have been already. With vast networks of connected devices plugged into global digital infrastructure, and business value increasingly defined by intangible assets, this is a threat that knows no borders, time zones, or limits. In the insurance business, it is classed as one of the most complex and challenging man-made threats out there.


Approaching (Almost) Any Machine Learning Problem

An average data scientist deals with loads of data daily. Some say over 60-70% time is spent in data cleaning, munging and bringing data to a suitable format such that machine learning models can be applied on that data. This post focuses on the second part, i.e., applying machine learning models, including the preprocessing steps. The pipelines discussed in this post come as a result of over a hundred machine learning competitions that I’ve taken part in. It must be noted that the discussion here is very general but very useful and there can also be very complicated methods which exist and are practised by professionals. ... Before applying the machine learning models, the data must be converted to a tabular form. This whole process is the most time consuming and difficult process and is depicted in the figure below.


Blockchain network disruption coming, and Australia among pioneers

The technology works to identify the ownership of energy as it is generated and then to manage multiple trading agreements between consumers who buy excess solar, ... Perth-based Power Ledger is testing various applications of blockchain across residential, retail and wholesale electricity markets in three different pilot projects across the country, and one in New Zealand ... “It’s a win for the people who have been able to afford to invest in roof-top solar, but also a win for customers who haven’t: they will be able to access clean, renewable energy at effectively a ‘wholesale’ rate. Everyone wins.” Well, perhaps not everyone. As the BNEF report notes, while most blockchain software and business models are currently at a proof-of-concept or trial stage of development, their potential to “rapidly disrupt traditional energy market structures” cannot be ignored.


Why digital transformation is forcing IT to evolve

IT organizations turn to enterprise vendors to help with the speed of innovation needed to support the business initiatives. In my discussions, IT organizations are finding a similar problem as the FANG group. Enterprise IT vendors are not positioned to innovate at the pace or the granularity of specific industry or organizational requirements. Out of necessity, IT organizations are beginning to embrace open source solutions to meet the challenge. Open source projects tend to move at the pace required by the project contributors. The contributors are starting to look much different than many traditional open source projects. Many forward-thinking end-user enterprises are dedicating developer resources to contribute to these projects directly, or take the base solution of a project and add capability.


Java Microservices: The Cake Is a Lie But You Can't Ignore It

“The microservice architectural style is an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. These services are built around business capabilities and independently deployable by fully automated deployment machinery”. This post is not about the pros and cons of microservices and assumes that you’re simply interested in the underlying technology to support it. Another post worth checking out for those issues covers the main challenges in debugging microservices — which is a common pitfall that many don’t think of when considering microservice architectures.



Quote for the day:


"Don't envy what people have, emulate what they did to have it." -- Tim Fargo


Daily Tech Digest - November 25, 2016

Corporate governance is more than a good infosec policy

When it comes to governance, many CISO's begin with policy and procedures as their first action items. Undoubtedly, corporate security policies are “must have” pieces of any organizational security posture, and after all, most C-level executives pride themselves on their ability to communicate and convey important topics to an audience. As such, they feel policy is a great fit for them to introduce and remind employees about the importance of security.  The policies might contain, for instance, rules about data storage / sharing, password complexity, access control permissions (who may access data, where they may access it from, and who manages the controls and data storage beyond them), and other items to which the consistent application or adherence is required to ensure that the IT infrastructure is kept secure.


7 open source security predictions for 2017

The flip side of the open source coin is that if you’re using open source, the chances are good that you’re also including vulnerabilities known to the world at large. Since 2014, the National Vulnerability Database (NVD) has reported over 8,000 new vulnerabilities in open source software. Vulnerabilities in open source are particularly attractive to attackers. The ubiquity of the affected components, the public disclosure of vulnerabilities (often with sample exploits) and access to the source code make the attacker’s job simpler.  In addition, without a traditional support model, users are typically unaware of new updates and vulnerabilities in the open source they’re using. Putting on my prognosticators’ hat, here are some events around open source and open source security that I wouldn’t be surprised to see in the coming year.


The Open Group - A CIO-Level View of IT4IT

Dan Warfield, Principal, CC&C Europe.- A CIO-Level View of IT4IT™. Dan is an entrepreneur, strategist, innovator and enterprise architect, whose recent experience includes creating the IT4IT-based reference architecture / operating model for a Fortune 50 company. In more than 30 years of IT leadership experience, he has been a solution executive, innovation leader and product manager for five global IT software / services companies including IBM and CSC, and worked as an independent strategy adviser.


Women in Information Security: Jess Dodson

For women who are currently in the field or wanting to be in the field, it’s about flexible work arrangements, paid maternity and carer leave, and management that’s understanding. Because while it sounds so very “old school,” women are still the primary carers in families. Also, they’re the ones who have to carry around a baby for ten months and need time to recuperate after all that! But I think it starts much earlier than that. I think we need to get into schools. We need to teach young girls that computers and math and science aren’t just for boys. They’re for girls. They’re fun and cool, and if that’s what they like to do, then they should do it. I’m trying to find a way to become a mentor or a spokesperson locally for young girls to show them that you can be a girl and be good at computers.


What is Intercloud?

Intercloud, as the name suggests, is a network of clouds that are connected with each other in some form. This includes private, public, and hybrid clouds that come together to provide a seamless exchange of data, infrastructure, and computing capabilities. In many ways, it is similar to the Internet- the network of networks that power the world today. This concept of Intercloud was started as a research project in 2008 at Cisco, and it was later taken over by the Institute of Electrical and Electronics Engineers (IEEE). It is based on the idea that no single cloud can provide all the infrastructure and computing capability needed for the entire world. Also, if a cloud does not have a presence in a particular geographic region, but gets a request for storage or computation, it should still be in a position to fulfill it.


Navigating legacy: Charting the course to business value

The CIO is well positioned to influence and support the whole digital iceberg and to help create the right strategy, platforms, and services to realize a holistic digital enterprise rather than a collection of disjointed departmental investments. If we are correct in our hypothesis that many business priorities are related to the digital agenda, then CIOs can be more responsive to bridging current gaps.  ... Globally, CIOs as a group are surprisingly similar in many of their personality traits and working styles (figure 2). Some of the top seven traits among CIOs may seem counterintuitive if one views the CIO simply as a technology steward. But, above and beyond their role as IT leader, CIOs are business leaders, and all seven traits are important in helping them succeed in their business leadership role.


The Laws of Cyber Threat: Diamond Model Axioms

Many confuse the purpose of the Diamond Model. Most believe the Diamond Model exists for analysts, but that is an ancillary benefit. Instead, think of the Diamond Model like a model airplane used to study the principles of aerodynamics. It is not an exact copy but rather a good approximation of the full-scale airplane being studied. The model exposes elements to test and study in a controlled environment improving the performance of the plane in an operational environment. The Diamond Model does the same, except for cyber threat analysis. When describing the Diamond Model to others, I usually start with, “we didn’t create the Diamond Model, we simply expressed some fundamental elements which always existed.” Surprisingly, I learned while writing the Diamond Model how exposing this fundamental nature improved cyber threat intelligence.


The new cybersecurity war takes shape

The stakes could not be higher. With financial data, medical records, intellectual property, and even military information in constant motion around the globe, our entire way of life depends on the security of our data. The expanding internet of things opens a new realm of vulnerable systems, and raises for the first time the prospect that hackers and spies can inflict immediate physical damage on their targets. The news gives us little cause for optimism. Recent data breaches demonstrate the ability of hackers to steal information on hundreds of millions of people at once (Yahoo) and to compromise data with implications for national security (US Office of Personnel Management). Anyone with the right technical skills and an agenda—activist hackers, corporations, nation states, terrorist cells—has the potential to wreak havoc on a worldwide scale.


Data manipulation heralds a new era of hacking

The repercussions of this sort of hack can be devastating for a business. If future planning, investments or purchases are made based on incorrect information, then not only could those decisions be wrong for the business, but there may be legal and financial consequences if it appeared that fraudulent behaviour had taken place. An example of this would be if the data that farmers use to determine soil pH levels, and therefore which crops to plant, were to be manipulated. Investors and businesses spend considerable amounts of money supporting the forecasted crop yields and, should that be based on altered data, then it could be financially crippling for the farmer and local businesses – while hackers could use this to purchase stocks and make a profit.


Open source talent in Europe – in great demand, and hard to find

Europeans, it turns out, are even more confident than their global counterparts in the open source job market. Of over one thousand European respondents, 60 percent said they believed it would be fairly or very easy to find a new position this year – as opposed to only 50 percent saying it would be easy globally. In fact, half of the Europeans reported receiving more than 10 calls from recruiters in the six months prior to the survey, while only 22 percent of respondents worldwide reported this level of engagement. While worldwide, 27 percent of respondents received no calls at all from recruiters, only five percent of Europeans said the same. Companies and organisations know that that they need to establish, build and sustain open source projects; they also know that for such projects to be successful, they must possess a level of sophistication that solicits support from developers.



Quote for the day:


"Positive thinking will let you do everything better than negative thinking will." -- Zig Ziglar


Daily Tech Digest - November 24, 2016

Is the speedy Surface Book i7 too expensive for IT?

This device is a workhorse wrapped up in an attractive design and it will undoubtedly serve anyone well in the enterprise. ... It's likely that this device will be cost prohibitive for businesses and reserved for those who run heavy software day in and day out, or work with video and photo editing software on a regular basis. It's a great alternative to the Apple devices that companies often turn to for creative workers, especially if IT wants to keep everyone in the same ecosystem. Even though the Surface Book i7 might not find massive popularity in the average workplace, Microsoft hopes it will find an eager audience in industries like engineering and design. The Performance Base was specifically built with these types of workers in mind and it can handle professional design and editing software, including 3D CAD software, which is typically used in engineering disciplines and architecture.


Behavioral threat assessment means real-time threat detection

BTA tools create a behavioral threat assessment by plugging into security information and event management tools, intrusion detection systems and intrusion prevention systems and others -- like firewalls -- and importing their log information. They then perform correlation analysis on that information to determine what behavior is normal for users, devices and systems. The next step for developing a behavioral threat assessment is additional analysis to determine whether anomalous behavior is just that -- anomalous, but harmless -- or represents a true threat. BTA products do all this by applying machine learning to the data streams so that security analysts don't need to program in rules about what comprises normal behavior.


Why CXOs should understand the assumptions behind predictive analytics

Proponents of predictive algorithms also argue that algorithms, ultimately a series of mathematical functions, are inherently unbiased. The designers of these algorithms may have included assumptions and shortcuts to model complex environments, or over or underrepresented some variables, but these can ultimately be tweaked and improved with relative ease. Like any system, a predictive algorithm is only as good as its model and the data that are available, once again validating the old computing axiom of GIGO. Proponents ultimately argue that any "bias" inherent in an algorithm is the fault of the creators, not the math itself. In the case of Chicago's gun violence predictive toolkit, proponents also argue that any flaws in the system are ultimately outweighed by the benefit of saving lives.


How security collaboration will prove vital in 2017

What’s needed is a platform through which the cybersecurity community can create and share vendor-neutral security orchestration models (defense strategies) which can then be internally rated by community members and updated as needed, rendering them ready for adaptation by organizations – no matter which security products they use. If an organization is lacking a security function that the model requires, the organization can be alerted and the gap filled. Orchestration models can also be created for specific verticals and tailored to the needs of specific organization types such as banks, retail, healthcare, or critical infrastructure, for example, or developed to specifically combat known hacker groups and their attack patterns, or both.


Desperately seeking cybersecurity help

Security intelligence requires the real-time collection and analysis of massive amounts of information and it’s easy to miss clues. To take one example, a study by the Ponemon Institute, found that it took organizations an average of 256 days to detect advanced persistent threats already residing in their systems. ... Advances in the related domains of artificial intelligence, data mining, machine learning and cognitive computing are feeding new optimism about the battle against cybercrime. Earlier this spring, computer scientists demonstrated how adaptive cybersecurity technologies can filter through millions of log lines each day to flag only the suspicious items. Over the course of a recent three-month-long test, an MIT system logged data from an unnamed e-commerce platform and successfully detected 85 percent of the threats without even needing human assistance.


10 most difficult IT jobs for employers to fill

The breakneck pace of technological innovation in an era of digital transformation has made it difficult for companies to find and land talent with the right mix of cutting-edge skills and experience. ... "In IT, most mid- to senior-level folks currently in the market have advanced to where they are because of their technical skills, not based on their management and soft skills. What that means is that certain roles are incredibly hard to fill, as they need both the technical savvy, as well as domain- and industry-specific expertise and leadership skills. Whenever you're asking a candidate to wear two different hats -- in this case, technical and management -- you're inherently making these roles harder to fill as the pool of qualified candidates becomes smaller," Sigelman says.


How to get more from Windows Defender by using its command-line tool

Since Windows Defender has a Windows UI and performs most of its operations in the background, you may be wondering why anyone would want to use it from the command line. Well, the truth is that the command-line version is useful in situations where you want to be able to automate and customize Windows Defender's standard operations. Furthermore, as I mentioned, there are some advanced operations you can only perform from the command line-version. You may not use some of those operations often, but it's nice to know that they are available. To find the command-line version of Windows Defender, just open File Explorer and navigate to C:\Program Files\Windows Defender. When you get there, look for a file by the name of MpCmdRun.exe


Internet trust at all time low: 5 ways of regaining it

“Everyone knows that data security is a major issue for both consumers and businesses, yet companies are not doing everything they could to prevent breaches.” “According to the Online Trust Alliance, 93% of breaches are preventable. And steps to mitigate the cost of breaches that do occur are not taken – attackers cannot steal data that is not stored, and cannot use data that is encrypted.” “This status-quo isn’t good enough anymore. As more and more of our lives migrate online, the cost and risk of a data breach is greatly increased, and will lead to lost revenues and a lack of trust.” With a reported 1,673 breaches and 707 million exposed records occurring in 2015, organisations must change their stance.


The Uncertain Future of Fintech

Traditional financial institutions continue to face challenges, with less than half (44.0%) of executives at legacy financial firms confident in their fintech strategy. This is not surprising given only about one-third (34.7%) affirmed they have a well-structured or proactive innovation strategy in place that is embedded culturally. The risk-averse nature of traditional firms also makes it difficult for them to create cultures that prioritize innovation, and 40.3% of executives said that theirs is not conducive to innovation. “Financial services senior executives are seeing fintech firms in a whole new light as they see greater opportunities to collaborate, but are also making significant headways in building more agile, in-house fintech capabilities.” said Thierry Delaporte, Head of Capgemini’s Global Financial Services Business Unit.


Machine Learning: The More Intelligent Artificial Intelligence | Part 2

So, how smart is machine learning compared to AI? Baikalov insists that it is a lot smarter because science-fiction style AI, or the capability of a machine to imitate intelligent human behavior, doesn't exist. “Machine Learning is a subset of AI, along with knowledge, perception, reasoning, planning and other good stuff, says Baikalov, “And there's a lot to learn, and as the machine learns something, we say "Well, if the machine can do it, it doesn't require intelligence, and therefore it's not AI." “The core problem with AI is that it's defined relative to human intelligence, which in turn is not well defined,” explains Baikalov. “AI is created by humans, and if the humans don't understand what the intelligence is, how can they program the machine to imitate it? And does AI even need to imitate every aspect of human intelligence?”


Microsoft and Linux: Growing ties could benefit CIOs

Microsoft's ongoing cultural shift from Windows-everywhere monolith to a more open company suits Ted Ross, general manager and CIO for the city of Los Angeles. He said Microsoft has been seen as hostile to open source, but noted that the company is changing its propriety ways.  The open source direction "reflects Microsoft expanding to an understanding of what the new economy looks like," Ross said.  That new economy, he said, is "very API-driven" and characterized by a cross-platform approach in which CIOs select among numerous technology tools for the best option for a given IT workload. He said he can't rely on a single vendor, or a single operating system, to deliver the optimum digital services for his customers. He said the city's 41 departments use a number of operating systems, with Windows and Linux at the top of the list.



Quote for the day:


"Those not chasing their dreams should stay out of the way of those who are." -- Tim Fargo


Daily Tech Digest - November 23, 2016

What Are The Differences Between Python, R, and Julia?

Not only were they originally designed with statistical purposes in mind, but a broad developer ecosystem has also evolved around them. This means there are extensions, libraries, and tools out there for performing just about any analytics functions you might need. R, Python, and the relative newcomer Julia are currently three of the most popular programming languages chosen for Big Data projects in industry today. They have a lot in common, but there are important differences that have to be considered when deciding which one will get the job done for you. Here’s a brief introduction to each of them, as well as some ideas about applications where one may be more suitable than the others.


Millennials are twice as bored at work as baby boomers, report says

"Millennials report higher levels of boredom at work because they are the most disengaged generation in the workforce globally," said Dan Schawbel, research director at Future Workplace and author of Promote Yourself. "They require constant feedback, training, mentoring and new career opportunities. If they aren't challenged at work, they immediately start looking at new jobs and will continue to job hop until their needs are satisfied." ... "Millennials are the largest generation in the US workforce now," said Jason Dorsey, cofounder and researcher at the Center for Generational Kinetics. "Engagement is not just about more money or the latest tech or a new yoga room—it's about understanding what your employees want, and being able to give it to them in a feasible way that makes them feel valued."


5 ways physical security breaches can threaten your network

If someone has access to this room without authorisation, your network is extremely vulnerable. When there are layered security measures offering your server room further protection inside your business, it can be easier to see if the area is accessed. Without locks on the server room doors or surveillance footage, however, it will be difficult to know if the hardware was sabotaged. With physical access to the server room, criminals can do an immense amount of damage to the network. Remote access can be set up so that the criminals will have access to the servers and their information at any time; backdoors can be left for all types of remote viewing and even control; information can simply be loaded onto a third-party device.


The Fast-Moving CIO: A Race For Transformation

The rationale for bimodal IT is that change is hard. Well, so is losing. Adapting to a rapidly changing market is difficult and complex but necessary to compete and win when customer expectations are rising. More importantly, customer tolerance for sub-par experiences is declining. Customers expect experiences that enable them to easily transact when and how they want. Here are a handful of examples that are sure to frustrate your customers: Delays in product availability due to the inability of supply chain systems to keep up with changing SKUs; issues with new orders because customer information in the system of record did not appropriately sync or update; and the lack of integration across legacy systems causing customers to re-enter data.


“Would you like us to email you a receipt?”

On the surface it’s a simple question increasingly being asked by high street retailers. But sometimes this simple question doesn’t tell the full story. An e-receipt can be more convenient at times, but it is also a way for shops to collect personal data about their customers and send them marketing. In the run up to the busy Christmas season, the ICO is reminding retailers that people have the right to know what happens to their personal data. Retailers need to be aware of the obligations under data protection and privacy laws. Here are the key questions you need to be asking before you start to collect information.


Facebook Said to Create Censorship Tool to Get Back Into China

Facebook does not intend to suppress the posts itself. Instead, it would offer the software to enable a third party — in this case, most likely a partner Chinese company — to monitor popular stories and topics that bubble up as users share them across the social network, the people said. Facebook’s partner would then have full control to decide whether those posts should show up in users’ feeds. ... A Facebook spokeswoman said in a statement, “We have long said that we are interested in China, and are spending time understanding and learning more about the country.” She added that the company had made no decisions on its approach into China. Facebook’s tricky position underscores the difficulties that many American internet companies have had gaining access to China.


Cerber Ransomware Expands Database Encryption Attacks

Overall the expectation from Trend Micro is that Cerber ransomware will continue to evolve as the attackers adjust their delivery methods, infection vectors and ransom demands. ... There are a number of things that organizations and end-users can do to help mitigate the risk of being the victim of a Cerber ransomware attack. Clay commented that as with most ransomware attacks, the missing piece appears to be an over-reliance on endpoint security to detect malware. "If endpoint security is utilized as a primary defense, then a cross-generational approach that includes both traditional and newer technologies like high-fidelity machine learning can improve detection of ransomware," Clay said.


Cognitive Hack: The New Battleground In Cybersecurity

The question of weighing the risks versus the rewards is an appropriate one. Consider this: The federal government has standards for regulating the food we eat, the drugs we take, the cars we drive and a host of other consumer goods and services, but the single most important tool the world increasingly depends on has no gatekeeper to ensure that the products and services connected to the Internet don’t endanger national security or pose a risk to its users. At a minimum, manufacturers of IoT must put measures in place to detect these threats, disable IoT devices once an attack starts and communicate the risks of IoT more transparently. Lastly, the legal community has also not kept pace with the development of IoT, however this is an area that will be ripe for class action lawsuits in the near future.


5 Technologies Your Business Should Adopt Right Now

Business technology is always changing, and if you don’t do your best to stay ahead of the curve you could find yourself playing a very dangerous game of catch-up. Automating business processes and incorporating new methods of payment and customer service are integral to staying competitive as a retailer and employer. Clients want to purchase from businesses that can respond to their needs quickly, and employees want to work in an environment that is efficient and secure. To continue to attract talented workers and retain tech-savvy clients, your business should incorporate these 5 business technologies as soon as possible


Docker Alternatives, Orchestration, and Implications for Microservices

Plenty of container alternatives and corresponding cloud services are available on the market that orchestrate microservices (and therefore the underlying containers). Container technologies and orchestration engines are usually used closely together. Often, they are built into the same tooling. Cloud offerings, where “users pay only for the resources - such as compute instances, load balancing and scheduling capabilities - that they use” are called CaaS (Container as a Service). The following list contains the differentiating container platform feature sets with different pros and cons. Also note that the following is – of course – not a complete list of container and orchestration offerings (but hopefully shows most of the currently relevant options):



Quote for the day:


"Integrity is built when you do what you promise to do." -- S. Chris Edmonds

Daily Tech Digest - November 22, 2016

Cyber Security Recommendations from the C-Suite

Security executives have a lot on their plate. They’re grappling with a new breed of cyber-attacks, financially-motivated cyber assailants, and a bevy of new, connected devices that bring unintended security risks to their organization. But it’s not all doom and gloom. C-level executives are relying on new technologies and best practices to fight fire with fire. They’re turning to former enemies for help, getting more bang for the buck, and relying on automation to safeguard their organization’s most critical information assets. To garner the best practices of security leaders, Radware conducted a survey of more than 200 C-level security executives from the U.S. and United Kingdom. The Security and the C-Suite: Threats and Opportunities Report unearthed a series of top recommendations that organizations should heed carefully.


Startups and enterprises can leverage Big Data Analytics to optimise workforce

There are several tools in the market that can take the hassle off for HR departments. Tools like AppDynamics and Workforce Analytics reduce the burden in several ways. They not only assess and predict whether a potential candidate would accept a job offer or if the prospect only in exploration mode, it would also track other significant feed such as social media. For example, culling information on the frequency of a potential candidate’s visits to LinkedIn, the frequency of LinkedIn page updates, whether the candidate is exploring different other options, whether he is asking for recommendations from other LinkedIn users. The tools also provide information on aspects like cultural fit of a candidate for the organisation, their personality with respect to organization values, etc.


The biggest threat to banks? Legacy systems, not fintech

While new financial technology (fintech) is permanently changing how financial institutions operate, it could very well be that the biggest threat to Canadian financial institutions is not fintech challengers, but the legacy systems that prevent them from adapting. In fact, this could be the next major problem that the Big Five banks have to tackle. Dave McKay, the CEO of Royal Bank of Canada, has publicly stated that the biggest threat to financial services is not from without, but from within: "Regulation is not the problem. The biggest barrier to adapting is the incredible legacy systems." The legacy problem becomes more confounding when you consider that banks have some of the smartest leaders, and some of the biggest budgets of any type of business in Canada. So what gives?


Are Humans Already Obsolete?

With intelligent services, the system knows how to react to this chain of events, and can even “recommend” other actions to take before the leave starts. HR is notified when the recruiting manager submits the leave request. With intelligent services, the SuccessFactors system will automatically reschedule the learning course after the recruiting manager returns, suggests to the recruiting manager to update their appraisals and goals, and reroute any pending job candidates to other members of the team. This is not just a win for the manager, her team and candidates, it’s a chance for HR to get out of the administrative and spend more time focused on the strategic.


Leveraging the power of nature to enhance Internet security

Quantum effects are being leveraged to generate random numbers at high rates and in ways that make guessing keys impossible, removing an important attack avenue for cyber criminals. Until this quantum effect was used, every other accepted method was not truly random, or was too slow to deliver the security really needed. This vulnerability has been the subject of years of research and community collaboration, including production of standards overseen by the U.S. Department of Commerce’s National Institute of Science and Technology (NIST). Since 1997 NIST has coordinated community-wide participation in a Random Number Generation Technical Working Group to help improve the ability of encryption solutions to leverage increasingly hard-to-break keys.


For Healthcare Organizations, Futureproofing Is Possible — and Necessary

Hospitals and other large facilities, many of which shift computer equipment between rooms on a routine basis, could see particular benefits from this setup. Giving IT staff direct control of organization-owned machines without requiring physical access means making changes at the snap of a finger, relatively speaking; this could be a boon when updating against the latest malware or virus, or changing settings to reflect new rules and regulations. Best of all, desktop virtualization is the perfect compliment for an organization concerned with futureproofing. A growing facility could provision a new fleet of laptops in hours instead of days, easily push a new EHR system (with department-specific configurations) to remote and local devices across all its locations, and make compliance-related changes in the blink of an eye.


Partnering to shape the future–IT’s new imperative

IT organizations are under increasing pressure to deliver better performance—as the partners already do—partly because of the growing availability and capabilities of third-party services such as cloud computing, infrastructure as a service, and software as a service. About one-third of business executives see third-party providers as a significant or complete substitute for the IT function’s services. Another source of pressure is the expansion of digital programs. Nearly all respondents (91 percent) say their companies are already pursuing a digital agenda, suggesting that the partnership between business and IT will become only more important over time—especially with so many organizations in the early days of their digital efforts.


Cybersecurity must be open, replaceable

Assume that, if the price is right, your system will be hacked. Take a lesson from the Great Wall of China – eventually the invading hordes will get through. The only solution is to design the system so that the security can be replaced once it is hacked. For web-based systems, this is fairly easy, since the security algorithms exist in software on a central web server that you can easily update. For pay TV systems, security algorithms are encoded in hardware and software on a smart card that is inserted into the TV set top box. When the system is hacked, the broadcaster can simply replace the smart card, which sends the hackers back to square one, trying to break a brand new combination of security hardware/software.


Who Has The Most Impact In Driving Security?

"Most business directors would never dream of ignoring risk when it comes to funds, but there is a disconnect there in terms of data," Drystek continued. That's why the communication needs to happen directly with the risk owner. Those enterprises that understand that risk is directly connected to business are the ones that are paving the way with sophisticated security programs. ... Those layers of both formal and informal communication most often enable security teams to get information into the right hands. "What I use as a prod is data quality, both integrity and availability. Security risk is business risk. Compliance is a weak form of security where it becomes an insurance issue," Drystek said.


Half of surveyed U.S. businesses admitted to suffering a ransomware attack

Getting hit with ransomware would be bad enough, but imagine paying the ransom and then having the attacker come back and demand a second ransom? It happens; more and more people pay, but it’s not like a cybercriminal’s promise to decrypt upon receiving the first ransom is a sterling guarantee that the victim’s files will be decrypted. Grossman believes that unlockers – the decryption keys to unlock ransomware-encrypted files which are released to the public by security experts – may not be something people can hope for in the future. Right now, some crooks reuse the same key for all their ransomware infections; once a security researcher gets hold of the key, then they offer it to the public since it works for other victims of the same ransomware to decrypt their files.



Quote for the day:


"Sometimes the questions are complicated and the answers are simple." -- Dr. Seuss