January 09, 2016

Antivirus software could make your company more vulnerable

Antivirus vendors don't seem too concerned about the potential for widespread attacks against their consumer products. For the most part, researchers agree that such attacks are unlikely for now because typical cybercriminal gangs have other, more popular, targets to attack such as Flash Player, Java, Silverlight, Internet Explorer or Microsoft Office. However, the creators of those widely used applications have increasingly added exploit mitigations to them in recent years, and as more people update to newer and better protected versions attackers might be forced to find new targets. Therefore, future attacks against antivirus products used by tens of millions or hundreds of millions of consumers can't be ruled out, especially if cybercriminals get their hands on previously unknown -- zero-day -- vulnerabilities, as they have done from time to time.


XL Catlin Analytics Strategy: Quality Over Quantity

Getting internally and externally sourced data ready for modeling is the bulk of the work, she explains. “Coming up with a cohesive data set probably takes four times longer than creating the model,” she says. “I would say we spend 45 percent of our time on the data, 10 percent on the model, and 45 percent on change management.” The data is housed in SAS and SQL databases , linked through ODBC connections. The team writes code in the SAS programming language to run the data analysis, relying on a code library. SQL and R are also used, to a lesser extent; SQL to extract data from source systems, and R, a language and environment for statistical computing and graphics for exploratory data analysis.


The transition from cloud back to a data center migration

Groupon began its move out of the cloud in 2011, three years after the online deal website was launched. "The biggest driver was cost," Chatha said. "It was not economically feasible for us to stay in the cloud." A motivating factor for some companies to move to the cloud is the ability to pay for it as an operating expense, rather than a capital expense. Today, base rent and utilities payments for colocation space can be treated as Opex, and hardware can be financed and paid as Opex, too, he said. Groupon's needs are more diverse than Netflix, for example, which has generated headlines about its complete move to Amazon Web Services (AWS). Netflix has its greatest need for storage, Chatha said, while Groupon needs everything, from virtual machine hosts to databases.


One-on-One Coaching Misses the Mark

Traditional coaching works with ane executive one-on-one and helps him find new approaches. Believing this approach too limited, we facilitated a meeting with the executive and his team to share the feedback we gathered. This eliminated secrecy and impressed the team. The executive had made himself vulnerable, and the team began thinking about how they could help him. Then we moved the conversation away from the executive to how the team could improve. It began discussing how better to define its collective goals, redesign meetings to make them more productive, and address issues before they became problems. The challenges the team identified, and the solutions proffered to improve performance, never would have emerged in private, one-on-one coaching sessions.


DDoS attack on BBC may have been biggest in history

A group calling itself New World Hacking said that the attack reached 602Gbps. If accurate, that would put it at almost twice the size of the previous record of 334Gbps, recorded by Arbor Networks last year. "Some of this information still needs to be confirmed," said Paul Nicholson, director of product marketing at A10 Networks, a security vendor that helps protect companies against DDoS attacks. "If it's proven, it would be the largest attack on record. But it depends on whether it's actually confirmed, because it's still a relatively recent attack."


Banks, don’t wait for your competition to become data driven

First, the upside of leveraging the potential of data science and analytics and developing data driven business models is not only a measure to increase internal process efficiency but especially to attract customers and maintain a sustainable business. Second, the risk of a “sit tight and wait” strategy is truly suicidal. Establishing a data driven business culture cannot be done over night and needs time for people training and development, letting aside the effort and time needed to choose and set up the systems and infrastructure. Recall how Google disrupted the search industry. Yahoo, Lycos and all these almost forgotten dinosaurs could never catch up over come even close to Google’s success after they had been disrupted.


The Dying Technologies Of 2016

Thinking of antique technologies, vinyl has made a comeback but CDs, DVDs, and Blu-Ray? They’re all marching to the media graveyard. Today, we stream everything we can. I still buy and own CD and DVD players, but I’m an old guy. Also, call me a Luddite, but I like having my music, videos and books in my hand, not in some distant cloud. There aren’t many of us left. Fewer and fewer PCs and laptops come with a CD/DVD player. We used to use CD/DVD drives to install software too. I rarely do that anymore. That’s not just because we download almost all our software today. It’s also because stand-alone PC software is on its way out. Accounting, office suites, customer-relationship management — you name it, we do it on the cloud now.


Project Alignment, Hiring Shortfall Top 2016 Big Data Challenges

This isn’t a new problem. Looking back even a few years ago, when it became clear that data was essentially currency, people predicted significant shortfalls in data scientists. In 2015, companies throughout the industry felt the sting acutely. Every day, new job postings go up looking for qualified data scientists. It takes time to find candidates and not every data genius is the perfect fit for every company. There may be some relief coming, with specialty certificate and alternative education programs for big data popping up from universities and other educational institutions, but it’s not an immediate fix for 2016. If companies want to fill their teams with more data scientists, my advice is to hire people in accordance to the nature of the problems companies want to solve, not all problems require advance data science.


Why your cyber insurance investment may not pay off

If you are considering cyber insurance, you are in my opinion doing the right thing. The cost of a data breach can be staggering, and many small and medium companies suffering one will not even survive. That being said, the purchase of a policy without establishing and following appropriate information security policies and procedures may well be a waste of money. Attorney Eran Kahana, a guest on episode 172 of the Down the Security Hole podcast, put is quite simply: "If you don't do security well, the courts will kill you." Since a strong security posture is necessary anyway to protect your business, the ability to meet the requirements for cyber insurance is just a bonus.  The following are some of the general thing you will need to have in place prior to seeking insurance.


The Search For The Killer Bot

As 2016 dawns, there’s a sense in Silicon Valley that the decades-old fantasy of a true digital assistant is due to roar back into the mainstream. If the trend in past years has been assistants powered by voice — Siri, Alexa, Cortana — in 2016 the focus is shifting to text. And if the bots come, as industry insiders are betting they will, there will be casualties: with artificial intelligence doing the searching for us, Google may see fewer queries. Our AI-powered assistants will manage more and more of our digital activities, eventually diminishing the importance of individual, siloed apps, and the app stores that sell them. Many websites could come to feel as outdated as GeoCities pages — and some companies might ditch them entirely.



Quote for the day:


"Deal with the world the way it is, not the way you wish it was." -- John Chambers


January 07, 2016

Connecting Big Data Project Management with Enterprise Data Strategy

Ideally a portfolio of projects will support an organization’s strategic plan and the goals or missions the organization is charged with pursuing. We may also need to “get tactical” by delivering value to the customer or client as quickly as possible, perhaps by focusing on better-controlled and better-understood product centric data early on via a “data lake” approach. Doing so will be good for the customer and will help create a relationship of trust moving forward. Such a relationship will be needed when complications or uncertainties arise and need to be dealt with. In organizations that are not historically “data centric” or in organizations where management and staff have a low level of data literacy, an early demonstration of value from data analysis is especially important. 


Hybrid Cloud, Microservices and The API Economy: Looking To 2016

The drive toward cloud and the drive toward hybrid environments. If you look back, containers are not surprising because the need for portability became very critical. We ended up with a truly hybrid environment. Along with that, you see this movement towards an API Economy, a movement towards microservices, the movement of DevOps, of rapid transmission, of rapid delivery of small batches of changes–all those made containers very attractive. To me, not only is this something we’ll look back on and say 2015 is where the traction began, but it’s going to gain even more traction and transformation in 2016.


Evaluating your need for a data warehouse platform

A data warehouse platform is typically based on a relational DBMS, and the data in it is structured and generally originates from an organization's operational and transactional systems. Data warehouses are accessed by business executives and analysts using BI dashboards, OLAP and reporting tools, and ad hoc SQL queries. Big data analytics, on the other hand, is typically supported by nonrelational technologies such as Hadoop, Spark and NoSQL DBMSes. The data can be both structured and unstructured, and can originate from every type of internal system plus external data sources, such as social media. Analytics are performed on big data for discovery and insight


Best practices in HDFS authorization with Apache Ranger

Apache Ranger offers a federated authorization model for HDFS. Ranger plugin for HDFS checks for Ranger policies and if a policy exists, access is granted to user. If a policy doesn’t exist in Ranger, then Ranger would default to native permissions model in HDFS (POSIX or HDFS ACL). This federated model is applicable for HDFS and Yarn service in Ranger. ... The federated authorization model enables customers to safely implement Ranger in an existing cluster without affecting jobs which rely on POSIX permissions. We recommend to enable this option as the default model for all deployments. Ranger’s user interface makes it easy for administrators to find the permission (Ranger policy or native HDFS)that provides access to the user.


Why the Cloud Is Taking Over Traditional IT Systems

Data is flowing over the unsecured public data highway, so security is critical, particularly as more workers switch to remote and mobile work. Infrastructure and applications are exposed to the outside world. At this point in the cloud evolution, most new cloud 2.0 applications are architected specifically for the cloud. This means that the performance and response time is higher than the first generation of cloud applications, which were just old client/server applications retrofitted with web interfaces. Around 2013, Moore’s Law started to run into the constraints of the laws of physics. Approaching very small size, transistors are less reliable. Consumers of computing power have enjoyed riding the wave of inexpensive computing power in increasingly smaller devices.


Data Center Design: Which Standards to Follow?

Best practices mean different things to different people and organizations. This series of articles will focus on the major best practices applicable across all types of data centers, including enterprise, colocation, and internet facilities. We will review codes, design standards, and operational standards. We will discuss best practices with respect to facility conceptual design, space planning, building construction, and physical security, as well as mechanical, electrical, plumbing, and fire protection. Facility operations, maintenance, and procedures will be the final topics for the series. Following appropriate codes and standards would seem to be an obvious direction when designing new or upgrading an existing data center.


Design Thinking Is Taking Hold At IBM

IBM's adoption of Design Thinking is important for the company's sheer market heft, but there is another reason, said Coleman. "IBM exists in the gap between the reality of a situation ('We've always succeeded this way.') and what could be," he said. "Design Thinkers say, 'Sure, that's great, but I have a vision," Coleman said. "Most people fear [a vision] because there's risk involved." He said that if Design Thinking succeeds at IBM, a company that for decades has typified how big business in the US works, large numbers of companies will likely follow suit. Indeed, IBM's Cutler said the company gives tours of its studio in Austin three times a week. Pushback still happens among employees and customers, of course. "Anytime something smells like a new process," Cutler admitted, some people are going to get defensive.


The 10 biggest startup opportunities in 2016

This year, venture capitalists and industry observers say the tech world should expect more of the same. "Most hot startups in 2016 won't be trying to lead revolutions or usher in whole new industries," says Igor Shoifot, an investment partner with TMT Investments. "Instead, they'll be enhancing existing technologies, products, services, or transactional ecosystems by saving users time, money, effort, and helping them make better choices more easily." However, the New Year has a few potential technology surprises in store, including the "Uberization" of manufacturing and mobile ecommerce in emerging markets. Here are 10 of the hottest technology startup categories, trends and opportunities (ranked in no particular order) experts expect to see in 2016.


How Goldman Sachs and Bank of America use the cloud and containers

For Thomas, containers represent a way to get the company’s developers and infrastructure workers to focus on the highest-value work. Too much time is spent on managing middleware systems and messaging buses that don’t add value for the bank. “Simplifying that and really flipping ratios of people who are just maintaining, supporting, managing applications, to people who are pushing the applications forward and bringing more value for our customers is the foundation of the goal,” he says. “It’s not about cost reduction, it’s about reinvesting the people and the talent we have to really business value added things for our customers.” Simplification means consolidation too. Thomas says Bank of America has condensed from 64 data centers last year to 31 this year. It plans to have only eight data centers by the end of 2016.


Governance Challenges When Gatekeepers are “Chilled”

The primary board concern is that for certain potentially controversial initiatives, some gatekeepers may become “gun-shy;” i.e., may engage in self-protective conduct that frustrates valid board strategic initiatives and other appropriate efforts. This, despite the fiduciary or employment risks a gatekeeper may assume by acting in what may be perceived as his/her own interests, as opposed to the legitimate business interests of the company. Note that this is a concern separate and distinct from the concern, expressed by some knowledgeable observers, that the new DOJ policy will have a chilling effect on employees’ willingness to cooperate in their companies’ internal investigations. We’re talking here about a different kind of “chill.” Such self-protective conduct may manifest itself in both obvious and subtle ways



Quote for the day:



"The sharpest criticism often goes hand in hand with the deepest idealism and love of country." -- Robert F. Kennedy


January 06, 2016

5 Predictions for Trends in Data, Analytics and Machine Learning in 2016

Applications will be designed to discover self improvement strategies as a new breed of log and machine data analytics, at the cloud layer, using predictive algorithms, enables; continuous improvement, continuous integration and continuous deployment. The application will learn from its users, in this sense the users will become the system architects teaching the system what they, the users, want and how the system is to deliver it to them. Gartner view Advanced Machine Learning amongst the top trends to emerge in 2016 with “advanced machine learning where deep neural nets move beyond classic computing ad information management to create systems that can autonomously learn to perceive the world, on their own … this is what makes smart machines appear "intelligent."


CES 2016: Sneak Peek At Emerging Trends

CES 2016 in Las Vegas came to life for media attendees with a preview event -- CES Unveiled -- on Monday night. The event was set up for vendors to show off their products in hopes of attaining media attention. It also served as a glimpse of the broader products and trends we'll be talking about throughout 2016 and beyond. After walking around a crowded ballroom for a couple of hours checking out gadgets of all shapes, sizes, and functionality, here are the major trends I saw observed that I think are likely to have long-term impact on our lives and businesses.


EU privacy watchdog to set up ethics advisory group

Outlining his plans for an ethics advisory group, Buttarelli said the group will “advise on a new digital ethics that allows the EU to realise the benefits of technology for society, whether for security or economic reasons, in ways that reinforce the rights and freedoms of individuals while retaining the value of human dignity”.  Buttarelli said that as the understanding that dignity is important spreads, people will want more opportunities to protect their privacy. “But we also need to be clear about exchanging personal data for incentives, whether those incentives relate to increased security or consumer benefits,” he said. According to Buttarelli, the internet has evolved such that the tracking of people’s behaviour has become routine for many intelligence agencies and an essential revenue stream for some of the most successful companies.


How IBM's Watson Takes On The World

Cognitive computing, according to Vice President of IBM Watson Steve Gold, essentially marks the arrival of a new “era” in computing. What started with his own company’s development of tabulation computing, to process US census data at the dawn of the 20thcentury, developed into programmatic computing in the middle of the century, with the arrival of transistors, relational databases, magnetic storage and eventually microprocessors. Now, the enormous growth in unstructured data we have experienced in recent years, and the sophisticated methods that have been developed to help us make sense of, understand and learn from this data, has given rise to cognitive computing. Cognitive computers don’t need to be programmed – they can learn for themselves.


Manage Your Emotional Culture

This playful spirit at the top permeates Vail. Management tactics, special outings, celebrations, and rewards all support the emotional culture. Resort managers consistently model joy and prescribe it for their teams. During the workday they give out pins when they notice employees spontaneously having fun or helping others enjoy their jobs. Rather than asking people to follow standardized customer service scripts, they tell everyone to “go out there and have fun.” Mark Gasta, the company’s chief people officer, says he regularly sees ski-lift operators dancing, making jokes, doing “whatever it takes to have fun and entertain the guest” while ensuring a safe experience on the slopes.


Data centers seek creative skills to drive innovation in IT

Creativity and innovation have more to do with the hierarchical, logical IT world than people may think, said James Stanger, senior director of products for CompTIA Inc., a nonprofit IT industry association involved in training and certifications. Innovation and creativity enhances an IT pro's ability to troubleshoot, design architectures and optimize performance to meet traditionally important metrics, such asreliability, stability and efficiency of IT operations. Stanger calls it the ability to make an informed choice. Combine an ability to see "the spaces between the systems" -- how everything interconnects and works in your environment -- with a deep knowledge of the protocols and procedures in use, and the IT worker can create the best architecture and operations possible for their business, he said.


Apple’s convergence will be about input not interface

It’s not the interface that’s changed, it’s the input. A Surface-style touch keyboard was a given as soon as the iPad Pro was confirmed, but Apple did more than move the keys to a more comfortable position. Snapping a Smart Keyboard to the iPad Pro instantly creates a bridge between the desktop and mobile realms, not just with the quick keystrokes and onscreen shortcut bar, but also in how integral it is to the whole experience. With the Air and the mini, Bluetooth keyboards are highly optional and occasional accessories that add little more than convenient typing, but the Smart Keyboard is absolutely necessary to the iPad Pro, so much so that I’m surprised Apple didn’t charge $200 extra and just include it in the box. It may be a baby step, but it’s an important one in the evolution of iOS.


“Just 14% said the IT operations department was the main sponsor of the migration project, and 11% said a business leader with no knowledge of the cloud was the main instigator,” said Rackspace’sAnatomy of a Cloud Migration report. “Overall, this means that CEOs, business leaders and boards of directors drive six out of 10 (61%) cloud migrations,” it added. In cases where the move to cloud was being led by a business leader rather than a tech one, it is far more common to see companies employ a third-party organisation to oversee the process, the report said. As for the reason why business leaders, rather than IT decision-makers, are leading organisations’ cloud charge, it could be because adopting cloud is seen as a way of cutting costs from the business.


Microsoft's New Security Approach

Nadella emphasized that the tools to protect, detect and respond to threats have existed for many years. The seeds for this were planted more than a year ago as Microsoft combined Intune, Azure Rights Management and Azure Active Directory Premium into its Enterprise Mobility Suite and the company doubled down on technologies such as auth­entication and identity management. "What is new is that posture," Nadella said. .... What Microsoft is trying to build, he said, is an "intelligent security graph" that brings together virtually all of the company's security intelligence from streams throughout Microsoft, its customers, partners and security operations centers throughout the world in real time and that of select partners tied into that graph.


Are Data Scientists Doing the Job They Were Hired For?

In a perfect world, data scientists would be free to access and manipulate all enterprise content quickly and fluidly without impairment. But the reality of the data environment for most businesses is a scattered and messy ecosystem of multiple systems and software; each used for different content and management functions. Data is duplicated, disconnected, and disjointed. There is no single portal or platform for search. When data scientists are forced to gather content from IT systems that are sprawled across innumerous platforms and departments, they are left grasping at straws and with little more than a flawed convenience sample. Garbage in, garbage out. The data scientist won’t likely answer all of our problems, but data management just might – if given enough time and planning. As 2016 starts to dawn upon us, business leadership is starting to realize that analytics skills alone will do little to make sense of enterprise-scale content.



Quote for the day:


"Executive ability is deciding quickly and getting somebody else to do the work." -- John G. Pollard


January 05, 2016

The Open Trusted Technology Provider™ Standard (O-TTPS)

Information Technology supply chains depend upon complex and interrelated networks of component suppliers across a wide range of global partners. Suppliers deliver parts to OEMS, or component integrators who build products from them, and in turn offer products to customers directly or to system integrators who integrate them with products from multiple providers at a customer site. This complexity leaves ample opportunity for malicious components to enter the supply chain and leave vulnerabilities that can potentially be exploited. As a result, organizations now need assurances that they are buying from trusted technology providers who follow best practices every step of the way.


Another Step Toward an Open NFV Ecosystem

ETSI NFV ISG Chairman Steven Wright of AT&T, commenting on the continued momentum, observed: “Among our most important 2015 goals was to foster interoperable implementations rather than creating new standards activity. Exiting our final meeting for the year, we are pleased with the progress and entertaining proposals for 2016 work items, which will continue to guide the entire industry on the direction for NFV.” A significant outcome resulting from NFV#12 was the completion of “Report on SDN Usage in NFV Architecture Framework.” This study was conducted over the past 12 months, with 40 contributors from across the industry, and motivated 35 recommendations for the ISG. The report analyzed SDN use cases for NFV, highlighting lessons learned from 14 ETSI NFV PoCs using SDN and NFV, along with open source SDN controllers.


Is Agile Costing You Too Much?

There is a question whether the product owner is a value-adding role or not? In many organizations it appears that it is not: The product owner is a middle-man with authority over prioritization. While sequencing and scheduling when done well will generate value, the act of sequencing one item over another, does not add value to the items themselves and from a Lean value stream mapping perspective it's a non-value-adding role. Even more curious then that Agile requires you to create non-value-adding positions. So in some extreme cases, Agile methods appear to have added 2 new people in non-value-adding positions for every 6 in value-adding positions. Put another way, after the Agile transition, 25% of the workforce is additional "Agile" overhead for operating the Agile method.


Can Your Business Survive a Data Center Outage?

ECS’s geo-replication ensures that the data is protected against site failures and disasters. ECS gives customers the option to link geographically dispersed systems and bi-directionally replicate data among these sites across WAN. Several smart strategies such as geo-caching are used to reduce WAN traffic for data access. That leads to the next natural question: If data is replicated to multiple sites, will I incur a large storage overhead? In order to reduce the number of copies in a multi-site deployment, ECS implements a data chunk contraction model, which dramatically reduces storage overhead in multi-site ECS environments. In fact, storage efficiency increases as more sites are added for geo-replication!


Spark and big data discovery: An evolutionary perspective

To enable a workflow that truly leverages the advantages of a Hadoop-based data lake, businesses need a set of tools that can open up all the assets in the data lake to everyone in the organization who needs them. They need to make analysis accessible and iterative. And they need a workflow that reduces the need for many specialized resources, placing core analytical capability into the hands of power business analysts. Businesses also need to empower these analysts to be citizen data scientists, who can free actual data scientists to pursue complex analysis rather than spending their time performing data preparation. With Spark, a data lake can become a true big data discovery environment. Spark’s emergence as a processing framework for big data is a game changer because its advanced analytics capabilities allow for large-scale data analysis across the enterprise.


Big Data Predictions For 2016

This past year was no exception. Everybody talks about the promise and the potential of big data. Yet there's a sense of disenchantment as CIOs search for use-cases to inspire change inside their own companies. They want to be shown, not told. They want the signal and not the noise. We noticed that 2015 was a noisy year, and 2016 seems like it will be equally as loud. It's not something that CIOs can afford to tune out. With digital transformations and pure-play startups disrupting established industries -- Uber is the example everyone mentions first -- the pressure is on to leverage data in new ways for competitive advantage. CIOs need to straddle two different worlds -- satisfying their existing customer base while moving fast to deliver instant, data-driven services to customers, or they risk losing ground to market upstarts.


Public vs. Private Cloud: How to Integrate Your Data Across Both

Another way to avoid data issues across public and private clouds is to simply choose one or another based on workload type and not have any particular workload straddle both. Some workloads have steady demand or sensitive data, which makes them better suited for the firewalled, fixed capacity confines of a private cloud. Financial analytics and Human Resources workloads are good examples. Other workloads see wide variations in demand and have publicly viewable data that make them a great fit for the elasticity of the public cloud. A customer-facing marketing website or customer analytics that have been sanitized to remove Personally Identifiable Information are typical candidates.


Measuring Change Readiness

One other concept we should stop and discuss briefly is the idea of change saturation. This concept captures the idea that organizations in general, and certain individuals in specific, can only absorb so much change at one time. One frequent occurrence with change efforts is the situation where more than one project or larger change effort may require the same human, financial, physical, information or other resources at the same time. To become aware of this situation and to enable you to work to mitigate the effects of change saturation, you will want to build a heat map identifying the different timing, duration, and intensity of the different requirements all of the different projects and change efforts will place on the different types of resources within the organization. This too is a prerequisite


Ransom32: First-of-its-kind JavaScript-based ransomware spotted in the wild

NW.js allows for much more control and interaction with the underlying operating system, enabling JavaScript to do almost everything ‘normal’ programming languages like C++ or Delphi can do.” Ransom32 is being sold as a service, but ransomware-as-a-service is not new; for example the Tox ransomware developer wanted 30% of the ransom payment and the FAKBEN Team requested a 10% cut of the profit. Ransom32 falls somewhere in-between, with the crypto malware authors wanting a 25% cut for customized versions of its currently undecryptable ransomware. Like other crypto-malware campaigns, wannabe bad guys sign up on a hidden server on the Tor network and can get their own customized Ransom32 ransomware after inputting the Bitcoin address where the ransom is to be delivered.


NVIDIA To Equip Self-Driving Cars With Water-Cooled Super Computer

Right now the focus of PX 2 is to detect and recognize objects, but Nvidia wants self-driving cars to also recognize circumstances. For example, a self-driving car may be able to distinguish an ambulance from a truck, and slow down. A car may also recognize snowy conditions and operate on a road in which the lanes are hidden. But such learning patterns are complex, and it could be a while until self-driving cars can handle such situations. The Drive PX 2 has 12 CPU cores, offers 8 teraflops of floating-point performance, has two Pascal GPUs and draws 250 watts. It is the equivalent of "150 MacBook Pros in your trunk," said Jen-Hsun Huang, Nvidia's CEO, during the press conference at CES.



Quote for the day:


"Products are made in the factory, but brands are created in the mind." -- Walter Landor


January 04, 2016

Why DevOps Needs To Embrace the Database

With all the new innovation, technology and methods that enable IT to move at the speed of the digital enterprise, DBAs are being asked to keep up using the same old methods. The life of a DBA has become a constant state of barely treading water if staying above it at all. As the pressure mounts, DBAs risk deploying changes that perpetuate the cardinal sin of application release – breaking the application. When erroneous database changes are not detected, the application(s) can go down and cause an unenviable ripple effect. DBAs have to track down and remedy the change, developers are annoyed that they have to go back and fix their app, C-level execs are likely livid that the business is losing money, and end users are thrown into a tailspin when they can’t access the apps they rely on.


GM and Lyft Are Building a Network of Self-Driving Cars

The partnership with Lyft, though, signifies ambitions far beyond Super Cruise. While we have no details on the proposed “network of on-demand autonomous vehicles”—such as how it will work or when it will arrive—we can assume it will require a far more advanced take on autonomous driving than Super Cruise will offer. Lyft, like other ride-sharing services, does the bulk of its work in cities, which are devilishly hard for robots to navigate. Urban areas are full of complicated intersections, pedestrians, cyclists, and other hard-to-predict variables. More to the point, it’s hard to see the benefit Lyft gets from partnering with GM on a car that sometimes needs a human driver: If you’re going to pay a person to drive, you might as well have them drive all the time.


Tech Salary Guide 2016

Technology remains a fast-growing and competitive field, and the salary changes for 2016 reflect that. Robert Half Technology reported on the salary changes for job titles across 10 tech verticals, and the average salary range has increased for every title. Some of the titles with the highest raises in salary include chief security officer, developer, business analyst, big data engineer, data scientist and wireless network engineer. Check out the salary ranges for popular technology jobs across a number of verticals and seehow much they've increased since 2015.


Are You Blind To Your Organization’s Culture?

Creating and changing a culture is a lot like losing weight. It takes hours of exercise, weeks of eating right and getting plenty of rest to drop 10-15 pounds. But it seems like those same 10 pounds (and then some) can get packed right back on in only one weekend of too much pizza, nachos, and beer. Culture can be like this. It may take a year or more for leaders to nurture the baseline of a constructive, healthy culture. And, just like that (he said with a snap) things can change. It starts to change when the CEO starts to talk badly about one of the members of their leadership team in the presence of the rest of the team. From there it continues to devolve when the members of that leadership team start doing the same to some of their own managers. In a matter of weeks, a new climate of back-stabbing and disengagement has emerged.


DBaaS to be the backbone of business transformation: Oracle

DBaaS is a paradigm where end users (database administrator, Developers, QA Engineers, Project Leads etc) can request database services, utilize it for the lifetime of the project, and then have them automatically de-provisioned and returned to the resource pool. This provides companies with a shared, consolidated platform from which organisations can easily provision database services. It also has elasticity to scale up and scale back database resources, and chargeback based on database usage, thus increasing cost efficiency. DBaaS follows the definition laid down by the National Institute of Standards and Technology (NIST) for cloud computing -SelfService, Rapid Elasticity, Measured Service (metering and chargeback) and Resource Pooling.


With the Calendar Turned to 2016, Will 'CIO' Become a Part of History?

IT can lead the creation and rollout of an online portal for employees to do everything from submit IT help desk requests, request a contract review from legal, to select healthcare benefits. This is why the CIO is the logical person to assume the role of CPO. The combination of an increasing adoption of cloud computing, and analyzing Big Data to help the enterprise reach its broad business objectives, will enable the CIO-turned-CPO to lead these new service-oriented initiatives. According to the new Verizon Enterprise Solutions’ “2016 State of the Market: Enterprise Cloud” report, 84 percent of businesses surveyed said their cloud use has increased in the past year, and half of enterprises say they will use cloud for at least 75% of their workloads by 2018.


IoT Trends for 2016. Everything connected everywhere

Smart Cities are the powerhouse of the IoT, within which all the other technology settings usually take place. All successful Smart Cities tend to have three aspects in common: non-partisan long-term objectives, combined Public and Private effort, and open data based on open standards. Successful open ecosystems rely on cost-effective cloud-based open infrastructures that are becoming the chosen solutions in many regions that are already either working or experimenting with these platforms. To ensure Smart City development to be deployed in an ordered manner, local government are attracting major technological partners through funding to put in place best-in-class Smart City services and achieve that long-term success that is key in this setting.


Ideas for WebRTC Implementation

WebRTC (Web Real-Time Communication) is an open source technology for implementing multimedia communication capabilities in real time directly in your web browser. It sets the Peer-to-Peer connection between two or more people, which is perfect for transferring of a media (audio and video streams). This technology is supported by the following browsers: Google Chrome, Mozilla Firefox and Opera. You do not need any additional plug-ins for these browsers; just open a web page and start a conversation. There is no native support for those using Safari and IE, but there is a possibility to add special plug-ins. The idea is simple. First, the browser sends a signal to the WebRTC server that the user wants to initiate a call. After getting the link from the server, the user sends this it to his companion.


Insurers Look to Tighten Cybersecurity Before Innovation

Insurers have realized that not only do they have a responsibility to monitor its activities, but they need to make sure that basic housekeeping is being carried out, says Steve Durbin, a former Gartner analyst and managing director of the Information Security Forum. “You may not be able to detect the malware, but you should be able to spot unusual activity, you may not be able to protect every piece of data, but you should have implemented encryption.” In a state filled with banks and insurers, it’s no surprise that New York regulators are currently considering a variety of cybersecurity requirements for banks and insurers. In a letter to other regulators, New York financial services superintendent Anthony Albanese said his agency has surveyed more than 150 banks and 43 insurers since 2013 and have concluded that “robust regulation” is needed.


The missing piece of the cloud security jigsaw

The adoption of the cloud presents additional issues that are snowballing into the unmanageable, specifically, with respect to identity management. It used to be that my active directory allowed for me to control access to most of my systems through domain and system credential access. However, the cloud doesn't necessarily conform to my existing network standards. In addition, with smaller cloud apps often IT isn't even really being consulted about user access or application operation or controls. In general control audits we do assess if users are properly activated and deactivated, but there still exists a lack of visibility which makes me feel uncomfortable in terms of who has what access when. Standards such as SAML are good but are not necessarily always available.



Quote for the day:


"Truly successful decision making relies on a balance between deliberate and instinctive thinking." -- Malcolm Gladwell


January 03, 2016

Enterprise Architecture - Guiding Principles

The usefulness of principles is in their general orientation and perspective; they do not prescribe specific actions. A given principle applies in some contexts but not all contexts. Different principles may conflict with each other, such as the principle of accessibility and the principle of security. Therefore, applying principles in the development of EA requires deliberation and often tradeoffs. The selection of principles to apply to a given EA is based on a combination of the general environment of the enterprise and the specifics of the goals and purpose of the EA. The application of appropriate principles facilitates grounding, balance, and positioning of an EA. Deviating from the principles may result in unnecessary and avoidable long-term costs and risks.


How to Flush DNS

There are wide arrays of DNS issues that can arise at the network administrator or power user level. For the end-user; however, the majority of DNS problems arise from either bad configuration entries or the local computer’s DNS storage requiring flushing. Independent of the type of operating system, many home computer users will input the DNS Server for their respective Internet Service Provider (ISP) incorrectly resulting in a failed Internet connection. Each ISP will have a slightly different configuration process; however, the IP address of the DNS server for your home network to use will be provided on registration for service. Many times the ISP will use the address for their actual DNS server, where others it will be the same as the Gateway IP for the service


The Disciplined Agile Framework

IT departments are complex adaptive organizations. What we mean by that is that the actions of one team will affect the actions of another team, and so on and so on. For example, the way that your agile delivery team works will have an effect on, and be affected by, any other team that you interact with. If you’re working with your operations teams, perhaps as part of your overall DevOps strategy, then each of those teams will need to adapt the way they work to collaborate effectively with one another. Each team will hopefully learn from the other and improve the way that they work. These improvements with ripple out to other teams. The challenge is that every area within IT has one or more bodies of knowledge, and in some cases published “books of knowledge”, that provide guidance for people working in those areas.


Designing the Business of IT

One of the core benefits that organisations can expect is a more cost-efficient IT environment. Senior IT leaders from MunichRe, Shell and Achmea, as well as research from Gartner, predicts that IT4IT will help organisations manage an increasingly complex IT estate in a more cost-effective fashion. It will also free up time and budget for innovation and new products. They feel the Reference Architecture provides a strong framework for managing multi-sourcing approaches, which are becoming more prominent in organisations around the world. Another key benefit of IT4IT is that it is not being introduced as an alternative to methodologies or frameworks such as TOGAF and ITIL.


Google's 'Lego' Smartphone, Smarter TVs: What We're Excited About In 2016

The Internet of Things should continue to provide the foundation for the technology industry's ambitions next year, framed by machine learning, analytics, networking, and ever-smaller devices. Connected sensors will proliferate. Intelligent software agents will learn new tricks that automate discrete tasks in a way that's similar to Gmail's Smart Reply service. Robots will emerge from private businesses to begin grocery deliveries on public sidewalks. If regulatory approval can be secured, drones will begin lawful package deliveries, following in the footsteps of flying contraband couriers.


TLS Client Authentication

Why TLS client authentication? Because that’s the most standard way to authenticate a user who owns a certificate. Of course, smartcard certificates are not the only application – organizations may issue internal certificates to users that they store on their machines. The point is to have an authentication mechanism that is more secure than a simple username/password pair. It is a usability problem, especially with smartcards, but that’s beyond the scope of this post. So, with TLS clientAuth, in addition to the server identity being verified by the client, the client identity is also verified by the server. This means the client has a certificate that is issued by an authority, which the server explicitly trusts.


Market Police Deploy New Algorithm Weapons Against Spoofers

“We have to capture every trade now,” O’Brien said. “In today’s markets it’s all about analyzing patterns and contexts.” Yet given how rapidly fraudsters can change their methods to hoodwink human beings, outwitting surveillance software could be even easier. Algorithms are sophisticated but they’re incapable of determining whether a flurry of buy and sell orders are legitimate or unlawful. “The surveillance tools are merely the first line of defense,” said Haim Bodek, founder of Decimus Capital Markets, a New York-based algorithmic investing firm. “These tools can help bring suspicious activity to the attention of regulators, trading venues and brokers, but they’re a poor substitute for a compliance program that monitors activity across affiliated accounts and groups of traders.”


2025: the five key attributes for your business surviving the next ten years in tech

The two make-or-break traits that rose to the top for these leaders were being able to spot new opportunities predictively and being able to innovate in an agile way. The survey also asked these leaders how prepared they believe their organisations are in each of these two dimensions. The gaps were quite remarkable. While 62% of those surveyed identified predictively spotting opportunities as being very important for their businesses, only 12% thought that their businesses had this capability. And only nine percent believed their organisations were capable of innovating extremely well in an agile way.


Podcast: Portfolio Management & The Agile Extension

In agile, we need to be prepared to constantly adapt our plans. That approach works extremely well at the project or initiative level, but at an organizational level, budgets and plans tend to be longer term and less adaptable. The current rate of change often means that those plans are negated and organizations find it difficult to adapt quickly to changing market conditions. We need to take the concept of backlog management and apply it at a higher level to programs and portfolios so that we are able to adaptively respond to changes in the world around us. The traditional definition of project success has been on time, on scope, and on budget. Those constraints still exist, but they are not the driving factors today.


Cybersecurity in 2016: will it come down to luck or leadership?

Unfortunately in most respects, 2016 won’t change much: users will still unknowingly click on malicious links; IT departments will still be bad at staying up to date with patching; the bad guys will continue to attack; and the tide of misery from breaches will persist. What matters most is whether your organisation will be a victim or not. Of course you could do nothing, and be lucky. But the only way to control your fate is to lead your organisation to the high ground based on a well-considered, security-first strategy. It is important to remember that, despite their claims, most security vendors cannot help you. Within the market we see too many 'me too' vendors, who’s main focus in on the staple of detection.



Quote for the day:

"It is literally true that you can succeed best and quickest by helping others to succeed." -- Napoleon Hill

January 02, 2016

5 Aspects of Cloud Computing to Watch Out For in 2016

The best part is that there is no need to buy expensive software licenses. Any software application upgrades, and the security feature aspects are the sole obligations of the cloud service provider. When it comes to efficiency, flexibility, reliability, and scalability the cloud undoubtedly triumphs over the rest.  Staying competitive in today’s world demands high levels of investment in technology in its various avatars. But enter cloud computing and all your worries go away thanks to ‘rent, don’t buy’ policy so strikingly espoused by it. It is easy to stay on top of market opportunities and retain your competitive edge with cloud computing. The future of cloud computing will only get brighter.


Technology Will Create New Models for Privacy Regulation

The average cost per user of a data breach is now $240 … think of businesses looking at that cost and saying “What if I can find a way to not hold that data, but the value of that data?” When we do that, our concept of privacy will be different. Our concept so far is that we should give people control over copies of data. In the future, we will not worry about copies of data, but using data. The paradigm of required use will develop once we have really simple ways to hold data. If I were king, I would say it’s too early. Let’s muddle through the next few years. The costs are costly, but the current model of privacy will not make sense going forward. If I ping a service, and it tells me someone is over 18, I don’t need to hold that fact.


Artificial Intelligence Finally Entered Our Everyday World

Ng’s prototype relies on a technology called deep learning. Inside the massive computer data centers that underpin Baidu’s online services, the company runs massive neural networks—networks of hardware and software that approximate the web of neurons in the human brain. By analyzing enormous collections of digital images, these networks can learn to identify objects, written words, even human faces. Feed enough cat photos into a neural net, and it can learn to identify a cat. Feed it enough photos of a cloud, and it can learn to identify a cloud.


The CEO of Google's £400 million AI startup is going to discuss ethics

Defending why Google hasn't revealed the members of its AI ethics committee yet, Hassabis said "it's very early days" and "there's lots of scrutiny on this". He said he'd like to get everyone "up to speed" on artificial intelligence first. "We wanted to have a calm, collected debate first," he said. "At some point we will reveal who these people [on the ethics committee] are and what issues are being discussed." Hassabis also assured the audience that he will not allow DeepMind technology to be used in military applications. Hassabis also revealed that he spoke with Hawking on the topic of AI a few months ago. "I think [Hawking] was quite reassured about how we specifically were approaching AI," said Hassabis.


Ten digital trust challenges

It’s not just the challenge of keeping up to speed with technological developments. It’s whether and how the current design of our private and public institutions needs to adapt to cope with these changes and to restore the trust of society – digital trust. In June, to mark the 800th anniversary of the Magna Carta, we looked at how institutions (organised and purposeful interactions of people based on contract, law or culture) must create and maintain trust through legitimacy, effectiveness and transparency, and how global megatrends like technology are driving the need for a design transformation and a bold new charter for our digital world. Here we take a look at 10 digital trust issues that in our view institutions must grow new capabilities to address.


Security threats, hackers and shadow IT still plague health IT

As a starting point, Hopfer suggests that CIOs take an inventory of the cloud services running within their organizations to assess their security posture. The exercise of evaluating what types of applications employees are running can shed light on the tools they need to support the business objectives of the enterprise. ... As hackers grow more sophisticated and attacks mount, security is a primary concern for CIOs in all industries, but it carries a special importance in healthcare owing to the sensitivity of the data involved. Moreover, much of the information contained in health records is unalterable, and, taken in composite, makes for a remarkably full profile that criminals can put to use for all manner of fraudulent ends.


How to defend your business against the worst hackers have to offer in the New Year

When it comes to crystal ball gazing in the tech world, the rule of thumb is not to do it as you only end up looking like an idiot in 12 months’ time. This time, though, the sad truth of the matter is that predicting the shape of the IT security threatscape for next year really isn't that hard: 'more of the bloody same' pretty much sums it up. But while exploit kits, DDoS attacks, and ransomware will all continue marching into the enterprise and doing damage, these threats will also evolve to become more dangerous. Here's five IT security predictions to mull over as you recover from your New Year's Eve party.


Big Data Industry Predictions for 2016

In 2016, automated personalization will be a critical business benefit that big data analytics will begin to deliver in the coming year. Companies will continue to seek competitive advantage by adopting new big data technologies and allowing machines to simulate subjective ‘squishy’ data – including human communication cues such as nonverbal behavior, facial expressions, tone of voice. Big data analytics makes this possible by assimilating vast amounts of information, including the types of data that were too slow and expensive to collect and analyze in the past, such as communications and case records for knowledge workers. As the machines get better at interpreting a variety of data types and collating it with vast quantities of structured data, they can begin to improve and accelerate both employee-owned business processes and customer-facing experiences.


Using Redis As a Time Series Database: Why and How

One of the first questions brought up when talking about Redis and its use as a time-series database is “what is the use or purpose of a time-series database?” The use-cases around time series databases are more related to the data involved - specifically that your data is structured as a series of events or samples of one or more values or metrics over time. A few examples include (but are not limited to): Sell price and volume of a traded stock; Total value and delivery location of an order placed at an online retailer; Actions of a user in a video game; and Data gathered from sensors embedded inside IoT devices. We could keep going, but basically if something happened or you made a measurement, you can record that with a timestamp.



Quote for the day:


"Unless you try to do something beyond what you have already mastered, you will never grow." --Ralph Waldo Emerson


January 01, 2016

Happy New Year, techies ! 


Algorithms: Turning Data Into Products


Algorithms rely on data. If you can get results from clear data, you can spot patterns, such as why a query is returning the wrong results, according to Surabhi Gupta, engineering manager at Airbnb. For companies such as Google and Airbnb, algorithms are their "product." In the simplest terms, a Google search algorithm has to produce an effective search result. That search result is the company's product. Airbnb's algorithms have to connect travelers with those offering accomodations. Practical expediency governs how algorithms are used and referenced. In the business world, algorithms are built for speed. "You are working with people with the expectation of immediate answers," said Google's Marshall. "Your algorithm has another constraint on it." In the end, a developer is taking a sub-optimal solution that runs fast, and comparing it to the optimal solution to see how close the two compare, he added.


The Impact of IoT on Data Science and Analytics

Of all the trends that will shape data science and analytics over the next few years, the Internet of Things (IoT) promises to have the most profound impact of all. In keeping with the publishing tradition of saying goodbye to one year, and hello to a new one, with top prediction lists, I have complied my thoughts on what is in store for the IoT in 2016 (and beyond). ... Disintermediation will be the word of 2016, as the tipping point has been reached and companies start reaping benefits of the investments in automated decisioning systems which will evolve in AI (yet another way in which robots will emerge). By removing human touch points from especially administrative processes, massive layoffs will make headlines. And public debate will rise on the ethical aspects of robotization in our society.


Will new regulations affect fintech startups in Hong Kong?

Fintech, or financial technology. The ubiquitous yet mysterious word usually references startups that are using technology to provide alternative or “disruptive” financial solutions. Fintech has been a buzzword as of late, rising as one of the most compelling sectors in Hong Kong’s startup scene. Attend any startup event or conferences touting creativity and you’ll hear something along the lines of how Hong Kong is well positioned to rise as a fintech hub akin to Singapore and Australia.  This would be exciting as fintech can ease payment processes and money transfers, generally moving a traditional industry forward. Hong Kong has already seen an increase in fintech-centric accelerators:



The Biggest Security Threats We’ll Face in 2016

Data sabotage can be much more difficult to detect than the kind of physical destruction caused by Stuxnet. That’s because data alterations can be so slight yet have enormous consequences and implications. Anyone remember the Lotus 1-2-3 bug back in the 90s that would produce accounting miscalculations in spreadsheets under certain conditions? That was an unintentional error. But attackers could get into financial and stock-trading systems to alter data and force stock prices to rise or fall, depending on their aim. ... There’s no evidence yet that the Juniper backdoor was installed by the NSA; it’s more likely that an NSA spying partner—possibly the UK or Israel—or a US adversary installed it.


Continuously Improving Your Lean-Agile Coaching

Rachel Davies describes a compatible approach in her book Agile Coaching. She was an original member of one of the first, most prolific, longest running extreme-programming (XP) teams at ConneXtra, and she is now an agile coach at Unruly. Joseph Pelrine, a computer and social-complexity scientist and leading XP expert, has a similar perspective. He uses complexity science to explain the theory behind agile and Scrum and uses social complexity to guide and coach self-organising teams. This view aligns with the coach role that Kent Beck, creator of XP, gives in the book Extreme Programming Explained. .... While each of these authors expands on some of aspect of lean-agile coaching, they all describe a role with all the aspects I mentioned before: hard and the soft skills, teaching/training to support growth and development, and guidance and support during delivery.


Cybersecurity and the Twenty-First Century Board of Directors

Just as boards require directors with expertise in finance, today they must also have directors with experience in cybersecurity. It takes qualified board members to provide oversight by knowing what questions to raise and how to assess management's responses. Symantec recognized this a few years ago when they recruited U.S. Air Force General Suzanne Vautrinot to the board. During her 31 year tenure with the USAF, "she led large-scale, diverse, global organizations that operate, extend, maintain, and defend global networks." Not only does General Vautrinot have the advanced degrees and board experience, she also "oversaw a multi-billion dollar global cyber enterprise with 14,000 military, civilians, and contractors.


Simpler management motivates Ethernet SAN investments

"It encouraged the SAN and Ethernet networking teams to work more closely together and closely with the systems team," Masseth said. Before, the FC guys thought the Ethernet team didn't understand lossless, or the importance of uptime. Now, Masseth said, whenever UA has an outage, incident or complex build to tackle, fingers aren't pointing at the other team. "They roll up their sleeves and find and fix the problems." Ethernet-based storage protocols offer attractive cost profiles through reuse of existing network infrastructure and management expertise, said IDC's Casemore. Lower cost when scaling out storage infrastructure was important to 35% of survey respondents when considering Ethernet SAN options. Masseth found that FCoE did reduce costs for UA, but this might not be the case for other builds, he warned.


Async await with Web Forms over multiple postbacks

Support for asynchronous procedures (using async / await) is great in C# and VB languages, and works very well in desktop (WinForms, WPF, console and others) applications. But there are web applications, like based on ASP.NET Web Forms, where support for asynchronous procedures are much less exposed. Microsoft itself states, that support for asynchrony (using async / await or other methods) is only limited to offload working threads, increase throughput and all asynchronous procedures must complete work before rendering phase (generating HTML). But there are also much more useful scenarios, where asynchronous procedure is started in some postback, and completed in one of subsequent postbacks, so executing spans over multiple postacks. This allow for UI driven asynchronous processing, which is useful for most web apps.


Thriving On Data #4 - Data Apart Together

The modern reality of technical evolution is that the conservative approach is the one that embraces delivery and operation changes more aggressively and focuses less on individual technologies. This view on information underpins the Business Data Lake which Capgemini co-innovated with Pivotal in 2013 and has since been adopted by both Informatica and EMC. Data Apart Together is not simply about how you combine data; it’s about recognizing that information often is apart for a reason. It is partner data they don’t want to share in the raw, it’s personal information that has to be kept in a specific geography, or its separate due to an acquisition and the sheer volume of information makes it unreasonable to coalesce into a single environment. Data Apart Together is therefore about how you enable different business units to derive insight across all the available information and not simply that which is directly available.


With potential Ford tie-up, Google looks to take back self-driving car lead from Tesla

For Google, the move would get their technology into more vehicles without them having to invest large amounts of capital into manufacturing. And, it's clear that Google wants to move the project forward since they'll be structuring their self-driving vehicle division as its own company under the new parent corporation, Alphabet next year—potentially including a carpooling competitor to Lyft or Uber. When asked what he thought of the potential partnership, Gartner analyst Ken Dulaney said that it makes "lots of sense." "Google needs the experience of production distribution and marketing, and Ford needs the breakthrough talents of Silicon Valley," Dulaney said.



Quote for the day:


"The result of long-term relationships is better and better quality, and lower and lower costs." -- W. Edwards Deming


December 31, 2015

7 Technology Resolutions For a Better 2016

It's 2016 and there's a feeling of hope and renewal in the air. That can mean only one thing: It's time for some New Year's resolutions. What did you vow to change this year? Are you going to learn a new skill? Pay off your credit card debt? Lose 40 pounds? Whatever your plans are, don't forget to throw in a few resolutions that involve the technology in your life. The best part of tech resolutions is they're fairly easy to keep and can improve your life almost right away. We've got seven suggestions below on how to make technology central to your plans for an awesome 2016.


Immutable Layers, Not (Just) Infrastructure

Immutable infrastructure is an effective application delivery workflow, but the deployment process is slow and resources are often underutilized. Building images and creating new hosts on every application update leads to a deploy process that can take 10 minutes or more. If there’s only one service per host, it usually does not use all the available CPU, memory, and disc resources on the host. If only 20% of resources are used, then the remaining 80% is wasted expense. Schedulers like Nomad, Mesos, and Kubernetes allow organizations to split applications and infrastructure into separate immutable layers, which speeds up deployment times and increases resource density, while still maintaining the benefits of immutable infrastructure. 


Software Licensing Audits: Is Your Company Prepared?

Violating license agreements can be expensive. Six defendants recently pled guilty in a software piracy case worth more than $100 million. While it wasn't an enterprise company left holding the bag, the outcome illustrates the consequences. In enterprise settings, most noncompliance is unintentional, which usually means the company did a poor job of managing its software licenses. If a company does not go to court, at minimum it will likely have to "true up," which means pay any licensing fees owed for software overuse. That can easily mean six or seven figures in large organizations. Licensees may also be subject to fines and penalties outlined in the license agreement. If the matter goes to litigation, the causes of action usually include breach of contract and copyright infringement, whether or not the noncompliance was willful or negligent.


How will blockchain technology transform financial services?

Bitcoin’s open source blockchain, described as a “permissionless” system, means it is decentralised and open to anyone. UBS and Microsoft are both working with blockchain start-up Ethereum, which runs a similar open source technology, and allows for the smart contracts that can execute trades automatically. But many in banking, wary of losing their grip over operations or of upsetting regulators, see the future in closed, or permissioned-only, networks. Almost two dozen of the world’s largest banks, including JPMorgan, UBS and Barclays, have thrown their weight behind R3 CEV, a start-up venture, to set up a private blockchain open only to invited participants who between them maintain and run the network. It forms part of an effort to build an industry-wide platform to standardise use of the technology.


Waterfall-to-Agile transition: Five tips from Bose

Bose adopted Agile development in 2003, after having become frustrated with the Waterfall methodology. CIO Rob Ramrath said Bose had trouble with Waterfall's "unknown unknowns," or ambiguity inherent in the process. He said the company "made a decision to burn the ships," and move to Agile development, which, he said, provides a structured way to avoid ambiguity in the development process and to pull business customers into the development process. Here are five recommendations that came out of the panel for a Waterfall-to-Agile transition:


Metatheory and enterprise-architecture

By contrast, a metatheory provides a consistent description of the context-space itself – the parameters and trade-offs underlying that context-space, much as above. As a theory-of-theory, a metatheory provides a frame in which to identify where each type of theory would work well – and where it wouldn’t. Hence on complexity, for example, Roger Sessions argues that we should aim to eliminate all complexity; John Seddon argues instead that we should aim to embrace complexity, and that trying to eliminate it only makes things worse. Which of them is right? If we were looking for a single consistent theory, one of them surely must be wrong? But actually they’re both right - in the right types of context. Equally, both of them are wrong – for the wrong type of context. To use a well-worn architects’ expression, “It depends“…


Microsoft outlines its cloud and server integration roadmap for 2016

Microsoft is planning to make generally available its tenth release of BizTalk Server in the fourth quarter of 2016. Before that, in Q2 of next year, Microsoft will release a Community Technology Preview of BizTalk Server 2016, followed by a beta of that product in Q3.BizTalk Server 2016 will align with Windows Server 2016 (due out in Q3 next year), SQL 2016, Office 2016 and the latest version of Visual Studio. The latest BizTalk release will support SQL 2016's AlwaysOn Availability Groups both on-premises, as well as hosted on Azure. Microsoft plans to tighten the integration between BizTalk Server and various application programming interface connectors, such as the ones to Salesforce.com and Office 365, to enhance hybrid on-prem/cloud scenarios.


Hybrid Cloud – Taming the Digital Dragon

Often the somewhat rigid structures and processes of IT departments, such as fixed employment or procurement via RFP, aren’t the most innovation-encouraging activities. This means CIOs can easily source their functional requirements, such as delivery and support of a SAP system, but they struggle to engage personnel who can devise innovative new ways in which it might be used to extend a successful digital strategy. That’s a much more rare skill set and is often concentrated in small startup businesses, where domain experts seek to exploit their considerable assets of knowledge areas. They in turn urgently need reference clients and problem statements to build their business around, and so very dynamic fusion can be achieved with regards to shared goals of digital innovation.


An Evaluation Guide to Application Lifecycle Management Software Solutions

Application Lifecycle Management tools can help improve software quality, cut costs, shorten time to market, and enhance collaboration by clearly outlining workflows, and helping you stay on top of your artifacts and processes throughout the lifecycle. What's more, advanced integrated ALM solutions also offer simple ways to export reports, greatly facilitating compliance audits in the above-mentioned safety-critical industries. Right, so there's a solution (or at least significant help) available for some of the most pressing difficulties you're facing. Should you just run to the store, grab an ALM platform off the shelf, and sit back to watch the extra money flowing in? Well, here's the thing: Application Lifecycle Management software come in all shapes and sizes, and finding the one that perfectly suits your processes can be problematic.


The Tech That Will Change Your Life in 2016

Voice-operated electronics are poised for a quantum leap in accuracy and intelligence in 2016. Talking offers a more natural way to interact with devices that need complex input but aren’t exactly keyboard-friendly, such as TVs, sound systems and household electronics. Voice arrived in a big way in 2015 when Microsoft’s Cortana virtual assistant came to Windows 10, while Siri and Google Now turned up in cars and TVs. This year, expect voice control on more computers and an even wider range of gadgets, including the CogniToys Dino, a toy that uses IBM’s Watson to help answer questions, and Jibo, a talking family robot.



Quote for the day:



"Be a leader to be remembered, make people feel good about themselves and increase their belief in their own abilities" -- Gordon Tredgold