Showing posts with label ebook. Show all posts
Showing posts with label ebook. Show all posts

Daily Tech Digest - September 30, 2017

Securing Applications: Why ECC and PFS Matter

Many of us are familiar with Secure Hypertext Transfer Protocol (HTTPS) that uses a cryptographic protocol commonly referred to as Transport Layer Security (TLS) to secure our communication on the Internet. In simple terms, there are two keys, one available to everyone via a certificate, called a public key and the other available to the recipient of the communication, called a private key. When you want to send encrypted communication to someone, you use the receiver’s public key to secure that communication channel. ... The benefit of securing our communication to prevent snooping of sensitive data is obvious; however, encrypting the communication has its downside – it’s computationally expensive and requires a lot of CPU processing to enable, plus encrypted communication may be used in malicious ways to send proprietary information


DNSSEC key signing key rollover: Are you ready?

“There may be multiple reasons why operators do not have the new key installed in their systems: some may not have their resolver software properly configured and a recently discovered issue in one widely used resolver program appears to not be automatically updating the key as it should, for reasons that are still being explored,” ICANN says. It could also be an awareness issue—that enough operators were not aware of the deployment process. “ICANN is on schedule to begin using the private portion [for signing domains] shortly,” Vixie says. The most challenging part of this multistep, multi-year process was overseeing the plan’s development, seeking broad review and approval, and obtaining approvals from multiple internet governance organizations to execute the plan, Vixie says.


Finally, a Driverless Car with Some Common Sense

A lack of commonsense knowledge has certainly caused some problems for autonomous driving systems. An accident involving a Tesla driving in semi-autonomous mode in Florida last year, for instance, occurred when the car’s sensors were temporarily confused as a truck crossed the highway. A human driver would have likely quickly and safely figured out what was going on. Zhao and Debbie Yu, one of his cofounders, show a clip of an accident involving a Tesla in China, in which the car drove straight into a street-cleaning truck. “The system is trained on Israel or Europe, and they don’t have this kind of truck,” Zhao says. “It’s only based on detection; it doesn’t really understand what’s going on,” he says. iSee is built on efforts to understand how humans make sense of the world, and to design machines that mimic this.


Banking on machine learning

Machine learning refers to the use of mathematical and statistical models to teach machines about new phenomena. It involves ingesting raw information in large datasets, understanding patterns and correlations and drawing inferences. While this may seem similar to how humans learn, machine learning algorithms ‘learn’ at much faster speeds with the ability to adapt from mistakes and course-correct. Needless to say, there are numerous applications of ML in any banking field that requires repetitive work, high-accuracy tasks or even informed decision-making. Take data security, which is a key concern for banks. Deep Instinct, a cyber security company that leverages deep learning for enterprise security, states that new malware often contains code that is similar to previous versions.


The business case for digital supply networks in life sciences


Unlike traditional supply chains, which are linear and siloed, digital supply networks are dynamic, interconnected systems that can more readily incorporate ecosystem partners and evolve over time. This shift from linear, sequential supply chain operations to an interconnected, open system of supply operations could lay the foundation for how life sciences companies compete in the future. Digital supply networks in life sciences can address challenges with optimal management of inventories, reliability, and visibility of products moving across the supply chain, or operations efficiencies and product yields. In view of the forces affecting life sciences—pricing pressures, the emergence of value-based and personalized medicine, and the expectations of customers and regulators—creating a life sciences digital supply network can be a logical new opportunity to deliver value.


6 ways to make sure AI creates jobs for all and not the few

Whenever I talk to people about the potential impact of artificial intelligence (AI) and robotics, it’s clear there is a lot of anxiety surrounding these developments. And no wonder: these technologies already have a huge impact on the world of work, from AI-powered algorithms that recommend optimal routes to maximize Lyft and Uber drivers’ earnings; to machine learning systems that help optimize lists of customer leads so salespeople can be more effective. We’re on the verge of tremendous transformations to work. Millions of jobs will be affected and the nature of work itself may change profoundly. We have an obligation to shape this future — the good news is that we can. It’s easier to see the jobs that will disappear than to imagine the jobs that will be created in the future but are as yet unknown.


Free ebook: Data Science with Microsoft SQL Server 2016


SQL Server 2016 was built for this new world, and to help businesses get ahead of today’s disruptions. It supports hybrid transactional/analytical processing, advanced analytics and machine learning, mobile BI, data integration, always encrypted query processing capabilities and in-memory transactions with persistence. It integrates advanced analytics into the database, providing revolutionary capabilities to build intelligent, high performance transactional applications. Imagine a core enterprise application built with a database such as SQL Server. What if you could embed intelligence, i.e. advanced analytics algorithms plus data transformations, within the database itself, to make every transaction intelligent in real time? That’s now possible for the first time with R and machine learning built into SQL Server 2016.


Cloud Computing Security: Provider & Consumer Responsibilities

The first step Cloud Service Providers take, is to secure the Data Center where they host their IT hardware for the Cloud. This is to secure the DC against unauthorized access, interference, theft, fires, floods and so on. The Data Center is also secured to ensure redundancy in essential supplies (Example power backup, Air conditioner) to minimize the possibility of service disruption. In most cases, Provider’s offer Cloud applications from ‘world-class’ data centers. The Cloud Provider ensures that their Infrastructure and the Services comply with Critical Protection Laws such as data protection laws, Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), Criminal Justice Information Services(CJIS) , the Sarbanes-Oxley Act, the Federal Information Security Management Act of 2002 (FISMA) and so on.


Want to be a better security leader? Embrace your red team

Successful business leaders understand the power of disruption as a pathway to anticipating unstated future customer needs. The concept of disruption as a force for innovation is powerful in the field of cybersecurity and often pushes business leaders to problem solve in new or unexpected ways. Proactively simulating attacks on your own organization is an excellent example.  With now-broad acceptance that attackers will get in and that compromise is expected, there are distinct advantages to being “productively paranoid.” Security leaders who are productively paranoid fully embrace the idea that the best way to play defense is to start playing offense. This doesn’t mean companies should “attack back,” but they need to understand the mindset and pathways attackers take to infiltrate organizations.


The digital workplace: 8 steps to greater agility, productivity

What is the digital workplace? It is a business strategy aimed at boosting employee engagement and agility through consumerization of the work environment, Rozwell says. Think of your one-size-fits-all-users ERP or expense management applications and imagine the opposite user experience. Your digital workplace should help individuals and teams work more productively without compromising operations. It should include computers, mobile devices and productivity and collaboration applications that are web-based and synch in real time. Such tools should, for example, mimic the ease of use of Uber and Airbnb and the social aspects of Facebook and Instagram. IBM, for one, has undertaken a massive transformation of its workplace to lure new tech talent.



Quote for the day:


"The most effective debugging tool is still careful thought, coupled with judiciously placed print statements." -- Brian Kernighan


March 20, 2016

Cyber-Insurance: Is It Right for Your Business?

As a result of actual and threatened events, the insurance market has responded with a new product to protect businesses from data breaches: cyber-insurance. Traditionally, businesses sought coverage for losses of data breaches under commercial property, commercial general liability, and business interruption policies for first-party losses, and under commercial liability and directors and officers liability policies for third-party losses. However, in the late 1990s, insurers began offering cyber-insurance in the form of standalone policies. Yet, despite recent data breaches, only 20 to 30 percent of American firms purchase cyber-insurance. The case law interpreting these policies is scarce, as courts struggle to define the parameters of cyber-liability. Courts are increasingly allowing plaintiffs to file creative claims against businesses in the wake of data breaches.


The future of computing

Moore’s law was never a physical law, but a self-fulfilling prophecy—a triumph of central planning by which the technology industry co-ordinated and synchronised its actions. Its demise will make the rate of technological progress less predictable; there are likely to be bumps in the road as new performance-enhancing technologies arrive in fits and starts. But given that most people judge their computing devices on the availability of capabilities and features, rather than processing speed, it may not feel like much of a slowdown to consumers. For companies, the end of Moore’s law will be disguised by the shift to cloud computing. Already, firms are upgrading PCs less often, and have stopped operating their own e-mail servers. 


Big Data for Governance - Implications for Policy, Practice and Research

This predicted growth is expected to have significant impact on all organizations, be it small, medium or large, which include exchanges, banks, brokers, insurers, data vendors and technology and services suppliers. This also extends beyond the organization with the increasing focus on rules and regulations designed to protect a firm’s employees, customers and shareholders as well as the economic wellbeing of the state in which the organization resides. This pervasive use and commercialization of big data analytical technologies is likey to have far reaching implications in meeting regulatory obligations and governance related activities.


XGIMI projects innovation in Android entertainment

The device is ideally suited for a number of use-cases including families or students that are space-constrained or cannot accommodate a permanently wall-mounted television. It incorporates a 700 ANSI Lumens OSRAM LED projector element which is fully capable of both 1080p and 4K output, and can render 3D video, with up to a 300" diagonal width projection area. The device can also act in a "Business" mode where data from a cloud storage service or from local USB can be presented using a built-in Microsoft Office-compatible viewer. Of course, with Office 365 for Android, it's also possible to run the real thing, provided you have the right packages installed. In its pre-production configuration the device was shipped to me with a modified version of Android 4.3 using AOSP-based sources. The device uses a 1.5Ghz quad-core ARMv7-based SoC, which is comparable to that which might be used in a high-end smartphone.


Beyond Bitcoin: The blockchain revolution in financial services

Interest in the technology exploded when it became clear that blockchain can be used to document the transfer of any digital asset, record the ownership of physical and intellectual property, and establish rights through smart contracts, among other applications. By reordering and automating complex, labor-intensive processes, the technology can enable organizations to operate both faster and more cheaply. Financial institutions are exploring a variety of opportunities to use blockchain, including applications to improve and enhance currency exchange, supply chain management, trade execution and settlement, remittance, peer-to-peer transfers, micropayments, asset registration, correspondent banking and regulatory reporting.


Who Are the Bad Guys and What Do They Want?

Recent breaches at the Internal Revenue Service are a stark reminder that cyber crime is alive and well. According to Center for Strategic and International Studies, cyber crime and cyber espionage cost the global economy between $375 billion and $575 billion annually, or roughly 1% of global income. So who are those guys and what do they want? Based on interviews with several cyber security experts, this O’Reilly report provides a concise and highly informative look into various actors who populate this murky world. You’ll explore some of their methods and motivations, as well as new approaches from the both US government and private sector to help organizations manage cyber security more aggressively. ... Get a copy of this report and find out what your organization can do to deal with this ongoing threat.


Defend against ransomware with 3 easy steps

The fight to secure your business is a never-ending battle. Ransomware is a particular strain of malware that quietly works in the background to encrypt user documents with a secret cryptographic key kept at a remote location and threatens to only release this key upon payment to the perpetrators. This type of malware has mostly changed in its increasing sophistication and prevalence, as well as the use of robust encryption schemes that offer little hope of undoing by the time its nefarious encrypting work is completed. According to Software Advice, businesses are taking note of the risks surrounding this malware. Sixty-seven percent of business decision-makers claim they'd never pay a ransom to regain access to infected files, yet only 23 percent say they're "very confident" their data is secure from ransomware attacks.


Spark in Action Book Review & Interview

Project Tungsten is one of these efforts under “get Spark as close to bare metal as possible” umbrella, where the goal is to remove any general-purpose software between Spark and the operating system (Tungsten allows Spark to bypass JVM and do memory management by itself). Tungsten makes a lot of sense, mainly because it makes a large class of JVM-related problems go away, garbage collection being the main one. Since end users are not managing memory manually, there’s no risk of getting segmentation fault errors, so the full potential is there to give Spark arbitrary large chunks of off-heap memory with significant performance improvements without any down sides that would be visible from the end user perspective.


Why Central Banks Should Start Issuing Electronic Money

The Bank of England currently issues central bank money reactively: it issues banknotes in whatever quantities are needed to meet demand from the public, and issues central bank reserves in order to meet demand from the banks. It could choose to issue digital cash in the same way, by providing the infrastructure for Digital Cash Accounts but letting the public determine how to split their holdings of money between bank deposits and digital cash. ... Alternatively, by taking a proactive approach to issuance, the Bank of England could use digital cash as a monetary policy tool to stimulate aggregate demand and influence the economy. If every citizen had a Digital Cash Account at the Bank of England (either directly or indirectly), then it would be a simple process for the Bank of England to make small and occasional ‘helicopter drops’ of newly created digital cash to every citizen.


What Should Data Scientists Know About Psychology?

How data is collected informs what we can conclude from that data. Many methodological confounds exist in relation to what can be extrapolated from data to maximize the ecological validity of what can be accurately concluded. Implementing quality assurance in collecting data, such that what is supposed to be measured is indeed being measured requires manipulation checks, quality testing and research. Then how the data is coded and quantified creates another lens of possible distortion. Poor measurement cannot be fixed post-hoc in already collected data. Furthermore, because statistics requires the calculated assumption of error (unlike formal mathematics) how one implements data mining/management decides on appropriate statistical analysis and interprets the results is of utmost importance in a field of scientific inquiry.



Quote for the day:


"Only those who attempt the absurd will achieve the impossible." -- M.C. Escher

November 25, 2015

Russian financial cybercrime: how it works

With online financial transactions becoming more common, the organizations supporting such operations are becoming more attractive to cybercriminals. Over the last few years, cybercriminals have been increasingly attacking not just the customers of banks and online stores, but the enabling banks and payments systems directly. The story of the Carbanak cybergroup which specializes in attacking banks and was exposed earlier this year by Kaspersky Lab is a clear confirmation of this trend. ... Information on the number of attacks may indicate the extent of the problem but does not reveal anything about who creates them and how. We hope that our review will help to shed light on this aspect of financial cybercrime


The State of Millennials Worldwide

As the survey’s authors noted, aspiring to the freedom brought by self-employment while still living with or being supported by family is an age-old contradiction. They found that in Asia or Latin America, these align with cultural norms, but the spread to North American cities where this historically hasn’t been the case are a strong indicator for municipal leaders to find ways to support this growing segment of their constituents. "Young people may respond positively to policies or programs that foster a mind-set of measured risk for personal or global growth, while laying the groundwork for long-term stability," the authors found.


In Machine Learning, What is Better: More Data or better Algorithms

“In machine learning, is more data always better than better algorithms?” No. There are times when more data helps, there are times when it doesn’t. Probably one of the most famous quotes defending the power of data is that of Google’s Research Director Peter Norvig claiming that “We don’t have better algorithms. We just have more data.”. This quote is usually linked to the article on “The Unreasonable Effectiveness of Data”, co-authored by Norvig himself (you should probably be able to find the pdf on the web although the original is behind the IEEE paywall). The last nail on the coffin of better models is when Norvig is misquoted as saying that “All models are wrong, and you don’t need them anyway”


Artificial Intelligence: 10 Things To Know

"We are trying to make a system which at first sight looks like it might be behaving in some manner that we might ascribe to intelligence," said Moore. "Everything, however, with 'artificial' in the label is actually just a really, really, really fancy calculator, all the way from chess programs to software in cars, to credit-scoring systems, to systems that are monitoring pharmaceutical sales for signs of an outbreak." ... "And people are making bad decisions, which are costing huge numbers of lives every year, by not going to physicians under some circumstances or not letting a doctor know about something important or mismanaging their medications.


Composable Infrastructure: Cutting Through the Noise

By separating the physical components of the server, those resources can then be pooled and programmatically composed into a logical server and then, subsequently decomposed, returning the elements back to the pools allowing for reuse. This breaking down of the server means that not only can the most efficient and optimal use of resources be made, but also the lifecycle management of those resources is also decoupled. So, in the case of M-Series, the next CPU generation that would drive a complete replacement of the server with a traditional rack-mounted server would only require the replacement of the CPU and possibly DIMMs to achieve an upgrade. Subsystems like the local storage, RAID controller, network adapter, power supplies, fans, and cabling are preserved until upgrades of those respective elements would yield benefit to the business.


eBook: Foundations of Data Science, by Microsoft Research

The field of algorithms has traditionally assumed that the input data to a problem is presented in random access memory, which the algorithm can repeatedly access. This is not feasible for modern problems. The streaming model and other models have been formulated to better reflect this. In this setting, sampling plays a crucial role and, indeed, we have to sample on the fly. in Chapter ?? we study how to draw good samples efficiently and how to estimate statistical, as well as linear algebra quantities, with such samples. One of the most important tools in the modern toolkit is clustering, dividing data into groups of similar objects. After describing some of the basic methods for clustering, such as the k-means algorithm, we focus on modern developments in understanding these, as well as newer algorithms.


Jai Ranganathan on architecting big data applications in the cloud

There are some fundamental design principles behind the original HDFS implementation, which don’t actually work in the cloud. For example, this notion that data locality is fundamental to this system design; it starts changing in the cloud when you’re looking at these large cloud providers — they are doing all these software-defined networking tricks and they can do bisectional bandwidth, like 40 gigs per second, across their data center … suddenly, you’re talking about moving hundreds of terabytes of data back and forth from a storage to a compute layer without any huge performance penalties. Suddenly, their performance is disadvantageous to this, but it’s not as bad as you think.


Security is the common theme in 2016 top IT projects

The heightened interest doesn't come as a surprise to experts. "Everyone's concerned with security issues due to the nature of what's been happening recently," said Turner who works for a non-profit organization in western New York that's striving to better connect Medicaid patients with health care providers. After another turbulent year of high-profile breaches, including Ashley Madison, CVS and the Office of Personnel Management, security threats are top-of-mind for board members and CEOs, alike, putting a spotlight on CIOs and senior IT leaders. For Vlasich, security and cloud computing, which ranked as a top IT project for the second year in a row, are intertwined thanks, in part, to rogue IT.


Java: The Missing Features

Java’s import syntax is quite limited. The only two options available to the developer are either the import of a single class or of an entire package. This leads to cumbersome multi line imports if we want just some but not all of a package, and necessitates IDE features such as import folding for most large Java source files. ... Java’s arrays aren’t collections, and the "bridge methods" provided in the helper class Arrays also have some major drawbacks. For example, the Arrays.asList() helper method returns an ArrayList, which seems entirely reasonable, until closer inspection reveals that it is not the usual ArrayList but rather Arrays.ArrayList.


Key Methods for Ensuring FRCP Data Preservation Compliance

The new FRCP amendments introduce the notion of “reasonable” preservation effort to preserve data across all forms of enterprise communication. In court, organizations must prove they made reasonable efforts to prevent communications data, in any form, from being destroyed. Failure to do so will lead the court to the assumption that the information not preserved is harmful to your defense. By some estimates, eDiscovery costs U.S. organizations around $41 billion annually. Not only is this expensive, but it can also be a time-intensive exercise. So, how can organizations demonstrate “reasonable” preservation efforts? 



Quote for the day:


"Speaking about it and doing it are not the same thing." -- Gordon Tredgold


July 04, 2015

Recovering from a Storage System Failure
Protecting from storage system failure requires more than just copying data from point A to point B. To meet any reasonable recovery expectation requires that a secondary storage system be available to start the application on. The good news is secondary storage is very affordable today and these systems can play a larger role than just being a standby storage system for the primary storage system. But before implementing the secondary storage system, the IT planner needs to understand what the acceptable recovery point (RPO) and recovery time objectives (RTO) will be for the applications counting on that system. Once the RPO and RTO is understood IT planners will know what method they should use to get data to the secondary storage system and what type of secondary storage system they should buy.


When Big Businesses Collaborate in the Cloud, Rapid Innovation Will Follow
The industry can also expect to witness accelerated technological advancements and innovation — when big businesses collaborate, rapid innovation will inevitably follow. Competitive divides between managed cloud and commodity cloud are growing; businesses need to understand which is right for their needs. Those who don’t have either the in-house expertise or the inclination to develop it are wise to choose a managed cloud solution, for efficiency and to guarantee uptime. Competitive collaboration plus partnerships through channel programs up and down the stack drive value for end customers. In order to drive success with the majority of the SMB through Enterprise market, we all need to consider partnerships.


Coffee Shops and Home Routers Could Offer Nearby Phones a 4G Data Connection
It encodes data in the same way as the LTE technology used by cellular networks today, but is designed to be used over the same part of the radio spectrum as Wi-Fi and has roughly the same range. The company says it can provide faster, less glitchy connections than Wi-Fi because LTE was developed for cellular networks where performance and reliability is more crucial. MuLTEfire opens new possibilities because the radio bands that Wi-Fi uses are not reserved for the exclusive use of any company. LTE is used today only by cellular networks on radio bands licensed from governments at costs of millions or billions of dollars. MuLTEfire is also different in that—as with Wi-Fi—a MuLTEfire hotspot can serve any device, regardless of which cellular carrier it is a subscriber to.


What’s the Difference between M2M and IoT?
To quote Mark Andreessen, “Software is eating the world” in industry after industry and it’s no different in ours. Old school SCADA (Supervisory Control and Data Acquisition) hardware is being consumed by Internet of Things software. Often people ask, “What’s the difference between M2M and IoT?” Besides the scope of the networks – M2M lives on a local area network or no network while IoT lives on a wide area network – the other major difference is SCADA consists of proprietary hardware like programmable logic controllers whereas the heart of IoT is software.


Why the Internet of Things won't be about the 'things'
“If you think, ‘Oh, I’ll never need that,’” pointing to a slide depicting the Apple Watch, “get that out of your system.” Forrester surveyed consumers about the Apple Watch upon its release for preorder in April, and 26%—representing about 50 million people in the United States—said they expect to own a smartwatch someday. What’s more, “8% want their next car to be a self-driving car, and they don’t even exist yet,” McQuivey said. “This is a consumer base that has learned the lesson of the last 10 years: ‘New stuff is going to come out, and it’s going to benefit me,” he added. “We are all early adopters now.”


Planning Cloud Analytics: conceptual architecture alternatives
Alternative architectures will result in cost variations for a cloud analytics solution. Choosing an optimal Cloud analytics architecture requires consideration of key tradeoffs described below. In addition, it requires consideration of factors such as skills, licensing, complexity, and maintenance. ...
In contrast to placing all essential components on a single CSP’s platform, distributing the components among multiple CSP platforms allows leveraging the strengths of product offerings available through each separate CSP. This approach necessitates consideration of costs associated with multiple CSPs as well as careful evaluation of architecture due to the potential for increased complexity.


Singapore wants to be a living lab for smart startups
“Singapore has always positioned itself as a great base for global companies,” he says when we meet for coffee on his visit to London. “We’re working hard to make it a great base but for startups and we’re talking about companies that have the opportunity to be big because people often comingle ‘SME’ and ‘startup’ and really they’re night and day.” Leonard says that this is a great time to foster startups: “You can be anywhere in the world and you don’t need to fill it up with infrastructure.” However, he says he’s not interested in building some California clone convoy. “One thing we’re not trying to do is mimic Silicon Valley. We have some good raw materials such as the intersection of universities and businesses, investment capital and high mobile use, and that gives us nice young talent and companies that want to tap into that.”


Why trust, rather than security, could be the bigger barrier to Google cloud adoption
“Once in the Google cloud, you have no idea where your data is. You don’t and won’t know when it’s being moved as Google looks to ensure that data is stored efficiently and regularly shifts data from one location to another to maximise resources,” he added. Hall said, as there is a risk their data could fall under the jurisdiction of another country with all the shifting around. Another concern, aired by a Computer Weekly reader, was that a company the size of Google is likely to be a top target for hackers, despite its assurances about the security resources it has in place to protect its platforms. “I have no doubt Google will have better security resources, but it will also be one of the top companies in the world on that target list,” the reader said.


Conditional Compilation in Universal Apps
Conditional compilation is the process of defining compiler directives that cause different parts of the code to be compiled, and others to be ignored. This technique can be used in a cross-platform development scenario to specify parts of the code that are compiled specific to a particular platform.Conditional compilation allows the compiler to skip some parts of the source code when compiling based on the conditional clauses. The #if ,#endif #elif and #else directives are used for conditional compilation. These directives add conditions to parts of a source file. With the use of these directives we can keep a single source file.


eBook: Problem Management
So what is problem management and what does it aim to achieve? And how is it different from incident management? This extract from Michael G Hall’s book, Problem Management, an implementation guide for the real world, explains how problem management differs from incident management and looks at the differences between reactive and pro-active problem management. The book draws on the principles of ITIL.



Quote for the day:

“Being a leader is making the people you love hate you a little more each day.” -- Patrick Ness

May 24, 2015

6 Psychological Triggers That Make UX Design Persuasive
You must learn about human psychology to design compelling user experiences. If you understand how the human mind works, it’s easier to get people’s attention and keep it. It’s also easier to get them to take some form of action (like subscribing or buying). But how do you find out what goes on inside the mind of your users? Well that’s where psychological triggers come in. They’re invisible forces that influence and persuade people. And when you use them in your design you can get more people to say yes to what you’re asking. In this post I’ll break down psychological experiments and academic research into simple, actionable steps that can help you design better experiences that lead to more sales online.


5 Smart Ways to Convince your CEO to Go Mobile
A common (and surprising) complaint I have been hearing is the difficulty CIOs and CTOs face in persuading CEOs to extend their business to mobile platforms. Today the mobile revolution seems to be pretty obvious just by looking at Apple’s performance. However, I realized that some are still reluctant to change their success formula. ... As mobile technology is evolving, we are finding newer ways to interact. Apple recently updated touch screens with force touch. Voice assistants are getting smarter, wearable devices have gained interest, fingerprint technology has become a lot better and new payment methods like Apple Pay are available. All these advancements have significant business applications. It’s important to be proactive and be a company that innovates instead of waiting for your industry to change dramatically and then reacting.


Security Concerns Extend to ‘Big Data Life Cycle’
The security flaws in Hadoop are well known. Apache Hadoop was an open source development project with little initial regard for security. As Hadoop’s security problems emerged, distributors and the Apache community began offering security add-ons for access control and authentication (Apache Knox), authorization (Apache Sentry), encryption (Cloudera’s Project Rhino) along with security policy management and user monitoring (the proposed Apache Argus based on Hortonworks‘ XA Secure acquisition). “Hadoop itself is very weak in security. You can be a Linux user and take all the data from Hadoop,” Manmeet Singh, co-founder and CEO of Dataguise, a provider of data masking and encryption tools for Hadoop, told Datanami last November. “The problem is the insider threat. Anybody can walk away with billions of credit card numbers.”


The City of Burnaby’s CIO offers an Internet of Things reality check
“What we’d like to get to is really to start thinking of these sensors more out of the optimization of the business process to how we can do things better as a city,” she said. This could include smarter traffic flows, remote proactive information before infrastructure fails, or being able to email citizens or send out tweets about something important happening in their area. Other use-case scenarios for sensor-based technologies today is on the City of Burnaby’s pump stations, which Wallace said are used to remotely monitor things like pressure flow and depth. “What really interests me is the education piece: if your water usage has gone up for 20 percent, for example, this is what it means for our reservoir,” she said. “Some of these things we’re doing were not called the Internet of Things. They were just things good cities did.”


Banking on IT Governance: Benefits and Practices
In banking today, more systems, applications and services are exposed to the customer through self-service channels which have a direct bearing on customer experience. They can create significant opportunities but increase the risk of poor performance. Thus, quality of IT governance has become an important tool for managing risk and marketplace effectiveness. However, IT governance comes with a slew of risks, and the distinctions among them are distorted with the merger of people, process and technology. This can lead to a serious impact on operational effectiveness. There is a need for security governance within banks, which entails building a robust framework and laying down a comprehensive information security policy. In addition, it relates to creating a data prevention framework for minimizing data breach.


Halamka and Branzell Urge CIOs To Be “Revolutionaries”
It is particularly difficult now that CIOs are pelted daily with new requests and demands from inside and outside their organizations, Halamka said. “People say, ‘OK, I get it, we need to be prepared for the accountable care future, we need to prepare for care management and care in the home, and even though there’s this cool project that some stakeholder wants, we really don’t have the bandwidth for that.’ And so what not to do” as a CIO “is as important as what to do, because each of us gets this laundry list of hundreds of things that stakeholders wants.” He said with a bit of humor, “The technique I usually use is not to say ‘no’; ‘no’ is such a negative word, so loaded with emotion. So, I say, ‘not now.’” Meanwhile, he added, “My role on the resource side is not to create fear, uncertainty and doubt, but to explain to the board what we need to do.”


DAM and the Art of Governance
A good DAM manager, like a librarian who is differentiating between reference-only items and circulating materials, will keep records. These may take the form of spreadsheets or flowcharts in a secure location delineating user group permissions, asset restrictions, metadata fields both required and optional, workflows, controlled vocabulary terms and taxonomy structure. The most important aspects of the governance strategy are the organizational buy-in on the policies for digital asset management and the documentation of these rules. The benefit will be the ease of decision-making enabled by an established governance plan. Don’t worry about how formal or official these policies may be – the value is in having the discussions leading to the creation of the governance plan and simply in having it all written down.


Three Ways Data Breaches Are Reshaping Data Governance
With the public increasingly cognizant of the amount of personal data they share with businesses, the organizations that collect this information will need to do a better job of determining how much stored data could be potentially exposed in a breach. Businesses need more context around stored data and a stronger understanding of the type of personal information that is collected and how it is protected. Metadata analysis enables businesses to take stock and identify which systems interact with what data, where that data is stored, how much of it is personally identifiable data (PID), and more. This can reveal gaps in data security or material risk factors – a crucial capability for businesses that desire proactive breach mitigation.


Is MDM BI?
There is certainly a reliance on each other; however, a solid BI strategy cannot exist without MDM. Let's face it, a report is only as good as the data from which it is drawn. ... MDM ensures the data you present in your BI layer is clean, complete, consistent and de-duplicated. These data issues arise when you are combining several data sources, eg, CRM, ERP, billing, stock, helpdesk, etc. Duplicates also arise as a result of fast-growing companies which, while on the acquisition path, acquire many new ERP and CRM systems along the way. ... If all this data is pulled into a data warehouse, they will be seen as different records and be counted as different customers. When creating BI views or reports, the data will be incorrect because of underlying problems in the source systems.


e-Book: Managing Third-Party Risks
This e-Book, produced by Compliance Week in cooperation with ProcessUnity, reviews the latest thinking in vendor and third-party risk. It provides compliance professionals with everything they need to know about third-party risk management and how to avoid regulatory complications. In the first article, we explore topics from a recent executive forum, which discussed vendor risks and why building a systematic approach is important. Next, in “Four Keys to Creating a Vendor Risk Management Program That Works,” ProcessUnity deconstructs the idea of vendor risk management and provides four principles that compliance practitioners should follow. Then we examine what happens when third parties engage in bribery and corruption. “Mapping Third-Party Risks” discusses the size and scope of the third-party universe and why companies should have a plan to monitor their vendors.




Quote for the day:

"If you can’t handle others’ disapproval, then leadership isn’t for you." -- Miles Anthony Smith

May 21, 2015

Q and A on The Scrum Culture
Bluntly speaking, command and control is not compatible with Scrum. As soon as you allow Scrum to spread throughout the command and control enterprise, there is a clash of cultures and only one will survive. On the one hand command and control is more effective in a production line environment, and it is usually also the dominant approach in the organization. So it has the home field advantage and is the primary source of "gravity", drawing people back to the old way of doing things. The Scrum Culture on the other hand is more effective in development and research environments and is what more and more people demand from their employers.


Can OpenStack free up enterprise IT to support software-driven business?
Although it is often considered as a way to build a private cloud, OpenStack can also be used to provision datacentre hardware directly. Subbu Allamaraju, chief engineer for cloud at eBay, said he would like to use OpenStack as the API for accessing all datacentre resources at the auction site, but the technology is not yet mature enough. Walmart's Junejan added: "We aim to move more markets onto OpenStack and eventually offer datacentre as a service." OpenStack can also be used to manage physical, bare metal server hardware. James Penick, cloud architect at Yahoo, said the internet portal and search engine had been using bare metal OpenStack alongside virtualisation.


Certification, regulation needed to secure IoT devices
Xie explained in an interview with ZDNet that in traditional networks where components such as switches and routers were wired, there were well-established architecture frameworks that outlined where and how firewalls should be connected to switches, be it redundantly or as a single connection. These guidelines would no longer be effective with SDNs where the these "wires" were now defined by software and where switches could be "relocated" by the stroke of a key, he said. Firewalls, instance, would need to continue to operate the necessary policies to secure a database within a SDN, when that database is virtually relocated to a different city. "So all that becomes more intangible, and the big challenge is for security to be able to adapt to that kind of architecture," he noted.


Net Neutrality Rules Forcing Companies To Play Fair, ... Giant ISPs Absolutely Hate It
While the FCC's rules on interconnection are a bit vague, the agency has made it clear they'll be looking at complaints on a "case by case basis" to ensure deals are "just and reasonable." Since this is new territory, the FCC thought this would be wiser than penning draconian rules that either overreach or contain too many loopholes. This ambiguity obviously has ISPs erring on the side of caution when it comes to bad behavior, which is likely precisely what the FCC intended. ... And by "well functioning private negotiation process," the ISPs clearly mean one in which they were able to hold their massive customer bases hostage in order to strong arm companies like Netflix into paying direct interconnection fees. One in which regulators were seen but not heard, while giant monopolies and duopolies abused the lack of last mile competition.


Leaderless Bitcoin Struggles to Make Its Most Crucial Decision
The technical problem, which most agree is solvable, is that Bitcoin’s network now has a fixed capacity for transactions. Before he or she disappeared, Bitcoin’s mysterious creator, Satoshi Nakamoto, limited the size of a “block,” or group of transactions, to one megabyte. The technology underlying Bitcoin works because a network of thousands of computers contribute the computational power needed to confirm every transaction and record them all in a permanent, publicly accessible ledger called the blockchain (see “What Bitcoin Is and Why It Matters”). Every 10 minutes, an operator of one of those computers wins the chance to add a new block to the chain and receives freshly minted bitcoins as a reward. That process is called mining.


Machine learning as a fluid intelligence harvesting service
Developers are only human. They have limited capabilities, attention spans and so on. But data and the knowledge that can be gained from it are seemingly unlimited. Even the world’s data scientists and domain experts have to prioritize their efforts to extract insights from some relevant portion of the vast ocean of information that surges around them.  With only so many hours in the day, data scientists and analysts need to leverage every big data acceleration, automation and productivity tool in their arsenals to sift, sort, search, infer, predict and otherwise make sense of the data that’s out there. As a result, many of these professionals have embraced machine learning.


Software development skills for data scientists
You should learn a principle called DRY, which stands for Don't Repeat Yourself. The basic idea is that many tasks can be abstracted into a function or piece of code that can be reused regardless of the specific task. This is more efficient from a "lines of code" perspective, but also in terms of your time. It can be taken to an illogical extreme, where code becomes very difficult to follow, but there is a happy medium to strive for. A good rule of thumb: if you find yourself writing the same line of code with only minor changes each time, think about how you can turn that code into a function that takes the changes as parameters. Avoid hard-coding values into your code. It is also good practice to revisit code you've written in the past to see if the code can be made cleaner, more efficient, or more modular and reusable. This is called refactoring.


Marketing vs. IT: Data Governance Bridges the Gap
The key is to first understand how to govern information in the modern data era – not going back to the stone ages where marketers – and for that matter all business users -- had to follow naming conventions, put everything into schemas and build their work into models. Today, IT teams can empower the data-driven marketing organization by providing better tools and automation across the entire analytic process, including a new class of self-service data preparation solutions, which simplify, automate and reduce the manual steps of the analytic process. This new self-service data preparation “workbench” empowers marketing, sales, finance and business operations analysts with a shared environment that captures how they work with data, where they get it from and ultimately what BI tool they use to analyze it.


Full Stack Web Development Using Neo4j
Neo4j is a Graph database which means, simply, that rather than data being stored in tables or collections it is stored as nodes and relationships between nodes. In Neo4j both nodes and relationships can contain properties with values. ... While Neo4j can handle "big data" it isn't Hadoop, HBase or Cassandra and you won't typically be crunching massive (petabyte) analytics directly in your Neo4j database. But when you are interested in serving up information about an entity and its data neighborhood (like you would when generating a web-page or an API result) it is a great choice. From simple CRUD access to a complicated, deeply nested view of a resource.


Executive's guide to the hybrid cloud (free ebook)
Hybrid strategies have begun making inroads in several industries, including the financial sector, healthcare, and retail sales. In a widely cited report, Gartner predicted that nearly 50 percent of enterprises will have hybrid cloud deployments by 2017. Hybrid clouds can help ensure business continuity, allow provisioning to accommodate peak loads, and provide a safe platform for application testing. At the same time, they give companies direct access to their private infrastructure and let them maintain on-premise control over mission-critical data. Is hybrid an ideal strategy for all companies — or a panacea for all cloud concerns? ... This ebook will help you understand what hybrid clouds offer, and where their potential strengths and liabilities exist.



Quote for the day:

“It’s what you do in your free time that will set you free—or enslave you.” -- Jarod Kintz

April 16, 2015

5 Factors to Retrospect after Every Sprint while Developing a Product
The essence of agile is to thrive for continuous improvement through empirical process control. True agile teams find ways to improve through experimentation, finding sustainability, and delivering business value earlier. It is a never-ending journey, and a sprint retrospective emerges as an opportunity to further accelerate this improvement process. It is a great time to allocate and analyse extraneous factors in detail, which otherwise may distract the team’s focus. In this post, we highlight 5 factors which every agile team should retrospect after each sprint. Let’s have a look.


Combining SIAM and DevOps for Digital Reimagination
Some of the most important aspects of the SIAM role are the coordination of people, processes, technology and data, and the governance across multiple suppliers, to ensure effective and efficient operations of the end-to-end service delivery to the business user. DevOps and SIAM converge in addressing current business and IT challenges and targeting people and attitude as primary drivers of performance and value. Whilst DevOps addresses the cons of functional specialisation and the spread of responsibilities across different IT teams, SIAM deals with the additional challenge of spreading services across multiple vendors.


Free ebook: Microsoft Azure Essentials: Azure Machine Learning
This ebook will present an overview of modern data science theory and principles, the associated workflow, and then cover some of the more common machine learning algorithms in use today. We will build a variety of predictive analytics models using real world data, evaluate several different machine learning algorithms and modeling strategies, and then deploy the finished models as machine learning web service on Azure within a matter of minutes. The book will also expand on a working Azure Machine Learning predictive model example to explore the types of client and server applications you can create to consume Azure Machine Learning web services.


Lack of skilled infosec pros creates high-risk environments
A portrait of the ideal cybersecurity professional emerges from this list of shortfalls: the top three attributes are a formal education, practical experience and certifications. The study reveals that organizations are experiencing attacks that are largely deliberate, and they lack confidence in the ability of their staff. The top four threat actors exploiting organizations in 2014 were cybercriminals (46 percent), non-malicious insiders (41 percent), hackers (40 percent) and malicious insiders (29 percent). 64 percent are very concerned or concerned about the Internet of Things, and less than half feel their security teams are able to detect and respond to complex incidents.


How The Internet of Things Is a Transformational Opportunity
Internet of Things looks like a massive opportunity over the years ahead, there are already many practical and valuable applications, and everything seems to be indicating that we are just in the first stages of what could be a game-changing series of innovations. However, opportunity attracts competition, and IBM will need to compete against several big players trying to get a piece of the pie. In January 2014 Google invested $3.2 billion in the acquisition of Nest Labs, a leading player in smart thermostats and smoke alarms. This means Google invested more in a single purchase than IBM over the coming four years in its whole Internet of Things initiative.


Will containers kill the virtual machine?
Containers are not a new technology: the earliest iterations of containers have been around in open source Linux code for decades. But in the past year they've captured the hearts and minds of many developers for building and running applications. Containers isolate specific code, applications or processes. Doing so gives whatever is inside the container a neat envelope for managing it, including moving it across various hosts. Whereas you can think of a virtual machine slicing up a server into multiple operating systems, containers run atop the OS so unlike a VM, they don't require an OS to boot up when they're created. In essence they can virtualize an operating system to provide a more lightweight package of an application compared to a VM.


SSL/TLS/HTTPS: Keeping the public uninformed
Perhaps the most important thing to understand about the SSL/TLS/HTTPS system that secures websites is that you are not supposed to understand it. ... If SSL/TLS/HTTPS was reallydesigned for security, this would have been done long ago. But secure websites are security theater. They seem to be secure, techies say they are secure (at least in public), but the system is flawed. That it took so long to expose Superfish was because the system is rigged against normal folks. Jonathan Zdziarski recently made another simple suggestion that, like mine, will never see the light of day. He points out that HTTPS interception, such as Superfish, can be detected if the web browser notices that the last X "secure" websites were all vouched for by the same Certificate Authority.


SEC’s Stein touches all the bases in discussion on data, technology
With a goal of collecting an estimated 58 million records per day, there is little doubt that CAT will require a tremendous amount of industry cooperation. However, Stein pointed out that a proposal that might seem like a regulatory reform wrought with headaches for the industry might eventually simplify the work of compliance professionals. “Only though CAT can we develop regulations that are driven by the facts,” Stein explained. Stein touched on how the Flash Crash and the lengthy investigation that followed highlighted the need for CAT and lamented the slow march to implementation, which remains years away. “We need the CAT as soon as possible,” Stein said.


Infosec taking the strain as threats evolve and skills gap widens
Davis added it may also indicate that information security professionals in Germany have a higher level of top executive support than in the UK and elsewhere in Europe. Despite budgets allowing for more personnel, 62% of respondents reported that their organisations have too few information security professionals – up from 56% in 2013. Frost & Sullivan estimates that the global workforce shortage will widen to 1.5 million in five years, while the variety and sophistication of cyber threats are expected to continue. The situation is exacerbated by the broadening footprint of systems and devices requiring security oversight. Signs of strain, including configuration mistakes and oversights, were identified as a significant concern, and recovery time following system or data compromises was found to be getting steadily longer.


Why CIOs can’t sell enterprise collaboration tools
One of the biggest challenges is determining how to implement enterprise collaboration in cross-functional manner, says John Abel, senior vice president of IT at Hitachi Data Systems, “Teams are pretty good at communicating within their own group but when it comes to integrating across departments silos tend to happen, which ultimately becomes problematic when each team needs to align on certain campaigns or key topics,” he says. NetScout’s CIO and Senior Vice President of Services Ken Boyd says the landscape of collaboration tools available today makes it difficult to pick the best ones for a specific workforce. “Locating a collaboration tools provider that can offer the right balance for the needs of our enterprise users can be a significant challenge,” he says.



Quote for the day:

“...A man can only stumble for so long before he either falls or stands up straight.” -- Brandon Sanderson

March 11, 2015

Big Data: A Brief(ish) History Everyone Should Read
Long before computers (as we know them today) were commonplace, the idea that we were creating an ever-expanding body of knowledge ripe for analysis was popular in academia. Although it might be easy to forget, our increasing ability to store and analyze information has been a gradual evolution – although things certainly sped up at the end of the last century, with the invention of digital storage and the internet. With Big Data poised to go mainstream this year, here’s a brief(ish) look at the long history of thought and innovation which have led us to the dawn of the data age.


Baidu ends support for Android platform
"Because of a company business adjustment, we have no choice but to painfully decide to suspend updates and support to the Baidu Cloud OS," the team said in an official forum posting. While the rest of Baidu's consumer cloud business will still be in operation, the Cloud OS and ROM design unit has been folded into a new company, the team added in another posting. It did not offer details about the new company. The Baidu Cloud OS, launched in 2012, marked the Chinese search giant's attempt to bring more company services to smartphones.


Most Innovative Companies 2015: A Q&A With SmartThings Founder Alex Hawkinson
The developer community has really continued to grow quickly, which is so incredibly exciting to me and for everyone involved with SmartThings. The last numbers we disclosed in January during my keynote at CES were that the number of developers had more than doubled since the time of our merger with Samsung last August. More than 10,000 developers had actually published a running app (we call them SmartApps) or integrated device (we call them Device Handlers) in our platform by that time. ... Usage is very diverse, as we've found that smart homes, as they deepen, end up reflecting the personality and uniqueness of the person who lives in the home. However, there are some big patterns where we see use cases that are popular across much or all of the customer base.


Free eBook! Software Defined Storage for Dummies
Software defined storage is a relatively new concept in the computing and storage industry and can refer to many different technologies and implementations. Software defined storage is part of a larger industry trend that includes software defined networking (SDN) and software defined data centers (SDDC). At its most basic level, software defined storage is enterprise class storage that uses standard hardware with all the important storage and management functions performed in intelligent software. Software defined storage delivers automated, policy-driven, application-aware storage services through orchestration of the underlining storage infrastructure in support of an overall software defined environment. Standard hardware includes:


Anatomy of a Successful SAP Implementation
Dougherty knows that a major reason the SAP program was a success was how they trained people on the system. “We took 55 volunteers from across the business and put them through Airgas SAP boot camp,” he says. “They each did more than 120 classroom hours of training and became our dedicated trainers. We took them out of their day jobs, and they went from business unit to business unit six weeks prior to each rollout to train those associates. They were also on the ground for two to four weeks during post-go-live to ensure the associates were using the system properly.”


A Journey to IoT w/Father, Son, a Laser and Cats…Phase One
... Kitten Interaction Terminal- Nano Edition, this version would employee much smaller and less expensive components and be in a convenient casing Happy to say that we made our goal of showing our Phase One project at the IoTPhx meet up and received great feedback. Our question to the group was how to get to Phase Two and connect KIT to the Internet. There were suggestions about doing it connected to a computer and/or doing it all on the board. There is some new code coming soon to provide TCP/IP connectivity within the ChipKit Boards that looks promising that could make it stand-alone…Good Times!


Keep It Simple With New Backup Technologies
If you are still on the fence over whether or not to move from legacy backup technologies to the new, virtualized technologies, I like to use this analogy: think of your backup system as a kitchen that needs to be remodeled. In one type of remodeling project, you replace the appliances and cupboards and maybe lay down new flooring. The kitchen looks very different, but the functionality is exactly the same, and any of the old frustrations (like a lack of cooking prep space or no room for a table) go unaddressed. That’s your legacy backup system.


Cybersecurity has a talent shortage
The demand for information security professionals is quickly exceeding the number of people who are capable of doing the job, said Peter W. Singer, former director of the Center for 21st Century Security and Intelligence at Brookings Institute and a strategist at the New America Foundation, a public policy institute. "We don't have enough expertise in the right places now," said Singer, co-author of a recent book "Cybersecurity and Cyberwar". "We often frame cybersecurity as a technology problem. It is a human problem." While there isn't a single best solution to a complex shortage of candidates, Singer said, education should be a top priority in meeting anticipated needs. Many experts and policymakers also see institutional reform as a place to start.


10 surprising skills that will give IT job seekers the edge
Companies around the world are engaged in a fierce fight for talent. Especially in IT, the growth of new disciplines like big data, a need to understand the business and to be malleable in the face of change, and the impending retirement of legions of highly skilled baby boomers are presenting companies with unique IT hiring challenges. This is forcing companies to reevaluate what they look for in IT job candidates. Here are 10 emerging skills and qualities companies are looking for.


The Often Overlooked Skills and Responsibilities of a Technical Team Leader
Knowing the responsibilities, we may determine the necessary characteristics of such a key person in software development. First, I will point out several deficient views of a technical team leader, and why these views are incomplete and may not lead to team success. Then, I will categorize all the necessary responsibilities to be carried out by a technical team leader. Finally, I will discuss other functions in a typical software organization, and will explain why we shouldn’t overwhelm the team (and its leader) with such responsibilities.



Quote for the day:

"Never give an order that can't be obeyed." -- General Douglas MacArthur

January 28, 2015

Big Data: 5 Top Companies and Their Plans for 2015
Expect new product and service announcements from the established big names, as well as a flood of innovative start-ups hitting the headlines over the next 12 months. This is the first part of my run-through of big data companies I expect to hear great things in 2015. I’ve started with the “big data giants” – established names which have made data the foundation of their business model. In another post I will focus on the newcomers and start-ups snapping at their heels.


Cloud ERP: 9 Emerging Options
The good news for CIOs and their teams considering moving some or all ERP functions online: Vendors have been prepping for this shift, and there's already plenty of choice. The conventional ERP heavyweights -- Microsoft, Oracle, and SAP -- are also in on the trend. You may have noticed how much all three, each in their own way, talked up cloud across the board in 2014. Oracle in particular spent a good bit of time discussing -- at the highest executive levels -- its cloud endeavors and future plans, sure signs that the entrenched on-premises approaches to ERP are getting a cloud makeover, even if such shifts will take much more time than, say, getting off that old Exchange server for email.


World's Largest DDoS Attack Reached 400Gbps, says Arbor Networks
Increasingly, the culprit is Network Time Protocol (NTP), an important but otherwise totally ignored way for the Internet to keep its routers and server infrastructure synchronised with UTC. Not long after an infamous attack on Spamhaus in early 2013, which used something called DNS amplification to summon up potentially vast amounts of traffic, someone worked out that other protocols were open to the same trick. NTP turned out to be a good candidate for the same spoofing/amplification treatment, notably during the almost-as-infamous attack on CloudFlare a year ago, the one Arbor mentions as hitting 325Gbps.


Samsung's $100 million Internet of Things Bet
Samsung has thrown its weight behind an effort called Thread, which also has the backing of Nest, processor designer ARM, and a few other industry players. As Parks Associates analyst Tom Kerber explains, Thread works by assigning every device its own IP address, and brings numerous benefits including end-to-end encryption and low power consumption. "If you think about longer-term, it's very likely that a lot of the intelligence is going to be in the end devices rather than in a central controller, so these end devices need to be addressable, and IP is kind of predominant," Kerber says.


7 Corporate blogging blunders to avoid
Yes, blogging has some huge advantages. And yes, I believe that many companies would greatly benefit from an ongoing blogging initiative. But that means taking the time to do it right.
Blogging requires a strong strategy, good optimization and fantastic content. Without those three elements, your blog will fizzle out before it starts to sizzle. It won’t help you boost search positions. It won’t engage your readers. And it won’t help your company make money. That’s never good. Want to keep your blog on the straight and narrow?


Microsoft unveils a great distraction
Can we all please calm down and look at this product glimpse rationally? Sure, HoloLens will find a home in some markets. I can certainly see design verticals such as CAD and CAM embracing the technology. And I’m sure that gamers will love it. Halo in 3D? I’m there! But outside of those niche markets, does this new headlining feature really offer anything to the ordinary home or corporate user? I don’t think so. My word processor and spreadsheet won’t work any better for being in 3D. As a writer, I want people to become immersed in my prose, but I don’t want to rely on a 3D trick to accomplish that.


Crooks Start Encrypting Websites And Demanding Thousands Of Dollars From Businesses
“The next step might well be the modern equivalent of protection rackets – threatening companies with being either taken offline or having their databases frozen unless they pay a regular fee.” Brian Honan, security consultant, said the modus operandi of the RansomWeb hackers was similar to ransomware attacks against a number of SMBs he had worked with, whereby the criminals broke into the server of the victim, overwrote backups with either the encrypted data or blank data, and at a later date returned to encrypt the server. “At this stage the backups are no longer useful as they contain no workable data to restore the systems, thus leaving the victim companies with the choice of either losing all their data and rebuilding it from scratch, or paying the ransom.”


Data scientists: How to hire and how to get the best from them
Hand says the rise of the data scientist is unsurprising, especially as it has been long predicted that the industry would see a massive shortage of people with a high level of analytical skills. He says any individual with data science in their LinkedIn profile can expect to be bombarded with emails from recruiters - yet, Hand says smart organisations also focus on two other sources of data scientists. "The first is to develop a close working relationship with a university, and not just the computer science school as many of the most successful analysts are coming from other schools of science," he says. "The second is to look inside the organisation for core analytical skills and to be prepared to retrain those people in advanced data analytics."


Building Massively-Scalable Distributed Systems using Go and Mesos
Apache Mesos uses an idiom known as a framework to delegate task scheduling to client code running on the cluster. Spark, for example, was originally a Mesos framework written to the Mesos API using the Scala language bindings. The original version of Mesos was built in 2009, before Go was popular. Today, Go is one of the most popular languages and many of the key components that integrate with Mesos are written in Go. For example, Kubernetes-Mesos, the Mesos framework for running Kubernetes workloads on Mesos is written in Go. Also, Go is popular with the many infrastructure tools including and surrounding the Docker container format which is natively supported by Mesos.


eBook: How to Adopt Microservices
Microservices architecture is emerging as the new standard for building applications. This approach to software design breaks complex applications into small, nimble, independent components to speed up time to market, simplify maintenance, and enable continuous integration.Learn in this new ebook how to adopt this new approach and optimize your applications and development processes.



Quote for the day:

"Leadership is the art of getting someone else to do something you want done because he wants to do it." -- Dwight D. Eisenhower

January 02, 2015

Relating the IoT to Enterprise Business Strategy
According to Porter and Hepplemann, the key element of “smart, connected products” is they take advantage of ubiquitous wireless connectivity to unleash an era where competition is increasingly about the size of the business problem solved. Porter and Hepplemann claim that as smart, connected products take hold, the idea of industries being defined by physical products or services alone will cease to have meaning. What sense does it make to talk about a “tractor industry” when tractors represent just a piece of an integrated system of products, services, software, and data designed to help farmers increase their crop yield?


Zero Day Weekly: ISC hacked, SS7 mobile security, Windows privilege escalation
This week the Internet Systems Consortium site was hacked, a Lizard Squad member was caught (and released), a privilege escalation bug was revealed in Windows, SS7 research nuked mobile privacy beliefs, The Interview became an Android malware vector, post-breach perceptions of Sony and Staples were analyzed, and more.


Wearables Carve New Path To Health In 2015
"The wearables market is starting to see technology that produces richer and more precise user data than ever before. The problem we're seeing is that most fitness trackers are offering a flat world of data, without much insight beyond what an accelerometer can capture," Kenzen CEO Sonia Sousa told InformationWeek. "This is why wearable fatigue is so high. After about six months, you stop caring because the number of steps doesn't really change." Use of health-oriented wearables will almost triple between 2014 and 2018, according to Juniper Research.


Why the Software Defined Data Center is the Future
“[IO’s] approach to the data centre has been to build a physical data centre layer that is modular in its approach. Our modules can be componentised, delivered in separate pieces, at the right size to meet changing needs. Also it is configured and managed by stacking a software layer on top of the components so that you create a smart data centre – a data centre that has a path to connect to the application layer and react in a dynamic fashion. The application layer is changing, and the physical data centre can change the way it behaves to support that.”


Why Involving CFOs in Innovation Is No Longer Optional
CFOs can bring to the innovation discussion finance’s expertise and depth in data and analysis pertaining to core business metrics, especially when an innovation initiative is occurring in core business functions. In cases where an organization is considering more breakthrough or disruptive types of innovation, CFOs and finance should be at the table and thinking more broadly about the risks the company might be taking on and how those risks might impact, or relate to, other elements of risk. In addition, any time an organization assesses strategy and how innovation could contribute to the overall corporate growth agenda, the CFO should have a prominent voice in the discussion.


eBook: Securing Tomorrow – The Road To Business Resiliency
EMC’s new Business Resiliency eBook outlines why firms need to rethink their ability to consistently and systemically anticipate significant interruptions and failures while fulfilling all business commitments and requirements. Isn’t it time to prepare your business to withstand both the expected and unexpected? Read the blogs (Left column) by security experts or jump to an eBook chapter (Right column) to learn more.


Why you don't need a SAN any more
Scale-Out File Server is the logical endpoint of Windows Server 2012 R2's software-defined storage. It's fast, flexible, and cheaper than the SAN alternatives. It might not be for every network, but it's also something you can build up to, as you start to use Storage Spaces and then add clustering to your network. The end result is a storage fabric, much like that used by Azure -- and ready for your own private cloud. While it's not suitable for all workloads in this version (especially not SharePoint and other document-centric services), Scale-Out File Server is ideal for hosting virtual machine images and virtual hard disks, for handling databases and for hosting web content.


Preparing for the data center of the future
The data center of 2020 will look vastly different from today's data center in a variety of ways. As application silos are broken down and resource tiers consolidated, the result will be data centers that consist of three hardware tiers -- for processing, memory and storage. Applications will dynamically allocate resources from each of the tiers, providing the required elasticity to respond to changing demands. With the advent of cheaper memory, more federal agencies will adopt in-memory computing technology to reduce application response times. That approach has the added benefit of transferring the load from transactional databases, which can further reduce licensing and operating costs.


SQL Stored Procedure Performance improvement
In this article we will focus on basic things which are useful to increase performance of the stored procedure for fetching or retrieving data. We will try to understand what kind of precautions we should take while creating a stored procedure for fetching or retrieving data. ... Stored Procedure for fetching or retrieving data from database may take long time to execute. Following are some points which will help to improve performance of such type of stored procedures


ScALeD – Scaled Agile and Lean Development
ScALeD – Scaled Agile and Lean Development – is not another scaling framework. We see ScALeD primarily as a practitioner driven movement to help organizations to find a sound and balanced approach to agile transition and scaling questions. Inspired by Lean and agile values, driven by principles and completed through various practices and frameworks. Our main mission is to create awareness about what agility can mean for an organization. The core of ScALeD is a set of 13 principles, structured into 5 pillars. The pillars or outline of ScALeD resemble the main lean values:



Quote for the day:

"Always do right. This will gratify some people and astonish the rest.” -- Mark Twain