April 20, 2015

5 Easy Tips to Deal with Conflicts Within Distributed Agile Teams
After being a buzzword for years, Agile has now become the go-to development methodology for most entrepreneurs. However, if your development team is remote, working on agile is a challenge. In this article, we highlight some conflicts which arise while you are working with distributed agile teams. Let’s identify these issues and understand how you can deal with them. ... Communicate regularly and effectively. Having face-to-face interactions helps. If you meet them personally, say once or twice a year, it works wonders in fostering the connection between you and your team. Building socializations platforms and creating opportunities for informal conversations is also a good idea. Team building sessions also help. Gaming sessions and co-worker trivia can come in handy when establishing a better relationship with your remote team


How to make more successful enterprise software purchases
By the time the requirements analysis is complete, an organization should know what they need in detail, and why they need it. An inadequate requirements analysis sets the stage for a troubled implementation project with ballooning costs. Part of the problem of ballooning costs is caused by waiting for the implementation stage to flesh out requirements in sufficient detail. However, there are many benefits to doing this work early on in the project, namely at the requirements analysis stage. ... It is difficult to estimate accurately the time and resources needed for implementing the software when the requirements are written at too high a level. A one-line requirement can encapsulate weeks or even months of work for the unwary. Far better to provide the detail needed to get more accurate implementation estimates.


Five silver linings of the public cloud
one of the most significant positives has been the speed at which Seaco can recover from system failures. Its recovery point objective - the maximum period that data is unavailable following a major incident - is down from 12 hours to one hour and its recovery time objective - the time it takes to restore a business process after IT-related disruption - was reduced from three days to two hours. ... In its early days, the firm spent millions of dollars building 1,000 foot datacentres in London and Washington. But as demand for its services increased it found it became "very expensive to keep every feature of the platform behaving with globally consistent performance".


With data analytics, no more Pontiac Azteks
Analytics exponentially expands the zone of what can be known. For-profit executives and hard-working public servants no longer need to make stuff up as they try to achieve organizational objectives. Nowhere is this truer than in the world of product development, especially with respect to bringing insights about customers to that process.  In the middle of the last century, during the era on Madison Avenue of Mad Men, the focus group was the cutting-edge method of doing this. Consumers would be brought in to spend a few hours in a conference room at a company’s marketing department or ad agency, and they would be asked things like how they used a product and what they wanted from a category of products.


Protecting infrastructure secrets with Keywhiz
To protect secrets stored on the server side, every secret is AES-GCM encrypted with a unique key before being stored in a database. This unique key is generated using HKDF. Square uses hardware security modules to contain derivation keys. Services get access to secrets through KeywhizFs. At Square, each service on every host has a directory where a KeywhizFs filesystem is mounted. Services merely have to open a read-only “file” in that directory to access a secret. Performing a directory listing shows which secrets are accessible. Local access control is straightforward; traditional Unix file permissions are used for the secret “files.” The advantage of a file-based representation is that nearly all software is compatible with reading secrets from files.


Meet the Cybersecurity Company Helping Sony Fend Off Hackers
Though not a household name, the Milpitas, California-based company has become a go-to security firm when big companies fall victim to cyberattacks. ... So when Sony's Los Angeles security team realized the studio's network had been breached, they asked FireEye to help figure out exactly what had happened and where the systems were vulnerable. That's the first step for many FireEye clients, most of which then ask the company to repair and improve their data defenses. "We were founded on the idea that cyber­attacks would ultimately overrun all existing defenses. Now this has been overwhelmingly demonstrated," says Ashar Aziz, the company's founder, chief strategy officer, and vice chairman.


IT consulting: Is moving out on your own the right move?
It's understandable why you might be considering going down the consulting path. For some, a full-time position can grow stale from working in the same environment, seeing the same people and dealing with the same problems day after day. "There can be an inherent lack of diversity, more limited exposure to different approaches. You may only experience certain types of projects once and only have one shot at success -- for instance, a major CRM application implementation," says Levine. Many times in your career you may find yourself at a crossroads. Neither direction is the right or wrong path, but if you consider the pros and cons carefully, you should be able to make the smarter choice. To help you get closer to the answer, we spoke with c-level tech experts to find out what you need to consider.


The VR growth cycle: What’s different this time around
Long story short, high-end VR would get crushed under its own weight long before it hits mass-market size. On the low end, total cost of ownership is lovely: $20 for a drop-in viewer and you have access to loads of two-minute, snack-sized VR that is cheap enough to produce that developers can create free, free to play, $.99 and ad-supported VR all day long. Now, the danger at the low end is that it passes from novelty into fad, instead of into a must-have, transcendent part of our everyday experience. I personally think we need to come at this from both ends to fully explore the potential of this as a business. And if I had to bet on one, I would bet on something closer to the low end. Maybe not Cardboard, maybe a cheaper edition of Gear VR. But something affordable to consume and produce. That will get the market to bigger numbers, faster.


Microsoft readies first developer preview of its new microservices Service Fabric
Using this Service Fabric, Azure applications can be decomposed into smaller components, a k a microservices, that can be updated and maintained independently of the underlying infrastructure. The Service Fabric enables the various microservices to communicate with one another via programming interfaces. Russinovich said last year that Microsoft was using the Service Fabric technology to run pieces of the Azure core, as well as services including Skype for Business (Lync) and the Azure SQL Database. Microsoft officials said today that the company also has used Service Fabric in building/deploying Intune, Event Hubs, DocumentDB and Cortana. Customers will get the exact same Azure Service Fabric framework technology that Microsoft uses internally, not a subset or different version of it, according to an April 20 blog post announcing the coming service.


Interview: The software processes behind Hailo's success
“Our developers provision their components and the system takes care of placing the service where it needs to be running, routing traffic to it and bringing feedback to the developers, who can control how much traffic is routed to the new service,” he says. This allows the development team to see if they have built something that does not work. “It is important for us to get the services we develop into production as quickly as possible, so we have automated testing, starting with the Hailo application and going back through integration testing of all its constituent components,” he says. One of the challenges a traditional software development team faces with DevOps is how testing and quality assurance fits in with continuous development and rollout.



Quote for the day:

"The art of leadership is saying no, not yes. It is very easy to say yes." -- Tony Blair

April 19, 2015

The business architect role and the enterprise architecture of tommorrow
To sum up, this enterprise business architect should operate higher up in the enterprise hierarchy to cover the business architecture and integrate it with the technology architecture. He will ensure that it is the full blueprint of the enterprise that it is delivered rather than the IT blueprint. And he will make sure that the audience is the whole enterprise rather than IT. This blueprint would enable stakeholders model own parts with same conventions and constraints in the enterprise wide context. This would unite the enterprise in one coherent operation and development effort. The EA would be the collective cross enterprise design where everybody contributes to the same plan and goals, in synchronization.


The Value of Data Platform-as-a-Service (dPaaS)
dPaaS provides enterprise-class scalability enabling users to work with rapidly-growing and increasingly complex data sets, including big data. Users have the flexibility to deploy any analytics tool on top of the platform to facilitate analyses in different environments and scenarios. The platform provides data stewards full transparency and control over data to ensure adherence with GRC (governance, regulatory, compliance) programs. dPaaS allows enterprises to reduce the burden of maintenance requirements for hardware and software. Companies can shift IT budgets from capex to more predictable opex, while freeing up IT teams to work on higher-return projects using market-leading technologies in collaboration with business units.


5 Unusual Ways Businesses Are Using Big Data
Big data is where it’s at. At least, that’s what we’ve been told. So it should come as no surprise that businesses are busy imagining ways they can take advantage of big data analytics to grow their companies. Many of these uses are fairly well documented, like improving marketing efforts, or gaining a better understanding of their customers, or even figuring out better ways to detect and prevent fraud. The most common big data use cases have become an important part of industries the world over, but big data can be used for much more than that. In fact, many companies out there have come up with creative and unusual uses for big data analytics, showing just how versatile and helpful big data can be.


How a Toronto prof changed artificial intelligence
In quick succession, neural networks, rebranded as “deep learning,” began beating traditional AI in every critical task: recognizing speech, characterizing images, generating natural, readable sentences. Google, Facebook, Microsoft and nearly every other technology giant have embarked on a deep learning gold rush, competing for the world’s tiny clutch of experts. Deep learning startups, seeded by hundreds of millions in venture capital, are mushrooming. Hinton now spend three-quarters of his time at Google and the rest at U of T. Machine learning theories he always knew would work are not only being validated but are finding their way into applications used by millions. At 67, when he might be winding down a long and distinguished career, he is just now entering its most exciting phase.


6 Wearables That Will Enhance The Wearable Revolution In 2015
The hearing aids continuously scan the acoustic environment and activate the most optimal settings for that particular listening situation. For example, if you are at a noisy family gathering, the smart hearing aids hone in on speech coming from the front while softening speech and noise from other directions. Later, if you are out walking the dog, they automatically adjust so you can enjoy the sounds of nature. ... The FitLinxx AmpStrip is a thin, waterproof device that tracks heart rate and activity around the clock with accuracy – all within a device as discrete and comfortable as a Band-Aid. It can be comfortably worn all day, every day. It easily sticks to your torso and automatically tracks heart rate, activity, exercise load, skin temperature and posture.


10 reasons to buy a Windows tablet for work instead of an iPad or Android
Tablets are going to work instead of laptops in some cases or to augment them in others. They can do a lot in the enterprise, some more than others. While the iPad and Android tablets are capable workmates, the tablets of choice are those running Windows. Windows has enjoyed a long reign as king of the workplace and that hasn't changed. There are a number of solid reasons why that is, and these reasons contribute to making Windows tablets the choice to take to work. ... Since Windows tablets provide more options to the enterprise when it comes to accessories, there is more cost flexibility. Also, business professionals will benefit from the app selection and the wide range of accessories.


Podcast: How to Architect for IoT
Some excerpts of this Podcast - IoT data is messy. Devices get cut off in mid-transition. How do you detect this–and clean it up–as data arrives?; IoT data is of incredibly high volume. By 2020, we will have 4x more sensor and IoT data than enterprise data. We already get more data today from sensors than we do from PCs. How do we scale to consume and use this. In addition, connected devices are not always smart or fault-tolerant. How do you ensure you are always ready to catch all that data; IoT and sensor and of itself is not terribly useful. It is rarely in a format that an analyst would even be able to read. It would be incredibly wasteful to store all this as-is in a business warehouse, DropBox repo, etc.


Digital Reasoning Goes Cognitive: CEO Tim Estes on Text, Knowledge, and Technology
Tim Estes founded Digital Reasoning in 2000, focusing first on military/intelligence applications and, in recent years, on financial markets and clinical medicine. Insight in these domains requires synthesis of facts from disparate sources. Context is key. The company sees its capabilities mix as providing a distinctive interpretive edge in a complex world, as will become clear as you read Tim's responses in an interview I conducted in March, to provide material for my recent Text Analytics 2015 state-of-the-industry article. Digital Reasoning has, in the past, identified as a text analytics company. Maybe not so much any more.


BI Industry Going Through Midlife Crisis
Seriously all the chatter about old slow BI approaches being left behind for rapid data discovery with little governance, one version of the truth being tossed to the wind in a new BI world being driven by the business, and even a short opening keynote flick created by the Gartner team showing a middle aged woman leaving her husband, tired of waiting, disappointed by empty promises, etc. did send a message and a warning signal. ... Data discovery tools are becoming totally irresistible to the business because they are fast, easy to use and visually drop-dead gorgeous. However, I can’t help but think a bit more BI sanity may return in a few years after the business realizes there is much more to a successful BI implementation than quickly connecting to data and creating pretty charts.


A Tester’s Perspective on Agile Snags
The true agile QA is also often responsible for non-unit-test tools, test environments, and test data. People in this role will find themselves weighing conflicting choices. The choices resemble those in non-agile projects, but the short timescales of an agile project make the problems particularly acute. The responsibility for test management is often delegated to one or two members of an agile team, rather than taken on by the team as a whole. Although working in agile keeps you on your toes, distributed responsibilities and better time management makes your work easier as well as efficient. Estimations also challenge agile testers.



Quote for the day

"I believe you have to be willing to be misunderstood if you're going to innovate." -- Jeff Bezos

April 18, 2015

Six cyber security startups kick off with CyLon accelerator
CyLon is supported by sponsorship from technology defence and security specialist Raytheon. CEO of Cyberlytic Stuart Laidlaw said his team – one of the selected six teams to form the first cohort – are looking forward to starting at CyLon. “We are delighted to have been selected for the first CyLon programme, which offers us a fantastic platform to grow our business into a leading global cyber security provider," he said. According to Iain Lobban, director of GCHQ until November 2014, cyber security is one of the most challenging issues in this generation.


Data science demands elastic infrastructure
The problem with trying to run big data projects within a data center revolves around rigidity. As Matt Wood told me in a recent interview, this problem "is not so much about absolute scale of data but rather relative scale of data." ... In a separate conversation, he elaborates: "Those that go out and buy expensive infrastructure find that the problem scope and domain shift really quickly. By the time they get around to answering the original question, the business has moved on. You need an environment that is flexible and allows you to quickly respond to changing big data requirements. Your resource mix is continually evolving--if you buy infrastructure, it's almost immediately irrelevant to your business because it's frozen in time. It's solving a problem you may not have or care about any more."


Anticipating the digital future
AI will more aggressively support decision making. The resulting information will be presented in a way that it can be absorbed through multiple senses. OK, that’s new. Privacy will increasingly be a problem/opportunity and while this will likely vary greatly across age groups, consumer-directed tools should help close the gap on privacy fairness. To net out much of this, the future will require a vastly changed set of tools and skills and only by focusing on remaining agile and keeping your eye on the trends, problems, and related technology advancements will you have a hope of keeping up. Good news is that most clearly won’t be able to so if you can keep up you’ll stand out sharply in a crowd of under performers.


The Non-parametric Bootstrap as a Bayesian Model
Still, the bootstrap produces something that looks very much like draws from a posterior and there are papers comparing the bootstrap to Bayesian models (for example, Alfaro et al., 2003). Some also wonder which alternative is more appropriate: Bayes or bootstrap? But these are not opposing alternatives, becausethe non-parametric bootstrap is a Bayesian model. In this post I will show how the classical non-parametric bootstrap of Efron (1979) can be viewed as a Bayesian model. I will start by introducing the so-called Bayesian bootstrap and then I will show three ways the classical bootstrap can be considered a special case of the Bayesian bootstrap. So basically this post is just a rehash of Rubin’s The Bayesian Bootstrap from 1981.


5 Things To Know About The Rise Of Open Source
If you still think open source technology is less reliable than proprietary software, or less secure, it’s time to learn more about the private sector’s digital revolution. During the past year major tech brands such as Google, Facebook and Microsoft have adopted a more open source philosophy, evident in their latest software releases. Similarly, more large companies are utilizing open source solutions alongside proprietary software to tap into open source’s diverse, creative, cooperative community of developers, thought leaders and users. If you want to expand the use of open source in your own business, there are a few things you should know.


What’s slowing down your network and how to fix it
The all-too-obvious answer is to see bandwidth as the problem, but with investigation, it is often not within a LAN environment, where a high amount of bandwidth is available. More likely, the problem lies within the WAN, where capacity is more finite and expensive. Problems with slow networks in a WAN environment are more likely to result from not employing quality-of-service software, according to Jason Peach, principal consultant at Networks First. “Rather than throwing more bandwidth at the problem, using more intelligent analysis to optimise bandwidth is often a better way to solve a bandwidth contention – the problem in any network scenario – LAN, WAN or WLAN, for example,” he says.


How wearables and mobile health tech are reshaping clinical trials
The average cost of bringing a drug from development to FDA approval is over $2.5 billion, according to a recent study by The Tufts Center for the Study of Drug Development. This figure includes costs for the drugs that don’t make it through to the approval phase, and the Tufts Center notes that higher drug failure rates contribute significantly to increases in R&D costs. But there’s a big opportunity here: If life science companies can get enough insight early in development, they can create a more efficient drug development process and prioritize resources for the most promising therapies. Big data analytics and new clinical technology — such as mobile health solutions and wearable devices — promise to significantly change how trials are conducted and increase the value of the data and insights that come out of these trials.


Hollywood movies vs. the real future of AI
AI is a supremely complex technology to understand, let alone create, and oftentimes Hollywood blockbusters stretch the technology's limitations to fit some desired scenario. In other words, the AI popularized and propagated by Hollywood seldom reflects the direction the technology is actually headed. "AI is nowhere near able to take over the world in the next few years," said Charlie Ortiz, senior principal manager of the Artificial Intelligence and Reasoning Group within Nuance's Natural Language and AI Laboratory. "And given the distance to that point, there are lots of other futures that could evolve. It could very well evolve into something that is more helpful and collaborative and could teach us if necessary."


Designing an Impediment Removal Process for Your Organization
Instead of trying to find and eliminate waste as a means of improving efficiency, I find it more natural to focus on the flow of work as a means of improving effectiveness. From that perspective, two questions become central. The first questions is “how does work flow through our system”? It can be very revealing for people to see the end-to-end picture of how work flows through the entire system, and not just their nominal area of functional responsibility. Managers and leaders from across the organization need to work together to create this picture. The second question is “what impedes the flow of work through the system”? Or, asked a different way, “what opportunities exist to improve the flow of work through the system?”


John Zachman on gaining synergies among the major EA frameworks
Friends of mine wanted me to change the name of this to Zachman Ontology, because if you recognize this, this is not a methodology; this is an ontology. This does not say anything about how you do Enterprise Architecture—top-down, bottom-up, left to right, right to left, where it starts. It says nothing about how you create it. This just says this is a total set of descriptive representations that are relevant for describing a complex object. ... A framework is a structure. A structure defines something. In contrast, a methodology is a process, a process to transform something. And a structure is not a process, and a process is not a structure. You have two different things going on here.



Quote for the day:

"If you genuinely want something, don't wait for it--teach yourself to be impatient." -- Gurbaksh Chahal

April 17, 2015

Cyber extortion: A growth industry
Jody Westby, CEO of Global Cyber Risk, also said in her experience, cyber extortionists have kept their side of the deal. She said for most of her clients, it comes down to a business decision. “I have seen IT guys say, ‘No way, we aren't negotiating or paying a dime,’” she said. “But then the CFO or another C-suite executive gets involved, evaluates the amount of money requested, and says it is a no-brainer: They are going to pay and keep the business running. It would cost more to have the system down.” Of course, not all extortionists are so “honorable”. According to Saengphaibul, “if you look hard enough, you’ll find numerous victims experiences showing hackers not upholding their end of the deal by not unlocking computers after ransom is paid.”


How To Build Better Products by Building Stronger Teams
So what is “great culture?” Too often, the visible trappings of culture -- free food, free drinks, yoga classes, Aeron chairs, video games, office Nerf guns -- are equated with culture, but this is a mistake. Yes, a lot of companies, especially in the technology business, are offering these things,, and they are great, but they have nothing to do with culture. Culture is how we talk, work, organize, win, and lose together. It is not something that you can pinpoint, but that leads to happier, more productive employees. All the free food and office perks in the world are useless if people feel afraid of failure, trapped in a rigid hierarchy, or that their employer values profits over people.


How one company is using artificial intelligence to develop a cure for cancer
Thanks to partnerships formed with universities, hospitals, and even the U.S. Department of Defense, Berg and its supercomputers have been able to analyze thousands of patient records and tissue samples to find possible new drug targets and biomarkers. All this data crunching has led to the development of Berg’s first drug, BPM 31510, which is in clinical trials. The drug acts by essentially reprogramming the metabolism of cancer cells, re-teaching them to undergo apoptosis, or cell death. In doing so, the cancer cells die off naturally, without the need for harmful and expensive chemotherapy.


Big data is easier than ever with Google Cloud Dataflow
Big data applications can provide extremely valuable insights, but extracting that value often demands high overhead – including significant deployment, tuning, and operational effort – diverse systems, and programming models. As a result, work other than the actual programming and data analysis dominates the time needed to build and maintain a big data application. The industry has come to accept these pains and inefficiencies as an unavoidable cost of doing business. We believe you deserve better. In Google’s systems infrastructure team, we’ve been tackling challenging big data problems for more than a decade and are well aware of the difference that simple yet powerful data processing tools make. We have translated our experience from MapReduce, FlumeJava, and MillWheel into a single product, Google Cloud Dataflow.


What's the real key to building a great tech team?
"Successful IT management is all about the people," he says, suggesting CIOs must understand the motivations of individuals both inside and outside the workplace. "I personally spend fifteen minutes with everyone that's about to join the organisation - and that's before we make an offer. Whether it's a help desk employee or an infrastructure director, it's crucial that I understand what they're like as an individual and what their interests are, and not just what they're like in a workplace," he says. Harley says his checks help ensure the HR team have explained to candidates the nature of the role and the likely pressures. "We're a very driven organisation and we're very busy. So, I reinforce that message. I want people to be resilient. People need to be a good fit culturally," he says.


IT's cloud security concerns do not correlate to actual failures
But in reality, cloud security is much different than what these surveys indicate. Indeed, the larger cloud service providers are doing a good job. Because cloud computing is still a fairly new technology, the providers use current approaches and mechanisms, such as identity-based security and advanced encryption for data at rest and in flight -- mechanisms many enterprises don't use internally. I suspect that most of the worries are driven by the natural fear that comes from not having direct control over your systems and data. To adopt the cloud, you must put your trust in other organizations. The cloud providers perhaps have not done a great job of explaining their true competence when it comes to security.


Why businesses need self-service business intelligence
Self-service business intelligence is not just for business leaders. Rather than limit access to data to senior management, organisations are finding it is crucial to properly equip all employees with intelligence they can act on. This is particularly so for small to medium sized businesses, where investing in larger enterprise level solutions that require multiple resources may not be a viable option. For small companies where employees wear many hats, to the largest of enterprises, it’s about making data analytics fit simply into the day-to-day. Rather than data belonging to IT, it’s about real people in business, who understand the topic and the environment, using data to get insight that’s actionable, and will positively impact their bottom line.


Business Rewritten By IT: A Mass Requirement for Automation
The disruptive impact that IT has had on almost every business can be traced back to its ability to deliver on those principles – efficiency, agility, better products. IT-led businesses must be agile in order to disrupt slow-moving market leaders and take advantage of the business opportunity differentiation offers. Technology-driven startups have to be efficient so they can battle with the balance sheets of the Fortune 500. (These balance sheets and huge investments often are in parallel to lethargy in reacting to the changing business landscape). IT-driven companies must be able to deliver ultimately better solutions to spark such dramatic market change in a relatively short period of time, and to drive businesses to incorporate IT into their offerings and infrastructures.


Big Data Processing with Apache Spark - Part 2: Spark SQL
Spark SQL, part of Apache Spark big data framework, is used for structured data processing and allows running SQL like queries on Spark data. We can perform ETL on the data from different formats like JSON, Parquet, Database) and then run ad-hoc querying. In this second installment of the article series, we'll look at the Spark SQL library, how it can be used for executing SQL queries against the data stored in batch files, JSON data sets, or Hive data stores. Spark 1.3 is the latest version of the big data framework which was released last month. Prior to this version, Spark SQL module has been in an “Alpha” status but now the team has removed that label from the library.



Quote for the day:

"Leading by example yields loyalty, leading by position yields frustration." -- @RichMcCourt

April 16, 2015

5 Factors to Retrospect after Every Sprint while Developing a Product
The essence of agile is to thrive for continuous improvement through empirical process control. True agile teams find ways to improve through experimentation, finding sustainability, and delivering business value earlier. It is a never-ending journey, and a sprint retrospective emerges as an opportunity to further accelerate this improvement process. It is a great time to allocate and analyse extraneous factors in detail, which otherwise may distract the team’s focus. In this post, we highlight 5 factors which every agile team should retrospect after each sprint. Let’s have a look.


Combining SIAM and DevOps for Digital Reimagination
Some of the most important aspects of the SIAM role are the coordination of people, processes, technology and data, and the governance across multiple suppliers, to ensure effective and efficient operations of the end-to-end service delivery to the business user. DevOps and SIAM converge in addressing current business and IT challenges and targeting people and attitude as primary drivers of performance and value. Whilst DevOps addresses the cons of functional specialisation and the spread of responsibilities across different IT teams, SIAM deals with the additional challenge of spreading services across multiple vendors.


Free ebook: Microsoft Azure Essentials: Azure Machine Learning
This ebook will present an overview of modern data science theory and principles, the associated workflow, and then cover some of the more common machine learning algorithms in use today. We will build a variety of predictive analytics models using real world data, evaluate several different machine learning algorithms and modeling strategies, and then deploy the finished models as machine learning web service on Azure within a matter of minutes. The book will also expand on a working Azure Machine Learning predictive model example to explore the types of client and server applications you can create to consume Azure Machine Learning web services.


Lack of skilled infosec pros creates high-risk environments
A portrait of the ideal cybersecurity professional emerges from this list of shortfalls: the top three attributes are a formal education, practical experience and certifications. The study reveals that organizations are experiencing attacks that are largely deliberate, and they lack confidence in the ability of their staff. The top four threat actors exploiting organizations in 2014 were cybercriminals (46 percent), non-malicious insiders (41 percent), hackers (40 percent) and malicious insiders (29 percent). 64 percent are very concerned or concerned about the Internet of Things, and less than half feel their security teams are able to detect and respond to complex incidents.


How The Internet of Things Is a Transformational Opportunity
Internet of Things looks like a massive opportunity over the years ahead, there are already many practical and valuable applications, and everything seems to be indicating that we are just in the first stages of what could be a game-changing series of innovations. However, opportunity attracts competition, and IBM will need to compete against several big players trying to get a piece of the pie. In January 2014 Google invested $3.2 billion in the acquisition of Nest Labs, a leading player in smart thermostats and smoke alarms. This means Google invested more in a single purchase than IBM over the coming four years in its whole Internet of Things initiative.


Will containers kill the virtual machine?
Containers are not a new technology: the earliest iterations of containers have been around in open source Linux code for decades. But in the past year they've captured the hearts and minds of many developers for building and running applications. Containers isolate specific code, applications or processes. Doing so gives whatever is inside the container a neat envelope for managing it, including moving it across various hosts. Whereas you can think of a virtual machine slicing up a server into multiple operating systems, containers run atop the OS so unlike a VM, they don't require an OS to boot up when they're created. In essence they can virtualize an operating system to provide a more lightweight package of an application compared to a VM.


SSL/TLS/HTTPS: Keeping the public uninformed
Perhaps the most important thing to understand about the SSL/TLS/HTTPS system that secures websites is that you are not supposed to understand it. ... If SSL/TLS/HTTPS was reallydesigned for security, this would have been done long ago. But secure websites are security theater. They seem to be secure, techies say they are secure (at least in public), but the system is flawed. That it took so long to expose Superfish was because the system is rigged against normal folks. Jonathan Zdziarski recently made another simple suggestion that, like mine, will never see the light of day. He points out that HTTPS interception, such as Superfish, can be detected if the web browser notices that the last X "secure" websites were all vouched for by the same Certificate Authority.


SEC’s Stein touches all the bases in discussion on data, technology
With a goal of collecting an estimated 58 million records per day, there is little doubt that CAT will require a tremendous amount of industry cooperation. However, Stein pointed out that a proposal that might seem like a regulatory reform wrought with headaches for the industry might eventually simplify the work of compliance professionals. “Only though CAT can we develop regulations that are driven by the facts,” Stein explained. Stein touched on how the Flash Crash and the lengthy investigation that followed highlighted the need for CAT and lamented the slow march to implementation, which remains years away. “We need the CAT as soon as possible,” Stein said.


Infosec taking the strain as threats evolve and skills gap widens
Davis added it may also indicate that information security professionals in Germany have a higher level of top executive support than in the UK and elsewhere in Europe. Despite budgets allowing for more personnel, 62% of respondents reported that their organisations have too few information security professionals – up from 56% in 2013. Frost & Sullivan estimates that the global workforce shortage will widen to 1.5 million in five years, while the variety and sophistication of cyber threats are expected to continue. The situation is exacerbated by the broadening footprint of systems and devices requiring security oversight. Signs of strain, including configuration mistakes and oversights, were identified as a significant concern, and recovery time following system or data compromises was found to be getting steadily longer.


Why CIOs can’t sell enterprise collaboration tools
One of the biggest challenges is determining how to implement enterprise collaboration in cross-functional manner, says John Abel, senior vice president of IT at Hitachi Data Systems, “Teams are pretty good at communicating within their own group but when it comes to integrating across departments silos tend to happen, which ultimately becomes problematic when each team needs to align on certain campaigns or key topics,” he says. NetScout’s CIO and Senior Vice President of Services Ken Boyd says the landscape of collaboration tools available today makes it difficult to pick the best ones for a specific workforce. “Locating a collaboration tools provider that can offer the right balance for the needs of our enterprise users can be a significant challenge,” he says.



Quote for the day:

“...A man can only stumble for so long before he either falls or stands up straight.” -- Brandon Sanderson

April 15, 2015

GoodData analytics developers on what they look for in a big data platform
Far and away, the most exciting is about real-time personalized analytics. This allows GoodData to show a new kind of BI in the cloud. ... It's for telling you about what’s going on in your electric smart meter, that FitBit that you're wearing on your wrist, or even your cell-phone plan or personal finances. A few years ago, Vertica was blazing fast, telling you what a million people are doing right now and looking for patterns in the data, but it wasn’t as fast in telling you about my data. So we've changed that. With this new feature, Live Aggregate Projections, you can actually get blazing fast analytics on discrete data. That discrete data is data about one individual or one device. It could be that a cell phone company wants to do analytics on one particular cell phone tower or one meter.


Security risk potential linked to young, mobile users
The public sector was the least likely to report lost or stolen data, although that does not mean the public sector is not losing data. Attitudes were also lax among people working in high-tech industries, who were more likely than average to give up their device password if asked for it by IT, and in education, where teachers revealed a tendency to write their passwords down on a piece of paper. ... “Corporations have thought about security historically as very much a perimeter solution and put a big firewall at the gateway,” he said. “We’ve been eroding that for a good 10 years as information becomes more fluid, but we have not yet moved away from the idea that security sits only at the perimeter of the network.”


4 data wrangling tasks in R for advanced beginners
With great power comes not only great responsibility, but often great complexity -- and that sure can be the case with R. The open-source R Project for Statistical Computing offers immense capabilities to investigate, manipulate and analyze data. But because of its sometimes complicated syntax, beginners may find it challenging to improve their skills after learning some basics. If you're not even at the stage where you feel comfortable doing rudimentary tasks in R, we recommend you head right over to Computerworld's Beginner's Guide to R. But if you've got some basics down and want to take another step in your R skills development -- or just want to see how to do one of these four tasks in R -- please read on.


Report: Internet of Evil Things is your next nightmare
"Virtually every organization has some sort of rogue wireless access point or printer," Paget said. Worst of all, many companies don't know what devices are on their networks because employees can easily go out and buy them and install them themselves -- or bring them from home as part of corporate Bring Your Own Device programs. Employee-owned devices are a particular concern, Paget added, because there are limits to what a company can do to secure them. Overall, he said, when scanning corporate systems, Pwnie discovered that companies typically had two to three times more devices than they thought they did.


Intel & Ingenico Announce Secure Payment Agreement for the Internet of Things
“This is a great example of how innovation can simplify the purchasing experience and further enhance the merchant-consumer relationship. Bringing secure payment into connected devices will root our payment acceptance expertise in the Internet of Things.” “The shift in liability this October will be a major milestone in the United States for banks and credit card companies, but especially for retailers,” said Doug Davis, senior vice president and general manager, Internet of Things Group, Intel. “Intel and Ingenico Group are working to bridge the retail experience and security gap while also making sure devices are easy to deploy and manage so we don’t create new burdens for the merchants.”


Navigating An Internet of Things Legal Minefield
This article explores how big data and the rights of data subjects can coexist. With the help of Amor Esteban, an attorney who helps companies navigate these murky and often dangerous waters, we explore the balance that may be struck between a company’s legitimate business interests and respect for the individual’s right to data privacy. ... He currently chairs that group and is editor in chief of its The Sedona Conference International Principles on Discovery, Disclosure & Data Protection: Best Practices, Recommendations & Principles for Addressing the Preservation & Discovery of Protected Data in U.S. Litigation. Together we will delve a little deeper into the development of IoT, the role of analytics in a complex IoT environment and what companies should be considering before embarking on a project.


A 21st Century Way of Life: From 20th Century Work-Life Balance to Lifeworking
The reason that organizations have been slow to truly rethink the concept of work-life is due more to cultural inertia than any other factor. The industrial-age assumptions about technology, organization and processes have become deeply ingrained within society, and have been reinforced through general and business education and the media. In most organizations these deeply entrenched assumptions have become orthodoxy, and this is why the question of work-life balance remains. Some enlightened organizations have made progress in some areas, especially with regard to virtual working and flexible working time, but in most cases these initiatives only patch the much deeper underlying problems


Nearly 1 million new malware threats released every day
Directed attacks and data breaches also grew, according to Symantec. Five out of six large companies were targeted by cybercriminals, a 40% rise on the previous year. The mining industry was the world's most targeted sector. Samir Kapuria, a Symantec executive, recalled one case in which hackers snuck into an energy company's computer network and stole a draft report. The report detailed the secret discovery of a potentially lucrative energy drilling spot. Hackers were trying to sell the information on a black market website to stock traders, Kapuria said. But they were foiled when the energy company (operating under a pseudonym) told prospective black market buyers that the information was false. Kapuria declined to mention the name of the company.


Data breaches may cost less than the security to prevent them
In a March 2015 column on The Conversation, Dean provided a hard to disagree with defense of why things security-wise "ain't gonna change" soon. "When we examine the evidence, though, the actual expenses from the recent breaches at Sony, Target and Home Depot amount to less than 1% of each company's annual revenues," wrote Dean. "After reimbursement from insurance and minus tax deductions, the losses are even less." Dean then administered the knockout punch: "This indicates that the financial incentives for companies to invest in greater information security are low and suggests that government intervention might be needed."


The Hybrid IT Enterprise Demands an End to Network Guessing Games
As visibility, control, and optimization are brought to hybrid networks it will become increasingly important to construct an analytics-driven infrastructure that can take action when problems occur anywhere in the network. We’re already seeing more IT organizations instrumenting network architectures with predictive analytics to create self-correcting, self-generating networks that respond to business needs and intents. Well-instrumented infrastructures provide the foundation for introducing automation. Such automation helps infrastructures react to changing demands without requiring manual intervention. Visibility tools can help to discover and map dependencies in application workloads, a necessary element for true workload portability.



Quote for the day:

"Courage is to never let your actions be influenced by your fears." -- Arthur Koestler

April 14, 2015

Enough With the Silos – Connect, Connect, Connect
It was the year that interest in Service-Oriented Architecture (SOA) exploded and began to influence the way developers built software. It was the year that virtual machines took off like a rocket. From a technology perspective, it was a busy year. Before then it usually made sense to drop applications into silos, no matter whether the silo was a cluster of powerful machines or a single server. You could provision enough hardware to ensure reasonable performance, configure the application for backup and recovery, wrap it all up in a bow and dump it in a dark corner of the data center. Windows and Linux both encouraged the silo approach because neither operating system shared resources efficiently between co-residing apps. It had become a one-app-per-server world.


Cloud machine learning wars heat up
Machine learning is the next frontier in Big Data innovation. And the cloud is the next frontier within that frontier. Almost five years ago, Google launched its Prediction API cloud-based machine learning service. This past July, Microsoft launched its Azure Machine Learning (Azure ML) service as a preview, and brought it into general availability in February. That service had (and has) surprisingly good integration with code written in the open source R programming language. ... They also provide APIs for developers to send input variable values and receive a predicted value for the target variable. The attraction of putting this all in the cloud is that any client application can run a prediction by making a single web service call.


Multi Threaded PowerShell Cookbook
I had the idea to try to directly leverage the TPL from within Powershell and effectively tackle the problem in exactly the same way as one would if writing multi threaded code in .NET, e.g., instantiating Task objects, etc. ... My preference was to use the TPL but I quickly found that things didn't quite work. Although we can write .NET code directly from within Powershell, that doesn't mean we should try to follow the same patterns in both. They are both markedly different and at the thread level I found that trying to instantiate and manipulate threads from within a Powershell script was a recipe for disaster. That left me using the System.Management.Automation.Runspace namespace and the results were quite pleasing.


Pivotal sets the stage for open-source in-memory computing
Releasing the code is the first step in Pivotal's plan,formulated earlier this year, to open-source components of the company's Big Data Suite, which includes GemFire. Later this year, the company plans to release the code for its Pivotal Hawq SQL engine for Hadoop and the Pivotal Greenplum Database. Not all of GemFire is being open-sourced. The company is holding back some advanced features for its commercial edition, such as the ability to stage continuous queries and establish wide-area network connectivity between clusters. Those who pay for the commercial edition will also receive enterprise-level support.


Digital Lumens: Why CIOs should 'lean in' to the IoT
The first thing that CIOs need to do is lean forward into IoT. I think in many cases CIOs are watching it happen without their control and management. I think that engenders fear, engenders fear about management of data, engenders fear about products and organization, I'm sure engenders fear about security. ... It's the role of the CIO to lean forward, talk about the security and policy procedures of the company but then say, 'Well, once you have those in our building, how can we help you? How can we think about that data flow? How can we store that reliably for you? What are other integration points?'


Wearable devices - now a reality for the workplace
The primary reasons for wearable devices are to gain access to IT resources without encumbering the user and getting in the way of the task in hand. So many other items of technology involve varying degrees of significant physical commitment - sitting down to use a desktop or laptop, two hands to use a tablet while standing and even cradling a smartphone requires a hand and at least one eye or ear. Something worn on the wrist, accessed by a glance, tap or spoken word not only fits a Dick Tracey wish-list, it also frees up hands, is out of sight and allows the user to be 'footloose'.


Metadata-Driven Design: Designing a Flexible Engine for API Data Retrieval
From plain flat files to structured XML files to the more esoteric ones (like ISO 2709), developers and administrators have been shuffling these files and ingesting their data for decades. There are both advocates and naysayers on the time-honored practice of ingesting data files. Critics point out that data files are not real-time sources of information, and depending on the chosen format, it may require a certain amount of coordination and finesse in order for them to be handled properly. Advocates, on the other hand, would make the argument that data files have been used for decades, and as a result, the accrued cornucopia of libraries and commands for handling them can empower even the untrained novice.


3 best practices for bootstrapping an open source business
That open source startups are hard to find in the investment-first ecosystem is not surprising, because they're usually started by people who actually build the product. Most of the time, seeking early stage investment for an open source product doesn't make financial sense. On the other hand, there's much to be gained from the business and marketing knowledge in local startup communities, so being sequestered from them can put open source developers at a disadvantage. If you're bootstrapping your open source company, here are three tips to help you prepare for that ultimate transition from development project to fully fledged business.


IBM Creates Watson Health to Analyze Medical Data
The Watson Health announcement is also the latest in flurry of initiatives IBM has announced this year that include new corporate partnerships as well as moves in cloud computing, data analytics and Watson. They are evidence that IBM is intent on investing for future growth, and showing it is doing so, in a year when its financial performance is likely to lag. IBM has reported disappointing earnings recently, and Virginia M. Rometty, IBM’s chief executive, has told industry analysts and investors that 2015 would be a transition year in which new growth businesses like Watson did not yet overcome the profit erosion in some of its traditional hardware and software products.


Government IT over the last five years – the good, the bad and the digital
“The landscape has changed significantly under the Government Digital Service. GDS has had a significant impact, and what’s happened which has been good is the dynamic and disruptive leadership shown by GDS in tech and digital and IT,” said Adam Thilthorpe, director of professionalism at BCS, the Chartered Institute for IT. “Some of the things they’ve done have had real impact on people’s lives and have made things better. Some of the things that they’ve done would actually be a great lesson to be listened to in the private sector.”



Quote for the day:

"It is always safe to assume, not that the old way is wrong, but that there may be a better way." -- Henry F Harrower

April 12, 2015

Balance exploitation & exploration within your organization with TOGAF
Every organization is confronted with ambidexterity. Ambidexterity is about achieving a healthy balance between the management of operations, the daily work, exploitation vs. the management of innovation, discovering, incubating and accelerating new products and services, exploration. Ambidexterity within organizations means "exploiting the present and exploring the future". Consider financial services providers, for example consumer banks. As illustrated in the figure below the daily operations of a consumer bank involves activities such as – and certainly not limited to – the management of:


A community distribution of OpenStack
It's worth pointing out that RDO is a community effort, so when it comes to support, the project's mailing lists, IRC channels and ask.openstack.org site are your best options. If you need professional support for your production environment, a commercial distribution like Red Hat Enterprise Linux OpenStack Platform (RHEL-OSP) would be the way to go. ... The rest of the work performed in RDO is done within the community boundaries. We follow most of the OpenStack and Fedora development conventions and practices, so sometimes the line between one and the other is blurred. Needless to say, everything done in RDO is open and committed to public repositories as it's being developed.


Data Viz Pioneer Nicholas Felton: "There Is A Real Shadow Over Data"
Ryan and I went out to California for some meetings about Daytum and about starting this pursuit of getting funding so we could work on it full-time. We went and talked to Mark and found out they were working on Timeline. We were especially interested in Open Graph, which was basically the ability to plug anything into Facebook. This included data sources that we were pretty interested in, like music, being able to visualize what you were listening to, or things that you’re watching from Netflix. At that point, the question for us was, "Do we want to work on Daytum and try and bring it to a grand scale, or have even a tiny influence on what 600 or 700 million people are using?" That was a hard conversation.


The Battle For Your Wrist Has Begun: Android Wear Versus Apple Watch
On the bright side, improvements to security could be coming in short order for Android Wear devices. Liviu Arsene, Senior Security Analyst at Bitdefender, explains, “These security risks could easily be fixed with stronger or better methods for ensuring the safety of the entire communication.” His suggestions include the use of Near Field Communication (NFC) to safely transmit a PIN code during pairing, but he warns that using NFC “would likely increase the cost and complexity of the devices.” An alternative method would be to “supersede the entire Bluetooth encryption between Android device and smartwatch and use a secondary layer of encryption at the application level.”


Containers Explained: 9 Essentials You Need To Know
At the most basic level, containers let you pack more computing workloads onto a single server, and let you rev up capacity for new computing jobs in a split second. In theory, that means you can buy less hardware, build or rent less data center space, and hire fewer people to manage that gear. Containers are different from virtual machines - which you probably already run using VMware or Microsoft Hyper-V virtualization software, or open source options KVM or Zen. Specifically, Linux containers give each application running on a server its own, isolated environment to run, but those containers all share the host server's operating system. Since a container doesn't have to load up an operating system, you can create containers in a split-second, rather than minutes for a virtual machine.


Strategic Torque: Enterprise Architecture & Portfolio Management
Practical application of integration based on theoretical foundations shows that the implementation of portfolio management is facilitated by enterprise architecture practices and in doing so contributes to the realisation of strategic planning and the overall improvement of cross-competency IT effectiveness. This discussion will show that there is a history of risk aversion, opportunity cost and siloed ‘think’ in the IT departments of tertiary educational institutions. ... This optimisation of organisation and organisational change combines service based value add client interaction, through streamlining process (through silo integration), and the reduction of opportunity cost and waste. This is in part an impact of risk appetite / tolerance. The ability to influence outside ones silo is perceived as riskier as control seems to be lacking.


Lean Documentation
People use documentation to find answers to the questions they have. The quality of the documentation can be measured by the time it takes to find the answers. We used Google Earth as a model. Have you ever tried to find your house on Google Earth (drilling down, not searching on address)? How long did it take? Maybe 30 to 60 seconds? Finding your house on the surface of the Earth is like finding 1 answer among 1,5 trillion (1,5 * 1012) answers. If you are looking for an answer it shouldn’t take more than 60 seconds, even if your system is complex and huge. How does this apply to documentation? We followed a hierarchy analogous to moving through the levels in Google Earth: moon level, satellite level, airplane level, helicopter level and so on.


Deep Gooses MySQL Performance with New Database Math
Instead of continually writing data to disk, CASSI uses machine learning algorithms to better predict the optimal moment to write data to disk, based on the particular configuration and capability of a computer, says Chad Jones, the chief strategy officer for Deep IS. “As things come in we’ll say, ‘What’s the best way to handle this by splitting up the in-memory and disk structures,'” Jones says. “We’re able to put an adaptive layer in between. It allows us to say ‘I’m not going to write this down right now because the data hasn’t quiesced. I keep seeing a lot of changes in this one column of data, so let’s defer writing until we know it’s ready to be written and then write it, so we eliminate a lot of extra work in the database.'”


You can’t have Big Data until you have Good Data
Rather than rushing in and trying to learn big data analytics by searching through irrelevant data collected by separate IT systems, companies should prepare the ground, start organising their data – show it some respect. Capturing data from lots of different places whether that be from emails, forms on the website and even manually, can cause mistakes, so that when it comes to analysing data companies are not always analysing the correct information – it might be old data or based on false inputs. Companies must stop measuring the wrong data; stop deceiving themselves about the accuracy of their data, and go back to basics. There are many data capture solutions available on the market. For example, in the finance department, accounts processing today should include scanning paper based invoices as standard and adding them to your PDF invoices from email.


What Are the Legal Concerns in a HIPAA Risk Assessment?
“There are handfuls of different reasons to have security folks look at your systems and audit you and give you various reports, and that’s fine,” Rostolsky said. “Ultimately, you need to have something that’s specifically looking at the security requirements and speaks and uses HIPAA language in the assessment.” Essentially, healthcare organizations should not rely on a false sense of security. It’s important that when their data systems and safeguards are being reviewed, that facilities try and keep in mind what the OCR would be looking for so no areas are missed. Having current physical safeguards, administrative safeguards, and technical safeguards is not only required by the Security Rule, but they work together to protect health information, according to Spencer.



Quote for the day:

"The old mantra of ‘be everywhere’ will quickly be replaced with ‘be where it matters to our business" -- Mike Stelzner

April 11, 2015

Big Data Platforms: How To Migrate From Relational Databases to NoSQL
With our discussion scope sufficiently narrowed, we'll start by tackling a relatively simple relational structure. The very first thing we'll need to do is to evaluate which entities can be de-normalized to become what I call super-classes. "Super-class" is not a standard big data term. It's my term and I find it makes things easier for the initial discussion. I'll explain why later. Each of these super-classes will be used to help define the new composite structure (an actual Big Data term). We'll be using the following Entity Relationship Diagram (ERD) to lay out the steps needed to identify our super-classes.


5 Competitive Strategies of Successful (and Ethical) Companies
Ethics becomes part of the competitive advantage that enables them to succeed. When I talk about a conscious strategy incorporating ethics, I am not thinking of a formal (written) strategic plan. Many organizations do not have formal strategic plans. But whether or not there is a formal plan, successful companies employ certain strategies to compete effectively. It is among these competitive strategies that ethics finds a place. I identify five competitive strategies common to companies that are successful and ethical on a sustained basis. None of these strategies considered alone guarantees ethical success. However, each strategy increases your chances of combined ethical and market success.


The App That’s a Breath of Fresh Air
Like many other innovations, BreezoMeter was born out of frustration. Its CEO, Ran Korber, was frustrated by the lack of centralized air quality information available when he was seeking a place clear of air pollution for his new home in Israel. As an environmental engineer with a pregnant wife, he was particularly concerned about the air quality. Finding nothing on the market provided all the answer he sought, he created his own solution. The app proved successful in Israel where 300 sensors sufficed to cover the most populated areas of an area roughly the size of New Jersey. Scaling up to cover an area hundreds of time bigger was a challenge for the startup. BreezoMeters’s CMO, Ziv Lautman, said it took half a year to collect air quality data from thousands of sensors scattered around the United States.


IS Audit Basics: Auditor: About Yourself (And How Others See You)
Technical expertise is necessary, but not sufficient to be or become a successful auditor. That is, a successful auditor is one who is credible, respected and personable enough to be considered a valuable source of information and advice. Having a good knowledge of oneself and the soft skills that facilitate human interaction is just as important as professional knowledge and, probably, harder to acquire. Being sensitive to how others perceive us is at least as important. “O would some Power with vision teach us to see ourselves as others see us! It would from many a blunder free us, and foolish notions.


10 minutes with… Two-Factor Authentication author Mark Stanislav
By combining different ‘factor classes’ (e.g. something you have, something you know, something you are), account security is greatly strengthened as the challenge of a criminal to get past two factors is a difficult hurdle. Because passwords are often poorly created, easily stolen, and commonly reused, their ability to protect our most important systems and services aren’t well matched for the needs and risks facing people today. Through the book I am able to educate my readers about not just what two-factor authentication is, but what choices they have to do it, what the upsides and downsides are to different methods, and what they should think about to make sound decisions regarding their security needs.


Intuitive Reasoning, Effective Analytics & Success: Lessons from Dr. Jonas Salk
To perceive something differently or even to know something as being true is of little or no value if you’re not willing to stand apart from the crowd. It’s very clear from his interview this was never an issue for Dr. Salk. He was extraordinarily thick-skinned, and had an exceptionally healthy attitude regarding criticism and rejection. And yet, he was fully willing to follow the hard road necessary for a new truth to be recognized and accepted. People lacking these high-EQ attributes are unfortunately likely to keep intuitive reasoning to themselves or just give up. ... the greatest insights, advances and innovations using big data will come from people with unique subject matter expertise and high intuitive reasoning skills – enabling them to “see” challenges very differently. And they will probably not be formally trained in data science or programming.


Burn Rate Doesn’t Matter
Too bad burn rate doesn’t matter. More specifically, burn rate (net cash outflow per month) is a vanity metric. Just as top-line revenue doesn’t tell you much about the health of a DJIA blue-chip, burn rate says very little about whether a startup is on track. Only by evaluating a company’s use of cash and long-term strategy can high burn be diagnosed as good or bad. In many cases, the low burn ideal is actually dangerous. At Founders Fund we avoid investing in companies unless they are consuming cash. We’re here to invest when doing so will bring about positive progress faster, which often manifests as the conversion of cash into assets and increased burn. Cash-flow-positive businesses are usually past this inflection point, or simply don’t have enough ideas about what valuable things to do with more money.


Self Service: A Data Scientist Productivity Boost
There are no less than six new and emerging roles within any organization, with data developers/engineers and business analysts being two of those, according to a recent Forrester webcast. The pool of data developers and engineers is roughly three million worldwide. These individuals count data modeling as a core skill; where data is in their DNA and the IT department is their home. Data developers have Excel, SQL, Microsoft Access and declarative dataflow diagrams down cold. They can work in declarative programming metaphors, draw dataflow mapping diagrams of what they want the system to do, but don’t necessarily do a lot of coding. The challenges this group faces are similar to those of the data scientist.


Surveys: Employees at fault in majority of breaches
"Security awareness is a must, but it's a slow and difficult task, and as CompTIA study shows human error is still the largest factor behind security breaches," said Igor Baikalov, chief scientist at Los Angeles-based Securonix, Inc. "The game changer," he said, "is continuous risk monitoring through automated analytics." It can detect human error, reduce false positives, and lower incidence response times, he said. "Humans were always considered to be the weakest point of the IT security chains -- and the more privileges they have, the more risk they pose to the corporate network," said Péter Gyöngyösi, product manager at Luxembourg-based BalaBit IT Security.


Asynchronous Programming in .Net with QnA
Task based Asynchronous Pattern (TAP) is based on concept of a task, represented by Task type inSystem.Threasing.Tasks namespace. It represents an asynchronous operation which you could wait for completion, cancel it, or specify a continuation to execute when this asynchronous operation is complete. It provides an object-oriented approach to writing asynchronous code. This frees up developer from worrying about semantics of language or execution environment for executing asynchronous operation and he can rather focus on functional aspects of application. Core idea here is to enable developer to execute methods on a separate thread seamlessly.



Quote for the day:

“Stories are the single most powerful weapon in a leader’s arsenal” -- Howard Gardner

April 10, 2015

SDDC adoption on a 'slow roll'
"It is a very new model, especially on the [software-defined networking] side," Dennehy said. "Customers are being extra careful about how they go down this road." In addition to the changes that SDDC brings to hardware and software, it also will usher in changes to IT staff. Tasks previously performed by highly skilled employees can be performed by software, according to Forrester's brief, "The Software-Defined Data Center Is Still A Work In Progress" by Richard Fichera. ... "The adoption of SDN is really concentrated in telecom and the very big data centers such as Google, Amazon and Facebook," Dennehy said. As for software-defined storage, it's not "plug and play" said Stanley Stevens, also a senior analyst at TBR.


Technology is turning genealogy on its head
The search for identity is often rooted in the past, which is why genealogy remains so popular. Technology has helped in many ways, from making it easy for home family-tree builders to create diagrams and search local council records, to powerful servers crunching data to find geographic correlations that might imply family connections. And then there's our DNA. Watson and Crick's discovery of the double-helix DNA 'code' didn't immediately change the world. But as computing power has increased, so too has the scope of DNA analysis. Sequencing that once cost tens of thousands of dollars now costs much less than one percent of that – and it's sequencing that tells us who we are biologically, or at least what we're made of.


Lambda Complexity: Why Fast Data Needs New Thinking
Rather than address the flaws directly, you simply run both the batch and streaming systems in parallel. Lambda refers to the two systems as the “Speed Layer” and the “Batch Layer”. The Speed Layer can serve responses in seconds or milliseconds. The Batch Layer can be both a long-term record of historical data as well as a backup and consistency check for the speed layer. Proponents also argue that engineering work is easier to divide between teams when there are two discrete data paths. It’s easy to see the appeal. But there’s no getting around the complexity of a Lambda solution. Running both layers in parallel, doing the same work, may add some redundancy, but it’s also adding more software, more hardware and more places where two systems need to be glued together.


HP Spectre x360 review: A sexy convertible that just can't take the heat
This configuration is actually fairly competitive. Outfitted with similar components, Dell’s XPS 13, for example, is $800—but it’s not a convertible and it even lacks the touchscreen at that price. Also, the XPS 13’s smaller, lighter form factor feels great until you touch the keyboard. The Spectre x360’s keyboard is far more comfortable to type on than the XPS 13’s. Frankly, I’d probably trade the XPS 13’s compact size for the Spectre x360’s keyboard in a second if it were my everyday driver. Other details of the Spectre x360 also impressed me. The tiny power button on the left side of the frame is a bit annoying—you have to hunt for it. However, it takes just enough pressure that you can’t easily activate it by accident. On the convertible Yoga 3 Pro, I’d put the machine to sleep all the time just by picking up the chassis.


Why heresy is good business strategy: Dell’s Armughan Ahmad
Ahmad said that Dell Blueprints – which optimize Dell integration with partner ecosystems – are a critical part of their strategy. Dell Blueprints comprise five separate disciplines: Unified Communications and Collaboration, like Skype for Business (formerly Microsoft Lync); Enterprise Applications such as OLTP, CRM and databases; VDI; Big Data analytics; and high performance computing. “Underneath these, we have these vendor partnerships, and all these companies power these solutions for us,” Ahmad said. “We let them put their hooks deep into our products, and we are willing to democratize the IT for that.” Here’s where the heresy comes into play. Dell’s model embraces a willingness to wipe Dell’s own technology off the partnered products, sacrificing short term CAPEX profits, in the interest of longer term benefits from reducing customer OPEX costs.


A Data Scientist's Advice to Business Schools
The expectation on any business graduate is that they possess an ability to strike a middle language between the priorities of a business and the deep domain knowledge of a company's experts. They should carry that 'generalist's touch' and be able to synthesize myriad high-level approaches into real-world utility for their organization. To produce graduates like this a business school must find ways to teach the general high-level approaches used by domain experts across a company's departments. Graduates should have an understanding of how an expert's deep expertise in their field adds value to the overall strategic direction of the company. Only then can value-producing conversations and disruptive ideation exist between the business graduate and the domain expert.


AT&T's data breach settlement called a 'slap on the wrist'
It's "alarming" that AT&T allowed contractor workers to have access to unencrypted customer records, Blech added. "There should no longer be any debate as to whether sensitive customer data should be encrypted or not," he said. It's interesting that the data breach settlement came through the FCC, when the U.S. Federal Trade Commission has been the agency that often pursues companies for data breaches, said Robert Cattanach, a partner at law firm Dorsey & Whitney focusing on cybersecurity and other regulatory litigation. The FCC settlement, the largest in agency history for a data breach, "ups the ante" for penalties, but the FCC may still have been a better option for AT&T, Cattanach said.


Q&A: Marcus Ranum chats with Privacy Professor CEO Rebecca Herold
Identify the risks those vendors present to the organization based on a variety of factors, including the types of information they are accessing, whether or not they are storing sensitive and personal information within their own systems, and the types of safeguards they have in place for those systems. Document it. Determine which vendors are high, medium and low risk; then dedicate attention appropriately. Perform regular security and privacy reviews -- there are many ways to do this -- for the high-risk vendors, as well as appropriate checks for the medium- and low-risk vendors. Keep an eye out for any published reports of breaches for the vendors they are using.


Internet of Things must learn interoperability lessons from history
“IoT is a whole myriad of different ways of connecting things,” he says. “It could be fixed, Wi-Fi, NFC, cellular, ultra-narrow band or even ZigBee - so many but they have different uses. You have to mix and match what is best to make connections work.” In the early days of Ubiquisys Franks encountered similar issues. There were, he says, a number of wireless proprietary technologies that wouldn’t talk to each other, making it impossible to roam from town to town let alone country to country. The solution was to get all the technologies into the same room and try and thrash out an interoperability plan.


You’ve Completed Unit Testing; Your Testing has Just Begun
Stopping just after unit testing the code is akin to starting mass production of automobiles after testing each nut and bolt of a car. Of course nobody would ever take such a huge risk; in real life, the car would first be taken on many test drives to check that the assembly of not just every nut and bolt, but every other part perform in coordinated orchestration as intended. In the software development world, test driving translates into what we affectionately refer to as integration testing. Integration testing guarantees that the collaboration of classes works. In the Java world, both the Spring framework and the Java EE platforms are containers that provide APIs over available services, for example JDBC for database access.



Quote for the day:

"Anyone can hold the helm when the sea is calm."  -- Publilius Syrus