June 30, 2015

How three years of Yammer has changed Microsoft for the better
Early Yammer use at Microsoft wasn't all positive, but even the arguments could be productive, Pisoni noted when we met last year. "Someone at Microsoft made a comment about a product in a mean-spirited way and the person who made the product popped up and said, 'Hey I make that product,' and they had a conversation. I think there were parts of the company so used to being in a silo that they had to realise they could communicate and get used to having constructive conversations and people got more constructive with each other." Fast forward to 2014 and if you worked at Microsoft and you wanted a new laptop, you had to use Yammer to request it. But it wasn't just a different way of filling in a form, Coplin told us. "It's not just another inbox, it's a culture. You're now measured on your contribution across the company."


The C-Suite Needs a Chief Entrepreneur
So if the CEO isn’t someone who can innovate, then who should? It’s a question that I’ve discussed with Lean Startup founder Steve Blank and business thinkers such as Yves Pigneur, Henry Chesbrough, and Rita McGrath. We believe that CEOs need a partner for innovation inside their companies, someone who will create and defend processes, incentives, and metrics that encourage radical ideas and find new areas for growth. It’s an executive who can help large companies reinvent themselves while they’re still successful. And this new role needs to sit in the C-suite. You could call this person the Chief Entrepreneur (CE) — someone who can lead the future of the company while the CEO takes cares of running the existing business. This is a huge divergence from the traditional norm for chief roles, but the CE is a necessary position of power to ensure that a company innovates.


Avoid culture shock in your next cloud computing project
"An engineer thinks, 'Oh, if you automate this… you're not going to have any work for me,'" said Jason Cornell, manager of cloud and infrastructure automation at Cox Automotive, an Atlanta-based provider of vehicle remarketing services. Enterprises must emphasize that cloud's automation will allow engineers to perform other, higher-value tasks for the business. "What I've been trying to do within my organization is to get us to the point where we can work higher and higher up the stack and get further and further away from managing things like patching [or the] maintenance of servers," Maddison said. "Those kinds of things we want to throw out the window, and get us to focus on stuff that is actually going to add business value."


CIOs warned of IT overspending risks as US dollar strength grows
“We’re looking at the US dollar being high through 2017, but the effect we’re seeing now is around product vendors re-pricing, using obfuscation and trying to keep the revenue they generate in line [with their in-house projections],” he said. “Next year it will be about recovering margin and the year after that, there will be a whole new world out there.” The latter point is in reference to the long-term effect the strong dollar will have on imports and exports arrangements between different countries, which could benefit some suppliers and harm others. “Printers are an interesting market right now, particularly in Europe, as it’s the only market where we have a vendor who has non-US based products and that’s Fujitsu,” said Lovelock.


Announcing the R Consortium
While the R Foundation continues its role as the maintainer of the core R language engine, the R Consortium will initiate projects to help the user community make even better use of R, and to help the R developer community further extend R via packages and other ancillary software projects. Projects already proposed include: building and maintaining mirrors for downloading R; testing and quality assurance platforms; financial support for the annual useR! Conference; and promotion and support of worldwide user groups. In general, the Consortium will seek the input of its members and the R Community at large for projects that foster the continuing growth of R and the community of people that drives its evolution.


OPM Chief's new Cyber Defense operation has potential
"Within almost any organization, there is a tendency for structure to drive behavior and for execution toward goals to be the ones that are measured by management," Harkins said. "By publicly demonstrating the leadership of accountability," OPM will surely "be able to stay on top of future risks because they will have the structure to drive prevention of issues and learn from incidents that may occur." Cylance late last year published an analysis labeling Iran a rising power in cyberspace, comparable to China, and specifically cited a campaign dubbed Operation Cleaver. On Friday, The Hill reported the group behind that series of attacks provided WikiLeaks with about 70,000 confidential cables from Saudi Arabia’s Foreign Ministry.


Cisco fleshes out its Internet of things system, portfolio
Cisco's IoT approach revolves more around a portfolio of products, reference architecture and ecosystem with the likes of Rockwell Automation and GE to name a few. Not surprisingly, Cisco's IoT rollout includes a heavy dose of infrastructure ranging from networking gear to security cameras. But Cisco also added tools for analytics, application management as well as "fog computing." Fog computing is an extension of the cloud designed to manage data from sensors and edge devices. For instance, a temperature reading every second doesn't need to be uploaded to the cloud. Fog computing techniques would take that real-time data, average it out based on parameters and upload it to the cloud every half hour or so. If temperature got out of whack the sensor would have enough intelligence to act quickly.


Re-imagine Master Data Management -- With Graph Databases
Current MDM solutions typically store their data in an Relational Database Management System (RDBMS) which makes it hard to see relationships within the data and leverage those insights in real time for competitive advantage. The new MDM needs to use a different type of data store optimized to quickly discover new insights in existing data, provide a 360-degree view of master data, answer questions about data relationships in real time. The good news is that this is not only possible but happening today, thanks to new technologies and approaches that transform the concept and execution of MDM to enable companies to consolidate data from many channels into one and offer a highly related, true view of this data.


Cybersecurity’s future will require humans and machines to work symbiotically
Although companies are building their own security solutions to detect and mitigate attacks at the earliest possible stages, as time goes on and more devices get shared across contexts by multiple users. That means the methods by which attacks will be perpetrated will multiply. The modern enterprise lives across the cloud, mobile devices, and the Internet of Things, which means the approaches we previously used to defend against cyber threats are no longer viable. ... In most enterprise settings, security data gets collected and correlated in SIEM (Security Incident and Event Management) products made by Splunk, LogRhythm, and others, and it ends up overwhelming the security analysts tasked with making sense of it.


Compliance Is Now Calling the Shots, and Bankers Are Bristling
“These people are in great demand,” says Maurice Gilbert, founder of Conselium, a headhunting firm in Dallas. Gilbert used to do executive searches for all sorts of positions. “Then, about eight or nine years ago, we got a compliance search,” he says. “And then we got another one. And we said, ‘Is this the tip of the iceberg?’” It was. Now, compliance is all Gilbert does. His biggest payday usually comes when he places a chief compliance officer. These people sometimes report straight to the board of directors, and they make really good money, Gilbert says. In April, a very large pharmaceutical company had him looking for a compliance head to come aboard at $1.5 million a year.



Quote for the day:

"People not only notice how you treat them, they also notice how you treat others." -- Gary L. Graybill

June 29, 2015

Blending humans and technology in the workforce
A key enabler behind this co-operation lies in the interfaces. Advances in natural language processing (NLP) and speech recognition are making it a lot easier for humans to interact with machines in real time. Speech recognition is becoming more effective thanks to the growing capability of machines to “understand” unstructured conversations. This is aided by the ability of machines to make instant internet searches and to use contextual clues. At the same time they are becoming more effective at incorporating user feedback to improve their accuracy. Such is the interest in NLP technologies that the market for this application is expected to grow quickly to reach $10bn by 2018 from $3.7bn in 2013.


DevOps & Product Teams - Win or Fail?
Developers are expected to deliver product features or user stories, preferably in a predictable way. When unforeseen problems cause delays, developers - keeping the release date in sight - struggle frantically to compensate, releasing incomplete features (although some would argue that there’s no such thing as releasing too early). Operations is usually prized on availability. MTTR may be more DevOps-friendly than MTBF, but regardless of how it's measured, outages are more difficult to prevent in face of constant change. This can cause engineers in operations to be over-cautious and too conservative. If lots of new product features are deployed to production, it’s the developers’ merit, but if any of those shiny new features cause an outage, the operations guys will be waking up to fix it.


UC San Diego Researchers Amp Up Internet Speeds
The results of the experiment, performed at UC San Diego's Qualcomm Institute by researchers from the Photonics Systems Group and published in the June edition of the research journal Science, indicate that fiber information capacity can be notably increased over previous estimates by pre-empting the distortion effects that will happen in the optical fiber. The official name of the paper is "Overcoming Kerr-induced capacity limit in optical fiber transmission." "Today's fiber optic systems are a little like quicksand," Nikola Alic, a research scientist from the Qualcomm Institute, the corresponding author on the Science paper, and a principal of the experimental effort, wrote in a June 25 statement.


Urgency of Present and Past in IoT Analytics
The most basic obstacle to extracting value from this OT-generated data is connecting it to traditional Information Technology (IT) systems. This integration is problematic because IT and OT evolved in different environments and use different data types and architectures. While IT evolved from the top down, primarily focusing on serving business and financial needs, OT grew from the bottom up, with many different proprietary systems designed to control specific equipment and processes. IT systems are based on well-established standards that can integrate large volumes of information across different applications. In the world of OT, no single industry standard yet exists for Machine to Machine (M2M) connectivity.


Why is Virtualization creating Storage Sprawl?
Storage sprawl has become worse in organizations that have made a significant investment in virtualization technologies. For example most virtual desktop infrastructures (VDI) have an all-flash array to handle desktop images, preventing boot storms, logout storms and maintaining acceptable performance throughout the day. But most VDI environments also need a file store for user home directories. There is little to be gained if this data is placed on the all-flash array, but certainly data centers need to provide storage to their users to support user created data. As a result most organizations end up buying a separate Network Attached Storage (NAS) device to support user home directories and other types of unstructured data.


Dealing With Data Privacy in the Cloud
“If the single biggest concern centres around the security of placing data in multi-tenant public clouds then this is a misjudgement," he said. "If anything, providers of public cloud managed hosting services know a lot more about system security than most individual firms. Second, privacy policy controls stipulated upon instances of private cloud may be harder to update that those held in public environments where service layers are stronger.” Indeed, confidence in cloud security is growing, according to the 2014 IDG Enterprise Cloud Computing Study. The survey found that the vast majority of enterprises were “very” or “somewhat” confident that the information assets placed in the cloud are secure.


Overcoming the business and technology barriers to DevOps adoption
"There was a degree of customer dissatisfaction with the service we provided, in the sense that we couldn’t keep up with the demand from developers, which meant there were long lead times in providing them with environments; and they were created in a manual way with semi-automated scripts," says Watson. "So, not only did it take us a long time to provide these individual environments, sometimes they weren’t always the same." This often led to disagreements between the software development and infrastructure building teams, as the environments they delivered didn’t always quite fit the bill. To rectify this, Watson created small groups of developers and infrastructure architects, while doing away with the ticketing system used to communicate requests between these groups.


Why is a cloud provider like a restaurant?
In my experience the thing that marks the unsuccessful restaurant is the belief that having the best kitchen equipment (or the cheapest, depending on their business model) – is the defining factor in their success. What rubbish! Any experienced restaurant patron can tell you that it’s all about the customer experience – and how the customer perceives the value delivered by the restaurant. The customer only has the ‘front-office’ experience – the location, parking valet, Maitre d’, bar, seating arrangement, ambience and of course, the menu and service. The restaurant might only be a drive-through fast-food outlet, but the same principles apply. The customer makes a choice of provider based on their required value proposition and expect that to be delivered.


The Road Ahead for Architectural Languages
MDE is a possible technological solution for successfully supporting the requirements of next-generation ALs. In MDE, architects use domain-specific modeling languages (DSMLs) to describe the system of interest. The concepts of a DSML - its first-class entities, relationships, and constraints - are defined by its metamodel. According to this, every model must conform to a specific metamodel, similar to how a program conforms to the grammar of its programming language. In MDE, it’s common to have a set of transformation engines and generators that produce various types of artifacts. Practitioners can take advantage of transformation engines to obtain source code, alternative model descriptions, deployment configurations, inputs for analysis tools, and so on.


Cloud computing may make IT compliance auditing even cloudier
There may be several issues at work that reduce IT's preparedness for compliance audits. Time is likely the leading culprit, as tending to compliance reporting is one more thing to be squeezed into a busy day. Related to that is lack of resources -- short-staffed IT departments may find it difficult to put someone on the case more than a few hours a week. Some industry observers say cloud computing has made the matter of compliance even, well, cloudier. The results are "not surprising when you consider the degree to which cloud systems and mobile access have penetrated most enterprises," says Gerry Grealish, CMO of Perspecsys. "Cloud SaaS systems and BYOD policies take the control of enterprise data -- including sensitive data -- away from enterprise IT teams and put it in the hands of third party vendors.



Quote for the day:

"It's not about how smart you are--it's about capturing minds." -- Richie Norton

June 28, 2015

8 Ways Business Intelligence Software Improves the Bottom Line
"With quick access to your internal data, you can more efficiently use your time to analyze internal information and make decisions," says Ryan Mulholland, president, Connotate, a provider of Web data monitoring and extraction solutions. For example, "as president of my company, I can look at all trends in our sales cycle and see which are going to affect our business," he says. "Then I can decide our course of action much more quickly and efficiently." ... "A strong BI system, if well-configured, can help eliminate the time spent copying and pasting data and performing calculations," says Max Dufour


Student Privacy at Risk in the Age of Big Data
The fear is that the multi-billion-dollar education technology (or “ed-tech”) industry that seeks to individualize learning and reduce drop-out rates could also pose a threat to privacy, as a rush to commercialize student data could leave children tagged for life with indicators based on their childhood performance.“What if potential employers can buy the data about you growing up and in school?” asks mathematician Cathy O’Neil, who’s finishing a book on big data and blogs at mathbabe.org.


The Future Of Algorithmic Personalization
Personalization should bring together collective intelligence and artificial intelligence. The connections become faster and the computers smarter and more efficient. To decrease the computing gap the focus is on enhancing the information flow between humans and machines. Humans are (still) the best pattern-recognition systems in the known universe. We can help each other to find and discover meaningful signals. Artificial intelligence should empower this sense-making by powering adaptive interfaces and predictive learning systems.Human-centered personalization brings together human-curated signals and adaptive machine-learning solutions.


The Limits of 3D Printing
Creating printable files involves two steps: creating a three-dimensional volume model that can be printed, and “slicing” that volume model in the best possible way to avoid material wastage and prevent printing errors. Both steps require tacit knowledge. Following the printing, the parts produced have to be recovered, cleaned, washed (or sanded and polished, in the case of metal prints), and inspected. This, in turn, means that using 3D printing for the aftermarket services — an application where it makes a lot of sense — requires making a significant upfront investment in generating the printable files of the spare parts that would likely be needed.


How Big Data Affects Us Through the Internet of Things
Although big data is ever-present in our lives, it can be difficult to understand how much it really has changed our day-to-day living. Let’s take a closer look at how big data has weaved its way into the lives of many consumers today, via the Internet of Things (IoT). The Internet of Things can be thought of as the interconnectivity of everyday objects that use network connectivity to send and receive data. Whether we consciously realize it or not, we are surrounded by objects like these that depend on big data to make our lives better.


Coming soon: An API for the human genome
The genome is in many senses a database that we have constructed and curated and built new interfaces to. As a result, it will soon be the latest addition to what has for been referred to as the API Economy. Computer programs themselves will be increasingly able to accommodate genomics, perhaps in ways no more remarkable than how Mint.com pulls together your bank balances. It’s this piece that I’m most interested in because a whole generation of software and hardware developers will soon be able to think about personalization at a molecular level, without the need for a bioinformatics team or a PhD. This will be the fourth and perhaps final wave.


Professionalism, waivers and the hard questions
Our role as enterprise-architects is mainly one of decision-support, not decision-making – we can, do and should give advice on architectural concerns, developed to the best of our ability, but unless we are explicitly asked to make decisions, the final decisions are not ours to make. And that distinction is crucial (not least for our continued employment…). If we’ve done our job well, we should have a pretty clear idea of what would work in the enterprise, and what won’t – what will support ‘things-working-together’, and what won’t – and our advice should indicate and describe that overall understanding, in terms appropriate for the respective audience.


Microsoft Cloud Meets Cisco’s Application Centric Infrastructure (ACI)
The path to fast and efficient IT requires more than just technology. Technology must enable a new process model for speeding up workflows across siloed organizations within the IT function. This session will introduce Cisco’s Application Centric Infrastructure and its tight integration into Microsoft Azure clouds. We'll show how you can deliver new tenant services while transforming your IT organization and workflows with a common policy model, centralized control, and simplified operational visibility across your data center. We’ll demonstrate how your applications, network and security teams can leverage a new operational model to generate compelling business outcomes for your enterprise.


Universal Limits on Computation
In 1965 Gordon Moore speculated that the number of transistors on a chip, and with that the computing power of computers, would double every year[10]. Subsequently this estimate was revised to between 18 months and 2 years, and for the past 40 years this prediction has held true, with computer processing speeds actually exceeding the 18 month prediction. Our estimate for the total information processing capability of any system in our Universe 8 implies an ultimate limit on the processing capability of any system in the future, independent of its physical manifestation and implies that Moore’s Law cannot continue unabated for more than 600 years for any technological civilization.


The Difference Between Culture and Values
If a company’s values are its bedrock, then a company’s culture is the shifting landscape on top of it. Culture is the current embodiment of the values as the needs of the business dictate. Landscapes change over time — sometimes temporarily due to a change in seasons, sometimes permanently due to a storm or a landslide, sometimes even due to human events like commercial development or at the hand of a good gardener. So what does it mean that culture is the current embodiment of the values as the needs of the business dictate? Let’s go back to the value of Transparency. When you are 10 people in a room, Transparency means you as CEO may feel compelled to share that you’re thinking about pivoting the product, collect everyone’s point of view on the subject, and make a decision together.



Quote for the day:

"Computer technology is so built into our lives that it's part of the surround of every artist." -- Steven Levy

June 27, 2015

Evolution of Continuous Delivery and DevOps
Ultimately engineering teams are employed to provide solutions to business problems. If your organisational structure is built around your technical platforms and your engineering teams are built around your organisational structure, the solutions they provide will in part solve business problems but mostly will try to compensate for organisational problems - this is by no means a new problem. When it comes to delivering value quickly and consistently there's nothing better than having everyone you need working together (say in the form of a cross-functional team) - as long as they have a consistent flow of work that they can all contribute to. The potential problem however is ensuring the work keeps everyone busy. Unless you have multi-skilled engineers (or engineers who are willing to simply muck in) things can become uneven and inefficient.


The Business Processes of Information Governance
Regardless of industry or line of business, an analogous quote can easily be made, be it selling shirts, producing pharmaceutical drugs, or buying and reselling products. The business processes that organizations put in place determine how efficiently and effectively companies conduct their work. These processes are often unique by industry, tailored to specific companies, and studied and improved via major initiatives and strategic investments. The business processes of information governance should be added to this list of differentiating, business value-driving processes. Indeed, highly effective organizations run highly effective business processes -- and information governance is no exception. To succeed, the business processes of information governance can be divided into six key areas:


How Will Businesses Change with Virtual Reality?
Virtual reality could also be meaningful to people with limited physical mobility as it would allow them to tour places they might not have otherwise been able to visit and do so by viewing actual video footage of real places. However, everyone might be interested in the ability of virtual reality to provide the realistic sights and sounds of faraway places. Combined with other sensory stimuli, such as a beach scent, light breeze, and some heat to simulate sunlight, virtual reality could allow you to create a fairly immersive, multi-sensory simulation of sitting on your favorite beach. While it would not be a real substitute, it might just be the next best thing.


TCS's Ignio can predict problems & automate: CEO N Chandrasekaran
It is a neuroscience-based self-learning platform. So if you take Ignio and you install it then the moment you put it into an environment, it sucks up the information about that environment and creates a context. If you take infrastructure context, it will know how many different kinds of hardware are there, how many different kinds of software are there. So it will learn about all the things about your infrastructure from the data that's already present in the your network. From an infrastructure point of view it has knowledge of the different infrastructure pieces that are there. Then what it does, is that it can automate.


Big Data and the Rise of Augmented Intelligence
Each year computers are getting faster, but at the same time we as humans are getting better at using them. The top chess players in the world are not humans OR computers, but combinations of humans AND computers. In this talk, Sean Gourley examines this world of augmented intelligence and shows how our understanding of the human brain is shaping the way we visualize and interact with big data. Gourley argues that the world we are living in is too complex for any single human mind to understand and that we need to team up with machines to make better decisions.


The rise of SSDs over hard drives, debunked
If there's one upgrade a consumer can make to a desktop or laptop computer that will make the greatest difference in performance, it's swapping in an SSD. NAND flash manufacturers such as Samsung, Toshiba, Micron, and Intel, have continued to shrink the lithography technology for making flash transistors. Last fall, at the Flash Memory Summit, Toshiba revealed its smallest lithography process for NAND flash with a 15-nanometer, 16GB MLC NAND wafer. The 15nm wafer was developed in partnership with SanDisk. Flash makers have also increased the number of bits -- from one to three -- that can be stored per NAND flash cell, all of which has increased density and reduced manufacturing costs.


Forget China, There’s An E-Commerce Gold Rush In Southeast Asia
The paucity of payment systems in the Southeast Asian is a case in point itself. Thanks to Alibaba and WeChat, COD decreased from more than 70 percent of total payments in 2008 to less than 21 percent recently in China during last year’s Singles’ Day mega sales (think Cyber Monday / Black Friday for China). Southeast Asia, however, doesn’t have an Alibaba or WeChat yet. Due to their size and reach, Alibaba and Tencent were able to turn their online payment methods into the de-facto standard for e-commerce in China. Southeast Asia is still fragmented. Until local financial institutions and/or e-commerce players get their act together in terms of payment platforms, COD will remain the dominant payment methods covering over 80 percent of total payments.


The Internet was a mistake, now let’s fix it
In fact, with government regulation and support, mathematically secure communication is eminently possible. Crypto theory says that a truly random key that is as long as the message being sent cannot be broken without a copy of the key. Imagine a world where telecommunication providers working under appropriate regulations issued physical media similar to passports containing sufficient random digital keys to transmit all of the sensitive information a household would share in a year or even a decade. We would effectively be returning to the model of traditional phone services where telecommunication companies managed the confidentiality of the transmission and government agencies could tap the conversations with appropriate (and properly regulated) court supervision.


The Pitfalls that You Should Always Avoid when Implementing Agile
An agile transformation is only likely to succeed if heavily supported on an ongoing basis by the management. It is also a prerequisite for a robust agile implementation that the management level is acting according to agile principles, and not only the project teams. ... Adopting agile will impose significant changes to an organization that may have worked according to traditional principles for maybe even 10 – 20 years or more. It may look simple but it is not. ... Top-level managers often do not see the necessity to involve themselves in projects on an ongoing basis. Yes, they allocate (a lot of) money to projects, and yes, they do get upset, when projects overspend or get delayed, but usually they do not see themselves as more than an escalation point for their Project Managers.


Will Google Artificial Intelligence Run An IT Help Desk?
The system can respond to questions, and have long, complex conversations with people. In tests, it was able to help users diagnose and fix computer problems, including Internet browser and password issues. The AI also taught itself how to respond to inquiries about morality and philosophy. The answers were coherent enough that you might mistake them for something your stoner roommate from college once said. ...  The machine is able to do this because it was designed to come up with an appropriate response based on context. “These predictions reflect the situational context because it's seen the whole occurrence of words and dialogue that's happened before it,” Jeff Dean, a Google senior fellow, said at a conference in March.



Quote for the day:

"The signs of outstanding leadership are found among the followers." -- Max DePree

June 26, 2015

The Wait-for-Google-to-Do-It Strategy
When it comes to the current state of innovation and the economy, the implications of Google Fiber are complicated. On the one hand, it is a testament to the power of competition. Google’s willingness to invest the money in a new network threatened cable and telecommunications companies’ dominance, and took customers away from them. That shifted the economic calculus. It’s no coincidence that the cities and regions where cable companies first announced they were building fiber, and offering high-speed connections at affordable prices, have been the places where Google Fiber either is or is going. At the same time, though, it’s depressing that ensuring competitive broadband markets required the intervention of an outsider like Google.


Are you suffering from a cloud hangover?
Recent research from Sungard Availability Services found that the hangover is costing European businesses an average of more than £2 billion; with the overwhelming majority of businesses (81%) in the UK, Ireland, France and Sweden having encountered some form of unplanned cloud spending. Not only is each organisation within these countries paying an average of £240,000 per year to ensure cloud services run effectively, but they have also spent an additional £320,000 over the last five years thanks to unforeseen costs such as people, internal maintenance and systems integration. And despite being hyped as a way to reduce IT complexity, many early adopters of the cloud (43%) have found that the complexity of their IT estate has in fact increased since their initial cloud investment.


Why datacentre zero is an impossible dream, even with help from the cloud
Rob Fraser, CTO for cloud services at Microsoft UK, agrees with the assessment that eventually the price of cloud services will fall to a point where it becomes near impossible for firms to compete with the public cloud on computing cost. "Fundamentally, from an economic point of view, there must come a point at which the cost per unit of compute, unit of storage, unit of analysis becomes hard to compete with the scale of public cloud. Economically look at all the forces of commoditisation and that point will have to occur," he said. But to believe that businesses will move wholesale to the cloud on the basis of cost alone, he said, is to ignore a swathe of issues beyond price. "There are still going to be hugely valid reasons why on-premise infrastructure needs to run its own level of scale, even if it might be more pricey, because of other issues around the business."


How CISOs can create security KPIs and KRIs
"If I don't know what you're doing, how can I help you? I'm going to make some assumptions about what you're doing and I could be completely wrong," Durbin says. "Security guys are always talking about cost. If we realign this, the security guys can now go to the business and say, 'look, if this is what is important to you, this is the role I can play in helping you protect that, but I don't have the funding for a variety of reasons.' The business can then make the call as to whether to find the funding for that problem. It's no longer the security guy's problem, it's the business's problem."


Commodity Data Center Storage: Building Your Own
New architectures focusing on hyperscale and hyper-convergence allow you to directly abstract all storage and let you manage it at the virtual layer. These new virtual controllers can reside as a virtual machine on a number of different hypervisors. From there, it acts as an enterprise storage controller spanning your data center and the cloud. This kind of hyper-converged virtual storage architecture delivers pretty much all of the enterprise-grade storage features out there including dedup, caching, cloning, thin provisioning, file replication, encryption, HA, and DRBC. Furthermore, REST APIs can directly integrate with proprietary or open source cloud infrastructure management systems.


Red Hat builds on its open source storage portfolio
This is the first version of the software that does not require RAID (redundant array of inexpensive disks) technologies to ensure data integrity, meaning organizations could save on storage hardware by as much as 75 percent, given they would not have to buy the additional storage to make duplicate copies of the data. Gluster can now also guard against bit rot, or the gradual decay of files on disk that can, over time, render them impossible to read. Gluster can now also take the place of hierarchical management systems (HSM), which offload old or less consulted data to less costly, slower storage systems. Gluster now offers operators fine-grain control of where to store data.


Don't be afraid to give your strategic goals a tuneup
As is often the case with startups, strategic goals and objectives will change based on reasons such as changes in technology, desires and directions of customers' needs -- or just a shift in the end-game vision itself. For us, our long-term strategic goals were to provide our customers with a host of offerings that effect a total digital transformation. Yet with our current and growing client list, we were not landing the larger jobs that fit in to this end-state. Instead, we were working on many tactical projects related to Web presence redesign and re-definition (modernization) and customer engagement. What my astute business partner had realized was that our corporate messaging was saying one thing, yet what our new clients needed was something a little closer to the ground.


Computers Are Getting a Dose of Common Sense
Making computers better at understanding everyday language could have significant implications for companies such as Facebook. It could provide a much easier way for users to find or filter information, allowing them to enter requests written as normal sentences. It could also enable Facebook to glean meaning from the information its users post on their pages and those of their friends. This could offer a powerful way to recommend information, or to place ads alongside content more thoughtfully. The work is a sign of ongoing progress toward giving machines better language skills. Much of this work now revolves around an approach known as deep learning, which involves feeding vast amounts of data into a system that performs a series of calculations to help identify abstract features in, say, an image or an audio file.


Who is responsible for digital leadership in the boardroom?
Soon, every member of the C-suite will need to be leading digital. ... there needs to be a concerted effort to raise the digital IQ of the whole senior leadership team which, in turn, will ensure the broader organisation appreciates emerging digital opportunities and imperatives. Developing and, where necessary, recruiting digital talent across the functions will also be required to shape and deliver the desired transformation agenda. The alternative is not attractive. There are still too many firms where the status quo prevails – an enterprise IT organisation that is disconnected from or can’t keep up with emerging digital business activities, a C-suite where “technology is not my job” attitudes are still deemed acceptable, and isolated pockets of digital activity in marketing, engineering and elsewhere that have yet to coalesce into a real digital strategy.


The Rise and Risk of BYOD (INFOGRAPHIC)
The BYOD trend presents its own concerns when it comes to protecting confidential workplace data. As the trend continues to gain steam, CIOs and IT departments are faced with a whole different set of variables due to the variety of devices and the relative level of security that comes with each device. The push to be able to your own mobile device to work is also a push to house personal information and apps alongside company data. It can be a risky endeavor. Although convenient, the conglomeration of data in this way is like a treasure trove for potential thieves.



Quote for the day:

"To do one must set goals; to set goals one must have a dream; to dream one must have an opportunity." -- ‏@Orrin_Woodward

June 25, 2015

Refactoring with Loops and Collection Pipelines
A common task in programming is processing a list of objects. Most programmers naturally do this with a loop, as it's one of the basic control structures we learn with our very first programs. But loops aren't the only way to represent list processing, and in recent years more people are making use of another approach, which I call the collection pipeline. This style is often considered to be part of functional programming, but I used it heavily in Smalltalk. As OO languages support lambdas and libraries that make first class functions easier to program with, then collection pipelines become an appealing choice.


Security should be enabling, says HP strategist Tim Grieveson
“Information security professionals have an important role in helping organisations build a culture of security, so that everyone can make a contribution because they understand the value of data and the associated security risks,” he said. By raising people’s situational awareness, he said, they are more likely to be self-policing when dealing with company data and wary of things like “shoulder surfing” or revealing personal and business information during phone calls on trains. “Like health and safety, information security should be a concern for everyone, but changing an organisation’s culture is challenging. Security needs to be built into every new product, application and business process,” he said.


Can LibreOffice successfully compete with Microsoft Office?
"If a customer has a problem opening your files (created in LibreOffice) then they can always download a free copy of LibreOffice," he says. But while this is true in theory, in practice it's probably more likely that a customer will expect you to get a copy of Office if you want to do business with them, rather than download some software they’ve probably never heard of themselves. Meeks also works at Collabora – a U.K.-based company that provides commercial support and maintenance for LibreOffice – and he says the company uses LibreOffice software without any interoperability problems. "We run a multimillion dollar business using LibreOffice and we routinely exchange documents with lawyers – and it works fine," he maintains.


Office 365 – Common Exchange Online Hybrid Mail Flow Issues
There are a number of symptoms that might indicate that hybrid mail flow is not working properly even when messages are routing. If messages between environments occasionally end up in a user’s junk mail, that’s definitely a good sign there is a misconfiguration. Issues booking conference rooms or receiving the wrong out-of-office message are also symptoms. It basically comes down to whether the messages between environments appear as “internal” or “external”. So just like a conference room is not going to accept a booking request from a random Internet user, it won’t accept a booking from your cloud user if that message appears as external. The quick way to check is to look at a message received in each environment and check the message headers.


Succeeding with Automated Integration Tests
Automated tests that are never or seldom executed can even be a burden on a development team that still try to keep that test code up to date with architectural changes. Even worse, automated tests that are not constantly executed are not trustworthy because you no longer know if test failures are real or just because the application structure changed. Assuming that your automated tests are legitimately detecting regression problems, you need to determine what recent change introduced the problem — and it’s far easier to do that if you have a smaller list of possible changes and those changes are still fresh in the developer’s mind. If you are only occasionally running those automated tests, diagnosing failing tests can be a lot like finding the proverbial needle in the haystack.


Dropbox Is Struggling and Competitors Are Catching Up
Stability may be in short supply in the company’s executive ranks. Ilya Fushman, the head of product for Dropbox for Business, became a venture capitalist in June. The company’s head of design, Gentry Underwood, has stepped down, too, though he remains with Dropbox in an unspecified capacity. Woodside hasn’t been able to hire an overall head of product management, the person who’d be trying to match the security and other features in place at Microsoft and Box. Box, which went public in January, is something of a cautionary tale for Houston and Woodside. Its total 2014 revenue was about 60 percent of Dropbox’s, according to IDC, but its market value is now only one-fifth of Dropbox’s private valuation, suggesting that the office cloud market may not grow fast enough to bridge the gap between investor fantasy and reality.


To 'fail fast,' CIOs need to think strategically from the get-go
It's almost like building out the framework. But unlike a house, where once you put a lot of the infrastructure in place, it's not so easy to tear apart and put new stuff in, with software, it is. We built an actual working product pretty darn quickly, and then we've iterated. As opposed to a pure design phase and build phase and then a customer test phase, it's almost been bunches of loops of that. Of course we had to think strategically in the beginning. What's going to be the broad framework for the technologies that we need to have in place to support what we want to do? But then when I saw that framework, I felt pretty confident that we could achieve the implementation. One of the challenges in failing fast and what's important is that you find that different languages need to be in place: the business language, the clinical language, the technology language.


Digital government 'a chance to build a new state'
Hancock also echoed his predecessor, Francis Maude, in emphasising the importance of building digital government around the “user” – that is, the citizen – rather than the mechanics of government. “In Finland, town planners will visit a local park immediately after a snowfall, because the footprints reveal the paths that people naturally choose to take. These ‘desire paths’ are then paved over the following summer. We too must pave the paths people travel,” he said. “Let’s take a small example. 'Registry offices' are officially known as 'register offices'. But everyone in real life calls them called 'registry offices', so no-one ever really searches for ‘register offices’ online. We’ve paved the path that people travel, so the Gov.uk page comes top of the search results, even if you search ‘registry’ office.”


Vitaly Kamluk tells how Interpol catches cyber criminals and other stories
Very often we use common techniques and tools for computer forensic examination: Encase, Sleuthkit, various data carvers, data format recognizers, and even standard binutils. We develop a lot of scripts and tools ourselves, sometimes just for a single case: unpackers, deobfuscators, custom debuggers, dumpers, decryptors, etc. Reverse engineering binaries takes quite a lot of time as well. We also may do mapping infrastructure, scanning networks, ports. Developping sinkholing software and log parsers is yet another important part of quality research ... The Internet is not owned by a single entity — it’s a network of equal participants. The solution is the union of all participants of the global network against cybercrime.


Phil Zimmermann speaks out on encryption, privacy, and avoiding a surveillance state
At a recent private viewing of the exhibition that features the Blackphone, Zimmermann pondered what the emergence of whistleblowers like Snowden says about the current state of privacy. "The moral problems with the behaviour of our intel agencies should give us pause, should get us to step back and question, 'What are we getting our intel agencies to do?' We should take another look at this. We should try to restrain them more," he told the audience. "This has been my motivation for my entire career in cryptography," he says. "The driving force is the human rights aspect of privacy and cryptography and ubiquitous surveillance, pervasive surveillance... We live in a pervasive surveillance society."



Quote for the day:

“Being confident and believing in your own self-worth is necessary to achieving your potential.” -- Sheryl Sandberg

June 24, 2015

Oracle's biggest database foe: Could it be Postgres?
Gartner, for example, forecasts that more than 70% of new in-house applications will be developed on an open-source database by 2018, and that 50% of existing commercial RDBMS instances will have been converted to open-source databases or will be in process. In other words, open-source databases are almost certainly cutting off Oracle's oxygen when it comes to new applications, but it may also be cutting into its hegemony within existing workloads. If true, that's new. Though from a biased source, an EnterpriseDB survey of Postgres users certainly suggests that Postgres users are running the venerable open-source database for increasingly mission-critical workloads, including those that used to pay the Oracle tax:


Infographic: Must Read Books in Analytics / Data Science
There are 2 attributes all the members in our team at Analytics Vidhya share: We all are voracious readers; and We all love to share our knowledge with people in simplified manner, so that everyone gets access to this knowledge. These two attributes lead us to naturally gravitate towards sharing some of the best reads we come across. You can think of this infographic as an ideal list of books to have in bookshelf of every data scientist / analyst. These books cover a wide range of topics and perspective (not only technical knowledge), which should help you become a well rounded data scientist.


Snowflake Launches Virtual Data Warehouses On AWS
Snowflake isn't a data warehouse of big data dimensions or routine enterprise data dimensions. Rather, it's a virtual data warehouse that will be sized to match the job sent to it. When the analytical tasks are finished, the warehouse shuts itself off to save overhead. "In other cloud data warehouses, you would have to unload the data to turn it off and then reload it [to use it again]," he said. Snowflake avoids that data movement task. Although Snowflake runs on AWS at its US West facility in Oregon, customers may use Snowflake without an AWS account. They also don't need to understand the ins and outs of Amazon virtual machine selection.


Report Template for Threat Intelligence and Incident Response
When handling a large-scale intrusion, incident responders often struggle with obtaining and organizing the intelligence related to the actions taken by the intruder and the targeted organization. Examining all aspects of the event and communicating with internal and external constituents is quite a challenge in such strenuous circumstances. The following template for a Threat Intelligence and Incident Response Report aims to ease this burden. It provides a framework for capturing the key details and documenting them in a comprehensive, well-structured manner. This template leverages several models in the cyber threat intelligence (CTI) domain, such as the Intrusion Kill Chain, Campaign Correlation, the Courses of Action Matrix and the Diamond Model.


Startup’s Lightbulbs Also Stream Music
The speaker bulb, which twists into a standard-size light socket, contains white and yellow LEDs, the brightness or dimness of which coördinates wirelessly with other Twist bulbs that contain just LEDs. Astro, which plans to ship the gadgets early next year, says a starter pack with two LED bulbs, a speaker bulb, and a handheld dimmer switch will cost $399, reduced to $249 for two months to encourage people to sign up. While companies like Philips Hue focus on automating and customizing the lights themselves, Twist is among a handful of companies thinking of the lightbulb as a conduit for wireless audio, too. The company says it plans to add additional functions in the future as well.


Why It's Worth Divorcing Information Security From IT
Too often, when Security reports to IT, we find the IT mentality interferes with security processes and priorities. These days, there is little to no common ground between keeping IT systems up and running for authorized users and monitoring them for signs of compromise by smart, stealthy criminals. Identifying and securing an already compromised system requires the capability to differentiate malicious activity from normal behavior, and hackers are very good at making their activity look normal. The only way to find them is through a combination of new technologies and human judgment. Being a subdivision of the IT department makes security blind to important business processes and to decision making at the corporate and department level.


Aligning Private Cloud and Storage: 4 Considerations
Firstly, the private cloud offers a greater degree of control than the public cloud, especially with data. When you build a private cloud, you’re able to keep your data at your fingertips, establish performance levels that your organization demands to best serve end-users and customers and set security policies that align with your customer responsibilities or industry regulations. Secondly, private cloud gives you more control of applications. Most public clouds require apps to fit their cloud mould, but a lot of businesses have unique, custom-made applications and recoding these applications to fit the public cloud is not a good solution.


Finance Hit by 300 Times More Attacks Than Other Industries
As can be expected, cyber-criminals are working hard to ensure their attacks are as successful as possible, firing a large volume of low level threats at their targets in order to distract IT security professionals while the main targeted attack is launched, Websense said. Obfuscation, malicious redirection and black hat SEO have become popular of late, although patterns apparently shift on a month-by-month basis – again to improve success rates. Targeted typosquatting is also making a comeback in the sector, usually in combination with social engineering as part of spear phishing attacks designed to compromise a host or trick a user into instigating a payment or transfer of money, the report claimed.


Mobile app testing for fun and profit
"If you're doing testing for a mobile website you can more or less use the same tools as you would when just testing out a normal website with your browser," Prusak said. "Ultimately, it should still work with your browser, and there are plug-ins and extensions which work with today's browsers which you can modify HTTP headers or even the resolution and make the backend still think you're connecting on a mobile device." "I'm aware of a plethora of different solutions and all of them require either jail-breaking the device or installing software on your computer and then pointing your phone or device to your computer and using that as a proxy," Prusak said. He sees these solutions as rife with issues, inefficient, and too complex.


Why You Should Definitely Migrate Existing Apps to the Cloud
100% security is an illusion. If you have to make a decision based on the available choices, cloud services are in no way less secure than any of the existing systems in place. Cloud service providers are known for their innovations. It is apparent that at any point in time they would implement better physical and logical security practices than a standalone on-premise data center operation. Many cloud providers are now ISO, PCI DSS, EU Model Clauses and other global security agencies certified. Moreover not all the applications require a bank grade security. Do they? In case you’ve highly sensitive data or your app is subjected to specific security & privacy regulations (such as HIPAA and HITECH) you can opt for Hybrid Cloud Service



Quote for the day:

"The time is always right to do what is right." -- Martin Luther King Jr.

June 23, 2015

Lack of trust in tech is damaging smarthome industry
Although younger people are twice as interested in smarthome technology than the older generation, they do not necessarily have the chance to adopt these technologies because many are living at home longer or are renting from a landlord and therefore have less control than a homeowner over energy consumption. “Younger people seem to be missing out,” Wetherall said. “Younger people are more excited by the novelty, by the ability to play with these technologies.” Smartmetering should fuel engagement with young people by helping them to understand energy consumption, and give them the means to engage with their landlord about making their property better and more efficient.


Bitnation Pangea Releases Alpha of Governance System Based on the Blockchain
Bitnation Pangea wants to be the world’s first blockchain powered Virtual Nation, able to provide all services that traditional governments provide and replace the nation state system with a voluntary form of governance. ... “The alternative the world is currently pivoting towards is U.N.-style global organizations, which would be an even worse ‘one-fit-all’ type governance model than what we currently have,” says Tarkowski Tempelhof. “Bitnation aims to prevent that, through setting a precedent for voluntary competing service providers, powered by the Bitcoin blockchain technology, effectively creating an open source cryptonation protocol.”


Cyber Security in Aviation
The increase of technology does not match the increase in technology security. Duggal said, “technology moves so fast, security sometimes gets left behind because you’re trying to get to the consumer, you’re trying to give them what they want, and sometimes when you try to address security after the fact you add complexity to the mix.” The threat level is increased when systems are not secured prior to installation. Security is often overlooked when ensuring for the consumer’s satisfaction with a rapid implementation and deployment. Making the consumers happy with the latest and greatest technology without first securing the systems before installation merely increases the threat level.


Simple is beautiful: Useful questions to cut through process complexity
As we elicit more information about the process, they may tell us about every logical branch and every exception, and we’ll gain a really rich understanding of the existing situation. This is very useful and will aid our analysis – after all, we’ll need to ensure that our processes can cater for the real environment and are useful in practice. However, it’s also important that we understand (and in some cases challenge) the need for each layer of complexity. In some cases, we may find that particular branches and steps are no longer relevant, and we may be able to simplify the overall process by eliminating them. Doing so may well make our customers’ lives easier, and ensure that the process is as quick, slick and as cost effective as possible. It can be a real win/win.


Vert.x 3, the Original Reactive, Microservice Toolkit for the JVM
Vert.x 3 also has built in support for RxJava - we provide Rx-ified versions of all our APIs so if you don't like a callback based approach which can sometimes be hard to reason with especially if you're trying to co-ordinate multiple streams of data then you can use the Rx API which allows you to combine and transform the streams using functional-style operations. We're also looking into an experimental new feature for Vert.x which allows you to write your application in a classic synchronous style, but where it doesn't actually block any OS thread, the idea being you can get the scalability advantages of not blocking OS threads but don't have the callback hell of programming against asynchronous APIs., i.e., have your cake and eat it. We think this could be a killer feature, if we get it right.


Spark at the Center of a Technology Revolution
All of our connected devices are fueling a growth in data that is completely new to everyone. Starting 3 years ago, we generated more data than we created in the 199,997 years of human history leading up to that point. What this starburst of data means is that how we think about data and technology needs to change at the most fundamental level. It’s not just a question of scale—the types of data and the potential for the way they impact human life and the globe are different at the core. Traditional approaches are either not going to function with the new, massive amounts of data, or they are not going to produce results that are relevant in a world where real-time feedback from devices wired into everything from human heartbeats to interstellar data is flowing constantly and at an increasing rate.


Top 4 Strategies for NFV Success
Just think of the benefits: replacing dedicated hardware appliances across the network with standard servers, general-purpose storage, and standardized software applications – not to mention virtualization to deliver any network function end-to-end. There’s no doubt that NFV can deliver tremendous rewards for network operators in terms of flexibility, scalability, and cost-efficiency. What service providers can’t afford to overlook, however, is how NFV may affect their connectivity infrastructures and network topologies. There are ramifications for transport networks that must be considered, in order to maximize the benefits and revenue opportunities that NFV promises. One such area is virtual customer premises equipment (vCPE). NFV has the potential to radically disrupt this space.


Christina Page explains how Yahoo keeps its datacentres green and clean
“It’s a low-tech design and is cheaper to build because you’re not installing these big chiller systems inside. It’s also more reliable as there are fewer moving parts to fail,” she says. Facilities built according to YCC specifications have a long and narrow “chicken coop-style” design, says Page, to encourage outside air to circulate inside and ensure just 1% of a building's total energy consumption is being drawn on to cool it. “What we’ve done is sited in places where there are few enough hot and humid days of the year so this designs really works. “At the time we were being conservative, and what we’ve concluded is that there are other locations where there are more hot and humid days that work just as well with this technology,” she says.


The False Dichotomy Between Planned and Improvisational Projects
Another way to look at the difference is the costs and benefits of individual innovation in the two environments. If everyone building a WalMart was constantly trying out radical new ideas, the result would be chaos. There is certainly innovation in commercial construction, but it has to be managed centrally to avoid interference. An electrical contractor doing things differently might make a small improvement but risk large downstream costs. By contrast, what was the cost and value to Facebook of someone going off and implementing photo tagging? The cost was small and the consequences on the rest of engineering was also small.


Q&A on Fifty Quick Ideas to Improve Your Tests
Like beauty, quality is in the eye of the beholder. This innate subjectivity can lead to wide ranging opinions on what good quality is and what attributes display those qualities. To ground understanding it is essential to quantify and visualise quality. This works on several levels. At a story or feature level we quantify a quality target in the form of acceptance criteria, we can also set a holistic picture of quality at a product level. Many teams use acceptance criteria for stories these days but criteria are often still ambiguous, like ‘must be fast’ or ‘must be reliable’, which leaves vast potential for error in the suitability of solution. We’ve found it useful to quantify quality at both feature and product level. Then there is a clear target for discussing feature acceptance, and also a higher level vision of quality that the feature falls within and that directs testing.



Quote for the day:

"To lead the people, walk behind them." -- Lao-Tzu

June 22, 2015

The one which offers 10 answers to the question: ‘What purpose does this biochip serve?’
The State Road Safety Inspection has long abandoned their hopes of the impossibility to fake beacons, “flags”, licenses and badges (no hologram would protect from that, really). That’s why, once a vehicle is pulled over by the police, an officer checks the driver’s license and a vehicle certificate against their database to find out whether the piece of plastic is legitimate (and whether the bearer is a good guy). How would the entire procedure look with a biochip in play? The officer presents the reader through the windshield, I touch it with my hand – and that’s it.


Big data log analysis thrives on machine learning
Clearly, automation is key to finding insights within log data, especially as it all scales into big data territory. Automation can ensure that data collection, analytical processing, and rule- and event-driven responses to what the data reveals are executed as rapidly as the data flows. Key enablers for scalable log-analysis automation include machine-data integration middleware, business rules management systems, semantic analysis, stream computing platforms, and machine-learning algorithms. Among these, machine learning is the key for automating and scaling distillation of insights from log data. But machine learning is not a one-size-fits-all approach to log-data analysis.


How a grocery delivery service became a red hot robotics company
"The ultimate aim is for humans to end up relying on collaborative robots because they have become an active participant in their daily tasks," says Dr Graham Deacon, Robotics Research Team Leader at Ocado Technology. "In essence, the SecondHands robot will know what to do, when to do it and how to do it in a manner that a human can depend on." To get a sense of what these collaborative robot helpers will be doing, imagine an Ocado warehouse. Conveyor belts zip colorful baskets to and fro along diverging paths, placing them in front of an army of human workers who pack them full of groceries. The warehouse is full of machinery, and all of it requires careful and constant maintenance.


Decision Boundaries for Deep Learning and other Machine Learning classifiers
With using {h2o} on R, in principle we can implement “Deep Belief Net”, that is the original version of Deep Learning. I know it’s already not the state-of-the-art style of Deep Learning, but it must be helpful for understanding how Deep Learning works on actual datasets. Please remember a previous post of this blog that argues about how decision boundaries tell us how each classifier works in terms of overfitting or generalization, if you already read this blog. It’s much simple how to tell which overfits or well gets generalized with the given dataset generated by 4 sets of fixed 2D normal distribution. My points are: 1) if decision boundaries look well smoothed, they’re well generalized, 2) if they look too complicated, they’re overfitting, because underlying true distributions can be clearly divided into 4 quadrants with 2 perpendicular axes.


The Advantages Of An Agile Company Culture
The real change comes from the company culture. Is the company still a command-and-control type of environment? Agile is about quickly adapting to change, and not being afraid to fail. As a leader, you need to create the type of environment where failure is not only accepted, but actively encouraged. Agile is more about how your team approaches problems, not the tools used to solve them. In an agile environment, employees are expected to communicate frequently, because internal feedback is important to improving the team. The constant learning and iterative nature of agile means that you need to embrace failure and allow that learning to occur.


Can We Design Trust Between Humans and Artificial Intelligence?
What is it that makes getting on a plane or a bus driven by a complete stranger something people don’t even think twice about, while the idea of getting into a driverless vehicle causes anxiety? Part of this is that we generally perceive other people to be reasonably competent drivers—something that machines can probably manage—but there is more to it than that. We understand why people behave the way they do on an intuitive level, and feel like we can predict how they will behave. We don’t have this empathy for current smart systems.


Who Will Own the Robots?
It is notoriously hard to determine the factors that go into job creation and earnings, and it is particularly difficult to isolate the specific impact of technology from that of, say, globalization, economic growth, access to education, and tax policies. But advances in technology offer one plausible, albeit partial, explanation for the decline of the middle class. A prevailing view among economists is that many people simply don’t have the training and education required for the increasing number of well-paying jobs requiring sophisticated technology skills. At the same time, software and digital technologies have displaced many types of jobs involving routine tasks such as those in accounting, payroll, and clerical work, forcing many of those workers to take more poorly paid positions or simply abandon the workforce.


A Manifesto for Creating Extraordinary Teams
Well, there's a name for that state of mind, it's called "flow" and a good friend of mine, Dr. Judy Glick-Smith, has been studying it for years. She recently wrote an article about it that captures perfectly what flow is all about and how to create teams that sustain a flow-state. I'm borrowing heavily from it here because it is a manifesto that I believe every leader should know by heart. Yes, I'm looking at you! ... Creativity and innovation are the inevitable results of unfettered team-flow. If all of these components are in place, each individual in the organization becomes a leader. Change is integrated into the fabric of the culture. Your people will embrace change, because they are creating it on a moment-by-moment basis.


Do the mobile developers your hire thoroughly understand the internet of things
When you hire mobile app developers to create modern apps working on such multiple devices, they should have their concept clear regarding IoT and its user experiences as well as intricacies involved in it. For mobile app designers, HCI are taking place in variety of contexts due to mobility involved in case of mobile devices. Designers have to deal with different resolutions and scale designs accordingly. They have to address resolutions of tiny top of wearable smart watches at one end, go to smartphones, tablets, desktops, and on the TV user interfaces.


DockerCon 2015: Game On
The bug that is being put in our ear is that enterprises are worried about security. Well, yeah, enterprises are always worried about security, but that’s not the point. While on the one hand, Docker does not present a conventional “attack surface” for the typical malicious user, it also does not present a conventional platform for the typical security vendor or security service. All security now, whether containerized or virtualized or on Facebook’s bare metal servers, is no longer a matter of hardening endpoints, but rather of maintaining the desired state of connections in the network. At this moment, even after a few years of rapid development, we don’t really know what a containerized network will look like, once the architectural debates get settled.



Quote for the day:

“No great manager or leader ever fell from heaven, its learned not inherited.” -- Tom Northup

June 21, 2015

Nest keeps smart home portfolio neat and tidy with latest upgrades
The Nest team on the show floor demonstrated how each of these products (and more made by others) can work together on the connected fabric. For example, the Dropcam can communicate with the Nest Theromostat to automatically turn on motion alerts when the thermostat is set to "Away." Dropcam can also record clips when Nest Protect detects smoke. But rather than trotting out even more different connected appliances throughout the home, Nest is working harder with what it already has, which not only offers the possibility of reducing clutter but simply costs for buying multiple gadgets in the long run. With Nest, Fadell elaborated consumers don't have to choose a bundle of products or platform -- they only have to start with one, which can be accessed, monitored and managed from anywhere worldwide through a mobile device.


The Startup Illusion
Contemporary entrepreneurs no longer adhere to the traditional model of professional success; work hard for many years and, one day, you’ll “make it.” Instead, today’s collegiate youth have bought into the Zuckerberg model. Young, aspiring entrepreneurs believe that with a brilliant idea and a little bit of luck they, too, can be billionaires, potentially overnight. Entrepreneurs are choosing to embrace the lie of probable success whilst ignoring the daunting statistics that contradict such thinking, such as the fact that 80 percent of startups fail within 18 months. I, too, turned a blind-eye. It’s difficult not to buy your own hyperbole. However, after an entire year’s worth of work crumbled in a matter of hours, I opened my eyes and saw through the startup illusion.


What the FCC's new robocall rules mean for your company's marketing efforts
Telemarketing efforts are widely used in both business-to-consumer and business-to-business marketing efforts, and the stakes are high. Earlier this year, Twitter called on the FCC to rule that those who call or text a wireless phone number for which consent was previously given should not be held accountable if that’s no longer the case when it is reassigned. Twitter did not respond to a request to comment for this story. “My hope would be that robocalling wouldn’t be part of any corporation’s communication strategy in 2015,” said author and Internet marketing consultant Brian Carter. The marketing world has moved toward opt-in communication, Carter noted.


Beyond Automation
Intelligent machines, Nicita thinks—and this is the core belief of an augmentation strategy—do not usher people out the door, much less relegate them to doing the bidding of robot overlords. In some cases these machines will allow us to take on tasks that are superior—more sophisticated, more fulfilling, better suited to our strengths—to anything we have given up. In other cases the tasks will simply be different from anything computers can do well. In almost all situations, however, they will be less codified and structured; otherwise computers would already have taken them over. We propose a change in mindset, on the part of both workers and providers of work, that will lead to different outcomes—a change from pursuing automation to promoting augmentation.


The Benefits of a Cloud Integrated Hyper-converged Architecture
The key benefit of an HCA though is its inherent simplicity. This is especially true if the architecture is delivered in a turnkey fashion that includes hardware and software, allowing the architecture to be scaled out as easily as adding additional bricks to a stack of Lego blocks. The result is a quicker time to value, since implementation is far simpler, and thanks to the integration there are fewer components to manage. The end result is a reduced total cost of ownership that allows the business to more rapidly extract value from their IT investments. HCA allows an organization to deliver IT services in the same way that large public cloud providers do, essentially creating a private cloud.


Elon Musk To Build A Hyperloop Test Track, Puts Out Call For Pod Designs
SpaceX says it is not getting into the loop business, merely that “it is interested in helping to accelerate development of a functional hyperloop prototype,” says a spokesman. There are still scores of engineering and mechanical issues to resolve around safety mechanisms, costs, propulsion and suspension systems and manufacturing techniques. Teams are welcome to submit entire pod designs, individual subsystems or safety features. SpaceX says it will also likely build its own pod, which will not be eligible to win the competition. Criteria for winning the competition come out in August.


What role does artificial intelligence play in Big Data?
Analysing large data sets requires developing and applying complex algorithms. To date, humans had to come up with hypotheses, identify relevant variables and then write algorithms to test these theories against the information collected in big data sets. However, as data sets become larger, the ability for humans to make sense of it all becomes more difficult, and limits the insights that can be gained from all this information. AI allows organisations to add a level of intelligence to their Big Data analytics to understand complex issues quicker than humans are able to. It can also serve to fill the gap left by not having enough human data analysts available.


DevOps Deep Dive: Infrastructure as code for developer environments
By taking the Infrastructure as Code approach to this problem, you gain a flexibility and extensibility for your solution and overcome the limitations of image based configuration. Specifically, because we defined our desired end state in code, we can modify the configuration attributes, we can change the versions of the software being installed, we can change the location of the software repository, or change the plugins required, or modify any aspect of the desired configuration all by changing the code. Contrast that flexibility against the static nature of an image, and the volume of work required to make a small change to an image-based configuration.


Probabilistic Project Planning Using Little’s Law
Little's Law helps us take an "outside view" on the project we are forecasting, based on knowledge about actual system performance on a reference class of comparable projects. The method can help any team that uses user stories for planning and tracking project execution, no matter the development process used. ... Little's Law deals with averages. It can help us calculate the average waiting time of an item in the system or the average lead time for a work item. In product development, we break the project delivery into a batch (or batches) of work items. Using Anderson’s formula, we can forecast how much time it will take for the batch to be processed by the development system.


SAS automates data modeling for fast analysis
SAS Factory Miner can use any source of data, as long as the data itself can be formatted into a table. The software, run from a server and accessed with a browser, offers a graphical point-and-click interface. It comes with a set of customizable templates for creating baseline models. Analysts can fine tune or revise any of the computer-generated models. To help pick the best models, the software uses a number of machine learning algorithms that, through repeated testing of the models, can recognize patterns to anticipate future performance. One unnamed customer used an early version of the software to build 35,000 different models in order to find the best approach for a marketing campaign.



Quote for the day:

"Success is the result of good judgement, which is the result of experience & experience is often the result of bad judgement" -- tony robbins

June 20, 2015

The APIs.json Discovery Format: Potential Engine in the API Economy
The goal of APIs.json is to provide a simple, common format that can be used to index APIs and the supporting elements of API operations. APIs.json works much like the Sitemap XML format. But instead of indexing websites, APIs.json is designed to index APIs and offer that index at a well-known location where API providers can publish an index of their API resources. APIs.json is designed to give API providers an easy way to update their own index but also allow other search engines, directories, and API service providers access to that local index, making all API resources within the domain discoverable.


Can You Really Define Culture? 4 Lessons From a Growing Startup
Culture is a common theme these days; every startup CEO talks about their amazing culture and how it drives them and inspires their team. Research shows that companies with a high-performance culture have a distinct competitive advantage in part because competitors cannot duplicate your culture like they can copy your technology. Investors are known to invest in the team and often, its underlying culture. One of the key components of a winning team from a venture capital perspective is the clear articulation and proof of that amazing culture. So I find myself now wondering what it really is for our company. How do I define it? And more importantly: How on earth am I going to institutionalize it as we grow?


How to stop the Internet of Things overwhelming your network
The internet can be unreliable and disconnect and reconnect with very little warning. Internet connection speeds can also vary between different clients and devices. The problem is that the IoT assumes the internet is reliable and able to transmit information in real-time. However, this isn’t the case. As human beings, we are notoriously impatient and this is true when it comes to our apps as we want the information we require straight away – internet connections are easily dropped and can often take a while to reconnect. The IoT doesn’t account for this. This is particularly important when it comes to banking apps on a smartphone.


How to structure an outsourced IT project for less risk, more leverage
“This comes into play when implementing a software-as-a-service platform,” says Alpert. “In these implementations there is typically a much smaller software development and testing lifecycle and more focus on agile configuration and testing.” An IT organization may also like the clarity that can accompany working with a sole provider. Unfortunately, “the perceived accountability benefits of ‘one throat to choke’ are typically unrealized due to poor commercial structure and provider unwillingness to accept real risk,” explains Alpert. “With a single provider, future phases of work are often overpriced due to lack of competitive leverage, and the project scope is not yet well defined to determine the discrete schedule, deliverables, requirements, and timeline to hold the provider accountable.”


Three of the worst responses to cyber security threats
A large part of cyber security is monitoring; without monitoring your network, it’s damn near impossible to know which threats you’re facing and what they’re targeting. So, if you get a red flag about a possible intrusion, or several members of staff raise concerns, then you listen and gather all the evidence you can, and come to a conclusion about whether you do something. Or you can do what the follow three organizations did ... “Backing up data is one thing, but it is meaningless without a recovery plan, not only that a recovery plan – and one that is well-practiced and proven to work time and time again,” Code Spaces said. “Code Spaces has a full recovery plan that has been proven to work and is, in fact, practiced.”


IT staff should be embedded in business
“It is an exceptionally lean approach to IT, but it is also extremely flexible in growth and changing situations,” says Alppi. The core IT team also gets some outside help. While not part of Alppi’s five specialists, Rovio has 20 to 30 employees (excluding games developers) with IT-related job descriptions. Instead of having IT as a separate bastion, they work for different units in the company. “Most of our business IT people work inside business units and are our major internal stakeholders. It allows them to be very hands-on with what is happening there. “Typically, anyone with even the slightest association to IT is put into the IT department and then you assign an IT manager to every business unit, but in our model those in charge of business IT also work in business,” says Alppi.


Q&A on Test Driven Development and Code Smells with James Grenning
TDD leads to code that does what the programmer thinks the code is supposed to do. Modules are developed with an executable specification of the module, the test cases. The test cases document very precisely what the code is supposed to do. If the code starts to violate the specification, a test fails. One of the big problems with code is that unwanted side effects are very difficult to anticipate. I make a change to one part of the code, and a seemingly unrelated other part of the code breaks. ... Simply, if you cannot identify with some precision a problem in the code’s structure, how can you fix it. I recall code reviews in my career were usually just a matter of opinion. “I don’t like that, I would have done this”, totally unsupported. Any programmer can announce “this code stinks”, but that is not good enough.


Information Security - Reducing Complexity
There is a drastic change in the threat landscape between now and the 1980s or even 1990s. Between 1980 and 2000, a good anti-virus and firewall solution was considered well enough for an organization. But now those are not just enough and the hackers are using sophisticated tools, technology and sills to attack the organizations. The motive behind hacking has also evolved and in that front, we see that hacking, though illegal is a commercially viable profession or business. ... The driver of adoption of these evolution is the business need. As businesses want to stay ahead of the competition, they leverage the evolving technologies and surge ahead of the competition. With a shorter time to market, all departments, including the security organization should be capable of accepting and implementing such changes at faster pace.


IT Professionals lack confidence in board’s cyber security literacy
“There’s a big difference between cyber security awareness and cyber security literacy,” said Dwayne Melancon, chief technology officer for Tripwire. “If the vast majority of executives and boards were really literate about cyber security risks, then spear phishing wouldn’t work. I think these results are indicative of the growing awareness that the risks connected with cyber security are business critical, but it would appear the executives either don’t understand how much they have to learn about cyber security, or they don’t want to admit that they that they don’t fully understand the business impact of these risks.”


EBay's security chief says collaboration key to keeping data safe from cyberattacks
On a high level there are primarily three reasons that drive hacker activity. The first one is kind of the category that Sony fell into and that is state-sanctioned or government-authorized hacks. And in that scenario they're usually trying to send a message but it's something that allegedly is authorized by a state or a government. The second category is hackers that are looking to monetize their hacks. They're out there hoping to get something they can sell and make money. The third one is really your activist hacker. Those are the ones that want to either deface a website to put their message up. They don't do anything really to extract money. They're just trying to send a message, which also falls into your Sony example.



Quote for the day:

“No one can make you feel inferior without your consent.” -- Eleanor Roosevelt