Daily Tech Digest - December 05, 2016

Should you go with Google's Go? 7 pros and cons

Go’s rise coincides with a rapid collapse of interest in C. Yes, C remains second on Tiobe’s list, but it has lost about 40 percent of programmer investment as computed by Tiobe’s complex metric. Built to be a stripped-down, efficient language for writing low-level code, Go shares many features with C, including much of the syntax. It’s hard not to conclude that a good part of Go’s newfound support is likely made up of former C programmers migrating to a new home. The Tiobe list isn’t about lines of code or job advertisements; instead, it tries to capture the pulse of the programming world by counting web searches and other behavioral metrics. It’s clear from Go’s large leap that people are starting to talk about Go for real-world projects, not merely fringe one-offs from startups.


Reality Check: Getting Serious About IoT Security

To determine the severity of the problem, I wanted to see how quickly an IoT device would be attacked once it was connected to the Internet. Would a user who bought an IoT webcam or printer have enough time to set up and securely configure the device before an attacker would compromise the device? ... The vast majority of the devices targeted by Mirai are running a stripped-down version of the Linux operating system, developed for multiple architectures (MIPS, ARM, x86, etc.). These machines generally run a tool called BusyBox — "The Swiss Army knife of embedded Linux," as developers refer to it. This single binary allows for the execution of more than 300 commands, cutting down on the space required of an operating system on an embedded device.


Respect and the Agile Workplace (a.k.a. 5 Failings of Your Humble Agile Architect)

It's quite common for me to be in a discussion when my mind races ahead to a solution for a problem that we're still spit-balling. And once I arrive at my solution, I'm anxious to get the conversation caught up to that point so we can just get on with it, dammit! But, of course, that doesn't work. Knowing this, I take a deep breath to calm myself, a technique I learned and have used since the sixth grade, and patiently help move the conversation forward at a more reasonable pace. And, of course, at this point I've made two mistakes. The first one, waiting patiently to get to my solution rather than helping the group get to some solution or a range of possible solutions, and the second one being the deep breath that's misinterpreted by others as a sigh of disinterest or impatience with them rather than my own frustration with myself.


What's Hot in Hiring: Data Security Consulting!

Information security can be broken down into two main areas. These areas are hardware, and software. A data security consultant may be expected to have a wider understanding of their industry, but in reality they will only specialize in some key areas. This means that employers need to be specific about who they’re looking for and the technologies that they use. It also means that jobseekers need to be upfront about their expertise, or they may risk finding themselves in a position that is beyond their current skillset, which could lead to career impacting underperformance. As a consultant, the role is to advise, develop, and implement change. This change is usually to address a problem that already exists. In the case of data security, this could mean that a security threat has already been identified, or it could be to mitigate possible threats with new technologies.


Why cybersecurity companies fail at selling to CISOs... and what to do about it

Why is Hayslip, who is also author of the book 'CISO Desk Reference Guide: A practical guide for CISOs', ranting on vendors? He likes them, he wants to help them do a better job at selling to CISOs, and he decided to offer them some hard-core advice. Cybersecurity software companies and solution providers ought to listen up on what this CISO has to say in his manifesto, even if some of it may be hard to swallow. Hayslip tells it like it is. He isn't singling out particular vendors or sales reps. He has no vendetta against them. To be clear, Hayslip is heavily engaged in the cyber vendor community and he's an Advisory Board Member at the San Diego Cyber Center of Excellence (CCOE), a non-profit founded by local cybersecurity companies dedicated to accelerating the region's cyber economy.


Intel is Winning Over Blockchain Critics By Reimagining Bitcoin's DNA

The main critique to emerge is that participants would need to use Intel hardware like SGX to execute code in a protected area that can't be inspected or tampered with. That's how you "know" — in theory — that the blocks filled with transactions will be dispensed at a certain interval, and that those transactions are correct. And you know that it can't be tampered because of cryptography involved. "PoET uses this special processor capability to regulate block frequency rather than computation," Sawtooth Lake project manager Dan Middleton said, explaining that by using the protected area of the chip, the code is executed as designed. "This is what enables the return to one-cpu-one-vote," he continued, echoing an idea invoked in Satoshi Nakamoto's bitcoin white paper.


Alexa and Google Home Record What You Say. But What Happens to That Data?

Google users can find everything they’ve asked for by visiting myactivity.google.com while they’re logged into their account. This query museum doesn’t just include voice requests. It also includes any Google searches, YouTube videos, and apps you’ve launched on Android, among other things. It’s all presented in a neat, searchable chronological stack. There are user benefits to these personal audio catalogs. For cases where spoken-word answers aren’t very useful—recipes and search results, for example—Amazon and Google provide links to written content in the Alexa and Home apps. Both companies say these audio databases help each system serve up personalized content and learn the intricacies of your Maine accent.


CNN’s Quest Discusses Cyber Breaches, an “Existential Threat”

No institution, however big or grand, is safe. The global payments system SWIFT has embarrassingly admitted $100 million was stolen from one of its members who had been careless with authentication details. Even the US government has admitted data on millions of employees has been compromised. What makes cyber security breaches most worrying for companies is the existential threat that comes with them. Rob a bank branch and you only get the money inside the vault. Compromise a bank’s trading or transfer systems and, as the SWIFT CEO admitted recently, you create a threat to the very existence of the institution itself. Cyber attackers frequently squat in compromised systems for months before launching their attacks. It creates a huge challenge for companies.


The digital opportunity for CIOs

Left to their own devices, functional leaders will likely tackle each of the three opportunities in independent ways. For example, the chief marketing officer might just concentrate on the customer, the chief financial officer might just concentrate on the use of analytics for management insight or financial reporting, and the chief operating officer might just look at digitising parts of the supply chain. But while digital might help that leader’s particular function, overall, it can add to poor investments and jeopardise broader adoption patterns more widely for the business. But all these areas share a strong technology underpinning. The CIO is therefore positioned to visualise the digital “big picture”, and help guide investments that build the right mix of technology skills, architectures and delivery models.


Ramsomware as a Service Fuels Explosive Growth

Orla Cox, director of security intelligence delivery at Symantec, said not only has the number of attacks increased, but the demanded ransom has as well. “The average ransom demand has more than doubled, and is now $679 (US dollars), up from $294 at the end of 2015,” she said. She added that 2016, "has also seen a new record in terms of ransom demands, with a threat known as 7ev3n-HONE$T (Trojan.Cryptolocker.AD),” which demands a ransom of 13 Bitcoin per computer, or $5,083 at the time of discovery in January. One reason for that explosive growth is probably because, even with headlines and continuous warnings about it, most individuals and organizations remain woefully vulnerable. Even if protection is available, they don’t always use it.



Quote for the day:


"Fear causes hesitation and hesitation will cause your worst fears to come true." -- Patrick Swayze


Daily Tech Digest - December 04, 2016

Dive Deep Into Deep Learning

The most remarkable thing about deep learning is that we don't program them to perform any of the acts described above. Rather, we feed the deep learning algorithm with tons of data such as images or speeches to train it, and the algorithm figures out for itself how to recognize the desired targets. The ability of Deep Learning methods to learn complex nonlinear relations by churning high amount of data, creating features by themselves makes it stand out from the other traditional Machine Learning techniques. To know how a standard Deep Learning algorithm works, we have to follow its predecessors, neural networks. Well, some practitioners also refer Deep learning as Deep Neural Networks, which is also a choice.


Machine learning: A new cyber security weapon, for good and ill

Darktrace claims its self-learning approach has been “inspired by the biological principles of the human immune system, identifying never-seen-before anomalies in real time, including insider threats and sophisticated attackers - without using rules, signatures or assumptions.” Modesty is not the company’s strong point. It claims to be “the only technology capable of detecting and responding to emerging cyber-threats, from within the network,” and that its self-learning software has been “recognised as the de factostandard for defending organisations of all sizes from constantly-evolving threats.” Darktrace announced Telstra as a customer in February, saying that the telco had decided to deploy the Darktrace Enterprise Immune System across its enterprise network “because of its unique capability to spot emerging abnormal behaviours in real time within the organisation.”


What is the Blockchain – part 5 – ICOs and DAOs

An ICO is increasingly being used by cryptocurrency and Blockchain startups to raise money by distributing a percentage of the initial coin supply. ... The tokens, or cryptocoins, which are sold during the crowdsale will be used on the platform to pay for transactions. ‘Investors’ that purchase these coins during the ICO do not get a share in the startup, but they hope that the price of the coin will rise and as such they can get a (substantial) return on their investment. ... A DAO is a grouping of smart contracts connected together, possibly in combination with IoT devices, AI/Machine Learning and big data analytics. It is run by irreversible computer code, only under control of a set of, irrevocable, business rules. As a result, a DAO does not have any governance by management or people, but is governed by code.


Growth Drivers, Trends, and Developments in UK Fintech Market

There is a move away from free float revenue models or paid subscriptions to alternative models that are based on monitoring and advertising or reselling of data to 3rd-party firms. This is due to data richness in financial services and development of a liquid and sophisticated market for digital leads. Identity and fraud protection are another development in UK fintech market. A connected world is complicated and makes protection of personal financial details challenging. As start-ups come up with untested and new business models, security is often viewed as a secondary focus. Infrastructure replacement is also a development in UK fintech market. Emergent fintech players are unsatisfied with current infrastructures and are side-stepping it. Infrastructures that have been developed to replace the old ones include cryptographic currencies like Bitcoin and peer-to-peer networks.


Trump presidency could sound death knell for offshore outsourcing

“Any Trump-inspired reform of the U.S. immigration laws will likely make it harder to move employees into the U.S. market,” says Peter Bendor-Samuel, CEO of outsourcing research firm Everest Group. “This will likely take the form of fewer H-1Bs, higher costs for visas, and caps on the number of visas the firms can utilize. That would likely result in IT services firms having to hire more U.S.-based resources, raising operating costs and reducing the labor cost advantages of offshore outsourcing. ... Industry observers expect the corporate lobby to push back on populist proposals. “Politics is still very much a money sport,” says Bendor-Samuel. “Trump is likely to quickly find that campaigning and governing are far different, with members of congress being much more concerned about corporate welfare than the average voter.”


Enterprise architecture model helps to maximize mobile empowerment

The biggest problem with mobile empowerment is that typical strategies don't account for mobility; they account for mobile devices only. A worker, who is supported by a mobile device, doesn't need to get the same information again, which is simply formatted for mobile display. They need to get different information, because the availability of IT support at their activity points changes how they work. Ideally, an enterprise architecture model could step back to business processes and then define its implementation in a mobility-optimized way. ... The challenge many enterprise architects value in driving mobility empowerment will be reduced because pure business requirements are lost or confused. In every enterprise architecture model, there is an implicit or explicit boundary between abstract business process requirements and explicit methods dictated by available IT tools.


How today’s tech tools take marketing automation to the next level

It’s no longer sufficient to send the same message to thousands of people at once. Businesses realize they must reach out to customers on an individual basis for them to pay attention. This starts with creating a subject line that will connect with them as they’re casually scrolling through their overstuffed inboxes, but it continues to the message itself. When an email contains information that specifically speaks to a customer’s preferences, that customer is more likely to take action. Many of today’s top email marketing tools offer the opportunity to direct email messages to certain audience segments. You can deploy one set of emails to customers who have purchased from you before, for instance, and another for customers who have shown an interest but never bought anything.


16 high-tech features you need in your next car

Vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication is basically exactly what it sounds like. It's a type of communication technology that lets cars talk to other vehicles, as well as surrounding infrastructure like traffic lights.  Why is this important? Because as cars become more autonomous they will need to be able to communicate with other cars on the road in order to operate more safely. ... More automakers are beginning to offer WiFi and LTE 4G connectivity in their newer vehicles. However, you'll still have to pay for whatever data plan you opt for.  WiFi and LTE 4G basically turns your car into a hotspot, allowing for you to connect several devices to the network. This means passengers can easily stream music, video, and surf the web without having to worry about killing the battery of your mobile device.


Internet Archive Seeks Emergency Backup - in Canada

Presumably, the Internet Archive has backups in place. But war and natural disasters aside, Kahle says deeper, intentional actions have previously affected libraries, citing in particular "legal regimes" and "institutional failure." "Throughout history, libraries have fought against terrible violations of privacy - where people have been rounded up simply for what they read," he writes. "At the Internet Archive, we are fighting to protect our readers' privacy in the digital world." Never before have humans had so much access to information than through the capabilities of the internet. And never before have governments, spies, cybercriminals and others been able to exploit it for profit, surveillance and influence.


Best practices for lowering cyber insurance costs and cyber risk

With cybersecurity threats on the rise, companies are increasingly taking advantage of cybersecurity insurance. And while cyber insurance can be worth it, it’ll cost you. Last year, U.S. insurers earned $1B in cyber premiums. You can minimize your premiums by showing your insurance company you’re actively mitigating cyber risks, which is a win-win: lower your risk and secure a more cost-effective insurance plan. Purchasing cyber insurance for the first time can be intimidating because every insurance vendor has unique offerings, but here are two best practices on how to approach cyber insurance to ensure it’s a good fit and cost-effective for your company



Quote for the day:


"Men who are in earnest are not afraid of consequences." -- Marcus Garvey


Daily Tech Digest - December 03, 2016

Inside the black box: Understanding AI decision-making

Sometimes, bias can be introduced via the data on which neural network-based algorithms are trained. In July this year, for example, Rachael Tatman, a National Science Foundation Graduate Research Fellow in the Linguistics Department at the University of Washington, found that Google's speech recognition system performed better for male voices than female ones when auto-captioning a sample of YouTube videos, a result she ascribed to 'unbalanced training sets' with a preponderance of male speakers. As Tatman noted, a few incorrect YouTube captions aren't going to cause any harm, but similar speech recognition biases in medical or connected-car applications, for example, would be another matter altogether.


New workplace is agile and nonstop

“Work has changed, and everyone needs more expertise, more consultation,” said Pamela Hinds, a professor of management science and engineering at Stanford. “There’s more speed with which projects have to get out, because of competition, and people are pulled on and off projects much more.” At the Museum of Applied Arts and Sciences in Sydney, a government-mandated transition from traditional computers to cloud-computing systems has everyone planning exhibitions and raising money on Jira, a software development tool for managing cloud projects quickly. “We change light bulbs on Jira. It’s how we plan all our exhibitions,” said Dan Collins, head of digital and media at the museum. “Things move a lot faster, with fewer meetings. Tools are more important than organizational charts.”


The Top 7 Big Data Trends for 2017

The most well-known platform for smart contracts is Ethereum. Ethereum is a decentralised platform for applications (DApps) that run exactly as programmed without any chance of fraud, censorship or third-party interference. Although Ethereum is still a very young platform, and has some challenges with involuntary hard forks, the opportunities of irreversible smart contracts linked together on a platform like Ethereum are enormous. Multiple startups are developing similar platforms such as Synereo, Maidsafe or the latest platform Ardor. They are all trying to build the decentralised internet. 2017 will see these platforms growing up, although we will probably also see some issues related to these platforms. However, slowly the technology of a decentralised internet is growing up and smart contracts will be an important part of Blockchain 2.0.


Big Data Poised to Get Much Bigger in 2017

Businesses today have more data than ever, which is growing rapidly, but if they do not know how to leverage that data, it becomes almost impossible to demonstrate the value of any Big Data project. “This could be due to the fact that many Big Data projects don’t have a tangible return on investment (ROI) that can be determined upfront,” said Heudecker. “Another reason could be that the Big Data initiative is a part of a larger funded initiative. This will become more common as the term “Big Data” fades away, and dealing with larger datasets and multiple data types continues to be the norm.” “That is the very reason why companies like Xavient exist,” said Sabharwal. He added “Xavient is committed to providing customers with tailored capabilities and solution flexibility and making our real-time data analysis solutions ubiquitous in an enterprise.”


Augmented reality, AI, and autonomous delivery -- is this the future of food?

Just Eat wants to continue harnessing the power of technology to ensure it continues to grow its customer base and keeps them as satisfied as possible -- and not just with their food, but with the whole online ordering experience as the takeaway industry grows. "Technology is at the heart of everything we do at Just Eat. We are always seeking ways to help our restaurant partners grow and ensure new and existing customers have a reliable, convenient and, increasingly, fun experience when they order from us," said David Buttress, chief executive of Just Eat, at the event. The company's development team is working on projects involving augmented reality, virtual reality, chat bots, voice communication, and even robots as it looks towards meeting the demands of the customer of tomorrow.


Blockchain Technology – What Is It and How Will It Change Your Life?

Blockchain means that we may no longer have to use the layers of bureaucracy in order to reduce uncertainty. Warburg sees the potential of blockchain as an extension of Nobel Prize winning economist Douglass North’s ‘New Institutional Economics’. Institutions, in this context, are just the rules (and organisations, whether informal or formal) that implement them e.g. the law or just bribery. “As Douglass North saw it, institutions are a tool to lower uncertainty so that we can connect and exchange all kinds of value in society. And I believe we are now entering a further and radical evolution of how we interact and trade, because for the first time, we can lower uncertainty not just with political and economic institutions, like our banks, our corporations, our governments, but we can do it with technology alone.”


The 6 Ds of Tech Disruption: A Guide to the Digital Economy

The structure of organizations is changing. Instead of thousands of employees and large physical plants, modern start-ups are small organizations focused on information technologies. ... It no longer takes a huge corporation to have a huge impact. Technology is disrupting traditional industrial processes, and they’re never going back. This disruption is filled with opportunity for forward-thinking entrepreneurs. The secret to positively impacting the lives of millions of people is understanding and internalizing the growth cycle of digital technologies. This growth cycle takes place in six key steps, which Peter Diamandis calls the Six Ds of Exponentials: digitization, deception, disruption, demonetization, dematerialization, and democratization.


Preparing your enterprise for IoT and automation in the workplace

Fog computing is a distributed computing approach where application services may be controlled at the network edge in a smart device and some application services are controlled in a remote data center or cloud environment. Fog computing allows a considerable amount of processing to occur at the edge of the network in a smart router or other gateway device. See also Mobile Edge Computing . The ability to process and analyze data at the edge becomes even more important as it reduces latency, provides for real-time analytics and quick decision-making, and works best where we have a high volume of sensor or connected devices. Specific industries where this makes the most sense are those in industrial verticals, smart cities, intelligent buildings, oil and gas or energy, and others.


Technologies for the Future of Software Engineering

Continuous delivery requires all teams to communicate through the codebase by doing continuous integration to the trunk. Teams keep the software always production-ready; if that’s not the case you have to stop and make it so. While deployment is continuous, release is incremental by toggle or switch whenever a useful increment or capability is ready. Continuous delivery provides essential end-to-end feedback, argued Poppendieck. Research indicates that product managers are wrong half the time, and that two third of the features and functions in a specification are unnecessary. This is a consequence of trying to decide what to build in detail before trying experiments to see if a feature really addresses the problem at hand.


Balancing Employee Privacy with Company Security in Mobile Policies

The figures are shocking- a recent study from the Ponemon Institute found 70 percent of respondents believe that the failure to secure company data on mobile devices results in data breaches. The study also found 67 percent believe it’s certain or likely that data breaches are caused due to employees using mobile devices to access sensitive and confidential company information. Only 33 percent of respondents believe their organization is vigilantly protecting sensitive or confidential data from unauthorized employee access. In addition to lax monitoring of employee usage, there are other ways employees can invite hackers and breaches into company systems. Accessing or using unsecured Wi-Fi in public places, such as airports or hotels, can allow hackers to view everything employees work on and download.



Quote for the day:


"All progress takes place outside the comfort zone." -- Michael John Bobak


Daily Tech Digest - December 02, 2016

Travel Security Tips for Personal and Business Trips

While you may not have much say in when and where you travel, understanding your trip’s goals can help determine the best business security practices. A quick, one-day trip to meet a business partner might mean you can leave your computer at home, for example. A month-long globe trot to multiple satellite offices, client meetings and a little R&R would require a more rigorous approach to securing all of your devices. It is equally important to know the purpose of your trip, the systems and access you will require while traveling, the sensitivity of information you will be handling and the available security resources. These points will determine what travel security precautions you should take before you even pull out your suitcase.


Major cybercrime network Avalanche dismantled in global takedown

To shut down Avalanche, law enforcement agencies embarked on an investigation that lasted longer than four years and involved agents and prosecutors in more than 40 countries, according to the U.S. Department of Justice. Europol said 39 servers supporting Avalanche were seized, and another 221 were forced offline with notifications sent to their hosting providers. Investigators used a method known as sinkholing to infiltrate the cybercriminals' computer infrastructure and disrupt their activities. This involved redirecting the internet traffic from Avalanche's infected computers to servers controlled by law enforcement. "The operation marks the largest-ever use of sinkholing to combat botnet infrastructures and is unprecedented in its scale," Europol said in a statement.


Why Small Businesses Should Get Smart About Information Security

In many ways, small businesses have even more to lose than large ones simply because an event—whether a hacking, natural disaster, or business resource loss—can be incredibly costly. The report beings by noting that while cybersecurity improvements by some businesses have rendered them more difficult attack targets, this has led hackers and cyber criminals to focus more of their attention on less secure businesses. One reason for this is that small businesses, including startups, often lack the resources to invest in information security as larger businesses can. Many fall victim to cyber-crime. In a later comment on the report, author Pat Toth stated, "[s]mall businesses may even be seen as easy targets to get into bigger businesses through the supply chain or payment portals."


Mobile Device Security is the need of the hour

One of the biggest apprehensions when it comes to using Android devices in any government & enterprise environment is its lack of security for the mobile device & the data on it. Google recently unleashed one of its biggest marketing campaigns and product launches outside the US. It is with the launch of Android One that it wants to capture the other billion. This has been a major success for Google who in spite of world dominance in terms of Android users still is not able to tightly manage its ecosystem. Fragmentation of software, screen size and resolution was hurting the app developer ecosystem. Android one as a strategy comes like a knight in the shining armor for Google, that will reduce fragmentation by strongly controlling what goes into the phone.


Six must-haves for IT's mobile security checklist

Let's face it -- there is no such thing as absolute security, and there likely never will be, simply because allowing even restricted access to any resource means that someone might compromise this access. Hackers can be bright but misguided, but professional information thieves are like any other spies on a critical mission, with the goal of stealing information or disrupting an organization's operations, often with devastating results. Since it's impossible to guarantee absolute security, the mission for IT administrators is to make any compromise to enterprise mobile security so difficult that all but a handful of hackers with access to nation-state-level resources will simply give up. The basics of good security practices are the same, regardless of organizational mission, size or the specific infrastructure and tools.


2017 security predictions

Cybersecurity professionals will struggle to protect critical infrastructure, connected systems and remotely accessed systems and devices while weak password practices remain the norm, but it's not just external threats that are a problem. Mitigating insider threats can also be accomplished through better password management, he says. The best way to do so is to implement a solution that securely store passwords that remain unknown to users, and then regularly validates and rotates those passwords to ensure safety and security, he says. "What we're talking about is credential vaults. In an ideal world, a user would never actually know what their password was -- it would be automatically populated by the vault, and rotated and changed every week. Look -- hackers are intrinsically lazy, and they have time on their side. ..," Dircks says.


Cybersecurity: Steps To Manage Cyber Risks Effectively

Hackers are targeting organizations from all industries, including not-for-profits and charities, by using techniques ranging from Advanced Persistent Threats ("APT") to sophisticated spear phishing campaigns. In such an environment, how should organizations prepare for the unexpected? While the challenge is significant, it is not insurmountable. The impact of a cyberattack on an organization can be significant. In many instances, an organization can lose the trust of its internal and external stakeholders if it comes to light that it had not put sufficient time, resources and energy into preparing for a cyberattack. On the other hand, organizations that invest in planning for the likely eventuality of a cyberattack are much better positioned to deal effectively with and limit any negative consequence.


Implantable medical devices can be hacked to harm patients

At least 10 different types of pacemaker are vulnerable, according to the team, who work at the University of Leuven and University Hospital Gasthuisberg Leuven in Belgium, and the University of Birmingham in England. Their findings add to the evidence of severe security failings in programmable and connected medical devices such as ICDs. ... Previous studies of such devices had found all communications were made in the clear. "Reverse-engineering was possible by only using a black-box approach. Our results demonstrated that security by obscurity is a dangerous design approach that often conceals negligent designs," they wrote, urging the medical devices industry to ditch weak proprietary systems for protecting communications in favor of more open and well-scrutinized security systems.


Should application development have greater security-based regulation?

While he admits the likes of PCI compliance or the incoming GDPR are starting to help, none of them go deep enough down into the code level for O’Sullivan’s liking, and instead he would like to see new rules that focus on secure code development. “If the regulations just went a little bit deeper - to kind of look at a granular level where the problems really are - and mandated using certain types of frameworks and using certain types of controls at a code level, that would help.” “There's all sorts of controls built into your code, they're out there, OWASP [a non-profit repository of security information] is a great resource for that type of thing. There's cheat sheets for avoiding certain vulnerability types. Use them, put them in your code. Mandate that they get used, build that into regulations.”


Data Science Up and Down the Ladder of Abstraction

If you're thinking of developing your skills in data science, you've probably already considered Python or R. Python is an especially popular choice for those coming from a programming background since it's a good general-purpose scripting language which also provides access to excellent statistical and machine learning libraries. When I first started out in data science I used Python and scikit-learn to tackle a clustering project. I had some data gathered from social media on users' interests and I was trying to determine if there were cohorts of users within the whole. I chose spectral clustering because it could identify non-globular clusters (so must be better, I reasoned), and the first results were promising. My confidence quickly evaporated when I re-ran the clustering and got different results.



Quote for the day:

"Leadership is not about making all the decisions. It's about clarifying decisions to be made and supporting your people to make them." -- @NextNate

Daily Tech Digest - December 01, 2016

‘Cybersecurity has become a full-time job’ in healthcare

“Cybersecurity has become a full-time job,” Karl West, CISO of Intermountain Healthcare in Utah, said at AEHIX, an adjunct conference to the College of Healthcare Information Management Executives (CHIME) Fall CIO Summit this month in Phoenix. “There is a call for all of us to do better,” West said. He said that healthcare may only be at 30 percent to 50 percent of compliance with the required security regulations. Healthcare trails other industries in this area because it has spent so much money on transforming care with IT, while cybersecurity has ended up taking a back seat. At the annual U.S. News and World Report Healthcare of Tomorrow summit held earlier this month in Washington, D.C., Dr. Brian Jacobs, CMIO of Children’s National Medical Center, said that the hospital now dedicates 19 percent of its IT budget to security, Politico reported.


Destructive Hacks Strike Saudi Arabia, Posing Challenge to Trump

The ferocity of the attacks appear to have caught Saudi officials by surprise. Thousands of computers were destroyed at the headquarters of Saudi’s General Authority of Civil Aviation, erasing critical data and bringing operations there to a halt for several days, according to the people familiar with the investigation. There have been no reports of widespread transportation interruptions at the King Khalid International Airport in Riyadh or the other major airports. A spokesman for the aviation authority in Riyadh didn’t immediately respond to phone calls and e-mails requesting comment. The people familiar with the probe didn’t identify the other targets but one said they were all inside Saudi Arabia and included other government ministries in the kingdom, a country where information is highly controlled.


Most Organizations Not Adequately Prepared for Cyber Attacks: Marsh Cyber Handbook

While cyber breaches are one of the most likely and expensive threats to corporations, few companies can quantify how great their cyber risk exposure is, which prevents them from protecting themselves,” according to an article in the handbook titled, “Can You Put a Dollar Amount on Your Company’s Cyber Risk?” “Most managers rely on qualitative guidance from ‘heat maps’ that describe their vulnerability as ‘low’ or ‘high’ based on vague estimates that lump together frequent small losses and rare large losses,” adds the article.... The challenge is “to build a smart, well-designed, cyber risk model that’s able to analyze potential direct revenue, liability, and brand loss scenarios.


IoT to Get Security, Gateway Benchmarks

The working group for the gateway benchmark aims to deliver system-level benchmarks measuring overall throughput, latency and energy consumption for node-to-cloud communications. It will probably start with an industrial profile but has not yet specified what parameters it will measure. The group currently includes members from ARM, Dell, Flex and Intel and hopes to deliver a complete spec by next fall. It will use workloads generated across multiple physical ports to test multiple system components including the processor, physical and wireless interfaces and the operating system. “Today, without a standardized methodology, IoT gateway benchmarking is not realistic,” said Paul Teich, a principal analyst at Tirias Research and technical advisor to EEMBC.


MongoDB-as-a-Service on Pivotal Cloud Foundry

Mallika Iyer and Sam Weaver cover a brief overview of Pivotal Cloud Foundry and deep dive into running MongoDB as a managed service on this platform. The MongoDB service for Pivotal Cloud Foundry leverages the capabilities of Bosh 2.0 for on-demand-dynamic provisioning for services while maintaining an integration with MongoDB's Cloud Ops Manager, to provide the best of both: PCF and MongoDB. Mallika Iyer is a Principal Software Engineer at Pivotal, and spends a lot of time building Bosh-managed services on that run on Pivotal Cloud Foundry. She is a cloud architect and has an extensive background in NoSQL and Large-Scale Search. Sam Weaver is the Product Manager for Developer Experience at MongoDB, based in New York.


Data Breach Preparation and Response: Breaches are Certain, Impact is Not

It is a good practice to map out what you believe to be the Breach Breakdown in some sort of visual manner so that you can more clearly define your working hypothesis. You should also include a timeline of events that represents the chronological progression of the attack. This will be of particular interest to executives and general counsel as they prepare statements regarding what happened and when. In addition, you should also maintain a partner list of the impacted systems represented in the diagram. This list should include additional system details such as IP address, hostname, OS, system function (ie, webserver, database, workstation), and method of compromise.


The real effect Google's Pixel phone is having on Android

Features unique to the Pixel, such as the Google Assistant, the Pixel camera, and Daydream ... plus the smartphone's deeper app integration [and] increased prominence of Android Pay ... will ultimately lead to users spending more money on Android, according to the research note. Morgan Stanley's analysts also predict that these features could see the Pixel driving higher mobile search monetization for Google as advertisers will spend more to reach the consumers who spend the most on their mobiles. And there you have it. The Pixel is ultimately a vessel for Google to bring its own mobile vision directly to mainstream users. That benefits Google as a company, and it benefits us as consumers who carry Android phones.


Disaster recovery testing: A vital part of the DR plan

The cost of implementing disaster recovery is directly affected by the level of recovery required so, to contain costs, applications have to be prioritised against a set of metrics that determine recovery requirements. Recovery time objective (RTO) describes the amount of time a business application can tolerate being unavailable, usually measured in hours, minutes or seconds. We can imagine applications that deliver core banking for financial organisations have an RTO=0, whereas some back-end reporting functions may have an RTO of up to 4 hours. Recovery point objective (RPO) describes the previous point in time from which an application should be recovered. To use our banking example again, an RPO of zero will be expected for most applications – we don’t want to accept any lost transactions.


How is runtime as a service different from PaaS or IaaS?

RaaS differs from platform as a service (PaaS) because the environment is long-running in many PaaS systems, but they automatically scale the application up or down like RaaS does. Additionally, a traditional PaaS deployment limits developers to a specific application framework. With many RaaS concepts, developers essentially deploy code in a container that starts on-demand. The major thing to focus on when building an application using RaaS is minimal bootstrapping, so the runtime can start up, execute and close down quickly. Infrastructure as a service (IaaS) is a traditional cloud computing service where companies pay by the hour for compute environments, whether they're actively used or idle. While it's the least efficient form of cloud computing, IaaS is still the go-to for most companies, primarily because it's the most similar to traditional programming


The Hardest Part About Microservices

The journey to microservices is just that: a journey. It will be different for each company. There are no hard and fast rules, only tradeoffs. Copying what works for one company just because it appears to work at this one instant is an attempt to skip the process and journey and will not work. And the point to make here is that your enterprise is not Netflix. In fact, I’d argue that for however complex the domain is at Netflix, it’s not as complicated as it is at your legacy enterprise. Searching for and showing movies, posting tweets, updating a LinkedIn profile, etc., are all a lot simpler than your insurance claims processing systems. These internet companies went to microservices because of speed to market, sheer volume, and scale



Quote for the day:


"I think we ought to read only the kind of books that wound and stab us. If the book we are reading doesn't wake us up with a blow on the head, what are we reading it for?" -- Franz Kafka,


Daily Tech Digest - November 30, 2016

10 ways to gracefully kill an IT project

The factors behind terminating a project may vary; the complexity involved, limited staff resources, unrealistic project expectations, a naive and underdeveloped project plan, the loss of key stakeholders, higher priorities elsewhere, or some other element - but likely it will be a combination of some or many of these possibilities. ...  Halfway through implementation it becomes clear the backup process will consume more bandwidth than expected, will take an inordinate time to complete backing up servers with hefty storage levels, and will cost more in the long run than the existing tape solution which can be refactored for cheaper, more efficient and less complicated administration. Clearly this is the proverbial "record coming off the needle" moment at which a harsh truth must be acknowledged.


Do You Really Have To Migrate To The Cloud?

Call it “digital darwinism.” Cloud brings big benefits to application vendors. On-premise customers using previous versions require maintenance — and that’s are a big drain on vendor resources. And they’re not as happy as new customers because they don’t have the latest innovations. Sales people have to spend lots of time trying to help justify upgrades — time they could use to sell to new customers. Extra costs and slower innovation means that over the long term, vendors offering cloud solutions will win. ... It’s time to remind her that her email — perhaps the most sensitive data you have in the company — already flies freely across public networks, protected only by security protocols. Remind her that money in your company’s bank accounts is really just data stored in ethereal databases around the globe.


4 things you should never say if you want to innovate successfully

For us, realistic innovation means starting with our customers, not our engineering department. Our product team creates bare-bones prototypes in low-tech tools like PowerPoint and shops them around. We learn what resonates with our target market before engaging expensive development resources, which ensures that engineers only work on projects with real legs. The products that pass the PowerPoint test do go to development, but customers remain involved. We ask them not only if the technology fills a need, but whether we are communicating the value proposition in a meaningful way. Successful innovation requires both: the right offering and the right pitch. When you’re lucky and smart enough to find both, pour fuel on the fire. The nature of that fuel will be different for different companies and products.


How the convergence of automotive and tech will create a new ecosystem

The operating models of the two sides differ dramatically. For example, automakers reengineer their core products approximately once every seven years, with noticeable updates every three years, but do not update existing products. Tech companies redo their core products about every two years, make noticeable updates every two months, and provide continual updates for existing products. The OEMs’ systematic “waterfall” approach to product development tends to slow down innovation; the average time to market is about five years. Most tech players depend on agile operating models that enable a time to market of roughly two years.


Branding: A Strategic Imperative For High-Tech Marketers

Unless a company has just hired a brand new CMO who wants to leave a mark, goes through major changes such as a merger, or undergoes a departure from a product line, branding is usually a tool better left for consumer companies. In a world of big data marketing and mandatory ROI, branding spending is difficult to justify as its impact can appear intangible. It takes time for a new brand initiative to bring results when the building of a brand does not occur overnight. The question is: Why would you engage in a branding initiative in an industry where investors and shareholders have little patience? Well, here are a few reasons why we should consider branding as a key strategic imperative for enterprise software or cloud services, especially in a high-growth environment.


Technology Isn’t the Answer to Surviving Digital Disruption

The goal is not to become a tech company. The goal must be to embed technology so ubiquitously and so deeply within the culture and operating model of the company that it becomes transparent - allowing you to enhance the customer experience and deepen your relationships. The challenge is that most traditional organizations have a self-view that is essentially synonymous with the product they sell or the service they provide. The required shift is not to become a tech company, but rather to make the organization synonymous with the value they provide and the relationship they create. The organization can only re-envision its business model from that perspective. The real goal of digital transformation, therefore, is to leverage technology to reshape and enhance the value you deliver to your customers.


A new Digital CIO must emerge from digital economy disruption

Technology has certainly affected the way we do business and companies’ roles. The time is now to refocus, and turn those initiatives into a full-fledged digital transformation strategy, and that means that CIOs must reinvent themselves in order to understand how the disruptions that digital transformation will bring to the businesses as we know them can create opportunities to growth and establish themselves as the strategic digital leader companies expect and need them to be. Before we explore what’s coming with the new digital reality, it is important to look at the present and understand how companies and CIOs are dealing with their IT departments. Reinventing the IT function to support the digital transformation requires far-reaching changes, from talent to infrastructure, and takes multiple years to complete.


What is email security and how can SMEs get it right?

“Email was never intended to be used in the way it is now. It’s not really kitted out for all of the risks associated with the internet; it was designed for a more trusting environment,” he explains. And it’s a mistake to think that SMEs don’t present a worthwhile target. In fact, they present attractive opportunities. ... “What does worthwhile mean?” asks Mr Bauer. “It’s relative to the cost of putting on an attack, and to the downside of getting caught.” Both are low when it comes to an attack on an SME, which makes them more appealing than larger corporations.  Each time an attempt to hack your company is made via email, there are one of two aims at play: to steal money, or gain information.
Small businesses should bear those purposes in mind, because they can be key to spotting – and stopping – hacks.


Why Now Is the Ideal Time for the CIO to Work with Graphs

There’s clearly enormous market growth taking place. Forrester Research estimates that one in four enterprises will be using such technology by 2017 while Gartner reports 70% of leading companies will pilot a graph database project of some significance by 2018. Graph databases aren’t applicable or helpful for all problems; there are transactional and analytical processing needs for which relational technology will probably always be the correct option, and there are NoSQL database alternatives that handle other types of large dataset well. But graphs make sense for any organisation seeking to make the most of its connected data. That is why I would recommend that any CIO looks to NoSQL, including graph databases, as a powerful new tool to supplement their RDBMS investment and deal with the growing data tsunami.


A new way to anonymize data might actually work

This is definitely a good thing, as EHR databases are now popular targets for cybercriminals due to the amount of data available in one location, as well as the fact that data—unlike financial information—cannot be changed or canceled. The paper's authors explain that the PEP framework consists of two components: polymorphic encryption and polymorphic pseudonymisation. The researchers begin with polymorphic encryption by explaining how it differs from more traditional encryption processes: "In traditional encryption, one encrypts for some chosen recipient who then holds the decryption key; whereas in polymorphic encryption one encrypts in a general manner and at a later time the encryption can be transcribed to multiple recipients with different keys."



Quote for the day:


"You have all the reason in the world to achieve your grandest dreams. Imagination plus innovation equals realization." -- Denis Waitley


Daily Tech Digest - November 29, 2016

CEOs See More Value In Technology Than People

There is a clear trend among CEOs to magnify the relative importance of technology in the future of work with 67 percent saying they believe that technology will create greater value in the future than human capital will. Another 63 percent of CEOs said they perceive that technology will become their firm’s greatest source of future competitive advantage. But the economic reality differs sharply, with human capital, not physical capital, creating the greatest value for organizations. CEOs’ distorted perceptions demonstrate the extent to which people are being painted out of the future of work ... A full 44 percent of leaders in large global businesses told Korn Ferry that they believe that the prevalence of robotics, automation, and artificial intelligence (AI) will make people “largely irrelevant” in the future of work.


Every company is a technology company, but most don’t behave like one

An interesting anecdote from The Lean Startup, one of the manifestos for startup founders, is that Intuit holds themselves accountable to being innovative and agile by using two key metrics: (1) the number of customers using products that didn’t exist three years ago and (2) the percentage of revenue coming from offerings that did not exist three years ago. Historically for Intuit, it took a new product an average of 5.5 years to reach $50 million in revenue; at the time the book was written, they had multiple products generating $50 million in revenue that were less than a year old. Particularly, as the world is moving towards cloud computing, continuous development, and continuous updates are the name of the game.


5 Expensive Traps Of DIY Hadoop Big Data Environments

“Hadoop is known to be self-healing, so if a node goes down on a server, it’s not a problem,” Dijcks says. “But if you buy inexpensive servers, you’re more likely to have nodes down and spend more time fixing hardware. And when you have a chunk of nodes that aren’t working, you’ve lost that capacity.” ... IT departments figure, “‘We’ve invested a lot of time, we’ve worked on this very hard, and now we need to put it into production,’” Dijcks says. “You can learn on throwaway servers, because if [the environment] goes down, no worries—just restart it. But in production, the cluster needs to stay up through hardware failures, human interaction failures, and whatever can happen.”


A fast data architecture whizzes by traditional data management tools

"Knowledge is power, and knowledge of yesterday is not as valuable as knowledge about what's happening now in many -- but not all -- circumstances," said W. Roy Schulte, vice president and analyst at Gartner. Businesses want to analyze information in real time, an emerging term dubbed fast data. Traditionally, acting on large volumes of data instantly was viewed as impossible; the hardware needed to support such applications is expensive. ... The use of commodity servers and the rapidly decreasing cost of flash memory now make it possible for organizations to process large volumes of data without breaking the bank, giving rise to the fast data architecture. In addition, new data management techniques enable firms to analyze information instantly.


The Financial Impact of NOT having Data Governance

During a recent customer visit, we got discussing the financial impact of Data Governance. To help explain this point, I thought I’d share some of the more common problems associated with NOT having data governance. By looking at it from this point of view we can get an idea of what the business is doing to overcome these issues, against which we can then associate some value. ... This isn’t meant to be an exhaustive paper on the subject, more a sharing of thoughts and ideas. I’d also add that the ideas presented in this blog aren’t suggesting these impacts will happen, more a sharing of some common challenges we see in the world of Financial Services and a way to try to understand the potential financial impact they might cause. These challenges should be seen from the perspective of potentially being part of a broader Data Governance initiative.


Social Media Is Killing Discourse Because It’s Too Much Like TV

It makes us feel more than think, and it comforts more than challenges. The result is a deeply fragmented society, driven by emotions, and radicalized by lack of contact and challenge from outside. This is why Oxford Dictionaries designated “post-truth” as the word of 2016: an adjective "relating to circumstances in which objective facts are less influential in shaping public opinion than emotional appeals." ... Social media, in contrast, uses algorithms to encourage comfort and complaisance, since its entire business model is built upon maximizing the time users spend inside of it. Who would like to hang around in a place where everyone seems to be negative, mean, and disapproving? The outcome is a proliferation of emotions, a radicalization of those emotions, and a fragmented society.


Will 'Digital Fingerprint' Technology Prevent Data Thieves?

The virtual intelligent eye works by generating a digital "fingerprint," based on behavior for every single login by every single user in every single application and database across the organization. This information is a recording of the "who, what, when, where, why and how" data is being accessed within an organization. Once a baseline for behavior is established, the system can easily identify anomalies in user activity and send out the appropriate alerts immediately when there are deviations from normal behavior. The cost of this technology will be positively impacted by the continuing decline in the cost of storage and processing power -- from cloud computing giants like Amazon, Microsoft and Alphabet. The healthcare data security war can be won, but it will require action and commitment from the industry.


Data Socialization: How to Achieve “Data Utopia” and Expedite Outcomes

Data socialization is an evolution in data accessibility and self-service across individuals, teams and organizations that is reshaping the way organizations think about, and employees interact with, their business data. Data socialization involves a data management platform that unites self-service visual data preparation, data discovery and cataloging, automation and governance features with key attributes common to social media platforms, such as having the ability leverage user ratings, recommendations, discussions, comments and popularity to make better decisions about which data to use. It enables groups of data scientists, business analysts and even novice business users across a company to search for, share and reuse prepared, managed data to achieve true enterprise collaboration and agility.


What's to blame for every single data breach? People, not technology

“Every breach occurs because someone in that company did something they were not supposed to do or because someone in that company failed to do something they were supposed to do,” Abagnale said. “There is not a master hacker sitting in Russia who will get through the company. The hacker will say, ‘I am not getting into JP Morgan Chase because they spend a fortune every year on cybersecurity, but they employ 200,000 people worldwide, so all I am looking for is one of those people who failed to do something they were supposed to or did something they were not supposed to do.’” Abagnale  said he will explain the weaknesses and soft spots in companies and instill in attendees that the most important job they have is to keep the information entrusted with them safe.


Reactor By Example

Reactor's two main types are the Flux<T> and Mono<T>. A Flux is the equivalent of an RxJava Observable, capable of emitting 0 or more items, and then optionally either completing or erroring. A Mono on the other hand can emit at most once. It corresponds to both Single and Maybe types on the RxJava side. Thus an asynchronous task that just wants to signal completion can use a Mono<Void>. This simple distinction between two types makes things easy to grasp while providing meaningful semantics in a reactive API: by just looking at the returned reactive type, one can know if a method is more of a "fire-and-forget" or "request-response" (Mono) kind of thing or is really dealing with multiple data items as a stream (Flux).



Quote for the day:


"Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it." -- Edsger W. Dijkstra


Daily Tech Digest - November 28, 2016

Ultimate Kanban: Scaling Agile without Frameworks at Ultimate Software

Ultimate started experimenting with Agile principles (namely, Scrum) in 2005. This initial transition to Scrum provided Ultimate with better visibility into the progress of teams towards wider business goals. However, there were some common sources of interruption that the Scrum did not handle very well. Regulatory changes that required immediate attention often forced teams to throw out plans for their sprints and start work for the new requirements. The ideal small Scrum team size (7-9 members) led to arbitrarily small teams with a very high cross-team coordination costs. Most importantly, though, after trying our hand at Scrum for a while, we did not see any major improvement in productivity.


The next big job in tech: Robot programmer

If your business is interested in bringing robotic programmers in, Mass said it's important to integrate them with other engineers. "Don't isolate them," he said. "From my experience, some problems in robotics can only be solved by a clever combination of software, electronics and mechanical design. Sometimes, changing the surface or angle around a sensor can make all the difference to making it work reliably. Make sure all of your engineers are working closely together and are talking to each other about their problems. Sometimes a solution can come from an unexpected direction." How to go about training to be a robot programmer? There are many books that teach programming, and you can also get your hands on a robotics kit. Also, Mass said "you shouldn't be afraid of reading data sheets or using an oscilloscope."


Information Architecture: What Is It and Where Did it Come From?

In order to understand IA, we first need to know where it originated. The term first started appearing in the 1970s. In 1970, a group of people at the Xerox Palo Alto Research Center were responsible for developing technology that could support the ‘architecture of information’. They were single-handedly responsible for many important contributions in what is today known as human-computer interaction. They introduced the first person computer with a user-friendly interface, laser printing, and the first WYSIWYG text editor. Modern use of the term IA, strictly related to the design of information, was officially introduced in the mid-1970s at the American Institute of Architecture conference where a man named Richard Saul Wurman introduced an idea that he called ‘the architecture of information’.


Upcoming bank rules could serve as a model for money management firms

Mr. Jacco believes banking regulations on cybersecurity will eventually apply to money managers. “It will be harder for them,” he said. “Some of them don't have big external websites; maybe they just have trading sites. Now on top of that they need a risk management function.” The regulations also will create a compliance change and organizational shift at money managers, Mr. Jacco said. The federal regulations, once established, “could create a new market standard for cybersecurity in general. The market may force everyone — managers, regulators — into that direction. But this phenomenon could take a long time to play itself out,” said Morgan Lewis' Mr. Horn.


The Internet of Things is making hospitals more vulnerable to hackers

Unfortunately, IoT start-ups often consider security to be a low priority, or an expensive headache that can be dealt with later on. That's a problem when those systems can potentially make the difference between life and death. "When implementing IoT solutions the components are chosen for their low cost and specific capabilities; however, the capabilities are significantly below what might be justified when the assets protected are human life, and security costs may be a significant portion of the cost, or even greater than the cost of the components. Prevalent vulnerabilities, however, do not only facilitate malicious actions, they may also increase the likelihood and impact of human errors and system failures," the report warns.


Six key principles for efficient cyber investigations

Even the largest companies appear to be less equipped to deal with more sophisticated cyberattacks, like the latest IoT-based Mirai DDoS attack or the attacks detected months or years after the initial breach, such as the Yahoo and Dropbox attacks. Inundated by alerts, analysts lack the automated and intelligence-driven processes to hone in on attacks across the kill chain and breaches continue far too long. To address this fundamental mismatch, organizations need a new perspective on the way they detect and respond to attacks. Like police investigations in the real world, every cyber investigation starts with a lead upon which a hypothesis is built. As more evidence is gathered in the field, the case continues to build until investigators can confirm or refute the direction of the investigation.


Q&A on the ​Practice of System and Network Administration

The key is to get information as early as possible. Discovering a problem on launch day is the worst. A simple technique is have a beta launch to find problems early. Everyone knows that, but people don’t think to do it for internal systems or system administration tools. We take this even further. Can you launch a single feature to validate assumptions months ahead of the real launch? I like to launch a service with no features, just the welcome-page, months ahead of the actual system launch. This gives us time to practice software upgrades, develop the backup procedures, document and test our runbook, and so on. Meanwhile the developers flesh out the system by adding features. When the system is ready for real users, there are very few surprises because the system has been running for months. Best of all, users get access to new features faster.


Whatever you're doing in Linux, Windows 10 will soon do it too

"Whatever it is that you normally do on Linux to build an application: whether it's in Go, in Erlang, in C, whatever you use, please, give it a try on Bash WSL, and importantly file bugs on us. "It really makes our life a lot easier and helps us build a product that we can all use and be far more productive with." The pledge to improve Windows' support for Linux tools reflects a recent change in Microsoft's rhetoric towards open-source software. While Microsoft's then CEO Steve Ballmer described open-source software as a cancer in 2001, in 2014 Microsoft CEO Satya Nadella proclaimed that "Microsoft loves Linux". Nadella's declaration may be simplistic, and ignore Microsoft's desire to stop organizations switching from Microsoft to open-source desktop software, as seen in Munich, but the tech giant has changed its hardline approach—even if only for pragmatic reasons.


Fault injection destined to be a must-have technique for software pros

Purposefully creating situations that can cause services and software to crash or malfunction is called fault injection. This is a QA paradigm that two software engineers from Microsoft believe can mitigate the risks associated with modern software deployment and management, especially in relation to applications and services in the cloud, by helping engineers observe and find fixes for these failures in a controlled manner rather than dealing with them for the first time at an unexpected moment. ... Fault injection could be compared to the testing method known as "stress testing," Zervos added -- creating more traffic or putting more stress on a service externally. But even this type of test will not provide the kind of information or insight fault injection can provide, including a look at how dependencies will behave in a given situation.


2017 Predictions: Mobile Is The Face Of Digital

There is no question that mobile moments are the battleground to win, serve and retain your customers. What a mobile moment is and where it surfaces, however, will become amorphous as it extends beyond smartphones to platforms and connected devices and then eventually lives in a consumer’s personal ecosystem. App usage as we know it has likely peaked. In 2017, platforms will expand in importance as consumers continue to consolidate their time into fewer places on the smartphone. Already, they spend 84% of their time in just five apps. These experiences that we loosely still refer to as mobile (but not for much longer) experience will lives as fragments on third party platforms.



Quote for the day:


"Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won't come in." -- Alan Alda