Daily Tech Digest - April 06, 2018

If you develop software with Microsoft, you now own the rights

If you develop software with Microsoft, you now own the rights
There has been confusion over who owns newly created intellectual property and concern that without an approach that ensures customers own key patents to their solutions, technology companies like Microsoft will enter those customers’ markets and compete against them with the very techhnology they  codeveloped. Microsoft’s initiative puts the company ahead of the curve on this issue, said Patrick Moorhead, president of the analyst firm Moor Insights & Strategy. “The reality is, most major companies will become [intellectual property] creators in the future, but they don’t know it yet,” said Moorhead. “What Microsoft announced helps those companies protect their [intellectual property] and Microsoft’s in a very open and consistent way. This will likely reduce buyer’s remorse and lawsuits.”  Analyst Stephen O’Grady of RedMonk concurred. “As more enterprises have begun to embrace software as a core to their business rather than simply a cost of doing business, the likelihood that they create potentially valuable [intellectual property] as part of their efforts increases.”



Mirai Variant Botnet Takes Aim at Financials

According to the researchers, the botnet involved in the first company attack was 80% compromised MikroTik routers and 20% various IoT devices. Those devices range from Apache and IIS web servers to webcams, DVRs, TVs, and routers. Manufacturers of the recruited devices include companies from the very small up to Cisco and Linksys. Irfan Saif is cyber risk services principal for Deloitte Risk and Financial Advisory. In an interview with Dark Reading he points out that the IoT devices brought into the botnets have processing, communication, and networking capabilities, so it's not surprising that they're being recruited for nefarious purposes. "It will be a continuing problem and the intricacies and complexities will continue to evolve," he says. "There's an ever-increasing set [of IoT applications] in industries and for facilities management that will broaden the set of devices that can be taken," Saif says, adding, "The complexity of devices that can be taken will continue to increase."


Open Source Isn't The Community You Think It Is

Open source isn̢۪t the community you think it is
The interesting thing is just how strongly the central “rules” of open source engagement have persisted, even as open source has become standard operating procedure for a huge swath of software development, whether done by vendors or enterprises building software to suit their internal needs. While it may seem that such an open source contribution model that depends on just a few core contributors for so much of the code wouldn’t be sustainable, the opposite is true. Each vendor can take particular interest in just a few projects, committing code to those, while “free riding” on other projects for which it derives less strategic value. In this way, open source persists, even if it’s not nearly as “open” as proponents sometimes suggest. Is open source then any different from a proprietary product? After all, both can be categorized by contributions by very few, or even just one, vendor. Yes, open source is different. Indeed, the difference is profound. In a proprietary product, all the engagement is dictated by one vendor. 


Google employees demand end to company's AI work with Defense Department

drone.jpg
Both Google and the Pentagon have stressed that the technology is not ready to be used in combat situations, with Marine Corps Col. Drew Cukor telling the audience at the 2017 Defense One Tech Summit audience that "AI will not be selecting a target [in combat] ... any time soon. What AI will do is [complement] the human operator." But Col. Cukor also said that he believes the Defense Department is "in an AI arms race," and acknowledged that "the big five Internet companies are pursuing this heavily." Cukor later added: "Key elements have to be put together...and the only way to do that is with commercial partners alongside us." According to the Wall Street Journal, the Defense Department spent $7.4 billion on technology involving AI last year, and Google, Microsoft, and Amazon are openly battling for a variety of defense contracts involving cloud computing and other software. But the employee letter argues that Google is damaging its brand by working on Project Maven and contributing to "growing fears of biased and weaponized AI."


GDPR will give Dutch privacy watchdog its teeth


Recent research showed that many small companies in the Netherlands are not ready for the GDPR. Another important link between the privacy watchdog and the business world are data protection officers (DPOs), who must be appointed by government institutions and companies working with “special personal data”, such as people’s social security numbers or medical data. “We rely heavily on DPOs to update us on how companies handle data protection,” says Wolfsen. The presence of a DPO in organisations is one of the first things the AP will check when the GDPR comes into effect, he says. “From day one, it’s going to be simple – we will check whether companies have a DPO if they are required to. If they don’t, we’re going to take action.” Wolfsen declines to say what kind of action that might be. Fines are a possibility, but the AP is known to show leniency in such matters, warning a company rather than fining immediately. This has led to some criticism from both opponents and privacy groups.


MPLS explained

MPLS explained
ATM and frame relay are distant memories, but MPLS lives on in carrier backbones and in enterprise networks. The most common use cases are branch offices, campus networks, metro Ethernet services and enterprises that need quality of service (QoS) for real-time applications. There’s been a lot of confusion about whether MPLS is a Layer 2 or Layer 3 service. But MPLS doesn’t fit neatly into the OSI seven-layer hierarchy. In fact, one of the key benefits of MPLS is that it separates forwarding mechanisms from the underlying data-link service. In other words, MPLS can be used to create forwarding tables for any underlying protocol. Specifically, MPLS routers establish a label-switched path (LSP), a pre-determined path to route traffic in an MPLS network, based on the criteria in the FEC. It is only after an LSP has been established that MPLS forwarding can occur. LSPs are unidirectional which means that return traffic is sent over a different LSP. When an end user sends traffic into the MPLS network, an MPLS label is added by an ingress MPLS router that sits on the network edge.


Microsoft’s AI lets bots predict pauses and interrupt conversations


The new way to talk debuts with Microsoft’s Xiaoice in China and Rinna in Japan. Xiaoice can chat through Xiaomi’s Yeelight, a smart speaker that looks identical to Amazon’s Echo Dot released two months ago. Microsoft plans to extend the conversational feature to additional devices within the next six months, Zo AI director Ying Wang told VentureBeat in an email. In the U.S., Microsoft’s Zo will receive the new feature for Skype soon, and it will also be expanded to Ruuh in India and Rinna bot in Indonesia. No specific date or time period was provided for when the capabilities would be made available to additional bots.  The more natural way of speaking is called “full duplex voice sense” by Microsoft and gives bots that communicate via voice the ability to carry on a continuous conversation with just a single use of a wake word like “Hey, Cortana.” This enables people to speak with machines in a way that feels more like a phone call or conversation.


Unpatched Vulnerabilities the Source of Most Data Breaches

Patching software security flaws by now should seem like a no-brainer for organizations, yet most organizations still struggle to keep up with and manage the process of applying software updates. "Detecting and prioritizing and getting vulnerabilities solved seems to be the most significant thing an organization can do [to prevent] getting breached," says Piero DePaoli, senior director of marketing at ServiceNow, of the report. "Once a vuln and patch are announced, the race is on," he says. "How fast can a hacker weaponize it and take advantage of it" before organizations can get their patches applied, he says. Most of the time, when a vuln gets disclosed, there's a patch for that. Some 86% of vuln reports came with patches last year, according to new data from Flexera, which also tallied a 14% increase in flaws compared with 2016. The dreaded zero-day flaw that gets exploited prior to an available patch remains less of an issue, according to Flexera. Only 14 of the nearly 20,000 known software flaws last year were zero-days, and that's a decrease of 40% from 2016.


NGINX Debuts App Server For Microservices

Nginx debuts app server for microservices
Nginx, makers of the popular Nginx open source web server, will begin shipping on April 12 a multilingual application server called Nginx Unit. It has also upgraded its Nginx Plus application server and announced a new control plane. Configured via a dynamic API, Nginx Unit 1.0 is an open source application server. Unlike the Nginx web server, which is designed for serving web pages and websites, the Nginx Unit application server is a web server that also can run code such as what might be found in a microservices environment. Application-level logic is supported. Supported languages in the initial release include Go, Perl, PHP, Python, and Ruby. Support for Java and JavaScript is due soon. Microservices are simplified via Nginx Unit because a single instance can simultaneously serve multiple application types, the company said. Nginx Unit also has networking capabilities such as reverse-proxying.


Patterns for Microservice Developer Workflows and Deployment


In the prototyping phase, there is a lot of emphasis on putting features in front of users quickly, and because there are no existing users, there is relatively little need for stability. In the production stage, you are generally trying to balance stability and velocity. You want to add enough features to grow your user base, but you also need things to be stable enough to keep your existing users happy. In the mission-critical phase, stability is your primary objective. If the people in your organization are divided along these lines (product, development, QA, and operations), it becomes very difficult to adjust how many resources you apply to each activity for a single feature. This can show up as new features moving really slowing because they follow the same process as mission-critical features or it can show up as mission-critical features breaking too frequently in order to accommodate the faster release of new features. By organizing your people into independent feature teams, you can enable each team to find the ideal stability versus velocity tradeoff to achieve its objective, without forcing a single global tradeoff for your whole organization.



Quote for the day:


"A person must have the courage to act like everybody else, in order not to be like anybody." -- Jean-Paul Sartre


Daily Tech Digest - April 05, 2018

How to protect your PC from the Meltdown and Spectre CPU flaws

thinkstockphotos 499123970 laptop security
A pair of nasty CPU exploits have serious ramifications for home computer users. Meltdown and Spectre let attackers access protected information in your PC’s kernel memory, potentially revealing sensitive details like passwords, cryptographic keys, personal photos and email, or anything else you’ve used on your computer. These are serious flaws. Fortunately, CPU and operating system vendors pushed out patches fast, and you can protect your PC from Meltdown and Spectre to some degree. It’s not a quick one-and-done deal, though. They’re two very different CPU flaws that touch every part of your operating system, from hardware to software to the operating system itself. Check out PCWorld’s Meltdown and Spectre FAQ for everything you need to know about the vulnerabilities themselves. We’ve cut through the technical jargon to explain what you need to know in clear, easy-to-read language. We’ve also created an overview of how the Spectre CPU bug affects phones and tablets.




“While it is perhaps no surprise to learn that use of Facebook both inside and outside the workplace would be affected, this story certainly appears to have had a broader impact,” wrote Chris Ross, international senior vice-president at Barracuda, in blog post. Almost two-thirds of respondents (62%) said that as a result of the revelations about Facebook, they had reviewed their corporate policy for allowing user access to non-business-related sites and apps, either in terms of providing new guidance or restricting access. Only 20% planned to maintain a policy to allow free access to non-business sites and apps. “While restricting access to non-business apps from the workplace can improve productivity, it may not impact Facebook’s ability to collect and share the personal information of users,” said Ross. “However, these privacy concerns have raised some strong feelings in the business community around Facebook’s viability as a business tool. Our recommendation is that organisations that continue to leverage Facebook as a business platform should review some basic controls.”



What should define an enterprise encryption strategy?

This year’s statistics are encouraging, but the report does show areas of challenge. Data discovery rates as the top data encryption planning/execution challenge by 67% of respondents, a number that is 8% higher than 2017. Respondents from the UK, Germany, the US and France have the most challenges, which likely points to activities associated with preparation and compliance of data privacy regulations such as GDPR which comes into effect in May this year. When considering the majority of organisations polled are using more than one public cloud provider, the report also raises questions about how organisations are enforcing consistent encryption and key management policies across multiple cloud vendors. Securing data in a multi-cloud environment can be especially problematic for organisations seeking compliance, particularly if they are attempting to instantiate a single organisational policy using different native tools from multiple cloud providers.


How Will Artificial Intelligence Define the Future of Network Security
To spot a potential threat, a cybersecurity team must have a deep, nuanced understanding of its organization’s standard IT protocols, including the behavior of privileged users, accounts, and access points and the normal flow of authentication attempts. Simply put, a threat only appears as a threat if it deviates from standard practices. An AI cybersecurity platform could do a great deal to minimize the number of false positives and enable IT teams to focus their energies on combating real threats. When an AI algorithm is given access to an organization’s internal log and monitoring systems, it can evaluate the usage patterns of each individual employee, create a series of baseline activity profiles, and keep an eye on all network activity 24 hours a day. AI is tremendously useful as this type of catch-all mechanism, but it becomes truly invaluable once it starts to recognize threats in micro-deviations that are all but invisible to the human eye. As an AI tool is fed more and more data over time, it becomes capable of maintaining a constantly moving standard by which to judge potential threats.


Generation Z Is Already Bored by the Internet


To a parent or the casual observer, a phone-bored teen may appear engaged. After all, they’re on their phone, which many people consider an inherently engaging activity. In reality, they’re bored out of their mind. “I can be in my bed for hours on my phone, and that’s me being bored,” said Maxine Marcus, a 17-year-old and founder of The Ambassadors Company, a teen consulting business. “You think that we’re so entertained because we’re on our phones all the time, but just because we’re on it, doesn’t mean we’re engaged or excited. I get bored on my phone all the time. “When you’re bored on your phone, you’re just sitting with your own thoughts. You’re on it, but it’s just an action so your brain still goes wherever it wants to go. You get bored and you start thinking and daydreaming,” she added. It’s important to note that the majority of time users spend on their phones, they spend engaged. Tech companies go to exorbitant lengths to keep users active and attentive.


How artificial intelligence and machine learning can revolutionize ecommerce

To start with, it is a high-technical debt undertaking and requires high quality data, timely decisioning at scale, and a hypothesis-based approach to marketing. Before any machine learning can be successfully applied, it needs data. And not just any type of data. Firstly, the data must be useful, in a way that reflects first-party behavior on digital channels, such as web pages and single-page applications. Secondly, the data must be processed quickly. Thanks to the latest cloud technology, data can be processed in an almost real-time fashion, with latency of a couple of minutes, rather than hours. Finally, and by definition, personalization is personal, that means that it’s unlikely that a specific set of machine learning approaches for one brand will translate to another. In order to achieve success with this, marketing teams need to employ a hypothesis-based approach to marketing, where they use the inferred signals from machine learning in conjunction with creative brand experiences.


Malicious IoT hackers have a new enemy

honeybot
IoT security is about the farthest thing from a laughing matter in the world of technology today, threatening global trade, privacy and the basic infrastructure of modern society. So you could be forgiven for being taken aback that the newest defender of vulnerable systems against bad actors looks a little like Johnny 5 from the movie Short Circuit. Researchers at Georgia Tech’s School of Electrical and Computer Engineering rolled out the HoneyBot robot late last week. In essence, it’s a canary in the digital coal mine, offering an early warning that someone is trying to compromise an organization’s systems. HoneyBot is designed to look like a perfectly ordinary remote-controlled robot to anyone attempting to access it remotely, providing sensor data and movement information to that remote user. Where it differs, however, is if a user tries to get it to do something the owner doesn’t want it to do – HoneyBot can provide simulated responses to those commands without enacting them in the real world.


So What's Microsoft SQL Operations Studio?


From the information provided so far in a PASS keynote and accompanying blog post, it appears to be a blending of those two tools, with some container and DevOps functionality thrown in along with some Visual Studio Code goodness -- and apparently built on Electron. In the blog post, Microsoft SQL Operations Studio (MSOS from here on, for brevity) was described as a cross-platform, lightweight tool for modern database development and operations. Along with SQL Server, it will be used to work with other Microsoft data offerings like Azure SQL Database and Azure SQL Data Warehouse. Like VS Code and other editors, MSOS will provide easy access to code snippets (in the T-SQL language, in this case) and dashboards to monitor performance in the cloud or on the Azure cloud. As for more of that VS Code goodness, Microsoft said "You'll be able to leverage your favorite command line tools like Bash, PowerShell, sqlcmd, bcp and ssh in the Integrated Terminal window. Users can contribute directly to SQL Operations Studio via pull requests from the GitHub repository."


The dream job that's all the rage across America

Freelancer
There are several factors fueling the growth of these fully virtual companies, experts say. The most obvious is technology. Tools such as Slack, Zoom, Dropbox and Quip, a document-sharing and editing platform, make it easier than ever to communicate with far-flung employees and track their performance and workflow more accurately, said Trina Hoefling, author of Working Virtually: Transforming the Mobile Workplace. "Technology is the enabler," she added, "so people starting businesses are realizing that they can launch a company without a physical location quite easily." But perhaps the bigger driver in this new way of working is the demand from employees for a better quality of life. According to Gallup's "State of the American Workplace" survey, more than one-third of the respondents said they would change jobs in order to be able to work remotely some of the time. Younger employees — so-called millennials — especially start their careers fully expecting to find a position that offers more flexibility in how and where they work.


Google killing Chrome extensions for mining cryptocurrency

Up until now, Google has allowed Chrome extensions that mine cryptocurrency as long as the user is informed and the extension's only purpose is to mine cryptocurrency. That only accounts for 10% of Chrome extensions that mine cryptocurrency, however. The other 90% are doing it behind the scenes, not informing users, or both. Google said it has rejected many of the 90% of guilty extensions, and beginning now it is no longer accepting new Chrome extensions that mine cryptocurrency. The 10% of legitimate extensions aren't long for the world either: They'll be delisted from the Chrome web store starting in late June. In short, if mining cryptocurrency is any part of a Chrome extension you use or develop it's game over. Chrome extensions that use blockchain technology for other purposes, like cryptocurrency wallets, are unaffected. Just because an extension is removed from the official Chrome Web Store doesn't mean you can't install and use it—you just have to perform a few steps to load Chrome extensions manually.



Quote for the day:


"Always bear in mind that your own resolution to succeed is more important than any one thing." -- Abraham Lincoln


Daily Tech Digest - April 04, 2018

mobile apps crowdsourcing via social media network [CW cover - October 2015]
Most users of Office 365 or SalesForce or Slack, or any other SaaS app, engage via the software to get their work done. But they also generally have control – and lot of control in some cases ­– over settings. Something that would have been previously handled by an IT Admin. They also might have influenced, or have even made, the decision to sign up for an app in the first place. In other words, in addition to using SaaS apps, end users have also assumed roles in assessing and administering them. This shows how the “democratization of IT” has not only put technology into more hands, but it has also expanded responsibilities. In many ways, we all need to be virtual CIOs now, if not virtual CISOs. In the remainder of this article, I will raise some basic questions about the security of SaaS apps. In particular, authentication, encryption and administration. In a follow-up article, I will discuss the security profile of SaaS companies and more about their own infrastructure. The SaaS security topic closest to end users is passwords and authentication, but the challenges are numerous. Users not only continue to be careless; they have reason to be confused. 


Stateful stream processing with Apache Flink
Virtually all business-relevant data is produced as a stream of events. Sensor measurements, website clicks, interactions with mobile applications, database modifications, application and machine logs, stock trades and financial transactions… all of these operations are characterized by continuously generated data. In fact, there are very few bounded data sets that are produced all at once instead of being recorded from a stream of data. If we look at how data is accessed and processed, we can identify two classes of problems: 1) use cases in which the data changes faster than the processing logic and 2) use cases in which the code or query changes faster than the data. While in the first scenario we are dealing with a stream processing problem, the latter case indicates a data exploration problem. Examples for data exploration use cases include offline data analysis, data mining, and data science tasks. The clear separation of data streaming and data exploration problems leads to the insight that the majority of all production applications address stream processing problems.


Now That You Are A Soldier In The Cyber War You Must Know Your Cognitive Biases

Even before the cyber war we were being overwhelmed with data. The average citizen is surrounded with information from TV, radio, entertainment, the Internet, social media, co-workers, neighbors, family, schools, the government as well as old sources like books, magazines, newsletters and newspaper. This overwhelming deluge of information is a mix of reporting that includes both valid insights and specious reporting meant to appear to be valid. Yes, some of what we are being subjected to is total baloney! But now on top of that we are being actively attacked by both cyber attacks and deceptive propoganda from hostile foreign powers. So, like it or not you are a soldier in this fight. In fact, you are the first line of defense. Your weapon in this war, your brain, is our greatest sense of hope. Your brain is what can keep you from getting deceived and what can help you configure your systems at home and work to thwart cyber attacks. It is also the only real way to make sense in this world of information overload and hostile foreign power deceptive operations.


Thanks to Facebook, expect GDPR to spread beyond the EU


While Facebook is not technically required to comply with the GDPR for data on Americans residing in the US, several of its current policies fall significantly short of the upcoming GDPR protections. Consumers have the right to be forgotten. This means that if asked, companies must erase all personal data, in their own databases and in the databases of any third parties to which information has passed. Not only does Facebook acknowledge the difficulty of deleting data once it leaves their platform, but it also places the onus on consumers to ensure erasure. In the case of third-party apps accessed by a Facebook login (the cause of the current Facebook nightmare), consumers are given a user number and instructed to contact the app developer directly to delete any personal information collected from the Facebook platform. Under GDPR, Facebook would be directly responsible for the deletion of information from all databases, internal or external. The inability to accomplish this would be a clear violation.


The Cloud Is Rising To The Cybersecurity Challenge


Preventing malicious insiders and skilled attackers that manage to get in through the front door from walking back out the door with a company’s crown jewels has gained renewed emphasis, with Google’s DLP API removing many of the barriers to companies being able to implement enterprise-grade filtering, from OCR’ing of image content to contextual detection. One-click statistical outlier detection makes it easier for companies to identify inadvertent holes in their anonymization workflows. Third party partnerships offer countless additional services, while improved auditing allows total visibility into all access of a company’s data. Amazon and Microsoft have similarly invested heavily in helping their customers build security-conscious applications and infrastructures that are designed for today’s world, rather than the quaint naïve blind trust of yesteryear’s web. Moreover, the major cloud vendors’ global footprints mean companies can mitigate their physical risk as well by distributing their applications geographically, allowing for seamless continuity of operations even in the face of natural or human disasters.


From pranks to nuclear sabotage, this is the history of malware

Malware creation then went through one of its periodic developmental droughts. But that all changed in 1982, when Elk Cloner made its appearance, and a new wave of viruses began to rise. “With the invention of the PC, people started writing boot sector viruses that were spread on floppies,” Zone Alarm’s Skyler King told Digital Trends. “People who were pirating games or sharing them on floppies [were being infected].” Elk Cloner was the first to use that attack vector, though it was completely benign, and not thought to have spread far. Its mantle was picked up four years later by the Brain virus. That piece of software was technically an anti-piracy measure created by two Pakistani brothers, though it had the effect of making some infected disks unusable due to timeout errors. “Those were kind of the first viruses as we would consider them,” King said. “And they were propagating so that if you put in a floppy, they could copy to it, and spread that way.” The change in attack vector was noteworthy, because targeting a system from a different angle would become the hallmark of new malware in the years that followed.


Top 6 Features in Windows Server 2019

windows server 2019
With the release of Windows Server 2019, Microsoft rolls up three years of updates for its HCI platform. That’s because the gradual upgrade schedule Microsoft now uses includes what it calls Semi-Annual Channel releases – incremental upgrades as they become available. Then every couple of years it creates a major release called the Long-Term Servicing Channel (LTSC) version that includes the upgrades from the preceding Semi-Annual Channel releases. The LTSC Windows Server 2019 is due out this fall, and is now available to members of Microsoft’s Insider program. While the fundamental components of HCI (compute, storage and networking) have been improved with the Semi-Annual Channel releases, for organizations building datacenters and high-scale software defined platforms, Windows Server 2019 is a significant release for the software-defined datacenter. With the latest release, HCI is provided on top of a set of components that are bundled in with the server license. This means a backbone of servers running HyperV to enable dynamic increase or decrease of capacity for workloads without downtime.


3 Steps To Beef Up Your SD WAN Security


Software-defined wide access networks (SD WANs) are becoming widespread, and for good reason. SD WAN products are cheaper than standard network equipment, as are the operational costs associated with adding new sites to the network. In addition, the benefits of intelligently managed traffic also increase both business operational efficiency and user experience. However, as onsite IT infrastructure becomes a thing of the past, business owners and CTOs still need to stay on top of their game when it comes to security issues. Although SD-WANs use 256-bit encryption as a standard (i.e. protecting data with a key that would be too long for hackers to crack, even with the most powerful computer), they are not immune to being breached by sophisticated cyberattacks. If you haven’t already, you should speak to your SD WAN provider to find out what specific security is in place on your network. Keep in mind, different vendors will provide slightly different security technologies.


How to detect and prevent crypto mining malware

digital money - binary code
Enterprises are very much on the lookout for any signs of critical data being stolen or encrypted in a ransomware attack. Cryptojacking is stealthier, and it can be hard for companies to detect. The damage it causes is real but isn't always obvious. The damage can have an immediate financial impact if the crypto mining software infects cloud infrastructure or drives up the electric bill. It can also hurt productivity and performance by slowing down machines. "With CPUs that are not specifically made for crypto mining, it could be detrimental to your hardware," says Carles Lopez-Penalver, intelligence analyst at Flashpoint. "They can burn out or run more slowly." Cryptojacking is in the early stages, he added. If a company spots one type of attack, there are four or five others that will get by. "If there's something that could potentially stop crypto miners, it would be something like a well-trained neural network," Lopez-Penalver says. That's just what some security vendors are doing — using machine learning and other artificial intelligence (AI) technologies to spot the behaviors that indicate crypto mining, even if that particular attack has never been seen before.


Take Responsibility for Your Security

Though suppliers are building secure systems, that’s just one step along the way, Snitkin noted. “That’s where these small companies in particular are hurting,” he added. “There’s no way those small companies can get the expertise to maintain these things.” To be as secure as big companies, the small guys need to accept a different strategy in which they rely more heavily on outside services, he argued. “Vulnerability could be completely outside the scope of what these companies are doing,” Nassar added. “Small companies don’t have a chance at all to get the internal competence to a level they need,” Bosch said, adding that the same is true to some extent for larger organizations. Part of the effort to improve security comes through collaboration—among vendors, customers and more. It requires an ecosystem rather than one vendor solution, commented Sami Nassar, vice president of cybersecurity at NXP Semiconductors.



Quote for the day:


"Leadership is the art of giving people a platform for spreading ideas that work," -- Seth Godin


Daily Tech Digest - April 03, 2018

connection network blockchain artificial intelligence ai
At Nvidia's GPU Technology Conference (GTC), Pure Storage announced a turnkey solution to simplify the deployment of AI infrastructure. The product, known as AIRI (AI Ready Infrastructure), is a validated, optimized solution that includes Pure Storage’s FlashBlade, 100 Gig-E switches from Arista, four DGX-1 servers from Nvidia, and all the software required to operationalize AI at scale. The product is supported by the Nvidia GPU Cloud deep learning stack and Pure Storage AIRI Scaling Toolkit, enabling data scientists to get to work in a few hours instead of months. The product is similar to other converged infrastructure products, such as Cisco’s FlexPod and Dell-EMCs VxBlock, that offer a turnkey way for businesses to stand up a private cloud in under a day. I’ve talked to customers of both products, and they told me that the converged products take all the complexity out of the deployment so companies can start using the infrastructure immediately. Converged infrastructure was huge leap forward for private clouds, and I expect it to have a similar impact on AI.


How To Disaster Proof Your Business IT

Ransomware has become a major risk in the past few years. It scrambles the data on computers and backups and then demands a payment for the code to restore the files. Making the payment doesn’t always get the mangled files back. Most small businesses have no disaster recovery plan. Nationwide Insurance reports that only 18 percent of companies in the United States with fewer than 50 employees have such a plan. The report estimates that 25 percent of businesses don’t reopen after a major disaster. Small businesses are at greater risk, since the loss of one server won’t wipe out a huge enterprise but could ruin a small shop. Having a plan significantly improves the chances of survival. Insurance can cover the replacement cost of lost physical assets, but money won’t get back lost data. Replacing a computer requires setting it up with the software and data from the old machine, and this is very hard if you’re caught unprepared. The good news is that there are cost-effective ways for small businesses to protect their data from catastrophic loss and get running again quickly.


Smart cities need the cloud—and vice versa

Smart cities need the cloud—and vice versa
The real advantage of using mostly public clouds to create and run smart cities is not the capabilities of the various clouds to host basic compute and storage in support of city automation. It’s the ability to reuse common smart city services across cities, services that will be sold and managed by the public cloud providers.The fundamental larger role of public cloud providers is to create sets of cloud services that will deliver best practices via services to all cities that want to become smart cities. The public cloud providers will essentially be the vehicle for sharing this technology. And they need cities to help define those services. The larger piece of the puzzle is cost reduction. There is no real reason to become a smart city unless it’s going to reduce city operations costs, as well as deliver citizen services better than you did before. In other words, it’s not enough to become “smart”; you need to spend tax dollars in more effective ways. Some cities are now successfully evolving into smart cities, paving the way for other city governments to follow.


Is Security Accelerating Your Business?

Small, departmental SaaS footprints expanded enterprise-wide and eventually evolved into cross-functional application and services platforms. To a certain degree, shortcomings and delays in application deployments also gave rise to "shadow IT" — a complete circumvention of the security process. This trend of embracing simplicity and speed has progressed into the consumerization of the infrastructure, as shown by the continued growth of infrastructure-as-a-service (IaaS) offerings such as Amazon Web Services and Microsoft Azure. With the increased adoption and simplicity of cloud deployment, the barrier to entry to spin up and deploy services has dropped dramatically, and this has increased the security gap in visibility and controls for these types of deployments. A recent survey from RightScale on cloud adoption revealed that less than half of application or business owners plan to delegate authority to central IT for the selection of public cloud services, which supports the notion that business leaders are opting for the easiest path forward when it comes to application deployment.


SD-WAN best path to virtualized networks of the future


As the virtualized network strategy with the most going for it, SD-WAN has the most fundamental role in virtualization and the broadest base of interested parties, not SDN or NFV. SD-WAN could be the transformational strategy of the current age. Not only does it disconnect service from the details of infrastructure, SD-WAN opens the door for infrastructure to change by adopting other virtualization technologies, including SDN and NFV. Because SD-WAN runs above infrastructure, and because it can be deployed by enterprises, network operators and managed services providers, fear of change or high levels of legacy infrastructure cost can't stall it out. Ten years from now, most business services probably will be SD-WAN-based, and cloud service delivery will be dominated by SD-WAN, too. Over the next decade, SD-WAN will both pave the way to change and define how network changes are matched to consumer and business services.


Are legacy technologies a threat to EU’s telecom infrastructure?

Mobile networks worldwide are still depending on SS7 and Diameter for controlling communications as well as on sets of protocols that were designed decades ago without giving adequate effect to modern day security implications. In this respect, the interconnected environment has become perilous. As today’s society is becoming more and more digital, such vulnerabilities might inhibit the proper functioning of the mobile networks, thereby impacting the operation of the digital markets. A full range of new services is being developed or is relying on the primary infrastructure offered by electronic communication providers  “In this context, ENISA has developed a study, which has examined a critical area of electronic communications: the security of interconnections in electronic communications, also known as signalling security. An EU level assessment of the current situation has been developed, so that we better understand the threat level, measures in place and possible next steps to be taken,” said Udo Helmbrecht, ENISA’s Executive Director.


Microsoft Announces 'Windows ML' Platform for AI-Assisted Windows 10 Apps


Microsoft today said Windows 10 developers will soon be able to more easily infuse artificial intelligence (AI) functionality into their apps with the help of a new machine learning platform coming in the next update of the OS. The announcement was made at today's Windows Developers Day event. Further information was provided in a post titled "AI Platform for Windows Developers," in which the company said the AI expertise it developed in products like Cortana and Bing Search will be made available to Windows app coders. "With the next major update to Windows 10, we begin to deliver the advances that have been built into our apps and services as part of the Windows 10 platform," Microsoft said. "Every developer that builds apps on Windows 10 will be able to use AI to deliver more powerful and engaging experiences." Initially, the AI functionality will focus on machine learning, as different partners like Qualcomm Technologies and AMD will help out with the "Windows ML" platform.


Microsoft wants you on Edge, even if it has to trick you

microsoft edge browser logo
Is Microsoft doing something to address one of the browser’s biggest problems, a paucity of extensions compared to competing browsers? Once again, the answer is no. As I write this, a total of 99 extensions are available for Edge, compared to many thousands for Chrome and Firefox. So what is Microsoft’s grand plan for convincing you and millions of others to switch from your current browser to Edge? It’s this: In the fall update to Window 10, Windows’ built-in email app will open all links in Edge rather than the browser you’ve set as your default. So you may want to use Chrome, for example, but if you use Windows Mail to open a link, you’ll use Edge whether you like it or not. And clearly, given Edge’s dismal market share, you won’t like it. Microsoft is currently testing this feature in the latest version of the public preview of the Windows 10 update that will be released in the fall. And if you believe the company, Microsoft isn’t doing it because it wants you to switch to Edge.


How Integrative Thinking Promotes Innovative Problem-Solving

Note on cork board: freethinkers wanted
The most successful organizational leaders I’ve encountered, including the one in the case study, are relentless about seeking out the counter-intuitive approach to their business challenges. They resist the rush to choose between traditional tactics and instead strive to reframe difficult situations as, “What if?” type opportunities. In every circumstance where I’ve observed the application of integrative thinking, it didn’t happen without the individual silencing the reflexive, pattern-matching portion of the brain and creating opportunities for new ideas to flood the system. The GM in the case study indicated to me that she would never have conceived of the idea without breaking the habit of asking customers about their satisfaction with her unit’s products. Instead, she and her team members stepped back and just observed. What they saw convinced them the customers had much bigger fish to fry than worrying about her unit’s offerings. While her unit was focused on creating the next version, running the latest price promotion, or building a new, low-cost product, the customers were barely treading water trying to tie systems together and use the data to serve their customers.


Automation and gamification key to cyber security

By pairing human intelligence with automated tasks and putting human-machine teaming into practice, the report said automated programs handle basic security protocols while practitioners have their time freed up to proactively address unknown threats. Most respondents (81%) believe their organisation’s cyber security would be stronger if it implemented greater automation, a quarter said that automation frees up time to focus on innovation and value-added work, while nearly a third (32 percent) of those not investing in automation say it is due to lack of in-house skills Gamification, the concept of applying elements of game-playing to non-game activities, is growing in importance as a tool to help drive a higher performing cyber security organisation, the survey found. Within organisations that hold gamification exercises, hackathons, capture-the-flag, red team-blue team or bug bounty programs are the most common, and almost all (96%) of those that use gamification in the workplace report seeing benefits.



Quote for the day:



"Data is the oil of the 21st century. But oil is just useless thick goop until you refine it into fuel." -- @ValaAfshar


Daily Tech Digest - April 02, 2018

Augmented reality, outer space and emerging technology

USA TODAY 321 Launch
The publishing industry is in an interesting place. Monetization has been a challenge since the dawn of digital and Facebook’s recent algorithm change, prioritizing posts from your actual friends, demonstrates just how much publishers need to look beyond ad revenue. While many try to figure out how to create subscription models, USA TODAY is doing something anomalous for the industry: investing in emerging technology. Last week, USA TODAY debuted an augmented reality (AR) app called 321 Launch. ... “We didn’t do this just so we can say we’re doing AR,” says Ray Soto, the USA TODAY NETWORK’s Director of Emerging Technologies. “When you consider the future of what AR could be, it’s taking these building blocks and leveraging location-based data to really change storytelling.” 321 Launch is the result of a partnership between USA TODAY and the network’s Florida Today newspaper, which is based in Brevard County, home of Cape Canaveral. USA TODAY had been wanting to experiment with AR anyway and Florida Today reached out with an idea about collaborating around space rocket launches.



Main Street Cybersecurity: Can Email Be Safe?

Can email be safe? Yes, but it will require new thinking across the board. Processes and technologies will continue to improve their ability to identify malicious emails, but attackers will improve their delivery methods and capabilities as well. For Main Street, the most affected recipients of potentially malicious emails, better education on how to discern good from bad emails (on their own) may be their first and best defense. What follows is an initial checklist for people on Main Street who read email to keep in mind before opening that next unknown email attachment. If you have questions or comment regarding this article please connect with us on twitter @Release2I. Keeping all your software and operating systems up to date is one of the easiest things that can be done to protect your computers. Almost all current operating systems have the ability to automatically download and apply patches on a regular basis- enable this capability. Malware delivered by email often attempts to exploit OS vulnerabilities, so if your system is “up to date” patch wise, you are less exposed.


The Invisible Hand Of Financial Services


Like most industries, banking changed dramatically due to rapid globalization and digitalization. The scale of banks grew and matured into what we see now: the modern global financial-services industry—a dynamic, interwoven system of global capital movement and interdependent technologies. With increased regulatory pressure and capital requirements, the number of global financial institutions has fallen dramatically. For example, the number of banks in the United States went from 14,400 in first-quarter 1984 to 4,938 in third-quarter 2017. This increased industry consolidation coupled with enterprise expansion has resulted in more efficient institutions, but only a handful of banks dominate nearly every geographic market. This is not necessarily good for financial customers. Large publicly traded financial-services firms have suffered from short-term thinking and strategies driven by fiscal quarters and financial results. Collective industry decisions around banking solutions dominated purely by a profit motive are rarely in the public interest.


Top 10 Testing Frameworks and Libraries for Java Developers

Testing is one of the disciplines that separates professional developers from amateur ones. It's not about following TDD, BDD, or whatever testing methodologies, but at the very minimum level, you must write code to test your code automatically. Many Java developers write unit tests and integration tests that automatically run during build time, mostly by using continuous integration tools like Jenkins or TeamCity. If some of you are wondering why a programmer should focus on automation testing, then let me tell you that the importance of automation testing is growing exponentially due to more awareness and emergence of DevOps. Companies generally prefer programmers who are good at writing unit tests and show good knowledge of various unit testing frameworks, libraries, and tools e.g. JUnit, Selenium, REST-Assured, Spock framework, etc. As a Java developer, we work on very different areas, starts from writing core Java code to creating JSP pages, writing REST APIs, and sometimes even creating Groovy scripts for build automation.


SD-Branch market expected to reach $3 billion by 2022

SD-Branch market expected to reach $3 billion by 2022
SD-Branch solutions are just reaching the market during 2018 — so spending will remain small this year. Expect SD-Branch adoption to accelerate during 2019-2021 as many suppliers introduce new products and as distributed organizations achieve CAPEX and OPEX benefits. Doyle Research forecasts that worldwide expenditures on SD-Branch solutions will reach $3 billion by 2022. SD-Branch is defined as having SD-WAN, routing, network security, and LAN/Wi-Fi functions all in one platform with integrated, centralized management. Software-based networking technologies such as software-defined networking (SDN), software-defined WAN (SD-WAN), and network functions virtualization (NFV) have abstracted network intelligence from the integrated network appliance (aka black box). The concept of the SD-Branch is to leverage network virtualization to run several discrete functions on a single platform. Advances in silicon from Intel, ARM, and Broadcom enable the network horsepower to run routing, SD-WAN, network security, and Wi-Fi functionality on one hardware platform.


Samsung’s Big Dreams For India

It has been a hard fought battle to win this market. Perceptions, market wars, opportunities, competition — sometimes from their traditional foreign rivals, sometimes from local upstarts, and lately, the Chinese. But every single time Samsung has been able to stave off the threats and hold its ground. It has continued to be the market leader in the television segment for over 12 years and in the mobile business for six years, after it toppled Nokia in 2012. Since then, there has been no looking back. Or so it seemed till now.  Perhaps it is their Korean culture of doggedness to be No. 1, or their pursuit for product excellence as borne out in their bitter rivalry with the Japanese in each business — from electronics to automobiles, that keeps them going. Or is it something else? “There is more to our story and central to that is our consumer-centric approach to whatever we do especially in India. We call it ‘Make for India’,” says Asim Warsi, Senior Vice President, Samsung India, with the exuberance of today’s millennials whom he is out to impress every day to encourage buying Samsung smartphones, TVs and home appliances.


Neuroevolution Will Push AI Development to the Next Level

Image: Shutterstock
Any serious effort to develop “artificial general intelligence” must at some point recapitulate the evolutionary process within which neural networks took shape and became attuned to world around them. Artificial intelligence researchers have been developing more sophisticated “neuroevolution” approaches for many years. Now it would seem that the time is right for these to enter the mainstream of commercialized AI in a big way. As AI becomes the driving force behind robotics, more developers are exploring alternative approaches for training robots to master the near-endless range of environmental tasks for which they’re being designed. There is fresh interest in approaches that can train robots to walk as well as humans, swim like dolphins, swing from trees like gibbons, and maneuver with the aerial agility of bats. As I noted here, the robotics revolution has spurred AI researchers to broaden the scope of intelligence to encompass any innate faculty that enables any entity to explore, exploit, adapt, and survive in some environment.


HPE Aruba launches AI solution for autonomous enterprise networking

According to the tech giant, this new set of partners will "create modern workplaces that uniquely pair end-user mobility, secure connectivity, and location with the sensory context of enterprise IoT," ranging from personalized workspaces to predictive maintenance and fully automated conference rooms. "Companies are reevaluating their real estate strategies to better align with and enable the future of work," said Francisco Acoba, Managing Director of Deloitte Consulting. "The smart digital workplace is now a major point of differentiation for organizations as employers look to attract, retain and grow today's talent." "The future of work will be defined by smart workplace experiences, and corporate real estate leaders along with their IT counterparts should consider embracing how mobility and enterprise IoT will transform the physical office," the executive added. According to research agency Gartner, almost half of CIOs are planning to pilot AI projects in the near future, while four percent are already exploring the potential value of AI-based solutions.


Facial Recognition Tech Moves From Smartphones To Boardroom

facial recognition access identification biotech
"There's a whole lot of data carried in your face: your age, your gender, even your emotional state at the time. And, those are things that could be useful outside of simply authentication," Aley said. Ever AI's software was developed using a massive store of video and images contained in a consumer photo and video cloud storage service, EverAlbum (now simply called Ever). The app, available in the Apple App and Android Play stores, lets users organize their image and video albums from multiple services. Ever AI collected a data set of 13 billion images that were tagged by users – similar to how photos are tagged in Facebook – with the names of those in the photos, Aley said. From that point, on, all future photos of those people are tagged automatically (with the user's permission) so users can find those photos easily. "This creates an 'identity' for those people," Aley said. The data store was used to train facial recognition algorithms, which resulted in a more accurate consumer application; Aley claims the company's enterprise facial recognition software is 99.8% accurate.


The ROI of Being Data-Driven

Every company tries to provide value to their customers. The more value they provide to their customers, the more the value of the company goes up (at least in theory). But one may ask: How do you measure the value of a business? The financial industry has come up with various ways to measure the value of a business. Two of the most common financial measures that people look at are EV(Enterprise Value) and EBITDA (Earnings Before Interest, Taxes, Depreciation, and Amortization). The ratio of EV/EBITDA is commonly used when comparing a firm's fiscal performance. The lower the value of the ratio, the more attractive your company is for private and individual investors. Alternatively, the best way to decrease the value of the fraction is by having a high EBITDA value. Calculating EBITDA for all non-financial purposes is the same as EBIT, which is revenues minus expenses. Therefore, the higher the revenues and lower the expenses, the higher the value of the business, which completely makes sense!



Quote for the day:


"Be The Kind Of Leader You Would Want To Follow." -- Gordon TredGold


Daily Tech Digest - April 01, 2018

Roubini States That Blockchain Is Pure Hype

Roubini States That Blockchain Is Pure Hype
To begin with, blockchain technology is less efficient than existing databases. When someone says that their project is working on blockchain, in reality, as a rule, the operation of a program reproduced on a variety of other devices is implied. In this case, the storage space requirements and processing power are much higher, and the transaction speed is much lower than when using a centralized program. Blockchain with the technologies of proof-of-stake or zero-knowledge require that all transactions be verified cryptographically, and this slows down their work. A blockchain with the technologies of proof-of-work, used in many popular cryptocurrencies, raises another problem as they require huge amounts of energy to ensure their work. That is why operations for the mining of Bitcoins in Iceland this year already can begin to consume more electricity than all Icelandic households put together. Blockchain can make sense only in cases where the exchange of speed for the quality of verification is really needed, but this technology is rarely promoted in this capacity.



Blockchain will make AI smarter by feeding it better data


The current challenge to smaller businesses isn’t the cost of AI systems — they’re increasingly more affordable and accessible. The barrier is gaining access to enough high quality data about customers to adequately power those systems. Few retailers can recognize their customers across multiple channels and devices, and they often rely on third-party, behavioral data that doesn’t give them a complete understanding of what products customers want to buy. Moreover, most retailers handle only a small slice of each of their customer’s purchases, hardly enough to make these AI systems work well. That’s why blockchain technology is so transformative. Its key innovation is to create a database that is open and decentralized, yet with strict controls over privacy. Shoppers could authorize all the stores they patronize to contribute data about their purchases to a blockchain ledger that protects the privacy of both consumers and retailers.



Dividing frontend from backend is an antipattern

Contemporary frontend work has evolved in complexity to the extent that we should no longer separate frontend from backend roles. Frontend engineers now solve the same kinds of problems as their backend counterparts, using the same kinds of solutions, and it is harmful to continue arbitrarily dividing us. To be clear, I’m not saying that we all need to be experts in everything. That would be impossible. Today’s technology stack goes down a long way, so being a genuinely balanced full-stack dev is probably not the most realistic of goals — but staying open-minded is. While it is perfectly valid to dislike a particular technology, such as CSS, the industry's culture of contempt for frontend work feeds into the outdated divide between frontend and backend, and detracts from building fast-moving, competitive teams and companies. Think of yourself as a developer first. Investigate frontend technologies, pair with UI specialists, evangelize your colleagues.


Man vs machine: How each responds to deception defenses

Russian hackers are targeting a number of political organizations.
If you use the same password for multiple systems, this analysis shows you should avoid this practice. Migrate to unique long pass phrases with less rotation and always consider multi-factor authentication when available. In general, human attackers are attracted to files that may contain configuration instructions for an application with a username and password for a specific individual or a shared account. Another popular file example is technical documents such as those providing information on how to use a corporate VPN service. Personal files with confidential information, IT/Corporate files, logs, databases, and reviewing recent files for Windows or Office are popular with human attackers and make good breadcrumbs and traps. Poisoned data within files including fake, planted credentials provides a valuable lure to detect attackers as they reuse them. On the other hand, malware due to its machine automation prefers structured data found in applications. Examples include session apps, web browsers, and uninstall information for applications.


Big Blue Is Finally Getting Serious About Cryptocurrency

Lund breaks down the demand IBM is seeing into three main kinds of tokens: securities tokens that give owners a stake in the issuing company, utility tokens that give users access to a service such as phone minutes and commodities tokens that represent precious metals and other physical assets. "We're actually seeing a move toward the issuance of tokens that have a higher velocity that represent, for example, a claim on a portion of gold bullion sitting in a vault somewhere," he said. Beyond the obvious potential interest in this work from commodities exchanges, Lund said IBM is being approached by retail companies, beverage providers and energy companies looking to tokenize various aspects of their business offerings. A fourth category of companies Lund said is approaching IBM are startups looking to raise capital, though he admits these opportunities have proved less enticing.


Artificial Intelligence: Optimising Your Recruitment & Avoiding Bias

AI diversity recruitment bias
The machine learning technology at the heart of Textio means that the more it is used, the better its analysis becomes. Conclusions drawn from previous job adverts enable the algorithms to give posts a score based on the tone and gendered nature of language used. Textio can predict if a post is likely to attract female or male respondents, and even how long it will take for the position to be filled. It can offer guidance on how to improve writing, enabling businesses to attract better qualified and more diverse talent in less time. It seems to be a win-win for HR. Critics of this kind of system may point out the potentially homogenising effect it could have upon the written word. Will technology like Textio make all job adverts the same? Could it lead to a process of levelling down, where only the most vanilla of workers are able to gain employment? Hesitancy around this technology is understandable: writing is a fundamental part of human expression, and it’s not clear that we want machines to start meddling in it. In fact, job adverts provide a unique opportunity for businesses to convey their ethos to potential employees.


Microsoft inches closer to commercially-viable quantum computing

quantum.jpg
Microsoft is slowly making headway in the race toward commercially-viable quantum computing, tapping into the unique properties of a certain particle to address issues engineers at many tech companies have been struggling with for decades. Alphabet, IBM, and a number of smaller companies are all competing for "quantum supremacy," a disputed term referring to the point at which quantum computers will be able to handle calculations beyond the capacity of the world's best supercomputers. "[Quantum supremacy] is very catchy, but it's a bit confusing and oversells what quantum computers will be able to do," Simon Benjamin, a quantum expert at Oxford University, told MIT's Technology Review. He added that even as the abilities of quantum computers improve, classic computers will still be faster and cheaper. "Using a quantum computer would be like chartering a jumbo jet to cross the road," Benjamin said.


How Nvidia is helping autonomous cars simulate their way to safety

how nvidia is helping autonomous cars simulate their way to safety gtc path billions of miles
To further improve simulations, Nvidia, and some of its partners, are using data from the sensors of autonomous vehicles to build higher definition maps. When autonomous vehicles hit the road, these machines will not only rely on the data that is available through training, but also contribute to data collection by sharing the data that it has captured from its LIDAR, IR, radar, and camera arrays. When this newly captured data is combined through deep learning with existing low-quality data sets, it will make streets and roads look more photo-realistic. Cognata claims that its algorithms can process the data in a way to bring out details in shadows and highlights, much like an HDR photo from your smartphone’s camera, to create a high-quality scene. While simulation is an excellent tool, Atsmon noted it has its own flaws. It’s too simple, and for autonomous driving to be realistic, it must learn from edge cases. Cognata claims that it only takes a few clicks to program in an edge case to validate autonomous vehicles for more unusual driving scenarios.


Disruption vs. Innovation: Defining Success

Disruption vs. Innovation: Defining Success
Innovation or rapid evolutionary innovation, as I define it, is turning your dreams into reality, or manifesting what you envision. Disruptive companies are those whose innovations or innovative processes completely change the market they serve. They might use an innovation to accomplish their goals, but not all innovations are disruptive. In other words, not all innovations cause a business or market to rapidly evolve. I firmly believe that all businesses must evolve over time in order to stay competitive in the marketplace, and that has shown to be true when it comes to disruption. Companies need appreciable time for their services to evolve and react to the needs of the market. Disruption does not happen overnight; neither does success. There are many so-called "overnight successes" that have actually been around for decades before finally reaching the tipping point and having mass appeal or nationwide/worldwide recognition.


5 Things You Should Know About the Cloud, But Were Afraid to Ask

tower disappearing into the clouds
Depending on whom you believe, cloud computing goes back as far as the early 1960s, with J.C.R. Licklider and the introduction of the Advanced Research Projects Agency Network (a.k.a. ARPANET) or as recent as 2006, when former Google CEO Eric Schmidt purportedly coined the term “cloud computing” at an industry conference. Whichever origin story you buy into, the cloud has clearly taken off and with it, business, IT and marketing leaders are clamoring to assess where things are now and where they may be headed. What follows are five fundamental observations about the cloud today (in no particular order). I hope these thoughts from the front line are useful and maybe even a bit of a provocative look at the cloud. ... The good news is the agility that comes with being on the cloud pays off in the intermediate to long-run. What’s more, it’s becoming clear in the urgent communication we see from organizations not yet on the cloud that if you don’t do it, you’ll fall behind — which creates its own kind of costs.



Quote for the day:


"Change is the end result of all true learning." -- Leo Buscaglia