Daily Tech Digest - July 31, 2017

Without enterprise architecture, the Internet of Things is just... things

The rise of IoT is shifting software and network requirements within organizations moving forward with efforts, the GAO report also states. The report's authors predict more emphasis on "analysis programs that can condense large volumes of IoT data into actionable information," as well as "'smart' programs that can augment or replace a human operator. Aggregated data gathered from IoT devices can undergo sophisticated data analysis techniques, or analytics, to find patterns and to extract information and knowledge, enhancing decision-making." There will be changes in business structures as a result as well. "IoT software developments permitting automation may reduce the need for human operators in certain capacities," particularly software that "relies on augmented intelligence and behavior to substitute for human judgement and actions, respectively. ..."

DevOps Security & the Culture of 'Yes'

The new way brings the teams together, which makes problem resolution part of the process. Iterations are small, and we have tools to get real visibility into what's happening in production. If there is a problem, we can point to the data instead of each other. This focus on communication, collaboration, and the use of production data to drive decisions is the key to making security work in a DevOps world. The principle issue that security teams face when working with other organizations is how to effectively communicate risk, priority, and tradeoffs. Just because we think something is important doesn't mean our developers and ops guys do, too. Moreover, the old approach of enforcing process and exercising veto power over releases is no longer viable. With DevOps, one thing we can be certain of is that the release is going out the door. No more C-S-No. Today, we can't just say "no." We can't even say "no, but..." We need to find ways to say "Yes, here's how we can do it."

When blockchain meets big data, the payoff will be huge

You can expect to see decentralized solutions like Storj, Sia, MaidSafe, and FileCoin start to gain some initial traction in the enterprise storage space. One enterprise pilot phase rollout indicates this decentralized approach could reduce the costs of storing data by 90 percent compared to AWS. As for blockchain-driven AI, you can expect to see a three phase roll-out. First, within the existing enterprise. Then, within the ecosystem. Finally, totally open systems. The entire industry might be termed blockchains for big data. Longer term, we will see an expansion of the concept of big data, as we move from proprietary data silos to blockchain-enabled shared data layers. In the first epoch of big data, power resided with those who owned the data. In the blockchain epoch of big data, power will reside with those who can access the most data and who can gain the most insights most rapidly.

Security Investments Consume SWIFT's Profits

Banks that collectively own SWIFT saw their profits vanish last year as the organization increased its investments in information security, even as the interbank messaging service handled record volumes of money-moving messages. The investment followed the $81 million heist from the central bank of Bangladesh in February last year, accomplished by attackers who issued fraudulent SWIFT money-moving messages from a compromised Bangladesh Bank system. News of the attack sparked a public relations disaster for the Brussels-based cooperative, formally known as the Society for Worldwide Interbank Financial Telecommunication, calling into question the integrity of its messaging service and whether the organization was doing enough to police members' information security practices.

For CISOs Blockchain Will Improve Internet of Things (IoT) Security

Within the realm of IoT, blockchain has huge potential with home automation systems, connected thermostats, autonomous vehicles, etc. Blockchain helps reduce security threats at the edge. The long-term value is with interactive appliances such as refrigerators or washing machines that can intuitively restock, order, pay for and have items shipped without user interaction. Industries like aviation, financial services, healthcare, mining, public sector and supply chain/logistics companies have all begun transforming to support blockchain. For example, aviation and manufacturing are using blockchain to track, move and locate replacement parts across multiple companies and suppliers. The financial services industry is investigating blockchain to ensure transactional integrity, faster clearing and settlement with lowered costs, especially at scale.

Banking Must Move From Mobile-First to AI-First

The number one trend around artificial intelligence is underscored by the fact that bankers believe artificial intelligence (AI) will ‘revolutionize the way banks gather information and interact with customers.’ This is in line with the findings from other industries, where the application of big data and machine learning is expected to provide a better understanding of customer beliefs and intentions, enabling enhanced customer experiences and better competitive positioning. ... “With advances in artificial intelligence, the Internet of Things and big data analytics, humans can now design technology that’s capable of learning to think more like people and to constantly align to and help advance their wants and needs. This human-centered technology approach pays off for businesses, as leading companies will transform relationships from provider to partner — simultaneously transforming internally.”

The Role of Machine Learning Technology in the Hotel Sector

Machine learning uses algorithms that iteratively learn from data, allowing technology to glean more actionable insights from the available data. Examples of machine learning outside of the hotel sector include credit scoring and the targeting of marketing advertisements. In hotel revenue technology, machine learning is often used in conjunction with statistical methods to produce cutting-edge forecasting and decision optimisation. High-performance technology can use machine-learning processes to better understand the relationship between price and demand, and generate room rates that adapt and anticipate market fluctuations.In the age of big data, machine learning systems are critical. Any revenue manager working without the support of an analytical revenue management solution will find themselves overwhelmed by the sheer volume and complexity of data.

How smart cities like New York City will drive enterprise change

Smart cities are built on citywide fiber networks, which can eventually (as with the case of ZenFi's network) connect 5G wireless antennas on every street corner and every floor of every office building back to the core network. This densification of the wireless networks is the true hero of the smart cities revolution, enabling not only smart-city kiosks, but capacity for high-speed wireless applications on smartphones and tablets, widespread IoT deployments, mobile augmented reality applications, self-driving cars and more. It's also amazing that New York is leading the smart city charge. Because if the concept can make it there, it can make it anywhere. Dark-fiber deployments in New York typically cost far more than in just about any other city because of heavy unionization and the scale of any disruption when streets have to be closed for fiber installation.

Everything You Need To Know About Wireless Mesh Networks

Mesh networks are resilient, self-configuring, and efficient. You don’t need to mess with them after often minimal work required to set them up, and they provide arguably the best and highest throughput you can achieve in your home. These advantages have led to several startups and existing companies introducing mesh systems contending for the home and small business Wi-Fi networking dollar. Mesh networks solve a particular problem: covering a relatively large area, more than about 1,000 square feet on a single floor, or a multi-floor dwelling or office, especially where there’s no ethernet already present to allow easier wired connections of non-mesh Wi-Fi routers and wireless access points. All the current mesh ecosystems also offer simplicity. You might pull out great tufts of hair working with the web-based administration control panels on even the most popular conventional Wi-Fi routers.

How Rust can replace C, with Python's help

Proponents of Rust, the language engineered by Mozilla to give developers both speed and memory safety, are stumping for the language as a long-term replacement for C and C++. But replacing software written in these languages can be a difficult, long-term project. One place where Rust could supplant C in the short term is in the traditionally C libraries used in other languages. Much of the Python ecosystem for statistics and machine learning is written in C, via modules that could be replaced or rewritten incrementally. It isn’t difficult to expose Rust code to Python. A Rust library can expose a C ABI (application binary interface) to Python without too much work. Some Rust crates (as Rust packages are called) already expose Python bindings to make them useful in Python. But there is always opportunity for closer integrations between both languages.

Quote for the day:

"Beware that the detours in your life don't turn into destinations.: -- Tim Fargo

Daily Tech Digest - July 30, 2017

How CIOs can use machine learning to transforming customer service

Machine learning means your company’s programs can make use of this data without being explicitly programmed to do so, as programs collect, learn, and adapt by utilizing ever-greater sources of data. Your human employees who do relatively simple and often mundane task, such as answering phones, will soon be replaced by much more efficient machines.  Workers who remain employed will find themselves working alongside of greater numbers of machines, too. Employees will find it much easier to comb through databases of information, retrieve specific solutions to costumer issues, and detect what kind of customer they’re on the line with by utilizing advanced software. Machine learning also enhances your company’s knowledge-retention capacity in the long run.

The New Enterprise Business Integration Approach In Banking

Some ‘trendsetter’ banks after having established maturity in mobile banking, have now begun to focus on the Omni-channel strategy. Omni(s) (Latin for “Universal”) – Channel is a digital transformation paradigm to provide common customer experience in multiple channels through which a customer interacts with the bank. Omni-channel implementation path often starts with the convergence of customer facing capabilities in Online, Tablet, Mobile and new disruptive channels such as Wearables; while also developing a transition roadmap to further integrate capabilities such as alerts, notifications, 3600 customer view to other self-service channels such as IVR (Interactive Voice Response), Kiosk, ATM; and assisted channels.

Preparing For Disruption: Fintech And The Fortune 500

Startups usually concentrate on one area of the financial business and do it well, whereas most Fortune 500 financial companies have diverse lines of business. As a result, large financial companies are fending off assaults on their bottom line from multiple fronts. For instance, PayPal, Stripe and Square are honing in on payment processing on one side, while robo-advisors like Betterment and Wealthsimple are looking to take over a chunk of the wealth management sector. And it doesn't stop there. The latest fintech players have ventured into the lending industry where companies like LendingClub and SoFi attack the consumer lending market, and Kabbage and OnDeck Capital look to become leaders in small business lending – a traditionally under-served market due to the high cost of processing loans.

The battle between banks and disruptors is just beginning

There are companies that do similar things in lending, savings, investments and other specific areas of financial services based upon internet technologies. These companies have names like Zopa, SmartyPig, Nutmeg and eToro, and have fun branding and cool offices. They are very different from banks. They all share many of the same attributes, in terms of being young, aspirational, visionary and capable. This is why, collectively, they have seen investments from venture capital and other funds averaging $25 billion for the last four years, according to figures published by KPMG. However, there is a possible impasse here. The most successful fintech firms are not replacing banks, or at least not yet. They are serving markets that were underserved. But none of them have replaced a bank.

SmartTechnologies to SmartLiving

Closed Ecosystem IoT relates to a fully integrated system of several types of network including machine to machine M2M, machine to human M2H and machine to data system M2D through an application gateway. Additionally these networks provide pre-connected and situational relationships dependent upon tasks, locations and users. An example would be a Home Ecosystem, again as this is the most likely location to get investment at this point in human society. All possible actions and interactions within a home, including disallow rules, policies related to sensors and personal ecosystems are defined and can be added to by users with the correct rights. Every sensor device, task and activity, item can be included in the ecosystem. Personal ecosystems and an Adoption / Attribution Ecosystem

How to stop stakeholders from sabotaging projects

"It is always best to have the stakeholders be included in critical meetings. Here they can voice their concerns early and request needed information. This also allows everyone to agree on a timeline for when critical decisions need to be made," said Lane. To address any form of stakeholder sabotage, Brzychcy recommended that leaders "maintain a holistic approach to projects and carefully game out several courses of action for any undertaking." ... Brzychcy also said that project managers need to be aware of what they don't know and spend the time filling in any knowledge gaps. Continual communication is an easy way to keep stakeholders emotionally invested and interested, said Nolan. There may be times when despite all efforts, a stakeholder remains disinterested in the project. "It may become necessary, ask for a new lead or point of contact on the stakeholder's team; this can re-energize the project and keep things moving forward," he said.

AML - A Paradigm Shift That Accelerates Data Scientist Productivity

There is a growing community around creating tools that automate the tasks outlined above, as well as other tasks that are part of the machine learning workflow. The paradigm that encapsulates this idea is often referred to as automated machine learning, which I will abbreviate as “AML” for the rest of this post. There is no universally agreed upon scope of AML, however the folks who routinely organize the AML workshop at the annual ICML conference define a reasonable scope on their website, which includes automating all of the repetitive tasks defined above. The scope of AML is ambitious, however, is it really effective? The answer is it depends on how you use it. Our view is that it is difficult to perform wholesale replacement of a data scientist with an AML framework, because most machine learning problems require domain knowledge and human judgement to set up correctly.

Artificial Intelligence Can Be a Catalyst Across Most Cycles of the IoT

Overall statistics aside, individual enterprises have their own stories about data growth, and how they intend to handle it. Surely, this is what cloud computing in all its forms is all about. But simply processing it, transmitting it, and storing it is not enough. Data is not simply water or electricity. It’s a core asset of any company. Well, the third wave of artificial intelligence (AI) upon us. A survey by Pega investigated what organizations do think about AI. The capabilities that seem attractive include speech recognition, replication of human interaction, problem solving, etc. The capabilities to actively developing within AI mentioned were game playing, running surveillance, automating manufacturing, etc. What’s the urgency? We’ve been hearing about the IoT for several years now, with focuses on making sense (if possible) of the protocols involved, the security of data, and how to handle it in its many varieties, velocities, and volumes.

On business-architecture and enterprise-architecture

The key problem here is that what most people call ‘enterprise-architecture’ is actually a contraction of an older and more accurate term: ‘enterprise-wide IT-architecture’ [EWITA]. Which no doubt seems fair enough at first – after all, ‘enterprise-architecture’ is a useful shorthand for EWITA. The catch is that that contraction becomes dangerously misleading when we move beyond an IT-only domain, and outward towards the enterprise itself. ... The point here, that way too many people still miss, is that we cannot run a BDAT-stack backwards: it is always base-relative. The mistake that is made time and again by users of TOGAF et al. is that they assume we can start anywhere in the stack – but if we do that from anywhere other than the base, the result gives us a scope that can be dangerously incomplete.

INDEA: Catalysing One Nation One Govt In India with EA

The vision of IndEA is “to establish best-in-class architectural governance, processes and practices with optimal utilisation of ICT infrastructure and applications to offer ONE Government experience to the citizens and businesses through cashless, paperless and faceless services enabled by Boundaryless Information Flow™.” The IndEA comprises of eight distinct yet inter-related reference models, each covering a unique and critical architecture view or perspective ... Integration of government business processes and services across the breadth of the enterprise is needed for delivering the services conveniently to the citizens on a sustainable basis. Data interchange in an e-Government setup is a primary need.

Quote for the day:

"It takes courage to believe that the best is yet to come." -- Robin Roberts

Daily Tech Digest - July 29, 2017

How Machine Learning Helps With Fraud Detection

Machine learning works on the basis of large, historical datasets that have been created using a collection of data across many clients and industries. Even companies that only process a relatively small number of transactions are able to take full advantage of the data sets for their vertical, allowing them to get accurate decisions on each transaction. This aggregation of data provides a highly accurate set of training data, and the access to this information allows businesses to choose the right model to optimize the levels of recall and precision that they provide: out of all the transactions the model predicts to be fraudulent (recall), what proportion of these actually are (precision)? Once the accuracy of the models is deemed acceptable it is time to start predicting, but where do such predictions come from?

Can Cordova Fit Your Target Industry?

AppBrain statistics show that Cordova, with all its limitations, is persistently used in development of complex apps for quality-demanding industries. In other words, developers are simply making fish climb trees, as they use Cordova for projects it wasn’t meant for from the very start. While Cordova takes up only 1.40% of U.S. top-500 apps, its installs are even lower - mere 0.49% - which only proves that the results of the developers’ attempts aren’t satisfying the users in the least. The low adoption figure also hints at the fact that most of complex apps created with Cordova turn out to be unsuccessful and disappoint users, who quickly abandon them due to underperformance. Then, most likely, developers blame it all on the tool instead of accepting that they asked too much from it.

5 Steps to Becoming a Major League Digital Influencer

It doesn't matter how much content you create and share; if you don't have a tribe of your own, that content of yours is just never going to get seen. You can't rely on organic audience growth, either, not unless you're already famous. You have to build your own following. You have to find people interested in your area of expertise, in the niche in which you want to become an influencer, and connect with them. If you're building a Twitter following, follow people in your target audience so that if your profile and content resonate, they will follow you back. The same goes for Facebook, Instagram, and Linkedin. What's more, you have to do all this on a daily basis. You have to be relentless about building your following until you get to that point where organic growth kicks in; and even then, I would continue building it.

Digital Transformation Myths and How They Hold You Back

The term “digital transformation” has a nice ring to it, but few organizations understand the true meaning of the words. Most believe it simply refers to moving away from inefficient, outdated technologies. However, the textbook definition of digital transformation necessitates a fundamental shift or evolution of your business model, changing the very way in which your business is conducted. Few organizations complete full-fledged digital transformations. For instance, updating an outdated mainframe system to a modern, service-oriented architecture in the cloud is not a digital transformation. A brick-and-mortar bank shifting its focus to an electronic online presence with new products and services is. It’s important to acknowledge that while many organizations either believe that they need a digital transformation or are in the midst one, few actually requirea large business transformation.

How to Evaluate Business Intelligence Tools

For the selection of business intelligence tools, one can browse the CVs of their employees for some experience in the field of BI. These employees can help them evaluate BI products. It is easier to have a software which even employees with basic skills can operate, ease of work and less money on training more people! ... Choosing a software according to company requirements is the priority. Many BI providers have different versions of software packages. These versions are priced according to the features enabled within each package. Hence, It is wise to decide on basis of prime requirements of the software. It is not necessary that a software with more advanced features is always useful. You might end up with more capital cost to the company as well as an increase in maintenance cost due to increase in hardware.

Recommendation System Algorithms

In the last 10 years, neural networks have made a huge leap in growth. Today they are applied in a wide range of applications and are gradually replacing traditional ML methods. I’d like to show you how the deep learning approach is used by YouTube. Undoubtedly, it’s a very challenging task to make recommendations for such a service because of the big scale, dynamic corpus, and a variety of unobservable external factors. According to the study “Deep Neural Networks for YouTube Recommendations”, the YouTube recommendation system algorithm consists of two neural networks: one for candidate generation and one for ranking. In case you don’t have enough time, I’ll leave a quick summary of this research here. Taking events from a user’s history as input, the candidate generation network significantly decreases the amount of videos and makes a group of the most relevant ones from a large corpus.

6 Predictions for the Convergence of IoT and Digital Marketing

Since IoT technology connects the internet with objects that are ubiquitous in our daily lives, marketers in almost every industry will be able to engage consumers throughout every phase of the customer journey. The term “Big Data” is an understatement for the amount of data IoT devices will produce. According to the Ericsson Mobility Report, IoT devices and sensors will exceed mobile phones as the largest category of connected devices in 2018 and generate a staggering 400 zettabytes of data per year. IoT's surge will overjoy marketers because they can leverage these massive data sets to integrate consumer behavioral signals into their marketing stack. This will allow them to capture interactions, conversion metrics, and consumer behavior predictions and link them to purchase-intent data.

Benefits Of Telecommuting For The Future Of Work

According to the State of Work Productivity Report, 65% of full-time employees think a remote work schedule would increase productivity. This is backed up by more than two-thirds of managers reporting an increase in overall productivity from their remote employees. Where do telecommuters find this extra boost of productivity? With none of the distractions from a traditional office setting, telecommuting drives up employee efficiency. It allows workers retain more of their time in the day and adjust to their personal mental and physical well-being needs that optimize productivity. Removing something as simple as a twenty minute commute to work can make all world of difference. If you are ill, telecommuting allows one to recover faster without being forced to be in the office. It also improves the impact on our overall health.

Artificial Intuition  –  A Breakthrough Cognitive Paradigm

The first big conceptual leap that we have to make is to understand that learning systems evolve in non-equilibrium settings. ... Stated in a different way, researchers should be very cautious about employing statistical or alternatively bulk thermodynamic metrics in their analysis of these systems. It is my belief that one of the most glaring inappropriate tools in the study of AI is the use of Bayesian methods. ... The second conceptual leap is to understand that our of what “Generalization” means is quite grossly inadequate. The use of the term in Machine Learning is extremely liberal. Furthermore, the Machine Learning approach of ‘curve fitting’ and thus interpolation and therefore generalization between adjacent points in the fitted curve, breaks down under the recently discovered notion of rote memorization of Deep Learning.

Q&A on The Digital Quality Handbook

Teams today are adopting cloud services that reduces the pain of managing a local device/desktop browser labs. In addition, teams are either developing overlays on top of the open-source frameworks like Appium/Protractor and such to close test automation coverage issues, or are using a mix of tools as part of their tool stack to get proper testing capabilities. In addition, keeping an eye on analytics and market trends as a way to understand how the market is moving, and which devices are trending up or down, can also help. With analytics in mind, having a well structured testing strategy that learns from previous executions and provides insights back to the team can help focus the testing on the most valuable tests - quality and value over quantity. As an example, identify the tests that found the most defects per platform.

Quote for the day:

"Leaders keep their eyes on the horizon, not just on the bottom line." -- Warren G. Bennis

Daily Tech Digest - July 28, 2017

The Virtues Of Digital: Creating A Truly Digital Bank

From Uber to Buzzfeed, Spotify and Netflix, we’ve been finding the new paradigms that best fit digital capabilities to replace and disrupt the digitised analogue products and services that came before. Banking is no exception to this rule. We’ve moved from a passbook to a printed statement that was first mailed to us, then put on a PC screen and eventually shrunk to your phone. Digitised banking – tick! Truly digital banking – we haven’t seen it yet! When I co-founded Monzo, a new digital challenger bank in the UK, that was the challenge that faced me. As Chief Customer Officer I led product and proposition, and I knew that I didn’t want to just digitise banking, I wanted to find the new paradigm, an approach that delivered digital banking rather than just digitising what had come before.

Cyber security not a priority for most sectors, study finds

Above the hospitality and food sector on the lower end are manufacturing (31%); admin and real estate (28%); construction (23%); transport and storage (23%); and entertainment and service (21%). This is despite the fact that cyber security has featured increasingly in mainstream media because of several high-profile data breaches and the fact that millions of UK firms are being hit by cyber attacks. According to research by business internet service provider Beaming, 2.9 million UK firms suffered cyber security breaches in 2016, costing them an estimated total of £29.1bn. According to security professionals consulted by networking hardware company Cisco, the operations of an organisation (36%) are most likely to be affected by any potential cyber attack.

The seven V’s of a data fabric

A data fabric must support the modernization of storage and data management, and move away from the proliferation of data silos. But a data fabric must also integrate with legacy systems, without requiring their presence for the long-term. To work effectively a data fabric must be broad and support a vast array of applications and data types at scale across locations. While data fabrics are a significant change from the assumptions that usually surround data storage and processing, the requirements have their roots in big data. The big data era was driven by the three V’s – Volume, Variety and Velocity. A data fabric does encompass these requirements but goes well beyond. In fact, an interesting way to summarize the requirements of a data fabric is the seven V’s – Volume, Variety, Velocity, Veracity, Vicinity, Visibility, and Value.

Design Patterns for Deep Learning Architectures

Pattern languages are an ideal vehicle for describing and understanding Deep Learning. One would like to believe the Deep Learning has a solid fundamental foundation based on advanced mathematics. Most academic research papers will conjure up high-falutin math such as path integrals, tensors, hilbert spaces, measure theory etc. but don't let the math distract oneself from the reality that our collective understanding remains minimal. Mathematics you see has its inherent limitations. Physical scientists have known this for centuries. We formulate theories in such a way that the expressions are mathematically convenient. Mathematically convenience means that the math expressions we work with can be conveniently manipulated into other expressions.

Scale up DevOps processes against the odds

Start with absolute buy-in from all the teams involved and a good architectural footprint. Embrace the minimum viable product, and build on it. For example, if you build servers manually, develop processes to create and deploy a golden image in your virtualization platform of choice. The next step: Implement a secure and compliant base image across all Windows systems and another across all Linux systems, then generalize the application stack. Entrenched organizations can have 50,000 servers with as many different configurations, he pointed out, so iterative platform changes must happen before DevOps processes can translate to, "I can push a button and get my application stack." "Ultimately, you want to get [to end-to-end automation], but don't go in expecting it," said Herz. "Do a little bit, make that little bit better and move up the stack."

7 steps to baking AI into your business

Cognitive resources, algorithms, and learning models are now widely available through APIs and cloud services. IBM, which has been in the AI game since the ’50s, leans on Watson and Bluemix, but Amazon, Google, and Microsoft are also heavily invested and have strong offerings. And it’s not a game for traditional big tech only. Marketing platforms like Salesforce and Adobe are also starting to offer AI as platform, and a vast number of specialized startups are popping up. OpenAI and related nonprofit initiatives also offer resources and training tools. There’s not going to be one right answer for all. Like a web services stack, your cognitive stack will be guided by your needs, learning models that fit the job, expertise of your scientists and technologists, data sources and formats, and integration needs.

The CEO’s guide to competing through HR

Many organizations have already built extensive analytics capabilities, typically housed in centers of excellence with some combination of data-science, statistical, systems-knowledge, and coding expertise. Such COEs often provide fresh insights into talent performance, but companies still complain that analytics teams are simple reporting groups—and even more often that they fail to turn their results into lasting value. What’s missing, as a majority of North American CEOs indicated in a recent poll,1is the ability to embed data analytics into day-to-day HR processes consistently and to use their predictive power to drive better decision making. In today’s typical HR organization, most talent functions either implicitly or explicitly follow a process map; some steps are completed by business partners or generalists, others by HR shared services, and still others by COE specialists.

HR in the age of disruption

First, HR needs to truly understand the flexible work environment. Employees of the future are no longer bound to stay in the office, sit in a cubicle and work from 9 to 5. Rather, they will become location-independent and will be able to work when and where they want as long as they can get access to WiFi and manage to get the jobs done. This is because internet and mobile devices have transformed the way people work, interact and collaborate. Next, the use of new methods to communicate and collaborate must be promoted immediately. Email will no longer be considered the most effective or efficient way to communicate or collaborate. Instead, technologies such as internal collaboration platforms are going to replace email in many situations.

Microsoft explores 'safe' manual memory management in .Net

Microsoft’s model for manual memory management builds on the notion of unique owners of manual objects, with locations in the stack or heap holding only the reference to an object allocated on the manual heap. The concept of shields is introduced to enable safe concurrent sharing of manual objects. The shield creates state in local thread storage to prevent de-allocation while the object is being used. While garbage collectors such as the .Net GC offer high throughput through fast thread-local bump allocation and collection of young objects, studies show that GC can introduce performance overhead compared to manual memory management, the researchers said. These overheads are amplified in big data analytics and real-time stream processing applications, partly due to the need to trace through large heaps, they explained.

Social Intelligence: How To Mine Social Media For Business Results

“There will always be a need for people to read, interpret and understand what customer needs are and how the brand should react,” he says. Nissan North America has six to eight analysts that review data aggregated in queues by the social media management tool Sprinklr, which monitors corporate Twitter handles, Facebook pages, Instagram and Google Plus. The analysts, Long says, are the ones who decide when to respond. “An individual instance of concern might not be enough to warrant a look, but when you get into a top 10 or top 20 ranked concern, you have to start paying attention,” he says. Long considers social intelligence a very important data point that, when coupled with satisfaction surveys and other customer feedback, can help inform and shape the organization’s products, advertising and customer experience.

Quote for the day:

"The quality of a leader is reflected in the standards they set for themselves." -- Ray Kroc

Daily Tech Digest - July 26, 2017

Digital transformation failure is a business failure

The challenge is constant and unrelenting, as the survey found: 80% of IT leaders are under pressure to be constantly improving their organisation’s customer experience through digital innovation, but 90% of digital projects fail to meet expectations and only deliver incremental improvements Databases are currently a clear handicap to this improvement, as 84% have had digital projects cancelled, delayed, or reduced in scope because of the limitations of their legacy database. “Our study puts a spotlight on the harsh reality that despite allocating millions of dollars towards digital transformation projects, most companies are only seeing marginal returns and realising this trajectory won’t enable them to compete effectively in the future,” said Matt Cain, CEO of Couchbase.

Artificial intelligence is not as smart as you (or Elon Musk) think

There are in fact many cases of AI algorithms not being quite as smart as we might think. One infamous example of AI out of control was the Microsoft Tay chatbot, created by the Microsoft AI team last year. It took less than a day for the bot to learn to be racist. Experts say that it could happen to any AI system when bad examples are presented to it. In the case of Tay, it was manipulated by racist and other offensive language, and since it had been taught to “learn” and mirror that behavior, it soon ran out of the researchers’ control. A widely reported study conducted by researchers at Cornell University and the University of Wyoming found that it was fairly easy to fool algorithms that had been trained to identify pictures. The researchers found that when presented with what looked like “scrambled nonsense” to humans, algorithms would identify it as an everyday object like “a school bus.”

CISO: To achieve security in IoT devices, remember the fundamentals

When you are talking about smart homes, the primary responsibilities of a CISO is to promote the consumerization of the smart home by getting rid of the fear factor that smart home devices can affect your privacy. ... Irrespective of the IoT or IT field, the biggest challenge every security officer faces today is weighing the business value with the risks. You should be able to support the business in a way that the product can be launched quickly so that the market can be captured appropriately, but at the same time the risks should be articulated. Being able to articulate the risks in the language of business will always be a learning exercise for every security professional. Sometimes businesses will make decisions based on the risks and you should be ready to flow with it. Sometimes the decisions will be made in favor of security. Either way, security should not be a blocker to business.

5 Reasons To Take A Fresh Look At Your Security Policy

The golden rules for writing security policy still apply, such as making sure the process is shared with all stakeholders who will be affected by it, using language that everyone can understand, avoiding rigid policies that might limit business growth, and ensuring the process is pragmatic by testing it out. Just because policies are intended to be evergreen doesn’t mean they can’t become stale, says Jay Heiser, research VP in security and privacy at Gartner. Particularly at the standards levels, one level below policy, guidance may need to be updated for different lines of business, or for jurisdictions that may be driven by different regulatory rules or geographic norms. Security and risk experts offer five reasons why companies should take a fresh look at security policies.

Security Think Tank: Avoiding the blame game

The traditional wisdom of “don’t open suspicious links or attachments” does not prevent a user clicking on an email that has been specifically designed and crafted not to be suspicious.  We have also seen attacks that exploit web pages people are known to browse or access, forums they use and other aspects of what could be described as “normal use of corporate and personal IT systems”. This extends to the use of cloud based-applications and file storage/sharing systems, travel booking services, tech support and chat applications – not reckless or naive behaviour, just normal. ... If you assume that some users are going to fall for these scams and that not all systems are going to have patches applied, then there is a need for better controls that can filter this use/exploitation and detect/prevent the inevitable people/process/technology security failures.

Cybersecurity skills shortage hurts security analytics, operations

Cybersecurity skills are especially important when it comes to security analytics and operations. It takes highly experienced professionals to investigate security incidents, synthesize threat intelligence, or perform proactive hunting exercises.  Unfortunately, this skills set is particularly lacking. In a recently published ESG research report, Cybersecurity Analytics and Operations in Transition, 412 cybersecurity and IT professionals were asked about the size and skill set of their organization’s cybersecurity team. As it turns out, 54 percent of survey respondents said the skill level for cybersecurity analytics and operations was inappropriate for an organization of their size, and 57 percent said the staff size for cybersecurity analytics and operations was inappropriate for an organization of their size.

What does 'hybrid cloud' mean? It depends on whom you ask

A new survey conducted by the cloud infrastructure company Stratoscale finds that C-level executives define hybrid cloud slightly differently than IT specialists. When asked, "What does hybrid cloud mean to you?" the plurality of executives (44 percent) said it means that different workloads belong in different environments. Another 33 percent said it means the ability to move workloads between private and public cloud. By comparison, IT specialists were effectively split between the two answers (39 percent and 38 percent, respectively). Additionally, the survey suggests that executives experience a shift in their perception of "hybrid cloud" as their company's cloud adoption level increases. In enterprises with adoption levels below 20 percent, hybrid is most often defined as the ability to move workloads between private and public cloud.

When it comes to cybersecurity, why is healthcare so behind?

Despite improving slightly, healthcare organizations have to stay on top of keeping their security measures in check. Organizations “that feel they are in a good place with their security program are the ones that do an annual external risk assessment,” Hall said. It’s crucial to conduct such assessments on a yearly basis, Hall noted, because of the evolving nature of cyberattacks. Rather than being reactive about their security efforts, hospitals must strive to be proactive when it comes to protecting their valuable data. And the data is indeed precious — gaining access to protected health information means big money for hackers. “Healthcare data is both accidental and intentional targets of attacks,” Beth Musumeci, vice president of cybersecurity for GE Healthcare, said.

The very real dangers of betting on hybrid cloud

On one hand, you have the expense and capital investment of operating your own data centers, but without the complete control of your data and processes that is one of the key benefits of on-premise technology stacks. On the other hand, you’ll have to deal with the complexity and potential of working with third-party cloud providers, without gaining much of the cloud’s promised benefit of freeing your team to focus on innovation instead infrastructure. ... Companies that really want to move to the cloud would be well served to move as much as possible—if not all—of their data center activities to the cloud as soon as practical. But, if for whatever reason they’re not ready to fully commit to the public cloud for the next few years, there’s a strong argument for doubling down on running their own data centers, and using the cloud only for simple, completely separate applications.

The Machines Are Getting Ready to Play Doctor

The automated approach could prove important to everyday medical treatment by making the diagnosis of potentially deadly heartbeat irregularities more reliable. It could also make quality care more readily available in areas where resources are scarce. The work is also just the latest sign of how machine learning seems likely to revolutionize medicine. In recent years, researchers have shown that machine-learning techniques can be used to spot all sorts of ailments, including, for example, breast cancer, skin cancer, and eye disease from medical images. “I've been encouraged by how quickly people are accepting the idea that deep learning can diagnose at an accuracy superior to doctors in select verticals,” Ng said via e-mail. He adds that it’s encouraging to see researchers looking beyond imaging to other forms of data such as ECG.

Quote for the day:

"Don't be buffaloed by experts and elites. Experts often possess more data than judgement." -- Colin Powell

Daily Tech Digest - July 25, 2017

10 Old-School IT Principles That Still Rule

The technology you buy is a long-term commitment on your part. You need it to be a long-term commitment on the supplier’s part, too. To play it safe, IT used to buy from big vendors. Now? Not only can open source be just as safe, sometimes you can get it from IBM or other big vendors. Not every open source technology has a broad enough base of support, but many do. If PHP, for example, will do the job, would you look at Java twice given its awful security track record? And yet Java is supported (perhaps “provided” would be more accurate) by Oracle, one of the biggest software companies in the world. This isn’t entirely new, either. The open-source-like SHARE library dates to the 1970s, after all.

Embrace the heat: Data center tips for summer operations

It’s not quite as blistering as that hot yoga class, but data center managers wrestling with their energy bills should seriously consider embracing the heat and asking techs to bring their shorts to work. Running data centers in the 80 to 82 degree Fahrenheit range as opposed to 70 degree or below can save up to two percent per degree, per bill. That’s a significant cost savings, especially if we’re talking a full 10 degree swing. Even during peak workloads, your data center should be able to take the heat. It may seem to go against conventional wisdom, but running a server “hot,” or operating that data center in a high temperature ambient (HTA) state, boosts the inlet temperature of that unit but still sticks well below component specifications. This is another way allowing crafty (and probably now sweaty) data center managers to keep their cooling costs under control.

Understand the multicloud management trade-off

In order to make multicloud work best for an enterprise you need to place a multicloud management tool, such as a CMP (cloud management platform) or a CSB (cloud services broker) between you and the plural clouds. This spares you from having to deal with the complexities of the native cloud services from each cloud provider. Instead you deal with an abstraction layer, sometimes called a “single pane of glass” where you are able to leverage a single user interface and sometimes a single set of APIs to perform common tasks among the cloud providers you’re leveraging. Tasks may include provisioning storage or compute, auto-scaling, data movement, etc.  While many consider this a needed approach when dealing with complex multicloud solutions, there are some looming issues.

Network transformation is the next big IT initiative

To support the ever-growing data needs of the end-users, agencies must enhance their network infrastructure. However, in the current environment of IT budget cuts procuring high-performance routers and firewalls are not feasible for many agencies. They must therefore explore other avenues to enhance their network infrastructure and capacity. Software-defined networking offers a potential solution for agencies that are looking to modernize their network environment without incurring much capital investment. Leveraging the principles of compute and storage virtualization, SDN allows agencies to virtualize their network infrastructure and services. Similar to data center virtualization where applications run on virtual machines, SDN enables network services (routing, firewall and WANX) to run on virtual machines hosted on general-purpose hardware.

Tweaking Internet Explorer to only use TLS 1.2

Out of the box, IE 11 conforms to the current standard, which is that it supports TLS 1.0, 1.1 and 1.2. This should be true on any up-to-date copy of Windows 7, 8.1 or 10. The nice thing about Internet Explorer is that the configuration options for supported TLS versions are right where they should be. As shown above, they can be found with: Tools -> Internet Options -> Advanced tab. Among the advanced options, they are at the very bottom. Changing these options is even easier than finding them. There is a simple, obvious, checkbox for each version of SSL and TLS that you would like to include or exclude. Compare this to Firefox, where you had to know the secret handshake to remove support for TLS 1.0 and 1.1. After limiting IE11 to just TLS1.2, the Qualys SSL Client Test should confirm that the tweaking actually works.

The paranoid Android traveler’s data-protection checklist

Changes to Android in more recent releases have bolstered security, so if you are traveling with an older device that does not support Nougat, you may want to seriously consider a hardware upgrade. Among other improvements, Nougat introduced new — and potentially more secure — device and file encryption; newer devices should have adequate hardware to handle encryption effectively (more details below). These tips are in roughly increasing order of difficulty and complexity, with the simplest and quickest first. In general, these tips involve a tradeoff between security and ease of use (making it harder to search your device can also make it a little harder for you to use it). So you may want to use some of these options only when traveling.

Cashing in on the Internet of Things

The practical (or impractical) reality of smart connected products in the home suggested there was a need for them to work together, so key industry players began to jockey for dominance. This pertained to the communications standards, as well as the ultimate command and control platforms ranging from Apple HomeKit to Amazon Echo to Google Home, Samsung SmartThings, and others. The Allseen Alliance (primarily driven by Qualcomm) got involved to broker standards for consumer IoT as well. And while the focus today in most elements of IoT is still largely on smart connected products, the progression to product systems is clearly happening. Larger players, like GE and Hitachi, bringing forward solutions like GE/Predix and Hitachi Lumada, further demonstrate this. 

10 Essential Performance Tips For MySQL

The best way to understand how your server spends its time is to profile the server’s workload. By profiling your workload, you can expose the most expensive queries for further tuning. Here, time is the most important metric because when you issue a query against the server, you care very little about anything except how quickly it completes. The best way to profile your workload is with a tool such as MySQL Enterprise Monitor’s query analyzer or the pt-query-digest from the Percona Toolkit. These tools capture queries the server executes and return a table of tasks sorted by decreasing order of response time, instantly bubbling up the most expensive and time-consuming tasks to the top so that you can see where to focus your efforts. Workload-profiling tools group similar queries together, allowing you to see the queries that are slow, as well as the queries that are fast but executed many times.

Don’t let cybercrime hold your innovation to ransom

It’s no secret that innovation is vital to stay ahead of the competition. However, it cannot come at the expense of business continuity. As a result, modern IT systems have to be more complex. While businesses work hard to make them as robust as possible, when you’re constantly innovating that complexity introduces an element of fragility and unpredictability that can be difficult to manage. The best way for CIOs to achieve these objectives is to effectively create and deploy innovative business services that are built on the organisations existing IT foundation and layered with new delivery models and platforms. In practice, it’s bridging the old and the new, enabling an organisation to innovate faster at a lower risk. Thankfully, without the need to rip and replace legacy applications.

Big Data Ingestion: Flume, Kafka, and NiFi

Flume is a distributed system that can be used to collect, aggregate, and transfer streaming events into Hadoop. It comes with many built-in sources, channels, and sinks, for example, Kafka Channel and Avro sink. Flume is configuration-based and has interceptors to perform simple transformations on in-flight data. It is easy to lose data using Flume if you’re not careful. For instance, choosing the memory channel for high throughput has the downside that data will be lost when the agent node goes down. A file channel will provide durability at the price of increased latency. Even then, since data is not replicated to other nodes, the file channel is only as reliable as the underlying disks. Flume does offer scalability through multi-hop/fan-in fan-out flows. For high availability (HA), agents can be scaled horizontally.

Quote for the day:

"Fear causes hesitation and hesitation will cause your worst fears to come true." -- Patrick Swayze

Daily Tech Digest - July 24, 2017

The Skills And Traits Of A Next Generation CIO

"Back then, when you searched for 'customer experience officer' on LinkedIn, mine was the only name that showed up," says Lindberg, who was recently appointed president of Kobie Marketing, a provider of loyalty program solutions. "Now there's something like 37,000 of us." Over the past ten years the number of digital customer touchpoints -- and the data associated with them -- has exploded. CIOs who see their primary function as managing internal IT systems are not in a position to deliver the information businesses need to improve the customer experience, she says. "If you're a CIO who hasn't made the realization that we are multiple years into the age of the customer, then it's time to shop for a new job," says Lindberg. "You have to understand the customer's wants and needs. That's why one of the first things I do upon walking into an organization is figure out how to connect the CIO to the live voice of the customer on an ongoing basis."

Cisco Security Report: 34% of Service Providers Lost Revenue from Attacks

DeOS attacks’ “aim is not just to attack, but to destroy in a way that prevents defenders from restoring systems and data,” writes David Ulevitch, SVP and GM of Cisco’s security business, in a blog post. Security researchers watched the evolution of malware during the first half of 2017. Attackers increasingly require victims to activate threats by clicking on links or opening files, the report says. Additionally, they are developing fileless malware that lives in memory and is harder to detect or investigate as it is wiped out when a device restarts. Adversaries are also relying on anonymized and decentralized infrastructure, such as a Tor proxy service, to obscure command and control activities. The report notes an increase in spam volumes, in which attackers use email to distribute malware and generate revenue. This coincides with a decline in exploit kit activity since mid 2016.

Consumers Welcome AI, Despite Lingering Privacy Concerns

In a world where more than four billion records of personal information were stolen or lost during 2016 and data breaches at large corporations dominate news headlines, privacy has become a hot-button issue for any new technology, including AI. Although consumers remain concerned about protecting their privacy and the vulnerability of their personal information, most are more interested in the potential for positive societal impact. When asked about the importance of AI being used to solve today’s bigger issues for the benefit of our society, consumers told us that they would be willing to share their personal information if it meant doing so could further medical breakthroughs (57%), relieve city traffic and improve infrastructure (62%), solve cybersecurity and privacy issues (68%)

Quest for AI Leadership Pushes Microsoft Further Into Chip Development

Bringing chipmaking in-house is increasingly in vogue as companies conclude that off-the-shelf processors aren't capable of fully unleashing the potential of AI. Apple is testing iPhone prototypes that include a chip designed to process AI, a person familiar with the work said in May. Google is on the second version of its own AI chips. To persuade people to buy the next generation of gadgets—phones, VR headsets, even cars—the experience will have to be lightning fast and seamless. "The consumer is going to expect to have almost no lag and to do real-time processing," says Jim McGregor, an analyst at Tirias Research. "For an autonomous car, you can't afford the time to send it back to the cloud to make the decisions to avoid the crash, to avoid hitting a person. The amount of data coming out of autonomous vehicles is tremendous you can't send all of that to the cloud."

OAuth 2.0 Threat Landscapes

It’s neither a flaw of OAuth 2.0 nor how Google implemented it. Phishing is a prominent threat in cyber security. Does that mean there is no way to prevent such attacks, other than proper user education? There are basic things Google could do to prevent such attacks in the future. Looking at the consent screen, ‘Google Docs’ is the key phrase used there to win user’s trust. When creating an OAuth 2.0 app in Google, you can pick any name you want. This immensely helps an attacker to misguide users. Google could easily filter out the known names and prevent app developers from picking names to trick the users. Another key issue is, Google does not show the domain name of the application (but just the application name) on the consent page. Having domain name prominently displayed on the consent page will provide some hint to the user where he is heading to.

AI Cyber Wars: Coming Soon To A Bank Near You

We are beginning to see both offense and defense using automation, machine learning and artificial intelligence (AI) to counter each other’s moves. For example, as firms adopt voice biometrics to make customers’ access to their accounts and information more secure, cyber-criminals can use the same machine learning algorithms to mimic voices and gain unauthorized access. Lyrebird, a Montreal-based AI startup, has developed a voice generator that can imitate almost any person’s voice, and can even add emotional elements missing from computer generated personas such as Siri and Cortana. Staying one step ahead of the threat is difficult, but forward-thinking financial institutions realize it’s imperative. As financial institutions up their game to protect their assets, three AI priorities have emerged: focusing resources, visualizing the threat, and accelerating response time.

What is Node.js? The Javascript Runtime Explained

Node.js takes a different approach. It runs a single-threaded event loop registered with the system to handle connections, and each new connection causes a JavaScript callback function to fire. The callback function can handle requests with non-blocking I/O calls, and if necessary can spawn threads from a pool to execute blocking or CPU-intensive operations and to load-balance across CPU cores. Node’s approach to scaling with callback functions requires less memory to handle more connections than most competitive architectures that scale with threads, including Apache HTTP Server, the various Java application servers, IIS and ASP.NET, and Ruby on Rails. Node.js turns out to be quite useful for desktop applications in addition to servers. Also note that Node applications aren’t limited to pure JavaScript. You can use any language that transpiles to JavaScript, for example TypeScript and CoffeeScript.

Four Tips for Working with Angular Components

If you want to improve the quality of your applications, you need to improve the quality of your code. That may mean tackling a new concept, or it might simply mean approaching existing concepts in a better and more efficient way. Learning to use components in Angular in the most efficient way possible, for instance, can help you to create applications that are more upgradable, that run more smoothly and that will be more future proof. Components have been a part of Angular since version .5 of AngularJS and provide a convenient and handy way to organize and recycle code. Angular (the shorthand for Angular 2) is not so much an upgrade to Angular 1.x as much as a ‘sequel’, being entirely rewritten with mobile support and other features in mind. Here, the controllers used in 1.x are completely replaced with components.

The Database’s Role in Speeding Application Delivery

Among databases there is considerable feature variance, even between relational databases and this may impact time to value. Some databases have a significant overhead in respect of database administration, usually because of the need for performance tuning – partitioning, adding indexes and so on. Products that are largely self-tuning have a cost advantage here, and it can be argued, improve time to value by that alone, although the more significant cost involved is likely to be the cost of the DBA or, alternatively, the business cost of poor database performance. Some practically useful database features improve time to value simply because you do not have to spend time building the capability that is missing or designing around it. A particular case in point here is distributed capability.

In 2017, the pressure is on to be secure. Are you feeling the squeeze?

Executives will be leaning on CSOs to ensure and demonstrate that company data is adequately protected – and their jobs are well and truly on the line, with another recent Trustwave survey suggesting that a data breach that becomes public is a fireable offence at 38 percent of companies. Other concerning vectors for breaches included ransomware and intellectual property theft, with practitioners most concerned about their responsibilities to identify vulnerabilities and stop the spread of malware. Advanced security threats and a shortage of security skills were the areas applying the most operational pressure on respondents, with cloud, Internet of Things (IoT) and social media presenting the biggest technological security challenges.

Quote for the day:

"Don't be afraid of your fears. They're not there to scare you. They're there to let you know that something is worth it." -- C. JoyBell C

Daily Tech Digest - July 23, 2017

Natural Language Processing: The What, Why, and How

Business managers have a Big Data problem. They puzzle over dashboards and spreadsheets drowning in too much data and trying to compile it all together into meaningful information. Arria, a company based in London, has come up with a solution. The Arria NLG Platform is a form of Artificial Intelligence, specialized in communicating information which is extracted from complex data sources in natural language (i.e. as if written by a human). It literally takes an organization’s data and transforms it into language, not standard computer-generated text that is overly technical and difficult to read, but natural human language that reads like a literate and well-educated person wrote it. Arria’s software can turn a spreadsheet full of data, that is dragged and dropped automatically into a written description of the contents, complete with trends, essentially providing business reports.

Real Time Data Integration on Hadoop

This very quick and focused data integration is often referred to as “streaming data enrichment”. In the insurance example, the company wants each recommendation to be based on the full context of the customer’s relationship with the company. Data integration in near real time is required because the first call provides a critical part of the context for the second call or website visit. My colleague, NoSQL expert Bryce Cottam, suggests using a low latency NoSQL database, such as HBase, as the repository for the integrated data in this case. Apache HBase is an open source database included in, or available with, most Hadoop distributions. Integration can be further simplified by designing the solution around a specific data integration requirement. For the insurance example, the problem is to integrate the data by customer.

Why public cloud is more expensive than you think

“If you were to go out and rent a car from Budget for one day a week, no problem,” he says. “If you want to use that car 24/7, 365 days a year then you’re going to pay for it twice over.” So anyone that’s looking to run an application that has predictable traffic levels and must always be available should avoid public cloud options for that, he says. “That’s very expensive under the Azure and AWS pricing model,” MacDonald explains. “Which it should be, because if you have these virtualized server banks and you’re doing pay-as-you-go, then you have to charge a lot to make a profit, because it’s not going to be used all the time.” Canada15Edge has been in business for about two years, operating one data centre on a colocation model for its clients. MacDonald says he’s hosting a number of managed service providers in his building.

Architecting the digital enterprise

To be nimble requires an organisation to empower those architects closest to the business needs – those with domain expertise. To maintain consistency amidst this new autonomy, an enterprise’s domain architects need to operate with a consensus around the approach to key architecture “plays” – such as cloud, security and analytics. The enterprise architect of the future needs to be able to grasp and manage risk: understanding what to solve now and what to solve iteratively. As the dominance of the biggest players has eroded, they must construct fluid ecosystems of software, where a product may be used to deliver a business outcome for one or two years until enterprise toolsets evolve. This is a different mentality for architects – one which tolerates risk and even sprawl so long as it is managed and iteratively resolved.

Attack and response: Cloud-native cybersecurity vs. virtual machine security

Most vulnerabilities lie in the application level, and deciphering the specific application to protect against relevant threats is hard to do on an ongoing basis. Cloud-native security addresses this problem with whitelisting and protection from known threats. For the first time ever, you can automatically whitelist which traffic should and shouldn’t get to your application automatically. VM security is completely blind to the application specific elements, or to the larger context of the application, especially in orchestrated systems where the IPs of the application might change on an hourly basis. Regarding protection from known threats, one of the major issues with existing web application firewalls (WAFs) is that it is very hard to configure it correctly for every exposed service.

What’s the Big Deal about China’s First Open-Source Blockchain Platform NEO?

Erik Zhang, core developer of NEO, introduced Smart Contracts 2.0 to the audience and explained the major differences between NEO and Ethereum. Ethereum uses its own language called Solidity for programming, whereas NEO supports all programming languages via a compiler, including those on Microsoft.net, Java, Kotlin, Go and Python. By allowing for common programming languages to be used on its platform, NEO hopes to attract a vast community of developers. NEO will have the Nest Fund, a project similar to Ethereum’s The DAO, and Tony Tao will soon release a white paper on the project. The DAO will make improvements on its shortcomings and will be released after being audited by a worldwide peer review.

Open hybrid cloud enables government IT service delivery

An open hybrid cloud solution enables government IT shops to provide flexible and agile service delivery with minimal disruption using current/existing infrastructure. At the same time, it establishes a fast, flexible and agile service-delivery environment supporting today’s traditional workloads and tomorrow’s cloud-based applications. Open hybrid cloud leverages innovation, economics and flexibility by providing access to the best service providers, vendors and technologies without getting locked in. Open Source solutions are leading the industry in rapid innovation and delivering secured open hybrid cloud. “If you automate your way into the cloud, you can automate your way across to another cloud and start making spot market decisions about what cloud you want to be in based on what you’re trying to do,” says Adam Clater

11 Things Every CEO Must Know About Disruption

The first thing to remember about disruption is that it's a two-way street. Either you are the disrupter, or you are being disrupted. This means I mean that if you aren't making things happen for you, or your company, then someone is probably going to put you out of business right under your nose with a lower price point and better business plan. ... This isn't meant to be fear-based, but it's the reality of the situation. Between the pace at which technology advances and the rate at which ideas are generated, disruption is truly a natural cause of the times. If you're comfortable in your business, you need to be thinking about who knows your comfortable and how they are planning to make you uncomfortable. There is a constant ebb and flow of disruption and being disrupted."

The Jobs that will be Orchestrated, not Automated

With the help of Robotic Service Orchestration (RSO) technology, we can orchestrate services across a human and digital workforce to get the right worker to do right task at the right time. As we move to an increasingly automated workforce, this is going to become increasingly important. While there are jobs that will absolutely and positively remain in the human realm, these jobs will likely benefit from some sort of robot interaction which will have to be managed.  RSO can also be used to ease the transition and effectively "install airbags" in the automation process. RSO can help to ensure that it’s easy to switch back from digital to human, if there are any unexpected side effects from moving to an automated agent instead of a human one.

Maximizing the Potential of Open Source Threat Intelligence Feeds

Open source threat intelligence feeds are appealing for a number of reasons. One of the more obvious reasons is their price- absolutely nothing. This is critical for smaller organizations that lack the resources for robust sources of intelligence. Cost aside, open source threat intelligence is also appealing because it provides a wide scope of information on different industries, topics, and locations. With the collaborative efforts of many contributors, users can benefit from intelligence without the hassle of contracts and data limits. Open source threat intelligence is also popular because much of it derives from honeypots, which are decoy entities used to study invasive behaviors. These open and closed-source applications register anomalies and problematic activity that can be then be turned into feeds, software patches, and studies of adversarial behavior.

Quote for the day:

"If the road is easy, you're likely going the wrong way." -- Terry Goodkind