Daily Tech Digest - July 30, 2017

How CIOs can use machine learning to transforming customer service

Machine learning means your company’s programs can make use of this data without being explicitly programmed to do so, as programs collect, learn, and adapt by utilizing ever-greater sources of data. Your human employees who do relatively simple and often mundane task, such as answering phones, will soon be replaced by much more efficient machines.  Workers who remain employed will find themselves working alongside of greater numbers of machines, too. Employees will find it much easier to comb through databases of information, retrieve specific solutions to costumer issues, and detect what kind of customer they’re on the line with by utilizing advanced software. Machine learning also enhances your company’s knowledge-retention capacity in the long run.


The New Enterprise Business Integration Approach In Banking

Some ‘trendsetter’ banks after having established maturity in mobile banking, have now begun to focus on the Omni-channel strategy. Omni(s) (Latin for “Universal”) – Channel is a digital transformation paradigm to provide common customer experience in multiple channels through which a customer interacts with the bank. Omni-channel implementation path often starts with the convergence of customer facing capabilities in Online, Tablet, Mobile and new disruptive channels such as Wearables; while also developing a transition roadmap to further integrate capabilities such as alerts, notifications, 3600 customer view to other self-service channels such as IVR (Interactive Voice Response), Kiosk, ATM; and assisted channels.


Preparing For Disruption: Fintech And The Fortune 500

Startups usually concentrate on one area of the financial business and do it well, whereas most Fortune 500 financial companies have diverse lines of business. As a result, large financial companies are fending off assaults on their bottom line from multiple fronts. For instance, PayPal, Stripe and Square are honing in on payment processing on one side, while robo-advisors like Betterment and Wealthsimple are looking to take over a chunk of the wealth management sector. And it doesn't stop there. The latest fintech players have ventured into the lending industry where companies like LendingClub and SoFi attack the consumer lending market, and Kabbage and OnDeck Capital look to become leaders in small business lending – a traditionally under-served market due to the high cost of processing loans.


The battle between banks and disruptors is just beginning

There are companies that do similar things in lending, savings, investments and other specific areas of financial services based upon internet technologies. These companies have names like Zopa, SmartyPig, Nutmeg and eToro, and have fun branding and cool offices. They are very different from banks. They all share many of the same attributes, in terms of being young, aspirational, visionary and capable. This is why, collectively, they have seen investments from venture capital and other funds averaging $25 billion for the last four years, according to figures published by KPMG. However, there is a possible impasse here. The most successful fintech firms are not replacing banks, or at least not yet. They are serving markets that were underserved. But none of them have replaced a bank.


SmartTechnologies to SmartLiving

Closed Ecosystem IoT relates to a fully integrated system of several types of network including machine to machine M2M, machine to human M2H and machine to data system M2D through an application gateway. Additionally these networks provide pre-connected and situational relationships dependent upon tasks, locations and users. An example would be a Home Ecosystem, again as this is the most likely location to get investment at this point in human society. All possible actions and interactions within a home, including disallow rules, policies related to sensors and personal ecosystems are defined and can be added to by users with the correct rights. Every sensor device, task and activity, item can be included in the ecosystem. Personal ecosystems and an Adoption / Attribution Ecosystem


How to stop stakeholders from sabotaging projects

"It is always best to have the stakeholders be included in critical meetings. Here they can voice their concerns early and request needed information. This also allows everyone to agree on a timeline for when critical decisions need to be made," said Lane. To address any form of stakeholder sabotage, Brzychcy recommended that leaders "maintain a holistic approach to projects and carefully game out several courses of action for any undertaking." ... Brzychcy also said that project managers need to be aware of what they don't know and spend the time filling in any knowledge gaps. Continual communication is an easy way to keep stakeholders emotionally invested and interested, said Nolan. There may be times when despite all efforts, a stakeholder remains disinterested in the project. "It may become necessary, ask for a new lead or point of contact on the stakeholder's team; this can re-energize the project and keep things moving forward," he said.


AML - A Paradigm Shift That Accelerates Data Scientist Productivity

There is a growing community around creating tools that automate the tasks outlined above, as well as other tasks that are part of the machine learning workflow. The paradigm that encapsulates this idea is often referred to as automated machine learning, which I will abbreviate as “AML” for the rest of this post. There is no universally agreed upon scope of AML, however the folks who routinely organize the AML workshop at the annual ICML conference define a reasonable scope on their website, which includes automating all of the repetitive tasks defined above. The scope of AML is ambitious, however, is it really effective? The answer is it depends on how you use it. Our view is that it is difficult to perform wholesale replacement of a data scientist with an AML framework, because most machine learning problems require domain knowledge and human judgement to set up correctly.


Artificial Intelligence Can Be a Catalyst Across Most Cycles of the IoT

Overall statistics aside, individual enterprises have their own stories about data growth, and how they intend to handle it. Surely, this is what cloud computing in all its forms is all about. But simply processing it, transmitting it, and storing it is not enough. Data is not simply water or electricity. It’s a core asset of any company. Well, the third wave of artificial intelligence (AI) upon us. A survey by Pega investigated what organizations do think about AI. The capabilities that seem attractive include speech recognition, replication of human interaction, problem solving, etc. The capabilities to actively developing within AI mentioned were game playing, running surveillance, automating manufacturing, etc. What’s the urgency? We’ve been hearing about the IoT for several years now, with focuses on making sense (if possible) of the protocols involved, the security of data, and how to handle it in its many varieties, velocities, and volumes.


On business-architecture and enterprise-architecture

The key problem here is that what most people call ‘enterprise-architecture’ is actually a contraction of an older and more accurate term: ‘enterprise-wide IT-architecture’ [EWITA]. Which no doubt seems fair enough at first – after all, ‘enterprise-architecture’ is a useful shorthand for EWITA. The catch is that that contraction becomes dangerously misleading when we move beyond an IT-only domain, and outward towards the enterprise itself. ... The point here, that way too many people still miss, is that we cannot run a BDAT-stack backwards: it is always base-relative. The mistake that is made time and again by users of TOGAF et al. is that they assume we can start anywhere in the stack – but if we do that from anywhere other than the base, the result gives us a scope that can be dangerously incomplete.


INDEA: Catalysing One Nation One Govt In India with EA

The vision of IndEA is “to establish best-in-class architectural governance, processes and practices with optimal utilisation of ICT infrastructure and applications to offer ONE Government experience to the citizens and businesses through cashless, paperless and faceless services enabled by Boundaryless Information Flow™.” The IndEA comprises of eight distinct yet inter-related reference models, each covering a unique and critical architecture view or perspective ... Integration of government business processes and services across the breadth of the enterprise is needed for delivering the services conveniently to the citizens on a sustainable basis. Data interchange in an e-Government setup is a primary need.



Quote for the day:


"It takes courage to believe that the best is yet to come." -- Robin Roberts


Daily Tech Digest - July 29, 2017

How Machine Learning Helps With Fraud Detection

Machine learning works on the basis of large, historical datasets that have been created using a collection of data across many clients and industries. Even companies that only process a relatively small number of transactions are able to take full advantage of the data sets for their vertical, allowing them to get accurate decisions on each transaction. This aggregation of data provides a highly accurate set of training data, and the access to this information allows businesses to choose the right model to optimize the levels of recall and precision that they provide: out of all the transactions the model predicts to be fraudulent (recall), what proportion of these actually are (precision)? Once the accuracy of the models is deemed acceptable it is time to start predicting, but where do such predictions come from?


Can Cordova Fit Your Target Industry?

AppBrain statistics show that Cordova, with all its limitations, is persistently used in development of complex apps for quality-demanding industries. In other words, developers are simply making fish climb trees, as they use Cordova for projects it wasn’t meant for from the very start. While Cordova takes up only 1.40% of U.S. top-500 apps, its installs are even lower - mere 0.49% - which only proves that the results of the developers’ attempts aren’t satisfying the users in the least. The low adoption figure also hints at the fact that most of complex apps created with Cordova turn out to be unsuccessful and disappoint users, who quickly abandon them due to underperformance. Then, most likely, developers blame it all on the tool instead of accepting that they asked too much from it.


5 Steps to Becoming a Major League Digital Influencer

It doesn't matter how much content you create and share; if you don't have a tribe of your own, that content of yours is just never going to get seen. You can't rely on organic audience growth, either, not unless you're already famous. You have to build your own following. You have to find people interested in your area of expertise, in the niche in which you want to become an influencer, and connect with them. If you're building a Twitter following, follow people in your target audience so that if your profile and content resonate, they will follow you back. The same goes for Facebook, Instagram, and Linkedin. What's more, you have to do all this on a daily basis. You have to be relentless about building your following until you get to that point where organic growth kicks in; and even then, I would continue building it.


Digital Transformation Myths and How They Hold You Back

The term “digital transformation” has a nice ring to it, but few organizations understand the true meaning of the words. Most believe it simply refers to moving away from inefficient, outdated technologies. However, the textbook definition of digital transformation necessitates a fundamental shift or evolution of your business model, changing the very way in which your business is conducted. Few organizations complete full-fledged digital transformations. For instance, updating an outdated mainframe system to a modern, service-oriented architecture in the cloud is not a digital transformation. A brick-and-mortar bank shifting its focus to an electronic online presence with new products and services is. It’s important to acknowledge that while many organizations either believe that they need a digital transformation or are in the midst one, few actually requirea large business transformation.


How to Evaluate Business Intelligence Tools

For the selection of business intelligence tools, one can browse the CVs of their employees for some experience in the field of BI. These employees can help them evaluate BI products. It is easier to have a software which even employees with basic skills can operate, ease of work and less money on training more people! ... Choosing a software according to company requirements is the priority. Many BI providers have different versions of software packages. These versions are priced according to the features enabled within each package. Hence, It is wise to decide on basis of prime requirements of the software. It is not necessary that a software with more advanced features is always useful. You might end up with more capital cost to the company as well as an increase in maintenance cost due to increase in hardware.


Recommendation System Algorithms

In the last 10 years, neural networks have made a huge leap in growth. Today they are applied in a wide range of applications and are gradually replacing traditional ML methods. I’d like to show you how the deep learning approach is used by YouTube. Undoubtedly, it’s a very challenging task to make recommendations for such a service because of the big scale, dynamic corpus, and a variety of unobservable external factors. According to the study “Deep Neural Networks for YouTube Recommendations”, the YouTube recommendation system algorithm consists of two neural networks: one for candidate generation and one for ranking. In case you don’t have enough time, I’ll leave a quick summary of this research here. Taking events from a user’s history as input, the candidate generation network significantly decreases the amount of videos and makes a group of the most relevant ones from a large corpus.


6 Predictions for the Convergence of IoT and Digital Marketing

Since IoT technology connects the internet with objects that are ubiquitous in our daily lives, marketers in almost every industry will be able to engage consumers throughout every phase of the customer journey. The term “Big Data” is an understatement for the amount of data IoT devices will produce. According to the Ericsson Mobility Report, IoT devices and sensors will exceed mobile phones as the largest category of connected devices in 2018 and generate a staggering 400 zettabytes of data per year. IoT's surge will overjoy marketers because they can leverage these massive data sets to integrate consumer behavioral signals into their marketing stack. This will allow them to capture interactions, conversion metrics, and consumer behavior predictions and link them to purchase-intent data.


Benefits Of Telecommuting For The Future Of Work

According to the State of Work Productivity Report, 65% of full-time employees think a remote work schedule would increase productivity. This is backed up by more than two-thirds of managers reporting an increase in overall productivity from their remote employees. Where do telecommuters find this extra boost of productivity? With none of the distractions from a traditional office setting, telecommuting drives up employee efficiency. It allows workers retain more of their time in the day and adjust to their personal mental and physical well-being needs that optimize productivity. Removing something as simple as a twenty minute commute to work can make all world of difference. If you are ill, telecommuting allows one to recover faster without being forced to be in the office. It also improves the impact on our overall health.


Artificial Intuition  –  A Breakthrough Cognitive Paradigm

The first big conceptual leap that we have to make is to understand that learning systems evolve in non-equilibrium settings. ... Stated in a different way, researchers should be very cautious about employing statistical or alternatively bulk thermodynamic metrics in their analysis of these systems. It is my belief that one of the most glaring inappropriate tools in the study of AI is the use of Bayesian methods. ... The second conceptual leap is to understand that our of what “Generalization” means is quite grossly inadequate. The use of the term in Machine Learning is extremely liberal. Furthermore, the Machine Learning approach of ‘curve fitting’ and thus interpolation and therefore generalization between adjacent points in the fitted curve, breaks down under the recently discovered notion of rote memorization of Deep Learning.


Q&A on The Digital Quality Handbook

Teams today are adopting cloud services that reduces the pain of managing a local device/desktop browser labs. In addition, teams are either developing overlays on top of the open-source frameworks like Appium/Protractor and such to close test automation coverage issues, or are using a mix of tools as part of their tool stack to get proper testing capabilities. In addition, keeping an eye on analytics and market trends as a way to understand how the market is moving, and which devices are trending up or down, can also help. With analytics in mind, having a well structured testing strategy that learns from previous executions and provides insights back to the team can help focus the testing on the most valuable tests - quality and value over quantity. As an example, identify the tests that found the most defects per platform.



Quote for the day:


"Leaders keep their eyes on the horizon, not just on the bottom line." -- Warren G. Bennis


Daily Tech Digest - July 28, 2017

The Virtues Of Digital: Creating A Truly Digital Bank

From Uber to Buzzfeed, Spotify and Netflix, we’ve been finding the new paradigms that best fit digital capabilities to replace and disrupt the digitised analogue products and services that came before. Banking is no exception to this rule. We’ve moved from a passbook to a printed statement that was first mailed to us, then put on a PC screen and eventually shrunk to your phone. Digitised banking – tick! Truly digital banking – we haven’t seen it yet! When I co-founded Monzo, a new digital challenger bank in the UK, that was the challenge that faced me. As Chief Customer Officer I led product and proposition, and I knew that I didn’t want to just digitise banking, I wanted to find the new paradigm, an approach that delivered digital banking rather than just digitising what had come before.


Cyber security not a priority for most sectors, study finds

Above the hospitality and food sector on the lower end are manufacturing (31%); admin and real estate (28%); construction (23%); transport and storage (23%); and entertainment and service (21%). This is despite the fact that cyber security has featured increasingly in mainstream media because of several high-profile data breaches and the fact that millions of UK firms are being hit by cyber attacks. According to research by business internet service provider Beaming, 2.9 million UK firms suffered cyber security breaches in 2016, costing them an estimated total of £29.1bn. According to security professionals consulted by networking hardware company Cisco, the operations of an organisation (36%) are most likely to be affected by any potential cyber attack.


The seven V’s of a data fabric

A data fabric must support the modernization of storage and data management, and move away from the proliferation of data silos. But a data fabric must also integrate with legacy systems, without requiring their presence for the long-term. To work effectively a data fabric must be broad and support a vast array of applications and data types at scale across locations. While data fabrics are a significant change from the assumptions that usually surround data storage and processing, the requirements have their roots in big data. The big data era was driven by the three V’s – Volume, Variety and Velocity. A data fabric does encompass these requirements but goes well beyond. In fact, an interesting way to summarize the requirements of a data fabric is the seven V’s – Volume, Variety, Velocity, Veracity, Vicinity, Visibility, and Value.


Design Patterns for Deep Learning Architectures

Pattern languages are an ideal vehicle for describing and understanding Deep Learning. One would like to believe the Deep Learning has a solid fundamental foundation based on advanced mathematics. Most academic research papers will conjure up high-falutin math such as path integrals, tensors, hilbert spaces, measure theory etc. but don't let the math distract oneself from the reality that our collective understanding remains minimal. Mathematics you see has its inherent limitations. Physical scientists have known this for centuries. We formulate theories in such a way that the expressions are mathematically convenient. Mathematically convenience means that the math expressions we work with can be conveniently manipulated into other expressions.


Scale up DevOps processes against the odds

Start with absolute buy-in from all the teams involved and a good architectural footprint. Embrace the minimum viable product, and build on it. For example, if you build servers manually, develop processes to create and deploy a golden image in your virtualization platform of choice. The next step: Implement a secure and compliant base image across all Windows systems and another across all Linux systems, then generalize the application stack. Entrenched organizations can have 50,000 servers with as many different configurations, he pointed out, so iterative platform changes must happen before DevOps processes can translate to, "I can push a button and get my application stack." "Ultimately, you want to get [to end-to-end automation], but don't go in expecting it," said Herz. "Do a little bit, make that little bit better and move up the stack."


7 steps to baking AI into your business

Cognitive resources, algorithms, and learning models are now widely available through APIs and cloud services. IBM, which has been in the AI game since the ’50s, leans on Watson and Bluemix, but Amazon, Google, and Microsoft are also heavily invested and have strong offerings. And it’s not a game for traditional big tech only. Marketing platforms like Salesforce and Adobe are also starting to offer AI as platform, and a vast number of specialized startups are popping up. OpenAI and related nonprofit initiatives also offer resources and training tools. There’s not going to be one right answer for all. Like a web services stack, your cognitive stack will be guided by your needs, learning models that fit the job, expertise of your scientists and technologists, data sources and formats, and integration needs.


The CEO’s guide to competing through HR

Many organizations have already built extensive analytics capabilities, typically housed in centers of excellence with some combination of data-science, statistical, systems-knowledge, and coding expertise. Such COEs often provide fresh insights into talent performance, but companies still complain that analytics teams are simple reporting groups—and even more often that they fail to turn their results into lasting value. What’s missing, as a majority of North American CEOs indicated in a recent poll,1is the ability to embed data analytics into day-to-day HR processes consistently and to use their predictive power to drive better decision making. In today’s typical HR organization, most talent functions either implicitly or explicitly follow a process map; some steps are completed by business partners or generalists, others by HR shared services, and still others by COE specialists.


HR in the age of disruption

First, HR needs to truly understand the flexible work environment. Employees of the future are no longer bound to stay in the office, sit in a cubicle and work from 9 to 5. Rather, they will become location-independent and will be able to work when and where they want as long as they can get access to WiFi and manage to get the jobs done. This is because internet and mobile devices have transformed the way people work, interact and collaborate. Next, the use of new methods to communicate and collaborate must be promoted immediately. Email will no longer be considered the most effective or efficient way to communicate or collaborate. Instead, technologies such as internal collaboration platforms are going to replace email in many situations.


Microsoft explores 'safe' manual memory management in .Net

Microsoft’s model for manual memory management builds on the notion of unique owners of manual objects, with locations in the stack or heap holding only the reference to an object allocated on the manual heap. The concept of shields is introduced to enable safe concurrent sharing of manual objects. The shield creates state in local thread storage to prevent de-allocation while the object is being used. While garbage collectors such as the .Net GC offer high throughput through fast thread-local bump allocation and collection of young objects, studies show that GC can introduce performance overhead compared to manual memory management, the researchers said. These overheads are amplified in big data analytics and real-time stream processing applications, partly due to the need to trace through large heaps, they explained.


Social Intelligence: How To Mine Social Media For Business Results

“There will always be a need for people to read, interpret and understand what customer needs are and how the brand should react,” he says. Nissan North America has six to eight analysts that review data aggregated in queues by the social media management tool Sprinklr, which monitors corporate Twitter handles, Facebook pages, Instagram and Google Plus. The analysts, Long says, are the ones who decide when to respond. “An individual instance of concern might not be enough to warrant a look, but when you get into a top 10 or top 20 ranked concern, you have to start paying attention,” he says. Long considers social intelligence a very important data point that, when coupled with satisfaction surveys and other customer feedback, can help inform and shape the organization’s products, advertising and customer experience.



Quote for the day:


"The quality of a leader is reflected in the standards they set for themselves." -- Ray Kroc


Daily Tech Digest - July 26, 2017

Digital transformation failure is a business failure

The challenge is constant and unrelenting, as the survey found: 80% of IT leaders are under pressure to be constantly improving their organisation’s customer experience through digital innovation, but 90% of digital projects fail to meet expectations and only deliver incremental improvements Databases are currently a clear handicap to this improvement, as 84% have had digital projects cancelled, delayed, or reduced in scope because of the limitations of their legacy database. “Our study puts a spotlight on the harsh reality that despite allocating millions of dollars towards digital transformation projects, most companies are only seeing marginal returns and realising this trajectory won’t enable them to compete effectively in the future,” said Matt Cain, CEO of Couchbase.


Artificial intelligence is not as smart as you (or Elon Musk) think

There are in fact many cases of AI algorithms not being quite as smart as we might think. One infamous example of AI out of control was the Microsoft Tay chatbot, created by the Microsoft AI team last year. It took less than a day for the bot to learn to be racist. Experts say that it could happen to any AI system when bad examples are presented to it. In the case of Tay, it was manipulated by racist and other offensive language, and since it had been taught to “learn” and mirror that behavior, it soon ran out of the researchers’ control. A widely reported study conducted by researchers at Cornell University and the University of Wyoming found that it was fairly easy to fool algorithms that had been trained to identify pictures. The researchers found that when presented with what looked like “scrambled nonsense” to humans, algorithms would identify it as an everyday object like “a school bus.”


CISO: To achieve security in IoT devices, remember the fundamentals

When you are talking about smart homes, the primary responsibilities of a CISO is to promote the consumerization of the smart home by getting rid of the fear factor that smart home devices can affect your privacy. ... Irrespective of the IoT or IT field, the biggest challenge every security officer faces today is weighing the business value with the risks. You should be able to support the business in a way that the product can be launched quickly so that the market can be captured appropriately, but at the same time the risks should be articulated. Being able to articulate the risks in the language of business will always be a learning exercise for every security professional. Sometimes businesses will make decisions based on the risks and you should be ready to flow with it. Sometimes the decisions will be made in favor of security. Either way, security should not be a blocker to business.


5 Reasons To Take A Fresh Look At Your Security Policy

The golden rules for writing security policy still apply, such as making sure the process is shared with all stakeholders who will be affected by it, using language that everyone can understand, avoiding rigid policies that might limit business growth, and ensuring the process is pragmatic by testing it out. Just because policies are intended to be evergreen doesn’t mean they can’t become stale, says Jay Heiser, research VP in security and privacy at Gartner. Particularly at the standards levels, one level below policy, guidance may need to be updated for different lines of business, or for jurisdictions that may be driven by different regulatory rules or geographic norms. Security and risk experts offer five reasons why companies should take a fresh look at security policies.


Security Think Tank: Avoiding the blame game

The traditional wisdom of “don’t open suspicious links or attachments” does not prevent a user clicking on an email that has been specifically designed and crafted not to be suspicious.  We have also seen attacks that exploit web pages people are known to browse or access, forums they use and other aspects of what could be described as “normal use of corporate and personal IT systems”. This extends to the use of cloud based-applications and file storage/sharing systems, travel booking services, tech support and chat applications – not reckless or naive behaviour, just normal. ... If you assume that some users are going to fall for these scams and that not all systems are going to have patches applied, then there is a need for better controls that can filter this use/exploitation and detect/prevent the inevitable people/process/technology security failures.


Cybersecurity skills shortage hurts security analytics, operations

Cybersecurity skills are especially important when it comes to security analytics and operations. It takes highly experienced professionals to investigate security incidents, synthesize threat intelligence, or perform proactive hunting exercises.  Unfortunately, this skills set is particularly lacking. In a recently published ESG research report, Cybersecurity Analytics and Operations in Transition, 412 cybersecurity and IT professionals were asked about the size and skill set of their organization’s cybersecurity team. As it turns out, 54 percent of survey respondents said the skill level for cybersecurity analytics and operations was inappropriate for an organization of their size, and 57 percent said the staff size for cybersecurity analytics and operations was inappropriate for an organization of their size.


What does 'hybrid cloud' mean? It depends on whom you ask

A new survey conducted by the cloud infrastructure company Stratoscale finds that C-level executives define hybrid cloud slightly differently than IT specialists. When asked, "What does hybrid cloud mean to you?" the plurality of executives (44 percent) said it means that different workloads belong in different environments. Another 33 percent said it means the ability to move workloads between private and public cloud. By comparison, IT specialists were effectively split between the two answers (39 percent and 38 percent, respectively). Additionally, the survey suggests that executives experience a shift in their perception of "hybrid cloud" as their company's cloud adoption level increases. In enterprises with adoption levels below 20 percent, hybrid is most often defined as the ability to move workloads between private and public cloud.


When it comes to cybersecurity, why is healthcare so behind?

Despite improving slightly, healthcare organizations have to stay on top of keeping their security measures in check. Organizations “that feel they are in a good place with their security program are the ones that do an annual external risk assessment,” Hall said. It’s crucial to conduct such assessments on a yearly basis, Hall noted, because of the evolving nature of cyberattacks. Rather than being reactive about their security efforts, hospitals must strive to be proactive when it comes to protecting their valuable data. And the data is indeed precious — gaining access to protected health information means big money for hackers. “Healthcare data is both accidental and intentional targets of attacks,” Beth Musumeci, vice president of cybersecurity for GE Healthcare, said.


The very real dangers of betting on hybrid cloud

On one hand, you have the expense and capital investment of operating your own data centers, but without the complete control of your data and processes that is one of the key benefits of on-premise technology stacks. On the other hand, you’ll have to deal with the complexity and potential of working with third-party cloud providers, without gaining much of the cloud’s promised benefit of freeing your team to focus on innovation instead infrastructure. ... Companies that really want to move to the cloud would be well served to move as much as possible—if not all—of their data center activities to the cloud as soon as practical. But, if for whatever reason they’re not ready to fully commit to the public cloud for the next few years, there’s a strong argument for doubling down on running their own data centers, and using the cloud only for simple, completely separate applications.


The Machines Are Getting Ready to Play Doctor

The automated approach could prove important to everyday medical treatment by making the diagnosis of potentially deadly heartbeat irregularities more reliable. It could also make quality care more readily available in areas where resources are scarce. The work is also just the latest sign of how machine learning seems likely to revolutionize medicine. In recent years, researchers have shown that machine-learning techniques can be used to spot all sorts of ailments, including, for example, breast cancer, skin cancer, and eye disease from medical images. “I've been encouraged by how quickly people are accepting the idea that deep learning can diagnose at an accuracy superior to doctors in select verticals,” Ng said via e-mail. He adds that it’s encouraging to see researchers looking beyond imaging to other forms of data such as ECG.



Quote for the day:


"Don't be buffaloed by experts and elites. Experts often possess more data than judgement." -- Colin Powell


Daily Tech Digest - July 25, 2017

10 Old-School IT Principles That Still Rule

The technology you buy is a long-term commitment on your part. You need it to be a long-term commitment on the supplier’s part, too. To play it safe, IT used to buy from big vendors. Now? Not only can open source be just as safe, sometimes you can get it from IBM or other big vendors. Not every open source technology has a broad enough base of support, but many do. If PHP, for example, will do the job, would you look at Java twice given its awful security track record? And yet Java is supported (perhaps “provided” would be more accurate) by Oracle, one of the biggest software companies in the world. This isn’t entirely new, either. The open-source-like SHARE library dates to the 1970s, after all.


Embrace the heat: Data center tips for summer operations

It’s not quite as blistering as that hot yoga class, but data center managers wrestling with their energy bills should seriously consider embracing the heat and asking techs to bring their shorts to work. Running data centers in the 80 to 82 degree Fahrenheit range as opposed to 70 degree or below can save up to two percent per degree, per bill. That’s a significant cost savings, especially if we’re talking a full 10 degree swing. Even during peak workloads, your data center should be able to take the heat. It may seem to go against conventional wisdom, but running a server “hot,” or operating that data center in a high temperature ambient (HTA) state, boosts the inlet temperature of that unit but still sticks well below component specifications. This is another way allowing crafty (and probably now sweaty) data center managers to keep their cooling costs under control.


Understand the multicloud management trade-off

In order to make multicloud work best for an enterprise you need to place a multicloud management tool, such as a CMP (cloud management platform) or a CSB (cloud services broker) between you and the plural clouds. This spares you from having to deal with the complexities of the native cloud services from each cloud provider. Instead you deal with an abstraction layer, sometimes called a “single pane of glass” where you are able to leverage a single user interface and sometimes a single set of APIs to perform common tasks among the cloud providers you’re leveraging. Tasks may include provisioning storage or compute, auto-scaling, data movement, etc.  While many consider this a needed approach when dealing with complex multicloud solutions, there are some looming issues.


Network transformation is the next big IT initiative

To support the ever-growing data needs of the end-users, agencies must enhance their network infrastructure. However, in the current environment of IT budget cuts procuring high-performance routers and firewalls are not feasible for many agencies. They must therefore explore other avenues to enhance their network infrastructure and capacity. Software-defined networking offers a potential solution for agencies that are looking to modernize their network environment without incurring much capital investment. Leveraging the principles of compute and storage virtualization, SDN allows agencies to virtualize their network infrastructure and services. Similar to data center virtualization where applications run on virtual machines, SDN enables network services (routing, firewall and WANX) to run on virtual machines hosted on general-purpose hardware.


Tweaking Internet Explorer to only use TLS 1.2

Out of the box, IE 11 conforms to the current standard, which is that it supports TLS 1.0, 1.1 and 1.2. This should be true on any up-to-date copy of Windows 7, 8.1 or 10. The nice thing about Internet Explorer is that the configuration options for supported TLS versions are right where they should be. As shown above, they can be found with: Tools -> Internet Options -> Advanced tab. Among the advanced options, they are at the very bottom. Changing these options is even easier than finding them. There is a simple, obvious, checkbox for each version of SSL and TLS that you would like to include or exclude. Compare this to Firefox, where you had to know the secret handshake to remove support for TLS 1.0 and 1.1. After limiting IE11 to just TLS1.2, the Qualys SSL Client Test should confirm that the tweaking actually works.


The paranoid Android traveler’s data-protection checklist

Changes to Android in more recent releases have bolstered security, so if you are traveling with an older device that does not support Nougat, you may want to seriously consider a hardware upgrade. Among other improvements, Nougat introduced new — and potentially more secure — device and file encryption; newer devices should have adequate hardware to handle encryption effectively (more details below). These tips are in roughly increasing order of difficulty and complexity, with the simplest and quickest first. In general, these tips involve a tradeoff between security and ease of use (making it harder to search your device can also make it a little harder for you to use it). So you may want to use some of these options only when traveling.


Cashing in on the Internet of Things

The practical (or impractical) reality of smart connected products in the home suggested there was a need for them to work together, so key industry players began to jockey for dominance. This pertained to the communications standards, as well as the ultimate command and control platforms ranging from Apple HomeKit to Amazon Echo to Google Home, Samsung SmartThings, and others. The Allseen Alliance (primarily driven by Qualcomm) got involved to broker standards for consumer IoT as well. And while the focus today in most elements of IoT is still largely on smart connected products, the progression to product systems is clearly happening. Larger players, like GE and Hitachi, bringing forward solutions like GE/Predix and Hitachi Lumada, further demonstrate this. 


10 Essential Performance Tips For MySQL

The best way to understand how your server spends its time is to profile the server’s workload. By profiling your workload, you can expose the most expensive queries for further tuning. Here, time is the most important metric because when you issue a query against the server, you care very little about anything except how quickly it completes. The best way to profile your workload is with a tool such as MySQL Enterprise Monitor’s query analyzer or the pt-query-digest from the Percona Toolkit. These tools capture queries the server executes and return a table of tasks sorted by decreasing order of response time, instantly bubbling up the most expensive and time-consuming tasks to the top so that you can see where to focus your efforts. Workload-profiling tools group similar queries together, allowing you to see the queries that are slow, as well as the queries that are fast but executed many times.


Don’t let cybercrime hold your innovation to ransom

It’s no secret that innovation is vital to stay ahead of the competition. However, it cannot come at the expense of business continuity. As a result, modern IT systems have to be more complex. While businesses work hard to make them as robust as possible, when you’re constantly innovating that complexity introduces an element of fragility and unpredictability that can be difficult to manage. The best way for CIOs to achieve these objectives is to effectively create and deploy innovative business services that are built on the organisations existing IT foundation and layered with new delivery models and platforms. In practice, it’s bridging the old and the new, enabling an organisation to innovate faster at a lower risk. Thankfully, without the need to rip and replace legacy applications.


Big Data Ingestion: Flume, Kafka, and NiFi

Flume is a distributed system that can be used to collect, aggregate, and transfer streaming events into Hadoop. It comes with many built-in sources, channels, and sinks, for example, Kafka Channel and Avro sink. Flume is configuration-based and has interceptors to perform simple transformations on in-flight data. It is easy to lose data using Flume if you’re not careful. For instance, choosing the memory channel for high throughput has the downside that data will be lost when the agent node goes down. A file channel will provide durability at the price of increased latency. Even then, since data is not replicated to other nodes, the file channel is only as reliable as the underlying disks. Flume does offer scalability through multi-hop/fan-in fan-out flows. For high availability (HA), agents can be scaled horizontally.



Quote for the day:


"Fear causes hesitation and hesitation will cause your worst fears to come true." -- Patrick Swayze