Daily Tech Digest - January 17, 2018

The Neuroscience of Intelligence: An Interview with Richard Haier


Neuroscience approaches have already made intelligence research more mainstream and ready for inclusion in policy discussions. For example, the single most important factor that predicts school success, by far, is the student’s intelligence. Social economic status, family resources, school and teacher quality all pale in comparison. The data showing this is overwhelming. Yet, the word “intelligence” is virtually absent from all discussions about education policies in the United States, and many other countries. Even if intelligence is mostly influenced by genes, all that means for education is that each student comes to school with a different set of strengths for learning. Teachers all know this and the common goal is to maximize each students potential. Attempts to create policies to do this without paying attention to what we know about intelligence have failed for decades, especially with respect to closing achievement gaps.



Why Your Data Could Be At Risk Without Decentralized Computing

According to industry experts, it will take decades for CPUs to be properly redesigned to resolve these issues and replaced. What should the world do to protect itself in the meantime? The answer is decentralization. This is a form of “trustless” computing that assumes from the start that no single machine can be relied upon, instead spreading information out across many different computers or “nodes.” In this framework, even though each individual entity has the potential to be compromised, the decentralized collective will always perform the work safely and correctly. Bitcoin, Ethereum, and blockchain technology in general offer notable examples of decentralized computing. Decentralization achieves two goals. First, no single machine is making all the decisions, so no single machine can unilaterally make bad decisions that affect individual users.


5 Ways SD-WAN Equips Enterprises to Improve Network Security


While the headlines have been alarming, overall industry trends are mixed. According to a recent report by the Ponemon Institute, the average cost of a data breach dropped by about 10 percent to $3.62 million in 2017. This is most likely tied to a reduction in the cost per record stolen, which declined from $158 in 2016 to $141 in 2017. However, the average size of data breaches rose 1.8 percent to more than 24,000 records. Clearly, this is not the time for enterprises to neglect network security. With the rapid expansion of the cloud, followed by what is likely to be an equally rapid move to the Internet of Things, wide-area infrastructure is in need of more flexible and robust protection. One of the most significant enhancements in this field is the advent of the software-defined wide-area network (SD-WAN). By abstracting regional connectivity on top of underlying hardware, enterprises can experience a number of benefits over traditional hardware-centric architectures.


6 things that prevent Blockchain from ruling the world

Generally speaking, the internet is fairly efficient when it comes to the transmission of data. The user requests information, and the server transmits back the piece of data requested with only a small amount of additional data required to get it there. However, the blockchain, in order for it to be preserved, as well as to prevent hacking, needs multiple copies distributed across many nodes. And the blockchain then requires a large amount of storage – for example, Bitcoin’s blockchain was nearly 150GB in size as of last month, and it’s getting bigger all the time. Furthermore, transmitting so much data for the blockchain each time also consumes additional electricity, making the blockchain quite inefficient. In a time where efforts are being made to compress video further to decrease the data required for a download, blockchain’s bulkiness makes little sense.


Financial savings just the beginning for CIOs who understand code quality


It is not just about cutting costs, but improving development productivity and code quality. In the past year, NCOI has fed code into the Cast system four times, but is moving to a contract to enable it to do so monthly to keep up with more regular software updates. “This is so we can refresh our portal every month,” said van Eeden. Ironically, since using the Cast system NCOI has been using more developers because it is doing more development. “For our core ERP application, we have doubled software development productivity,” said van Eeden. “My output doubled, and the quality in the sense of downtime and the number of bugs also improved dramatically.” Van Eeden said he knows there have been no software outages since the company has been using the software intelligence platform, whereas previously it “didn’t even look at the robustness of systems”.


The role of trust in security: Building relationships with management and employees

In reality, security processes must constantly evolve based on discussions between the chief security officer, management, and employees in every business unit, accounting for emerging risks, new technologies, and recently uncovered vulnerabilities. Chief security officers need to first and foremost ensure that a solid understanding exists between the security team and the business units. There is no way that anyone could understand the nuances of a business unit’s capabilities, processes, assets, and services to the extent the unit itself does, so it is tremendously important for a chief security officer to meet with each unit and develop a comprehensive security plan, which is aligned on the corporate level. Only by gaining a more complete understanding of the unique needs of a business unit can a chief security officer develop safeguards that reduce risks.


Demystifying DynamoDB Streams


In order to build something even as simple as a master-slave replication, there are several primitives to understand. The first and foremost is ordering. Imagine if two transactions were to be applied sequentially to a database — the first writes a new entry and the second deletes this entry, which ultimately results in no data persisting in the database — but if the ordering is not guaranteed, the delete transaction could be processed first (causing no effect) and then the write transaction applied, which results in data incorrectly persisting in the database. The second core primitive is duplication: each single transaction should appear exactly once within the log. Failure to enforce ordering or prevent duplication within a log can result in the master and slave becoming inconsistent. ... There are multiple strategies to checkpointing, each of which is a trade-off between specificity and throughput.


How AI Would Have Caught the Forever 21 Breach

As a first step, we must recognize that the days of the desktop/server model are over. In the case of Forever 21, the POS devices served as ground zero — not a laptop, a server, or even a corporate printer. In the age of the Internet of Things, we increasingly rely on "nontraditional" devices to optimize efficiency and boost productivity. But what constitutes a nontraditional device, and how do we look for it? Is it a device without a monitor? A device without a keyboard? Today a nontraditional device could be anything from heating and cooling systems to Internet-connected coffee machines to a rogue Raspberry Pi hidden underneath the floorboards. Protecting registered corporate devices is not enough — criminals will look for the weakest link. As our businesses grow in digital complexity, we have to monitor the entire infrastructure, including the physical network, virtual and cloud environments, and nontraditional IT, to ensure we can spot irregularities as they emerge.


What is identity management? IAM definition, uses, and solutions

Compromised user credentials often serve as an entry point into an organization’s network and its information assets. Enterprises use identity management to safeguard their information assets against the rising threats of ransomware, criminal hacking, phishing and other malware attacks. Global ransomware damage costs alone are expected to exceed $5 billion this year, up 15 percent from 2016, Cybersecurity Ventures predicted. In many organizations, users sometimes have more access privileges than necessary. A robust IAM system can add an important layer of protection by ensuring a consistent application of user access rules and policies across an organization.  Identity and access management systems can enhance business productivity. The systems’ central management capabilities can reduce the complexity and cost of safeguarding user credentials and access.


Mental Models & Security: Thinking Like a Hacker

Although we cannot predict the future with great certainty, we often subconsciously make decisions based on probabilities. For example, when crossing the road, we believe there's a low risk of being hit by a car. The risk exists, but if you've looked for traffic, you are confident that you can cross. The Bayesian method says that one should consider all prior relevant probabilities and then incrementally update them as newer information arrives. This method is especially productive given the fundamentally nondeterministic world we experience: we must use both prior odds and new information to arrive at our best decisions. While there may not be a simple answer to what it means to "think like a hacker," the use of mental models to build frameworks of thought can help avoid the pitfalls associated with approaching every problem from the same angle.



Quote for the day:


"It is easy to lead from the front when there are no obstacles before you, the true colors of a leader are exposed when placed under fire." -- Mark W. Boyer


Daily Tech Digest - January 16, 2018

Data Management 2018
A lot is happening it the market today, as data continues to explode at unprecedented levels. Compliance is no longer an option, but a requirement. The push to cut costs and embrace the multi-cloud – yet still maintain visibility of your data – has never been more critical. At the same time, the need to safeguard against data breaches is an absolute must. And, the necessity to gather as many insights from your data as possible could be the difference between success and failure for many organisations. So, what does all of this mean for today’s CIOs and IT decision makers? Here are the top five predictions for the coming year. For those who see challenges as opportunities, this could be an exciting year for you. ... It is expected that new data valuation techniques to get a boost from AI to reshape information lifecycle management through the automation of policy enforcement and more intelligent data management actions.


What is Zero Trust? A model for more effective security

The Zero Trust model of information security basically kicks to the curb the old castle-and-moat mentality that had organizations focused on defending their perimeters while assuming everything already inside didn’t pose a threat and therefore was cleared for access. Security and technology experts say the castle-and-moat approach isn’t working. They point to the fact that some of the most egregious data breaches happened because hackers, once they gained access inside corporate firewalls, were able move through internal systems without much resistance. “One of the inherent problems we have in IT is we let too many things run way too openly with too many default connections. We essentially trust way too much,” Cunningham says. “That’s why the internet took off – because everyone could share everything all the time. But it’s also a key fail point: If you trust everything, then you don’t have a chance of changing anything security wise.”


Microsoft Stresses Security, Responsible AI in Cloud Policy Updates


In the 2018 update, Microsoft is tackling some of the negative consequences of using AI and other technologies based on the tumultuous year the IT industry experienced in 2017. "We continue to witness cyber-attacks by nation-states on citizens, critical infrastructure and the institutions of democracy. We read on an almost daily basis about the criminal hacking of companies and governments to steal private and sensitive information of customers," wrote Smith in the 2018 update to the e-book. "We listen to the concerns about the loss of jobs to automation and the disruptive impact of artificial intelligence (AI) on entire sectors of the economy." In terms of cyber-security, Microsoft continues to honor its commitment to spend $1 billion in the IT security field each year. If necessary, the company is poised to use legal means to disrupt nation-state attacks.


Cloud computing: Three strategies for making the most of on-demand

"It's still complicated," says Marks, before suggesting more organisations will continue to move from an exploratory stage through to full adoption. "The difference today is that the cloud is understood for its different dimensions, be that at the level of the infrastructure, the platform or services. The migration of applications and data from the server downstairs to the public cloud is a shift that continues. The key point is that it's hard to think of a sensible reason why a CIO would buy hardware ever again -- and that should be your starting point." ZDNet speaks with three IT leaders at different stages of the cloud adoption process: exploring, transforming, and pioneering. Evidence from these three stages suggests cloud-led change remains a work in progress, where smart IT leaders assess their business context and provide an on-demand solution that can flex with future requirements.


Enterprise software spending set to grow thanks to AI and digital boost


“Looking at some of the key areas driving spending over the next few years, Gartner forecasts $2.9tn in new business value opportunities attributable to AI by 2021, as well as the ability to recover 6.2 billion hours of worker productivity,” he said. “Capturing the potential business value will require spending, especially when seeking the more near-term cost savings. “Spending on AI for customer experience and revenue generation is likely to benefit from AI being a force multiplier – the cost to implement will be exceeded by the positive network effects and resulting increase in revenue.” Gartner forecast a slight increase of 0.6% in datacentre spending in 2018 compared with 2017, but predicted a decline of 0.2% in 2019. As Computer Weekly has reported previously, this may be related to the increase in SaaS and cloud-based services.


Lessons in Becoming an Effective Data Scientist

The first skill that I look for when engaging with or hiring a data scientist is humility. I look for the ability to listen and engage with others who may not seem as smart as them. And as you can see from our DEPP methodology, humility is the key to driving collaboration between the business stakeholders (who will never understand data science to the level that a data scientist do) and the data scientist. Humility is critical to our DEPP methodology because you can’t learn what’s important for the business if you aren’t willing to acknowledge that you might not know everything. Humility is one of the secrets to effective collaboration. Nowhere does the importance of the business/data science collaboration play a more important role than in hypothesis development. If you get the hypothesis and the metrics against which you are going to measure success wrong, everything the data scientist does to support that hypothesis doesn’t matter.


7 Acquisitions that Point to Cloud Maturity


Over the better part of a decade, cloud computing mergers and acquisitions painted a picture where cloud service providers where on the hunt to accumulate as many customers as possible. Thus, you saw massive build-outs of facilities and a haphazard set of mergers and acquisitions that had no real rhyme or reason. We also witnessed incredible price wars when it came to commodity cloud resources in the IaaS and PaaS space. In a nutshell, the cloud service provider market has always about growth by any means necessary. If you contrast previous years' cloud acquisitions with those that have occurred the latter parts of 2017 and into 2018, we start to see a new pattern forming. Sure, there are still signs of significant growth in the cloud space for those looking for bleeding-edge services. Yet at the same time, you see a trend towards stability and cloud acquisition dollars trending toward following what the customer wants in a cloud service -- as opposed to the other way around.


As the cloud’s popularity grows, so does the risk to sensitive data

Despite the prevalence of cloud usage, the study found that there is a gap in awareness within businesses about the services being used. Only a quarter (25%) of IT and IT security practitioners revealed they are very confident they know all the cloud services their business is using, with a third (31%) confident they know. Looking more closely, shadow IT may be continuing to cause challenges. Over half of Australian (61%), Brazilian (59%) and British (56%) organizations are not confident they know all the cloud computing apps, platform or infrastructure services their organization is using. Confidence is higher elsewhere, with only around a quarter in Germany (27%), Japan (27%) and France (25%) not confident. Fortunately, the vast majority (81%) believe that having the ability to use strong authentication methods to access data and applications in the cloud is essential or very important.


Big Data 2018: 4 Reasons To Be Excited, 4 Reasons To Be Worried

Figure 1. TensorFlow Playground offers an interactive sandbox for exploring the foundations of TensorFlow. (Source: Google)
Machine-learning models can accurately perform recognition of specific patterns in data streams. In environments already inundated with data, this capability provides high value and distinct advantages, and the industry has responded accordingly. Data scientists can take advantage of a growing number of open-source machine-learning frameworks including Google’s TensorFlow, Apache MXNet, Facebook Caffe2, and Microsoft Cognitive Toolkit, among others. Most important, the task of building models has never been easier. For example, Amazon Web Services (AWS) offers deep learning AMIs (Amazon Machine Images) with the leading ML frameworks already built in and ready for use on the AWS cloud. For those just starting, Google’s TensorFlow Playground helps users learn more about the neural networks underlying machine learning frameworks, using simple data sets and pre-trained models


Container infrastructure a silver lining amid Intel CPU flaw fixes

Meltdown and Spectre loom over containers
"Most folks running containers have something like [Apache] Mesos or Kubernetes, and that makes it easy to do rolling upgrades on the infrastructure underneath," said Andy Domeier, director of technology operations at SPS Commerce, a communications network for supply chain and logistics businesses based in Minneapolis. SPS uses Mesos for container orchestration, but it is evaluating Kubernetes, as well. Containers are often used with immutable infrastructures, which can be stood up and torn down at will and present an ideal means to handle the infrastructure changes on the way, due to these specific Intel CPU flaws or unforeseen future events. "It really hammers home the case for immutability," said Carmen DeArdo, technology director responsible for the software delivery pipeline at Nationwide Mutual Insurance Co. in Columbus, Ohio.



Quote for the day:


"When we lead from the heart, we don't need to work on being authentic we just are!" -- Gordon Tredgold


Daily Tech Digest - January 15, 2018

Blockchain Company Wants to Create Alternative Decentralized Digital Economy

Blockchain Company Wants to Create Alternative Decentralized Digital Economy
One recently proposed solution to this is Pocketinns, which aims to disrupt this space by acting as a collection of marketplaces. Nearly anything you can think of would be available on the platform – its goal is to turn all the current monopolies on their heads by providing a safer, secure alternative platform by offering the same quality promised by these giant corporations.The company already has a home sharing and vacation rental marketplace active and live in Europe with 50,000 properties and is looking at adding another 250,000 properties in the next few months and all of this happens at zero percent commission. Pocketinns looks to follow the monthly subscription model by offering multiple services on one single platform.In addition, the future vision includes building an internal financial network used to support the internal transactions which include payment processors, remittances, banking, etc.


20 years on, open source hasn’t changed the world as promised
This chicken-and-egg conundrum is starting to resolve itself, thanks to the forward-looking efforts of Google, Facebook, Amazon, and other web giants that are demonstrating the value of open-sourcing code. Although it’s unlikely that a State Farm or Chevron will ever participate in the same way as a Microsoft, we are starting to see companies like Bloomberg and Capital One get involved in open source in ways they never would have considered back when the term “open source” was coined in 1997, much less in 2007. It’s a start. Let’s also not forget that although we have seen companies use more open source code over the past 20 years, the biggest win for open source since its inception is how it has changed the narrative of how innovation happens in software. We’re starting to believe, and for good reason, that the best, most innovative software is open source.


Why you’ll fire Siri and do the job yourself

ObEN’s PAI approach is one answer to the question of how virtual assistants with agency might function. We’ve assumed for years that virtual assistants will do more than just answer our questions, which is mostly what they do today. Future virtual assistants should buy things, negotiate fees, automatically remind co-workers of their deadlines and more. Consider Amy, the x.ai virtual assistant. Amy is A.I. that interacts via email and schedules meetings. Amy has a personality and can make decisions in an email conversation, such as the meeting participants and the Amy virtual assistant negotiating available times for meetings. Amy is a virtual person, and many people who encounter Amy assume they’re interacting with a real human. If our virtual assistants are to be “personalities” like Amy, they could also be virtual representations of ourselves. This approach is actually more transparent than the A.I. that’s currently used.


Safeguarding your biggest cybersecurity target: Executives

Safeguarding your biggest cybersecurity risk: Executives
“Executives need to internalize that they are targets,” says Bill Thirsk, vice president of IT and CIO at Marist College. “Cyber attackers take time to watch, plan, practice, hone, and harden their art before going after a high-value target. Attackers have the luxury of stealth, time, duplicity, and multiple platforms for designated random attacks — all of which work against normal human behavior, curiosity, and the need for connectedness.” An executive’s “digital footprint” needs to be understood and gaps must be closed as a matter of practice, Thirsk says. Social accounts should be registered, confirmed, and monitored, he says. But getting executives to buy into protection is a challenge. “Every statistic I’ve seen shows that executives are the least likely to adhere to policies that they expect everyone else to follow,” says Paul Boulanger, vice president and chief security consultant at SoCal Privacy Consultants. “In part, this is because they are the people most willing to sacrifice security for convenience.”


IT service management effectiveness hampered by lack of metrics


According to the study, the increasing demand placed on IT operations is resulting in teams taking on more work than they can handle. Axelos found that this could be having a negative effect on their reputation. “Despite struggling to keep up with demand and working beyond realistic expectations, they are still perceived as delivering poor performance,” the report stated. IT operations and development teams said they wanted to eliminate inefficient practices. The study found that 55% of ITSM professionals who took part in the survey showed an interest in identifying and eliminating wasteful work through the use of continuous service improvement, DevOps and agile practices. Axelos found that larger organisations tend to recognise lack of visibility as a problem, while smaller organisations struggle more with inefficient processes and understanding customer needs.


AI Begins to Infiltrate the Enterprise

Image: Shutterstock
The data that feeds AI systems can also present obstacles. "The gathering and curation of data is a key challenge," said Patience. "We see that over and over again, where either organizations don't have enough data, or they have it and can't get access to it.” Then there are the problems with the technology itself. While AI research has advanced incredibly quickly in recent years, we still don't have a general artificial intelligence that truly thinks and learns the way humans do. As a result, human interactions with AI are sometimes less than satisfactory. "The 'klutziness,' if you will, of a computer itself is a serious challenge," Hadley said, adding, "The opportunities for mistakes and disasters from the point of view of the customer experience are much more likely." That leads to a bigger issue: trust. "The overarching issue in the whole development of the field is do people trust the results that they get out of a machine?" Reynolds said.


Spectre and Meltdown explained: What they are, how they work, what's at risk

thinkstock 500773792 cpu processor
The problem arises because the protected data is stored in CPU cache even if the process never receives permission to access it. And because CPU cache memory can be accessed more quickly than regular memory, the process can attempt to access certain memory locations to find out if the data there has been cached — it still won't be able to access the data, but if the data has been cached, its attempt to read it will be rejected much more quickly than it otherwise would. Think of it as knocking on a box to see if it's hollow. Because of the way computer memory works, just knowing the addresses where data is stored can help you deduce what the data is. ... Spectre and Meltdown both open up possibilities for dangerous attacks. For instance, JavaScript code on a website could use Spectre to trick a web browser into revealing user and password information. Attackers could exploit Meltdown to view data owned by other users and even other virtual servers hosted on the same hardware, which is potentially disastrous for cloud computing hosts.


Don't Use a Blockchain Unless You Really Need One

Most CoinDesk readers are probably familiar with the usefulness of decentralization in a monetary context (and if you're not, take a look at recent articles about cryptocurrency adoption in Iran, Venezuela, Russia and, ahem, the alt-right). The neutrality, censorship-resistance and openness of a permissionless network mean it will attract the odious along with the oppressed, and the software doesn't decide which is which. But why is decentralization worthwhile in the data use case? "Today, every piece of content and media you have is living somewhere owned by somebody," Ravikant explained. ... Then the conversation took a turn that I have to admit made me roll my eyes at first. "If someone creates a new Pokemon card game or a Magic the Gathering card game" online, he continued, the characters are "living and owned by a certain company in a certain format. You can't just go and reuse those assets."


IOT Security Needs A White Knight

wireless network - internet of things [iot]
A lack of quality control and the presence of a host of very old devices on IoT networks might be the most critical security threats, however. Decades-old hardware, which may not have been designed to be connected to the Internet in the first place, let alone stand up to modern-day security threats, creates a serious issue. “You have over 10 billion IoT devices out there already … and a lot of these devices were created in 1992,” noted Sarangan. Moreover, the huge number of companies making IoT-enabled hardware makes for a potentially serious problem where quality control is concerned. Big companies like Amazon and Microsoft and Google make headlines for their smart home gizmos, but the world of IoT is a lot broader than that. China, in particular, is a major source of lower-end IoT devices – speakers, trackers, refrigerators, bike locks and so on – and it’s not just the Huaweis and Xiaomis of the world providing the hardware.


Forget the CES hype, IoT is all about industry

Forget the CES hype, IoT is all about industry
In addition to all the new product previews, this year’s CES is full of summits, seminars, presentations and other sessions devoted to helping consumer products companies make, sell, deploy and monetize everything from smart cars and smart homes to smart cities. But I’m here to tell you that none of that really matters much to the future of the Internet of Things (IoT). Nope, despite the CES hype, the IoT is really all about industrial and business devices, networks and applications. Here’s the thing: The consumer side of IoT is consumed by the faddish and spectacular, not the everyday and useful. Just consider the kind of IoT products that have been featured at CES in previous years: There was the infamous smart toothbrush (expect more of those this year, too), not to mention smart hairbrushes and refrigerators. Not exactly must-haves for most people.



Quote for the day:


"An approximate answer to the right problem is worth a good deal more than an exact answer to an approximate problem." -- John Tukey


Daily Tech Digest - January 14, 2018

Strategy and Innovation Roadmapping Tools

EIRMA Roadmapping ViewThe ways of doing roadmapping have existed for some time, but supporting software has not. Motorola is credited with the development of roadmapping in the 1970s to support integrated product and technology strategic planning. Unfortunately, many have struggled with drawing, and coloring boxes and connecting lines in different tools for over 20 years. This doesn’t have to be the case today. ... Added analytical capabilities in this group of strategy and innovation roadmapping tools may cover techniques such as scenario planning , Delphi, Blue Ocean and more . Such analysis may be disconnected from in-flight efforts; but, such analysis may also allow for future plans to be considered in light of current state results. Analysis techniques for the Marketplace are also integral to creating and delivering strategy and innovation in many organisations. The marketplace we’ve profiled in our Market Guide for Strategy and Innovation Roadmapping Tools supports many of these techniques


NASA Awarded A Grant For Ethereum Blockchain-Related Research

Among the goals of the program are measures to protect NASA vehicles from collisions with space junk orbiting the earth, which can damage or completely incapacitate them, and the processing of highly complex data. At the helm of the research project is Dr. Jin Wei (sometimes credited as Jin Kocsis or Jin Wei Kocsis), an assistant professor with the University of Akron's Department of Electrical and Computer Engineering.  A write-up published by the Collier Report of US Government Spending, which shares a significant amount of language with a project summary ostensibly penned by Wei, describes plans to develop a "data-driven resilient and cognitive networking management architecture." Wei's team will also conduct research into decentralized computing mechanisms that could prove instrumental in processing "the massive amount of high-dimensional data" often collected by NASA spacecraft.



By 2020 83% Of Enterprise Workloads Will Be In The Cloud

By 2020 83% Of Enterprise Workloads Will Be In The Cloud
Digitally transforming enterprises (63%) is the leading factor driving greater public cloud computing engagement or adoption today. 66% of IT professionals say security is their most significant concern in adopting an enterprise cloud computing strategy. 50% of IT professionals believe artificial intelligence and machine learning are playing a role in cloud computing adoption today, growing to 67% by 2020. Artificial Intelligence (AI) and Machine Learning will be the leading catalyst driving greater cloud computing adoption by 2020. These insights and findings are from LogicMonitor’s Cloud Vision 2020: The Future of the Cloud Study. The survey is based on interviews with approximately 300 influencers LogicMonitor interviewed in November 2017. Respondents include Amazon Web Services AWS re:Invent 2017 attendees, industry analysts, media, consultants and vendor strategists. The study’s primary goal is to explore the landscape for cloud services in 2020.


Blockchain: can the law keep up?

The legal implications of a disruptive technology such as blockchain vary as the technology is applied to different sectors and applications. Some of the key considerations are as follows: ... Distributed ledger technology is, just as described, distributed. There is no fixed location of a transaction, a registry or an application. It is therefore critical that the parties to any arrangement involving this technology have expressly agreed and recorded the jurisdiction and governing law which is to apply to the arrangement. Some jurisdictions are starting to address the legal and regulatory matters around blockchain. ... Contractual and legal issues must be seen from a different angle with blockchain technologies. How are service levels and performance defined? What is the liability position? In particular, the enforceability of an arrangement involving blockchain should be considered carefully.


The Future of Humans - Intersection of HR & AI


Artificial intelligence is transforming our lives at home and at work. At home, you may be one of the 1.8 million people who use Amazon's Alexa to control the lights, unlock your car, and receive the latest stock quotes for the companies in your portfolio. In total, Alexa is touted as having more than 3,000 skills and growing daily. In the workplace, artificial intelligence is evolving into an intelligent assistant to help us work smarter. Artificial intelligence is not the future of the workplace, it is the present and happening today. Time is not far when AI will be contributing to every business function to make transactions effective & efficient. Human Resources will not be able to stay form it for a long time and all HR professionals should embrace this change gracefully and make the most of it. Artificial intelligence is all about analysing, breaking down and transforming data into humanized format, which is easy to interpret and study. A good example of AI is the suggestions and predictions that we get from our smartphones without having to be reminded for the same.


Why the Organisation of the Tomorrow is a Data Organisation

A car company should no longer see itself as a car manufacturer, but as a software company that is in the business of helping move people from A to B. It should look at how the company can do so in the most reliable, comfortable and safe way. Whether it produces cars, self-flying taxis or develops an Uber-like app are then questions that can be asked. The same goes for, for example, a bank. They are not a financial institution, but a data company that enables people to store money and make transactions safely. Whether this is done using a cryptocurrency or as a mobile-only bank are then questions that can be asked. Nowadays, any company, regardless of the industry, should see itself as a data company. When seeing an organisation as a data company, it allows you to remove any inhibitors that prevent the business from delivering the product or service in the most efficient, effective and customer-friendly way.


Machine Learning's Greatest Potential Is Driving Revenue In The Enterprise


These and many other insights are from the recently published study, Global CIO Point of View. The entire report is downloadable here (PDF, 24 pp., no opt-in). ServiceNow and Oxford Economics collaborated on this survey of 500 CIOs in 11 countries on three continents, spanning 25 industries. In addition to the CIO interviews, leading experts in machine learning and its impact on enterprise performance contributed to the study. For additional details on the methodology, please see page 4 of the study and an online description of the CIO Survey Methodology here. Digital transformation is a cornerstone of machine learning adoption. 72% of CIOs have responsibility for digital transformation initiatives that drive machine learning adoption. The survey found that the greater the level of digital transformation success, the more likely machine learning-based programs and strategies would succeed. IDC predicts that 40% of digital transformation initiatives will be supported by machine learning and artificial intelligence by 2019.


Key considerations of AI, IoT and digital transformation

IoT is a key driver of both the machine learning and AI craze. With the volume of data produced by machines and people on a daily basis becoming unmanageable, it has become increasingly difficult to make use of this information — and the proliferation of connected sensors only serves to further up the ante. Without AI and machine learning, making heads from tails of this data is downright difficult and creates problems for businesses looking to use their data. With digital transformation at the forefront of many business initiatives, applying AI to IoT can help drive the innovation and business effectiveness that many companies are hoping to achieve. Applied correctly, companies can change the way they operate through the use of these digital technologies to realize key competitive differentiators. Some industries are more proactive in this regard than others, which is best observed in Constellation Research’s recently released “Business Transformation 150.”


In 2018, AI will be listening and watching us more than ever: Is our privacy under threat?


Finding Alexa or Google Assistant (or even both) inside a television or speaker is no longer a surprise. But as the smart home of the future finally becomes an attainable reality, artificial intelligence is appearing everywhere. At CES it was in fridges and ovens, washing machines, dryers and even light switches. Yes, we are at a stage where even the most humble of household devices — the light switch — has been given a microphone, speakers, and a blue pulsating light to indicate when it is listening and thinking. Somehow, while our backs were turned, our light switches became intelligent. The revelation of widespread surveillance efforts by the NSA and Britain's GCHQ are still a recent memory. Yet the giants of Silicon Valley are fitting microphones and cameras in every room of our house. The Amazon Echo Spot is designed as a bedside alarm clock — yet it has a small camera, an always-listening microphone and an always-on internet connection.


Blockchain and the Rise of Transaction Technology

Administrations transact with citizens to provide them with trusted public services. They transact with businesses and governments, too. Sometimes citizens transact with government through business. Within strategic sectors, like energy or utility business, transacting is key. In an increasingly data-focused economy, transacting data can even be said to be a special type of virtualized critical infrastructure. This is why states and businesses need to focus on assuring trusted data structures. Blockchains and distributed ledgers, then, can be considered a tool for ensuring data integrity, immutability and trust. It does not mean we need to port everything to blockchain. But it can mean provide an additional, transaction layer to existing data structures, a robust audit trail on what happens on our critical infrastructure. In this way, the possible role of distributed ledgers within digital state infrastructure too often goes unrecognized.



Quote for the day:

"Strength is when you have so much to cry for but you prefer to smile instead." -- Unknown

Daily Tech Digest - January 11, 2018

Cybersecurity on a network
Industrial control systems (ICS) are everywhere. These systems play a critical role in nearly every industry around the world, including electric, water and wastewater, oil and natural gas, and transportation, as the smart technology of today and tomorrow is driven by these systems. This same widespread use and importance of ICS, especially those found in critical infrastructure, also makes them a primary target for bad actors, and the increasing use of the internet is only serving to magnify the potential for issues. According to Industrial Control Systems Vulnerabilities Statistics from Kaspersky, there were only two ICS vulnerabilities detailed in 1997 (the first year this information was recorded); however, these vulnerabilities are now much more commonplace, with 189 reported in 2015.


5 ways to establish dependable edge computing

Edge computing is set to grow dramatically in 2018 and beyond. As IoT devices continue to come online in droves of tens of billions, edge data centers will grow in prevalence too, in order to collect, process and manage data when and where it’s being created. IT departments should expect to see tremendous growth and demand for reliable computing at the edge over the next few years. As edge data centers continue to play an increasingly important role in both the business and IT landscapes, we’ll see the standards for their level of predictability and uptime grow to match those that enterprises and consumers have come to expect from traditional, large data centers. So, how can you build an edge data center that’s reliable and generates value for your company? Here are five important ways IT departments should be building their edge data centers to help ensure end-to-end reliability and resiliency.


An intro to Studio 3T, a MongoDB IDE

An intro to Studio 3T, a MongoDB IDE
A good indication of whether a technology is in the plateau of productivity in Gartner’s hype cycle is when someone asks ”Is MongoDB dead?” on that bastion of, um, sane discussion, Quora. A second good indication is when there are productivity tools and at least a nascent third-party market around your technology. A third indication is when a third party creates an IDE for it: The growing third-party market is a key indication that MongoDB has moved from mere maturity to one of the dominant players in this market. Enter Studio 3T, a small European firm with its own sea mammal mascot and a reputation for being “the MongoDB GUI.” Its eponymous product is the successor to its MongoDB Chef. According to Studio 3T marketing chief Richard Collins, the company direction is as a full-fledged IDE for MongoDB. Studio 3T lets team collaborate on MongoDB charts across roles and skill levels, from the developer to the analyst to the DBA.


Report Discusses How to Approach Botnets, Cybersecurity Threats

Botnets and automated attacks include distributed denial of service (DDoS) attacks, ransomware attacks, and computational propaganda campaigns, the report noted. “Traditional DDoS mitigation techniques, such as network providers building in excess capacity to absorb the effects of botnets, are designed to protect against botnets of an anticipated size,” report authors wrote. “With new botnets that capitalize on the sheer number of ‘Internet of Things’ (IoT) devices, DDoS attacks have grown in size to more than one terabit per second, outstripping expectations. As a result, recovery time from these types of attacks may be too slow, particularly when mission-critical services are involved.” Stakeholders in all industries must be willing to coordinate and collaborate together to combat these threats. Problems must be proactively addressed “to enhance the resilience of the future Internet and communications ecosystem.”


Look at full security development lifecycle to reduce web threats


For smaller companies without a security team, this can help with patching and general security maintenance. However, it’s important to check the security measures they offer, especially in terms of security monitoring, distributed denial of service attack (DDoS) mitigation, their responsiveness to security incidents and their processes for dealing with incidents. If you need to host the website yourself, perhaps because you are delivering a web-based service that requires close coupling to your own systems, then you will need to make sure your own systems are protected and separated from the web server itself, to prevent the web server being used as a Trojan horse to attack your operational systems.  Whichever approach you take, supply chain security also needs to be considered. There have been instances over the past few years where an attacker has attacked a website tool developer and modified the code of the tools used to build a website so that every site they build includes backdoors open to the attacker, or pre-placed malware.


'Back to Basics' Might Be Your Best Security Weapon

Despite an influx of best-in-breed security technologies, organizations around the world are seeing a continued rise in cyber attacks. There are big implications. Financial consequences include immediate costs of investigating the breach and extend longer-term to include lawsuits and regulatory fines. Loss of customer trust can translate into declines in business. Perhaps most damaging is the impact of shutting down entire systems, which can grind operations to a halt. This is especially dangerous when the target is a critical healthcare, government, or utility provider. From the high-profile Equifax breach to payment compromises at hotel chains and retailers, security teams are increasingly under pressure to not only determine why this is happening but what can be done to fix or prevent these problems. For many companies, getting "back to basics" could be one of the most effective weapons in the war on cyberattacks.


How Cisco’s newest security tool can detect malware in encrypted traffic

Cisco ETA security
ETA collects metadata about traffic flows using a modified version of NetFlow and searches for characteristics that indicate the traffic could be malicious. It inspects the initial data packet, which is translated in the clear, even in encrypted traffic. It also records the size, shape and sequence of packets, how long they take to traverse the network, and it monitors for other suspicious characteristics such as a self-signed certificate, or whether it has command-and-control identifiers on it. All of this data can be collected on traffic, even if its encrypted. “ETA uses network visibility and multi-layer machine learning to look for observable differences between benign and malware traffic,” Cisco explains in a blog post announcing ETA. If characteristics of malicious traffic are identified in any packets, they are flagged for further ianalysis through deep packet inspection and potential blocking by an existing security appliance like a firewall.


Researchers uncover major security vulnerabilities in ICS mobile applications

Specifically, the research revealed the top five security weaknesses were: code tampering (94% of apps), insecure authorization (59% of apps), reverse engineering (53% of apps), insecure data storage (47% of apps) and insecure communication (38% of apps). “The flaws we found were shocking, and are evidence that mobile applications are being developed and used without any thought to security,” said Bolshev. “It’s important to note that attackers don’t need to have physical access to the smartphone to leverage the vulnerabilities, and they don’t need to directly target ICS control applications either. If the smartphone users download a malicious application of any type on the device, that application can then attack the vulnerable application used for ICS software and hardware. What this results in is attackers using mobile apps to attack other apps.”


Embark upon ITSM asset management with systematic tagging


Tags are just text strings attached to devices and infrastructure as part of an ITSM asset management strategy. A tag is a key-value pair, seen as metadata. It assigns common attributes to assets, so they can be logically defined and grouped. A simple key-value pair example that would prove useful for ITSM asset management is the following: stack = production. In this example, the IT infrastructure team applies the stack tag to all production servers when they are built. When the administrator needs to perform system management, such as an update, he puts the tag productioninto an update query to restrict any operations to those servers tagged with production. Other key-value pair examples include owner = QAteam and location = LosAngeles. Using this set of tags in an update rollout, the global IT manager can apply changes only to servers owned by the quality assurance (QA) team and located in the Los Angeles data center.


Humans and machines in one loop—collaborating in roles and new talent models

In the near future, human workers and machines will work together seamlessly, each complementing the other’s efforts in a single loop of productivity. And, in turn, HR organizations will begin developing new strategies and tools for recruiting, managing, and training a hybrid human-machine workforce. Notwithstanding sky-is-falling predictions, robotics, cognitive, and artificial intelligence (AI) will probably not displace most human workers. Yes, these tools offer opportunities to automate some repetitive low-level tasks. Perhaps more importantly, intelligent automation solutions may be able to augment human performance by automating certain parts of a task, thus freeing individuals to focus on more “human” aspects that require empathic problem-solving abilities, social skills, and emotional intelligence.



Quote for the day:


"The struggle you're in today is developing the strength you need for tomorrow." -- Robert Tew


Daily Tech Digest - January 10, 2018

How Founders Kill Their Own Start-Ups


First, micro-managing founders have a hard time scaling up their ventures. A lack of empowerment inevitably creates decision bottlenecks, which hamper speed of execution, a key ingredient of scalability. Such bottlenecks also foster “firefighting”, which is when small issues keep grabbing more attention and resources than they should. Ultimately, high-level, high-impact decisions suffer neglect and progress grinds to a halt. Second, micro-management is a talent drain. Often those who are attracted to start-ups want to build something, to be part of something bigger. Millennials in particular want to have an impact. Micro-managers clip wings and can’t retain top contributors, especially creative ones. ... Third, time spent micro-managing is time spent away from a founder’s most important tasks: Thinking about the big picture, drumming up business and finding resources to make everything run smoothly. A founder’s micro-management, evidenced by high turnover, may even put off potential advisors and investors.



How the Threat Landscape Will Shift This Year

Defending against supply chain attacks will be tough because each software vendor has a different distribution mechanism and signing infrastructure, says Weston. In the past, companies could put software on a "trusted list" if it had a history of being secure. However, he says, businesses have to realize anything can change from good to bad at any time. "Getting your software sources from centralized locations where possible is one of the practical means for protecting against supply chain attacks," Weston adds. Cryptocurrency will be a growing security issue as more people adopt it. Attackers will target machines to cannibalize their resources and focus on cryptocurrencies, which are getting harder to mine in legitimate ways. Wallets will also become popular among hackers.


The 12 biggest issues IT faces today
Most organizations struggle with finding qualified tech staff, says Todd Thibodeaux, president and CEO of CompTIA. Training them up on the clock feels equally daunting. “The good news for employers is that the majority of IT pros like what they’re doing,” Thibodeaux says. “Their jobs provide them with a sense of personal accomplishment. Their skills and talents are put to good use. They see opportunities to grow and develop in their careers — and they’re generally satisfied with their compensation and benefits.” While IT staff may enjoy their work, retraining goes a long way in keeping it that way, says Thibodeaux. “IT pros would like more resources for training and development, and more career path guidance and career advancement opportunities,” he says. “They’re also interested in having access to more tools and engaging with more technologies and applications. And they’d welcome the opportunity to work on new technology initiatives.”


Code Refactoring Techniques

Red-green refactor is the Agile engineering pattern which underpins Test Driven Development. Characterized by a “test-first” approach to design and implementation, this lays the foundation for all forms of refactoring. You incorporate refactoring into the test driven development cycle by starting with a failing “red” test, writing the simplest code possible to get the test to pass “green,” and finally work on improving and enhancing your code while keeping the test “green.” This approach is about how one can seamlessly integrate refactoring into your overall development process and work towards keeping code clean. There are two distinct parts to this: writing code that adds a new function to your system, and improving the code that does this function. The important thing is to remember to not do both at the same time during the workflow.


What is DevSecOps? Developing more secure applications

devsecops gartner image
“DevOps has become second nature for agile, high-performing enterprises and a foundation for the success of their online business,” says Pascal Geenens, a security evangelist and researcher at Radware. “Continuous change in technology and consumer demand means there is a continuous cycle of updates to run that will keep a very varied set of functions from page upload times to shopping and search features up to date and running at their best.” “However, application security was mostly an afterthought, and at times perceived as a roadblock to staying ahead of the competition,” says Geenens. “Given the reliance of applications to keep operations running, bypassing security must be considered a high-risk strategy -- a distributed or permanent denial of service attack could easily catch you out. You just need to look at the implications of failing to update the Apache Struts framework as suffered by Equifax. The DevSecOps movement is designed to change this.”


The symphonic enterprise

After a decade of domain-specific transformation, one question remains unanswered: How can disruptive technologies work together to achieve larger strategic and operational goals? We are now seeing some forward-thinking organizations approach change more broadly. They are not returning to “sins of the past” by launching separate, domain-specific initiatives. Instead, they are thinking about exploration, use cases, and deployment more holistically, focusing on how disruptive technologies can complement each other to drive greater value. For example, blockchain can serve as a new foundational protocol for trust throughout the enterprise and beyond. Cognitive technologies make automated response possible across all enterprise domains. Digital reality breaks down geographic barriers between people, and systemic barriers between humans and data. Together, these technologies can fundamentally reshape how work gets done, or set the stage for new products and business models.


Aadhaar sitting duck for cyber criminals, says RBI-backed research

Aadhaar sitting duck for cyber criminals, says RBI-backed research
"Aadhaar faces a number of challenges over the short and long-term. The primary challenge is to protect the data from prying and excessive profit seeking excess of the business world. It is wellknown that businesses are increasingly operating in a highly competitive world in which ethical boundaries are rapidly being pulled down. The problem is compounded because they have to satisfy their shareholders in a competitive business environment that rarely looks beyond the quarterly profits and the operational dynamics of stock market listing," it says. However, the paper says, cyber vulnerabilities of Aadhaar are a bigger concern than the possible commercial misuse of data. "In an era when cyber threats are frequent, the major challenge for UIDAI is to protect the data under its control since the biometrics is now an important national asset which has huge ramifications for various government programmes and the banking system," it says.


3 top trends that will drive ERP software development

Organizations are continuing to see the many advantages of developing a roadmap and strategy that leads to digital transformation (i.e. the ability to leverage the Industrial Internet of Things (IIoT), machine learning, AI, big data or analytics). This movement will continue to drive their approach through 2018 or organizations will face the consequences of being left behind by competitors. From an ERP perspective, the combined driving force of these innovative technologies will continue to re-shape how enterprises utilize their IT solutions to provide increased efficiency, process and productivity inside their organization – something for many have been left untapped. While the industry has been discussing many of these leading-edge technologies, here are the “big three” technologies which will play a significant role in re-shaping how we think of ERP in 2018.


How network verification differs from monitoring, and what it’s good for

network monitoring concept
In a network verification system, the intent is explicitly declared – in this case, that the external partner network should be connected to the demilitarized zone but isolated from the rest of the data center. The network verification system can then explore all possible data flows that could occur and determine if some flows will violate the intent, thus spotting the vulnerability well before the attack. Now suppose you are called early Saturday morning to fix this vulnerability by locking down firewall rules. ... Depending on the application and the access-control mistake, such a slip-up might result in an immediate red-alert outage, or it might not show up until Monday morning. Either way, traffic monitoring will see the problem only after it has already affected users. A company with a network verification system could incorporate the proposed change into its network model pre-deployment and predict that the change would violate the connectivity intent, saving you from causing an outage.


Will 2018 be the year of the chatbot? Not without human help

"A chatbot is like a baby—it needs to be nurtured and taught—you can't just set up a chatbot and let it go," Harles said. Developers should avoid getting too attached to a certain technology, and think about the reasoning for a chatbot, Harles said. The bot should set out to solve a problem for the customer, making it a useful, uncomplex tool. "Some organizations think chatbots will automate a process or avoid call center calls—but the truth is chatbots are like the advent of the ATM," Harles said. "Many thought the ATM would replace banks and tellers, but in reality, it simply created a new channel. That is what chatbots will deliver for brands—a new communication channel." But a new communication channel can come with its own issues, including potential abuse of the channel with inappropriate or unrelated questions.



Quote for the day:


"It's not what you look at that matters, it's what you see." -- Henry David Thoreau


Daily Tech Digest - January 08, 2018


Software architecture has traditionally been associated with big design up front and waterfall-style delivery, where a team would ensure that every last element of the software design was considered before any code was written. In 2001, the "Manifesto for Agile Software Development" suggested that we should value "responding to change over following a plan," which when taken at face value has been misinterpreted to mean that we shouldn’t plan. The net result, and I’ve seen this first hand, is that some software development teams have flipped from doing big design up front to doing no design up front. Both extremes are foolish, and there’s a sweet spot somewhere that is relatively easy to discover if you’re willing to consider that up front design is not necessarily about creating a perfect end-state. Instead, think about up front design as being about creating a starting point and setting a direction for the team. This often missed step can add a tremendous amount of value to a team by encouraging them to understand what they are going to build and whether it is going to work.



Image: Pixabay
"Disruption happens when small, nimble companies challenge incumbents with technology," said Conrad Burke, vice president of New Ventures at Intellectual Ventures and head of ISF Incubator, an incubator and accelerator within Intellectual Ventures, which creates and licenses intellectual property (IP). "Timing is everything. You have to be able to execute well and quickly because things can change rapidly." On the other hand, speed can kill. Many great ideas haven't taken off as expected simply because the timing was wrong. For example, at the turn of the millennium, mobile marketing was supposed to change the world. Now, 18 years later, marketers are advancing the same idea as if it's novel. The difference is that today we have smartphones that deliver slick experiences, as opposed to flip phones. In addition, the networks to which our phones are connected provide coverage just about everywhere and their significantly greater bandwidth supports many content types, not just text.



Why Microsoft’s Cosmos DB may displace AWS’s cloud databases

Why Microsoft’s Cosmos DB may displace AWS’s cloud databases
The reason for Cosmos DB’s ascendance may stem from declining developer interest in “polyglot persistence.” Coined by Thoughtworks’ Martin Fowler back in 2011, polyglot persistence suggests that “any decent sized enterprise will have a variety of different data storage technologies for different kinds of data.” Rather than forcing data to fit a relational data model, for example, an enterprise will more likely embrace wide-column data for some parts of an application, a graph database for others, and relational for still others. The popularity of databases like MongoDB is a clear sign that, in fact, we do live in an increasingly polyglot world. Microsoft’s genius with Cosmos DB is that developers may want to have their polyglot persistence cake and eat it too—all in one place. As InfoWorld’s Serdar Yegulalp has written, “With Cosmos DB, Microsoft offers multiple consistency models in the same database, so the choice of model can be a function of the workload rather than the product.” That’s huge.


Fundamental skills for a bright future in IT
“Going forward, I believe you’ll see a basic need for talent that understands the fundamentals of algorithms to create AI systems, as well as general design and even some psychology and understanding of human behavior,” Fermin says. “People who are tasked with designing and building, say, chatbots will need to understand how to give those technologies human characteristics and make the interactions indistinguishable from interacting with another human being — how can it show empathy, compassion, creativity, communication,” he says. But how can job seekers emphasize these kinds of “middle ground” skills on their resume? You can’t simply list them as you would harder tech skills; you should take the same approach you would when highlighting your soft skills, says Quizlet’s Glotzbach. “Emphasize how you’ve learned on the job, or how you’ve invested in a post-graduate education through online courses, microcertifications, bootcamps — anything like that,” Glotzbach says.



Why A Cyberattack Could Cause Infrastructure To Fall Like Dominoes

The biggest motive for cyberattacks over the past few years has been financial gain. Hackers shut down a network and demand a ransom before halting an attack or giving the victim access to their network. With profit in mind, companies that, presumably, have the cash to pay ransoms are typically the target. But infrastructure operators can be victims of hackers facing any number of motivations, including money, politics or vandalism. There are strong indications that bigger and more organized actors — in some cases nation-states — have probed U.S. nuclear power plants, a dam in New York and a network that sits at the center of the global banking system.  Fear of retaliation is likely the best explanation for why a major attack hasn’t occurred. Those who have the means are likely just as vulnerable.


Apple confirms all devices affected by Meltdown and Spectre


While all Mac systems and iOS devices are affected, Apple said there were no known exploits impacting Apple device users and recommended downloading software only from trusted sources. Apple has released mitigations in iOS 11.2, macOS 10.13.2, and tvOS 11.2 to help defend against Meltdown, and said the Apple Watch was unaffected. The impact of the mitigations for Meltdown have been estimated as high as a 30% reduction in performance, but Apple claimed that updates it had released so far resulted in “no measurable reduction in the performance” of macOS and iOS. Apple also plans to release mitigations in its Safari browser to help defend against Spectre “in the coming days” and said testing indicated that the Safari mitigations would have little or no measurable performance impact.


France goes after companies for deliberately shortening life of hardware

Printer manufacturers “deliberately shorten the life of printers and cartridges,” a French environmental and consumer protection group claims. That's against the law in France, and government prosecutors have agreed to investigate the claims. If the lawsuit against the printer company, Japan-based Epson, is proven, the firm could be found guilty of breaking a little-known French law that stipulates vendors can’t purposefully lower the lifespan of a product to ramp up replacement rates. A conviction could be significant for tech hardware manufacturing overall. Nabbing Epson would likely affect not only how hardware is built and sold in France, but it also could mean laws get adopted in other European territories —individual nations are involved in the functioning of the EU bloc overall.


5 steps to becoming a global IT leader

5 steps to becoming a global IT leader
An IT leader in a multicultural environment must also be aware of his or her own cultural framework, observes Annalisa Nash Fernandez, a New York-based intercultural strategist who advises international executives on cross-cultural communication. "For an American CIO, that may mean a high degree of individualism and personal accountability, a task-based versus relationship-based approach and a linear view of project timing, which is the lens through which the diverse cultures and business styles of the teams abroad will be understood and processed," she explains. Begin the journey with face-to-face contacts to better know your customers, business partners and team, suggests Keith Collins, executive vice president and CIO of SAS. "Language and cultural differences make conference calls difficult at best," Collins says. "Practicing empathy is critically important in this environment."


China to block SD-WAN and VPN traffic by Jan. 11

Global economic technology prospects: China and the United States of America
Millions of Internet users relied on virtual private networks (VPNs) to circumvent the Chinese censorship system, dubbed the Great Firewall of China. In the past, VPNs have worked intermittently but were invariably blocked, forcing users to jump to another VPN. The new regulations will block VPN access to unregistered services. Crackdowns on accessing the Internet beyond the Great Firewall — the world’s most sophisticated state-censorship operation, employs at least 2 million online censors. But this news highlights how the world’s second largest economy is struggling to balance authoritarianism with its business leadership aspirations. In addition, a strict new cybersecurity law came into effect in June. In July China Telecom, the nation’s biggest Internet service provider, sent a letter to corporate clients that said in future, VPNs would only be allowed to connect to a company’s headquarters abroad.


Four Age-Old Business Problems Machine Learning Will Soon Solve


The hype surrounding machine learning has been accelerating and expanding for years. Supporters talk about the potential of this technology to improve every process and eliminate any issue. In many cases, the levels of optimism and excitement have reached a fevered pitch. Less enthusiastic observers, however, have noted that the promise of machine learning is often described in broad terms and abstract assertions. The combination of technical jargon, hazy use cases, and vague details leaves many wondering how accessible and advantageous machine learning really is. That suspicion is valid, even important. But at the startup level, the machine learning market is innovating and advancing fast. Better still, the technologies in the pipeline are intended neither for the biggest companies nor the most complex processes. Rather, they are being designed to solve age-old business problems that could plague any enterprise.



Quote for the day:



"A mistake is a crushing weight to a pessimist and a trampoline to an optimist." -- Tim Fargo