Daily Tech Digest - July 10, 2018

The value of visibility in your data centre

The value of visibility in your data centre image
Keeping key enterprise applications up and running well is an absolute requirement for modern business. As estimated by Gartner, IDC and others, the cost of IT downtime averages out to around £4,200 per minute. A simple infrastructure failure might cost around £75,000; while the failure of a critical, public-facing application costs more like £378,000 to £755,000 per hour. When failures impact large-scale global logistics and cause widespread inconvenience to customers, for example, last May’s, British Airways airline operations systems failure, costs can quickly become staggering. BA estimated losing $102.19 million USD (£77.08 million GBP) in hard costs including airfare refunds to stranded passengers, plus incalculable damage to reputation. BA’s parent company, IAG, subsequently lost $224 million USD (£170 million GBP) in value, based on its then-current stock valuation. Preventing such disasters, or intervening effectively and rapidly when they occur, means giving developers and operations staff (DevOps) visibility into IT infrastructure, networks, and applications.



Entrepreneurs think differently about risk


What is the worst thing that can happen? This is where a lot of people start and it’s why they don’t even bother evaluating the rest of it. A person who hates their job and doesn’t want to work for anyone again might shrug off becoming a freelancer because of the risk involved in quitting a 9–5, losing the steady paycheque and benefits, potentially not having clients and needing to find a new job. However I like to think “then what?” Will that kill me? No, it just means that you may need to find a new job. Plenty of people get laid off and need to find new jobs, that’s not the end of the world. If you’re worried that raising money for your startup will cause you to give away too much ownership in your company and that your investors will one day take control and oust you from the company, that’s a real fear. But even then, you still would own your shares in the company. Maybe if they oust you from the company it’s because you’re doing an abysmal job as CEO and they need someone who can grow the company. You still would own a big chunk of that company.


APT Trends Report Q2 2018


We also observed some relatively quiet groups coming back with new activity. A noteworthy example is LuckyMouse (also known as APT27 and Emissary Panda), which abused ISPs in Asia for waterhole attacks on high profile websites. We wrote about LuckyMouse targeting national data centers in June. We also discovered that LuckyMouse unleashed a new wave of activity targeting Asian governmental organizations just around the time they had gathered for a summit in China. Still, the most notable activity during this quarter is the VPNFilter campaign attributed by the FBI to the Sofacy and Sandworm (Black Energy) APT groups. The campaign targeted a large array of domestic networking hardware and storage solutions. It is even able to inject malware into traffic in order to infect computers behind the infected networking device. We have provided an analysis on the EXIF to C2 mechanism used by this malware. This campaign is one of the most relevant examples we have seen of how networking hardware has become a priority for sophisticated attackers. The data provided by our colleagues at Cisco Talos indicates this campaign was at a truly global level.


Organizations must act to safeguard 'the right to be forgotten'

The immediate need is clear—the capability to delete accounts and any associated personal data. But this is not as simple as it might first appear. Organizations are loath to give up data—it helps them improve their own business models, and quite frankly, it is profitable. One only need to look at the recent reselling of user information to third parties to realize its value.Enterprises, then, would need to be compelled to part with what it perceives as valuable—and governments are attempting this with legislation such as GDPR. Beyond the necessary business case, however, lie technological challenges. While many online services have built in deletion and removal options, lingering personal data is a different matter. If this personal information is located in an application or structured database, then the process is relatively straightforward—eliminate the associated account and its data is also removed. If the sensitive data is in files—detached from applications governed by the business—then they behave like abandoned satellites orbiting the earth, forever floating in the void of network-based file shares and cloud-based storage.


Big Data Is A Huge Boost To Emerging Telecom Markets

big data telecom
Big data in telecommunications is playing the biggest role by increasing the reach of major telecommunication brands in these markets. This is especially evident in Africa, where the telecommunications market growth has been the strongest. In 2004, only 6% of African consumers owned a mobile device. This figure has grown sharply over the past 14 years. There are now over 82 million mobile users throughout the continent. In some regions in Africa, the growth has been faster than even the most ambitious technology economists could have predicted. The number of people in Nigeria that own mobile devices has been doubling every year. Pairing big data and telecom has helped spur growth in the telecommunications industry in several ways. According to an analysis by NobelCom, this will likely lead to cheaper telephone calls between consumers in various parts of the world. Here are some of the biggest. A growing number of telecommunications providers are investing more resources trying to reach consumers throughout Africa and other emerging telecom markets.


Be smart about edge computing and cloud computing

Be smart about edge computing and cloud computing
Edge computing is a handy trick. It’s the ability to place processing and data retention at a system that’s closer to the target system it’s collecting data for as well as to provide autonomous processing. The architectural advantages are plenty, including not having to transmit all the data to the back-end systems—typical in the cloud—for processing. This reduces latency and can provide better security and reliability as well. But, and this is a big “but,” edge computing systems don’t stand alone. Indeed, they work with back-end systems to collect master data and provide deeper processing. This is how edge computing and cloud computing provide a single symbiotic solution. They are not, and will never be, mutually exclusive. Some best practices are emerging around edge computing that allow enterprises to provide better use of both platforms. ... The edge computing hype will drive confusion in the next few years. To avoid that confusion, you need to understand what roles each type of system plays, and you need to understand that very few technologies take over existing technologies.


Can Cybersecurity be Entrusted with AI?

Cybersecurity
While the technology can help to fill cybersecurity skill gaps but at the same time its a powerful tool for hackers as well. In short AI can act as guard and threat at same time. What matter is who use it for what purpose. At end It all depends upon Natural Intelligence to make good or bad use of Artificial Intelligence. There are paid and free tools available which can attempt to modify malwares to bypass machine learning antivirus software. Question is how to detect and stop? Cyberattacks like phishing and ransomeware are said to be much more effective when they are powered by AI. On the other hand to power up the behavioural patterns AI in particular is extremely good at recognizing patterns and anomalies. This makes it an excellent tool for threat hunting. Will AI be the bright future of security as the sheer volume of threats is becoming very difficult to track by humans alone. May be AI might come out as the most dark era, all depends upon Natural Intelligence. Natural Intelligence is needed to develop AI/machine learning tools. Despite popular belief, these technologies cannot replace humans (in my personal opinion). Using them requires human training and oversight.


How to Adopt a New Technology: Advice from Buoyant on Utilising a Service Mesh


Adopting technology and deploying it into production requires more than a simple snap of the fingers. Making the rollout successful and making real improvements is even tougher. When you’re looking at a new technology, such as service meshes, it is important to understand that the organizational challenges you’ll face are just as important as the technology. But there are clear steps you can take in order to navigate the road to production. To get started, it is important to identify what problems a service mesh will solve for you. Remember, it isn’t just about adopting technology. Once the service mesh is in production, there need to be real benefits. This is the foundation of your road to production. Once you’ve identified the problem that will be solved, it is time to go into sales mode. No matter how little the price tag is, it requires real investment to get a service mesh into production. The investment will be required by more than just you as well. Changes impact coworkers in ways that range from learning new technology to disruption of their mission critical tasks.


How Businesses Can Navigate the Ethics of Big Data

How Businesses Can Navigate the Ethics of Big Data
The laws regarding data protection and privacy differ from country to country all across the world. The EU has an authentic set of laws pertaining to this matter, but they are visibly different than what the United States has. Privacy within the EU is often said to be stronger than what it is in the U.S. Although the myths may exaggerate the difference, the EU is miles ahead of the U.S. when it comes to stringent data and privacy protection. Privacy is considered a fundamental right for all individuals living in the EU. Details about privacy and data protection are discussed as much as gun control in the U.S. The U.S. does have privacy protection problems, but the crux of the matter is that these laws are separate for both the governing bodies.  The diversity in laws concerned with data protection in numerous countries puts forward the notion that there is a need for globally-accepted norms that govern how privacy and protection are provided to users and their data. The globally accepted norms will set the standards and a pathway for others to follow when it comes to data protection.


Selling tech initiatives to the board: Eight success tips for IT leaders

Too many IT leaders, especially if they are busy running multiple projects, underestimate how much time it takes to put together a really persuasive presentation. Some of the most compelling presentations are those built around a demo of the technology being discussed or those with a strong video presentation that draws the audience into the topic. However, demos and videos aren't going to help if you don't have a clear and cogent message for board members who are charged with ensuring that the company is well run, is making the right kinds of investments, and is positioning itself for the future. If what you present doesn't check all of these boxes, it won't succeed. ... It's easy for a technology leader to get mired in tech talk and lose an audience. The board already knows that you know tech. What it wants to know is how well you understand the business and how tech can advance it. The best way to show them that you're focused on the business is to present a clear message in plain English and to avoid technology buzzwords and levels of detail that are extraneous to the business decision that has to be made.



Quote for the day:


"Growth happens when you fail and own it, not until. Everyone who blames stays the same." -- Dan Rockwell


Daily Tech Digest - July 08, 2018

Why Big Data and AI are the Next Digital Disruptions?

Big Data, Artificial Intelligence, AI, TechNews, tech news
Big data and Artificial Intelligence are two inextricably linked technologies, to the point that we can talk about Big Data Intelligence. Artificial Intelligence has become ubiquitous in companies in all industries where decision making is transformed by intelligent machines. The need for smarter decisions and Big Data management are the criteria that drive this trend. The convergence between Big Data and AI seems inevitable as the automation of smart decision-making becomes the next evolution of Big Data. Rising agility, smarter business processes and higher productivity are the most likely benefits of this convergence. The evolution of data management did not go smoothly. Much of the data is now stored on a computer, but there is still a lot of information on paper, despite the possibility of scanning paper information and storing it on disks or in databases. You just have to go to a hospital, an administration, a doctor’s office or any business to realize that a lot of information about customers, vendors, or products is still stored on paper. However, it is impossible to store terabytes of data produced by streaming video, text and images on paper.



Why today’s leaders need to know about the power of narratives

Effective narratives articulate the 'why' - a higher purpose or common goal that helps form a shared identity
Effective narratives are defined by two characteristics. Firstly, they articulate the "why" - a higher purpose or common goal that helps actors overcome vested interests and form a shared identity. The first line in Satoshi Nakamoto’s eminent white paper that launched Bitcoin describes how "a purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution". Secondly, effective narratives establish cause-effect relationships that form the basis for working towards this goal. Chinese electric vehicle manufacturers and Tesla are rivals in retail markets, but also partners in propagating the idea that electric passenger vehicles are the best means for lowering carbon emissions. Narratives interact with the real world in that actors combine normative beliefs (the "why") and positive beliefs (the "how") into decisions which result in perceived outcomes that potentially trigger a change of the narrative itself. As such, narratives are categorically different from stories. Stories are self-contained, whereas narratives are open-ended.


Announcing Microsoft Research Open Data – Datasets

The goal is to provide a simple platform to Microsoft researchers and collaborators to share datasets and related research technologies and tools. Microsoft Research Open Data is designed to simplify access to these datasets, facilitate collaboration between researchers using cloud-based resources and enable reproducibility of research. We will continue to shape and grow this repository and add features based on feedback from the community. We recognize that there are dozens of data repositories already in use by researchers and expect that the capabilities of this repository will augment existing efforts. ... Datasets in Microsoft Research Open Data are categorized by their primary research area, as shown in Figure 4. You can find links to research projects or publications with the dataset. You can browse available datasets and download them or copy them directly to an Azure subscription through an automated workflow. To the extent possible, the repository meets the highest standards for data sharing to ensure that datasets are findable, accessible, interoperable and reusable; the entire corpus does not contain personally identifiable information. The site will continue to evolve as we get feedback from users.


Augmented Reality in Manufacturing is Ready for Its Closeup

dhl-vision-picking-06 (1).png
Virtual/Augmented reality (VR and AR) — using technology to see something that literally is not there — is coming to a manufacturing facility near you. It's actually already there, but according to PwC, more than one in three manufacturers will implement virtual or augmented reality in manufacturing processes in 2018. Perhaps it will be something relatively simple, like what logistics giant DHL recently accomplished by introducing "Vision Picking," pilot programs of workers wearing smart glasses with visual displays of order picking instructions along with information on where items are located and where they need to be placed on a cart. The smartglasses freed pickers' hands of paper instructions and allowed them to work more efficiently and comfortably. ... "Digitalization is not just a vision or program for us at DHL Supply Chain, it's a reality for us and our customers, and is adding value to our operations on the ground," says Markus Voss, Chief Information Officer & Chief Operating Officer, DHL Supply Chain. 


The Golden Record: Explained


Where ‘big data’ appears to be the skeleton key that will unlock everything and all you want to know about your business, there’s more than meets the eye when it comes to understanding your data. Yes, clean data will unlock incredible value for your enterprise; inaccurate records, on the other hand, are a significant burden on our productivity. This is why we all seek the “Golden Record”. The Golden Record is the ultimate prize in the data world. A fundamental concept within Master Data Management (MDM) defined as the single source of truth; one data point that captures all the necessary information we need to know about a member, a resource, or an item in our catalogue – assumed to be 100% accurate. Its power is undeniable.  ... The complexity of implementing a Master Data Management solution stems from defining the workflow that will connect our disparate data sets. First, we have to identify every data source that feeds into the dataset. Then, we must consider which fields we find to be the most reliable depending on their source. Finally, we must define the criteria that will determine when the data from one source should overwrite conflicting data from a secondary source in our MDM system.



Some IoT experts, taking a practical view, think the only requirements at the end-points should be to deliver secure identity and no other complexity.  Amir Haleem, CEO of Helium, which is building a decentralized network of wide-range wireless protocol gateways and a token to connect edge IoT devices, said adding complexity to end devices"is like a gigantic hurdle to people actually building things." Apart from anything else, there's the cost. "People get very sensitive about the bill of materials (BoM) when you start talking at a scale of millions or tens of millions," said Haleem. "You start proposing like a 60 cent addition to a BoM and all of a sudden that's a meaningful number." Haleem said it makes no sense for end devices, like sensors that track and monitor medicine or food supply chains, to actively participate in a blockchain because these have to be power-efficient and cheap in an IoT setting. But delivering strong identity in the form of hardware-secured keys is essential, particularly in the face of recurring widespread vulnerabilities, botnets etc.


Can we have ethical artificial intelligence?

i-robot-film-still-main
“Generally, the idea that needs to be adopted by the industry is an ethical design right from the very start. So, it’s no longer useful just to have ethical approval of a system once it’s done and deployed – it has to be considered from the beginning and it has to be continuously considered.” It’s clear that the problem with intelligent machines is people. Without careful checks and balances, we could find ourselves using data that is inherently biased to feed machines which would themselves become biased. And without serious consideration and action, we might also find ourselves at the whim of corporations and governments. Francois Chollet, an artificial intelligence researcher in Google wrote in a recent blog post that AI poses a threat given the possibility of ‘highly effective, highly scalable manipulation of human behaviour.’ He also stated that continued digitization gives social media companies an ever-increasing insight into our minds, and ‘casts human behaviour as an optimization problem, as an AI problem: it becomes possible for [them] to iteratively tune their control vectors in order to achieve specific behaviours.’


The Answer to Disruptive Technology is “Education”

When disruptive technologies are addressed in education, they are usually considered in isolation. I increasingly come across discussions about “artificial intelligence,” “blockchain,” or “robots.” But the world is revolving more and more around these technologies working together. Disruptive technologies are accelerating each other’s development, creating new societal, economic, legal and commercial realities. For instance, disruptive digital technologies (operating together) are transforming the way business works. Instead of hierarchical and asset-heavy companies, we see flatter organizations/platforms with fewer assets and employees. Coordination of the assets and workers isn’t done by traditional managers, but digital technologies, sensors, and data analytics. Some even predict the end of the firm. ... Disruptors create growth by redefining performance that either brings a simple, cheap solution to the low end of a traditional market or enables “non-consumers” to solve pressing problems in their everyday lives. Employing “old world” ideas seems unlikely to work when pursuing the new.


8 Deep Learning Frameworks for Data Science Enthusiasts

The machine learning paradigm is continuously evolving. The key is to shift towards developing machine learning models that run on mobile in order to make applications smarter and far more intelligent. Deep learning is what makes solving complex problems possible. As put in ​this ​article, Deep Learning is basically Machine Learning on steroids. There are multiple layers to process features, and generally, each layer extracts some piece of valuable information. Given that deep learning is the key to executing tasks of a higher level of sophistication – building and deploying them successfully proves to be quite the Herculean challenge for data scientists and data engineers across the globe. Today, we have a myriad of frameworks at our disposal that allows us to develop tools that can offer a better level of abstraction along with the simplification of difficult programming challenges. Each framework is built in a different manner for different purposes. Here, we look at the 8 deep learning frameworks to give you a better idea of which framework will be the perfect fit or come handy in solving your business challenges.


Process Simulation with the Free DARL Online Service

Trading is about transferring funds from one financial instrument to another and back again, in such a way as to have more of the first financial instrument when you've finished. In this case, the two financial instruments we will use are the pound sterling and the dollar. I'm going to start with a simulated £10,000 and trade in and out of the dollar. ... This data contains a date and the open, close, high and low exchange rates for each day. We're going to simulate trading at the close. In the file, this value is called "price". When you trade anything through an exchange, or use a high street foreign exchange, there are two sources of cost. There's normally a transaction fee and there's a "spread" which is an offset to the central rate. These are the sources of guaranteed profit to the brokers. Our simulation will have values for both of these. Simulating trading will require us to respond to a trading signal and to buy whichever currency we are told, to calculate and subtract charges, and to keep track of the value of our holding. In trading parlance, in currency trading, you can be "long" or "short" a particular currency. If we are holding sterling, we are long sterling and short the dollar, if we are holding dollars we are short sterling and long dollar.



Quote for the day:


"Leaders think and talk about the solutions. Followers think and talk about the problems." -- Brian Tracy


Daily Tech Digest - July 07, 2018

The Re-Permissioning Dilemma Under GDPR

The Re-Permissioning Dilemma Under GDPR
There seems to be divergent opinions relating to the requirement to undertake re-permissioning of data subject consent under GDPR. Article 4(6) of GDPR makes it clear that if the basis for the collection of personally identifiable information is consent, as required under Article 6, then such consent must be “freely given, specific, informed and an unambiguous indication of the data subject’s wishes…by a clear affirmative action…”Accordingly, obtaining positive and affirmative consent is mandatory, otherwise data controllers and processors may be infringing upon data subject rights and may be subject to legal remedies, liabilities and penalties. However, Recital 171 of the GDPR appears to obviate the need to obtain positive data subject consent by affirming that “ Where processing is based on consent pursuant to Directive 95/46/EC, it is not necessary for the data subject to give his or her consent again if the manner in which the consent has been given is in line with the conditions of this Regulation…” This provision may give comfort to organizations who have made significant investments in building out their contact databases based on implied or opt-out consent as the basis for their collection.


Are We Experiencing a ‘Fintech Moment’?

Many banks – mostly the larger banks – invest large amounts of resources in digital technologies and have for the last decade. While smaller banks and credit unions lag in these investments, and attention on what is coming, their technology providers continue to focus on emerging technologies. The crux of the Kodak situation was the speed of the customer behavior shift, and their failure to adjust strategy accordingly. Digital technology has and continues to impact banking customers’ behavior and expectations as well. Luckily, the banking customers’ behavior has changed much slower than in the case of Kodak, allowing financial institutions to move at their speed thus far. Complacency, in this situation, should worry banking executives. The industry now lags in rethinking customer experiences compared to other industries. Technology innovators and early adopters move to fintech firms or the more tech-savvy banks. As the early majority begins this jump, banks will have to be ready to serve them.


What US FinTech can learn from UK FinTech?

Stricter regulations appear to be the biggest hurdle for US digital banks in growing and innovating. Many US digital banks are finding it simpler to partner with existing institutions, rather than forge their own path. Moven and Simple did just that and avoided getting their own charters. Varo Money followed this tactic too, but it has applied for its own bank charter in order to become the first app-based bank in the US. It is much more difficult to get a bank charter (the main banking licence) in the US. Bank charters are more complicated and expensive compared to the UK, where only one regulator has to issue it. In fact, US regulators haven’t approved a new bank charter in 10 years, effectively prohibiting any startups from acquiring one. Furthermore, different deposit insurance requirements benefit UK bank startups too as they are dynamic to the size of the bank. Another huge advantage for UK bank challengers is the implementation of “open banking.” Starting this past January, open banking has forced the UK’s nine largest banks to share their data with licensed startups (with account holder approval). 


Singapore Has Fintech Dreams, But It's Short on Tech Talent

“It would be great if the government can consider experimental schemes,” including a one-year employment pass for technology workers who are vetted by industry associations, “in order to address the immediate talent shortage gap,” he said. “This would be a balanced approach to protect jobs for locals, which should always be a priority, while still allowing Singapore to capture emerging global opportunities in the area of fintech and blockchain,” he said. Ravi Menon, managing director of the Monetary Authority of Singapore, who is spearheading the nation’s fintech efforts, said the talent crunch is a global problem and the biggest challenge for the city state’s burgeoning industry. The central bank is building capacity in the industry through upskilling programs, but that’s not enough. “We have to admit and acknowledge that there are some talents or skill sets we just don’t have and we have to remain open to foreign talent,” Menon said at a fintech event in May.


7 Features That Make Kubernetes Ideal For CI / CD

Kubernetes is a powerful, next generation, open-source platform for automating the deployment, scaling and management of application containers across clusters of hosts. It can run any workload. Kubernetes provides exceptional developer user experience (UX), and the rate of innovation is phenomenal. From the start, Kubernetes’ infrastructure promised to enable organizations to deploy applications rapidly at scale and roll out new features easily while using only the resources needed. With Kubernetes, organizations can have their own Heroku running in their own Google Cloud, AWS or on-premises environment. In years past, think about how often development teams wanted visibility into operations deployments. Developers and operations teams have always been nervous about deployments because maintenance windows had a tendency to expand, causing downtime. Operations teams, in turn, have traditionally guarded their territory so no one would interfere with their ability to get their job done. Then containerization and Kubernetes came along, and software engineers wanted to learn about it and use it. It’s revolutionary.


Memo to the CFO: Get in front of digital finance—or get left back

Digitization is now a realistic goal for the finance function because of a range of technological advances. These include the widespread availability of business data; teams’ ability to process large sets of data using now-accessible algorithms and analytic methods; and improvements in connectivity tools and platforms, such as sensors and cloud computing. CFOs and their teams are the gatekeepers for the critical data required to generate forecasts and support senior leaders’ strategic plans and decisions—among them, data relating to sales, order fulfillment, supply chains, customer demand, and business performance as well as real-time industry and market statistics. ... CFOs may decide to champion and pursue investments in one or all of these areas. Much will depend on the company’s starting point—its current strategies, needs, and capabilities and its existing technologies and skill sets. It is important to note that digital transformation will not happen all at once, and companies should not use their legacy enterprise resource planning and other backbone systems as excuses not to start the change.


Make the Leap from IT 'Pro' to IT 'Manager'

Image: Pixabay
In many technology-focused jobs, employees don’t necessarily need to have a complete grasp regarding what the business does, how it operates or what direction it is going in. Yet, if you really want to make your mark in IT management, having a thorough understanding of the inner-workings of the business is crucial. Therefore, when time permits, make sure you read and fully understand the organization's mission statement. Talk to other managers and get their opinions on the current and future state of the company. Then armed with that knowledge, begin applying it toward your tech-specific role. Doing so will help you to better communicate with end users about why they need specific apps and how things like digital transformation of the business impacts their ability to work. ... In traditional enterprise organizations, an IT department is broken into multiple teams. For example, teams commonly are split into service desk, infrastructure, DevOps, database and data security, to name a few. If you want to stand out as a leader in the IT department, a good place to start is within your specific team.


Cyber resilience in Scotland: combating cyber crime

The Scottish government unveiled its cyber resilience strategy in 2015, with the aim of helping Scotland’s people, businesses and public sector improve their ability to use technology securely, and understand and address cyber crime. It launched more detailed cyber resilience plans for the public sector in November 2017, and the private and third sectors in June 2018. ... Part of this investment involves supporting the wider adoption of the Cyber Essentials scheme, with the aim of “at least [doubling] the number of organisations across the public, private and third sectors holding Cyber Essentials or Cyber Essentials Plus certification in Scotland during Financial Year 18-19”. All Scottish public-sector bodies are expected to achieve certification to the Cyber Essentials scheme by the end of October 2018. Cyber Essentials was developed by the UK government to provide five cyber security controls that all organisations can implement to achieve a baseline of cyber security


Ericsson UDN, Mode Move Needle


Managing multiple networks on a global scale is no small task. And while the reliability and security of MPLS is certainly comforting, it is quite rigid. Service providers and enterprises can leverage this private core network to avoid scaling issues as well the costly and unbending nature of antiquated edge solutions. Marcus Bergström, VP and GM Ericsson UDN, noted “We’re proud to have discovered and on-boarded Mode to help add value immediately to our existing customer base. Mode saw immediate value and scale of running its breakthrough routing technology on our modern edge compute platform that is built in partnership with service providers globally.” In specific, Ericsson UDN (Unified Delivery Network) is now melded with Mode HALO to reinvigorate the private core network. Mode’s routing algorithms play a central role in the cloud service solution. The self-service solution enables migration into the modern age of performance, while introducing flexibility and a reduction in cost.


Management AI: Types Of Machine Learning Systems


“Hopefully” is the key difference between a heuristic and a deterministic algorithm. When deterministic algorithms are run, you will get a solution. Heuristics are “rules of thumb”: rules that can make predictions but do not have certainty. Heuristic algorithms often use probabilistic methods during applications of rules and providing results. Two terms relevant to expert systems are forward chaining and backward chaining. Forwardchaining starts from evidence and drives to a conclusion. Backward chaining starts with a conclusion and then checks to see if the evidence supports that conclusion. Think about cause and effect. ... Analytics as ML is a sensitive and controversial idea to many. As mentioned in an earlier article, machine learning is moving past a purely AI origin. In the last decade, business intelligence (BI) has increasingly incorporated more advanced analytics. BI analytics include deterministic algorithms that process massive amounts of data to identify patterns as well as make predictive and prescriptive suggestions. Those algorithms are the foundation of analytics used in the BI sector.



Quote for the day:


"Leaders don't inflict pain. They bear pain." -- Max DePree


Daily Tech Digest - July 06, 2018

Globally, businesses are expected to invest $3.1bn in blockchain-based systems in 2018 according to IDC, more than double the figure from the previous year. If these predictions are correct, RSA warns that security teams could be left blind to cyber attack because many traditional Siem tools are unable to baseline the ‘new normal’ behaviours associated with blockchain and could allow hackers to gain entry to corporate networks. “Opinions are mixed on whether blockchain is a flash in the pan, or the next major disruptor. However, there is evidence – particularly in financial services – that blockchain adoption is gaining momentum,” said Azeem Aleem, global director of RSA’s Advanced Cyber Defence Practice. “If this is the case, then organisations need to be prepared for the impact this could have on their security operations teams,” he said. As with any new technology, Aleem said cyber attackers will look for vulnerabilities in how businesses implement blockchain, adding that any disruption or security breach due to a blockchain-related vulnerability could have a serious impact on operations.


PKO launches blockchain-based documentation verification platform

Trudatum has been piloted by PKO BP for over a year, as the result of the “Let’s Fintech” accelerator programme, and alongside a number of other successful Coinfirm pilots with clients across Western Europe, the United States and Japan. The first stage of implementing Trudatum across the bank will focus on integrating it with PKO’s current systems and providing a solution which makes it possible to verify the authenticity of various bank documents. Every document recorded in the blockchain (e.g. proof of a transaction, or bank’s terms and conditions for a given product) will be issued in the form of irreversible abbreviation (“hash”) signed with the bank’s private key. This will allow a client to verify remotely if the files he received from a business partner or from the bank are true, or if a modification of the document was attempted. Thanks to Coinfirm’s solution, PKO BP can now provide more efficient supervision of transaction history and data verification, which will be beneficial both in terms of time savings and costs of managing these processes. Trudatum is not only a solution for the challenges above, but it also permits cryptographic security for digital signatures.


IT infrastructure management software learns analytics tricks


While some users remain cautious or even skeptical of AIOps, IT infrastructure management software of every description -- from container orchestration tools to IT monitoring and incident response utilities -- now offer some form of analytics-driven automation. That ubiquity indicates at least some user demand, and IT pros everywhere must grapple with AIOps, as tools they already use add AI and analytics features. PagerDuty, for example, has concentrated on data analytics and AI additions to its IT incident response software in 2017 and 2018. A new AI feature added in June 2018, Event Intelligence, identifies patterns in human incident remediation behavior and uses those patterns to understand service dependencies and communicate incident response suggestions to operators when new incidents occur. "The best predictor of what someone will do in the future is what they actually do, not what they think they will do," said Rachel Obstler, vice president of products at PagerDuty, based in San Francisco.


BOE tells U.K. banks cyber attacks are coming, now get ready

Financial regulators told firms to come up with a detailed plan for restoring services such as payments, lending and insurance after a disruption, and to invest in the staff and technology to make it work. The plan should include time limits on how long an outage could last. “Boards and senior management should assume that individual systems and processes that support business services will be disrupted, and increase the focus on back-up plans, responses and recovery options,” the Bank of England and the Financial Conduct Authority said. The discussion paper published on Thursday is part of the regulators’ effort to bolster the resilience of financial firms in response to a rising number of operational failures. The focus is on ensuring continuity of business services that are essential for the economy. The regulators underlined the role that firms’ senior officials have to play in improving their ability to bounce back in a crisis. Thursday’s paper is intended to spark a debate with industry and consumers on how best to respond to inevitable disruptions.


Collaborative Intelligence: Humans and AI Are Joining Forces

R1804J_WILSON_COLLABORATION.png
As AIs increasingly reach conclusions through processes that are opaque (the so-called black-box problem), they require human experts in the field to explain their behavior to nonexpert users. These “explainers” are particularly important in evidence-based industries, such as law and medicine, where a practitioner needs to understand how an AI weighed inputs into, say, a sentencing or medical recommendation. Explainers are similarly important in helping insurers and law enforcement understand why an autonomous car took actions that led to an accident—or failed to avoid one. And explainers are becoming integral in regulated industries—indeed, in any consumer-facing industry where a machine’s output could be challenged as unfair, illegal, or just plain wrong. For instance, the European Union’s new General Data Protection Regulation (GDPR) gives consumers the right to receive an explanation for any algorithm-based decision, such as the rate offer on a credit card or mortgage. This is one area where AI will contribute to increased employment: Experts estimate that companies will have to create about 75,000 new jobs to administer the GDPR requirements.


Nasdaq CIO Puts AI to Work

“There’s not an industry that I can see that won’t benefit (from AI),” he said. Technology executives at Nasdaq and other firms say the big value in AI comes when it’s paired with human workers, in what’s known as “AI augmentation.” In 2021, AI augmentation will generate $2.9 trillion in business value and recover 6.2 billion hours of worker productivity, according to forecasts from Gartner Inc. Last year, Nasdaq’s team of in-house team of data scientists and data engineers built an AI system that helps analysts write change-of-ownership reports. Such reports typically include information for chief executives and investor relations officers about institutional activity, including top buyers and sellers, as well as shareholder analysis, price performance and valuation. In the past, the reports were higher quality when the humans writing them had a lot of experience. But when those analysts moved on to other jobs, it took time to train new employees to write the reports, and in turn, to ramp up the quality, Mr. Peterson said. The AI system, currently in pilot phase, helps generate some portions of the report quickly and at a high quality, freeing up human analysts to spend more time providing deeper context and advising clients, Nasdaq said.




While no one was looking, California passed its own GDPR

California Consumer Privacy Act of 2018
What happens now? If you do business in California, you have to comply with the law, and so does any company that you sell customer data. If they violate the law, you are on the hook for it. And you have to add a “Do Not Sell My Personal Information” link to your site. No doubt the law will be challenged, and the ballot can always come back if the law is weakened or overturned. If you are potentially impacted by GDPR in any way, you should have already done some compliance. Now, if you do business in California you will have to, even if you aren’t in the state. Basically, all the best practices for GDPR apply here. This means making sure all of your data is accurate. Now would be a really good time to revisit customer and mailing lists because if there are inaccuracies you can find it will save the trouble of doing it later. Old, outdated or obsolete data can be removed. Make sure all data collection channels know of the new rules and adjust accordingly to take in correct data and quickly get at it to make changes or removals. Make sure to document data handling rules so everyone who handles data, either for intake, editing or management, knows what is expected.


Reactive or Proactive? Making the Case for New Kill Chains

Organizations won't see these employees looking at job search websites. Instead, they will visit websites where they can circumvent web proxies. These are websites that allow them to hide, and then jump to the Dark Web, for example, to move data and bypass controls. The next stage of the chain is when they persistently try logging into systems to which they typically do not have access. They quietly "jiggle doors" looking for sensitive data that is outside the scope of their, their peers', and overall team's role. Combining these two stages — visiting suspicious websites and jiggling doors — are good examples that indicate a person may be a persistent threat. The next stage is when the person acts. For example, on a regular basis, s/he may encrypt small amounts of sensitive data and exfiltrate it outside the network. By breaking the data down into small amounts, the person aims to evade detection, and by encrypting it, makes it even more difficult because the company cannot see what's inside. Obviously, the goal is to stop the person before getting to the final stage of exfiltration.


You can no longer afford to indulge cloud blockers

Many enterprises are today highly successful with cloud computing, and the evidence clearly shows that the cloud is more secure than on-premises systems, costs less to operate, and provides key strategic capabilities such as agility and reduced time to market. But there are still those people who have kept cloud computing out of their companies for the last decade, at first through active resistance and dismissal, now by being quietly passive-aggressive. Today, they are faced with a boss, board of directors, and staffers who are all looking at new information, and perhaps facing competition that is faster and more agile with cloud computing. These cloud resisters are in a full-blown state of cognitive dissonance. This cognitive dissonance is bad for both them and their companies. Many of these people are seen as blockers, and so they lose their jobs; CIOs top the list. What a waste of talent! Worse, they also end up wasting their companies’ time and money trying to prove to everyone that they were indeed right about something they are not right about.


The Generational Shift in IT Drives Change for IT Pros

Instead of focusing on the challenges that emerging technologies bring focus on the new opportunities they offer, just as they did when the Internet arrived and mobile devices became more commonplace. IT professionals can play a key role in using technology-driven creativity to enable innovation, standardization, and simplicity into the business, helping the whole organization get ahead of the curve. In order to do this, IT has to move away from patching and backups to value-creating activities such as design-thinking, application development, user adoption and learning management. Even the smallest step, such as creating a chatbot that serves as an IT helpdesk, can transform organizational performance and invidiual productivity. Further, emerging technologies like artificial intelligence, natural language processing, blockchain and the Internet of Things are being built on cloud technology. Understanding these emerging technologies, the data they rely on, and how they can be applied to the business will be critical as IT professionals become strategic partners in deploying these technologies in the enterprise.



Quote for the day:


"The greatest single problem in communication is the illusion that it has occurred." -- G.B. Shaw


Daily Tech Digest - July 05, 2018

Testing machine learning interpretability techniques

Gears
Originally, researchers proposed testing machine learning model explanations by their capacity to help humans identify modeling errors, find new facts, decrease sociological discrimination in model predictions, or to enable humans to correctly determine the outcome of a model prediction based on input data values. Human confirmation is probably the highest bar for machine learning interpretability, but recent research has highlighted potential concerns about pre-existing expectations, preferences toward simplicity, and other bias problems in human evaluation. Given that specialized human evaluation studies are likely impractical for most commercial data science or machine learning groups anyway, several other automated approaches for testing model explanations are proposed here (and probably other places too): we can use simulated data with known characteristics to test explanations; we can compare new explanations to older, trusted explanations for the same data set; and we can test explanations for stability.


As the leader “learns” more about her/his team/member(s), thanks to observation and coaching results, he/she will start to gain (or maybe lose) confidence in the progress of her team/member(s). Depending on the progress magnitude, the leader can dose the level of enablement of the team and/or the individuals. Good progress means more enablement, until the team becomes what we call Autonomous;the best of the breed in the industry, acknowledged by many since the time of Hirotaka Takeuchi and Ikujiro Nonaka, the inventors of the “New New Product Development Cycle”. But what if progress is slow, or below the acceptable norm? Practically, this is the tough part of the story. The answer will certainly depend on a deeper look into the reasons, as well as the organizational ability, to practice patience in developing its people. But in all cases, there is a given threshold for those who neither have the guts nor the desire to improve.  



For a strategy to be sound, it should be preceded by a warts-and-all look at the effectiveness and maturity of the as-is position and a clear line of sight of where it needs to get to. This requires a deep understanding of the business within which security operates, alongside measuring the effects of the myriad security jigsaw pieces across the organisation. This almost never happens. If it did, security teams would recognise that investment needs to be made primarily and almost solely on fixing the crap that is already there. How can this be? Well, let’s go through some of the jigsaw pieces that just about every organisation will have in its security picture. Policy – we all have policy. If you work in government, you will have more policy than you can shake a stick at, and in other organisations or industries, hopefully less so. However, almost every policy is the equivalent of the Ten Commandments: thou shalt not commit adultery; thou shalt not share thy password.


'It's going to create a revolution': how AI is transforming the NHS

Scan of head in profile with tumour highlighted in red
Computer engineers are fond of asserting that data is the fuel of AI. It is true: some modern approaches to AI, notably machine learning, are powerful because they can divine meaningful patterns in the mountains of data we gather. If there is a silver lining to the fact that everyone falls ill at some point, it is that the NHS has piles of data on health problems and diseases that are ripe for AI to exploit. Tony Young, a consultant urological surgeon at Southend University hospital and the national clinical lead for innovation at NHS England, believes AI can make an impact throughout the health service. He points to companies using AI to diagnose skin cancer from pictures of moles; eye disorders from retinal scans; heart disease from echocardiograms. Others are drawing on AI to flag up stroke patients who need urgent care, and to predict which patients on a hospital ward may not survive. “I think it’s going to create a revolution,” he says.


The art of finding a good data scientist

Identify software development pain points CIO
In the heated competition for data science talent, it’s important to fish in the ponds where not everyone else is fishing, so we’ve found ourselves focusing less on the expected targets like those Stanford and MIT computer science types and more on schools that seem to produce graduates with a robust outlook on applying science in daily life. Carnegie Mellon University and the University of California, Berkeley, are among the institutions that have particularly impressed us. In fact, on May 10, Carnegie Mellon announced it would launch the nation’s first Bachelor of Science program in AI this fall. Many U.S. universities offer an AI track within their computer science or engineering programs, but Carnegie Mellon is establishing a distinct undergraduate major, with a practical focus. Meanwhile, the University of California, San Diego, announced it will begin limiting enrollment in the data science major it started in fall 2017 due to overwhelming demand. What a terrific indication of the soaring interest in data science and a much-needed boost for the pipeline of data science expertise.


Nokia to build & test 5G apps in China with Tencent

3rd World Internet Conference – Day 1
5G presents an opportunity to revisit Nokia’s role once again, both as a network services provider as well as a developer of services to run on those networks. “This collaboration with Tencent is an important step in showing webscale companies around the globe how they can leverage the end-to-end capabilities of Nokia’s 5G Future X portfolio,” said Marc Rouanne, president of Mobile Networks at Nokia. “Working with them we can deliver a network that will allow them to extend their service offer to deliver myriad applications and services with the high-reliability and availability to support ever-growing and changing customer demands.” For Tencent, the company already has a huge number of users, and last year it was part of a consortium (with Alibaba, Didi and Baidu) that took at $12 billion stake in mobile operator China Unicom. That partnership will give the company — which has made its fortune in software — messaging apps, games and other services — a stronger place in building services that are more tightly integrated with networks. And this deal with Nokia will extend that kind of work specifically in the area of 5G.


Reskilling facilitates agile IT in the digital era

Reskilling facilitates agile IT in the digital era
Reskilling happens organically around agile software development at John Hancock, says Derek Plunkett, who runs application development for the financial services firm's retirement plan services. There, application developers, engineers, quality assurance analysts, cybersecurity talent and other IT staffers work with an array of business workers in small, nimble teams to build various digital products and services, including the company's websites and retirement calculators, says Plunkett. Key to this endeavor is ensuring that IT's culture is aligned around building the best business outcomes for the company's plan participants. "We want to be strategic partners and in order to do that, we need to understand the goals of the business,” Plunkett says, adding that he doesn’t employ a formal rotational program. John Hancock’s IT is moving toward a more engineering-focused, startup culture, which includes pair programming, where two developers code from one keyboard and computer.


UK announces creation of London cybercrime court

The purpose-built court will deal with civil, business, and property cases. Lord Chancellor David Gauke said the deal represents a "message to the world that Britain both prizes business and stands ready to deal with the changing nature of 21st-century crime." "This is a hugely significant step in this project that will give the Square Mile its second iconic courthouse after the Old Bailey," added Catherine McGuinness, Policy Chairman of the City of London Corporation. "I'm particularly pleased that this court will have a focus on the legal issues of the future, such as fraud, economic crime, and cybercrime." According to the Office for National Statistics' latest Crime Survey for England and Wales(CSEW), 4.7 million incidents of criminal fraud and cybercrime were experienced by UK residents in the past year, with bank and credit card fraud forming the majority of cases. Norton suggests that in 2017, £130 billion was stolen from the general public by cybercriminals, of which £4.6 billion in losses were experienced specifically by British consumers.


Data Citizens: Why We All Care About Data Ethics


In the world of data citizenship, these mechanisms are less well defined. Even discovering that bias exists can be challenging since so many data science outcomes are proprietary knowledge. It may not be obvious to anyone who does not have the resources to conduct a large-scale study that hiring algorithms are unintentionally leading to vicious poverty cycles, or that criminal risk assessment software is consistently poor at assessing risk, but great at categorising people by race, or that translation software imposes gendered stereotypes even when translating from a non-gendered language.  These are, of course, all examples that have been discovered and investigated publically but many others exist unnoticed or unchallenged. In her book “Weapons of Math Destruction,” Cathy O'Neil describes one young man who is consistently rejected from major employers on the basis of a common personality test.


Top six security and risk management trends

New detections technologies, activities and authentication models require vast amounts of data that can quickly overwhelm current on-premises security solutions. This is driving a rapid shift toward cloud-delivered security products. These are more capable of using the data in near real time to provide more-agile and adaptive solutions. “Avoid making outdated investment decisions,” advised Mr. Firstbrook. “Seek out providers that propose cloud-first services, that have solid data management and machine learning (ML) competency, and that can protect your data at least as well as you can.” ... The shift to the cloud creates opportunities to exploit ML to solve multiple security issues, such as adaptive authentication, insider threats, malware and advanced attackers. Gartner predicts that by 2025, ML will be a normal part of security solutions and will offset ever-increasing skills and staffing shortages. But not all ML is of equal value. “Look at how ML can address narrow and well-defined problem sets, such as classifying executable files, and be careful not to be suckered by hype,” said Mr. Firstbrook.



Quote for the day:


"Leadership does not always wear the harness of compromise." -- Woodrow Wilson


Daily Tech Digest - July 04, 2018

Understanding Blockchain Fundamentals, Part 3: Delegated Proof of Stake


The gist is that PoW provides the most proven security to date, but at the cost of consuming an enormous amount of energy. PoS, the primary alternative, removes the energy requirements of PoW, and replaces miners with “validators”, who are given the chance to validate (“mine”) the next block with a probability proportional to their stake. Another consensus algorithm that is often discussed is Delegated Proof of Stake (DPoS) — a variant of PoS that provides a high level of scalability at the cost of limiting the number of validators on the network. ... DPoS is a system in which a fixed number of elected entities (called block producers or witnesses) are selected to create blocks in a round-robin order. Block producers are voted into power by the users of the network, who each get a number of votes proportional to the number of tokens they own on the network (their stake). Alternatively, voters can choose to delegate their stake to another voter, who will vote in the block producer election on their behalf.



Cryptocurrency Theft Drives 3x Increase in Money Laundering

"We're now seeing, in the last probably eight to 12 months, a real influx of new criminals that are highly technically sophisticated," he explains. There's a major difference between seasoned threat actors and those who have been dabbling in cybercrime for less than 12 months: operational security. It isn't a question of technical prowess so much as lack of experience, Jevans continues. Cybercrime's newest threat actors can craft advanced malware designed to target cryptocurrency addresses and inject similar addresses, under their control, to receive funds. Their malware is designed to target digital funds in a way traditional malware isn't, created by people who grew up learning about virtual currencies and can exploit them in new ways. The problems start when they secure the money. ... "It's clear these people really understand cryptocurrency and crypto assets really, really well," he explains. "What they don't understand is old-school operational security … they're just not sophisticated that way. Legacy folks, they definitely have better operational security. They're better at how they interface with it, how they distribute malicious code, how they manage user handles on different forums."


Dell New XPS 13 vs. HP Spectre x360 13t: Which laptop is better

dell new xps 13 vs hp spectre x360 13 1
With completely refreshed models at hand, we're putting these two dream machines through an old-fashioned smackdown. We're comparing them on everything from design and features to price and performance, declaring a winner in each category. Keep reading to see who comes out ahead. ... Both laptops are extremely portable for what they offer in capability and performance. In pure weight contests, our scale put the New XPS 13 at 2 pounds, 10.5 ounces, and the Spectre x360 13t at 2 pounds, 11.7 ounces. Unless you’re looking for a true featherweight-class devices that's closer to two pounds, it’s going to be hard to beat these two. Where it might matter to someis how large the actual body is, which can affect the size of your laptop bag or your comfort on a cramped airplane. While we think this is a pretty close battle, the nod obviously goes to the New XPS 13, which is just incredibly small despite having a 13.3-inch screen. ... It’s interesting that both the XPS 13 and Spectre x360 13t are the last refuge of “good keyboards.” There's no marketing to make you believe that less key travel is better.


4 reasons why CISOs must think like developers

number 4 four with binary grunge background
Developers are constantly looking for ways to extend services and share data using API’s & Microservices. Microservices help weave a digital fabric through a set of loosely-coupled services stitched together as a platform. Platform-centric architectures provide for extensibility with the ability to plug-and-play new tools and services using API’s with open data formats like JSON. CISO’s similarly must start thinking of ways to break down data silos and integrate the data from various tools and sub-systems. The list of “sensors” generating security data is endless and keeps growing every day. Anti-virus scan reports, firewall logs, vulnerability scan data, server access logs, authentication logs and threat profiles are just some of the sources of critical security information. All this data only makes sense when integrated into one single view and analyzed using AI-models. The volume, velocity and variety of data make it impossible for human-beings to analyze and react. AI-driven models help discern anomalous behavior from regular patterns and are the only scalable approach for detecting threats in near real-time. Security operations, automation, analytics and incident response as an integrated platform is the way to go.


Network professionals should think SD-Branch, not just SD-WAN

Aruba, SD-Branch, SD-WAN, WAN, networking
Doyle defines the SD-Branch as having SD-WAN, routing, network security, and LAN/Wi-Fi functions all in one platform with integrated, centralized management. An SD-Branch can be thought of as the next step after SD-WAN, as the latter transforms the transport and the former focuses on things in the branch, such as optimizing user experience and improving security. ... Most SD-WAN solutions focus on WAN transport, but apps continue on inside the branch. Aruba’s SD-Branch provides fine-grained contextual awareness and QoS across the WAN, but also inside the branch, and can be extended to mobile users. This is an important step in breaking down the management silos of remote networks, in office, and WAN. Network engineers should think of the end-to-end network instead of discrete places. Apps don’t care about network boundaries, and it’s time for network operations to think that way, as well. From an operations perspective, Aruba’s SD-Branch would enable IT organizations to manage more branches with fewer people. The automated capabilities and ZTP takes care of many of the tasks that were historically done manually.


Open source isn’t the community you think it is

Open source isn̢۪t the community you think it is
The interesting thing is just how strongly the central “rules” of open source engagement have persisted, even as open source has become standard operating procedure for a huge swath of software development, whether done by vendors or enterprises building software to suit their internal needs. While it may seem that such an open source contribution model that depends on just a few core contributors for so much of the code wouldn’t be sustainable, the opposite is true. Each vendor can take particular interest in just a few projects, committing code to those, while “free riding” on other projects for which it derives less strategic value. In this way, open source persists, even if it’s not nearly as “open” as proponents sometimes suggest. Is open source then any different from a proprietary product? After all, both can be categorized by contributions by very few, or even just one, vendor. Yes, open source is different. Indeed, the difference is profound. In a proprietary product, all the engagement is dictated by one vendor.


Java Parallel Streams
A stream is a sequence of elements. An array is a data structure that stores a sequence of values. Then, a stream is an array? Well, not really - let's look at what a stream really is and see how it works. First of all, streams don't store elements, an array does. So, no, a stream is not an array. Also, while collections and arrays have a finite size, streams don't. But, if a stream doesn't store elements, how can it be a sequence of elements? Streams are actually a sequence of data being moved from one point to the another, but they're computed on demand. So, they have at least one source, like arrays, lists, I/O resources, and so on. Let's take a file for an example: when a file is opened for editing, all or part of it remains in memory, thus allowing for changes, so only when it is closed there's a guarantee that no data will be lost or damaged. Fortunately, a stream can read/write data chunk by chunk, without buffering the whole file at once. Just so you know, a buffer is a region of a physical memory storage (usually RAM) used to temporarily store data while it is being moved from one place to another.


Cloud computing concept
The marketplace consists of suppliers and consumers that either rent out or purchase computing power to perform their tasks. Consumers who connect to the virtual space can either select a rental time or buy available power for their projects, and then calculate the cost accordingly. When the power resource is theirs, consumers can then take advantage of SONM’s capabilities to render videos, host apps and websites, make scientific calculations, manage data storage, or work with machine learning. Suppliers — the computing power owners — earn SNM tokens when they sell computer resources to consumers. SONM is completely decentralized, which means the platform is transparent and free from ownership, and the company claims it is less expensive than centralized competitors. “Blockchain enables the creation of a genuinely open decentralized system without a single control center,” Antonio said. “Additionally, using blockchain to manage settlements on-platform with the help of the SNM cryptocurrency allows the interests of participants to be protected.”


An urban scene.
Economic viability is important, says Sharma, because of the public policy imperative to find cost-effective solutions to the problems facing urban areas. “In general, cities are stretched in terms of their budgets,” he says, “They are thinking about how to efficiently utilize all of the assets they have. For example, better traffic management can be an economic alternative to building a new highway. The ultimate goal is not necessarily to build roads, it’s to improve mobility, and do a better job of getting people from point A to point B.” Sharma says that social media and awareness of new technology is increasing the motivation of urban planners and politicians to implement smarter solutions to problems such as traffic congestion, parking shortages, security, and first-responder response times. “Citizens are demanding more from their leaders,” he says. “I think this will motivate policymakers, and result in the right decisions when it comes to using digital technology.” A recently released report from Juniper Research, sponsored by Intel, looks at the evolution of smart cities in the context of mobility, healthcare, public safety and productivity.


Facial Recognition: Big Trouble With Big Data Biometrics

Amazon Web Services, for example, in 2016 began to offer biometric capabilities via Amazon Rekognition, and it's ready to highlight positive use cases. "We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft)," Matt Wood, general manager for deep learning and artificial intelligence at Amazon Web Services, said in a blog post last month. ... As data breach expert Troy Hunt has written as well as extensively documented: "Sooner or later, big repositories of data will be abused. Period." Hunt was specifically writing about India's Aadhaar implementation, which is the world's largest biometric system, storing about 1.2 billion individuals' details, and which has not been a security success story



Quote for the day:


"The essence of leadership is the willingness to make the tough decisions. Prepared to be lonely." -- Colin Powell