Daily Tech Digest - July 06, 2018

Globally, businesses are expected to invest $3.1bn in blockchain-based systems in 2018 according to IDC, more than double the figure from the previous year. If these predictions are correct, RSA warns that security teams could be left blind to cyber attack because many traditional Siem tools are unable to baseline the ‘new normal’ behaviours associated with blockchain and could allow hackers to gain entry to corporate networks. “Opinions are mixed on whether blockchain is a flash in the pan, or the next major disruptor. However, there is evidence – particularly in financial services – that blockchain adoption is gaining momentum,” said Azeem Aleem, global director of RSA’s Advanced Cyber Defence Practice. “If this is the case, then organisations need to be prepared for the impact this could have on their security operations teams,” he said. As with any new technology, Aleem said cyber attackers will look for vulnerabilities in how businesses implement blockchain, adding that any disruption or security breach due to a blockchain-related vulnerability could have a serious impact on operations.


PKO launches blockchain-based documentation verification platform

Trudatum has been piloted by PKO BP for over a year, as the result of the “Let’s Fintech” accelerator programme, and alongside a number of other successful Coinfirm pilots with clients across Western Europe, the United States and Japan. The first stage of implementing Trudatum across the bank will focus on integrating it with PKO’s current systems and providing a solution which makes it possible to verify the authenticity of various bank documents. Every document recorded in the blockchain (e.g. proof of a transaction, or bank’s terms and conditions for a given product) will be issued in the form of irreversible abbreviation (“hash”) signed with the bank’s private key. This will allow a client to verify remotely if the files he received from a business partner or from the bank are true, or if a modification of the document was attempted. Thanks to Coinfirm’s solution, PKO BP can now provide more efficient supervision of transaction history and data verification, which will be beneficial both in terms of time savings and costs of managing these processes. Trudatum is not only a solution for the challenges above, but it also permits cryptographic security for digital signatures.


IT infrastructure management software learns analytics tricks


While some users remain cautious or even skeptical of AIOps, IT infrastructure management software of every description -- from container orchestration tools to IT monitoring and incident response utilities -- now offer some form of analytics-driven automation. That ubiquity indicates at least some user demand, and IT pros everywhere must grapple with AIOps, as tools they already use add AI and analytics features. PagerDuty, for example, has concentrated on data analytics and AI additions to its IT incident response software in 2017 and 2018. A new AI feature added in June 2018, Event Intelligence, identifies patterns in human incident remediation behavior and uses those patterns to understand service dependencies and communicate incident response suggestions to operators when new incidents occur. "The best predictor of what someone will do in the future is what they actually do, not what they think they will do," said Rachel Obstler, vice president of products at PagerDuty, based in San Francisco.


BOE tells U.K. banks cyber attacks are coming, now get ready

Financial regulators told firms to come up with a detailed plan for restoring services such as payments, lending and insurance after a disruption, and to invest in the staff and technology to make it work. The plan should include time limits on how long an outage could last. “Boards and senior management should assume that individual systems and processes that support business services will be disrupted, and increase the focus on back-up plans, responses and recovery options,” the Bank of England and the Financial Conduct Authority said. The discussion paper published on Thursday is part of the regulators’ effort to bolster the resilience of financial firms in response to a rising number of operational failures. The focus is on ensuring continuity of business services that are essential for the economy. The regulators underlined the role that firms’ senior officials have to play in improving their ability to bounce back in a crisis. Thursday’s paper is intended to spark a debate with industry and consumers on how best to respond to inevitable disruptions.


Collaborative Intelligence: Humans and AI Are Joining Forces

R1804J_WILSON_COLLABORATION.png
As AIs increasingly reach conclusions through processes that are opaque (the so-called black-box problem), they require human experts in the field to explain their behavior to nonexpert users. These “explainers” are particularly important in evidence-based industries, such as law and medicine, where a practitioner needs to understand how an AI weighed inputs into, say, a sentencing or medical recommendation. Explainers are similarly important in helping insurers and law enforcement understand why an autonomous car took actions that led to an accident—or failed to avoid one. And explainers are becoming integral in regulated industries—indeed, in any consumer-facing industry where a machine’s output could be challenged as unfair, illegal, or just plain wrong. For instance, the European Union’s new General Data Protection Regulation (GDPR) gives consumers the right to receive an explanation for any algorithm-based decision, such as the rate offer on a credit card or mortgage. This is one area where AI will contribute to increased employment: Experts estimate that companies will have to create about 75,000 new jobs to administer the GDPR requirements.


Nasdaq CIO Puts AI to Work

“There’s not an industry that I can see that won’t benefit (from AI),” he said. Technology executives at Nasdaq and other firms say the big value in AI comes when it’s paired with human workers, in what’s known as “AI augmentation.” In 2021, AI augmentation will generate $2.9 trillion in business value and recover 6.2 billion hours of worker productivity, according to forecasts from Gartner Inc. Last year, Nasdaq’s team of in-house team of data scientists and data engineers built an AI system that helps analysts write change-of-ownership reports. Such reports typically include information for chief executives and investor relations officers about institutional activity, including top buyers and sellers, as well as shareholder analysis, price performance and valuation. In the past, the reports were higher quality when the humans writing them had a lot of experience. But when those analysts moved on to other jobs, it took time to train new employees to write the reports, and in turn, to ramp up the quality, Mr. Peterson said. The AI system, currently in pilot phase, helps generate some portions of the report quickly and at a high quality, freeing up human analysts to spend more time providing deeper context and advising clients, Nasdaq said.




While no one was looking, California passed its own GDPR

California Consumer Privacy Act of 2018
What happens now? If you do business in California, you have to comply with the law, and so does any company that you sell customer data. If they violate the law, you are on the hook for it. And you have to add a “Do Not Sell My Personal Information” link to your site. No doubt the law will be challenged, and the ballot can always come back if the law is weakened or overturned. If you are potentially impacted by GDPR in any way, you should have already done some compliance. Now, if you do business in California you will have to, even if you aren’t in the state. Basically, all the best practices for GDPR apply here. This means making sure all of your data is accurate. Now would be a really good time to revisit customer and mailing lists because if there are inaccuracies you can find it will save the trouble of doing it later. Old, outdated or obsolete data can be removed. Make sure all data collection channels know of the new rules and adjust accordingly to take in correct data and quickly get at it to make changes or removals. Make sure to document data handling rules so everyone who handles data, either for intake, editing or management, knows what is expected.


Reactive or Proactive? Making the Case for New Kill Chains

Organizations won't see these employees looking at job search websites. Instead, they will visit websites where they can circumvent web proxies. These are websites that allow them to hide, and then jump to the Dark Web, for example, to move data and bypass controls. The next stage of the chain is when they persistently try logging into systems to which they typically do not have access. They quietly "jiggle doors" looking for sensitive data that is outside the scope of their, their peers', and overall team's role. Combining these two stages — visiting suspicious websites and jiggling doors — are good examples that indicate a person may be a persistent threat. The next stage is when the person acts. For example, on a regular basis, s/he may encrypt small amounts of sensitive data and exfiltrate it outside the network. By breaking the data down into small amounts, the person aims to evade detection, and by encrypting it, makes it even more difficult because the company cannot see what's inside. Obviously, the goal is to stop the person before getting to the final stage of exfiltration.


You can no longer afford to indulge cloud blockers

Many enterprises are today highly successful with cloud computing, and the evidence clearly shows that the cloud is more secure than on-premises systems, costs less to operate, and provides key strategic capabilities such as agility and reduced time to market. But there are still those people who have kept cloud computing out of their companies for the last decade, at first through active resistance and dismissal, now by being quietly passive-aggressive. Today, they are faced with a boss, board of directors, and staffers who are all looking at new information, and perhaps facing competition that is faster and more agile with cloud computing. These cloud resisters are in a full-blown state of cognitive dissonance. This cognitive dissonance is bad for both them and their companies. Many of these people are seen as blockers, and so they lose their jobs; CIOs top the list. What a waste of talent! Worse, they also end up wasting their companies’ time and money trying to prove to everyone that they were indeed right about something they are not right about.


The Generational Shift in IT Drives Change for IT Pros

Instead of focusing on the challenges that emerging technologies bring focus on the new opportunities they offer, just as they did when the Internet arrived and mobile devices became more commonplace. IT professionals can play a key role in using technology-driven creativity to enable innovation, standardization, and simplicity into the business, helping the whole organization get ahead of the curve. In order to do this, IT has to move away from patching and backups to value-creating activities such as design-thinking, application development, user adoption and learning management. Even the smallest step, such as creating a chatbot that serves as an IT helpdesk, can transform organizational performance and invidiual productivity. Further, emerging technologies like artificial intelligence, natural language processing, blockchain and the Internet of Things are being built on cloud technology. Understanding these emerging technologies, the data they rely on, and how they can be applied to the business will be critical as IT professionals become strategic partners in deploying these technologies in the enterprise.



Quote for the day:


"The greatest single problem in communication is the illusion that it has occurred." -- G.B. Shaw


Daily Tech Digest - July 05, 2018

Testing machine learning interpretability techniques

Gears
Originally, researchers proposed testing machine learning model explanations by their capacity to help humans identify modeling errors, find new facts, decrease sociological discrimination in model predictions, or to enable humans to correctly determine the outcome of a model prediction based on input data values. Human confirmation is probably the highest bar for machine learning interpretability, but recent research has highlighted potential concerns about pre-existing expectations, preferences toward simplicity, and other bias problems in human evaluation. Given that specialized human evaluation studies are likely impractical for most commercial data science or machine learning groups anyway, several other automated approaches for testing model explanations are proposed here (and probably other places too): we can use simulated data with known characteristics to test explanations; we can compare new explanations to older, trusted explanations for the same data set; and we can test explanations for stability.


As the leader “learns” more about her/his team/member(s), thanks to observation and coaching results, he/she will start to gain (or maybe lose) confidence in the progress of her team/member(s). Depending on the progress magnitude, the leader can dose the level of enablement of the team and/or the individuals. Good progress means more enablement, until the team becomes what we call Autonomous;the best of the breed in the industry, acknowledged by many since the time of Hirotaka Takeuchi and Ikujiro Nonaka, the inventors of the “New New Product Development Cycle”. But what if progress is slow, or below the acceptable norm? Practically, this is the tough part of the story. The answer will certainly depend on a deeper look into the reasons, as well as the organizational ability, to practice patience in developing its people. But in all cases, there is a given threshold for those who neither have the guts nor the desire to improve.  



For a strategy to be sound, it should be preceded by a warts-and-all look at the effectiveness and maturity of the as-is position and a clear line of sight of where it needs to get to. This requires a deep understanding of the business within which security operates, alongside measuring the effects of the myriad security jigsaw pieces across the organisation. This almost never happens. If it did, security teams would recognise that investment needs to be made primarily and almost solely on fixing the crap that is already there. How can this be? Well, let’s go through some of the jigsaw pieces that just about every organisation will have in its security picture. Policy – we all have policy. If you work in government, you will have more policy than you can shake a stick at, and in other organisations or industries, hopefully less so. However, almost every policy is the equivalent of the Ten Commandments: thou shalt not commit adultery; thou shalt not share thy password.


'It's going to create a revolution': how AI is transforming the NHS

Scan of head in profile with tumour highlighted in red
Computer engineers are fond of asserting that data is the fuel of AI. It is true: some modern approaches to AI, notably machine learning, are powerful because they can divine meaningful patterns in the mountains of data we gather. If there is a silver lining to the fact that everyone falls ill at some point, it is that the NHS has piles of data on health problems and diseases that are ripe for AI to exploit. Tony Young, a consultant urological surgeon at Southend University hospital and the national clinical lead for innovation at NHS England, believes AI can make an impact throughout the health service. He points to companies using AI to diagnose skin cancer from pictures of moles; eye disorders from retinal scans; heart disease from echocardiograms. Others are drawing on AI to flag up stroke patients who need urgent care, and to predict which patients on a hospital ward may not survive. “I think it’s going to create a revolution,” he says.


The art of finding a good data scientist

Identify software development pain points CIO
In the heated competition for data science talent, it’s important to fish in the ponds where not everyone else is fishing, so we’ve found ourselves focusing less on the expected targets like those Stanford and MIT computer science types and more on schools that seem to produce graduates with a robust outlook on applying science in daily life. Carnegie Mellon University and the University of California, Berkeley, are among the institutions that have particularly impressed us. In fact, on May 10, Carnegie Mellon announced it would launch the nation’s first Bachelor of Science program in AI this fall. Many U.S. universities offer an AI track within their computer science or engineering programs, but Carnegie Mellon is establishing a distinct undergraduate major, with a practical focus. Meanwhile, the University of California, San Diego, announced it will begin limiting enrollment in the data science major it started in fall 2017 due to overwhelming demand. What a terrific indication of the soaring interest in data science and a much-needed boost for the pipeline of data science expertise.


Nokia to build & test 5G apps in China with Tencent

3rd World Internet Conference – Day 1
5G presents an opportunity to revisit Nokia’s role once again, both as a network services provider as well as a developer of services to run on those networks. “This collaboration with Tencent is an important step in showing webscale companies around the globe how they can leverage the end-to-end capabilities of Nokia’s 5G Future X portfolio,” said Marc Rouanne, president of Mobile Networks at Nokia. “Working with them we can deliver a network that will allow them to extend their service offer to deliver myriad applications and services with the high-reliability and availability to support ever-growing and changing customer demands.” For Tencent, the company already has a huge number of users, and last year it was part of a consortium (with Alibaba, Didi and Baidu) that took at $12 billion stake in mobile operator China Unicom. That partnership will give the company — which has made its fortune in software — messaging apps, games and other services — a stronger place in building services that are more tightly integrated with networks. And this deal with Nokia will extend that kind of work specifically in the area of 5G.


Reskilling facilitates agile IT in the digital era

Reskilling facilitates agile IT in the digital era
Reskilling happens organically around agile software development at John Hancock, says Derek Plunkett, who runs application development for the financial services firm's retirement plan services. There, application developers, engineers, quality assurance analysts, cybersecurity talent and other IT staffers work with an array of business workers in small, nimble teams to build various digital products and services, including the company's websites and retirement calculators, says Plunkett. Key to this endeavor is ensuring that IT's culture is aligned around building the best business outcomes for the company's plan participants. "We want to be strategic partners and in order to do that, we need to understand the goals of the business,” Plunkett says, adding that he doesn’t employ a formal rotational program. John Hancock’s IT is moving toward a more engineering-focused, startup culture, which includes pair programming, where two developers code from one keyboard and computer.


UK announces creation of London cybercrime court

The purpose-built court will deal with civil, business, and property cases. Lord Chancellor David Gauke said the deal represents a "message to the world that Britain both prizes business and stands ready to deal with the changing nature of 21st-century crime." "This is a hugely significant step in this project that will give the Square Mile its second iconic courthouse after the Old Bailey," added Catherine McGuinness, Policy Chairman of the City of London Corporation. "I'm particularly pleased that this court will have a focus on the legal issues of the future, such as fraud, economic crime, and cybercrime." According to the Office for National Statistics' latest Crime Survey for England and Wales(CSEW), 4.7 million incidents of criminal fraud and cybercrime were experienced by UK residents in the past year, with bank and credit card fraud forming the majority of cases. Norton suggests that in 2017, £130 billion was stolen from the general public by cybercriminals, of which £4.6 billion in losses were experienced specifically by British consumers.


Data Citizens: Why We All Care About Data Ethics


In the world of data citizenship, these mechanisms are less well defined. Even discovering that bias exists can be challenging since so many data science outcomes are proprietary knowledge. It may not be obvious to anyone who does not have the resources to conduct a large-scale study that hiring algorithms are unintentionally leading to vicious poverty cycles, or that criminal risk assessment software is consistently poor at assessing risk, but great at categorising people by race, or that translation software imposes gendered stereotypes even when translating from a non-gendered language.  These are, of course, all examples that have been discovered and investigated publically but many others exist unnoticed or unchallenged. In her book “Weapons of Math Destruction,” Cathy O'Neil describes one young man who is consistently rejected from major employers on the basis of a common personality test.


Top six security and risk management trends

New detections technologies, activities and authentication models require vast amounts of data that can quickly overwhelm current on-premises security solutions. This is driving a rapid shift toward cloud-delivered security products. These are more capable of using the data in near real time to provide more-agile and adaptive solutions. “Avoid making outdated investment decisions,” advised Mr. Firstbrook. “Seek out providers that propose cloud-first services, that have solid data management and machine learning (ML) competency, and that can protect your data at least as well as you can.” ... The shift to the cloud creates opportunities to exploit ML to solve multiple security issues, such as adaptive authentication, insider threats, malware and advanced attackers. Gartner predicts that by 2025, ML will be a normal part of security solutions and will offset ever-increasing skills and staffing shortages. But not all ML is of equal value. “Look at how ML can address narrow and well-defined problem sets, such as classifying executable files, and be careful not to be suckered by hype,” said Mr. Firstbrook.



Quote for the day:


"Leadership does not always wear the harness of compromise." -- Woodrow Wilson


Daily Tech Digest - July 04, 2018

Understanding Blockchain Fundamentals, Part 3: Delegated Proof of Stake


The gist is that PoW provides the most proven security to date, but at the cost of consuming an enormous amount of energy. PoS, the primary alternative, removes the energy requirements of PoW, and replaces miners with “validators”, who are given the chance to validate (“mine”) the next block with a probability proportional to their stake. Another consensus algorithm that is often discussed is Delegated Proof of Stake (DPoS) — a variant of PoS that provides a high level of scalability at the cost of limiting the number of validators on the network. ... DPoS is a system in which a fixed number of elected entities (called block producers or witnesses) are selected to create blocks in a round-robin order. Block producers are voted into power by the users of the network, who each get a number of votes proportional to the number of tokens they own on the network (their stake). Alternatively, voters can choose to delegate their stake to another voter, who will vote in the block producer election on their behalf.



Cryptocurrency Theft Drives 3x Increase in Money Laundering

"We're now seeing, in the last probably eight to 12 months, a real influx of new criminals that are highly technically sophisticated," he explains. There's a major difference between seasoned threat actors and those who have been dabbling in cybercrime for less than 12 months: operational security. It isn't a question of technical prowess so much as lack of experience, Jevans continues. Cybercrime's newest threat actors can craft advanced malware designed to target cryptocurrency addresses and inject similar addresses, under their control, to receive funds. Their malware is designed to target digital funds in a way traditional malware isn't, created by people who grew up learning about virtual currencies and can exploit them in new ways. The problems start when they secure the money. ... "It's clear these people really understand cryptocurrency and crypto assets really, really well," he explains. "What they don't understand is old-school operational security … they're just not sophisticated that way. Legacy folks, they definitely have better operational security. They're better at how they interface with it, how they distribute malicious code, how they manage user handles on different forums."


Dell New XPS 13 vs. HP Spectre x360 13t: Which laptop is better

dell new xps 13 vs hp spectre x360 13 1
With completely refreshed models at hand, we're putting these two dream machines through an old-fashioned smackdown. We're comparing them on everything from design and features to price and performance, declaring a winner in each category. Keep reading to see who comes out ahead. ... Both laptops are extremely portable for what they offer in capability and performance. In pure weight contests, our scale put the New XPS 13 at 2 pounds, 10.5 ounces, and the Spectre x360 13t at 2 pounds, 11.7 ounces. Unless you’re looking for a true featherweight-class devices that's closer to two pounds, it’s going to be hard to beat these two. Where it might matter to someis how large the actual body is, which can affect the size of your laptop bag or your comfort on a cramped airplane. While we think this is a pretty close battle, the nod obviously goes to the New XPS 13, which is just incredibly small despite having a 13.3-inch screen. ... It’s interesting that both the XPS 13 and Spectre x360 13t are the last refuge of “good keyboards.” There's no marketing to make you believe that less key travel is better.


4 reasons why CISOs must think like developers

number 4 four with binary grunge background
Developers are constantly looking for ways to extend services and share data using API’s & Microservices. Microservices help weave a digital fabric through a set of loosely-coupled services stitched together as a platform. Platform-centric architectures provide for extensibility with the ability to plug-and-play new tools and services using API’s with open data formats like JSON. CISO’s similarly must start thinking of ways to break down data silos and integrate the data from various tools and sub-systems. The list of “sensors” generating security data is endless and keeps growing every day. Anti-virus scan reports, firewall logs, vulnerability scan data, server access logs, authentication logs and threat profiles are just some of the sources of critical security information. All this data only makes sense when integrated into one single view and analyzed using AI-models. The volume, velocity and variety of data make it impossible for human-beings to analyze and react. AI-driven models help discern anomalous behavior from regular patterns and are the only scalable approach for detecting threats in near real-time. Security operations, automation, analytics and incident response as an integrated platform is the way to go.


Network professionals should think SD-Branch, not just SD-WAN

Aruba, SD-Branch, SD-WAN, WAN, networking
Doyle defines the SD-Branch as having SD-WAN, routing, network security, and LAN/Wi-Fi functions all in one platform with integrated, centralized management. An SD-Branch can be thought of as the next step after SD-WAN, as the latter transforms the transport and the former focuses on things in the branch, such as optimizing user experience and improving security. ... Most SD-WAN solutions focus on WAN transport, but apps continue on inside the branch. Aruba’s SD-Branch provides fine-grained contextual awareness and QoS across the WAN, but also inside the branch, and can be extended to mobile users. This is an important step in breaking down the management silos of remote networks, in office, and WAN. Network engineers should think of the end-to-end network instead of discrete places. Apps don’t care about network boundaries, and it’s time for network operations to think that way, as well. From an operations perspective, Aruba’s SD-Branch would enable IT organizations to manage more branches with fewer people. The automated capabilities and ZTP takes care of many of the tasks that were historically done manually.


Open source isn’t the community you think it is

Open source isn̢۪t the community you think it is
The interesting thing is just how strongly the central “rules” of open source engagement have persisted, even as open source has become standard operating procedure for a huge swath of software development, whether done by vendors or enterprises building software to suit their internal needs. While it may seem that such an open source contribution model that depends on just a few core contributors for so much of the code wouldn’t be sustainable, the opposite is true. Each vendor can take particular interest in just a few projects, committing code to those, while “free riding” on other projects for which it derives less strategic value. In this way, open source persists, even if it’s not nearly as “open” as proponents sometimes suggest. Is open source then any different from a proprietary product? After all, both can be categorized by contributions by very few, or even just one, vendor. Yes, open source is different. Indeed, the difference is profound. In a proprietary product, all the engagement is dictated by one vendor.


Java Parallel Streams
A stream is a sequence of elements. An array is a data structure that stores a sequence of values. Then, a stream is an array? Well, not really - let's look at what a stream really is and see how it works. First of all, streams don't store elements, an array does. So, no, a stream is not an array. Also, while collections and arrays have a finite size, streams don't. But, if a stream doesn't store elements, how can it be a sequence of elements? Streams are actually a sequence of data being moved from one point to the another, but they're computed on demand. So, they have at least one source, like arrays, lists, I/O resources, and so on. Let's take a file for an example: when a file is opened for editing, all or part of it remains in memory, thus allowing for changes, so only when it is closed there's a guarantee that no data will be lost or damaged. Fortunately, a stream can read/write data chunk by chunk, without buffering the whole file at once. Just so you know, a buffer is a region of a physical memory storage (usually RAM) used to temporarily store data while it is being moved from one place to another.


Cloud computing concept
The marketplace consists of suppliers and consumers that either rent out or purchase computing power to perform their tasks. Consumers who connect to the virtual space can either select a rental time or buy available power for their projects, and then calculate the cost accordingly. When the power resource is theirs, consumers can then take advantage of SONM’s capabilities to render videos, host apps and websites, make scientific calculations, manage data storage, or work with machine learning. Suppliers — the computing power owners — earn SNM tokens when they sell computer resources to consumers. SONM is completely decentralized, which means the platform is transparent and free from ownership, and the company claims it is less expensive than centralized competitors. “Blockchain enables the creation of a genuinely open decentralized system without a single control center,” Antonio said. “Additionally, using blockchain to manage settlements on-platform with the help of the SNM cryptocurrency allows the interests of participants to be protected.”


An urban scene.
Economic viability is important, says Sharma, because of the public policy imperative to find cost-effective solutions to the problems facing urban areas. “In general, cities are stretched in terms of their budgets,” he says, “They are thinking about how to efficiently utilize all of the assets they have. For example, better traffic management can be an economic alternative to building a new highway. The ultimate goal is not necessarily to build roads, it’s to improve mobility, and do a better job of getting people from point A to point B.” Sharma says that social media and awareness of new technology is increasing the motivation of urban planners and politicians to implement smarter solutions to problems such as traffic congestion, parking shortages, security, and first-responder response times. “Citizens are demanding more from their leaders,” he says. “I think this will motivate policymakers, and result in the right decisions when it comes to using digital technology.” A recently released report from Juniper Research, sponsored by Intel, looks at the evolution of smart cities in the context of mobility, healthcare, public safety and productivity.


Facial Recognition: Big Trouble With Big Data Biometrics

Amazon Web Services, for example, in 2016 began to offer biometric capabilities via Amazon Rekognition, and it's ready to highlight positive use cases. "We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft)," Matt Wood, general manager for deep learning and artificial intelligence at Amazon Web Services, said in a blog post last month. ... As data breach expert Troy Hunt has written as well as extensively documented: "Sooner or later, big repositories of data will be abused. Period." Hunt was specifically writing about India's Aadhaar implementation, which is the world's largest biometric system, storing about 1.2 billion individuals' details, and which has not been a security success story



Quote for the day:


"The essence of leadership is the willingness to make the tough decisions. Prepared to be lonely." -- Colin Powell


Daily Tech Digest - July 03, 2018

Facebook releases its load balancer as open-source code

Facebook releases its load balancer as open-source code
Google is known to fiercely guard its data center secrets, but not Facebook. The social media giant has released two significant tools it uses internally to operate its massive social network as open-source code. The company has released Katran, the load balancer that keeps the company data centers from overloading, as open source under the GNU General Public License v2.0 and available from GitHub. In addition to Katran, the company is offering details on its Zero Touch Provisioning tool, which it uses to help engineers automate much of the work required to build its backbone networks. This isn’t Facebook’s first foray into open-sourcing the software that runs its network. Last month, the company open-sourced PyTorch, the software used for its artificial intelligence (AI) and machine learning projects. PyTorch is a Python-based package for writing tensor computation and deep neural networks using GPU acceleration. Facebook has to develop these kinds of software packages because while there are plenty of off-the-shelf software products out there, none of them is made for a global social media company that has 2 billion users.



If you thought GDPR was bad – Just wait for ePrivacy Regulation

GDPR delineates rules for obtaining clear and unambiguous consent for collection and use of personal information. ePR follows the same definition of what constitutes valid consent but makes it the “central legal ground” for the processing of electronic communications data, direct marketing communications and the access to end users’ terminal devices (phones, wearable devices, gaming consoles, etc.). One area of concern under ePR is that while GDPR also includes legitimate interest (as long as the consumers are aware of this and have consented to it) and contractual necessity as allowable factors for collecting and processing personal data, ePR lacks these exemptions to consent. This adds ambiguity, brings into question the alignment and relationship between GDPR and ePR and may effectively narrow how companies can process electronic communications data as well as what they can collect. ... Or on the flip side, does it mean that companies have to alter account origination processes to obtain consent for processing necessary to set up the account? Both questions seem to extend the need for obtaining consent beyond what the GDPR established.


Jabra Elite 65t true wireless earphones review: A true AirPod alternative

jabra elite 65t
They’ll remain safe during a shower, and they’ll be fine if you get caught in the rain. But just don’t take them swimming. With their IP55 rating, they’ll stand up to a blast from a jet of water. But submersion? Not so much. ... If you remove one of the buds from your ear, it’ll pause whatever you’re listening to. Put it back in, and the music continues—a nice touch of sophistication. And staying true to Jabra’s roots as a Bluetooth headset maker, the Elite 65t can also be used with just one earbud, the right one, pushed into your ear canal. This makes the earphones a good choice for anyone looking to use his or her phone handsfree while driving. Oh, and should you lose one of your earphones, Jabra makes it easy to buy a replacement through its accessory site. Jabra says the 65t can run for up to five hours off of a single charge. I found this estimate to be reasonably accurate. The earphone’s slim charging case, while larger than what you’ll see with a set of AirPods, has enough juice to provide two additional five-hour charges. Users will appreciate the fact that 15 minutes worth of charging in the case will provide about 90 minutes worth of music.


Ransomware: Not dead, just getting a lot sneakier

Ransomware may no longer be flavour of the month but it still remains a significant threat. The short-term damage means business can't be done while files are encrypted while the longer-term impact may result in loss of trust from customers and users who may not feel that the victim can be trusted to keep their data secure. There's also the possibility that a victim who pays the ransom could easily become infected again as attackers realise they've got an easy target on their hands. For cybercriminals ransomware still offers a big payday, quickly, unlike malicious cryptocurrency mining which requires patience to realise a pay-off. Behind much of the potency of ransomware is the EternalBlue SMB vulnerability which allowed WannaCry, NotPetya and other ransomware attacks to self-perpetuate around networks. It's over a year since the NSA vulnerability was leaked by hackers but there are plenty of organisations which, despite the clear demonstrations of the damage attacks exploiting EternalBlue can do, still haven't patched their networks.


The pros and cons of serverless architecture


Fundamentally, serverless lets developers focus on writing code. There are still servers somewhere in the stack, but the developer doesn't need to worry about managing those underlying resources. While services like Amazon Elastic Compute Cloud (EC2) require you to provision resources for the OS and the application, a serverless architecture simply asks how many resources a single demand of your function requires. For example, a web testing suite might require 128 MB of RAM for any single website. Even if you deploy 10 million copies of that function, each individual one needs only 128 MB. They can even all run at the same time. Serverless focuses on what each individual request requires and then scales automatically. There are several different approaches to serverless development. Most developers who transition from a traditional framework, such as Flask, Rails or Express, might choose to use a serverless framework, such as Chalice for Python or Serverless for Node.js. These frameworks are similar to the traditional ones, which help ease the transition for those developers.


How careless app developers are risking data of millions of sensitive users

The poorly secured backend database in thousands of apps is leaking sensitive user data. Many app developers have put at risk millions of sensitive medical and financial records of users due to their poor coding practices.  Recently released report by the mobile security firm Appthority describes the data leaks. The report pins the blame on app developers that failed to properly use Google’s Firebase cloud database. The platform acquired by Google in 2014 is used for authentication of user details. Firebase is intended to make app development much easier by doing much of the manual authentication work for coders. Appthority’s report lists more than 3,000 apps that leaked the user details. Most of these apps are Android-based while only 600 apps are on iOS. These incorrectly configured Firebase databases have exposed many users on the internet. Many of these apps record sensitive information such as financial data, employee medical records, and plain text passwords.

Why accounting matters to your cloud computing plans
While cloud computing can save you millions of dollars a year, it may actually cost you money, at least in the short term. That’s something that I’ve run into from time to time with clients over the years. At issue is that you need to consider net savings. That mean looking for the all-in cost of the cloud, including dealing with tax and other accounting implications. Although cloud computing is typically a superior model, walking away from traditional hardware and software has a cost as well. Indeed, in a few cases I’ve found that a cloud computing solution that will save $10 million a year actually will cost $15 million considering the impact of taxes. The gross savings made sense for cloud, but the net savings did not. So, how are cloud geeks supposed to deal with these accounting issues? By using business analysts to work up cloud ROI models. It’s not uncommon for these business analysts to be CPAs. Even more complex is the fact that most companies are multinational these days, and so you to figure out not only the net cost impact for a single country, but for dozens of countries that have some pretty odd laws when it comes to accounting, especially tax issues.


The modern CSO: Future-proofing your organization in a disruptive world

Thinking well in advance about the risk involved in moving to new IT platforms should allow CSOs to make sure that some things (e.g., privacy by design) are taken into account from the start and the emphasis on security and compliance is kept. “It’s also worth keeping up with what is taking place on the security side by looking at the low hanging fruit for security problems. Patching machines, keeping software updated, managing access control – these are all well-understood issues that keep getting exploited,” he notes. “The big problems like WannaCry in 2017 were all due to known issues. Understanding those breaches and patching vulnerabilities quickly should keep companies ahead of the large majority of potential attacks.” New technologies such as containers should also make this easier. “Rather than having to build upon that existing IT infrastructure and keep updating it, you can use a clean container build each time that is up to date. You keep the containers as up to date as possible, you audit any third-party software or plug-ins that get used within those containers continuously, and you focus on those images in your library,” he explains.


Pulse Secure VPN enhanced to better support hybrid IT environments

Pulse Secure VPN enhanced to better support hybrid IT environments
Pulse Connect Secure is fully mobile-aware, with features such as certificate-based authentication with an embedded certificate authority and integrated endpoint container. Support for SAML authentication allows enterprises to blend data center and cloud resources into a robust user experience.  Pulse Connect Secure simplifies network administration and compliance management with a centralized web-based console, end user self-provisioning, and integration with EMM policy management platforms. Centralized appliance management delivers an IT administration experience that enables proactive and rapid responses to security threats and network events. Administrators are able to replicate configuration and policies from one appliance to others and perform bulk operations for firmware updates and policy changes. An administrative dashboard provides appliance status and unified compliance reporting with context-aware visibility of devices and users. Pulse Connect Secure can be deployed as a hardware, virtual, or cloud appliance. Pulse Secure recently announced a new release of Pulse Connect Secure aimed at simplifying connectivity and security in cloud and hybrid IT environments.


Cybersecurity remains non-core competency for most C-suite executives

Whilst cybersecurity has now become a critical business function, it remains a non-core competence for a significant number of boards. CISOs have become increasingly common in recent years (recent research suggests that nearly two-thirds of large US companies now have a CISO position), but the majority do not report directly to the CEO, which reduces their effectiveness. Cyrus Mewawalla, Head of Thematic Research at GlobalData commented, “The frequency of cyberattacks is only likely to accelerate over the coming years, therefore it is vital that senior executives have a full understanding of the inherent risks and implications. The losers will be those companies whose boards do not take cybersecurity seriously, as they run a higher risk of being hacked.” It is hard to assess a company’s exposure to cybersecurity risk, but the composition of the board often provides clues: CEOs who do not have a CISO reporting directly to them present a high risk.



Quote for the day:


"The leadership team is the most important asset of the company and can be its worst liability." -- Med Jones


Daily Tech Digest - July 02, 2018

Microsoft Surface Studio: A cheat sheet

surface-studio-1.png
From the point of view of artists and designers, the Studio offers a high-end computer built around their creative needs, which does away with having to use a separate drawing tablet and computer. Even if creatives ignore the Surface Studio, its release is good news, likely to prompt incumbents like Apple and Wacom to spec up and cut the prices of new machines — in particular for the iMac, which the Studio has been compared to many times, despite the iMac lacking a touchscreen. By following up the immaculately designed Surface Book laptop with a striking machine like the Surface Studio, Microsoft also appears to be trying to establish itself as a competitor to Apple on the design front. The Surface Studio garnered good reviews but with sizable caveats. TechRepublic's sister site ZDNet praised its attractive high-resolution screen and snappy performance but criticized its high price, limited build-to-order and upgradeability options, as well as the fact the Surface Dial is not included by default. CNET had similar concerns, and also highlighted limitations of the GPU choice and lack of front-mounted USB ports and Thunderbolt connection.



UK government cyber security standard welcomed


The standard outlines a set of cyber security outcomes for government departments to achieve in the areas of identification, protection, detection, response and recovery. The outcomes-based approach is aimed at allowing government departments flexibility in how the standards are implemented, “dependent on their local context”, the document states, adding that “compliance with the standards can be achieved in many ways, depending on the technology choices and business requirements in question.” Some of the key requirements include clear lines of responsibility and accountability to named individuals for the security of sensitive information, training and guidance for senior accountable individuals, strict access control, use of secure configurations, regular patching, attention to email and web application security, developing an incident response and management plan and the testing of contingency mechanisms to ensure continued delivery of essential services. One of the few prescriptive uses of technologies is the use of Transport Layer Security version 1.2 (TLS v1.2) to protect email and data in transit.


How to Write Better Code

For the first of these points, great books help you to read code. Some books that I strongly recommend are Clean Code, Implementation Patterns, Refactoring, The Art of Agile, Pragmatic Programmer, and Practices of an Agile Developer. I've enjoyed reading all of these books immensely. These books will teach you considerations such as low coupling, high cohesion, and simple design. They will teach you useful principles like the Single Responsibility Principle and the Open-Closed Principle. The patterns and principles teach you new information and code to discuss with your team. For the second of these points, Test-Driven Development is one great way to learn how to write code. I enjoy doing coding katas myself and often use them for teaching. But, the most valuable skill when writing code is one I learned at code retreats. It is essential to learn when to delete the code you write. I don't just mean refactor it in order to be smaller. I mean that, for coding exercises, highlight all the files and press the delete button. I mean for production code, after spending a few hours working on a task, use git reset --hard HEAD.


How a robot vacuum navigates your home

r960 left
Newer, higher-end robot vacuums include self-navigation systems that use mapping technology. Each manufacturer implements its own particular spin on mapping, but each of them is currently built around two slightly different methods. One uses an onboard digital camera to take pictures of walls, ceilings, doorways, furniture and other landmarks. A version of this type of mapping is used in Roomba’s 900 series vacuums and Samsung’s Powerbots. The other method, employed in vacuums like Neato's Botvac series, uses a laser range finder (also called LIDAR for Light Detection and Ranging) that measures the distance to objects in the vacuum’s path. In either case, the robot vacuum uses the data it collects in combination with information from its other sensors to gradually build a map of the room during its initial cleaning. Mapping delivers significant advantages. Armed with a floor plan, the robot vacuum can plot the most efficient route through the room, which is why mapping models seem to move in more orderly straight lines than their non-mapping counterparts. Mapping also allows the robot vac to localize itself within the map, which informs it where it's been and where it yet needs to go.


Slack outages raise reliability concerns

In an interview last month, a Slack representative acknowledged the company's rapid growth has been challenging to keep up with at times. Since January 2015, the company has grown from 1.1 million daily users to 8 million regular users today. "To be frank, we're still learning as we go," said Julia Grace, senior director of infrastructure engineering at Slack, based in San Francisco. "This is such a complex piece of software. We're operating at a global scale. We're learning and evolving and growing and making the service better along the way." Some analysts pointed out that Slack's performance was much worse when it was starting out. "Once upon a time, in the very, very, very early days of Slack, they were built on a model that couldn't scale," said Michael Facemire, an analyst at Forrester Research. "You remember those outages; you remember the old days when [Slack] would be down, and it would be down for very perceivable amounts of time." Nevertheless, with tech powerhouses Cisco and Microsoft as competitors, Slack can no longer afford to look weak. Companies are unlikely to standardize on a collaboration vendor with an uptime record significantly less than rivals.


IEEE Sets Fog Computing Standard

IEEE sets fog computing standard for compute, storage, networking
“We now have an industry-backed and -supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Helder Antunes, chairman of the OpenFog Consortium and senior director at Cisco, said in a statement.  According to the OpenFog website: “The sheer breadth and scale of IoT, 5G and AI applications requires collaboration at a number of levels, including hardware, software across edge and cloud, as well as the protocols and standards that enable all of our ‘things’ to communicate. "Existing infrastructures simply can’t keep up with the data volume and velocity created by IoT devices, nor meet the low-latency response times required in certain use cases such as emergency services and autonomous vehicles. "By extending the cloud closer to the edge of the network, fog enables latency-sensitive computing to be performed in proximity to the data-generating sensors, resulting in more efficient network bandwidth and more functional and efficient IoT solutions. Fog computing also offers greater business agility through deeper and faster insights, increased security and lower operating expenses.”


HealthEngine's Latest Problem: A Data Breach

Embattled Australian medical appointment booking service HealthEngine says late Friday it has notified 75 users of a data breach that may have exposed some identifying information. The data breach is the latest in a string of problems for HealthEngine, which has fallen under scrutiny for tampering with patient reviews and for its third-party marketing activities, which underpin its free medical booking service. The breach involved HealthEngine's Practice Recognition system, which allows patients to write reviews of practices. It is unclear when the breach occurred. More than 59,600 patient feedback entries may have been improperly accessed, and 75 of those contained "identifying information," HealthEngine says in a notice on its website. "Due to an error in the way the HealthEngine website operated, hidden patient feedback information within the code of the webpage was improperly accessed, the company says on its website. "The information is ordinarily not visible to users of the site."


Shadow IT: When employees venture to the dark side

A man using a mobile phone in shadow against a bright wall
While the factors leading to shadow IT today are the same, the outcome (and risks) are largely new. Equipped with a company credit card and their web browser—users are willing to go outside the scope of IT to get the apps they need to work productively—jeopardizing security and corporate compliance in the process. Employees who play fast and loose with the rules of IT can lead to renegade apps stirring-up all sorts of trouble, and everyone can contribute to the problem. Executives who store sensitive notes and documents within apps like Evernote and Dropbox put company secrets at risk. The marketing department can cause financial headaches by, for example, purchasing unsanctioned Salesforce licenses for their team members. When shadow apps run amok outside the scope of IT, a lot can go wrong. Most importantly, without knowledge or control of the apps workers are using, IT admins cannot guarantee corporate or user privacy. Employee workflow and productivity are also at risk. Individual teams that use competing apps (for example, sales uses Slack while engineering uses Microsoft Teams) can make collaboration more difficult, if not impossible. And then there are the costs associated with paying for separate software licenses, or worse, paying double for the same software license across different teams.


Enabling stakeholders to boldly support data governance

Decisions on data governance programs ultimately affect many different stakeholders, some of whom are unknown when those decisions are made. Understanding the entirety of the people that are affected is where a lot of the change management aspects come into play. Widening the net to engage as many people as possible helps ensure the program is effective. Creating a data governance program behind closed doors with just a select few participants can be a recipe for disaster. Engagement depends on knowledge and buy-in, so don’t short-change your efforts by limiting participation. Communicate with as many data stakeholders as you can, especially with those who have taken on data community leadership roles. Data stakeholders want to be part of the process of creating, implementing and sustaining a data governance program. They want to know that their knowledge is appreciated and taken into account when decisions are made about policies, procedures, business rules, metadata and tools.


Dispatch From The Super Internet

super fast gigabit internet speed
The Super Internet is the sum total of how the internet operates when you’re running a very large number of Chrome extensions. It’s a different and better internet, where all the normal complaints don’t apply. On the Super Internet, you don’t enter passwords or see advertising. You don’t get tracked. Every page is HTTPS. And if you go to a page where your registered password has been leaked to the dark web, the Super Internet will tell you. Cloud applications on the Super Internet are ten times better than those available to most users. The Super Internet version of Gmail can send and receive SMS, do advanced mail merge, send recurring and scheduled emails, send PGP-encrypted emails, apply follow-up and due-date reminders to incoming emails, edit outgoing emails using HTML or Google Docs, block notification of senders when you open an email — the list goes on and on. The Super Internet has social networking features most users can’t even imagine. Twitter, for example, is enhanced with auto-refreshing streams, one-button account switching, instant and automatic following and unfollowing, the ability to remove any component of Twitter, including promoted tweets, and hundreds of additional features.



Quote for the day:


"The weak can never forgive. Forgiveness is the attribute of the strong." -- Mahatma Gandhi


Daily Tech Digest - July 01, 2018


There’s a lot of interest in becoming a data scientist, and for good reasons: high impact, high job satisfaction, high salaries, high demand. A quick search yields a plethora of possible resources that could help -- MOOCs, blogs, Quora answers to this exact question, books, Master’s programs, bootcamps, self-directed curricula, articles, forums and podcasts. Their quality is highly variable; some are excellent resources and programs, some are click-bait laundry lists. Since this is a relatively new role and there’s no universal agreement on what a data scientist does, it’s difficult for a beginner to know where to start, and it’s easy to get overwhelmed. ... Many of these resources follow a common pattern: 1) here are the skills you need and 2) here is where you learn each of these. Learn Python from this link, R from this one; take a machine learning class and “brush up” on your linear algebra. Download the iris data set and train a classifier (“learn by doing!”). Install Spark and Hadoop. Don’t forget about deep learning -- work your way through the TensorFlow tutorial.



Crypto Coin Graveyard Fills Up Fast as ICOs Meet Their Demise

Blockchain startups are faring worse than their counterparts in other industries. Of 103 companies that received initial seed or angel funding in 2013 and 2014, only 28 percent managed to raise additional funding, according to CB Insights’s October report. That compares with 46 percent of the 1,098 tech companies that raised a second round in the U.S. between 2008 and 2010. Among tech companies, 14 percent went on to a fourth round, while only 2 percent of the blockchain companies did, the researcher found. "I don’t think we found the killer app yet," said Arieh Levi, an analyst at CB Insights. "It just seems like there’s been a lot of projects tried, but there aren’t really many users of blockchain protocols beyond speculators and traders." The failed projects have cost investors billions. Barring outliers like BitConnect, which saw its market cap shrink to about $4 million from nearly $3 billion in December, most of the ICOs that birthed these coins were relatively small, but investors may have still lost as much as $500 million, estimated Lex Sokolin, global director of fintech strategy at Autonomous Research LLP.


Why adopt cloud technology in the financial services industry?

Why adopt cloud technology in the financial services industry? image
Financial firms should undertake a shift in thinking and put technology – rather than finance – at the core of their business. UEBA (User and Entity Behaviour Analytics) and CASB (Cloud Access Security Broker) technologies together provide solutions to these challenges. UEBA tracks what users are doing and how data is moving, flagging if user or data behaviour differs from what could be considered normal and safe. Whether authorised or not, employees can put data and systems at risk, even if they stay within the security policies managed by a CASB. For example, a hacker that’s tricked an employee into divulging their credentials can move cloud data laterally from different applications to a cloud system, designed to surreptitiously withdraw the data afterwards. A recent survey found that hackers can exit a network within an hour, armed with prized data, so it’s vital to spot a compromised account before it’s too late. CASB helps financial firms get the rules of engagement just right, as CASB security keeps users in line with an organisation’s cyber security policies.


The Blockchain ecosystem v3: six months after the hype


The crypto world nowadays is not a safe haven. New businesses appear, evolve and rest on laurels, while others default or fail to meet investors expectations. In the nine months that passed since the first version of our map, many projects have soared, but some have terminated all activities. So, we have added a special section for some discontinued projects or proven frauds to remind you to triple check before getting into adventurous endeavors. Nevertheless, positive news also struck the crypto universe. We have placed on the map many new interesting projects with successfully closed ICOs during the past 9 months, as well as companies that we think deserve our readers’ attention. Those companies that haven’t released their product yet, are painted in a pale gray color, to make the map even more informative. Despite the number of projects that we highlight here has more than doubled, a half of the new ones still haven’t launched their product, thereby falling behind those who have. Of course, it’s quite expected that a decent portion of these newcomers arise from the sector of financial services. 



Using Topological Data Analysis to Understand the Behavior of Convolutional Neural Networks


There is a particular class of neural networks that are well adapted to databases of images, called convolutional neural networks. In this case, the input nodes are arranged in a square grid corresponding to the pixel array for the format of the images that comprise the data. The nodes are composed in a collection of layers, so that all edges whose initial node is in the i-th layer have their terminal node in the (i+1)-st layer. A layer is called convolutional if it is made up of a collection of square grids identical to the input layer, and it is understood that the weights at the nodes in each such square grid (a) involve only nodes in the previous layer that are very near to the corresponding node and (b) obey a certain homogeneity condition, so that for each square grid in layer i, the weights attached to a given node are identical to those for another node in the same grid, but translated to its surrounding neurons. Sometimes intermediate layers called pooling layers are introduced between convolutional layers, and in this case the higher convolutional layers are smaller square grids. Here is a schematic picture that summarizes the situation.


‘Moneyball’ing data – A closer look at how churn and propensity models work


As machine learning, deep learning, artificial intelligence, etc. become mainstream words that are taught in primary schools these days, it pays to fully understand how the system truly makes predictions and prescribes actions that a business should take. In this article, let’s look at how churn models and propensity to buy models can help you ‘moneyball’ your data. First things first, to ‘moneyball’ your data, you first need to have data. It can be anything from sales data, customer demographics, visits, social profiles, customer feedback, etc. This data forms the basis for your models to get trained on is called as ‘training data’. Models and algorithms are either pre-built or can be customized for a specific use case. For example, if you want to understand which segment of customers are going to churn in the next quarter, you can build a churn model which denotes the probability of a specific customer/ set of customers as a percentage. You can then get an output along the lines of: ‘Top 100 customers that going to churn in Q2 18’ and use that report to engage the customer better.


EU Report Says Cryptocurrencies 'Unlikely' to Challenge Central Banks

In the latest Monetary Dialogue report issued on June 26, the European Parliament's Committee on Economic and Monetary Affairs said that while cryptocurrencies have made financial transactions "relatively safe, transparent, and fast," they pose no threat to sovereign currencies around the world. The analysis, which was conducted by the Center for Social and Economic Research, a non-profit research institute based in Warsaw, first recognized the positive changes cryptocurrencies have brought to financial transactions, noting that they now "are used globally, disregarding national borders." Cryptocurrencies "respond to real market demand," the analysis claimed, and they will have the potential to become a "full-fledged private money" or even a permanent element to the global economy. However, the researchers said it is "unlikely" cryptocurrencies will threaten central banks and sovereign currencies and dismantle the existing monetary structures, especially in countries where their sovereign currencies are widely circulated.


The Spooky World of Quantum Computation


Quantum theories are counter-intuitive, because we all know that's not how the world works. The theories were even unsettling to the scientists who developed them, who struggled with their strange implications. Einstein hated the idea that the world was fundamentally non-deterministic, which led to his famous pronouncement that ‘God does not play dice.’ Schrödinger hoped his later theoretical work would eliminate what he called ‘the quantum jump nonsense’ and expressed regret at ever having contributed to quantum theory.. In order to show how absurd superposition was, Schrödinger devised a thought experiment, in which a cat was isolated from the outside world in a box. Also in the box was a radioactive rock, a bottle of cyanide, and a geiger counter wired to a hammer, which would smash the bottle if any radioactivity was detected. The emission of particles from a radioactive source is a probabilistic event, which cannot be predicted ahead of time. If a particle was emitted, the poison would be released, killing the occupants of the box. Quantum theory says that until a measurement was made, the decaying particle would be in both a decayed and non-decayed state.


A Deep Dive Into Cloud-Agnostic Container Deployments

Container orchestration refers to the automated organization, linkage, and management of software containers. These concepts are conventional for most of the tools mentioned above. This article aims to deep dive into a comparison between the two dominating players. ... Kubernetes necessitates a set of manual configurations to tie its components to the Docker engine. It comes with unique installations for every operating system. Before installation, Kubernetes requires information like node IP addresses, their roles, and numbers. There are many tools available to simplify the install and config process, though. Kubernetes is considered relatively white-box, i.e. you can get a lot more out of it, but you really need to have a deep understanding of what makes Kubernetes tick to achieve this. The platform is not designed for novices and the faint of heart to navigate. Throughout the pros and cons of Docker Swarm, you can note that Swarm's focus is on the ease of adoption and integration with Docker. Kubernetes, on the other hand, stands open and flexible.


2018 State of Testing Report


Open questions can sometimes be tricky but they are also incredibly interesting, as they provide an open platform for individual testers to express themselves and provide answers we could not foresee ahead of time. Specifically in the question about non-testing tasks, we saw a number of recurring answers pointing towards testers working either closer to customers (organizing Beta Testing programs, or briefing customers directly on the functionality of the product), or representing these customers while serving as product owners within their teams. We also saw a number of answers stating testers are now writing product code as part of their day-to-day tasks - aligned with the philosophy that teams are uniform and every member can and should be able to perform all actions. Open questions are also an opportunity for respondents to release some of the tensions and frustrations they feel as part of their work... Like the person who answered one of his non-testing tasks was to serve as a ZOO KEEPER, something I am sure many of us have felt one time or another in our testing careers.



Quote for the day:


"Leadership happens at every level of the organization and no one can shirk from this responsibility." -- Jerry Junkins