Daily Tech Digest - July 29, 2018

C# 8 Ranges and Recursive Patterns


Ranges easily define a sequence of data. It is a replacement for Enumerable.Range() except it defines the startand stop points rather than start and count and it helps you to write more readable code. ... Pattern matching is one of the powerful constructs, which is available in many functional programming languages like F#. Furthermore, pattern matching provides the ability to deconstruct matched objects, giving you access to parts of their data structures. C# offers a rich set of patterns that can be used for matching. Pattern matching was initially planned for C# 7, but after while .Net team has find that he need more time to finish this feature. For this reason, they have divided the task in two main parts. BasicPattern Matching, which is already delivered with C # 7, and the AdvancedPatternsMatching for C# 8. We have seen in C# 7 Const Pattern, Type Pattern, Var Pattern and the Discard Pattern. In the next C# 8 version, we will see more patterns like Recursive Pattern, which consist of multiple sub-patterns like the Positional Pattern, and Property Pattern.


There are also several issues to consider in terms of both PCI and HIPAA compliance when working with CSPs. Among them is the requirement that comes up during compliance audits regarding where your data resides and what protective measures are in place. With cloud services, that’s sometimes easier said than done. Many CSPs employ a network of data centers that work together to provide high availability and security of your data. As a result, the data may be moved to different data centers across large geographic spans based on service levels, resource demand, cost, latency, disaster recovery and business continuity needs. For security reasons, CSPs may be reluctant to divulge the location of their data centers or where data is specifically located at any one time. Things can become more complex in the case of global providers. With the European Union implementation of the General Data Protection Rules it is important to know where your data resides. Almost every business is touched by the impact of GDPR.


API Governance Models in the Public and Private Sector: Part 5

Beyond the technology, the legal department should have a significant influence over APIs going from development to production. Providing a structured framework that can generally apply across all services easily, but also providing granular level control over the fine-tuning of legal documents for specific services and use cases. With a handful of clear building blocks in use to help govern the delivery of APIs from the legal side of the equation. ... The legal department will play an important role in governing APIs as they move from development to production, and there needs to be clear guidance for all developers regarding what is required. Similar to the technical and business elements of delivering services, the legal components need to be easy to access, understand, and apply, but also make sure and protect the interests of everyone involved with the delivery and operation of enterprise services.


Data Analytics Is The New Co-Pilot For Every CIO

FICO_RIQ_Data Analytics_Business Intelligence
Measurements are important – you won’t be motivated to improve what you’re not measuring. Organizations must always understand their security posture in order to know how they quickly they can react to potential breach events. ... Compliance has become a much bigger deal over the last few years. Where organizations were once concerned about doing only as much as required to tick the box, they are now concerned about doing as much as they practically can to align to both the letter and spirit of the regulations.  ... Rely on the experts. There are software packages available that bring you coverage for multiple compliance regimes in a single application. It makes sense to leverage these pre-built tools, as most compliance regime requirements are similar and processing data multiple times to attain related outcomes is not efficient. ... Leverage analytics. As I stated above, ticking the box is no longer enough. You have to put best efforts into solving for the spirit of the regulation, and analytics can help you improve your ability to identify the conditions and incidents that the regulations are really getting after.


Here’s how to make AI inclusive


Governments around the world should prioritize preparing citizens for the proliferation of AI. Leaders should create a game plan that addresses the trajectory of job loss by asking difficult questions, including: should we slow down the evolution of technology to buy time to reskill the workforce? We know that certain jobs are going to become extinct. But can we minimize the impact by mapping out the “glide path” and helping prepare workers for new jobs before their current ones reach their natural conclusion? Think about AI like a car. There are two ways that the driver can reach a speed of 200 miles per hour. First, the driver can slam on the accelerator and go from 0 to 200 in matter of seconds. Or, the driver can control the acceleration by gradually applying pressure and monitoring the speedometer. The second scenario is much safer, of course. The same is true when it comes to monitoring the acceleration of AI. Governments should not try to stop its progression.


7 Ways IoT Is Changing Retail in 2018

retail-internet-of-things
With IoT, you can set up sensors around the store that send loyalty discounts to certain customers when they stand near products with their smartphones, if those customers sign up for a loyalty program in advance. Additionally, you can use IoT to track items a customer has been looking at online, and send that customer a personalized discount when she’s in-store. Imagine if your customer perused your purses online, and then, in-store, received a discount on her favorite purse? Rather than offering general discounts on a wide variety of products, you can tailor each discount using IoT to maximize your conversion rates. Ultimately, finding ways to incorporate IoT devices into your day-to-day business requires creativity and foresight, but the benefits of IoT in retail -- as outlined above -- can help your business discover innovative solutions to attract more valuable and loyal long-term customers.


The rise of autonomous systems will change the world

“It’s tough to make predictions, especially about the future”, as physicist Niels Bohr is quoted to have said. One thing for sure will change the world as we know it today, and this is the rise of autonomous systems. I would expect major progress in the synthesis of symbolic logic as for (explicit) knowledge representation in combination with (implicit) deep neural networks. This development will lead to autonomous systems that learn while interacting with their environment, that are able to generalize, to draw deductions, and to adapt to new, previously unknown situations. ... One of my favourite application areas is exploratory search, i.e. searching where you don’t know exactly where the search process might lead you to. Sometimes you might not be able to explicitly phrase your search intention. Probably because you lack the vocabulary or you might not be an expert in the domain in which you are looking for information. Then, first you have to gather information about your domain before you might be able to perform pinpoint retrieval.


AI and Jobs: What’s The Net Effect?


Among workers mostly engaged in non-routine tasks, one in five will soon be using AI to some extent, according to Gartner’s research. However, those excited about using AI systems at work may need to temper their excitement. AI-powered systems will likely help with mundane tasks such as creating a work schedule, or systems might be able to prioritize emails so employees can focus on the most important tasks. However, these systems are likely to evolve into virtual secretaries over time, and those using these systems will find them to be valuable time-saving tools covering an ever-increasing range of tasks. AI seems to perpetually be viewed as a future technology. Research from O’Reilly, however, shows that the foundation for AI-empowered companies already exists. A total of 28 percent of the leaders polled in the early 2018 survey report already using deep learning, which is viewed as perhaps the most important AI technology for typical businesses. Furthermore, 54 percent of respondents plan on using deep learning for future projects.


Understanding Software System Behaviour With ML and Time Series Data


Because they own the memory of the simulation, they have access to the complete state of the system. Theoretically, this means it is possible to analyse the data to try to reverse-engineer what is going on at a higher level of understanding, just from looking at the underlying data. Although this tactic can provide small insights, it is unlikely that only looking at the data will allow you to completely understand the higher level of Donkey Kong.  This analogy becomes really important when you are using only raw data to understand complex, dynamic, multiscale systems. Aggregating the raw data into a time-series view makes the problem more approachable. A good resource for this is the book Site Reliability Engineering,which can be read online for free. Understanding complex, dynamic, multiscale systems is especially important for an engineer who is on call. When a system goes down, he or she has to go in to discover what the system is actually doing. For this, the engineer needs both the raw data and the means to visualise it, as well as higher-level metrics that are able to summarise the data. 


The top security and risk management trends facing organizations

“Customer data is the lifeblood of ever-expanding digital business services. Incidents such as the recent Cambridge Analytica scandal or the Equifax breach illustrate the extreme business risks inherent to handling this data,” Gartner noted. “Moreover, the regulatory and legal environment is getting ever more complex, with Europe's GDPR the latest example. At the same time, the potential penalties for failing to protect data properly have increased exponentially.” In the U.S., the number of organizations that suffered data breaches due to hacking increased from under 100 in 2008 to over 600 in 2016. "It's no surprise that, as the value of data has increased, the number of breaches has risen too," said Firstbrook. "In this new reality, full data management programs — not just compliance — are essential, as is fully understanding the potential liabilities involved in handling data."



Quote for the day:

"Be a solution provider and not a part of the problem to be solved" -- Fela Durotoye

Daily Tech Digest - July 28, 2018


Trading has changed significantly with the initiation of computers. In the coming future, blockchain technology will not only exclude intermediates but also make the stock exchange decentralized, without a need for the central system to bring supply and demand together. Since blockchain is shared by all the associates, it is easy to prevent double-spending and verify who owns tokens at some particular point in time. They can be implemented in this sector by using digital currency like bitcoins that can be stored and carried in the form of cryptographic tokens. For instance, since bitcoins uses a peer-to-peer network to broadcast information about any transactions taking place, they can be added to blocks that are cryptographically secured, forming an immutable blockchain. Also, they can be tracked and ‘colored’ to distinguish and can be associated with the ownership of certain assets like stocks, bonds etc. In this way, many different assets can be transferred using the bitcoin blockchain, but there are also other cryptocurrency networks that are authorized for exchanging multiple assets, such as Ripple.



Data-Driven? Think again

When the analysis is complex or the data are hard to process, a pinch of tragedy finds its way into our comedy. Sometimes boiling everything down to arrive at that 4.2 number takes months of toil by a horde of data scientists and engineers. At the end of a grueling journey, the data science team triumphantly presents the result: it’s 4.2 out of 5! The math was done meticulously. The team worked nights and weekends to get it in on time. What do the stakeholders do with it? Yup, same as our previous 4.2: look at it through their confirmation bias goggles, with no effect on real-world actions. It doesn’t even matter that it’s accurate—nothing would be different if all those poor data scientists just made some numbers up. Using data like that to feel better about actions we’re going to take anyway is an expensive (and wasteful) hobby. Data scientist friends, if your organization suffers from this kind of decision-maker, then I suggest sticking to the most lightweight and simple analyses to save time and money. Until the decision-makers are better trained, your showy mathematical jiu jitsu is producing nothing but dissipated heat.


How Big Data Can Play An Essential Role In Fintech Evolution

fintech big data evolution
In the banking and fintech industry, like in many others, offering personalised services is one of the greatest marketing tools available. Fintech companies like Contis Group claim that more and more customers they have search for personalized and flexible fintech services and packages. The pressure to create personalized services in the industry is also driven by the increasing number of companies that adopt such strategies, thus where a keen competition is present. Alternative banking institutions began to use the services of fintech companies to improve their services and offer more personalized packages, but also a better, more comprehensive, faster infrastructure, which contributes to creating a more personalized and facile experience for the final consumer. Not only can fintech companies identify spending patterns to make banking recommendations, but they can also use those to help the final user save more money if this is one of their goals. 


DevOps for Data Scientists: Taming the Unicorn

Header image
Developers have their own chain of command (i.e. project managers) who want to get features out for their products as soon as possible. For data scientists, this would mean changing model structure and variables. They couldn’t care less what happens to the machinery. Smoke coming out of a data center? As long as they get their data to finish the end product, they couldn’t care less. On the other end of the spectrum is IT. Their job is to ensure that all the servers, networks and pretty firewall rules are maintained. Cybersecurity is also a huge concern for them. They couldn’t care less about the company’s clients, as long as the machines are working perfectly. DevOps is the middleman between developers and IT. ... Imagine pushing your code to production. And it works! Perfect. No complaints. Time goes on and you keep adding new features and keep developing it. However, one of these features introduce a bug to your code that badly messes up your production application. You were hoping one of your many unit tests may have caught it.  


The Democratization of Data Science

Once an organization is delivering the access and education needed to democratize data among its employees, it may be time to adjust roles and responsibilities. At a minimum, teams should be able to access and understand the data sets most relevant to their own functions. But by equipping more team members with basic coding skills, organizations can also expect non–data science teams to apply this knowledge to departmental problem solving — leading to greatly improved outcomes. If your workforce is data-literate, for example, your centralized data team can shift its focus from “doing everyone else’s data work” to “building the tools that enable everyone to do their data work faster.” Our own data team doesn’t run analyses every day. Instead, it builds new tools that everyone can use so that 50 projects can move forward as quickly as one project moved before.


Success With AI in Banking Hinges on the Human Factor

The one reason why banking operations aren’t relying on AI isn’t because of the unwillingness to adapt to change. Rather, the industry lacks the right talent to drive that change. There is a significant disconnect between the recognition of a need and an appropriate response. The Accenture research found that while executives believe that most of their employees are not ready to work with AI, only 3% of executives are planning to increase investments in retraining workers in the next three years. This is unfortunate since employees indicate that they are not only impatient to thrive in an intelligent enterprise that can disrupt markets and improve their working experience; they are also eager to acquire the new skills required to make this happen. “Banks’ lack of commitment to upskilling and reskilling employees to learn how to collaborate with intelligent technologies will significantly hinder their ability to deploy and benefit from them,” McIntyre explained.


columbus ohio (aceshot1/Shutterstock.com)
The Connected Vehicle Environment is expected to deliver situational awareness for traffic management and operations based on data from connected vehicle equipment installed in vehicles and on a select group of roadways and intersections where the technology can reduce the number of accidents and support truck platooning, which involves electronically linking groups of trucks to drive close to one another and accelerate or brake simultaneously.  The city will install 113 roadside units that will contain some or all of the following: a traffic signal controller, a Global Navigation Satellite System (GNSS) receiver to pinpoint locations, a wireless dedicated short-range communications (DSRC) radio and a message processing unit. Meanwhile, 1,800 onboard units will be installed in city fleet vehicles and volunteer citizen vehicles that will communicate with the roadside units and one another. The units will contain a GNSS receiver, a vehicle data bus, a DSRC radio, a processing unit, a power management system, software applications and a display.


IoT and data governance – is more necessarily better?


Organizations have realized that data is a strategic asset and a lot of them are trying to commoditize it. In the case of IoT, not all data is created equal. Simply hoarding data because it may be useful one day may create a much higher risk than making decisions about data that make sense for a specific organization. In the case of IoT, this has become a huge challenge because smart devices can gather unimaginable amounts of data. However, the fact that they can doesn’t mean that they should. I will not get into the details of risks around cybersecurity because that has been debated ad nauseam. I am interested in discussing the other side of the coin: business opportunities. What does having a clear strategy for the collection and use of data gathered from IoT devices mean in terms of revenue and profitability? How can data governance help achieve that goal? Data governance is the framework under which data is managed within an organization to ensure the appropriate collection (the “what to use”), processing (the “how to use”), retention (the “until when to use”) and relevance (the “why to use”) of data.



Raspberry Pi gets supercharged for AI by Google's Edge TPU Accelerator

productaccelerator2xsplash.jpg
Machine-learning models will still need to be trained using powerful machines or cloud-based infrastructure, but the Edge will accelerate the rate at which these trained models can run and be used to infer information from data, for example, to spot a specific make of car in a video or to perform speech recognition. While AI-related tasks like image recognition used to be run in the cloud, Google is pushing for machine-learning models to also be run locally on low-power devices such as the Pi. In recent years Google has released both vision and voice-recognition kits for single-board computers under its AIY Projects program. Trained machine-learning models available to run on these kits include face/dog/cat/human detectors and a general-purpose image classifier. Google is also releasing a standalone board that includes the Edge TPU co-processor and that bears a close resemblance to the Raspberry Pi. The credit-card sized Edge TPU Dev Board is actually smaller than Pi, measuring 40x48mm, but like the Pi packs a 40-pin expansion header that can be used to wire it up to homemade electronics.


Data Analytics or Data Visualizations? Why You Need Both

Data Security
Depending upon the level of detail that stakeholders need to draw actionable conclusions, as well as the need to interact with or drill-down into the data, traditional data analytics might not be sufficient for businesses to excel in today’s competitive marketplace. Additional tools are needed to help extract more timely, more nuanced, and more interactive insights than data analysis alone can provide. Those tools are data visualization tools. The reason data analytics is limited might be simple enough. Data analytics helps businesses understand the data they have collected. More precisely, it helps them become cognizant of the performance metrics within the collected data that are most impactful to the business. And it can provide a clearer picture of the business conditions that are of greatest concern to decision-makers. But analytics does not do what data visualization can do: help to communicate and explain that picture with precision and brevity while in a format that the brain consumes exceedingly quickly. The data itself isn’t changed by data viz; further analysis isn’t done. But two-dimensional tables of data are not very amenable to learning; the mind tends to gloss over a large amount of it, scan for highest and lowest values, and miss the details in between.



Quote for the day:


"Tomorrow's leaders will not lead dictating from the front, nor pushing from the back. They will lead from the centre - from the heart" -- Rasheed Ogunlaru


Daily Tech Digest - July 27, 2018

Mastering Spring framework 5, Part 1: Spring MVC

metal spring
Spring MVC is the Spring framework's traditional library for building Java web applications. It is one of the most popular web frameworks for building fully functional Java web applications and RESTful web services. In this tutorial, you'll get an overview of Spring MVC and learn how to build Java web applications using Spring Boot, Spring Initializr, and Thymeleaf. We'll fastrack our Spring MVC web application with the help of Spring Boot and Spring Initializr. Given input for the type of application to be built, Spring Initializr uses the most common dependencies and defaults to setup and configure a basic Spring Boot application. You can also add custom dependencies and Spring Initializr will include and manage them, ensuring version compatibility with both third-party software and Spring. Spring Boot applications run standalone, without requiring you to provide a runtime environment. In this case, since we're building a web application, Spring Boot will automatically include and configure Tomcat as part of the app's runtime. We can also customize the app by adding an H2 database driver to our Maven POM file.



5 Keys to Creating a Data Driven Culture

With business reconstructing their entire model to accommodate the need for digital change, one does think about what is causing this disruption. The need for a digital change starts with data. Data has become the need of the hour, and to perfectly manage and extract it, organizations need to go where the customers are: digital. Data is being generated by customers in the digital world, and organizations are willing to incorporate this digital change in a bid to get hold of this data. IoT devices and smartphones are playing an important role in data generation, curating data important to all organizations. Customers are not the only ones generating this data. From smart city technologies such as connected cars, trains, and video surveillance, to businesses themselves, data is generated at a meteoric rate. The digital interactions that every business has with their customers is one of the major sources of data, and businesses often ponder how they could use these data sources to reach meaningful insights that help them in real time.


New NetSpectre Attack Can Steal CPU Secrets via Network Connections

NetSpectre
Although the attack is innovative, NetSpectre also has its downsides (or positive side, depending on what part of the academics/users barricade you are). The biggest is the attack's woefully slow exfiltration speed, which is 15 bits/hour for attacks carried out via a network connection and targeting data stored in the CPU's cache. Academics achieved higher exfiltration speeds —of up to 60 bits/hour— with a variation of NetSpectre that targeted data processed via a CPU's AVX2 module, specific to Intel CPUs. Nonetheless, both NetSpectre variations are too slow to be considered valuable for an attacker. This makes NetSpectre just a theoretical threat, and not something that users and companies should be planning for with immediate urgency. But as we've seen in the past with Rowhammer attacks, as academics spend more time probing a topic, exfiltration speeds will also eventually go up, while the technical limitations that prevent such attack from working will slowly go down and dissipate.


Embracing RPA - Opportunities and Challenges for Accountancy Profession

Once IT and security risks are satisfied with the IT architecture, the process is documented in detail and can be carried forward for implementation. Key sectors where RPA is playing a significant role in bringing in process efficiencies include highly regulated verticals such as, healthcare, banking, financial services and insurance. Other major sectors include telecommunications, utilities, mining, travel and retail. ... Business users of the organisation review the work of the robots and resolves any exception and escalates, if required, to identify stakeholder for resolution. In a long run, the bots can be self-learning to go the level of RPA for decision making. RPA is believed to revolutionize and redefine the way we will work and make us more smart and quick in processes, RPA as we see have commenced deployment in most large business and, will continue to grow and will adopt to be cognitive by next five years. Further, it is predicted by many that that is shall develop to machine learning platform probably by year 2025-2026


Containers Provide the Key to Simpler, Scalable, More Reliable App Development


Kubernetes originally came out of Google, and it’s basically an orchestration layer around containers. For example, if I’m writing a containerized application, I can run it on top of Kubernetes, and Kubernetes will handle a lot of the underlying infrastructure orchestration—specifically, things like scaling up to meet demand or scaling down when demand is light. If servers crash, it will spin up more. The application developer simply says, “Hey, here are my containers. This is what they look like. Run them,” and then Kubernetes manages and orchestrates all of the underlying capacity. Kubernetes works whether you’re developing an application for three people or a global enterprise. What you’re doing is applying good architectural structure around a large-scale application whether you need it or not. So, you’re getting inherent reliability and scaling abilities along with capabilities to address and handle failures. For example, let's say I deploy a cluster within an on-prem or cloud infrastructure region and it is spread across three different physical availability domains.


CCTV and the GDPR – an overview for small businesses

The GDPR requires data controllers and processors to implement “appropriate technical and organisational measures” to protect personal data. This entails an approach based on regular assessments to ensure that all risks are appropriately addressed. For instance, access to CCTV systems must be limited to authorised personnel, which is especially important where systems are connected to the Internet or footage is stored in the Cloud, and there is a greater risk of unauthorised access. Surveillance systems should also incorporate privacy-by-design features, including the ability to be switched on or off, and the option to switch off image or sound recordings independently where it would be excessive to capture both. CCTV equipment must also be of a sufficient quality and standard to achieve its stated purpose. The international standard for information security management, ISO 27001, is an excellent starting point for implementing the technical and organisational measures necessary under the GDPR.


Why a product team’s experimentation goes wrong

Why a product team̢۪s experimentation goes wrong image
The only thing worse than not running experiments is running experiments that are misinterpreted. There are several ways in which companies misunderstand the statistics behind experiments. Firstly, companies are overly reactive to early returns. Early on during experiments there are few conversions and experiment results swing wildly. When teams “peak early” at results, they frequently overvalue the data and end experiments prematurely. It is very common for the direction of a metric to swing over the course of an experiment and teams that do not have the patience to wait are at the mercy of random chance. Secondly, stakeholders often create arbitrary pressures and deadlines to get answers early. In many business processes, management can improve productivity by introducing pressure and deadlines for teams. However, in the realm of science, this behaviour causes the opposite of the intended effect. Ordering teams to give results of by a certain date often causes teams to interpret insignificant data through gut feel. While these decisions can feel scientific to executives, they are all too often incorrect and give a false certainty about the wrong direction.


With Today’s Technology, Do You Really Need A Financial Advisor?

Finding an asset to invest in is one thing, but understanding how to implement it is another. Before investing in any funds, it is very important to study the historical data of the asset class. Sure, most of the time, past performance doesn’t necessarily correlate with future performance, but it is reasonable to think that some historical risk-reward relationships are likely to persist (i.e., long-term, stocks could be expected to outperform bonds, but with a higher degree of volatility). The financial advisor will look at all these and present you with an implementation plan that is likely to benefit you the most. While choosing assets to invest in, another aspect that clients usually overlook is taxes. If the future returns on an asset turn out to be average while the taxes on them are high, then the overall return for an investor would be negatively affected. This is why tax management is important, as tax-conscious financial planning and tax-efficient portfolio construction can lead to higher returns.


This company changes the DNA of investing — through machine learning

This company changes the DNA of investing — through machine learning
Simply put, a computer can be taught what `successful trading’ looks like, and combine such information from various users to build an investment portfolio that draws from their cumulative wisdom. It is no wonder, then, that financial giants such as JPMorgan Chase and Goldman Sachs are openly utilizing machine learning for their investing practices. After all, they have the resources and the data to make it work. However, this power is not reserved for these giant corporations. There are instances in which machine learning can benefit the ‘little guy’ as well. eToro’s declared mission is to disrupt the traditional financial industry and break down the barriers between private investors and professional-level practices. One such instance can be seen in eToro’s CopyFunds Investment Strategies, which are managed thematic portfolios, powered by advanced machine learning algorithms. This means private individuals now have access to technology previously reserved for giant corporations.


The Commercial HPC Storage Checklist – Item 2 – Start Small, Scale Large

As the HPC project moves into full-scale production, the organization then faces the opposite problem, making sure the system can scale large enough to continue to meet the capacity demands of the project. Scaling out requires meeting several challenges. First, the system has to integrate new nodes into the cluster successfully, since additional nodes provide the needed capacity and performance. However, adding another node is not always as straightforward as it should be. Many systems require adding the node manually as well as manually rebalancing data from other nodes to the new node. The Commercial HPC storage customer should look for an HPC storage system that can grow with them as their needs evolve. It should start small during the initial phases of development but scale large as the environment moves into production. The system should make the process of adding nodes as simple as possible; automatically finding available nodes, adding them to the cluster automatically and automatically rebalancing cluster data without impacting storage performance.



Quote for the day:


"Ever tried. Ever failed. No matter. Try again. Fail again. Fail better." -- Samuel Beckett


Daily Tech Digest - July 25, 2018

Are Initial Coin Offerings leaking money? image
The bottom line is that ICOs are being constructed with serious holes in them. Worse still, as the numbers from EY show, cyber criminals are taking advantage. Companies running ICOs are drawing huge sums of money in a very narrow window of time. If something goes wrong once the ICO is live, there little room for manoeuvring and precious little legal recourse that can realistically be taken. It’s the perfect conditions for cyber criminals to exploit. There is a high financial motivation and they’ve been drawn to ICOs like sharks drawn to churn in the water. The consequence of an attack? Well there’s two parties that could be affected there: the ICO organisers and the investors. Just one vulnerability is enough for attackers to steal investors’ money and do irreparable damage to the corporate reputation of the ICO organiser. The need to patch these holes is apparent but organisations are working on short time frames and might not realise where they are most vulnerable. So what are the main points of weakness?



Rolls-Royce Is Building Cockroach-Like Robots to Fix Plane Engines


Rolls-Royce believes these tiny insect-inspired robots will save engineers time by serving as their eyes and hands within the tight confines of an airplane’s engine. According to a report by The Next Web, the company plans to mount a camera on each bot to allow engineers to see what’s going on inside an engine without have to take it apart. Rolls-Royce thinks it could even train its cockroach-like robots to complete repairs. “They could go off scuttling around reaching all different parts of the combustion chamber,” Rolls-Royce technology specialist James Cell said at the airshow, according to CNBC. “If we did it conventionally it would take us five hours; with these little robots, who knows, it might take five minutes.” Rolls-Royce has already created prototypes of the little bot with the help of robotics experts from Harvard University and University of Nottingham. But they are still too large for the company’s intended use. The goal is to scale the roach-like robots down to stand about half-an-inch tall and weigh just a few ounces, which a Rolls-Royce representative told TNW should be possible within the next couple of years.


While tight integration is desirable, these systems have many of the same networking challenges as other data center deployments, including requirements for scalability, automation, security and management of traffic flows. Additionally, they need to link to other data center resources inside the data center, at remote data centers and in the cloud. Software-defined networking architecture can ease some of the scaling, automation, security and connectivity challenges of hyper-converged system deployments. Hyper-converged systems integrate storage, computing and networking into a single system -- a box or pod -- in order to reduce data center complexity and ease deployment challenges associated with traditional data center architectures. A hyper-converged system comprises a hypervisor, software-defined storage and internal networking, all of which are managed as a single entity. Multiple pods can be networked together to create pools of shared compute and storage.


Big Tech is Throwing Money and Talent at Robots for the Home

CES 2018
Whether or not the robots catch on with consumers right away is almost beside the point because they’ll give these deep-pocketed companies bragging rights and a leg up in the race to build truly useful automatons. “Robots are the next big thing,” said Gene Munster, co-founder of Loup Ventures, who expects the U.S. market for home robots to quadruple to more than $4 billion by 2025. “You know it will be a big deal because the companies with the biggest balance sheets are entering the game.” Many companies have attempted to build domestic robots before. Nolan Bushnell, a co-founder of Atari, introduced the 3-foot-tall, snowman-shaped Topo Robot back in 1983. Though it could be programmed to move around by an Apple II computer, it did little else and sold poorly. Subsequent efforts to produce useful robotic assistants in the U.S., Japan and China have performed only marginally better. IRobot Corp.’s Roomba is the most successful, having sold more than 20 million units since 2002, but it only does one thing: vacuum.



“Enterprise Architecture As A Service” - How To Reach For The Stars


Educate those implementing your value chain in best open practices. To deliver EA As A Service, one would do well by ensuring services are delivered through best practices that are open because this enables an organization to train easily, hire selectively, and produce consistently. Of course, one might ask about differentiation – the secret sauce for differentiation will be in your proven ability to deliver fast and on target! Apply the best in class tools proven to improve production capability. Similar to the about decided upon and utilizing a consistent set of best in class tools helps ensure that deliverables are consistent among clients and enable reuse which can improve speed and quality of delivery. Tools that support the best open practices add even more. Collaborate with partners to evolve the best open practices. Keeping in mind that differentiation comes in how well you deliver EA As A Service, collaboration on the best open practices provides an avenue to improve the best practices based on real experiences, improves market perception, and helps keep the bar raised for the industry.



Micropsia Malware


Controlled by Micropsia operators, the malware is able to register to an event of USB volume insertion to detect new connected USB flash drives. This functionality is detailed in an old blog post. Once an event is triggered, Micropsia executes an RAR tool to recursively archive files based on a predefined list of file extensions ... Most of the malware capabilities mentioned above have outputs written to the file system which are later uploaded to the C2 server. Each module writes its own output in a different format, but surprisingly in a non-compressed and non-encrypted fashion. Micropsia’s developers decided to solve these issues by implementing an archiver component that executes the WinRAR tool. The malware first looks for an already installed WinRAR tool on the victim’s machine, searching in specific locations. In the event a WinRAR tool is not found, Micropsia drops the RAR tool found in its Windows Portable Executable (PE) resource section to the file system.


FinTech’s road to financial wellness


It’s one thing to build up a pot of money (saving), but it’s also vital to make that money work hard for you (investing). Investment platforms like Moneybox and Nutmeg are giving everyday people the ability to make their money go further. Robo-advice in particular is making it considerably easier for consumers to invest their money in a way that matches their circumstances and attitude to risk. A key benefit of these start-ups is that they often have low minimum investment limits, which has led to younger generations and those with small savings pots being able to invest. ... A recent report found the insurance sector lags only behind the utilitiessector when it comes to disappointing customers with a poor online customer experience. These bad experiences are causing consumers to be put off dealing with insurance and insurers, meaning those consumers often aren’t financially protected. InsurTech companies like Lemonade, however, are using behavioural economics and new technology to create aligned incentives between the insurer and the customer.


Securing Our Interconnected Infrastructure

While it's encouraging that the House is leaning forward on industrial cybersecurity and committed to authorizing and equipping the Department of Homeland Security to protect our critical infrastructure, this still remains largely a private sector problem. After all, over 80% of America's critical infrastructure is privately owned and the owners and operators of these assets are best positioned to address their risks. In doing so, one of the questions companies are asking themselves is how to reconcile the risks and rewards of the interconnected world. Should we simply retreat into technological isolationism and eschew the benefits of connectivity in the interest of security, or is there a better way to manage the risk? The former is gaining a growing chorus, especially among security researchers. The latest call comes from Andy Bochman of the Department of Energy's Idaho National Labs. Bochman argued this past May in Harvard Business Review that the best way to address the cyber-risk to critical infrastructure is "to reduce, if not eliminate, the dependency of critical functions on digital technologies and their connections to the Internet."


The race to build the best blockchain


Things move incredibly fast in the blockchain world. Ethereum is three years old. Projects like Cardano and EOS, sometimes called "blockchain 2.0" projects, are already considered to be giants in the space. They have a combined token market cap of roughly $11.8 billion despite barely being operational. Cardano, which focuses on a slow and steady approach, with every iteration of the software being peer reviewed by scientists, is promising, but it hasn't fully launched its smart contract platform yet. EOS, an incredibly well-funded startup that launched in June, is another huge contender. However, EOS has a complicated governance process which caused a fair amount of trouble right after the launch, together with a slew of freshly discovered bugs. With an estimated $4 billion in pocket, EOS has the means to do big things, but it will take some time to see whether it can live up to the promise.  But there's already a new breed of blockchain startups coming. They've been working, often in the shadows, to develop new concepts and technologies that may make the promise of a fast, decentralized app platform a reality.


Serverless vs. containers: What's best for event-driven apps?


Event processing is very different from typical transaction processing. An event is a signal that something is happening, and it often requires only a simple response rather than complex edits and updates. Transactions are also fairly predictable since they come from specific sources in modest numbers. Events, however, can originate anywhere, and frequency of events can range from nothing at all to tens of thousands per second. These important differences between transactions and events launched the serverless trend and also precipitated the strategy called functional programming. Functional programming is pretty simple. A function -- or lambda, as it is often called -- is a software component that contains outputs based only on the input. If Y is a function of X, then Y varies only as X does. For practical reasons, functions don't store data that could change their outputs internally. Therefore, any copy of a function can process the same input and produce the same output. This facilitates highly resilient and scalable applications.



Quote for the day:


"Rarely have I seen a situation where doing less than the other guy is a good strategy." -- Jimmy Spithill


Daily Tech Digest - July 24, 2018

Rapid7 penetration tests reveal multitude of software flaws, network misconfigurations

Rapid7 penetration tests reveal multitude of software flaws
People are simply too predictable when it comes to creating passwords, and that’s even if an organization enforces password length and complexity standards. For example, “Summer2018!” meets the objectives of a password that is required to have at least one uppercase letter, one lowercase letter, one number, and one special character. But Rapid7 noted that it is one of the worst passwords a person can choose. Seasonal passwords came in as the third most common type of password. ... What do organizations most care about protecting? Despite the almost-daily data breach announcements, Rapid7 found that organizations are more concerned with protecting their own sensitive data such as internal communications and financial metrics than protecting the sensitive data of their customers or employees. As for organizations’ top five biggest priorities for protecting information, sensitive internal data is at the top with 21 percent, PII was second at 20 percent, authentication credentials were third at 14 percent, protecting payment card data came in at 7.8 percent, and bank account data was fifth at 6.5 percent.



Three AI And Machine Learning Predictions For 2019


The U.S. Army is currently using machine learning to predict when combat vehicles need repair. Think about it, there are millions of pieces of equipment that our Army uses each and every day. To keep track of the data involved, they are recruiting the help of an AI assistant. For the first implementation, a few dozen armored infantry transports will receive sensors inside of the vehicles’ engines. These sensors will record temperature and RPM and will transmit it to the software. Machine learning capabilities will look for patterns in the data that match engine failures in similar vehicles. What if your car did this? AAA might become obsolete if your car could tell you that the transmission is about to crap out on you. If the army is using the technology, I'm sure it won't be long till we see it in the civilian world. Automotive isn't the only industry that is seeing potential new uses for this tech, healthcare is about to see some changes too. As if Google wasn’t already on the AI map, they have begun to predict the likelihood of a patient’s death using machine learning – with staggering 95% accuracy.



data center technician
NVMe is a protocol for accessing high-speed storage media that’s designed to reduce latency and increase system and application performance. It's optimized for all-flash storage systems and is aimed at enterprise workloads that require low latency and top performance, such as real-time data analytics and high-performance relational databases. Storage vendors have been re-tooling their systems to support the faster interconnect protocol, and IBM is no exception. A key change in the FlashSystem 9100 is the use of small form factor NVMe drives. IBM redesigned its FlashCore technology to fit into a standard 2.5-inch SSD form factor with NVMe interfaces – a move that reduced the physical size of the drives by more than half. That redesign made an impression on Owen Morley, director of infrastructure at online dating platform Plenty Of Fish. Morley is among a group of users of IBM's all-flash storage who came together at an event in Mexico City to share their thoughts on the new 9100 system and the potential for NVME-accelerated storage in their own enterprises.



Edge computing will be vital for even simple IoT devices

The evolution of wearables required each generation to monitor and collate a greater number of measurements (raw data). Developers found optimal ways of doing this by processing raw data locally (on the edge of the application using the Bluetooth chips’ increasingly powerful onboard processors) and then forwarding to a smartphone app and the cloud (for data sharing and tracking) only the essential information (desired data). The technology enabled continuous (low-latency) monitoring, and the modest Bluetooth wireless throughput was sufficient to update apps and cloud servers of the key tracking information without requiring extended on-air duration that would otherwise be needed to stream raw data. Sending only the key information also minimized the impact on the user’s cellphone data allowance (data cost). Things go wrong, hackers never quit Because users didn’t always carry their smartphones, wearables had to operate autonomously when not connected. Resiliency was built into the systems. They didn’t depend on a continuous network or internet connection for successful operation (redundancy).


Nation-State Spear Phishing Attacks Remain Alive and Well

Nation-State Spear Phishing Attacks Remain Alive and Well
The trouble with phishing is that it relies on social engineering - meaning it's designed to trick users - and it can potentially be used to compromise any online account. Unfortunately, we humans are both easy to trick - at least some of the time - as well as fallible. And attackers can pummel would-be victims with phishing attacks until one succeeds. The scale of the phishing challenge is reflected by the number of video interviews touching on phishing that I recently conducted at the London Infosecurity Europe conference. Experts described everything from the increasingly targeted nature of phishing attacks and the importance of never forgetting the human factor as well as training users, using technology to extract data from emails and attachments and implementing the practice of tracking malicious domains to better block phishing campaigns. But as this patchwork of practices, procedures and technology demonstrates, there's no single fix for the phishing problem. Furthermore, with more of our business and personal lives now living in the cloud, the impact of falling victim to a phishing attack continues to increase.



Privacy pros gaining control of technology decision-making over IT

“This global survey is critical in our efforts to better understand how privacy professionals are addressing compliance challenges and the technologies that are being deployed now and in the near future,” said Chris Babel, CEO of TrustArc. “Though security budgets remain larger, we’re seeing a marked shift in privacy teams’ influence over technology purchasing decisions. This trend confirms what we’re seeing among our customers – that they have a growing need for technology solutions to help them manage privacy compliance at scale on a global basis.” The EU GDPR and other global and domestic legal reforms, combined with technological advancements, have made the task of operationalizing privacy and data protection vastly more complicated. Businesses now must account for how data is entering the organization, how it is being used, what permissions are attached to it and who has the responsibility for managing it. To address these challenges, the demand for privacy technology continues to grow rapidly.


Measuring Tech Performance: You’re Probably Doing It Wrong


First, velocity is a relative and team-dependent measure, not an absolute one. Teams usually have significantly different contexts which make comparing velocities inappropriate. (Seriously, don’t do this.) Second, when velocity is used as a productivity measure, teams are very likely to game it: they inflate their estimates and focus on completing as many stories as possible at the expense of collaboration with other teams (which might decrease their velocity and increase the other team's velocity, making them look bad). Not only does this destroy the utility of velocity for its intended purpose, it also inhibits collaboration between teams. Velocity as a productivity metric violates our guidelines by focusing on focusing on local measures and not global measures. This is particularly obvious in the second critique above: by (understandably) making choices to optimize their own velocity, teams will often not collaborate with other teams. This often results in scenarios where subpar solutions are available to the organization because there isn't’ a focus on global measures.


How to spot bad data, and know the limitations when its good

A 2016 survey of CEOs found 84 percent of them felt concerned about the quality of data they used while making decisions. And they have valid reasons for feeling wary — bad data could cause financial repercussions if business leaders put too much trust in material that’s ultimately lacking. It’s also crucial to consider the wasted time from bad data. When professionals engage in data-driven marketing, they may be relying on content filled with non-human influences such as bots or malware. If that happens, they could get false perceptions of customers’ journeys at websites or the factors that cause them to linger on certain pages versus others. There are reputational risks, too. If a company releases public research that later gets proven inaccurate, it’ll be difficult for that entity to encourage trust in future material. When business leaders blindly trust data — especially when making decisions — they inevitably set the stage for problems. Staying aware of the characteristics of bad data discussed here is an excellent first step in being proactive.


Law firms failing to meet their client’s digital expectations, according to study

Law firms failing to meet client̢۪s digital expectations image
Martin Flick, CEO of Olive Communications, said: “Today’s busy, always on and mobile first consumer wants to buy goods and services, and communicate with sellers whenever, wherever, and however they choose.” “Increasingly this is through digital interaction. When it comes to their lawyer or solicitor, they want to engage in the same way, without the frustration of having to wait days for paper documents to arrive in the post or for an email to come through with the answer to a question that could be easily resolved with an instant message or automated response.” “Consumers want more control over their legal affairs with sometimes, little or no human intervention, and with the speed, efficiency, and security that multiple channel web-based communications offer.” The study found that a significant portion of law firms are embracing new technology internally, for example, 69% are using IM and chat to communicate with each other. However, few of these firms are extending the use of technology externally to enhance the client experience.


Backup best practices: A NAS is not enough

The idea of 3-2-1 is to have three copies of every file, two of which are on different physical devices, and one of which is located off-site. Our guy didn't have that. He counted entirely on one NAS for all his backups. He has an offsite backup, but it hadn't been updated.The "off" part of my strategy is to have at least one full backup air-gapped from the Internet. I do this for my stuff by keeping one backup server shut down, except for a once a week quick incremental backup nibble ... The point of this article, though, is to remind you of the 3-2-1-off-and-away strategy and to not be dumb. A single NAS as your backup strategy is not enough. As a rule, I have two NAS boxes running all the time. One is my hot, live working environment. Another is an offline backup. In my case, I was fortunate that the ioSafe folks sent me their flood-and-fire-proof ioSafe 1515+, so my backup NAS isn't just a second NAS, it's an armored bomb-proof bunker of a backup NAS. At some point in the future, I'll take you through my whole storage architecture.



Quote for the day:


"You may not control all the events that happen to you, but you can decide not to be reduced by them." -- Maya Angelou


Daily Tech Digest - July 23, 2018

Most of AI’s Business Uses Will Be in Two Areas


The business areas that traditionally provide the most value to companies tend to be the areas where AI can have the biggest impact. In retail organizations, for example, marketing and sales has often provided significant value. Our research shows that using AI on customer data to personalize promotions can lead to a 1-2% increase in incremental sales for brick-and-mortar retailers alone. In advanced manufacturing, by contrast, operations often drive the most value. Here, AI can enable forecasting based on underlying causal drivers of demand rather than prior outcomes, improving forecasting accuracy by 10-20%. This translates into a potential 5% reduction in inventory costs and revenue increases of 2-3%. While applications of AI cover a full range of functional areas, it is in fact in these two cross-cutting ones—supply-chain management/manufacturing and marketing and sales—where we believe AI can have the biggest impact, at least for now, in several industries. Combined, we estimate that these use cases make up more than two-thirds of the entire AI opportunity.



How SD-WAN Will Make The Cloud Much Much Bigger

cloud balloon inflate cloud computing grow big blow up
The need to be connected to the mother ship is what brings the Cloud into its meaningful existence because we live and work at the edges of the Cloud. SD-WAN is not just a market but a platform as well that will eventually evolve into user-defined WAN (UD-WAN). To clarify, the term applies to enterprise users and not consumers. And the purpose of SD-WAN is to connect and fully integrate the very edges of the enterprise – be it corporate headquarters, branch/remote offices or the mobile millions. In other words, us, the users. But if we look at the concept of the cloud it is pretty clear that it is referenced in an abstract form. After all what is this cloud thing? Some physical space in a non-descript windowless warehouse? Without its tentacles, the cloud is nothing more than a collection of computers, storage and cooling systems created by geeks and for what purpose? It is those very tentacles in the form of wide-area networks (WAN) that give the Cloud its purpose. And given the explosive adoption of cloud-based applications (Box, Dropbox, Salesforce, SAP, Slack, etc.) cloud computing is not a fad, it is here to stay. However, that is just the beginning.


The value of superior UX? Priceless, but awfully hard to measure

The problem, Cooper continues, is that managers and executives outside of the bubble remain skeptical about investing any more than they have to in UX -- to them, it's a dark art. So, they ask: "What is the ROI of UX?" Asking about ROI, of course, is a manager's way of expressing doubts. "They aren't seeking enlightenment," Cooper says. ... In UX design, he continues "ROI is often about eliminating poor design." Some industry specialists have attempted to put a monetary value on superior UX design. A recent report from CareerFoundry estimates that UX design work delivers a 100-fold return on investment, without even counting the soft benefits. Every $1 investment in UX translates to returns of at least $100 dollars, the report's authors illustrate -- mainly through e-commerce and customer-facing interactions. Add to this the softer, but just as important, ancillary benefits: "fewer support calls, increased customer satisfaction, reduced development waste, and lower risk of developing the wrong idea."


Why techmatters – the challenge for everyone in the UK tech community


If there is a magic recipe for digital innovation, then the UK surely has all the ingredients. We have created and attracted some of the world’s best and most diverse digital talent. We have world-leading businesses, universities and powerful ecosystems that enable expertise to spill over from one part of the economy to another. In almost every sector, I can point to world leaders on the cutting edge of digital transformation. Above all, we have ambition and we have each other. What sets us apart from any other country is that in the UK technology community, we stand on the shoulders of each other. But to really thrive, three things are important. We must stay focused on making tech work for people and our economy. We must not underestimate our international competitors. And, perhaps most importantly, we must accept the enormous responsibility that comes with developing powerful technology. We do have great people in this sector – but we simply don’t have enough of them. And we don’t have the depth of skills and talent that the economy needs as a whole. This, surely, is our biggest challenge.


Why Artificial Intelligence Is Not a Silver Bullet for Cybersecurity

While AI is likely to work quite well over a strictly controlled network, the reality is much more colorful and much less controlled. AI's Four Horsemen of the Apocalypse are the proliferation of shadow IT, bring-your-own-device programs, software-as-a-service systems, and, as always, employees. Regardless of how much big data you have for your AI, you need to tame all four of these simultaneously — a difficult or near-impossible task. There will always be a situation where an employee catches up on Gmail-based company email from a personal laptop over an unsecured Wi-Fi network and boom! There goes your sensitive data without AI even getting the chance to know about it. In the end, your own application might be protected by AI that prevents you from misusing it, but how do you secure it for the end user who might be using a device that you weren't even aware of? Or, how do you introduce AI to a cloud-based system that offers only smartphone apps and no corporate access control, not to mention real-time logs? There's simply no way for a company to successfully employ machine learning in this type of situation.


Unsecured server exposes 157 GB of highly sensitive data from Tesla, Toyota and more

data breach, Level One, Tesla, Toyota, Ford
The unsecured trade secrets and corporate documents had been exposed via the file transfer protocol rsync. UpGuard wrote, “The rsync server was not restricted by IP or user, and the data set was downloadable to any rsync client that connected to the rsync port. The sheer amount of sensitive data and the number of affected businesses illustrate how third- and fourth-party supply chain cyber risk can affect even the largest companies. The automation and digitization of manufacturing has transformed the industry, but it has also created a new area of concern for industries, and one that must be taken seriously for organizations to thrive in a healthy digital ecosystem.” Not only could anyone connect to Level One’s rsync server, but it was also “publicly writable, meaning that someone could potentially have altered the documents there, for example replacing bank account numbers in direct deposit instructions, or embedding malware.” The exposed rsync server was discovered on July 1. Attempts to contact Level One started on July 5, but contact wasn’t established until July 9. The exposure was closed within a day, by July 10.


Organizations Need IT Experts Who Know Basic LAN/WAN Switching and Routing

Responsiveness, security, and reliability are the new hallmarks of networking. Automation, analytics, IoT, policy-based network management, programmability, and virtualization are enabling these changes. The technologies, and the ways they’re being applied, are new. So, IT and networking professionals need new skills to make them work for businesses. In order to appeal to hiring managers, boost their careers, and bring greater value to employers there are fundamental skills that IT and networking professionals need. At a very fundamental level, it’s critical that IT experts know the basics of LAN and WAN switching and routing. These skills will help network engineers configure, verify, troubleshoot, and secure today’s networks. In addition, the evolution of the network creates a growing need for IT professionals who can implement and manage software-centric networks. This involves using APIs, controllers, policies, and virtualization. These technologies and tools allow for greater automation, network intelligence, and agility.


The Engineer’s guide to the future


If AR is hyped, AI is basically the buzzword of the century. Lots of people aren’t really sure what it means, but they know it’s important and that their business needs it. The first thing to know is that modern day Artificial Intelligence doesn’t actually mean a computer being intelligent — it’s basically a catch-all term for computer programs that can “learn”, to improve their operational efficiency or their success. Even at that, lots of applications that say they use AI actually don’t. A chatbot that has a big decision tree in the background isn’t AI, it’s just a big decision tree. If you ask “What is Ragnarok?” and get back the answer “It is simultaneously a great action movie and the ruin of a good character” — it’s probably not artificial intelligence, just quite wise. However, there is plenty of amazing work being done with proper AI and Machine Learning, for a whole heap of use-cases. We don’t need a crystal ball to say that knowing about AI will be beneficial for a future engineering career. Similar to Apple and Google releasing tools to “democratise” Augmented Reality development, each year there are more tools available to enable developers to build AI solutions


In the wake of GDPR, college IT security programs need to evolve

While U.S. universities who offer information security programs typically cover a range of compliance concepts related to U.S. regulations, such as the Health Insurance Portability and Accountability Act (HIPAA) or Sarbanes-Oxley (SOX), the GDPR is something of a game changer because it is not a regulation enacted by a U.S. agency, yet it requires compliance on the part of U.S. entities. The GDPR is only the first of several proposed global regulations governing data privacy. Before 2015, data exchanges between the U.S. and the EU were governed by the Safe Harbor program which allowed the personal data of EU citizens to be exchanged with U.S. providers as long as both sides of the transaction complied loosely with the EU Data Protection Directive. The directive wasn’t as tightly defined as the GDPR and lacked teeth in the form of significant fines or penalties. As a result, up to this point in time, U.S. businesses have not had to unduly concern themselves with regulations enacted outside U.S. borders. GDPR demands a change in that mindset.


Can businesses use blockchain to solve the problem of data management?

Can businesses use blockchain to solve the problem of data management? image
Since the nodes are distributed and operate peer-to-peer, the possibility of bottleneck formation is nonexistent. One of the most important features of blockchain systems, however, is immutability: once an entry is appended to the database, it cannot be removed. Using blockchain for databases seems like a logical step forward. There’s definitely an emerging movement seeking to lay the foundations for a decentralised architecture across industries. With blockchain, a marketplace akin to AirBnB or Uber can materialise for storing data – nodes on the network can be incentivised to replicate and retain information using a blockchain protocol’s inbuilt payment layer. This concept can be taken a step further with the use of sharding and swarming. Sharding offers a greater degree of privacy whereby, instead of sending a file to other nodes, you distribute fragments of said file. In this way, the owner can be sure that those in possession of their data cannot access it, as they will only hold a small (and unreadable) piece – much like torrenting.



Quote for the day:


"Authentic leaders are not afraid to make mistakes, but they fix them faster than they make them." -- George Bernard Shaw