Today, however, many enterprise blockchains are running in the cloud and often in a single datacenter. This goes against the philosophy that blockchains should be fully decentralized and opens up security risks associated with managing data in a single location. Cisco believes the next generation of enterprise blockchain systems should be fully decentralized and offer hybrid deployment models, where blockchain nodes can be on-premise behind firewall systems or use public cloud nodes or some combination of the two. In addition, business leaders need to think holistically about end-to-end security. A common misconception is that the distributed nature of blockchains makes them inherently secure, and in many ways, they are. However, without the correct design measures, they can be prone to multiple threats, for example through the exploitation of infrastructure-level vulnerabilities. Our approach involves end-to-end security architecture, leveraging analytics that spans the infrastructure layer through to the application layer.
What has changed in recent years in the promise of AI is the coming together of very advanced computing power with highly sophisticated algorithms and networks that collaborate to recognize patterns, discern meaning from vast sets of data and train themselves to get better at these tasks. For the enthusiasts, the big question in AI might be when we will achieve machine super-intelligence, or Singularity, the point at which machine intelligence explodes to vastly surpass human intelligence. It is chillingly described as the last invention humans will make. But for a country at India’s level of socio-economic development, the suite of AI technologies can be applied effectively ... The NITI Aayog report proposes an umbrella organisation to shape and implement India’s AI policies—from stitching global partnerships to picking specialised teams to pursue audacious moonshot projects. It identifies five focus areas and a two-tiered institutional structure - government-led CORE or Centre of Research Excellence to focus on core research and private sector-led ICTAI or International Centers of Transformational AI to focus on application-based AI research.
Because architects are often visually oriented, we tend to show the fruit of our efforts by means of diagrams. But this is not always the best option; it mainly works for 1) designing and 2) visually oriented people. In management, we often find people with a legal background. They are ill-served by diagrams and other pictures, but instead require text, while those with a background in finance need numbers, tables, and charts. Speaking their “language” is an important part of effecting the change you want to see. The information contained in architecture models can be displayed in many forms – not just diagrams. In particular, the various dependencies between the elements of your architecture can be exploited to create cross-cutting insights that serve the needs of various stakeholder groups. Displaying this information in multiple ways may help different types of stakeholders gain the understanding they need to move forward. Below, you see the technology obsolescence risk of (part of) the business capabilities of an organization. The end-of-life dates of various technologies, as provided by Technopedia, are aggregated across the entire architecture via, in this instance, applications and business processes supporting each capability.
This article will focus on working with Ethereum on the Microsoft Azure platform. Azure is a cloud platform from Microsoft and could be used for other beneficial purposes as well. We’ll use consortium from Azure. A consortium in Ethereum’s context means a collaboration between supporting members of a blockchain with underlying infrastructure. Consortium helps us configure the block chain and its controlling parts on Azure. We create a Genesis block and a default account with initial Ether allocations. Setting up consortium here would be a private blockchain and would not be part pf public Ethereum blockchain that means Ethers from private blockchain could not be transferred or exchanged directly with Ether from public blockchains. Azure helps us to set up the baseline blockchain infrastructure by providing the templates. We’ll first set up a blockchain consortium on Azure and Azure will create a genesis block for us. If one does not have a paid Azure account, one could leverage Azure’s new account’s benefits of giving $200 credits. That means if you are new to Azure and wants to play around with its free trial, you’ll get $200 credits that you can use to explore Azure. If you are new to Azure and do not have an account, follow following process, else directly login to your portal.
Companies may have good reasons for wanting to avoid implementing public blockchain infrastructure. Chief among their concerns would be that they want to keep their information and transactions private. Firms need to protect their intellectual property and to safeguard critical pieces of information. A private blockchain can function like its public counterparts with the exception of a permissioning system where only authorized users could interact with the blockchain. External parties such as accountants, auditors, and regulators could be brought into the structure in order to monitor business activities in real time. As such, blockchain creates the possibility for what many are calling a triple entry accounting system. Transactions are not just recorded as debits and credits on separate books when companies do business with one another, but the transactions can be easily verified on a blockchain as well. This has the potential to eliminate some of the chief mechanisms for fraud even if it will prove challenging to eliminate fraud completely.
For banks globally, 2018 could be a pivotal year in accelerating the transformation into more strategically focused, technologically modern, and operationally agile institutions, so that they may remain dominant in a rapidly evolving ecosystem. This metamorphosis is far from easy as most banks grapple with multiple challenges: complex and diverging regulations, legacy systems, disruptive models and technologies, new competitors, and, last but not least, an often restive customer base with ever-higher expectations. ... Although banking has undoubtedly improved in many ways in the last couple of decades, most organizations have not gone through the customer-centric transformation that other industries have undergone. With widespread digital disruption, banks may even risk losing control over customer experience. Of course many banks, global and local, large and small, have changed their market and customer strategies since the financial crisis. Many of these decisions may have been forced upon them by regulatory expectations, and perhaps are not necessarily grounded in a refined understanding of markets and customers.
Even before machine learning and deep learning, image analysis software was available that could identify edges, boundaries, geometric shapes, and density distributions, as well as texture analysis, as aids for the pathologist. Subsequently, machine learning of various types such as random forests and support vector machines were able to learn from the features and identify lesions. With the advent of deep learning with multiple connected processing layers, it was no longer necessary to extract such features from an image before using those features as representations of the image to input into the algorithm. The most widely used deep learning model is the convolutional neural network (CNN). A CNN can do feature extraction automatically and analyse that information directly. All that a CNN needs is a (suitably labelled) training set of the images themselves. Rather than working with the (large number) of pixels in the image directly, the initial layers of the CNN will generate increasingly complex representations of those images which, though much smaller than the original, capture important features that the rest of the network uses for analysis and classification.
The original blockchain architecture for Bitcoin is not built for Turing complete Smart Contracts; we need another architecture for large scale decentralized applications that fulfill the requirements from business and economic use cases. As such, PoW and PoS are both unsuitable. To meet the requirements, NEO Blockchain is built on delegated Byzantine Fault Tolerance (dBFT), an adapted BFT algorithm by Erik Zhang. The advantage with dBFT is that it cannot branch. There is always one and only one block proposed at any single time. ... The transaction is completed directly after it is included in a block, in other words in one single confirmation. It also means that the transaction time is actually completely equal to block time; if block time is 5 seconds, then the transaction is received in 5 seconds. Moreover, there is no risk management needed when sending and receiving large volume and high value transactions. With no branching during operation, all transactions are 100% complete after they are included in a block, so no risk analysis on finality has to be made.
To show the value of stream processing, let’s pick a motivating example that is applicable across many different industries. Imagine that you need to create and maintain, in real-time, 360-degree profiles for your customers. This would be useful for many reasons such as: To create a better customer experience. For instance, "This premium customer tried several times in the past five minutes to checkout his shopping cart but failed because our latest website update was buggy, let’s offer him a discount immediately and apologize for the bad user experience."; and To minimize risk. For instance, "This new payment seems fraudulent because the payment originated outside the US, yet the customer’s mobile phone app reports she is in New York City. Let’s block the payment and contact the customer immediately." What is needed for this use case is to bring together, in real-time, data from a variety of internal and possibly also external channels, and then consolidate this information in comprehensive customer profiles (sometimes called customer 360-degree profiles) that are continuously updated as soon as new information is available in any of the channels.
The most common is supervised learning. This is where you have collected multiple examples of the inputs and outputs of some process you want to learn, recorded in a database, or some code representation like XML or Json, and the machine learning algorithm tries to create a model that reproduces the outputs when presented with the inputs. If you remember from the previous article, DARL inputs and outputs can be Textual, Categorical, Numeric or Temporal. Machine learning here is limited to categorical and numeric inputs and outputs. Learning is limited to a single output at a time. If that output is categorical then classification has been performed, if numeric then prediction. The data used to train the data is a training set, and some of the data may be put aside to form a test set. with this machine learning algorithm you specify the percentage of the data to train on and the system will randomly split the data into two groups. Although problems that have an existing analytic solution are sometimes used to test ML algorithms, for instance getting a model to copy some logical relationship, in the real world no one in their right mind would use a machine learning algorithm to learn something for which an analytic model
Quote for the day:
"Obstacles can't stop you. Problems can't stop you. Most of all, other people can't stop you. Only you can stop you." -- Author, Jeffrey Gitomer