July 17, 2016

Windows Containers and Docker

The Windows Server container shares the kernel with the OS running on the host machine, which means all containers running on that machine share the same kernel. At the same time, each container maintains its own view of the OS, registry, file system, IP address, and other components, with isolation provided to each container through process, namespace, and resource control technologies. The Windows Server container is well suited for situations in which the host OS and containerized applications all lie within the same trust boundary, such as applications that span multiple containers or make up a shared service. However, Windows Server containers are also subject to an OS/patch dependency with the host system, which can complicate maintenance and interfere with operations.


BNP's Ex-Blockchain Lead is Now Coding Smart Contracts for Clearinghouses

Along with co-founder and fellow BNP Paribas alum, David Acton, the bootstrapped team has already built a proof-of-concept for contract creation and trade registration that uses a smart contract for US treasuries and other cash-like short-term treasury instruments in Europe. The smart contracts are intended to represent bilateral contracts between parties that are backed by different guarantors, likely a member of the CCP clearinghouse. The smart contracts themselves would be administrated by the CCP. In Europe, CCPs such as the European Central Counterparty NV and Eurex Clearing serve counterparties by both taking funds from a buyer and assets from a seller and managing the risk in a wide range of ways. In the US, the DTCC fulfills a similar function.


The Batch Mode Window Aggregate Operator in SQL Server 2016: Part 2

Besides the general performance advantages of batch mode processing compared to row mode processing, this operator uses a dedicated code path for each window function. Many inefficiencies in the original row mode optimization are removed. For example, the need for an on-disk spool is eliminated by maintaining the window of rows in memory and using a multi-pass process over that window in a streaming fashion. ... Remember that when querying columnstore, sorting for computation of window functions is unavoidable since columnstore data isn’t ordered; however, you do get the benefits of reduced I/O cost, batch mode processing and parallelism, with much better scaling for larger number of CPUs, compared to row mode processing.


SQL Server, Power BI, and R

R has also been integrated into Power BI, allowing you to create fully integrated visualizations with the power of the R language. In this blog post I will show an example using R in SQL Server to create a model and batch score testing data, then use Power BI Desktop to create visualizations of the scored data. Rather than moving your data from the database to an external machine running R, you can now run R scripts directly inside the SQL Server database – train your model inside the database with the full power of CRAN R packages plus the additional features of speed and scalability from the RevoScaleR package. And once you have a model, you can perform batch scoring inside the database as well.


Strengthening the Foundations of Software Architecture

Once the software architecture is in place, it may subsequently be discovered the requirements have changed or were never fully understood. How easy it is to change the software depends on whether it was architected in such a way that alterations don’t significantly conflict with the original design. The more extreme agile methodologies deviate further from this blueprint. Applications are written in smaller slices, delivering value to the user faster but reducing visibility of the overall design. There often isn’t any one individual responsible for designing the architecture, and the decision-making is delegated to the developers incrementally working on the software. Because cycle times are reduced, the team can get feedback faster and respond more quickly to changing requirements, but how easy it is to implement changes still depends on whether they are congruous with the architecture.


Basho Open Sources Time Series Database Riak TS 1.3

Basho is indeed the biggest contributor to Riak TS, primarily because we had to make so many additions and changes to have a purpose built time series solution. We are currently talking to several companies about working on a series of capabilities to Riak TS to solve a large problem in the time series arena. As for the objective of open sourcing the code, we believe that we have a lot to offer the community in terms of innovative approaches, ideas, leadership in distributed systems and want to collaborate to build even better solutions. That process is almost always accelerated when you leverage open source as a path. We have a long history of open sourcing our software and gaining support from the community in creating better solutions


Modeling your big data enterprise architecture after the human body

Think about the storage systems in our brain. We have short-term, sensory, long-term, implicit, and explicit. Why do we have so many? The answer is there was an evolutionary benefit that each system provided over a generalized system. These systems most likely have different indexing strategies, flushing mechanisms, and aging-out/archiving processes. We find a parallel in our world of software architecture, with storage systems like RDBMS, Lucene search engines, NoSQL stores, file systems, block stores, distributed logs, and more. The same goes for our processing systems. Vision interpretation is very different from complex decision-making. Just like the brain, in software architecture, there are different execution patterns and optimizations that serve different use cases. Tools like SQL, Spark, SPARQL, NoSQL APIs, search queries, and many more. There is a reason for the different approaches to processing, and there will be more approaches in the future as we find different ways to address our problems.


How Cardihab uses data to speed recovery of cardiac patients

Cardihab is a spin-off company from the Commonwealth Science and Industrial Research Organisation, and is also a participant of the HCF Catalyst accelerator program. He explained a key problem behind why people do not complete their CR program is due to accessibility and convenience. "The way normal cardiac rehab works is it's usually a 6-8 week long program where the person has to go to a clinic once or twice a week and that can really be inconvenient, especially for patients who have returned to work, or for rural remote patients," McBride said. Cardihab has been designed to collect data about a patient including how many steps a patient has taken, and their blood pressure and sugar levels, via Bluetooth-enabled monitors. The information is then uploaded to the cloud and shared with the patient's clinician, who can access it through an online portal.


Deep Learning: Using an Artificial Brain to Protect against Cyberattacks

When applied to cybersecurity, it takes milliseconds to feed a raw data file and pass it through the deep neural network to obtain detection with the highest accuracy rate. This predictive capability of being able to detect a never- before seen malware variant enables not only extremely accurate detection, but also leads the way to real-time prevention because at the very second a malicious file is detected, it is already blocked. Therefore, while traditional machine learning yields better results than signatures and manual heuristics, deep learning has shown groundbreaking results in detecting first-seen malware, even compared with classical machine learning. This observation is consistent with improvements achieved by deep learning in other fields, such as computer vision, speech recognition, text understanding, etc.


SQL Server 2016 Upgrade Planning

If you are using Master Data Services or Data Quality Services, keep in mind an customization will get overwritten. You must back up your MDS and DQS databases before upgrading to prevent any accidental data loss. In SQL Server 2016 , those applications do have schema upgrade level changes. After upgrading to SQL Server 2016, any earlier version of the Add-Ins for Excel will no longer work. You will need to tell your users to download the SQL Server 2016 versions of the Master Data Services or Data Quality Services Add-In for Excel. Integration Services packages do not get automatically updated in a SQL Server upgrade. You will need to migrate packages afterward service upgrade completes with the Integration Services Package Upgrade Wizard. Developers can upgrade 2012 or 2014 projects to 2016 without manual adjustments after upgrade. They can also choose to incrementally update without deploying the whole project.



Quote for the day:

“I do not think that there is any other quality so essential to success of any kind as the quality of perseverance.” -- John D. Rockefeller


No comments:

Post a Comment