Daily Tech Digest - July 23, 2017

Natural Language Processing: The What, Why, and How

Business managers have a Big Data problem. They puzzle over dashboards and spreadsheets drowning in too much data and trying to compile it all together into meaningful information. Arria, a company based in London, has come up with a solution. The Arria NLG Platform is a form of Artificial Intelligence, specialized in communicating information which is extracted from complex data sources in natural language (i.e. as if written by a human). It literally takes an organization’s data and transforms it into language, not standard computer-generated text that is overly technical and difficult to read, but natural human language that reads like a literate and well-educated person wrote it. Arria’s software can turn a spreadsheet full of data, that is dragged and dropped automatically into a written description of the contents, complete with trends, essentially providing business reports.


Real Time Data Integration on Hadoop

This very quick and focused data integration is often referred to as “streaming data enrichment”. In the insurance example, the company wants each recommendation to be based on the full context of the customer’s relationship with the company. Data integration in near real time is required because the first call provides a critical part of the context for the second call or website visit. My colleague, NoSQL expert Bryce Cottam, suggests using a low latency NoSQL database, such as HBase, as the repository for the integrated data in this case. Apache HBase is an open source database included in, or available with, most Hadoop distributions. Integration can be further simplified by designing the solution around a specific data integration requirement. For the insurance example, the problem is to integrate the data by customer.


Why public cloud is more expensive than you think

“If you were to go out and rent a car from Budget for one day a week, no problem,” he says. “If you want to use that car 24/7, 365 days a year then you’re going to pay for it twice over.” So anyone that’s looking to run an application that has predictable traffic levels and must always be available should avoid public cloud options for that, he says. “That’s very expensive under the Azure and AWS pricing model,” MacDonald explains. “Which it should be, because if you have these virtualized server banks and you’re doing pay-as-you-go, then you have to charge a lot to make a profit, because it’s not going to be used all the time.” Canada15Edge has been in business for about two years, operating one data centre on a colocation model for its clients. MacDonald says he’s hosting a number of managed service providers in his building.


Architecting the digital enterprise

To be nimble requires an organisation to empower those architects closest to the business needs – those with domain expertise. To maintain consistency amidst this new autonomy, an enterprise’s domain architects need to operate with a consensus around the approach to key architecture “plays” – such as cloud, security and analytics. The enterprise architect of the future needs to be able to grasp and manage risk: understanding what to solve now and what to solve iteratively. As the dominance of the biggest players has eroded, they must construct fluid ecosystems of software, where a product may be used to deliver a business outcome for one or two years until enterprise toolsets evolve. This is a different mentality for architects – one which tolerates risk and even sprawl so long as it is managed and iteratively resolved.


Attack and response: Cloud-native cybersecurity vs. virtual machine security

Most vulnerabilities lie in the application level, and deciphering the specific application to protect against relevant threats is hard to do on an ongoing basis. Cloud-native security addresses this problem with whitelisting and protection from known threats. For the first time ever, you can automatically whitelist which traffic should and shouldn’t get to your application automatically. VM security is completely blind to the application specific elements, or to the larger context of the application, especially in orchestrated systems where the IPs of the application might change on an hourly basis. Regarding protection from known threats, one of the major issues with existing web application firewalls (WAFs) is that it is very hard to configure it correctly for every exposed service.


What’s the Big Deal about China’s First Open-Source Blockchain Platform NEO?

Erik Zhang, core developer of NEO, introduced Smart Contracts 2.0 to the audience and explained the major differences between NEO and Ethereum. Ethereum uses its own language called Solidity for programming, whereas NEO supports all programming languages via a compiler, including those on Microsoft.net, Java, Kotlin, Go and Python. By allowing for common programming languages to be used on its platform, NEO hopes to attract a vast community of developers. NEO will have the Nest Fund, a project similar to Ethereum’s The DAO, and Tony Tao will soon release a white paper on the project. The DAO will make improvements on its shortcomings and will be released after being audited by a worldwide peer review.


Open hybrid cloud enables government IT service delivery

An open hybrid cloud solution enables government IT shops to provide flexible and agile service delivery with minimal disruption using current/existing infrastructure. At the same time, it establishes a fast, flexible and agile service-delivery environment supporting today’s traditional workloads and tomorrow’s cloud-based applications. Open hybrid cloud leverages innovation, economics and flexibility by providing access to the best service providers, vendors and technologies without getting locked in. Open Source solutions are leading the industry in rapid innovation and delivering secured open hybrid cloud. “If you automate your way into the cloud, you can automate your way across to another cloud and start making spot market decisions about what cloud you want to be in based on what you’re trying to do,” says Adam Clater


11 Things Every CEO Must Know About Disruption

The first thing to remember about disruption is that it's a two-way street. Either you are the disrupter, or you are being disrupted. This means I mean that if you aren't making things happen for you, or your company, then someone is probably going to put you out of business right under your nose with a lower price point and better business plan. ... This isn't meant to be fear-based, but it's the reality of the situation. Between the pace at which technology advances and the rate at which ideas are generated, disruption is truly a natural cause of the times. If you're comfortable in your business, you need to be thinking about who knows your comfortable and how they are planning to make you uncomfortable. There is a constant ebb and flow of disruption and being disrupted."


The Jobs that will be Orchestrated, not Automated

With the help of Robotic Service Orchestration (RSO) technology, we can orchestrate services across a human and digital workforce to get the right worker to do right task at the right time. As we move to an increasingly automated workforce, this is going to become increasingly important. While there are jobs that will absolutely and positively remain in the human realm, these jobs will likely benefit from some sort of robot interaction which will have to be managed.  RSO can also be used to ease the transition and effectively "install airbags" in the automation process. RSO can help to ensure that it’s easy to switch back from digital to human, if there are any unexpected side effects from moving to an automated agent instead of a human one.


Maximizing the Potential of Open Source Threat Intelligence Feeds

Open source threat intelligence feeds are appealing for a number of reasons. One of the more obvious reasons is their price- absolutely nothing. This is critical for smaller organizations that lack the resources for robust sources of intelligence. Cost aside, open source threat intelligence is also appealing because it provides a wide scope of information on different industries, topics, and locations. With the collaborative efforts of many contributors, users can benefit from intelligence without the hassle of contracts and data limits. Open source threat intelligence is also popular because much of it derives from honeypots, which are decoy entities used to study invasive behaviors. These open and closed-source applications register anomalies and problematic activity that can be then be turned into feeds, software patches, and studies of adversarial behavior.



Quote for the day:


"If the road is easy, you're likely going the wrong way." -- Terry Goodkind


No comments:

Post a Comment