Daily Tech Digest - May 02, 2019

AI is already changing how cancer is diagnosed

cancer AI
Thanks to screening programmes, scientific breakthroughs, and technological advances in areas such as genetics, and medical imaging, cancer is much more likely to be diagnosed at a much earlier stage than it was several decades ago. However, accuracy in medical imaging diagnosis remains low, with professionals seeing 20-30 percent false negatives in chest X-rays and mammographies. False positive diagnosis (wrongly stating that there is a problem) is also common. AI can help counteract this, and the fact that healthcare is data-rich, is an added bonus. The vast majority of AI applications within healthcare leverage machine learning algorithms. The more data they are exposed to, the more likely they are to unearth hidden patterns within it that can then be used to perform a task without being explicitly programmed to do so. ... “One of the biggest challenges that we wish to address when it comes to a cancer diagnosis is ‘early detection.’ If a patient is diagnosed early, the chance of survival increases exponentially.


The Zero Server Web Framework Allows Developers to Create Web Applications With No Configuration

The Zero Server web framework allows developers to create, build and develop web applications with server-side rendering and little to no configuration. The recently released major iteration of Zero accepts a mix of Node.js, React, Vue, HTML, MDX, and static files, with support for Svelte poised to follow suite in upcoming versions. Zero 1.0 features automatic configuration, file-system based routing, automatic dependency resolution, and more. With Zero 1.0, developers organize the miscellaneous pieces of a web application into folders, whose structure and content will be mapped to the routes served by Zero Server. The file-system based routing maps files to routes according to the file extension. Following old PHP conventions, content that resides in ./api/login.js is exposed at http://<SERVER>/api/login. This is valid for any file with a .jsextension. Zero thus allows developers to define their API endpoints as individual functions. Zero additionally incorporates a route rewrite mechanism to allow for nested routes. Files with a .jsx extension are expected to contain React code which exports a React component which specifies the page to display at the appropriate route.


Microservices introduce hidden security complexity, analyst warns


Microservices security is something that needs to be tackled urgently, but this is challenging, said Balaganski, because there are almost no established design patterns, best practices or standards for the design, implementation and maintenance of microservices. “It is important for organisations to first realise that there is a problem that they were not previously aware of and that they need to start asking the right questions and looking for the answers,” he said. “If organisations are not aware of the problems, they won’t be looking for solutions.” Understanding the basics of how microservices work and the security implications of using this architecture is a good place to start, said Balaganski. “If you don’t know the basics, you can’t plan your further strategy based on an informed risk assessment,” he said. “In terms of finding out what questions to ask, they should be looking at the draft special publication from Nist [the US National Institute of Standards and Technology] on Security strategies for microservices-based application systems, which is basically a list of things that need to be considered.”


Tips for creating a successful big data lake

Most data collected by enterprises today is thrown away. Some small percentage is aggregated and kept in a data warehouse for a few years, but most detailed operational data, machine-generated data, and old historical data is either aggregated or thrown away altogether. That makes it difficult to do analytics. For example, if an analyst recognizes the value of some data that was traditionally thrown away, it may take months or even years to accumulate enough history of that data to do meaningful analytics. The promise of the data lake, therefore, is to be able to store as much data as possible for future use. So, the data lake is sort of like a piggy bank (Figure 1-4)—you often don’t know what you are saving the data for, but you want it in case you need it one day. Moreover, because you don’t know how you will use the data, it doesn’t make sense to convert or treat it prematurely. You can think of it like traveling with your piggy bank through different countries, adding money in the currency of the country you happen to be in at the time and keeping the contents in their native currencies until you decide what country you want to spend the money in


Is it still worth becoming a data scientist?

istock-1077503914bigdata.jpg
With slowing salary growth among data scientists and signs there may be a glut of junior talent, should aspiring data scientists pause for thought? Boykis' advice is to consider getting into the field by the "back door", by starting out in a tangentially related field like a junior developer or data analyst and working your way towards becoming a data scientist, rather than aiming straight for data scientist as a career. Stack Overflow's Silge has a slightly different interpretation of why salaries are levelling out and believes people shouldn't necessarily be deterred from entering the industry. "I think that what we're seeing is a little bit of the standardization and the professionalization of data science," she said. "The past ten years have been a bit of the Wild West when it comes to data science. 'How do you become a data scientist?', it's been a really open question. "I see the industry moving towards some consensus around 'What does it mean to be a data engineer? and 'What does it mean to be a data scientist?'. "When you get to that stage it becomes easier to hire for those roles, and when these roles are easier to hire for you don't have the crazy salary situation we had before."


CIO interview: Mark Holt, CTO, Trainline

“It’s just an amazing group of people,” he says. “We’ve gone from a quite slow-moving environment to one where we operate at e-commerce pace. We do more than 300 production releases every week. We have a team who are able to operate at that pace – and that requires a particular group of individuals, with the right skillset, attitude and approach.” Holt says it is not easy to find such talented professionals. He recognises that these highly skilled individuals are the types of people that Google or Facebook are looking to hire, and says the key to success is to continually think about how people work, the roles they fulfil and the supportive environment the business needs to create. “We focus on culture,” he says. “I like the phrase ‘intentional’ – we pay attention to our culture, we care about it and we nurture it on a daily basis. A lot of my conversations with my direct reports will be about culture and the cultural impact of doing something. If we make a change or move something around, what will happen? How does it feel to be in the development and infrastructure teams at Trainline?”


Revolutionary data compression technique could slash compute costs

A new form of data compression, called Zippads, could slash compute costs
In Zippads, as the new system is called, stored object hierarchical levels (called “pads”) are located on-chip and are directly accessed. The different levels (pads) have changing speed grades, with newly referenced objects being placed in the fastest pad. As a pad fills up, it begins the process of evicting older, not-so-active objects and ultimately recycles the unused code that is taking up desirable fast space and isn’t being used. Cleverly, at the fast level, the code parts aren’t even compressed, but as they prove their non-usefulness they get kicked down to compressed, slow-to-access, lower-importance pads—and are brought back up as necessary. Zippads would “see computers that can run much faster or can run many more apps at the same speeds,” an MIT News article says. “Each application consumes less memory, it runs faster, so a device can support more applications within its allotted memory.” Bandwidth is freed up, in other words. “All computer systems would benefit from this,” Sanchez, a professor of computer science and electrical engineering, says in the article. “Programs become faster because they stop being bottlenecked by memory bandwidth.”


4 best practices for improving governance strategies

As the role of technology in corporate America has evolved over the last 30-plus years, the term “IT governance” has risen to prominence. In the most basic sense, IT governance is a formal framework that provides structure for organizations that ensures all IT investments and systems support core business objectives. In other words, it helps align IT strategy with business strategy. “As changes rapidly occur, it is essential to have a well-defined IT governance framework, a state of compliance within regulatory requirements, and a preemptive approach to IT business risks,” Arbour Group explains. For large organizations that have dozens of priorities and millions of dollars invested into various strategies at any point in time, IT governance is an absolute necessity. A failure to articulate the correct approach to IT governance could result in costly mistakes that prevent the organization from being successful. For business leaders that haven’t empowered their companies with IT governance – or even those who have, but know they aren’t taking full advantage – there’s ample room for improvement.


Automating trust with new technologies


The need for trust starts when a product or component leaves the factory or farm. A manufacturer that has implemented automated trust creates a digital “birth certificate” with specifications, provenance, cost, and other relevant data. It then enters this birth certificate (usually an IoT tag) into its existing ERP system, integrated with blockchain to create a secure, immutable, cryptographically sealed record. This record is instantly available, in identical form, on the different servers of the participants in this supply chain, such as the manufacturer, logistics providers, distributors, and wholesalers. Next come IoT sensors, to record location, temperature, ambient vibrations, and other measures to provide continuous end-to-end provenance. The logistics provider scans the sensors to connect them to the blockchain and to the digital birth certificate. As assets change location and condition, IoT sensors gather the data. Blockchain stores it, securely and immutably, with a timestamp on the servers of all of the participants.


Using TypeScript with the MySQL Database

TypeScript, introduced in 2012, has had a recent surge in popularity. A recent JavaScript and Web Development InfoQ Trends Report notes that "TypeScript has had a dramatic rise in popularity, now in the top 10 most popular programming languages on GitHub...". In June 2018 TypeScript made its debut on the TIOBE Index of programming languages top 100 at #93 and the following month was ranked in the top 50. More recently TypeScript is ranked at 44 on the TIOBE index. TypeScript has emerged as a powerful environment for authoring web applications, providing significant improvements over standard JavaScript while remaining consistent with the language. In this article we'll explore in depth the details necessary to use TypeScript with Node.js, MySQL, and TypeORM to create a powerful solution for managing database access with server-side TypeScript. We'll build an example CRUD application to provide a complete end to end solution. The example application shall model a journal catalog. We shall assume familiarity with JavaScript.



Quote for the day:


"Knowledge is like underwear. It is useful to have it, but not necessary to show it off." -- Bill Murray


No comments:

Post a Comment