Daily Tech Digest - June 27, 2018

The Future of Decentralization


Just as alternative, renewable, natural and sustainable energy can contribute to a cleaner environment, so to the infrastructure of politics, economics and the motives of the web need to be remade with more intention as to truly allow technology to serve humanity and not the other way around. This will take time, likely decades if not longer. It will take the help of whatever artificial intelligence becomes. ... Decentralization requires a human-AI hybrid civilization. It requires automation, robots, new kinds of jobs, roles and values not yet invented or even implemented or imagined. Radical decentralization also requires leadership, humans with a different consciousness, that’s more inclusive, more augmented with data, more wise — than any human leaders in history. The future of decentralization is global, it’s not something Scandinavia or China arrives at before others, though certainly they may embody some elements of it before. A decentralized internet, like solutions on the blockchain will required upgrades, better iterations, improved software, smarter smart contracts, quantum infrastructures, AIs that police them to make sure they are what they were intended to be.


IT chiefs keep obsolete systems running just to keep data accessible

OPIS
One of the chief problems of keeping the aging systems running is related to security, as the research highlights. 87 per cent of the IT decision makers in the survey sample agree that legacy applications on older operating systems are more vulnerable to security threats. At the same time, 82 per cent recognize that old or legacy systems are rarely compatible with modern security and authentication methods. “On older systems some security vulnerabilities are harder – or even impossible – to resolve. If available at all, patches for new threats could be delayed because legacy apps are considered less of a priority,” says Jim Allum. “As legacy applications pre-date the latest security innovations there is a clear security risk to having a lot of legacy within your application portfolio.” A related issue is compliance, with 84 per cent of the sample agreeing that on old/legacy applications it is harder to accurately track and control access to sensitive data in line with stricter data privacy regulations such as the GDPR.



Global IoT security standard remains elusive


Despite the cacophony of approaches towards IoT security, Kolkman noted that most are underpinned by common IT security principles. “If you look at the different IoT security frameworks, there seems to be consensus on things like upgradability and data stewardship – even if there’s no global standard that describes it all,” he said. These principles are reflected in a set of enterprise IoT security recommendations released by the Internet Society this week. Among them is the need for companies to closely follow the lifecycle of IoT devices, which should be decommissioned once they are no longer updatable or secure. Meanwhile, the Internet Society’s Internet Engineering Task Force is also working on IoT standards in areas including authentication and authorisation, cryptography for IoT use cases and device lifecycle management. With cyber security at the top of most national security agendas today, Kolkman said the Internet Society has reached out to policy makers to provide recommendations about what they can do, such as setting minimum standards of IoT security and accountability.


A CIO on Carrying the Burden of Medical Device Cybersecurity

The situation "has created significant challenges, because ... those devices sit in our networks and infrastructures from the technology side, and we're now held responsible to remediate those issues," says Earle, who is also chair of the College of Healthcare Information Management Executives - or CHIME - board of trustees. "Many of those devices are very proprietary and it's very difficult to manage them because you would need to put in some kind of solution that ... monitors devices - and the proprietary nature of those devices makes that very challenging to do," he says in an interview with Information Security Media Group. "It's a lack of standards as well as a lack of characterization of those standards that makes this challenging. There's no true vulnerability disclosure associated with these devices. Suppliers should provide documentation of the vulnerabilities of their products like they would normally do for anything else in a situation like that. We need to ask for greater risk sharing."


Know your enemy: Understanding insider attacks


When an enterprise establishes an insider threat program, executives need to be aware of the potential negative effects this can have on employee morale and sensitivity to loss of privacy. Implementing an insider threat program mandates increased communication with the staff to explain the program, explain how they can help and offer frequent emphasis on program wins. The 2016 Ponemon Institute report "Tone at the Top and Third Party Risk," noted that "If management is committed to a culture and environment that embraces honesty, integrity and ethics, employees are more likely to uphold those same values. As a result, such risks as insider negligence and third party risk are minimized." An insider threat program should also include a steering board/committee. Ideally, such a committee should include representatives from law, intellectual property, the office of internal governance, global privacy, human resources, information technology, corporate communications and security.


The future of consumer MDM: Cloud, referential matching and automation

Current MDM technologies typically use “probabilistic” and “deterministic” matching algorithms to match and link consumer records across an enterprise and to ensure there is only one master record for each consumer. These algorithms match records by comparing the demographic data contained in those records—data such as names, addresses, and birthdates. But demographic data is notoriously error-prone, frequently incomplete and constantly falling out of date. And probabilistic and deterministic matching algorithms are only as accurate as the underlying demographic data they are comparing, meaning they are fundamentally limited in how accurate they can be by the availability and quality of the data. But there is a new paradigm in identity matching technology called “referential matching” that is not subject to these same fundamental limits. Rather than directly comparing the demographic data of two consumer records to see if they match, referential matching technologies instead compare the demographic data from those records to a comprehensive and continuously-updated reference database of identities.


The ML.Net project version 0.2 is available for .Net Core 2.0 and .Net Standard 2.0 with support for x64architecture only (Any CPU will not compile right now). It should, thus, be applicable in any framework where .Net Standard 2.0 (eg.: .Net Framework 4.6.1) is applicable. The project is currently on review. APIs may change in the future. Learning the basics of machine learning has not not been easy, if you want to use an object oriented language like C# or VB.Net. Because most of the time you have to learn Python, before anything else, and then you have to find tutorials with sample data that can teach you more. Even looking at object oriented projects like [1] Accord.Net, Tensor.Flow, or CNTK is not easy because each of them comes with their own API, way of implementing same things differently, and so on. I was thrilled by the presentations at Build 2018 [2] because they indicated that we can use a generic work-flow approach that allows us to evaluate the subject with local data, local .Net programs, local models, and results, without having to use a service or another programming language like Python.


Could blockchain be the missing link in electronic voting?

To bolster the security, accuracy and efficiency of elections, some suggest the implementation of blockchain technology. Blockchain is a decentralised, distributed, electronic ledger used to record transactions in such a way that transactions made using it can't be subsequently altered without the agreement of all parties. Thousands of network nodes are needed to reach consensus on the order of ledger entries. Most famously, blockchain is used for bitcoin transactions, but it's finding use cases in everything from storing medical records to authenticating physical transactions. Such is the level of interest in blockchain technology that governments are even examining its potential use cases. Blockchain-enabled elections have already taken place: In March, Sierra Leone voted in its presidential elections and votes in the West Districts were registered on a blockchain ledger by Swiss-based firm Agora. By storing the data in this way, election data was "third-party verifiable and protected against any possibility of tampering," the company said, with the results publicly available to view.


Secure by Default Is Not What You Think

Once a product is built to be secure by default, it still needs to remain that way once deployed in its environment, which is increasingly complex and interconnected. That’s why the first responder — the person installing the product, application, or database — is evermore important. To keep the organization and users safe, the first responder needs to apply general principles, such as configuring controls to be secure as possible, enabling encryption at rest and SSL/TLS secure communication channels, restricting access to applications or data only to those people who need it, and requiring authentication that relies on trusted identity sources. Certificate or key-based authentication also are considerations. General principles can guide administrators, yet one size does not fit all. Administrators also have to tailor approaches to specific environments. What banks need from their databases, applications, and other technologies, for instance, is different from what oil companies or intelligence agencies need. Whatever the industry, someone needs to watch the whole picture. For instance, a database sits between an application above it and an operating system below it.


Underground vendors can reliably obtain code signing certificates from CAs

code signing certificates
The researchers where also surprised to find that all vendors opt for selling the anonymous code signing certificates to the malware developers instead of providing a signing service for a fee. “All vendors claim that their certificates are freshly issued, that is, that they have not been stolen in any way but obtained directly from a CA. Further, all vendors claimed that they sell one certificate into the hands of just one client, and some even offered free or cheap one-time reissue if the certificate was blacklisted too soon. Vendors did not appear to be concerned with revocation, often stating that it usually ‘takes ages’ until a CA revokes an abused certificate,” they shared. “Some vendors even claim to obtain the certificate on demand, having the certificate issued once a customer pays half of the price. Interestingly, [one vendor] even claims that he always has a few publisher identities prepared and the customer can then choose which of these publisher names he wants to have his certificate issued on.”



Quote for the day:


"The highest reward for a man's toil is not what he gets for it but what he becomes by it." -- John Rushkin


No comments:

Post a Comment