Daily Tech Digest - December 15, 2020

Blockchain Vs Relational Database: What’s The Difference?

So, what is blockchain technology? Well, it’s a ledger system that is decentralized and distributed. More so, it also offers data integrity, transparency, and so on. In simple terms, blockchain would be connected in a chain-like format. It means that any data in the ledger will take on a chain-like structure. So, just imagine the structure of blocks that are interlinked together. Furthermore, a block will be linked to the previous and after blocks. As a result, all the blocks create a chain of blocks, thus the name. More so, every single block on the ledger will have data or information about the transaction. So, what about the security of those transactional data? Well, every single block will be cryptographically encrypted. Another cool thing about blockchain is that it will have a cryptographic Hash ID that no one can reverse engineer. You might think blockchain as a database that just stores information. However, the difference is immense. In reality, both of them are quite different, and we’ll get into that shortly in the blockchain vs relational database comparison. Blockchain is, by default, immutable. So, it means that no one can modify any form of data whatsoever. Thus, any information that gets into the system once can never be altered or deleted. As a result, it will stay in the ledger forever.


6 Cloud Native Do’s and Don’ts for Developers

It’s easy to get so caught up in the question of what technologies you’re using, that you forget why you’re using them in the first place. But remember that adopting cloud infrastructure — whether it’s a Kubernetes cluster in your own data center, or serverless API in the public cloud — isn’t the goal. The goal is to help your organization build more scalable and flexible applications and to do it quicker. If you’re not actually taking into account the advantages and disadvantages of cloud infrastructure when you build applications, there’s a good chance you’re not actually meeting your organization’s real goals. ... Nodes crash. Networks fail. Remote APIs give unexpected results. Cloud native development requires you to handle these problems gracefully. Applications need to give users some sort of response, even if a component, or several components, are broken or non-responsive. You also need to think about how to recover once the broken or unavailable component is working again. Check out the Reactive Principles for additional guidance and techniques for getting started. ... Cloud native applications have unique compliance and security challenges.


Security considerations for OTA software updates for IOT gateway devices

Security is a process and a mindset. There is no magic switch we can toggle to make a system secure. It is important to stay vigilant, reviewing existing security flaws, and adapting to your workflow to account for them. New classes of attacks appear seemingly every day and engineering teams must prepare for this in order to remain secure. The white hats have to get it right every time while the black hats only need to get it right once. You need to identify what resources are worthy of being protected. A database of weather readings is unlikely to contain proprietary information whereas a customer database most certainly is. You will want to tailor the security to match the severity of a breach. The objective of most security devices is to increase the cost of an attack or reduce the value of any successful breaches. It is important to realize that the OTA update system is generally only concerned with potential attacks and vulnerabilities to the update process itself. It does not provide any protection against attacks that happened outside of the update change. For these kinds of attacks, you need to rely on other components provided by your operating system. One extremely important general security consideration is the principle of least privilege.


Microsoft and the State of Quantum: Q&A With Mariia Mykhailova

The existing quantum hardware is just not mature enough to run quantum algorithms to solve real-world problems, both in terms of the number of qubits in the devices and their quality. However, quantum computing can have impact today – it just requires some extra creativity! We call these solutions “quantum-inspired algorithms” – algorithms that were developed with quantum processes in mind but run on classical hardware. ... Microsoft Quantum’s mission is to develop a scalable and open quantum system and ecosystem around it. This means that we’re working on building a full stack quantum system, and that stack has a lot of layers. Some of these get a lot of publicity, such as Microsoft Quantum Development Kit or the quantum hardware and the fundamental physics research required to implement our vision for it, the topological qubits. But there are other, less known but not less important layers of the stack between these two, such as qubit control technology that has to support scaling quantum systems to millions of qubits, way beyond the physical limitations of current systems. That being said, solving world’s intractable problems is certainly not a single-company effort!


An Introduction to Blockchain + NoSQL Databases

Despite the benefits, distributed computing is not pervasive; even within modern enterprises centralization of many systems is still quite common. This includes industries that you would expect to be designed with more resiliency in mind, like the global financial systems or supply chain management which have tended to be more centralized around mainframe computing. By the way, you can always tell when there is a centralized system because when it fails, it fails absolutely! When all data or services are running on a single machine it is quite easy to know when it goes down because everything completely stops. It may be because it takes time to start up a replacement machine, or takes time to notice a failure before re-routing users or a myriad of other devastating engineering reasons. A centralized system is the opposite of the peer-to-peer networks we aspire to. However, with the introduction of platforms like Bitcoin, the next generation of digital currency and “ledgers” are slowly being proven out. Now there are thousands of different cryptocurrencies and dozens of Blockchain backends that are taking advantage of decentralized technology. As an aside, note that “distributed ledger” does not equate to the proof-of-work scenarios that many cryptocurrencies use.


Ethical design thinking: empowering designers to drive ethical change

Designers have started to recognise that some of what they have created is harming people. They are now starting to look at the use of technology and its impact in the long term, with ethical design at the centre of their thinking. Despite their motivation, companies have accepted that AI bias exists and are changing how they harvest and use people’s data — and designers are central to this change in strategy. “The core is really around pivoting from what can be done, with the designer coming in at a later stage, to thinking about what should be done, with the designer coming in at the beginning of the process,” says Woodley. “The designer represents the human. They create what is consumed by the person and so they should be ones influencing the line between what the business wants, what is possible from a technology perspective and what is responsible from an ethical perspective,” she continues. Design thinking, starting with empathy or the understanding of the human, needs to be at the forefront of future technology innovations and services. We need to flip the current model. Instead of leveraging technology to achieve business goals without taking the human impact into consideration, we need to put the human at the centre of our technology endeavours.


What’s at stake in the Computer Fraud and Abuse Act (CFAA)

Intended as the United States’ first anti-hacking law, the CFAA was enacted almost thirty-five years ago, long before lawyers and technologists had any sense of how the Internet would proliferate and evolve. In fact, the Act is outdated enough that it specifically excludes typewriters and portable hand-held calculators as a type of computer. Since its inception, it has been robustly applied for basic terms and services breaches, like the infamous case of Aaron Swartz downloading articles from the digital library JSTOR, to indicting nation-state hackers and extraditing Julian Assange. The core of the problem lies in the vague, perhaps even draconian, description of “unauthorized” computer use. While the law has been amended several times, including to clarify the definition of a protected computer, the ambiguity of unauthorized access puts the average consumer at risk of breaking federal law. According to the Ninth Circuit, you could potentially be committing a felony by sharing subscription passwords. The stakes are particularly high for security researchers who identify vulnerabilities for companies without safe harbor or bug bounty programs. White-hat hackers, who act in good faith to report vulnerabilities to a company before it is breached, face the same legal risks as cybercriminals who actively exploit and profit from those vulnerabilities.


Take any open source project — its contributors cut across national, religious and racial lines

“Open source is not all technical, and there is a strong community angle to this. During my college days, I’ve been involved in small ways with local user groups, where I used to conduct classes and tutorials on various topics. Once I moved to Bengaluru to work, I got heavily involved in the Python community and organised the first Python conference in India in 2009. PyCon India was probably one of the first language-specific tech conferences in India, and it has since then grown to be one of the largest PyCons in the world. This year, due to the coronavirus situation, we’re conducting the conference online. I’m also an active contributor to the Stack Overflow website, where I rank among the top 0.29 per cent worldwide for giving answers to questions.” Ibrahim feels that a lot of people don’t seem to realise that contributing something significant to a project requires a large amount of work. The main challenge is to develop patience and dedication to spend enough time to understand a project so that one can contribute to it. There are smaller problems, like some projects do not have enough contributors to help with technical problems, but overall, the main problem is the lack of discipline to put in the time necessary to achieve some level of proficiency.


Hear the Music Behind Your Data

When faced with the troves of data piling up daily, companies can become quickly overwhelmed. They’re unsure of where to begin an analysis for connections between data points. Data science is about exploring and seeking patterns within data, so it plays a pivotal role in getting companies started with their analyses. Oftentimes, data scientists won’t even know the question before they explore; instead, they’ll use their technology to identify emerging trends and patterns. Capturing and interpreting those patterns can provide tremendous benefits to a company. For example, data can help you catch bots that sign up and then spam your product. Human interaction with a product produces certain patterns — behavior forms a shape. You can compare that behavior shape to potentially anomalous datasets and determine if a user is human or not. That gives your team confidence in disconnecting potential bots from your system, which can save a fair amount of server space and money. Music is all about patterns, too. Composing a musical piece requires understanding how notes and the spaces between them all fit together to create cohesive patterns. Every song you’ve ever heard has a particular waveform derived from unique patterns of notes and spaces.


The Private Sector Needs a Cybersecurity Transformation

Fundamentally, the current approach to security is focused on the past — even if it's just a few milliseconds ago. Identifying a threat that already occurred and stopping the next one is not protection. And with the advances in technology available today, it should not be the accepted protocol for our industry. When a time-consuming analysis results in the conclusion of "we can block this attack next time," you are nowhere close to secure. Simply put, this approach does nothing to account for the agile adversaries that we know exist. Staying agile in this fight means looking forward, not back. For that to be a reality however, time plays a crucial role. Research from Ponemon Institute shows that security teams spend at least 25% of their time chasing false positives. I'd argue it's even higher. Defense cannot continue to be about uncovering the threats that have already happened while trying to block them again. Time has to be spent on truly preventing what's coming next. ... While hygiene is important, there is very little prevention going on at the threat level. Well-meaning employees have been stretched so thin that they find post-event response acceptable and equate it to cybersecurity. Sometimes hygiene equates to patching, but often there is a good reason why you can't patch.



Quote for the day:

“The real voyage of discovery consists not in seeking new landscapes but in having new eyes.” -- Marcel Proust

No comments:

Post a Comment