Showing posts with label dao. Show all posts
Showing posts with label dao. Show all posts

Daily Tech Digest - May 03, 2022

5 Elements of High-Quality Software

The software architecture initially puts a foundation for a software project and lets programmers work on the particular project for many years. The entire software system’s lifetime, maintainability, and market success depend on this foundation. Late architectural changes are usually time-consuming and costly for any software development team. In other words, it’s literally impossible to change the foundation once a house is built on top of it. Therefore, we always need to strive to select the optimal architectural pattern with the first implementation attempt. It’s indeed better to spend more time on early architectural decisions rather than spending your time fixing the side-effects of instant non-optimal architectural decisions. First, work on the software system’s core functionality and stabilize — at this stage, architectural changes are not much time-consuming. Next, work on features by using the software core’s functionality. Even though you use a monolithic pattern, you can detach the core logic from features at the source-code level — not from the architectural level.


SSE kicks the ‘A’ out of SASE

Now comes security service edge (SSE), which pulls back the security functions in SASE into a unified services offering that includes CASB, zero-trust network architecture (ZTNA) and secure web gateway (SWG). SSE came in the wake of the COVID-19 pandemic, with most employees being sent home to work and putting in motion the ongoing trend toward hybrid work. With many people working from home at least part of the time, the role of branch offices is lessened and the need for security features that follow workers where they are – with work days starting from home and then moving to offices or other locations – is growing. What the role of SSE is in the larger network security space is and what it means for the future of SASE are the subjects of some debate in the industry. However, it puts a spotlight on the ongoing evolution of networking as the definition of work continues to change and the focus of IT shifts from the traditional central data center data and workloads in the cloud and at the edge. Once the pandemic hit, "it was no longer about branch offices," said John Spiegel, director of strategy at Axis Security, which in April launched Atmos, its SSE platform.


What Is Zero Trust Network Access (ZTNA)?

To begin, the idea behind the entire zero trust network access starts with the assumption that cybersecurity attacks can be a result of who is internal and who is external to the network. A traditional IT network trusts pretty much everything while a zero trust architecture network literally means “trust no one” including systems, users, software, and machines. Zero trust network access verifies a user’s identity and privileges and forces both users and devices to be continuously monitored and re-verified to maintain access. For example, let’s say that you log in to your bank account via a mobile device or even your laptop computer. Once you check your balance, you open a new tab to continue something else outside of the bank account screen. After a while, that tab will produce a pop-up with a timeout warning asking if you want to continue or log out. If you don’t reply in time, it will automatically log you out of the screen and you will be forced to log back in if you want to access your bank account details again.


Determining “nonnegotiables” in the new hybrid era of work

Skill development is another function that takes place at the group level. So, in the hybrid era, it’s also important to avoid losing those opportunities. As Degreed’s Chief Learning and Talent Officer Kelly Palmer wrote for the World Economic Forum, it’s helpful to use hybrid employees’ time at the office for “collaborative projects in which their new skills can be put to work,” while “fully remote companies can organize virtual collaborations.” Prioritizing development on both the individual and team levels is also a nonnegotiable because of the challenges presented to organizations by skill gaps. “Half of all employees around the world will need reskilling by 2025—and that number does not include all the people who are currently not in employment,” PwC Global Chairman Robert E. Moritz and World Economic Forum Managing Director Saadia Zahidi wrote in the 2021 report Upskilling for Shared Prosperity.


Things that will remain in the inkwell in the new European regulation of artificial intelligence

The European Union took a step forward and a year ago presented a proposal for a pioneering regulation in the world, which divides AI technologies into four categories based on the risk they may pose to citizens. But some experts point out that there are complex applications that, in their current wording, could be left out of regulation. Health, autonomous cars and weapons, among others. The EU debates the last fringes of the regulations on AI, which could be ready in 2023. A regulation that is “unique in the world” due to its characteristics, although it leaves important aspects in the shadows, says Lucía Ortiz de Zárate, Researcher in Ethics and Governance of Artificial Intelligence at the Autonomous University of Madrid. Ortiz de Zárate has submitted, together with the Fundación Alternativas, comments on the Commission’s proposal. ... This researcher misses the fact that there are sensitive sectors that are not included in the most closely watched artificial intelligence classifications, as is the case of health. 


How To Re-Architect Four Business Components With Digital Transformation

Going paperless and modernizing IT won't drive digital transformation on their own. On the contrary, true digital transformations encompass reevaluating current business processes and re-architecting them from the ground up to effectuate radical change. The key to successful digital transformation is to establish and seamlessly intertwine four core pillars: technology and infrastructure, business processes and models, customer experience and organizational culture. In my experience as an entrepreneur operating a digital transformation agency, high-performing organizations and digital leaders are able to continuously re-evaluate their core, identify weaknesses and opportunities and guide their teams through the ongoing transformation of all four pillars simultaneously to achieve defined goals. Whether it's an implementation of AI-driven analytics or a new customer portal, all components of the four pillars need to be considered and transformed in unison to achieve transformation goals and deliver tangible results. Initiatives that touch only the technology or infrastructure may drive improvement, but they're rarely transformative.


Deep Dive: Protecting Against Container Threats in the Cloud

Container technology, like other types of infrastructure, can be compromised in a number of different ways – however, misconfiguration reigns atop the initial-access leaderboard. According to a recent Gartner analysis, through 2025, more than 99 percent of cloud breaches will have a root cause of customer misconfigurations or mistakes. “Containers are often deployed in sets and in very dynamic environments,” Nunnikhoven explained. “The misconfiguration of access, networking and other settings can lead to an opportunity for cybercriminals.” Trevor Morgan, product manager at comforte AG, noted that companies, especially smaller companies, are generally using default configuration settings vs. more sophisticated and granular configuration capabilities: “Basic misconfigurations or accepting default settings that are far less secure than customized settings.” That can lead to big (and expensive) problems. For instance, last June the “Siloscape” malware was discovered, which is the first known malware to target Windows containers. It breaks out of Kubernetes clusters to plant backdoors, raid nodes for credentials or even hijack an entire database hosted in a cluster. ...”


DAOs: A blockchain-based replacement for traditional crowdfunding

Digital crowdfunding platforms like GoFundMe, Patreon and Kickstarter have enjoyed massive patronage over the past 10 years. This growth can be attributed primarily to the nature of crowdfunding which is set up with minimal risk. This risk is spread across all contributors of a particular idea or startup. Start-ups with financial needs will find that getting funding from traditional institutions is no easy feat. These institutions take on quite a lot of the risk involved in financing business ideas that could end badly. With a global economy still reeling from the pandemic, the accessibility and much less bureaucratic nature of DAOs as a tool for crowdfunding have been a primal factor in its growth. Digitalized crowdfunding in the form of DAOs has eliminated some traditional limits of the financing form. The simplicity makes it a disruptive force to traditional crowdfunding methods. Emmet Halm dropped out of Harvard to found DAOHQ. DAOHQ bills itself as the first marketplace for DAOs where users can find information about any DAO. 


A regular person’s guide to the mind-blowing world of hybrid quantum computing

Quantum computers allow us to harness the power of entanglement. Instead of waiting for one command to execute, as binary computers do, quantum computers can come to all of their conclusions at once. In essence, they’re able to come up with (nearly) all the possible answers at the same time. The main benefit to this is time. A simulation or optimization task that might take a supercomputer a month to process could be completed in mere seconds on a quantum computer. The most commonly cited example of this is drug discovery. In order to create new drugs, scientists have to study their chemical interactions. It’s a lot like looking for a needle in a never-ending field of haystacks. There are near-infinite possible chemical combinations in the universe, sorting out their individual combined chemical reactions is a task no supercomputer can do within a useful amount of time. Quantum computing promises to accelerate these kinds of tasks and make previously impossible computations commonplace. But it takes more than just expensive, cutting-edge hardware to produce these ultra-fast outputs.


Go Language Riding High with Devs, But Has a Few Challenges

Among the most significant technical barriers to increased Go language adoption are missing features and lack of ecosystem/library support. “We asked for more details on what features or libraries respondents were missing and found that generics was the most common critical missing feature — we expect this to be a less significant barrier after the introduction of generics in Go 1.18,” wrote Alice Merrick, a user experience researcher at Google, in a post on the Go Blog discussing the 2021 survey. “The next most common missing features had to do with Go’s type system.” The Go community added generics to the Go language in release 1.18 of the language. Release 1.18, delivered last month, provides new features to enhance security and developer productivity, and improve the performance of Go. Steve Francia, Google Cloud’s Product & Strategic Lead for Go, called the new update “monumental” and said generics was the most sought-after feature by developers. “With generics, this specific feature has been the most sought-after feature in go for the last 10 years,” Francia said.



Quote for the day:

"It takes an influential leader to excellently raise up leaders of influence." -- Anyaele Sam Chiyson

Daily Tech Digest - May 01, 2022

The metaverse is a transformational opportunity for the business world

The idea of the metaverse gives enterprise software developers a roadmap to build software based on a single digital identity where companies join a network to connect with other companies who have likewise joined. This is not a system that belongs to any one company, but an environment where all companies are equal. Why do businesses want to connect? Because that’s the nature of a business at its essence, connecting with customers, suppliers, and any stakeholder to establish an expectation of value from the relationship, and then measure the realization of this value over time. This thinking is not inconsistent with the idea of a VR environment where a business user engages with some part of their business in an immersive environment. But we are setting the aperture much wider to say the entire business should be thought of as being part of the metaverse, and all of the data that exists about that business can be aimed at that digital identity to create a digital twin for the entire business. Then this digital business can connect with other businesses to do what businesses do — exchange value — but is now supported by a persistent, interoperable, collaborative digital space that is co-created and co-owned by those companies who have joined the metaverse.


Cognitive Biases About Leadership and How to Survive Them

We develop cognitive biases based on our life experience. Just as we expect teachers to be good with kids and surgeons to have a steady hand, we also hold behavioral expectations for our leaders. Today’s emphasis on servant leadership has us all believing that leaders are heroes, existing to serve the people and their every action should be a selfless gesture. Then, when they fail to act in accordance with our beliefs, we become disillusioned — the hero has fallen and everything they ever did, good or bad, gets lumped into one big giant disappointment. That’s a lot of burden for a leader to bear. Instead of looking at leaders as one whole unit, we need to see them as a collection of basic human traits. We forget that within every leader is a person, with flaws and imperfections. Instead of putting the whole person on a pedestal as some kind of one-size-fits-all embodiment of goodness, just admire them for their strengths. Unpack what you like about them without discarding the whole leader. Take the good they accomplished for what it is, but don’t blame humans for not being angels.
 

Data Is The New Business Fuel, But It Requires Sound Risk Management

Today’s remote or hybrid work model poses a whole new set of security challenges. Many companies can minimize risk by leveraging a multicloud strategy, but the risk associated with malware or ransomware can compromise crucial corporate and customer data. Despite this, according to a report from Menlo Security, only 27% of organizations have advanced threat protection in place for all endpoint devices with access to company data. It’s crucial that companies deploy advanced cybersecurity software and also train employees on acceptable use of public or home-based Wi-Fi usage. While enterprise data provides the fuel that drives accurate AI, it’s important that data scientists ensure that bias doesn’t creep into the algorithms that are developed. Data should be analyzed to ensure that it is diverse and doesn’t lead to any decisions that could provide an unfair advantage to certain populations. As an example, AI that helps to determine the best suppliers to work with should be trained with diverse supplier data. Speaking of suppliers, it’s not enough that data has proper governance within the organization. 


How Aurora Serverless made the database server obsolete

Amazon Aurora Serverless v1 changed everything by enabling customers to resize their VMs without disrupting the database. It would look for gaps in transaction flows that would give it time to resize the VM. It would then freeze the database, move to a different VM behind the scenes, and then start the database again. This was a great starting point, explains Biswas, but finding transaction gaps isn't always easy. "When we have a very chatty database, we are running a bunch of concurrent transactions that overlap," he explains. "If there's no gap between them, then we can't find the point where we can scale." Consequently, the scaling process could take between five and 50 seconds to complete. It could sometimes end up disrupting the database if an appropriate transaction gap could not be found. That restricted Aurora Serverless instances to sporadic, infrequent workloads. "One piece of feedback that we heard from customers was that they wanted us to make Aurora Serverless databases suitable for their most demanding, most critical workloads," explained Biswas.


The ever-expanding cloud continues to storm the IT universe

VMware Inc. several years ago cleaned up its fuzzy cloud strategy and partnered up with everyone. And you see above, VMware Cloud on AWS doing well, as is VMware Cloud, its on-premises offering. Even though it’s somewhat lower on the X-axis relative to last quarter, it’s moving to the right with a greater presence in the data set. Dell and HPE are also interesting. Both companies are going hard after as-a-service with APEX and GreenLake, respectively. HPE, based on the survey data from ETR, seems to have a lead in spending momentum, while Dell has a larger presence in the survey as a much bigger company. HPE is climbing up on the X axis, as is Dell, although not as quickly. And the point we come back to often is that the definition of cloud is in the eye of the customer. AWS can say, “That’s not cloud.” And the on-prem crowd can say, “We have cloud too!” It really doesn’t matter. What matters is what the customer thinks and in which platforms they choose to invest. That’s why we keep circling back to the idea of supercloud. You are seeing it evolve and you’re going to hear more and more about it. 


Solving Business Problems With Blockchain

Smart contracts are one of the applications of blockchain that can vastly help companies in securing a deal. By using smart contracts, companies can form an electrical code that assists organizations to develop a venture in a conflict-free manner. Unlike traditionally, if a company tries to change the terms of the contract or denies to release a payment, everybody on the network can leverage the technology’s transparency to view the same, and the contract’s code automatically freezes the deal. The agreement would not continue further until the company pays the due or goes back to keeping up with the guidelines. This smart management of contracts helps businesses to maintain operations functioning without any friction. As blockchain is a technology that increases transparency, keeping track of the incoming and outgoing products from the site can be managed efficiently by everyone on the network. Every time a product halts at a specific gateway, the same gets documented and inserted into the blockchain ledger. This documentation increases transparency on cargo status and ensures they reach retailers on time and intact in condition.


The Future of Health Data Management: Enabling a Trusted Research Environment

TRE is becoming a commonly used acronym among the science and research community. In general, a TRE is a centralized computing database that securely holds data and allows users to gain access for analysis. TREs are only accessed by approved researchers and no data ever leaves the location. Because data stays put, the risk of patient confidentiality is reduced. ... TREs are becoming the architectural backbone for health data in many research organizations. While this is a step in the right direction, many TREs still can’t speak to colleagues from other organizations, or even other departments within their own organization. ... As the genomic sector continues to grow, the capability of TREs to communicate will allow researchers and scientists to effectively collaborate to overcome life threatening diseases and diagnosis by breaking down the “silos” of health data. That doesn’t mean moving data. Life sciences data sets are too large to move efficiently – and to complicate matters, many data security regulations forbid data to leave an organization, state or nation.


Designing Societally Beneficial Reinforcement Learning Systems

As an RL agent collects new data and the policy adapts, there is a complex interplay between current parameters, stored data, and the environment that governs evolution of the system. Changing any one of these three sources of information will change the future behavior of the agent, and moreover these three components are deeply intertwined. This uncertainty makes it difficult to back out the cause of failures or successes. In domains where many behaviors can possibly be expressed, the RL specification leaves a lot of factors constraining behavior unsaid. For a robot learning locomotion over an uneven environment, it would be useful to know what signals in the system indicate it will learn to find an easier route rather than a more complex gait. In complex situations with less well-defined reward functions, these intended or unintended behaviors will encompass a much broader range of capabilities, which may or may not have been accounted for by the designer. ... While these failure modes are closely related to control and behavioral feedback, Exo-feedback does not map as clearly to one type of error and introduces risks that do not fit into simple categories. 


Don’t Fear Artificial Intelligence; Embrace it Through Data Governance

Data-centric AI is evolving, and should include relevant data management disciplines, techniques, and skills, such as data quality, data integration, and data governance, which are foundational capabilities for scaling AI. Further, data management activities don’t end once the AI model has been developed. To support this, and to allow for malleability in the ways that data is managed, HPE has launched a new initiative called Dataspaces, a powerful cloud-agnostic digital services platform aimed at putting more control into the hands of data producers and curators as they build intelligent systems. Addressing, head on, the data gravity and compliance considerations that exist for critical datasets, Dataspaces gives data producers and consumers frictionless access to the data they need, when they need it, supporting better integration, discovery, and access, enhanced collaboration, and improved governance to boot. This means that organisations can finally leverage an ecosystem of AI-centric data management tools that combine both traditional and new capabilities to prepare the enterprise for success in the era of decision intelligence.


How DAOS Are Changing Leadership

Traditionally, top-down leadership comes to those who either already have power or the ability to purchase it. Since everyone has equal shares in a DAO, authority is not "given" to anyone. Instead, it's earned by the merits of the proposals made. This creates an organization that follows the guidance of someone people are voluntarily following. This always yields better results, whether through growth, innovation or higher profits. This style of leadership is something all good leaders can practice. Even if they didn't "earn" their role in the same way, they can earn the trust and loyalty of their team through their actions. ... Modern corporations are like enormous ships that require huge amounts of time and effort to change course. There is endless red tape and bureaucracy to navigate before any real change can be implemented. Because DAOs are more democratic, changes can be proposed and implemented with relatively little hassle. While DAOs are primarily based on the division of funds, leaders can still note how the process works and see how efficient it is. The level of efficiency DAOs create is something that great leaders can seek to replicate in their own organizations.
 


Quote for the day:

"Challenges in life always seek leaders and leaders seek challenges." -- Wayde Goodall

Daily Tech Digest - April 10, 2022

Robots Developing The Unique Sixth Sense

In the sense of smell and taste, robots with chemical sensors could be far more precise than humans, but building in proprioception, the robot’s awareness of itself and its body, is far more challenging and is a big reason why humanoid robots are so tough to get right. Tiny modifications can make a big difference in human-robot interaction, wearable robotics, and sensitive applications like surgery. In the case of hard robotics, this is usually solved by putting a number of strain and pressure sensors in each joint, which allow the robot to figure out where its limbs are. This is fine for rigid robots with a limited number of joints, but it is insufficient for softer, more flexible robots. Roboticists are torn between having a large, complicated array of sensors for every degree of freedom in a robot’s mobility and having limited proprioception skills. This challenge is being addressed with new solutions, which often involve new arrays of sensory material and machine-learning algorithms to fill in the gaps. They discuss the use of soft sensors spread at random through a robotic finger in a recent study in Science Robotics.


The Rise of Enterprise Data Inflation

Data inflation ensues when spending on data rises without deriving proportional enterprise value from that spending. Surprisingly, digital transformation and application modernization have created fertile ground for data inflation to run rampant. As enterprises refactor applications and ever-expanding datasets aren’t managed carefully, enterprises experience data sprawl. Moving to the cloud to deliver more capability and use can inadvertently lead to data inflation. Often, a dataset is helpful across multiple areas of a business. Different development groups or people with unrelated objectives might make numerous copies of the same data. They often change a dataset’s taxonomy or ontology for their software or business processes, making it harder for others to identify it as a duplicate. This occurs because the average data scientist trying to hone in on a particular data insight has different priorities than the data engineers responsible for pipelining that data and creating new features. And the typical IT person has little visibility into the use of the data at all. The result is that the enterprise pays for many extra copies without getting any new value – a core driver of data inflation.


Will Apple build its own blockchain?

One thing that is pretty clear is that if Apple creates a specific carve-out for NFTs in its own App Store rules, it’s going to be on its own terms. They could take a number of different paths; I could see a world where Apple could only allow certain assets on certain blockchains or even build out their own blockchain. But Apple’s path toward controlling the user experience will most likely rely on Apple taking a direct hand in crafting their own smart contracts for NFTs, which developers might be forced to use in order to stay compliant with App Store rules. This could easily be justified as an effort to ensure that consumers have a consistent experience and can trust NFT platforms on the App Store. These smart contracts could send Apple royalties automatically and lead to a new in-app payment fee pipeline, one that could even persist in transactions that took place outside of the Apple ecosystem(!). More complex functionality could be baked in as well, allowing Apple to handle workflows like reversing transactions. Needless to say, any of these moves would be highly controversial among existing developers.


A Microservice Overdose: When Engineering Trends Meet the Startup Reality

Microservices are not the only big engineering trend that is happening right now. Another big trend that naturally comes together with microservices, is using a multi-repo version control approach. The multi-repo strategy enables the microservice team to maintain a separate and isolated repository for each responsibility area. As a result, one group may own a codebase end to end, developing and deploying features autonomously. Multi-repo seems like a great idea, until you realize that code duplication and configuration duplication are still not solved. Apart from the code duplication that we already discussed, there is a whole new area of repository configurations – access, permissions, branch protection, and so on. Such duplications are expected with a multi-repo strategy because multi-repo encourages a segmented culture. Each team does its own thing, making it challenging to prevent groups from solving the same problem repeatedly. In theory, a better alternative could be the mono-repo approach. In a mono-repo approach, all services and codebase are kept in a single repository. But in practice, mono-repo is fantastic if you’re Google / Twitter / Facebook. Otherwise, it doesn’t scale very well.


Talking Ethical AI with SuperBot’s Sarvagya Mishra

AI is the most transformative technology of our era. But it brings to the fore some fundamental issues as well. One, a rapidly expanding and pervasive technology powered by mass data, may bring about a revolutionary change in society; two, the nature of AI is to process voluminous raw information which can be used to automate decisions at scale; three, all of this is happening while the technology is still in the nascent stage. If we think about it, AI is a technology that can impact our lives in multiple ways – from being the backbone of devices that we use to how our economies function and even how we live. AI algorithms are already deployed across every major industry for every major use case. Since AI algorithms are essentially sets of rules that can be used to make decisions and operate devices, they could make judgement calls that harm an individual or a larger population. For instance, consider the AI algorithm for a self-driving car. It’s trained to be cautious and follow traffic rules, but what happens if it suddenly decides that breaking the rules is more beneficial? It could lead to a lot of accidents. 


Data Science: How to Shift Toward More Transparency in Statistical Practice

A common misconception about statistics is that it can give us certainty. However, statistics only describe what is probable. Transparency can be best achieved by conveying the level of uncertainty. By quantifying research inferences about uncertainty, a greater degree of trust can be achieved. Some researchers have done studies of articles in physiology, the social sciences, and medicine. Their findings demonstrated that error bars, standard errors, and confidence intervals were not always presented in the research. In some cases, omitting these measures of uncertainty can have a dramatic impact on how the information is interpreted. Areas such as health care have stringent database compliance requirements to protect patient data. Patients could be further protected by including these measures, and researchers can convey their methodology and give readers insights into how to interpret their data. Assessing Data Preprocessing Choices Data scientists are often confronted with massive amounts of unorganized data. 


DAO regulation in Australia: Issues and solutions, Part 2

So, the role of the government is to introduce regulations and standards, to make sure that people understand that when they publish a record — say, on Ethereum — it will become immutable and protected by thousands of running nodes all around the globe. If you publish it on some private distributed ledger network controlled by a cartel, you basically need to rely on its goodwill. The conclusion for this part of the discussion is the following. With blockchain, you don’t need any external registry database, as blockchain is the registry, and there is no need for the government to maintain this infrastructure, as the blockchain network is self-sustainable. Users can publish and manage records on a blockchain without a registrar, and there must be standards that allow us to distinguish reliable blockchain systems. ... The difference is that this must be designed as a standard requirement for the development of a compliant DAO. Those who desire to work under the Australian jurisdiction must develop the code of their decentralized applications and smart contacts compliant with these standards.


Data Governance Adoption: Bob Seiner on How to Empower Your People to Participate

When you consider the ADKAR model for change, any program adoption requires personal activation. “You need to find a way to make that connection with people,” Bob says. “ADKAR relies on personal traits and things that people need to adjust to and adopt to further the way they’re able to govern and steward data in their organization. Make it personable, make it reasonable, and help them understand they play a big role in data governance.” But even the most energized workforce can’t participate in active data governance without the right tools — your drivers won’t win their race without cars, after all. Like most large organizations, Fifth Third has a very divided data platform ecosystem, with several dozen tools employing both old and new technology. But as their vice president of enterprise data, Greg Swygart, notes, where data consumption starts and ends — curation and interaction — “the first step in the data marketplace is always Alation.” “Implementing an effective data governance program really requires getting people involved,” Bob concludes. 


AI Regulatory Updates From Around the World

Under the proposed ‘Artificial Intelligence Act,' all AI systems in the EU would be categorized in terms of their risk to citizens' privacy, livelihoods, and rights. ‘Unacceptable risk' covers systems that are deemed to be a "clear threat to the safety, livelihoods, and rights of people.” Any product or system which falls under this category will be banned. This category includes AI systems or applications that manipulate human behavior to circumvent users' free will and systems that allow ‘social scoring' by governments. The next category, 'High-risk,' includes systems for critical infrastructure which could put life or health at risk, systems for law enforcement that may interfere with people's fundamental rights, and systems for migration, asylum-seeking, and border control management, such as verification of the authenticity of travel documents. AI systems deemed to be high-risk will be subject to “strict obligations” before they can be put on the market, including risk assessments, high quality of the datasets, ‘appropriate’ human oversight measures, and high levels of security.


SEC Breach Disclosure Rule Makes CISOs Assess Damage Sooner

The central question facing CISOs who've experienced a security incident will be around how materiality is determined. The easiest way to assess whether an incident is material is by looking at the impact to sales as a percentage of the company's overall revenue or by tracking how many days a company's systems or operations are down as the result of a ransomware attack, Borgia says. But the SEC has pressured companies to consider qualitative factors such as reputation and the centrality of a breach to the business, he says. For instance, Pearson paid the SEC $1 million to settle charges that it misled investors about a breach involving millions of student records. Though the breach might not have been financially material, he says it put into doubt Pearson's ability to keep student data safe. The impact of the proposed rule will largely come down it how much leeway the SEC provides breach victims in determining whether an incident is material. If the SEC goes after businesses for initially classifying an incident as immaterial and then changing their minds weeks or months later when new facts emerge, he says, companies will start putting out vague and generic disclosures that aren't helpful.



Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - April 07, 2022

Researchers Identify ‘Master Problem’ Underlying All Cryptography

In the absence of proofs, cryptographers simply hope that the functions that have survived attacks really are secure. Researchers don’t have a unified approach to studying the security of these functions because each function “comes from a different domain, from a different set of experts,” Ishai said. Cryptographers have long wondered whether there is a less ad hoc approach. “Does there exist some problem, just one master problem, that tells us whether cryptography is possible?” Pass asked. Now he and Yanyi Liu, a graduate student at Cornell, have shown that the answer is yes. The existence of true one-way functions, they proved, depends on one of the oldest and most central problems in another area of computer science called complexity theory, or computational complexity. This problem, known as Kolmogorov complexity, concerns how hard it is to tell the difference between random strings of numbers and strings that contain some information. ... The finding suggests that instead of looking far and wide for candidate one-way functions, cryptographers could just concentrate their efforts on understanding Kolmogorov complexity. “It all hinges on this problem,” Ishai said. 


4 Reasons Decentralized Business Management Is Booming

Organizations face employee churn all the time, whether due to a lack of challenging work or dissatisfaction with the company's overall direction. Both of these reasons are interconnected. An inflexible organizational hierarchy leaves employees fighting to impress their managers instead of creating revenue-generating assets. With power consolidated in the hands of a few, leadership skills are scarce. Thus, when top-level executives move on, the company faces a tough time replacing those who departed and must engage resources to locate and vet suitable leadership. Promoting from within is ideal because long-term employees understand the company and its products well. They've witnessed the company's processes from the ground up, which makes them ideal leaders. However, centralized organizations don't provide low-level employees with the opportunity to ascend to leadership roles. A decentralized organization forces employees to act as leaders. Thanks to greater autonomy and priority on responsiveness, employees must act decisively. Intrapreneurship increases, promoting creativity, and the organization is energized.


DeFi can breathe new life into traditional assets

Tokenization of commodities enables blockchain-based ownership of a physical asset, which is essentially just a decentralized version of an already-existing practice in traditional finance. Tokenized precious metals are somewhat similar conceptually to a share in a gold exchange-traded fund (ETF), as they represent the investor’s stake in physical gold stored elsewhere and largely work toward the same purpose. Projects like VNX offer digital ownership of tokenized commodities that are backed by physical assets including gold, giving the investor the same benefits as investing in physical gold but have the versatility of a crypto asset on top of that. Stablecoins are also a viable option, allowing investors to reap the benefits of decentralization while maintaining the security of traditional finance. Backing from fiat and other real-world assets removes the common fear that crypto has no basis. Stablecoins like TrustToken (TUSD) grant investors more certainty and flexibility, lowering the stakes for any user by enabling easy redeeming of their funds at any given moment.


Chinese APT Targets Global Firms in Monthslong Attack

The campaign, which began in October 2019, targeted Japanese firms and their subsidiaries in 17 locations across the world, Symantec said in its report. The focus of the campaign was to exfiltrate data, particularly from automotive organizations, as part of an industrial cyberespionage effort. The APT group was then using a custom malware variant called Backdoor.Hartup as well as "living off the land" tools to target its victims. Once the victim's network was compromised, the hackers remained active for up to a year to exfiltrate data. Cicada then used a Dynamic Link Library side-loading technique to compromise the victims' domain controllers and file servers. "Various tools (were) deployed in this campaign, and Cicada’s past activity indicates that the most likely goal of this campaign is espionage. Cicada activity was linked by U.S. government officials to the Chinese government in 2018," the latest report says. Upon successfully gaining access to victim machines, the Symantec researchers observed APT actors deploying a custom loader and the SodaMaster backdoor. 


First malware targeting AWS Lambda serverless platform disclosed

The researchers have dubbed the malware “Denonia” — the name of the domain that the attackers communicated with — and say that it was utilized to enable cryptocurrency mining. But the arrival of malware targeting AWS Lambda suggests that cyberattacks against the service that bring greater damage are inevitable, as well. Cado Security said it has reported its findings to AWS. In a statement in response to an inquiry about the reported malware discovery, AWS said that “Lambda is secure by default, and AWS continues to operate as designed.” ... Cado Security cofounder and CTO Chris Doman said that businesses should expect that serverless environments will follow a similar threat trajectory to that of container environments, which he noted are now commonly impacted by malware attacks. Among other things, that means that threat detection in serverless environments will need to catch up, Doman said. “The new way of running code in serverless environments requires new security tools, because the existing ones simply don’t have that visibility. They won’t see what’s going on,” Doman said. “It’s just so different.”


Why We’re Porting Our Database Drivers to Async Rust

Similar to the way Python relies on modules compiled in C to make other modules less unbearably slow faster, our CQL drivers could benefit from a Rust core. A lightweight API layer would ensure that the drivers are still backward compatible with their previous versions, but the new ones will delegate as much work as possible straight to the Rust driver, trusting that it’s going to perform the job faster and safer. Rust’s asynchronous model is a great fit for implementing high-performance, low-latency database drivers because it’s scalable and allows high concurrency in your applications. Contrary to what other languages implement, Rust abstracts away the layer responsible for running asynchronous tasks. This layer is called runtime. Being able to select, or even implement, your own runtime is a powerful tool for developers. After careful research, we picked Tokio as our runtime due to its active open source community, focus on performance; rich feature set, including complete implementation for network streams, timers, etc., and lots of fantastic utilities like tokio-console.


How David Chaum Went From Inventing Digital Cash to Pioneering Digital Privacy

Shocked by the surveillance operations exposed by Edward Snowden, Chaum refined the mixing technologies developed at the end of the 1970s to provide untraceable message sending, using sophisticated cryptography not only to encrypt the content of message but to hide the identity of the user by eliminating the "metadata" of who sends messages to whom, how often and from where. Chaum is horrified by the promises of “end-to-end” message content encryption offered by companies such as Meta (formerly Facebook.) It leaves user metadata intact, which means it can still be harvested and sold, he warns. “It's criminal. It's exploitative of the public in the worst way,” says Chaum. “Because the real value in the information is the traffic data,” and “the sender's social graph and its relation to the timing of events,” he says—it could be used to predict our behavior and to further political ends (as was the case in the Cambridge Analytica scandal).


Reproducibility in Deep Learning and Smooth Activations

The Smooth reLU (SmeLU) activation function is designed as a simple function that addresses the concerns with other smooth activations. It connects a 0 slope on the left with a slope 1 line on the right through a quadratic middle region, constraining continuous gradients at the connection points (as an asymmetric version of a Huber loss function). SmeLU can be viewed as a convolution of ReLU with a box. It provides a cheap and simple smooth solution that is comparable in reproducibility-accuracy tradeoffs to more computationally expensive and complex smooth activations. The figure below illustrates the transition of the loss (objective) surface as we gradually transition from a non-smooth ReLU to a smoother SmeLU. A transition of width 0 is the basic ReLU function for which the loss objective has many local minima. As the transition region widens (SmeLU), the loss surface becomes smoother. If the transition is too wide, i.e., too smooth, the benefit of using a deep network wanes and we approach the linear model solution — the objective surface flattens, potentially losing the ability of the network to express much information.


The security implications of the hybrid working mega-trend

Ultimately, any high-level security model really breaks down into a trust issue: Who and what can I trust? – the employee, the devices, and the applications the employee is trying to connect to. In the middle is the network, but today, more often than not, the network is the internet. Think about it. Employees sit in coffee shops and log onto public browsers to access their email. So now what organisations are looking for is a secure solution for their applications, devices, and users. Every trusted or ‘would-be trusted’ end-user computing device has security software installed on it by the enterprise IT department. That software makes sure the device and the user who is on the device is validated, so the device becomes the proxy to talk to the applications on the corporate network. So now the challenge lies in securing the application itself. Today’s cloud infrastructure connects the user directly to the application, so there is no need to have the user connect via an enterprise server or network. The client is always treated as an outsider, even while sitting in a corporate office.


The Principles of Test Automation

The only way to reliably find errors is to build a comprehensive automated test suite. Tests can check the whole application from top to bottom. They catch errors before they can do any harm, find regressions, and run the application on various devices and environments at a scale that is otherwise prohibitively expensive to attempt manually. Even if everyone on the team was an exceptionally clever developer that somehow never made a mistake, third-party dependencies can still introduce errors and pose risks. Automated tests can scan every line of code in the project for errors and security issues. ... Some tests start their lives as manual tests and get automated down the road. But, more often than not, this results in overcomplicated, slow, and awkward tests. The best results come when tests and code have a certain synergy. The act of writing a test nudges developers to produce more modular code, which in turn makes tests simpler and more granular. Test simplicity is important because it’s not practical to write tests for tests. Code should also be straightforward to read and write. Otherwise, we risk introducing failures with the test themselves, leading to false positives and flakiness.



Quote for the day:

"Without courage, it doesn't matter how good the leader's intentions are." -- Orrin Woodward

Daily Tech Digest - April 04, 2022

What are Governance Tokens? How Token Owners Shape a DAO's Direction

Governance tokens represent ownership in a decentralized protocol. They provide token holders with certain rights that influence a protocol’s direction. This could include which new products or features to develop, how to spend a budget, which integrations or partnerships should be pursued, and more. Generally speaking, exercising this influence can take two forms. First, governance token holders can propose changes through a formal proposal submission process. If certain criteria are met and the proposal goes to a vote, governance token holders can use their tokens to vote on the proposed changes. The specific mechanisms and processes through which these rights are exercised differ across protocols. ... In traditional corporations, a concentrated executive body—typically some combination of a C-Suite, board of directors, and shareholders—has sole discretion over decisions pertaining to the organization’s strategic direction. DAOs differ from traditional corporations in that they don’t have a centralized group of decision-makers; but they still need to make decisions that influence the organization’s future.


Remote work vs office life: Lots of experiments and no easy answers

"It's important that it's an iterative process because we're going to find out things that we didn't necessarily expect in our assumptions around how the styles of work that we will be carrying out may well change as we start to reach a balance," he says. Lloyds is examining the work that takes place in offices, branches and homes, and is thinking about how the bank will connect people across these spaces in what Kegode refers to as "a mindful way". Developing that understanding involves constant conversations and an analysis of the crossover between business demands, individual needs and team requirements. "It's always about looking at how we can use technology as an enabler to make us more human," he says. "How can we use technology to enhance our human traits and the things that make us unique that machines can't do?" Lloyds started introducing Microsoft Teams just before the pandemic, which served the bank well when lockdown began. While video-conferencing tech has kept workers productive during the past two years, the future of the workplace will require careful conversations about how tools are adopted and adapted.


PCI SSC Releases Data Security Standard Version 4.0

The PCI Security Standards Council on Thursday released the Payment Card Industry Data Security Standard version 4.0. The Council says that the latest version's improvements are intended to counter evolving threats and technologies, and the new version will enable innovative methods to combat new threats. Organizations currently use PCI DSS version 3.2.1. The council is allowing two years - until March 31, 2024 - for the industry to conduct training and provide education regarding implementation of the changes and updates in version 4.0. While the new standard will be considered best practice, the current version of PCI DSS will remain active during this time. After March 31, 2024, it will be retired over the next year, and the new requirements will become effective after March 31, 2025. The global payments industry received feedback on the latest changes over the course of three years, during which more than 200 organizations provided more than 6,000 suggestions to ensure the standard continues to meet the ever-changing landscape of payment security, the council says.


Building Trust with Responsible AI

User-centered reliable AI systems should be created using basic best practices for software systems and methods that address machine learning-specific problems. The following points should be kept in mind while designing a reliable and responsible AI. Consider augmenting and assisting users with a variety of options. One should use a human-centered design approach. This includes building a model with appropriate disclosures, clarity, and control for the users. Engage a wide range of users and use-case scenarios, and incorporate comments before and during the project’s development; Rather than using a single metric, you should use a combination to understand better the tradeoffs between different types of errors and experiences. Make sure your metrics are appropriate for the context and purpose of your system; for example, a fire alarm system should have a high recall, even if it implies a false alarm now and then; ML models will reflect the data they are trained on, so make sure you understand your raw data. If this isn’t possible, such as with sensitive raw data, try to comprehend your input data as much as possible while still maintaining privacy; Understand the limitations of your dataset and communicate them with the users whenever possible.


The CISO as brand enabler, customer advocate, and product visionary

Quantifying the value of a corporate brand is tough. But it’s clear that your organization’s brand is as much an asset as the devices and networks that the CISO is charged with protecting – in fact, the brand may be your organization’s largest single asset. A recent Forbes/MASB report states that brand assets drive approximately 20% of enterprise value on average. Doesn’t that sound like something worth protecting? Yes, the creation and growth of the brand is typically the responsibility of the marketing organization and the CMO (chief marketing officer). But it’s not unusual for marketing to feel like it’s outracing the other business functions, including the CISO, and they are anxious for everyone to “catch up” and join them. The CISO can act as a useful counterweight to help marketing achieve its goals safely, in good times and bad. For example, isn’t it important to fully coordinate a breach response between these two groups in a way that best preserves the value of your brand? Those brands that emerge out of a high-profile information security incident stronger don’t get there by accident.


Introducing the MeshaVerse: Next-Gen Data Mesh 2.0

When designing MeshaVerse, our primary focus was on preserving decentralization while ensuring data reliability, data quality, and scale. Our novel approach includes implementing Dymlink, a symlink in the data lakehouse, and a new SlinkSync (Symbolic link Sync), a symlink that links Dymlinks together – similar to a linked list. By establishing which symlinks can be composed as a set – using either a direct probable or indirect inverse probable match – we are able to infer the convergence criteria of a nondivergent series (i.e the compressed representation of the data) while always ensuring we stay within the gradient of the curve. As a result, we’re able to prevent an infinite recursion that can potentially stale all data retrieval from the Data Mesh. Stay tuned for a future blog, where we’ll dive deeper into this approach. The integrity of this virtual data is ensured in real-time and at scale using a more recent implementation of Databricks Brickchain, taking advantage of all global compute power and therefore offering the potential to store the entire planet’s data with a fraction of the footprint.


DAOs could revolutionize how startups are run

Blockchain technology has ushered in the creation of businesses that allow users greater control over the services they choose to use. These emerging services turn the top-down approach of traditional tech firms on its head, allowing patrons to have a say in the development of a new generation of Web3-based games, apps, and companies. VCs currently have a monopoly on decision-making in their chosen investments, giving them the power to dictate critical judgments and the direction of these companies. While this sounds fair in theory — given the money they provide — this can also mean that critical decisions get slowed, or the original vision for the company diverges entirely. However, under the Web3 model, it makes sense that key business decisions should be as decentralized as the infrastructure that underpins them. Decentralized voting via a token governance structure means that anyone — regardless of their ethnicity, creed or financial status — can get involved and benefit from being part of a like-minded community of peers, removed from the hierarchical structure of the standard business model.


5 things CIOs should know about cloud service providers

While cloud service providers may offer similar capabilities, they are not actually the same. Determining the best one for your unique requirements and goals is another critical piece of your strategy. “When working with cloud service providers, it’s important to align the platform with the company’s unique business objectives,” says Scott Gordon, enterprise architect director, cloud infrastructure services at Capgemini Americas. “Every organization has its own situation, and the cloud strategy must be catered to solve those customized business challenges to create value and results.” While there might be some plain-vanilla workloads where the choice of cloud service provider might not have overwhelming implications, most organizational realities are more complex. Thinking back to the advice from Haff and LaCour, this is again where specific motivations or goals have a big impact. Gordon notes, for example, the importance of evaluating the end-to-end life cycles of your on-premises applications and determining which ones will require modernization and/or migration at some point.


General Catalyst’s Clark Talks Opportunistic Investing in Tech

We have to balance thematic with what we refer to as opportunistic work. We have to pay attention and engage with companies that get referred to us through our founders and other parts of our network. There are other incubator functions--that is important for us to engage in because we don’t necessarily see everything as we view things thematically. It’s just impossible. We do some of our very best work when we are being more intentional. ... Another area is dynamic workforce, which is a little fuzzy. I fit things like Remote.com, Awardco, and Hopin into these things, as well as things like Loom and Glean where it’s not just the tools end users are using because they are much more project-based than they used to be. Now it’s like, “You’re going to do this project and when that’s done, there’s another one. Maybe you do two at once and the teams you work with are different.” It’s a different system that we’ve put in place. Distributed work is permanent now. We will get back in the office one, two, three days a week -- or not. 


Improving open source software governance stature

The first line of defense against vulnerable open source libraries is to scan a project's dependencies for libraries known to have security vulnerabilities. OWASP Dependency-Check is a tool that returns a report that identifies vulnerable dependencies, along with their common vulnerabilities and exposures (CVEs). There are different ways to run OWASP Dependency-Check, such as via a command-line interface, an Apache Maven plugin, an Ant task or a Jenkins plugin, which enables easy integration into any CI/CD pipeline. Using a tool that creates actionable reports is only as useful as the process enforced around the tool. Run OWASP Dependency-Check on a consistent schedule to scan the codebase against the latest updates of newly discovered CVEs. Dedicate time and plan for identified CVEs. When using open source dependencies, consider the licenses that govern their use. Licenses for open source projects define how to use, copy and distribute the software. Depending on the application's software and distribution types, the application's source code might not permit certain open source tools.



Quote for the day:

"Brilliant strategy is the best route to desirable ends with available means." -- Max McKeown

Daily Tech Digest - April 03, 2022

With Identity Management, Start Early for Less Tech Debt

Starting with a robust identity and access management (IAM) solution will give new projects a head start on the competition. Users will have access to more features earlier. Additionally, no growing pains also mean no tech debt. Any new project has challenges right from the start. Finishing the MVP is a high priority. Planning meetings to outline necessary features and requirements can suffer from scope creep. Every shortcut taken to deliver on time borrows against the future. Tech debt is a known cost, and many startups take on a significant amount. As any app needs users, it eventually will come down to planning the features and structures needed. Everyone is a user themselves, so it’s easy to come up with a variety of useful features. Single sign-on, social logins and multifactor authentication are all conventional IAM features included in the project scope and planned out for customers. Features and domain knowledge are designed around what the team thinks a user will need. A user’s footprint within your app gets built out in forms and user profile pages. Business data and user data are stored together.


Enterprise Architects Can Be Indispensable in the Boardroom

Data is enterprise currency, and executive management discussions in the boardroom are data-driven. A knowledgeable enterprise architect can show the board how data for business requirements are translated into technological specifications. EA can provide timely reports on the status of the current application landscape and IT inventory to provide data that addresses crucial boardroom evaluations and decision-making. Use reports to tie EA into business processes during regular meetings. Data can be used to illustrate real issues with simple diagrams and use cases, demonstrating options and concrete results. EA overlays on top of the business model can help boardroom members visualize cost, revenue, risk, and performance metrics to support decisions and track alignment with initiatives. The enterprise architect is the data guru of the boardroom. ... If you want to have a game in the boardroom, you must get to know the players. You need the sponsorship of executives who wield real influence and can promote engagement of EA initiatives. 


Europe’s AI Act contains powers to order AI models destroyed or retrained, says legal expert

The European Commission put out its proposal for an AI Act just over a year ago — presenting a framework that prohibits a tiny list of AI use cases, considered too dangerous to people’s safety or EU citizens’ fundamental rights to be allowed, while regulating other uses based on perceived risk — with a subset of “high risk” use cases subject to a regime of both ex ante (before) and ex post (after) market surveillance. In the draft Act, high-risk systems are explicitly defined as: Biometric identification and categorisation of natural persons; Management and operation of critical infrastructure; Education and vocational training; Employment, workers management and access to self-employment; Access to and enjoyment of essential private services and public services and benefits; Law enforcement; Migration, asylum and border control management; Administration of justice and democratic processes. Under the original proposal, almost nothing is banned outright — and most use cases for AI won’t face serious regulation under the Act as they would be judged to pose “low risk” so largely left to self regulate — with a voluntary code of standards and a certification scheme to recognize compliance AI systems.


Why a ruling on digital ID by Kenya's Supreme Court has global implications for online privacy

Kenya’s digital ID programme, called the National Integrated Identity Management System (NIIMS), was ruled illegal by the highest court because there was no clear documentation of the data privacy risks, nor was there a clear strategy for measuring, mitigating and dealing with those risks. Related concerns about data privacy and security have arisen in other digital ID platforms as well. For example, India’s Aadhaar is the world’s largest biometric digital ID system. Registration is linked to biometrics and demographics, and can connect to services including SIM cards, bank accounts, and government aid programmes, making financial systems more inclusive. Despite these advantages, Aadhaar has seen pushback regarding feasibility and privacy. ... A major risk surrounding biometrics in particular is that if, and when, an attacker obtains these credentials for a victim, they may be able to impersonate the victim indefinitely, since a user’s biometrics do not change. These risks can be mitigated using emerging technologies like computation over encrypted data with rotating keys. 


Why did AI pioneer Marvin Minsky oppose neural networks?

The Dartmouth Summer Research Project on Artificial Intelligence in 1956 is widely considered as the founding moment of artificial intelligence as a field: John Mccarthy, Marvin Minsky, Claude Shannon, Ray Solomonoff etc attended the eight-week long workshop held in New Hampshire. On the fiftieth anniversary of the conference, the founding fathers of AI returned to Dartmouth. When Minsky took the stage, Salk Institute professor Terry Sejnowski told him some AI researchers view him as the devil for stalling the progress of neural networks. “Are you the devil?” Sejnowski asked. Minsky brushed him off and went on to explain the limitations of neural networks, pointing out neural networks haven’t delivered the goods yet. But Sejnowski was persistent. He asked again: “Are you the devil?”. A miffed Minsky retorted: “Yes, I am.” Turing award winner Marvin Minsky has made major contributions in cognitive psychology, symbolic mathematics, artificial intelligence, robot manipulation, and computer vision. As an undergraduate student at Harvard, Minsky built SNARC, considered the ‘first neural network’ by many, using over 3000 vacuum tubes and a few components from the B-52 bomber.


Is the Future of Digital Identity Safe?

Although multifactor authentication is crucial for preventing a great percentage of attacks, however, is not enough – not in today’s rapidly changing threat landscape. Enterprises need to evolve their identity and access management policy towards a modernized authentication solution. As Uri and I agreed, we need to leverage multiple data layers that would allow us to map a legitimate behavior versus a malicious one. Not only do we need to examine contextual data like location and device, but we also need to consider behavioral insights, look at micro behaviors such as hesitation, distraction, and rest. Having all these data layers, we can then leverage machine learning to aggregate them into a coherent analysis that indicates abnormal behaviors. Besides enabling artificial intelligence and machine learning to enhance our security posture, it is equally important to consider customer experience. For example, the best authentication tools today rely on mobile applications. What happens if a ratio of your employees cannot use their mobile phone, or they are reluctant about their employee installing an app in their personal mobile? 


Metaverses and DAO: Are Crypto Enthusiasts Ready to Usher Them In? 

There are already many who see the metaverse as a tremendous and thrilling possibility. According to many observers, the venture will be a new chance for economies, working settings, and further interaction. However, the metaverse, like any technology, requires rigorous research and use to be sustainable. Cryptos were on fire last year over environmental degradation issues, and metaverse has to counteract this to emerge on top. There are some principles underlying the metaverse: data sovereignty, privacy and governance, and honesty. It also focuses on both diversity and utmost respect for users. To stay loyal to the metaverse's values, those who work on its future need to follow specific rules. In addition, the move allows long-term benefits. They can be environmental sensitivity, social responsibility, or fiscal prudence. The future of the metaverse looks like many different things for different people. The ability to create virtual worlds and draw people is a lucrative new career for some. Furthermore, there can be the incorporation of NFTs to give value to the virtual space on the metaverse and allow users to earn income.


Application-Layer Encryption Basics for Developers

You may be working across multiple infrastructures, and for instance, HTTPS only covers a small part of the data flow inside your infrastructure, if you need an extra layer of protection, because the data is sensitive, or it may go outside of a specific infrastructure. Most importantly, if you need to enforce access control with encryption. For example, if you think of something like end-to-end encryption in a chat app, for instance, the access control is the sender and receiver, are really the only people who can access that data. That's not enforced just with a bit on a server saying who's allowed to do what, it's enforced through control of cryptographic key material. It's very clear how to use that in chat. It's actually a generalizable capability that you can use across lots of different types of use cases. Like in that use case, application layer encryption improves privacy. In some cases, it improves privacy substantially. It's actually significantly harder for developers than just implementing something like HTTPS. 


What the media is missing about decentralized autonomous organizations

While we’ve only scratched the surface of the potential DAOs have to create a radically more transparent and equitable financial system, we’ve already seen projects emerge that are delivering real value to real people in the real world today. One example is the war in Ukraine, where UkraineDAO, set up by Russian art collective Pussy Riot and Trippy Labs, raised over $6.75 million worth of Ether (ETH) donated directly to Ukrainian defense efforts against Russia. While this amount may not shift the balance of the war, the rapid creation and scaling-up of UkraineDAO demonstrate the power of decentralized financial technologies to coordinate a disparate global group of individuals around a single cause to deliver tangible results. But, the value of DAOs goes beyond just raising funds for noble causes under duress. In fact, many DAOs are already providing sustainable value to participants across the world and even harnessing blockchain technology to take on some of the most pressing challenges of our time such as climate change. 


The Evolution to Service-Based Networking

As application delivery evolved, orchestrators such as Kubernetes, Mesos and Docker Swarm integrated discovery functionality to reduce the need for those manual scripts. And while that’s great, what does it mean for the evolution of networking? A lot, actually. Networking still needs to be based on service identity because that’s how orchestrators track things, but the shift away from static, IP-based networking toward a service-based networking solution that these service discovery features provided was perhaps the most impactful change to networking, making application identity the basis for networking policies. Networking’s transition to a service-identity-based networking requirement also has cascading effects on other workflows. The first, and arguably the most important, is on security. While service discovery may solve for tracking changes more dynamically, it doesn’t help you apply consistent security policies to those applications. As I mentioned earlier, enforcing security and access to sensitive data is a core networking requirement.



Quote for the day:

"To make a decision, all you need is authority. To make a good decision, you also need knowledge, experience, and insight." -- Denise Moreland