Showing posts with label DApp. Show all posts
Showing posts with label DApp. Show all posts

Daily Tech Digest - August 20, 2023

Central Bank Digital Currency (CBDC) and blockchain enable the future of payments

CBDC has the potential to transform the future of payments. It can be used to create programmable money that can be spent only on specific things. For example, a government could issue a stimulus package that can only be spent on certain goods and services. This would ensure that the money is spent in the intended manner and would reduce the risk of fraud. Also, CBDC can improve financial inclusion. According to the World Bank, around 1.7 billion people do not have access to basic financial services. CBDC can solve this problem by providing a digital currency that anyone with a smartphone can use, without the need for a bank account. When a CBDC holder uses their phone as a medium for transactions, it becomes crucial to establish a strong link between their digital identity and the device they are using. This link is essential to ensure that the right party is involved in the transaction, mitigating the risk of fraud and promoting trust in the digital financial ecosystem. That said, CBDC and the digital identity can work together to improve financial inclusion.


A statistical examination of utilization trends in decentralized applications

Decentralized applications (dApp) have proliferated in recent years, but their long-term viability is a topic of debate. However, for dApps to be sustainable, and suitable for integration into a larger service networks, they need to attract users and promise reliable availability. Therefore, assessing their longevity is crucial. Analyzing the utilization trajectory of a service is, however, challenging due to several factors, such as demand spikes, noise, autocorrelation, and non-stationarity. In this study, we employ robust statistical techniques to identify trends in currently popular dApps. Our findings demonstrate that a significant proportion of dApps, across a range of categories, exhibit statistically significant positive overall trends, indicating that success in decentralized computing can be sustainable and transcends specific fields. However, there is also a substantial number of dApps showing negative trends, with a disproportionately high number from the decentralized finance (DeFi) category. 


How SaaS Companies Can Monetize Generative AI

Rather than building these models from scratch, many companies elect to leverage OpenAI’s APIs to call GPT-4 (or other models), and serve the response back to customers. To obtain complete visibility into usage costs and margins, each API call to and from OpenAI tech should be metered to understand the size of the input and the corresponding backend costs, as well as the output, processing time and other relevant performance metrics. By metering both the customer-facing output and the corresponding backend actions, companies can create a real-time view into business KPIs like margin and costs, as well as technical KPIs like service performance and overall traffic. After creating the meters, deploy them to the solution or application where events are originating to begin tracking real-time usage. Once the metering infrastructure is deployed, begin visualizing usage and costs in real time as usage occurs and customers leverage the generative services. Identify power users and lagging accounts and empower customer-facing teams with contextual data to provide value at every touchpoint.


“Auth” Demystified: Authentication vs Authorization

There are two technical approaches to modern authorization that are growing ecosystems around them: policy-as-code and policy-as-data. They are similar in that both approaches advocate decoupling authorization logic from the application code. But they also have differences. In policy-as-code systems, the authorization policy is written in a domain-specific language, and stored and versioned in its own repository like any other code. OPA is one well-known example of this approach. It is a CNCF graduated project that is mostly used in infrastructure authorization use cases, such as k8s admission control. It provides a great general-purpose decision engine to enforce authorization logic, and a language called Rego to define that logic as policy. The policy-as-data approach determines access based on relationships between users and the underlying application data. Rather than rely on rules in a policy, these systems use the relationships between subjects (users/groups) and objects (resources) in the application. 


Redefining Software Resilience: The Era of Artificial Immune Systems

Artificial Immune Systems, inspired by the vertebrate immune system, provide an innovative approach to designing self-healing software. By emulating the biological immune system’s ability to adapt, learn, and remember, AIS can empower software systems to detect, diagnose, and fix issues autonomously. AIS offers a framework that enables the software to learn from each interaction, adapt to system changes, and remember past faults and their resolutions. AIS leads to a more robust, resilient system capable of tackling an array of unpredictable errors and vulnerabilities. The vertebrate immune system consists of innate immunity and adaptive immunity. Innate immunity protects us against known pathogens. Innate immunity is always non-specific and general. Present self-healing software models closely resemble innate immunity. Adaptive immunity can learn from current threats and apply the knowledge to handle future situations. At its core, these systems mimic the vertebrate immune system’s differentiation of self and non-self entities.


Europe’s Business Software Startups Prove Resilient: Why?

So what are the factors underpinning the resilience of Europe’s business software sector. One key element of the picture is demand from other tech companies. “Europe’s tech ecosystem is maturing, " says Windsor. “And as the sector matures, companies need tools. Those tools are being supplied by business software companies.” And of course, there is demand from companies outside the tech sector. From banking and financial services to manufacturing, digital transformation is continuing across the economy as a whole creating opportunities for new B2B software providers. But how do European companies take advantage of those opportunities in a market that has been dominated by North American rivals? This isn’t captured in the data, but Windsor sees a home market-first approach, widening out to include new countries and territories as businesses grow. “Anecdotally companies start by selling to their domestic market, then they look at the continent. After that, they expand to other regions.” There is, Windsor adds, a preference for the Asia Pacific. The U.S., on the other hand, remains a difficult market.


Open RAN Testing Expands in the US Amid 5G Slowdown

To be clear, open RAN technology in the US has a number of backers. Dish Network is perhaps the most vocal, having built an open RAN-based 5G network across 70% of the US population. Further, other operators have hinted at their own initial open RAN aspirations, including AT&T and Verizon. Interestingly, the US government has also emerged as a leading proponent for open RAN. For example, the US military continues to fund open RAN tests and deployments. And the Biden administration's NTIA is doling out millions of dollars in the pursuit of open RAN momentum. Broadly, US officials hope to use open RAN technologies to encourage the production of 5G equipment domestically and among US allies, as a lever against China. But open RAN continues to face struggles. For example, US-based open RAN vendors like Airspan and Parallel Wireless have hit hurdles recently. And research and consulting firm Dell'Oro recently reported that open RAN revenue growth slowed to the 10 to 20% range in the first quarter, after more than doubling in 2022.


Low-Code and AI: Friends or Foes?

Although it appears likely that AI will replace low-code, there are actually many opportunities for symbiosis between the two concepts. Rather than eradicate low-code platforms entirely, LLMs will likely become more embedded within them. We’ve already seen this occur as low-code providers like Mendix and OutSystems integrated ChatGPT connectors. Microsoft has also embedded ChatGPT into its Power Platform as well as integrated GPT-driven Copilots into various developer environments. “Low-code and AI on their own are powerful tools to increase enterprise efficiency and productivity,” said Dinesh Varadharajan, the chief product officer at Kissflow. “But there is potential for the combination of both to unlock game-changing automation for almost every industry. The power comes from the congruence between low-code/no-code and AI.” There is also the opportunity to train bespoke LLMs on the inner workings of specific software development platforms, which could generate fully-built templates upon natural language prompts. 

Cloud cost optimization should begin by measuring the drivers of cloud spend at a granular level and then providing full visibility to the teams and organizations that are behind the spend, says Tim Potter, principal, technology strategy and cloud engineering with Deloitte Consulting. “Near-real-time dashboards showing cloud resource utilization, routine reports of cloud consumption, and predictive spend reports will provide application teams and business units with the data needed to take action to optimize cloud costs,” he notes. ... Rearchitecting applications is a frequently overlooked way to achieve the cost and other benefits of transitioning to a cloud model. “Organizations also need to understand the various discount models and select one that optimizes costs yet also provides flexibility and predictability into spending,” says Mindy Cancila, vice president of corporate strategy for Dell Technologies. Cancila adds that organizations should not only consider current workload costs, but also how to manage costs for workloads as they scale over time.


Warning: Attackers Abusing Legitimate Internet Services

Cloud storage platforms, and Google Cloud in particular, are the most exploited, followed by messaging services - most often Telegram, including via its API - as well as email services and social media, the researchers found. Examples of other services being abused by attackers include OneDrive, Discord, Gmail SMTP, Mastodon profiles, GitHub, bitcoin blockchain data, the project management tool Notion, malware analysis site VirusTotal, YouTube comments and even Rotten Tomatoes movie review site profiles. "It is important to note that ransomware campaigns use legitimate cloud storage tools such as mega.io or MegaSync for exfiltration purposes as well," although the crypto-locking malware itself may not be coded to work directly with legitimate tools, the report says. Criminals' choice of service depends on desired functionality. Anyone using an info stealer such as Vidar needs a place to store large amounts of exfiltrated data. The researchers said cloud services' easy setup for less technically sophisticated users makes them a natural fit for such use cases.



Quote for the day:

"We're all passionate about something, the secret is to figure out what it is, then pursue it with all our hearts" -- Gordon Tredgold

Daily Tech Digest - July 16, 2023

The engines of AI: Machine learning algorithms explained

Machine learning algorithms train on data to find the best set of weights for each independent variable that affects the predicted value or class. The algorithms themselves have variables, called hyperparameters. They’re called hyperparameters, as opposed to parameters, because they control the operation of the algorithm rather than the weights being determined. The most important hyperparameter is often the learning rate, which determines the step size used when finding the next set of weights to try when optimizing. If the learning rate is too high, the gradient descent may quickly converge on a plateau or suboptimal point. If the learning rate is too low, the gradient descent may stall and never completely converge. Many other common hyperparameters depend on the algorithms used. Most algorithms have stopping parameters, such as the maximum number of epochs, or the maximum time to run, or the minimum improvement from epoch to epoch. Specific algorithms have hyperparameters that control the shape of their search.


How to Build a Cyber-Resilient Company From Day One

Despite your best proactive measures, some cyber threats will infiltrate your defenses. Reactive defenses, such as firewalls and antivirus software, help to minimize damage when these incidents occur. Firewalls monitor and control incoming and outgoing network traffic based on predetermined security rules, forming the first line of defense against cyber threats. Antivirus software complements firewalls by detecting, preventing and removing malicious software. Intrusion Detection and Prevention Systems (IDS/IPS) monitor your network for suspicious activities and potential threats, alerting you to a potential attack and, in some cases, taking action to mitigate the threat. Encryption is another valuable reactive measure that involves making your sensitive data unreadable to anyone without the appropriate decryption key, thus protecting it even if it falls into the wrong hands. Security Information and Event Management (SIEM) systems provide real-time analysis and reporting of security alerts generated by applications and network hardware. They help detect incidents early and respond promptly.


Quantum Algorithms vs. Quantum-Inspired Algorithms

Quantum-inspired algorithms refer usually to either of the two: (i) classical algorithms based on linear algebra methods — often methods known as tensor networks — that were developed in the recent past, or (ii) methods that attempt to use a classical computer to simulate the behavior of a quantum computer, thus making the classical machine operate algorithms that benefit from the laws of quantum mechanics that benefit real quantum computers. On (i), while the physics community has leveraged these methods to address problems in quantum mechanics since the 70s [Penrose], tensor networks have an independent origin as far back as the 80s in neuroscience as well, as there is nothing really quantum behind them; it really is just linear algebra. For (ii), the process of emulating a quantum system falls back on the limitations of classical hardware. It is very hard to emulate classically the full dynamics of a large quantum system for the exact same reasons that one wants to actually build a real one! So, does this mean that quantum-inspired algorithms are bogus? Not really. 


Operator survey: 5G services require network automation

"Private 5G" and "network slicing" rank second and third, respectively. Heavy Reading expects their importance and popularity to increase as additional operators deploy 5G SA and can support full autonomy. "Performance SLAs for enterprise services" is currently the lowest ranking (fifth) of all service choices but is likely to be a valuable market, especially for network slicing and private 5G. "Connected devices (e.g., cars, watches, other IoT devices)" ranks just above performance SLAs in fourth. Internet of Things (IoT) is a sizeable market within 4G, but the massive machine-type communications (mMTC) use case has yet to be realized in 5G, as technologies such as RedCap remain underdeveloped. Smaller operators have a different opinion from larger operators on the revenue growth question. For mobile operators with less than 9 million subscribers, private 5G ranks first. This result perhaps indicates that smaller operators feel they are already exploiting eMBB services and see little scope for further revenue growth with SA.


Top 5 Features your ITSM Solution Should Have

Addressing the root causes of recurring incidents and preventing them from happening again is the core of what a problem management module is designed for. Robust problem management functionality helps investigate, analyze, and identify underlying causes, leading to effective problem resolution. A reliable ITSM solution should include features such as root cause analysis, trend identification, and proactive problem identification. This should provide a structured approach to change requests, reduce the impact of incidents, and improve the overall stability of your IT environment. A comprehensive knowledge management system is a necessary asset for any IT service desk. It serves as a centralized repository of information, providing users with self-help resources, troubleshooting guides, and best practices from within the organization. A well-organized and searchable knowledge base allows users to access relevant articles and documentation for independent issue resolution. Knowledge bases reduce reliance on IT support and enable faster problem resolution. When choosing an ITSM solution with a knowledge base, look for user-friendly interfaces, easy personalization, and collaborative features.


No cyber resilience without open source sustainability

Open source sustainability is a problem: maintainers of popular software projects are often overwhelmed by issues and pull requests to the point of burnout. Donations have emerged as one solution, and are regularly provided by governments, foundations, companies, and individuals. Yet, as excerpts of recent drafts of the CRA indicate, it could threaten to undermine sustainability by potentially introducing a burdensome compliance regime and potential penalties if a maintainer decides to accept donations. The result will be less resources flowing to already under resourced maintainers. Open source projects are often multi-stakeholder: they receive contributions from developers building as individuals, volunteering in foundations, and working for companies, large and small. The current text would regulate open source projects unless they have “a fully decentralised development model.” Any project where a corporate employee has commit rights would need to comply with CRA obligations. This turns the win-wins of open source on its head. Projects may ban maintainers or even contributors from companies, and companies may ban their employees from contributing to open source at all. 


Building Trust in a Trustless World: Decentralized Applications Unveiled

In a DApp, smart contracts are used to store the program code and state of the application. They replace the traditional server-side component in a regular application. However, there are some important differences to consider. Computation in smart contracts can be costly, so it's crucial to keep it minimal. It's also essential to identify which parts of the application require a trusted and decentralized execution platform. With Ethereum smart contracts, you can create architectures where multiple smart contracts interact with each other, exchanging data and updating their own variables. The complexity is limited only by the block gas limit. Once you deploy your smart contract, other developers may use your business logic in the future. There are two key considerations when designing smart contract architecture. First, once a smart contract is deployed, its code cannot be changed, except for complete removal if programmed with a specific opcode. It can be deleted if it is programmed with an accessible SELFDESTRUCT opcode, but other than complete removal, the code cannot be changed in any way.


How the upcoming Cyber Resilience Act will impact privacy

The Cyber Resilience Act has several positive implications for privacy. Firstly, by enforcing strict standards of cybersecurity in the development and production of new devices, the Act creates an ecosystem where security is ingrained in the product development cycle. Secondly, by creating the reporting obligations, the Act ensures that vulnerabilities are addressed promptly, reducing the risk of personal data breaches and protecting the privacy of individuals. Third, the Act empowers consumers by ensuring they are informed about the vulnerabilities in their devices and the measures they can take to protect their personal data. From the perspective of data controllers, particularly those who serve as manufacturers of devices regulated by the Act, compliance requirements are raised to an even higher threshold. ... Additionally, they will have to comply with reporting obligations regarding vulnerabilities, even those that have already been fixed, regardless of whether personal data was affected or not. Neglecting to fix known vulnerabilities may also result in reputational consequences for data controllers.


Crafting a cybersecurity resilience strategy: A comprehensive IT roadmap

In recent years, there has been a significant increase in the demand for cybersecurity professionals due to the growing importance of protecting sensitive information and systems from cyber threats. Organizations are allocating larger budgets to enhance their cybersecurity measures, resulting in a surge in the number of job opportunities in this field. According to the latest Cyber Security Report by Michael Page, companies are actively seeking skilled cybersecurity talent to address their security challenges. The report reveals that globally, more than 3.5 million cybersecurity jobs are expected to remain unfilled in 2023 due to a shortage of qualified professionals. This shortage has created a sense of desperation among companies, as they struggle to find suitable candidates to fill these critical roles. India is projected to have over 1.5 million vacant cybersecurity positions by 2025, underscoring the immense potential for career growth in this field. To effectively address the ever-changing risks of digitalization and increasing cyberthreats, it is crucial for organizations to implement a continuous security program. 


The rise of OT cybersecurity threats

There is a need for a separate security program for OT that includes different tools, governance, and processes. Companies can’t simply extend their IT security program to OT, as the differences between the two domains are too great. It may require two security operation centers (SOCs), which adds to the complexity and costs of cybersecurity management. Bellack explains that some CEOs or CIOs underestimate the risks associated with an OT attack. “It’s a relatively new set of risks and a lot of executives don’t understand that they are indeed in danger,” Bellack says. “Companies build smarter, faster, cheaper factories using digital technologies because it’s great for business. But it also expands their attack surface, and many people in charge don’t realize the impacts or what they need to do to protect themselves.” ... “Machines are components in a complex, revenue producing infrastructure that is a mix of physical, digital, and human elements. Safety and availability are the key focus, and security is sometimes forced to take a back seat if either of those may be compromised,” explains Boals.



Quote for the day:

"Practice isn't the thing you do once you're good. It's the thing you do that makes you good." -- Malcolm Gladwell