Daily Tech Digest - March 26, 2023

What Is Decentralized Identity?

Decentralized identities are not hosted on centralized servers by big entities such as Google or Meta Platforms (the former Facebook). Instead, they are often hosted on decentralized file-sharing platforms, such as the InterPlanetary File System (IPFS). These open-source protocols store data on decentralized networks that are difficult to shut down and give users ownership over their online data. In addition, decentralized identities only share information with other parties when and if they choose. This means that, unlike centralized identities, personal data cannot be stored or shared without the user's knowledge or consent. According to the Ethereum Foundation, decentralized identities can be used for many things, such as a universal login to reduce the need for separate usernames passwords, as a way to bypass know-your-customer (KYC) measures and to create online communities that are free of bots and fake accounts.


Why we need to care about responsible AI in the age of the algorithm

The rapid pace of AI development does not appear to be slowing down. Breakthroughs come fast – quickly outpacing the speed of regulation. In the past year alone, we have seen a range of developments, from deep learning models that generate images from text, to large language models capable of answering any question you can think of. Although the progress is impressive, keeping pace with the potential harms of each new breakthrough can pose a relentless challenge. The trouble is that many companies cannot even see that they have a problem to begin with, according to a report released by MIT Sloan Management Review and Boston Consulting Group. ... Responsible AI is more than a check box exercise or the development of an add-on feature. Organizations will need to make substantial structural changes in anticipation of AI implementation to ensure that their automated systems operate within legal, internal and ethical boundaries.


Uncovering new opportunities with edge AI

Edge AI and edge ML present unique and complex challenges that require the careful orchestration and involvement of many stakeholders with a wide range of expertise from systems integration, design, operations and logistics to embedded, data, IT and ML engineering. Edge AI implies that algorithms must run in some kind of purpose-specific hardware ranging from gateways or on-prem servers on the high end to energy-harvesting sensors and MCUs on the low end. Ensuring the success of such products and applications requires that data and ML teams work closely with product and hardware teams to understand and consider each other’s needs, constraints and requirements. While the challenges of building a bespoke edge AI solution aren’t insurmountable, platforms for edge AI algorithm development exist that can help bridge the gap between the necessary teams, ensure higher levels of success in a shorter period of time, and validate where further investment should be made.


IT Automation vs. Orchestration: What's the Difference?

IT automation refers to the use of technology to automate tasks and processes that would otherwise be done by someone on your team. This includes everything from communication to security tasks. Today, the appeal of this automation is greater than it has ever been in the corporate world. One study shows that more than 30% of organizations have five or more departments that automate tasks. ... Orchestration is about coordinating tasks and processes into workflows. Orchestration is the process of automating and managing the end-to-end flow of IT services, from initial requests to final delivery. This can include everything from provisioning new servers to deploying applications and monitoring performance. The benefits of orchestration are similar to those of IT automation but they extend beyond simple task execution. They enable organizations to coordinate and manage complex workflows across multiple systems, tools, and teams. This improves efficiency and reduces the chance of errors on a larger scale.


Critical flaw in AI testing framework MLflow can lead to server and data compromise

MLflow is written in Python and is designed to automate machine-learning workflows. It has multiple components that ​​allow users to deploy models from various ML libraries; manage their lifecycle including model versioning, stage transitions and annotations; track experiments to record and compare parameters and results; and even package ML code in a reproducible form to share with other data scientists. MLflow can be controlled through a REST API and command-line interface. All these capabilities make the framework a valuable tool for any organization experimenting with machine learning. Scans using the Shodan search engine reinforce this, showing a steady increase of publicly exposed MLflow instances over the past two years, with the current count sitting at over 800. However, it's safe to assume that many more MLflow deployments exist inside internal networks and could be reachable by attackers who gain access to those networks.


Can Security Keep Up With ChatGPT Evolutions?

As with most technological developments, there are two sides of the coin. ChatGPT may present businesses with a never-ending pool of opportunities, but the same resource is available to those with more malicious intent. While ChatGPT itself cannot be directly targeted by cybersecurity threats like malware, hacking or phishing, it can be exploited to help criminals infiltrate systems more effectively. The platform’s developers have taken steps to try to reduce this as much as possible, but it takes just one attacker to word their question in the right way to get the desired response. The best example here is phishing. Asking the platform to generate a phishing template directly will result in the chatbot refusing. However, if someone with malicious intent rewrote their question ever so slightly, the AI won’t detect any issue. For example, if you ask it to create a ‘gophish’ template, it will comply. The advanced capabilities of ChatGPT throws up several red flags for security teams, but it isn’t time to hit the doomsday button just yet.


Creating Strong ROI for Multi-Cloud Solutions Through Compliance & Security

When it comes to budget, storing data in the cloud eliminates the need to pay upfront for physical hardware and services. Predictable subscription services fees without capital expenses means organizations can lower their overall costs and invest the savings in other areas that drive innovation. Take for example, a healthcare organization that moves its critical on premise infrastructure into the cloud. In doing so, the organization immediately saves enough on its capital expense budget to add much needed additional healthcare staff ready to serve patients. With regard to gaining intelligence, the data that can be gathered in a single or multi-cloud environment makes it infinitely easier to analyze and gain actionable insights that would otherwise be unavailable. This level of data-driven analytics and intelligence is powerful as it can be directly applied to customer service and operational performance improvements. Multi-cloud solutions also make scaling up and down to meet demand extremely simple and efficient. 


6 Myths About Leadership That May Be Holding You Back

While it is true that leaders often hold positions of authority and are responsible for making important decisions, leadership is not limited to those in formal leadership positions. Leadership can be demonstrated by anyone who takes the initiative, inspires others and creates positive change, regardless of their official role or title. Some of the most influential leaders do not hold formal leadership positions but still manage to influence others and make a difference. ... True leaders often face uncertain and unpredictable situations and may only sometimes have all the answers. In these situations, it's natural for a leader to feel some degree of uncertainty or doubt. The key difference between a leader and someone who appears confident is that a leader can acknowledge their limitations and vulnerabilities while still maintaining their focus and determination. They are not afraid to ask for help or admit when they don't know something. Leaders who are open and honest about their struggles can inspire greater trust and respect from their team. 


API Gateways: The Doorway to a Microservices World

While microservices are beneficial, they also create significant new challenges. These challenges include:Increased complexity: A microservices architecture introduces additional complexity, as each service needs to communicate with other services through well-defined interfaces. This can result in increased development and management overhead, as well as challenges with testing and debugging. Distributed systems management: A microservices architecture is a distributed system, which means it can be challenging to monitor and manage individual services, especially when there are multiple instances of the same service running in different environments. Data consistency: Maintaining data consistency across multiple services can be challenging, as changes to one service can impact other services that depend on that data. This requires careful planning and management to ensure that data remains consistent and up-to-date across the system.


An open data lakehouse will maintain and grow the value of your data

So here’s how to take advantage of all the data flowing through your organization’s digital transformation pipelines and bring together open-source systems and the cloud to maximize the utility of the data. Use an open data lakehouse designed to meld the best of data warehouses with the best of data lakes. That means storage for any data type, suitable for both data analytics and ML workloads, cost-effective, fast, flexible and with a governance or management layer that provides the reliability, consistency and security needed for enterprise operations. Keeping it “open” (using open-source technologies and standards like PrestoDB, Parquet and Apache HUDI) not only saves money on license costs, but also gives your organization the reassurance that the technology that backs these critical systems is being continuously developed by companies that use it in production and at scale. And as technology advances, so will your infrastructure. Remember, you’ve already invested mightily in data transformation initiatives to remain competitively nimble and power your long-term success.



Quote for the day:

"Leadership matters more in times of uncertainty." -- Wayde Goodall

No comments:

Post a Comment