Confidential computing – the next frontier in security
“Confidential computing is a technique for securing data while in use by
creating secure spaces that users rather than administrators’ control,” says
Martin O’Reilly, director of research engineering at the Alan Turing Institute.
“The idea is to create a trusted execution environment (TEE) or secure enclave,
where the data is only accessible by a specific application or user, and only as
the data is being processed.” ... Confidential computing’s ‘360-degree
protection’ enables data to be processed within a limited part of the computing
environment, giving organisations the ability to reduce exposure to sensitive
data while also providing greater control and transparency, even allowing
businesses to share data for joint processing securely. This represents a
significant change, says O’Reilly, pointing out that the ability to create
secure spaces where the user controls who has access to the data effectively
replicates the trust companies might have in their own IT departments. He notes,
however, that the advantages should be weighed against the complexities involved
in setting up and managing these technologies.
Ethical AI – Can India be the leader?
The warning signs that have emerged should make the leadership team within all
the financial institutions sit up and take serious notice. With enough success
achieved on customer acquisition, it's now time to focus energies on creating
systems that can serve the financial institutions' best interests and use AI as
an effective tool for customer engagement rather than coercive intrusions. AI
can indeed be moulded into a great tool for enhanced customer success, although
the possibilities of this have not been fully explored. Ethical AI is a practice
of evaluating the ethical quality of the impact of its prediction on human life.
Debt collection is a good place to develop this practice because human debt
collectors with performance targets typically find it hard to navigate the moral
quagmire involved in dealing with financial stress. India, the land of
spirituality, is well-positioned to become a leader in the practice of Ethical
AI. The day is not far when the practice of Ethical AI will be a key
differentiator for AI Platforms. On the policy side, countries across the globe
are also working on Data Protection laws in the model of EU's GDPR, which would
give customers the legal Right to Explanation.
The Progressive Web Apps Guide You’ve Been Looking For
PWA, or Progressive Web Apps, is one of the hottest topics in web development
these days. It promises users a native-app like experience and features, but
still built with common HTML, CSS, and JavaScript (or you can go even further by
using any JavaScript frameworks), how cool is that! ... n simpler words,
progressive web apps are typical web apps, built using the same old HTML, CSS,
and JavaScript but with additional capabilities such as offline mode, A2HC (add
to home screen), background syncing, push notifications, etc. They look and feel
like a native apps but they ain’t! ... There are a few things to note about
service worker: It’s a JavaScript Worker, so it can’t access the DOM directly.
Instead, a service worker can communicate with the pages it controls by
responding to messages sent via the postMessage interface, and those pages can
manipulate the DOM if needed; Service worker uses a lot of promises, since it’s
a script that runs in the background. What it mainly does is to listen to events
that happen on the browser then provide a callback for each events we
interested.
The path to .NET 5 and Blazor WebAssembly with some fun sprinkled in
One extremely interesting announcement was the new release of Blazor
WebAssembly. Blazor lets you build interactive web UI wuth C# instead of
JavaScript. Blazor WebAssembly allows users to build a completely isolated
application entirely in C# that can run in nearly every web environment,
including environments that only support static sites (think only HTML, CSS
and Javascript). Blazor WebAssembly does this my compiling all C# code needed
to run your application (your code and .NET libraries) into native code that
will execute on the browser, not a server somewhere. This is valuable in
scenarios when your app needs to run offline, or completely decoupled from a
server, and your app only needs a server if it requests data outside of the
application (similar to how many Javascript-based applications work). Due to
the fact that the application runs in the browser, render times of the UI are
near instantaneous, allowing for a great end-user experience. To see these
benefits for ourselves, we decided to port a heavily used application to .NET
5 and Blazor WebAssembly and not only reap these benefits, but document the
process one would take moving their existing .NET Core application using
Blazor Server to .NET 5 using Blazor WebAssembly.
Are Data Engineering Jobs Getting More Popular Than Data Science Jobs?
While some of their activities might overlap, data engineers are primarily about
moving and transforming data into pipelines for the data science team. Put it
simply, data engineers have three critical tasks to perform — design, build and
arrange data pipelines. In contrast, data scientists analyse, test, aggregate
and optimise data. ... Data engineers essentially collect, generate, store,
enrich and process data in real-time or in batches. Data engineering involves
building data infrastructure and data architecture. Data engineers require
experience in software engineering, programming languages, and a firm grip on
core technical skills. Understanding ETL, SQL, and programming languages such as
Java, Scala, C++, and Python are desired. ... The data science strategy in an
organisation deals with data infrastructure, data warehousing, data mining, data
modelling, data crunching, and metadata management, most of which are carried
out by data engineers. Studies suggest most data science projects fall through
as data engineers and data scientists find themselves at cross purposes. Many
companies fail to recognise the importance of hiring data engineers. While most
companies are starting to realise the importance of data engineers, the talent
shortage is all too real.
Using Machine Learning in Testing and Maintenance
With machine learning, we can reduce maintenance efforts and improve the quality
of products. It can be used in various stages of the software testing
life-cycle, including bug management, which is an important part of the chain.
We can analyze large amounts of data for classifying, triaging, and prioritizing
bugs in a more efficient way by means of machine learning algorithms. Mesut
Durukal, a test automation engineer at Rapyuta Robotics, spoke at Aginext 2021
about using machine learning in testing. Durukal uses machine learning to
classify and cluster bugs. Bugs can be classified according to severity levels
or responsible team or person. Severity assignment is called triage and
important in terms of prioritization, where the assignment of bugs to the
correct team or person prevents a waste of time. Clustering bugs helps to see
whether they heap together on specific features. Exploring the available data on
bugs with machine learning algorithms gave him more insight into the health of
their products and the effectiveness of the processes that were used.
Digital Transformation Requires Redefining Role of Data Governance
Today’s data governance and data management practices must be redefined to
support the organization’s business needs and ultimately underpin the
organization’s data monetization strategy. A Data Monetization practice must:
Evangelize a compelling vision regarding the economic potential of data and
analytic assets to power an organization’s digital transformation; Educate
senior executives, business stakeholders and strategic customers on how to
“Think Like a Data Scientist” in identifying where and how data and analytics
can deliver material business value; Apply Design Thinking and Value
Engineering concepts in collaborating with business stakeholders to identify,
validate, value and prioritize the organization’s high-value use cases that will
drive the organization’s data and analytics development roadmap; Champion a
Data Science team to “engineer” reusable, continuously-learning and adapting
analytic assets that support the organization’s high priority use
cases; Develop an analytics culture that synergizes the AI / ML
model-to-human collaboration that empowers teams at the point of customer
engagement and operational execution.
AI and IoT: Transforming Business
The execution of modern technologies such as Artificial Intelligence and IoT has
been changing the entire business world. A recent survey stated that almost 500
IT professionals claiming AI and IoT as most emerging technologies do remodeling
business operations and compelling companies to invest more to gain a
competitive advantage. And the reasons are simple. The amalgamation of IoT and
AI can create smart strategies that can read human preferences and help
management make informed decisions with zero error. Not convinced? Let’s check
it out with a real-life example. One of the world’s renowned car manufacturers,
BMW, has started using AI and IoT in their manufacturing process. It uses
censored robots in their premises which help workers while producing innovative
cars. They are also leveraging AI for driverless cars for the future. In fact,
AI and IoT are affecting the entire transportation industry. Interactive maps
and smart route optimization is making it easy for drivers to reach the
destination early. It saves fuel cost and reduces journey time. This is why you
might have heard why entrepreneurs embrace AI solutions in their taxi app clone
development because it plans routes based on peak hours and road construction.
10 Ways Enterprises Can Use the Edge
Edge use cases are expanding across industries as companies move compute and
analytics capabilities to the edge. Some companies want to reduce latency.
Others want to gain greater insights into what's happening in the field whether
people, crops, or oil rigs. "Edge computing enables companies and other types of
organizations to analyze large amounts of data on site or on devices in real
time," said Shamik Mishra, CTO for Connectivity in the Engineering and R&D
Business at global consulting firm Capgemini. "This can enable several new
opportunities in terms of new sources of revenue, improved productivity, and
decreased costs." In fact, there's an entire world of the Internet of Things
(IoT) innovation happening that makes edge use cases even more compelling
including smart homes, wearables, AR video games and increasingly intelligent
vehicles. Gartner expects the IoT platform market to grow to $7.6 billion by
2024, which represents both on-premises and cloud deployments. The company
considers PaaS a key enabler of digital scenarios. Allied Market Research sees
the broader opportunity worth $16.5 billion by 2025, driven by the desire to
avoid network latency problems and restrictions on bandwidth usage for storing
data in the cloud...."
How to Maintain a Healthy Codebase While Shipping Fast
To determine what proportion of your sprint to allocate to tech debt, simply
find the overlap between the parts of your codebase you'll modify with your
feature work and the parts of your codebase where your worse tech debt lives.
You can then scope out the tech debt work and allocate resources accordingly.
Some teams even increase the scope of their feature work to include the relevant
tech debt clean-up. More in this article 'How to stop wasting time on tech
debt.' For this to work, individual contributors need to track medium-sized debt
whenever they come across it. It is then the Team Lead's responsibility to
prioritize this list of tech debt, and to discuss it with the Product Manager
prior to sprint planning so that engineering resources can be allocated
effectively. Every once in a while, your team will realize that some of the
medium-sized debt they came across is actually due to a much larger piece of
debt. For example, they may realize that the reason the front-end code is
under-performing is that they should be using a different framework for the job.
Left unattended, these large pieces of debt can cause huge problems, and — like
all tech debt — get much worse as time goes by.
Quote for the day:
"No man is good enough to govern another
man without that other's consent." -- Abraham Lincoln
No comments:
Post a Comment