When did Data Science Become Synonymous with Machine Learning?
Many folks just getting started with data science have an illusory idea of the
field as a breeding ground where state-of-the-art machine learning algorithms
are produced day after day, hour after hour, second after second. While it is
true that getting to push out cool machine learning models is part of the work,
it’s far from the only thing you’ll be doing as a data scientist.In reality,
data science involves quite a bit of not-so-shiny grunt work to even make the
available data corpus suitable for analysis. According to a Twitter poll
conducted in 2019 by data scientist Vicki Boykis, fewer than 5% of respondents
claimed to spend the majority of their time on ML models [1]. The largest
percentage of data scientists said that most of their time was spent cleaning up
the data to make it usable. ... Data science is a burgeoning field, and reducing
it down to one concept is a misrepresentation which is at best false, and at
worse dangerous. To excel in the field as a whole, it’s necessary to remove the
pop-culture tunnel vision that seems to only notice machine learning.
NaaS adoption will thrive despite migration challenges
The pandemic has also played a significant role in spurring NaaS adoption,
Chambers says. "During the early days of COVID-19 there was a rapid push for
users to be able to connect quickly, reliably, and securely from anywhere at any
time," he says. "This required many companies to make hardware/software
purchases and rapid implementations that accelerated an already noticeable
increase in overall network complexity over the last several years."
Unfortunately, many organizations faced serious challenges while trying to keep
pace with suddenly essential changes. "Companies that need to quickly scale up
or down their network infrastructure capabilities, or those that are on the cusp
of major IT infrastructure lifecycle activity, have become prime NaaS-adoption
candidates," Chambers says. It’s easiest for organizations to adopt small-scale
NaaS offerings to gain an understanding of how to evaluate potential risk and
rewards and determine overall alignment to their organization’s requirements.
Securing DevOps amid digital transformation
The process of requesting a certificate from a CA, receiving it, manually
binding it to an endpoint, and self-managing it can be slow and lack
visibility. Sometimes, DevOps teams avoid established quality practices by
using less secure means of cryptography or issuing their own certificates from
a self-created non-compliant PKI environment – putting their organizations at
risk. However, PKI certificates from certified and accredited globally trusted
CAs offer the best way for engineers to ensure security, identity and
compliance of their containers and the code stored within them. A certificate
management platform, which is built to scale and manages large volumes of PKI
certificates, is perfect for the DevOps ethos and their environments.
Organizations can now automate the request and installation of compliant
certificates within continuous integration/continuous deployment (CI/CD)
pipelines and applications to secure DevOps practices and support digital
transformation. Outsourcing your PKI to a CA means developers have a single
source to turn to for all certificate needs and are free to focus on core
competencies.
Reprogramming banking infrastructure to deliver innovation at speed
Fintech firms typically apply digital technology to processes those legacy
institutions find difficult, time consuming, or costly to undertake, and they
often focus on getting a single use case like payments, or alternative lending
right. In contrast, neo banks, or challenger banks, deliver their services
primarily through phone apps that often aim to do many things that a bank can
do, including lending money and accepting deposits. A key advantage for both
is that they don’t have to spend time, money, and organisational capital to
transform into something new. They were born digital. Likewise, they both
claim convenience as their prime value proposition. However, while customers
want convenience, many still see banking as a high-touch service. If their
bank has survived decades of consolidation and has served a family for
generations, familiarity can be a bigger draw than convenience. That said, the
COVID-19 pandemic has accelerated the online trend. More and more of us
auto-pay our bills and buy our goods as well as our entertainment and services
via e-commerce.
No free lunch theorem in Quantum Computing
The no free lunch theorem entails that a machine learning algorithm’s
average performance is dependent on the amount of data it has.
“Industry-built quantum computers of modest size are now publicly accessible
over the cloud. This raises the intriguing possibility of quantum-assisted
machine learning, a paradigm that researchers suspect could be more powerful
than traditional machine learning. Various architectures for quantum neural
networks (QNNs) have been proposed and implemented. Some important results
for quantum learning theory have already been obtained, particularly
regarding the trainability and expressibility of QNNs for variational
quantum algorithms. However, the scalability of QNNs (to scales that are
classically inaccessible) remains an interesting open question,” the authors
write. This also suggests a possibility that in order to model a quantum
system, the amount of training data might also need to grow exponentially.
This threatens to eliminate the edge quantum computing has over edge
computing. The authors have discovered a method to eliminate the potential
overhead via a newfound quantum version of the no free lunch theorem.
IT Talent Shortage: How to Put AI Scouting Systems to Work
The most likely people to leave a company are highly skilled employees who
are in high demand (e.g., IT). Employees who feel they are underutilized and
who want to advance their careers, and employees who are looking for work
that can more easily balance with their personal lives, are also more likely
to leave. It’s also common knowledge that IT employees change jobs often,
and that IT departments don't do a great job retaining them for the long
haul. HR AI can help prevent attrition if you provide it with internal
employee and departmental data that it can assess your employees, their
talents and their needs based upon the search criteria that you give it. For
instance, you can build a corporate employee database that goes beyond IT,
and that lists all the relevant skills and work experiences that employees
across a broad spectrum of the company possess. Using this method, you might
identify an employee who is working in accounting, but who has an IT
background, enjoys data analytics, and wants to explore a career change. Or
you could identify a junior member of IT who is a strong communicator and
can connect with end users in the business.
Automation and digital transformation: 3 ways they go together
All sorts of automation gets devised and implemented for specific purposes or
tasks, sometimes for refreshingly simple reasons, like “Automating this makes
our system more resilient, and automating this makes my job better.” This is
the type of step-by-step automation long done by sysadmins and other
operations-focused IT pros; it’s also common in DevOps and site reliability
engineering (SRE) roles. IT automation happens for perfectly good reasons on
its own, and it has now spread deep and wide in most, if not all, of the
traditional branches of the IT family tree: development, operations, security,
testing/QA, data management and analytics – you get the idea. None of this
needs to be tethered to a digital transformation initiative; the benefits of a
finely tuned CI/CD pipeline or security automation can be both the means and
the end. There’s no such thing as digital transformation without automation,
however. This claim may involve some slight exaggeration, and reasonable
people can disagree. But digital transformation of the ambitious sort that
most Fortune 500 boardrooms are now deeply invested in requires (among other
things) a massive technology lever to accomplish, and that lever is
automation.
The best way to lead in uncertain times may be to throw out the playbook
Organizations also used the sensing-responding-adapting model to combat
misinformation and confusion about masks and vaccines. With conflicting
guidance from the Centers for Disease Control and Prevention (CDC) in the US
and the World Health Organization (WHO), one organization we studied opted for
“full transparency” with a “fully digital” solution. The company built an app
that included data from sources the company considered reliable, and it
updated policies, outlined precautions, and offered ways to report vaccination
status. The app turbocharged the company’s sense-respond-adapt capabilities by
getting quality information in everyone’s hands and opening a new channel for
regular two-way communication. There was no waiting for an “all hands” meeting
to get meaningful questions and feedback. Reflecting on the results of the
study, one takeaway became clear: it’s worthwhile for leaders of any team to
absorb the lessons of sense-respond-adapt, even if there is no emergency at
hand. Here are three ways to employ each step of the model.
The Three Building Blocks of Data Science
Data is worthless without the context for understanding it properly —
context which can only be obtained by a domain expert: someone who
understands the field where the data stems from and can thus provide the
perspectives needed to interpret it correctly. Let’s consider a toy
example to illustrate this. Imagine we collect data from a bunch of
different golf games from recent years of the PGA Tour. We obtain all the
data, we process and organize it, we analyze it, and we confidently
publish our findings, having triple-checked all our formulas and
computations. And then, we become laughingstocks of the media. Why? Well,
since none of us has ever actually played golf, we didn’t realize that
lower scores correspond to a better performance. As a result, all our
analyses were based on the reverse, and therefore incorrect. This is
obviously an exaggeration, but it gets the point across. Data only makes
sense in context, and so it is essential to consult with a domain expert
before attempting to draw any conclusions.
Surprise! The metaverse could be great news for the enterprise edge
Metaverse latency control is more than just edge computing, it’s also edge
connectivity, meaning consumer broadband. Faster broadband offers lower
latency, but there’s more to latency control than just speed. You need to
minimize the handling, the number of hops or devices between the user
who’s pushing an avatar around a metaverse, and the software that
understands what that means to what the user “sees” and what others see as
well. Think fiber and cable TV, and a fast path between the user and the
nearest edge, which is likely to be in a nearby major metro area. And
think “everywhere” because, while the metaverse may be nowhere in a strict
reality sense, it’s everywhere that social-media humans are, which is
everywhere. Low latency, high-speed, universal consumer broadband? All the
potential ad revenue for the metaverse is suddenly targeting that goal. As
it’s achieved, the average social-media junkie could well end up with 50
or 100 Mbps or even a gigabit of low-latency bandwidth. There are
corporate headquarters who don’t have it that good.
Quote for the day:
"The ability to summon positive
emotions during periods of intense stress lies at the heart of effective
leadership." -- Jim Loehr
No comments:
Post a Comment