Pay-per-use pricing model can boost digital initiatives
One of the more dominant reasons why organizations do not take advantage of as a
service and pay per use is that their current budget model will not allow them.
Another common critique, however, centers on the assumption that if you
implement a pay-per-use model, you will pay more over time than if you procured
the system upfront. In other words, usage-based pricing enables you to pay less
upfront, but over the life of the product, those who choose to pay more upfront
will pay less overall. Each offering is different. But even if you assume that a
pay-per-use option would require a larger expense versus an upfront purchase,
the ability to defer payments has value. In addition, pay-per-use models reduce
the personnel, time and risk associated with forecasting the environment. The
cost equation ultimately comes down to two questions: How confident are you with
your ability to forecast application needs over the next three to four years?
And are you sure there isn't something you could do that would be a better use
of your time? The research data on the rising interest in a pay-per-use pricing
model often focuses predominately on consumption-based pricing, which is
different from a true as-a-service model. Organizations can achieve similar
benefits with each approach, however.
Malware Exploits Security Teams' Greatest Weakness: Poor Relationships With Employees
Teams that are willing to brave this task manually will find a high mountain to
climb. Approaching an employee about this forces all sorts of uncomfortable
topics front and center. Inquisitive users may now be curious about the scope
and veracity of the company's monitoring. Now that they are working from home
and surrounded by family, they wonder where is the line drawn with collecting
personal data, and is there an audit log for the surveillance? For many teams,
the benefits of helping end users are not worth the risk of toppling over the
already wobbly apple cart. So extensions like The Great Suspender linger,
waiting for the right moment to siphon data, redirect users to malicious
websites, or worse. This seems like a significant weakness in how IT and
security teams operate. Because too few security teams have solid relationships
built on trust with end users, malware authors can exploit this reticence,
become entrenched, and do some real damage. Forward-looking teams are rolling up
their sleeves and developing tools to solve this. Dan Jacobson, who leads
corporate IT at Datadog, told me about a tool his team built to handle this type
of conundrum.
Technology & Water: How 5G and IoT Can Update Our Water Infrastructure
Droughts lower reservoir levels, which creates water-quality challenges and
means that every last drop of water in a system is critical. A company named
Ketos has developed a suite of interconnected devices and tools that rely on
sensors to get near real-time analytics on water quality and supply
management. The sensors can flag toxins like lead and copper and report on
water flow and pressure. And since every last drop of water is critical in a
drought, there are now sensor-based leak detection technologies from two
companies, Echologics and Syrinix, that allow utilities to monitor their
networks in near real-time and be proactive with incident or maintenance
responses. Las Vegas Valley used the Syrinix system to lower the average 20%
to 40% of water lost to leaks down to just 5%. High-speed 5G networks like
Verizon’s 5G Ultra Wideband could allow all this technology to work in tandem,
permitting the processing and transferring of vast amounts of data from the
water utility’s equipment to decision makers. In addition, more sensors mean
more real-time information so that managers can be proactive instead of just
reactive. On the other end of the spectrum, extreme rainfall events can cause
flash flooding when rainwater fills up the sewer systems.
Finding the Great Predictors for Machine Learning
Factor analysis is a statistical process for expressing variables in terms of
latent variables called factors. Factors represent two or more variables that
are highly correlated to each other. In short, factors are proxies for the
model variables because of a common variance that exist because the variables
correlate to each other. The benefit of factor analysis is to eliminate
variables that are not influencing the model. Factors developed when
transforming the dimensionality of a dataset present a more economic way to
describe influential variables. The result is a reduced number of parameters
for statistical models, be a regression or a machine learning model. An
analyst can plan a more optimal computation of training data, allowing a
machine learning model to be developed more efficiently. Factor analysis is
particularly useful for surveys that contain a broad variety of comments and
categorical responses. Survey responses are typically categorized, such as a
Likert scale, in which respondents rate a question statement as 1 (very
strongly agree) to 10 (very strongly disagree). But interpreting which answers
can influence a sought answer can be tricky to establish.
Facilitating remote working and improving talent engagement using AI
There are of course many significant benefits to remote working. Employees can
work where they want, when they want, and employers have access to a global
pool of talent. In order for both parties to achieve maximum productivity,
businesses need to not only embrace new ways of working, but also facilitate
and improve the experience for their teams and enable a better work life
balance. And AI supports this shift. On the face of it, the idea of AI
conjuring images of robots and opaque black boxes of algorithms seems
incompatible with these challenges. However, there are viable opportunities
for creative and impactful AI applications that help address these risks head
on. The key to allowing this to happen is for companies to embrace cloud-based
work management and collaboration systems. With such a system in place and
under constant use, AI technologies can learn more about your team, the work
they are doing and how they are interacting with their colleagues. This is
what I call being AI-ready. Naturally, it takes an insightful party who can
identify an issue or a pattern of issues that relate to remote working. This
could be in regards to employee productivity, efficiency or
satisfaction.
What the COVID-19 vaccine rollout means for returning to the workplace
The Challenger survey found that mask requirements will be nearly universal;
almost 93% of companies said they will provide and/or require workers to wear
masks. And masks will not be limited to just workers: almost 72% of companies
said they will provide and/or require visitors to wear masks, the company
said. In terms of other policies, the same number of companies also said they
will be limiting or prohibiting gatherings in shared spaces, such as
conference rooms, break rooms, lunchrooms, and bathrooms. The same percentage
also said they would be maintaining social distancing protocols, with fewer
people in workspaces and not allowing workers to come within six feet of each
other, the Challenger survey said. And the same percentage of survey
respondents planned to provide sanitizing products. Only 14% of companies in
the Challenger survey said they would be providing and/or requiring workers to
wear gloves. Among other precautions planned, 89% of companies said they would
conduct regular deep-cleaning of all workstations/worksites, and 82% would
limit or exclude visitors. Elevator use will be limited for 57% of companies
and the same number said they will take the temperature of workers when they
arrive at work and will survey workers to see if they have had any risk of
exposure.
Microsoft asks government to stay out of its cyber attack response in Australia
It's a concept that has been applied to cyber incident responses, where
additional risk is introduced during the initial phases of an ongoing crisis
because the ability of subject matter experts and network defenders to
adequately respond is hampered by an onslaught of information requests,
speculation, and well-intended ideas from individuals or organisations when
the malicious activity is yet to be fully understood by anyone. It said
further complicating any such operation is the fact that the government would
be doing so without a thorough understanding of the specific resources and
protocols available for deployment, and that the "resources required to obtain
such knowledge would be prohibitively expensive, logistically complicated, and
amount to an extremely invasive governmental intervention". "As such, the
danger of having a government direct a private sector entity's response
without complete knowledge of the situation and the technology cannot be
understated," Microsoft said. "Moreover, individual organisations are not only
best positioned to respond; they also have as equal an incentive as the
government to protect their own networks and maintain the trust of their
customers."
Why the ‘experience of things’ lies at the heart of digital transformation
Right now, customer behaviors are changing, so customer experience leaders
have to recognize the shift and keep pulse with the changing dynamics. Which
means that brands have to stop trying to create static customer journey maps
and realize that just like it’s the customer’s experience, it’s also the
customer’s journey. And customers are on multiple journeys with multiple
channels and want to engage with brands exactly how they want to engage. So,
don’t try to force your journey maps on them. Remove complexity and friction
in every interaction. And own it when you mess up or fail to do so. Be
authentic! When it comes to employees, organizations have to transform digital
workplace experiences for people with unified technology solutions that
actually get down into the flow of how they work. Look at technology solutions
from the perspective of what experiences they will enable or offer for people.
Again, you have to remove complexity and friction. For example, communications
and collaboration platforms have to enable a set of seamless and frictionless
experiences so people can connect, co-create, collaborate, and build
community. The pandemic seriously drove home the point of how critical
real-time collaboration solutions like video and messaging are. But for
employees, beyond the tech should be a focus on how they are experiencing
work.
Boosting science and engineering in the cloud
Hard problems — like autonomous vehicles, rockets, and supersonic transport —
benefit from engineers and scientists being able to flexibly mold
infrastructure to the questions they’re hoping to answer. Boiled down, smart
companies have learned that the best way to attract and nurture developer
talent is not only to compensate them well, but also, and more important, to
remove obstacles in their work. The rise of SaaS (with an API for whatever
back-end function you need), Jamstack, Kubernetes, and all these other new
technologies spreading across the enterprise software stack free developers to
focus on the logic of the new application or service they are developing. They
can forget about the infrastructure. Time-to-market cycles speed up. More and
better services delivered much faster leads to happier, stickier customers.
And more top-line revenue. In sum, it’s a partnership between developers and
engineers/scientists. Developers abstract away all the infrastructure hassles
and suddenly your engineers and scientists can help your business beat the
competition and grab market share. It’s a match made in heaven. Or Hacker
News.
Just How Much Is That Cloud Data Warehouse in the Window?
One common hybrid data warehouse scenario involves shifting specific workloads
– typically, test-dev, disaster recovery, and analytic discovery – to the
cloud context. An organization that employs a hybrid-multi-cloud scenario
might seek to complement its on-premises data warehouse system by exploiting
the desirable features or capabilities of two or more PaaS data warehouses.
These might include inexpensive on-demand capacity – useful not only for
analytic discovery, but for data scientists, machine learning (ML) engineers,
data engineers, and other technicians who design pipelines that entail
scheduling distributed data processing and data movement operations – or
integration with cloud-adjacent software development, data integration, ML,
artificial intelligence (AI), etc. services. Extending the data warehouse in
this way does not preclude moving a large share or even a majority of
on-premises workloads to the cloud, with the result that, over time, the PaaS
data warehouse could draw the on-premises data warehouse (along with a
constellation of multi-cloud data warehouse resources) into its orbit.
Quote for the day:
"When a man assumes leadership, he
forfeits the right to mercy." -- Gennaro Angiulo
No comments:
Post a Comment