Welcome to the client-serverless revolution
As this trend intensifies, a new paradigm of connected internet applications has
come to the forefront. This approach is known as client-serverless computing. It
delivers consistently dynamic, interactive application experiences from any
smartphone or edge device, no matter where a user happens to be, or where the
resources they’re accessing are being served from. The widespread adoption of
rich-client devices and the global availability of distributed cloud services
has fueled the client-serverless computing trend even more, but it also demands
more from developers. No longer can developers assume that their program code
will primarily access databases, app servers, and web servers that are located
within a single data center or cloud region. Instead, developers must build
server-side business logic and markup, as well as the client-side JavaScript
that will render the user interface on myriad client devices. They must code
applications that are optimized for high-quality, browser-side interactivity
over industry standard interfaces such as REST (for remote APIs) or JSON (for
data formats). Client-serverless has roots in the old-guard, three-tier
application architectures that sprung up around PCs and local area networks that
connected a client-side GUI to a back-end SQL database.
Strengthening Zero-Trust Architecture
First, it's helpful to consider zero trust in terms of the need for controlled
access management that does not negatively affect the business. Specifically,
organizations must establish a zero-trust environment that limits access to
individuals with the proper authority but doesn't interfere with daily
operations. One way to accomplish this is through a data-trust lens. Rather than
granting blanket access to validated users, organizations should hide specific
files and data from those who don't have the authorization to access them,
strengthening data protection beyond user-level permissions without impacting
authorized users. By hiding objects like files, folders, or mapped network and
cloud shares, attackers cannot find or access the data they seek. This function
can serve as a powerful defense against data theft and ransomware attacks.
Application trust likewise takes security beyond user privileges. Merely
focusing on whether a query is authorized isn't enough — it's also vital to
consider the application invoking that query. Doing so can prevent unauthorized
access from applications such as Windows command line or PowerShell, which
regular users wouldn't typically use to access data. Application trust can also
help identify and deflect attackers attempting to probe open ports and services
to compromise.
How can tech leaders take people with them on their digital transformation journey?
Leaders need to make it personal for their employees, make it clear that by
introducing this new digital tool their life will become easier and their
productivity more efficient. Leaders can look to do this by winning hearts and
minds through demonstrations and simple, clear communication. If, for example,
a business is introducing a new collaborative tool they need to make it clear
how that will benefit employees. Will it reduce email traffic? Make instant
communication more effective? Or free up more time in their day to focus on
other priorities? Demonstrating these benefits will help to put people in the
right mind-set from the start. It’s also important to ask for instant feedback
on transformational change programmes. Ensuring people are involved from the
start will promote engagement throughout the process and help leaders to
understand how their employees feel about the change and impacts within their
teams. Identify champions AND advocates Digital change champions are nothing
new but are critical to support the roll out of digital transformation at the
frontline of a business. These people can answer frequently asked questions,
provide an additional avenue of communication to leaders and encourage
employees to make best use of the new tools being made available to them.
AI No Silver Bullet for Cloud Security, but Here’s How It Can Help
One of the most promising – and certainly most developed – uses of AI in
cybersecurity is to use AI systems to trawl through historical data in order
to identify attack patterns. Some AI algorithms are very effective at this
task, and can inform otherwise oblivious cybersecurity teams that they have,
in fact, been hacked many times. The primary value of this kind of system is
seen when it comes to managing employee access to systems and files. AI
systems are extremely good at tracking what individual users are doing and at
comparing this with what they do typically. This allows administrators (or
automated security systems, explored below) to easily identify unusual
activity and block users’ access to files or systems before any real damage is
done. This kind of functionality is now widespread in many industries. Some
cloud providers even ship it with their basic cloud storage systems. In many
cases, in fact, an organization is not even aware that an AI is collecting
data on the way they use their cloud service in order to scan this for unusual
activity. This type of tool, however, also represents the limit of what AI can
do, in terms of cloud security, at the moment. Most organizations lack the
tools to use AI systems in a more complex way than this.
How do I select a PAM solution for my business?
Before choosing a PAM solution for their business, the first question a CISO
should ask themselves is what it is that they aim to protect? Adopting PAM is
as much about mindset and approach as it is about technology. Thousands of PAM
programme engagements with the world’s largest organizations have cemented our
view that the best way to protect the business is first to identify critical
data and assets, then assess the paths that an attacker might take to
compromise them. This sounds obvious but it is not yet the common practise
that it should be. Privileges identities, credentials, secrets and accounts
are found throughout IT infrastructure, whether this be on-premises,
multi-cloud or a mix thereof. The ones that allow access to your critical data
and assets are what the initial focus should be on. Once these are determined,
there are a number of essential features that apply: Ease of
implementation, ease of use, and ease of integration. The latter is essential.
Look for integrations with your existing vendor stack; Cloud readiness is key.
You are likely going to be moving applications into the cloud. Their
privileged access needs to be secured; Session management and
recording; Credential management for humans, applications, servers and
machines; Audit and reporting features; and Privileged threat
alerting.
Reported Data Breaches Rise 5% in Australia
The Office of the Australian Information Commissioner received 539
notifications between July and December, up from 512 in the first half of the
year, according to its new report. Healthcare providers reported 133 breaches,
followed by finance at 80; education, 40; legal, accounting and management
services at 33; and the federal government at 33. This marked the first time
the Australian government entered the top five list of sectors reporting the
most breaches, displacing the insurance industry. The federal government’s
breach tally does not include intelligence agencies or state and local
government agencies, public hospitals and public schools. Under Australia’s
notifiable data breaches law, organizations covered by the Privacy Act 1988
are required to report within 30 days breaches that are likely to result in
“serious harm.” Fines for noncompliance can range up to 2.1 million Australian
dollars ($1.6 million). The breach notification law went into effect in 2018
(see: Australia Enacts Mandatory Breach Notification Law). Although breach
notifications increased by 5%, the OAIC characterized that as a “modest”
increase given the rising cybersecurity risks introduced by the rapid shift in
early 2020 to working from home due to the COVID-19 pandemic.
‘Weird new things are happening in software,’ says Stanford AI professor Chris Re
To handle the subtleties of which he spoke, Software 2.0, Re suggested, is
laying out a path to turn AI into an engineering discipline, as he put it,
one where there is a new systems approach, different from how software
systems were built before, and an attention to new "failure modes" of AI,
different from how software traditionally fails. It is a discipline,
ultimately, he said, where engineers spend their time on more valuable
things than tweaking hyper-parameters. Re's practical example was a system
he built while he was at Apple, called Overton. Overton allows one to
specify forms of data records and the tasks to be performed on them, such as
search, at a high level, in a declarative fashion. Overton, as Re described
it, is kind of an end-to-end workflow for deep learning. It preps data, it
picks a model of neural net, tweaks its parameters, and deploys the program.
Engineers spend their time "monitoring the quality and improving
supervision," said Re, the emphasis being on "human understanding" rather
than data structures. Overton, and another system, Ludwig, developed by Uber
machine learning scientist Piero Molino, are examples of what can be called
zero-code deep learning. "The key is what's not required here," Re said.
Hunting and anti-hunting groups locked in tit-for-tat row over data gathering
The data collection practices of the Hunting Office (HO), a central
organisation delegated to run the administrative, advisory and supervisory
functions of the UK’s hunting associations, and the Countryside Alliance
(CA), a campaign organisation with over 100,000 members that promotes rural
issues, have been questioned by activists running a website called Hunting
Leaks. The website owners said that a monthly round-up of anti-hunting
activity – which appears to have been shared via email with hunts across the
UK – was passed on to Hunting Leaks by an undisclosed animal rights group.
The leaked document, a report on saboteur activity between 14 November and
12 December 2020, lists the names of anti-hunting groups, the names of 30
activists (some of which are referred to multiple times) and information
about their vehicles, including registration numbers. It also includes
information on the number of anti-hunting activists in attendance, details
about their movements and activity on a given hunt day, as well as guidance
for how hunt members should approach collecting information and video
footage.
6 ways to bring your spiraling cloud costs under control
The best way to avoid overspending on cloud resources is to know what you
need ahead of time. “Scalable cloud services, in theory, have made
overprovisioning unnecessary, but old behaviors used in traditional data
centers lead to [cloud] resources that are often underutilized or completely
idle, which result in unnecessary spend,” wrote Gartner analysts in a
December 2020 research note. This may not be music to the ears of anyone who
has already made sizable commitments in the scramble to react to the
challenges of the pandemic, but it does highlight the importance of
right-sizing your cloud environment where possible. “Start with knowing what
you spend—not just the invoice you get—but what are you spending on, where
are you spending the most, and where are you seeing growth,” said Eugene
Khvostov, vice president of product engineering at cost-management software
specialist Apptio. For larger organizations, a proven approach is to
establish a dedicated cloud center of excellence, tasked with monitoring and
governing cloud usage and establishing best practices. For smaller
organizations, this responsibility falls on senior members of the IT team,
who will be tasked with establishing budgetary guardrails, often linked to
longer-term ROI requirements.
Looking beyond Robotic Process Automation
There are a whole host of reasons why a process might not be suitable for
automation, but you should consider things such as the time it will take to
automate and how many steps in the process require human intervention.
Generally speaking, the more logical and easier to define the process is,
the faster and easier it is to automate. With a holistic view of processes
in your organisation, you will be able to pinpoint which processes can and
should be automated, as well as those where people are the key drivers. This
will not only be crucial in achieving greater efficiencies, but in
demonstrating the benefits to employees and an understanding of where they
fit into this new way of working. Consider where upskilling or knowledge
sharing might be needed to ensure employees are equipped to support
automation. It’s all well and good having technology in in place, but it
won’t run effectively without the right people and buy-in alongside it. The
relationship between people and technology is going to become even more
important as the capabilities of RPA and other machine-based learning
advance over the next few years. Just because you can’t fully automate a
process, doesn’t mean greater efficiencies can’t be achieved. For
Quote for the day:
"Ninety percent of leadership is the
ability to communicate something people want." --
Dianne Feinstein
No comments:
Post a Comment