How to apply design thinking in data science
Observing end-users and recognizing different stakeholder needs is a learning
process. Data scientists may feel the urge to dive right into problem-solving
and prototyping but design thinking principles require a problem-definition
stage before jumping into any hands-on work. “Design thinking was created to
better solutions that address human needs in balance with business opportunities
and technological capabilities,” says Matthew Holloway, global head of design at
SnapLogic. To develop “better solutions,” data science teams must collaborate
with stakeholders to define a vision statement outlining their objectives,
review the questions they want analytics tools to answer, and capture how to
make answers actionable. Defining and documenting this vision up front is a way
to share workflow observations with stakeholders and capture quantifiable goals,
which supports closed-loop learning. Equally important is to agree on
priorities, especially when stakeholder groups may have common objectives but
seek to optimize department-specific business workflows.
The role of big data in auditing and assurance services
Conventionally, audit judgements rely on sole evidence sourced from structured
datasets in an organization’s financial records. But technological advances in
data storage, processing power and analytic tools have made it easier to
obtain unstructured data to support audit evidence. Big data can be used for
prediction by using a complex method of analytics to glean audit evidence from
datasets and other sources which encompass organizations, industries, nature,
internet clicks, social media, market research and numerous other sources. ...
An innovative system will not only enable the application of artificial
intelligence embedded Natural Language Processing (NLP) to streamline
unstructured data but also ensure its integration with an Optical Character
Recognition (OCR). These capabilities and other new cutting-edge technologies
will effectively help to convert both structured and unstructured data into
meaningful insights to drive audit. Thus, the use of big data is to make it
easier to eliminate human errors, flag risks in time and spot fraudulent
transactions and, in effect, modernize audit operations, thereby improving the
efficiency and accuracy of the financial reporting process.
Operators unprepared for high gains from low-power IoT roaming
A key feature supported by the latest technology is passive or ambient IoT,
which aims to connect sensors and devices to cellular networks without a power
source and that could dramatically increase the number of cellular IoT
devices. This facet is increasingly becoming appealing to several enterprise
verticals. NB-IoT and LTE-M are backed by major mobile operators, offering
standardised connectivity with global reach. Yet Juniper warned that a key
technical challenge faced by operators is their inefficiency in detecting
low-power devices roaming on their networks, meaning that operators lose
potential revenue from these undetected devices. Due to their low data usage
and intermittent connectivity, these devices require constant network
monitoring to fully maximise roaming revenue. ... “Operators must fully
leverage the insights gained from AI-based detection tools to introduce
premium billing of roaming connections to further maximise roaming revenue,”
said research author Alex Webb. “This must be done by implementing roaming
agreements that price roaming connectivity on network resources used and time
connected to the network.”
How to improve cyber resilience by evaluating cyber risk
The biggest challenge in evaluating cyber risk is that we always underestimate
it. The impact is almost always worse than what was estimated. A lot of us are
professional risk mitigators and managers, and we still get it wrong. Going
back to the MGM Resorts cyber attack, I refuse to believe that MGM believed
that their ransomware breach was going to cost them US$1 billion in between
lost revenues, lost valuation and loss of confidence from both the market and
customers. That, to me, is the biggest issue. There is a huge gap there. Even
though there are a lot of numbers surrounding the cost of a data breach, they
still all significantly underestimate it. So that to me, I think is the
biggest area. ... We are spending a lot of time talking about the tools that
these actors use, whether it is artificial intelligence (AI), ransomware,
hacking, national security threats and so on. To make an impact against this
threat we must focus on resilience and what you can tolerate, then
understanding what you can withstand and what conditions you can withstand
them under.
What Sam Altman's move to Microsoft means for ChatGPT's future: 3 possible paths forward
Microsoft acquires what's left of OpenAI and kicks OpenAI's current board of
directors to the curb. Much of OpenAI's current technology runs on Azure
already, so this might make a lot of sense from an infrastructure point of
view. It also makes a lot of sense from a leadership point of view, given that
Microsoft now has OpenAI's spiritual and, possibly soon, technical leadership.
Plus, if OpenAI employees were already planning to defect, it makes a lot of
sense for Microsoft to simply fold OpenAI into the company's gigantic
portfolio. I think this may be the only practical way forward for OpenAI to
survive. If OpenAI were to lose the bulk of its innovation team, it would be a
shell operating on existing technology in a market that's running at warp
speed. Competitors would rapidly outpace it. But if it were brought into
Microsoft, then it can keep moving at pace, under the guidance of leadership
it is already comfortable with, and continue executing on plans it already
has.
Kaspersky’s Advanced Persistent Threats Predictions for 2024
Botnets are typically more prevalent in cybercrime activities compared to APT,
yet Kaspersky expects the latter to start using them more. The first reason is
to bring more confusion for the defense. Attacks leveraging botnets might
“obscure the targeted nature of the attack behind seemingly widespread
assaults,” according to the researchers. In that case, defenders might find it
more challenging to attribute the attack to a threat actor and might believe
they face a generic widespread attack. The second reason is to mask the
attackers’ infrastructure. The botnet can act as a network of proxies, but
also as intermediate command and control servers. ... The global increase in
using chatbots and generative AI tools has been beneficial in many sectors
over the last year. Cybercriminals and APT threat actors have started using
generative AI in their activities, with large language models explicitly
designed for malicious purposes. These generative AI tools lack the ethical
constraints and content restrictions inherent in authentic AI
implementations.
Alternative data investing: Why connected vehicle data is the future
One of the most promising subsectors of the alternative data realm is
geolocation, standing at an impressive valuation of $400 million. Geolocation
is prized for its ability to correlate ground-level activities to consumer
trends, business health and revenue. But within this sphere, the real
game-changer is ‘connected vehicle’ data. Connected vehicle data, a subset of
geolocation, is an invaluable resource for investors. It enables analysis of
both passenger car and truck activities across almost any location. This opens
a window into consumer trends, helping investors decipher current demand
dynamics before company earnings calls. Moreover, tracking truck activity
provides insights into a company’s supply chain health. By monitoring truck
traffic at key economic areas – be it manufacturing facilities, warehouses,
distribution centers or seaports – investors can gauge a company’s production,
distribution and supply chain efficiencies. This level of detail can provide a
holistic view of a company’s operations and its future revenue potential.
The Potential Impact of Quantum Computing on Data Centre Infrastructure
All kinds of use cases involving complex algorithms are the candidates for
being addressed using quantum computers. Use cases around financial modelling
and risk analysis at macro level, environmental analysis, and climate
modelling especially while undertaking the development projects in ecological
sensitive areas, supply chain optimisation, life sciences, AI based drug
discovery / drug repurposing, custom treatment for complex disease etc would
be the candidates for using quantum computing in data centres. Apart from the
above, one key area where quantum computing will impact everybody is deep
fake. In a very short time, the generative AI has shown its capability of
creating fake videos of anybody, with little training material. ... Quantum
computing will play key role in providing the required infrastructure support
implementation of algorithms which will be able to identify such fake videos
and stop them before they become viral and create law & order problems in
societies. Players like Facebook (including WhatsApp), Instagram have strong
usage requirements of quantum computing to address the menace of fake news and
fake videos.
7 steps for turning shadow IT into a competitive edge
A formalized and transparent prioritization process is also important. CIOs
need a way to capture lightweight business cases or forecast business value to
help prioritize new opportunities. At the same time, CIOs, CISOs, and
compliance officers need to establish a risk management framework to quantify
when shadow IT creates business issues or significant risks. CIOs should
partner with CFOs in this endeavor because when departments procure their own
technologies without IT, there are often higher procurement costs and
implementation risks. CIOs should also elicit their enterprise architect’s
guidance on where reusable platforms and common services yield cost and other
business benefits. “Shadow IT often wastes resources by not generating
documentation for software that would make it reusable,” says Anant Adya, EVP
at Infosys Cobalt. “Insightful and far-reaching governance coupled with
detailed application privileges discourage shadow IT and helps build
collaborative operating models.” Creating technology procurement controls that
require CIO and CISO collaboration on technology spending is an important step
to reduce shadow IT.
Running Automation Tests at Scale Using Cypress
With Cypress, teams can easily create web automation tests, debug them
visually, and automatically run them in the CI/CD pipelines, thus helping with
Continuous Integration (CI) and development. Though Cypress is often compared
with Selenium WebDriver, it is fundamentally and architecturally different. It
does not use Selenium WebDriver for automation testing. Thus enabling users to
write faster, easier, and more reliable tests. The installation and setup part
of Cypress is also easier compared to other test automation frameworks as it
is a node package. You just need to run the npm command npm install cypress
and then use the Cypress framework. ... Cypress offers some out-of-box
features to run automation tests at scale. Travel:- Cypress allows you to
"time travel" through the web application, meaning it will check out what
happens at each step when the tests are executed. We can step forward,
backward, and even pause the test execution on run time. Thus providing us the
flexibility to inspect the application’s state in real-time test execution.
Auto Waits:- Cypress has the inbuilt auto wait functionality that
automatically waits for the command and assertion before moving to the next
step.
Quote for the day:
"Success is how high you bounce when
you hit bottom." -- Gen. George Patton
No comments:
Post a Comment