Where IT Consultancies Expect to Focus in 2025
“Much of what’s driving conversations around AI today is not just the technology
itself, but the need for businesses to rethink how they use data to unlock new
opportunities,” says Chaplin. “AI is part of this equation, but data remains the
foundation that everything else builds upon.” West Monroe also sees a shift
toward platform-enabled environments where software, data, and platforms
converge. “Rather than creating everything from scratch, companies are focusing
on selecting, configuring, and integrating the right platforms to drive value.
The key challenge now is helping clients leverage the platforms they already
have and making sure they can get the most out of them,” says Chaplin. “As a
result, IT teams need to develop cross-functional skills that blend software
development, platform integration and data management. This convergence of
skills is where we see impact -- helping clients navigate the complexities of
platform integration and optimization in a fast-evolving landscape.” ... “This
isn’t just about implementing new technologies, it’s about preparing the
workforce and the organization to operate in a world where AI plays a
significant role. ...”
How Is AI Shaping the Future of the Data Pipeline?
AI’s role in the data pipeline begins with automation, especially in handling
and processing raw data – a traditionally labor-intensive task. AI can
automate workflows and allow data pipelines to adapt to new data formats with
minimal human intervention. With this in mind, Harrisburg University is
actively exploring AI-driven tools for data integration that leverage LLMs and
machine learning models to enhance and optimize ETL processes, including web
scraping, data cleaning, augmentation, code generation, mapping, and error
handling. These adaptive pipelines, which automatically adjust to new data
structures, allow companies to manage large and evolving datasets without the
need for extensive manual coding. ... Beyond immediate operational
improvements, AI is shaping the future of scalable and sustainable data
pipelines. As industries collect data at an accelerating rate, traditional
pipelines often struggle to keep pace. AI’s ability to scale data handling
across various formats and volumes makes it ideal for supporting industries
with massive data needs, such as retail, logistics, and telecommunications. In
logistics, for example, AI-driven pipelines streamline inventory management
and optimize route planning based on real-time traffic data.
Innovating with Data Mesh and Data Governance
Companies choose a data mesh to overcome the limitations of “centralized and
monolithic” data platforms, as noted by Zhamak Dehghani, the director of
emerging technologies at Thoughtworks. Technologies like data lakes and
warehouses try to consolidate all data in one place, but enterprises can find
that the data gets stuck there. A company might have only one centralized data
repository – typically a team such as IT – that serves the data up to everyone
else in the company. This slows down data access because of bottlenecks. For
example, having already taken days to get HR privacy approval, the finance
department’s data access requests might then sit in the inbox of one or two
people in IT for additional days. Instead, a data mesh puts data control in
the hands of each domain that serves that data. Subject matter experts (SMEs)
in the domain control how this data is organized, managed, and delivered. ...
Data mesh with federated Data Governance balances expertise, flexibility, and
speed with data product interoperability among different domains. With a data
mesh, the people with the most knowledge about their subject matter take
charge of their data. In the future, organizations will continue to face
challenges in providing good, federated Data Governance to access data through
a data mesh.
The Agile Manifesto was ahead of its time
A fundamental idea of the agile methodology is to alleviate this and allow for
flexibility and changing requirements. The software development process should
ebb and flow as features are developed and requirements change. The software
should adapt quickly to these changes. That is the heart and soul of the whole
Agile Manifesto. However, when the Agile Manifesto was conceived, the state of
software development and software delivery technology was not flexible enough
to fulfill what the manifesto was espousing. But this has changed with the
advent of the SaaS (software as a service) model. It’s all well and good to
want to maximize flexibility, but for many years, software had to be delivered
all at once. Multiple features had to be coordinated to be ready for a single
release date. Time had to be allocated for bug fixing. The limits of the
technology forced software development teams to be disciplined, rigid, and
inflexible. Delivery dates had to be met, after all. And once the software was
delivered, changing it meant delivering all over again. Updates were often a
cumbersome and arduous process. A Windows program of any complexity could be
difficult to install and configure. Delivering or upgrading software at a site
with 200 computers running Windows could be a major challenge.
Improving the Developer Experience by Deploying CI/CD in Databases
Characteristically less mature than CI/CD for application code, CI/CD for
databases enables developers to manage schema updates such as changes to table
structures and relationships. This management ability means developers can
execute software updates to applications quickly and continuously without
disrupting database users. It also helps improve quality and governance,
creating a pipeline everyone follows. The CI stage typically involves
developers working on code simultaneously, helping to fix bugs and address
integration issues in the initial testing process. With the help of
automation, businesses can move faster, with fewer dependencies and errors and
greater accuracy — especially when backed up by automated testing and
validation of database changes. Human intervention is not needed, resulting in
fewer hours spent on change management. ... Deploying CI/CD for databases
empowers developers to focus on what they do best: Building better
applications. Businesses today should decide when, not if, they plan to
implement these practices. For development leaders looking to start deploying
CI/CD in databases, standardization — such as how certain things are named and
organized — is a solid first step and can set the stage for automation in the
future.
To Dare or not to Dare: the MVA Dilemma
Business stakeholders must understand the benefits of technology experiments
in terms they are familiar with, regarding how the technology will better
satisfy customer needs. Operations stakeholders need to be satisfied that the
technology is stable and supportable, or at least that stability and
supportability are part of the criteria that will be used to evaluate the
technology. Wholly avoiding technology experiments is usually a bad thing
because it may miss opportunities to solve business problems in a better way,
which can lead to solutions that are less effective than they would be
otherwise. Over time, this can increase technical debt. ... These trade-offs
are constrained by two simple truths: the development team doesn’t have much
time to acquire and master new technologies, and they cannot put the business
goals of the release at risk by adopting unproven or unsustainable technology.
This often leads the team to stick with tried-and-true technologies, but this
strategy also has risks, most notably those of the hammer-nail kind in which
old technologies that are unsuited to novel problems are used anyway, as in
the case where relational databases are used to store graph-like data
structures.
2025 API Trend Reports: Avoid the Antipattern
Modern APIs aren’t all durable, full-featured products, and don’t need to be.
If you’re taking multiple cross-functional agile sprints to design an API
you’ll use for less than a year, you’re wasting resources building a system
that will probably be overspecified and bloated. The alternative is to use
tools and processes centered around an API developer’s unit of work, which is
a single endpoint. No matter the scope or lifespan of an API, it will consist
of endpoints, and each of those has to be written by a developer, one at a
time. It’s another way that turning back to the fundamentals can help you
adapt to new trends. ... Technology will keep evolving, and the way we employ
AI might look quite different in a few years. Serverless architecture is the
hot trend now, but something else will eventually overtake it. No doubt,
cybercriminals will keep surprising us with new attacks. Trends evolve, but
underlying fundamentals — like efficiency, the need for collaboration, the
value of consistency and the need to adapt — will always be what drives
business decisions. For the API industry, the key to keeping up with trends
without sacrificing fundamentals is to take a developer-centric approach.
Developers will always create the core value of your APIs.
The targeted approach to cloud & data - CIOs' need for ROI gains
AI and DaaS are part of the pool of technologies that Pacetti also draws on,
and the company also uses AI provided by Microsoft, both with ChatGPT and
Copilot. Plus, AI has been integrated into the e-commerce site to support
product research and recommendations. But there’s an even more essential area
for Pacetti.“With the end of third-party cookies, AI is now essential to
exploit the little data we can capture from the internet user browsing who
accept tracking,” he says. “We use Google’s GA4 to compensate for missing
analytics data, for example, by exploiting data from technical cookies.” ...
CIOs discuss sales targets with CEOs and the board, cementing the IT and
business bond. But another even more innovative aspect is to not only make IT
a driver of revenues, but also have it measure IT with business indicators.
This is a form of advanced convergence achieved by following specific
methodologies. Sondrio People’s Bank (BPS), for example, adopted business
relationship management, which deals with translating requests from
operational functions to IT and, vice versa, bringing IT into operational
functions. BPS also adopts proactive thinking, a risk-based framework for
strategic alignment and compliance with business objectives.
Hidden Threats Lurk in Outdated Java
How important are security updates? After all, Java is now nearly 30 years
old; haven’t we eliminated all the vulnerabilities by now? Sadly not, and
realistically, that will never happen. OpenJDK contains 7.5 million lines of
code and relies on many external libraries, all of which can be subject to
undiscovered vulnerabilities. ... Since Oracle changed its distributions and
licensing, there have been 22 updates. Of these, six PSUs required a
modification and new release to address a regression that had been introduced.
The time to create the new update has varied from just under two weeks to over
five weeks. At no time have any of the CPUs been affected like this. Access to
a CPU is essential to maintain the maximum level of security for your
applications. Since all free binary distributions of OpenJDK only provide the
PSU version, some users may consider a couple of weeks before being able to
deploy as an acceptable risk. ... When an update to the JDK is released, all
vulnerabilities addressed are disclosed in the release notes. Bad actors now
have information enabling them to try and find ways to exploit unpatched
applications.
How to defend Microsoft networks from adversary-in-the-middle attacks
Depending on the impact of the attack, start the cleanup process. Start by
forcing a password change on the user account, ensuring that you have revoked
all tokens to block the attacker’s fake credentials. If the consequences of
the attack were severe, consider disabling the user’s primary account and
setting up a new temporary account as you investigate the extent of the
intrusion. You may even consider quarantining the user’s devices and
potentially taking forensic-level backups of workstations if you are unsure of
the original source of the intrusion so you can best investigate. Next review
all app registrations, changes to service principals, enterprise apps, and
anything else the user may have changed or impacted since the time the
intrusion was noted. You’ll want to do a deep investigation into the mailbox’s
access and permissions. Mandiant has a PowerShell-based script that can assist
you in investigating the impact of the intrusion “This repository contains a
PowerShell module for detecting artifacts that may be indicators of UNC2452
and other threat actor activity,” Mandiant notes. “Some indicators are
‘high-fidelity’ indicators of compromise, while other artifacts are so-called
‘dual-use’ artifacts.”
Quote for the day:
"To think creatively, we must be able
to look afresh to at what we normally take for granted." --
George Kneller
No comments:
Post a Comment