The Agile Manifesto 20 years on: agility in software delivery is still a work in progress
What's missing from many agile initiatives is "ways to manage what you do based
on value and outcomes, rather than on measuring effort and tasks," says Morris.
"We've seen the rise of formulaic 'enterprise agile' frameworks that try to help
you to manage teams in a top-down way, in ways that are based on everything on
the right of the values of the Agile Manifesto. The manifesto says we value
'responding to change over following a plan,' but these frameworks give you a
formula for managing plans that don't really encourage you to respond to change
once you get going." ... Ritchie agrees that there's too much of a tendency to
pigeonhole agile into rigid processes. "The first and most-common mistake is the
interpretation of agile as simply a process, or something you can just buy and
do to immediately call yourself agile," says Ritchie. "This more often than not
results in process for the sake of process, frustration, and - contradictory to
the intent of agile - an even further disconnect between business outcomes and
the IT professionals chartered to deliver them." Related to this, he says, is
there often can be a "dogmatic agile zealot approach, where everything a
particular framework says must be taken as gospel...'"
Combining edge computing and IoT to unlock autonomous and intelligent applications
Edge computing isn’t limited to just sensors and other IoT; it can also involve
traditional IT devices, such as laptops, servers, and handheld systems.
Enterprise applications such as enterprise resource planning (ERP), financial
software, and data management systems typically don’t need the level of
real-time instantaneous data processing most commonly associated with autonomous
applications. Edge computing has the most relevance in the world of enterprise
software in the context of application delivery. Employees don’t need access to
the whole application suite or all of the company’s data. Providing them just
what they need with limited data generally results in better performance and
user experience. Edge computing also makes it possible to harness AI in
enterprise applications, such as voice recognition. Voice recognition
applications need to work locally for fast response, even if the algorithm is
trained in the cloud. “For the first time in history, computing is moving out of
the realm of abstract stuff like spreadsheets, web browsers, video games, et
cetera, and into the real world,” Thomason said. Devices are sensing things in
the real world and acting based on that information.
From The Vault: Top Statistical Ideas Behind The Data Science Boom
Improved data collection strategies (think sensors, Internet) have resulted in
enormous datasets. But, data collection and curation consumes nearly 80% of a
data engineer’s typical day. Data is still a problem. More so a couple of
decades ago. The idea behind bootstrap distribution is to use it as an
approximation to the data’s sampling distribution. According to researchers,
parametric bootstrapping, prior and posterior predictive checking, and
simulation-based calibration allow replication of datasets from a model instead
of directly resampling from the data. Calibrated simulation in the face of
uncertain data volumes is a standard procedure rooted in statistics and helps in
analysing complex models or algorithms. Gelman and Vehtari believe the future
research will lean more towards inferential methods, taking ideas such as unit
testing from software engineering and applying them to problems of learning from
noisy data. “As our statistical methods become more advanced, there will be a
continuing need to understand the links between data, models, and substantive
theory,” concluded the authors. The ideas mentioned above have laid the
foundation for modern-day deep learning and other such tools.
Driving innovation with emotional intelligence
EQ is increasingly recognized as a competitive advantage, according to a survey
by Harvard Business Review Analytic Services. It found that emotionally
intelligent organizations get an innovation premium. These organizations
reported more creativity, higher levels of productivity and employee engagement,
significantly stronger customer experiences, and higher levels of customer
loyalty, advocacy, and profitability. Organizations that did not focus on
emotional intelligence had “significant consequences, including low
productivity, lukewarm innovation, and an uninspired workforce,” said the
report. ... Verizon surveyed senior business leaders both before and after
covid-19. Before the pandemic, less than 20% of respondents said EQ would be an
important skill for the future. But since covid, EI increased in significance
for 69% of respondents. ... “A sure way to stifle innovation is to not have the
emotional maturity to recognize that innovation and creativity can come from
many sources,” says Steele. “I think that our agency has hugely benefited from
research institutes, large businesses, small businesses, and individual
contributors.” She continues, “The capacity to recognize untapped sources of
innovation, then bringing them together in a system, is a great ability to
have.”
Three flaws that sat in Linux kernel since 2006 could deliver root privileges to attackers
While the vulnerabilities “are in code that is not remotely accessible, so this
isn’t like a remote exploit,” said Nichols, they are still troublesome. They
take “any existing threat that might be there. It just makes it that much
worse,” he explained. “And if you have users on the system that you don’t really
trust with root access it, it breaks them as well.” Referring to the theory that
‘many eyes make all bugs shallow,’ Linux code “is not getting many eyes or the
eyes are looking at it and saying that seems fine,” said Nichols. “But, [the
bugs] have been in there since the code was first written, and they haven’t
really changed over the last 15 years.” As a matter of course, GRIMM researchers
try “to dig in” and see how long vulnerabilities have existed when they can – a
more feasible proposition with open source. That the flaws slipped detection for
so long has a lot to do with the sprawl of the the Linux kernel. It “has gotten
so big” and “there’s so much code there,” said Nichols. “The real strategy is
make sure you’re loading as little code as possible.”
Master Data Management Much More Than Technology
Industry experts define data governance as the “authority over the management of
data assets” and assigning “accountability for the quality of your
organization’s data.” Having authority over data assets is the function of data
ownership. Being accountable for the quality of these data assets is the
function of data stewardship. Data is a business asset, and business assets are
controlled by business people. Therefore, data owners and data stewards should
be business people. They must be careful not to manage their data within the
narrow focus of their own business unit (department or division); instead, they
must ensure that their data is managed from an enterprise perspective so that it
can be used and shared by all business units. Enterprise information management
(EIM) is about the administration of data. One industry expert describes EIM as
“a function, typically dedicated to an organization in IT, for maintaining,
cataloging, and standardizing corporate data.” This is done with the help of
data stewards under the umbrella of a data strategy, and by establishing
data-related standards, policies, and procedures.
Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI?
Lights travel faster than electrons. The concept of using light as a substitute
for carrying out heavy tasks (aka photonics computing/optical computing) dates
back to the 1980s, when Nokia Bell Labs, an American industrial research and
scientific development company, tried to develop a light-based processor.
However, due to the impracticality of creating a working optical transistor, the
concept didn’t take off. We experience optical technology in cameras, CDs, and
even in Blue-Ray discs. But these photons are usually converted into electrons
to deploy in chips. Four decades later, photonic computing gained momentum when
IBM and researchers from the University of Oxford Muenster developed the system
that uses light instead of electricity to perform several AI model-based
computations. Alongside, Lightmatter’s new AI chip has created a buzz in the
industry. According to the company website, Envise can run the largest neural
networks three times higher inferences/second than the Nvidia DGX-A100, with
seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.
Why Data Management Needs An Aggregator Model
The traditional approach to managing unstructured data has been storage-centric;
you move data to a storage system, and the storage system manages your data and
gives you some tools to search it and report on it. This approach worked and
made things easier when data volumes were small and all of an enterprise's data
could fit in a single storage solution. As enterprises shift to a hybrid
multicloud architecture, they can no longer manage data within each storage
silo, search for data within each storage silo and pay a heavy cost to move data
from one silo to another. As GigaOm analyst Enrico Signoretti pointed out: "The
trend is clear: The future of IT infrastructures is hybrid ... [and] it requires
a different and modern approach to data management." Another key reason an
aggregator model for data management is needed is that customers want to extract
value from their data. To analyze and search unstructured data, vital
information is stored in what is called "metadata" — information about the data
itself. Metadata is like an electronic fingerprint of the data. For example, a
photo on your phone might have information about the time and location when it
was taken as well as who was in it. Metadata is very valuable, as it is used to
search, find and index different types of unstructured data.
3 Ways To Improve Board-Level Focus on Third-Party Risk Management
Assessing the risks that third parties bring to your business shouldn’t begin
once you have signed the contract. Instead, security and procurement teams
should be reviewing known risks in potential vendors during the sourcing and
selection stage of the vendor lifecycle. Unfortunately, though, only 31% of
companies conduct thorough pre-contract due diligence, indicating there is a
long way to go to overcome this obstacle. ... Third-party risk management can’t
be a one-and-done task. It needs to be a continuous process built into the risk
DNA of the enterprise. However, most organizations can get easily tripped up
with performing vendor risk assessments, since half are still using manual
spreadsheets to manage their vendors, and a further 34% say it takes over a
month to complete an assessment of a top-tier vendor. This traditional static
annual assessment approach must give way to a more dynamic process that
incorporates real-time risk metrics. Agility should be the order of the day in
assessing third parties. ... Effectively reducing vendor risk requires an
understanding of how vendors are performing against expectations – both security
and performance-related.
Compromised devices and data protection: Be prepared or else
While encryption alone isn’t fully sufficient to secure data, it’s also the case
that multiple layers of encryption are often necessary to ensure that any
exposed data is rendered unreadable and unusable. For example, an encryption
tool like Bitlocker, if used on its own, can leave data vulnerable in certain
scenarios such as if a power failure interrupts the encryption process, or if a
system administrator’s credentials are compromised. In the wrong hands, a system
administrator account will be able to view all files as decrypted and in clear
text. However, deploying a solution like Encrypted File System (EFS) as a secondary encryption layer on top of Bitlocker will provide additional
file-level encryption. In this way, EFS makes it possible to ensure the
encryption of sensitive data, even if an attacker has gained access to device
hardware and has powerful credentials in hand. This approach provides the added
benefit of making it possible to service devices without it being necessary to
allow data access or present any risk of exposure. By implementing a layered
encryption strategy with protection at both the full drive and file levels,
organizations can take peace of mind that the loss of a particular device is
hardly a loss at all.
Quote for the day:
"Becoming a leader is synonymous with
becoming yourself. It is precisely that simple, and it is also that
difficult." -- Warren G. Bennis
No comments:
Post a Comment