Everything You Need to Know About Enterprise Architecture vs. Project Management
Even though both have their own set of specialized skills, they still correlate
in certain areas. Sometimes different teams are working on various initiatives
or parts of a landscape. In the middle of the project, they find out that each
team needs to work on the same bit of the software or service ... However, to
execute such a situation without any mishap needs some coordination and a good
system in place to foresee these dependencies. Since it is hard to keep track of
all the dependencies and some might come to bite you from the back later. This
is where enterprise architecture is needed. Enterprise architects are usually
well aware of these relationships and with their expertise in architecture
models, they can uncover these dependencies better. Such dependencies are
usually unknown to the project or program managers. Therefore, this is where
enterprise architect vs. Project management correlates. Enterprise architecture
is about managing the coherence of your business whereas project management is
responsible for planning and managing usually from the financial and resource
perspective.
A Minimum Viable Product Needs a Minimum Viable Architecture
In short, as the team learns more about what the product needs to be, they only
build as much of the product and make as few architectural decisions as is
absolutely essential to meet the needs they know about now; the product
continues to be an MVP, and the architecture continues to be an MVA supporting
the MVP. The reason for both of these actions is simple: teams can spend a lot
of time and effort implementing features and QARs in products, only to find that
customers don’t share their opinion on their value; beliefs in what is valuable
are merely assumptions until they are validated by customers. This is where
hypotheses and experiments are useful. In simplified terms, a hypothesis is a
proposed explanation for some observation that has not yet been proven (or
disproven). In the context of requirements, it is a belief that doing something
will lead to something else, such as delivering feature X will lead to outcome
Y. An experiment is a test that is designed to prove or reject some hypothesis.
In Search of Coding Quality
The major difference between good- and poor-quality coding is maintainability,
states Kulbir Raina, Agile and DevOps leader at enterprise advisory firm
Capgemini. Therefore, the best direct measurement indicator is operational
expense (OPEX). “The lower the OPEX, the better the code,” he says. Other
variables that can be used to differentiate code quality are scalability,
readability, reusability, extensibility, refactorability, and simplicity. Code
quality can also be effectively measured by identifying technical-debt
(non-functional requirements) and defects (how well the code aligns to the laid
specifications and functional requirements,” Raina says. “Software documentation
and continuous testing provide other ways to continuously measure and improve
the quality of code using faster feedback loops,” he adds. ... The impact
development speed has on quality is a question that's been hotly debated for
many years. “It really depends on the context in which your software is
running,” Bruhmuller says. Bruhmuller says his organization constantly deploys
to production, relying on testing and monitoring to ensure quality.
A chip that can classify nearly 2 billion images per second
While current, consumer-grade image classification technology on a digital
chip can perform billions of computations per second, making it fast enough
for most applications, more sophisticated image classification such as
identifying moving objects, 3D object identification, or classification of
microscopic cells in the body, are pushing the computational limits of even
the most powerful technology. The current speed limit of these technologies
is set by the clock-based schedule of computation steps in a computer
processor, where computations occur one after another on a linear schedule.
To address this limitation, Penn Engineers have created the first scalable
chip that classifies and recognizes images almost instantaneously. Firooz
Aflatouni, Associate Professor in Electrical and Systems Engineering, along
with postdoctoral fellow Farshid Ashtiani and graduate student Alexander J.
Geers, have removed the four main time-consuming culprits in the traditional
computer chip: the conversion of optical to electrical signals, the need for
converting the input data to binary format, a large memory module, and
clock-based computations.
Scrum, Remote Teams, & Success: Five Ways to Have All Three
Agile teams have long made use of team agreements (or team working
agreements). These set ground rules for the team, created by the team and
enforced by the team. When our working environment shifts as much as it has
recently, consider establishing some new team agreements specifically
designed to address remote work. Examples? On-camera expectations, team core
working hours (especially if you’re spread across multiple time zones) and
setting aside focus time during which interruptions are kept to a minimum.
... One of the huge disadvantages of a remote team is the lack of personal
connections that are made just grabbing a cup of coffee or standing around
the water cooler. Remote teams need to be deliberate about counteracting
isolation. Consider taking the first few minutes of a meeting to talk about
anything non-work related. Set up a time for a team show-and-tell in which
each team member can share something from their home or background in their
home office that matters to them. Find excuses for the team to share
anything that helps teammates get to know each other more—as human beings,
not just co-workers.
Cisco introduces innovations driving new security cloud strategy
Ushering in the next generation of zero trust, Cisco is building solutions
that enable true continuous trusted access by constantly verifying user and
device identity, device posture, vulnerabilities, and indicators of
compromise. These intelligent checks take place in the background, leaving
the user to work without security getting in the way. Cisco is introducing
less intrusive methods for risk-based authentication, including the
patent-pending Wi-Fi fingerprint as an effective location proxy without
compromising user privacy. To evaluate risk after a user logs in, Cisco is
building session trust analysis using the open Shared Signals and Events
standards to share information between vendors. Cisco unveiled the first
integration of this technology with a demo of Cisco Secure Access by Duo and
Box. “The threat landscape today is evolving faster than ever before,” said
Aaron Levie, CEO and Co-founder of Box. “We are excited to strengthen our
relationship with Cisco and deliver customers with a powerful new tool that
enables them to act on changes in risk dynamically and in near real-time.
10 key roles for AI success
The domain expert has in-depth knowledge of a particular industry or subject
area. This person is an authority in their domain, can judge the quality of
available data, and can communicate with the intended business users of an
AI project to make sure it has real-world value. These subject matter
experts are essential because the technical experts who develop AI systems
rarely have expertise in the actual domain the system is being built to
benefit, says Max Babych, CEO of software development company SpdLoad. ...
When Babych’s company developed a computer-vision system to identify moving
objects for autopilots as an alternative to LIDAR, they started the project
without a domain expert. Although research proved the system worked, what
his company didn’t know was that car brands prefer LIDAR over computer
vision because of its proven reliability, and there was no chance they would
buy a computer vision–based product. “The key advice I’d like to share is to
think about the business model, then attract a domain expert to find out if
it is a feasible way to make money in your industry — and only after that
try to discuss more technical things,” he says.
Be Proactive! Shift Security Validation Left
When security testing only kicks in at the end of the SDLC, the delays
caused in deployment due to uncovered critical security gaps cause rifts
between DevOps and SOC teams. Security often gets pushed to the back of the
line, and there's not much collaboration when introducing a new tool, or
method, such as launching occasional simulated attacks against the CI/CD
pipeline. Conversely, once a comprehensive continuous security validation
approach is baked in the SDLC, daily invoking attack techniques emulations
through the automation built-in XSPM technology identify misconfiguration
early in the process, incentivizing close collaboration between DevSecOps
and DevOps. With built-in inter-team collaboration across both security and
software development lifecycle, working with immediate visibility on
security implications, the goal alignment of both teams eliminates erstwhile
strife and friction born of internal politics. Shifting extreme left with
comprehensive continuous security validation enables you to begin mapping
and to understand the investments made in various detection and response
technologies and implementing findings to preempt attack techniques across
the kill chain and protect real functional requirements.
Unlocking the ‘black box’ of education data
Technology enables education leaders to understand a child’s learning
journey in a way that hasn’t been previously possible. Be this through
logging the time a child spends on a certain task, recording areas that
students consistently do well or poorly in, or by noting hours spent in
extra-curricular programmes. Edtech allows the collection and centralisation
of data on a child across their years spent in school. This data can then be
used to build up a holistic picture of the student’s learning to share with
everyone who supports that pupil, from teachers, parents and carers to
learning support assistants. They are all able to contribute to the
discussion on a pupils areas for focus and improvement. Artificial
Intelligence (AI) data analytics can be a valued tool in allowing teachers
to visualise and assess the most effective ways of learning in the
classroom, the metacognition processes occurring, and intervene if needed to
support learning. Beyond the classroom, education leaders and policy makers
can aggregate data to develop strategies and policies.
How to Retain Talent in Uncertain Circumstances
“There was confusion and uncertainty, which led to a willingness for those
professionals in those organizations to listen to the opportunities we had,”
Sasson says. “There was no visibility whatsoever, which created an
environment where they were more open to hearing what else was out there.”
In some cases a company may be planning downsizing after a merger, and they
may be allowing that uncertainty to linger because they want some employees
to voluntarily find new jobs, Sasson says. However, in other cases
organizations may want to retain their valuable talent, particularly in this
tight job market. Just because there’s a merger or acquisition doesn’t
necessarily mean that everyone will make a stampede to the door. ...
Sasson’s team asked the employees at Proofpoint why they weren’t interested
in new opportunities. “From what we understand, the CEO at Proofpoint and
the Thoma Bravo team -- they seemed to do an excellent job of communicating
the value of the acquisition and limiting the jitters that would typically
be felt by the rank and file,” Sasson said.
Quote for the day:
"A leader should demonstrate his
thoughts and opinions through his actions, not through his words." --
Jack Weatherford
No comments:
Post a Comment