The DevOps Journey: Continuous Mindset Starts With Cultural Change
Continuous mindset is intertwined throughout all three of the above
approaches. To reach the maximum value in each area, a continuous mindset
needs to be enabled, matured and applied to disciplines in other DevOps areas.
The changes are across people, processes, technology and culture. Like DevOps,
gaining a continuous mindset is a journey that will take time, many steps and
constant learning. Continuous mindset changes the way people achieve outcomes
for your business. This requires an emphasis on changing the way people think
at every level of the organization, the way people are connected and the way
they work. Achieving true continuous mindset ROI requires significant cultural
adoption and collaboration. A key component is leveraging strong
organizational change management (OCM). ... Ultimately, a continuous mindset
changes the organizational engagement model and requires organizations to
align on how and when they will work together. This requires shifting
responsibilities and empowering teams/individuals to contribute to the
improvement of DevOps capabilities.
What does a real economist think of cryptocurrencies?
It’s a computing question, but obviously intersects with economic issues and
the core question is how will all this stuff pay for itself, which is still
unclear. Economists do or at least could have a lot to say about that. I
wouldn’t say the economists are worthless. I would say they haven’t been
valuable yet. They may come to the party very late. Cryptocurrency is not
fundamentally new monies, but you will find people in the sector who still
will argue these things will serve as literal currencies. I think the name
cryptocurrency has become unfortunate but clearly they are currencies can be
used for some purposes sometimes black or gray market purposes, but I don’t
think fundamentally that’s what they are. You see this with NFTs which are
their own thing and they’re closely related to crypto. But like, how does that
relate to a currency? Is an NFT an artwork? But within the language of
cryptography it all makes more sense. It’s more of a unified development and
that’s a way to think about it rather than thinking of them as currencies.
Hybrid working in the metaverse
Employees would be represented by avatars that used non-fungible tokens
(NFTs) and cryptocurrencies to buy goods and services, and accessed
applications, such as Slack or Dropbox, within this virtual space in order
to communicate and collaborate. Operating as a platform-as-a-service
offering, this virtual office would be based on technology ranging from
augmented (AR) and virtual reality (VR) to digital twins and would integrate
with third-party tools and applications to create a truly immersive
environment. But as to whether such a concept is likely to take off any time
soon, Iain Fisher, director of the Northern European digital strategy and
solutions practice at research and advisory firm ISG, is not convinced –
although he does believe it could have a role to play in certain,
predominantly creative, industries. For instance, he sees computer gaming
being a “huge” market, while the immersive nature of the technology means it
could appeal to retailers, entertainment providers and advertisers keen to
offer new customer experiences.
The Long Road to Quantum: Are We There Yet?
In the past few years, we have witnessed that access to quantum computing
hardware has catalyzed an entire ecosystem involving algorithms, middleware,
firmware, control system. Indeed, there's an entire supply chain of quantum
computing-relevant hardware and services. It includes Zapata, QCWare,
Riverlane, Q-Control, and many others. The same will happen with the larger
quantum ecosystem when access to other categories of quantum hardware
systems are made available over the cloud: quantum simulators, quantum
emulators, analog quantum machines, and programmable but targeted purpose
quantum systems. Consider quantum sensing, quantum signal processing,
quantum analog machine learning, communications. The list goes on. The
barrier to entry to any one of those applications is enormous. Speaking
specifically to cold atom-based quantum technology, which is what I know, it
takes something like three or four Ph.D. physicists, two or more years, and
$2M or so to establish a credible effort that involves hardware. Now,
suppose the barrier to creating the hardware is removed; hardware expertise,
the development time, and the capital costs of hardware now go away.
How the metaverse could shape cybersecurity in 2022
Ever since the idea of the Metaverse hit the news, a flurry of cybercriminal
activity has been evident through rising NFT scams. Since these scams deploy
social engineering tactics, it’s safe to say that social engineering attacks
are not going away any time soon. In fact, there will likely be a rise in
attacks as the metaverse continues to take shape. The fact that the
Metaverse is so far going to house an extensive collection of sensitive
data, there has to be a probable rise in hack attacks. Along with that is
the evident impact it has on data privacy. If things remain vulnerable,
there could be frequent hacks and data theft, harming all users. With that
comes the imminent threat of scams and malware invasions. However, what is
probably most deeply concerning is that the metaverse is built through
blockchain technology. While this technology is secure, it is not immune to
vulnerabilities altogether. Moreover, it is decentralized, with no
designated admin or moderator to keep charge or control. With such an
absence of authority, there will be no possible way to retrieve stolen or
illegally obtained assets. Since the Metaverse will operate through avatars,
there will be no concrete method to identify cybercriminals.
Why microservices need event-driven architecture
The ability for data to transit wide area networks, which are often unstable
and unpredictable, can be tricky and time consuming. Add to that the
challenges being created by the Internet of Things, big data, mobile
devices. This creates significant risks to microservices initiatives. Older
systems are not quick and easy to update, but on the flip side microservices
need to be swift and flexible. Older implementations rely on aging
communication protocols, while microservices rely on APIs and open
protocols. Most legacy systems will be deployed on premise, while most
microservices live in the cloud. Newer systems such as IoT networks use
highly specialized protocols, but most microservices APIs and frameworks do
not support them as standard. Event-driven architecture addresses these
mismatches between legacy systems and microservices. ... EDA takes data from
being static to fluid. For example, stuck, at rest in a database locked
underneath an API to being fully in motion - consumable as business-critical
events happen in real-time. RESTful Microservices alone are not enough.
Performance Vs. Scalability
Sometimes architects divide systems into smaller parts. They may separate a
data store into two physically different parts. This means that one part of
the system can scale up or down separately from the other parts of the
system. This can be useful when one part of the system receives more traffic
than other parts. For example, the menu part of the system may receive
thousands of requests per second, while the ordering part may only receive a
few transactions per second. If the menu part of the system maxes out
resources, it may slow down ordering even though ordering is not doing much
work. Independent scaling would allow you to throw resources at the menu so
that ordering doesn’t suffer performance degradation. On its face, this
seems like a nice-to-have feature of the architecture. However, this may
massively increase the system’s complexity, and it may actually increase
costs. Database instances are often expensive, and you often pay a flat
monthly fee whether you use them heavily or not. Moreover, the busier part
of the system should only affect the performance of the less active part if
the active part maxes out resources.
Inside the blockchain developers’ mind: Can EOS deliver a killer social DApp?
On Ethereum, users have just addressed similar to Bitcoin addresses, which
are a long string of numbers and letters that are free to create because
they don’t take up any network storage. This is critical because anything
that takes up network storage or uses some of the network’s computational
resources has a real-world cost that must be paid by someone. Steem wanted
to be a social blockchain and so, the theory went. They needed a centralized
account that would be easy to remember which they would use to manage their
frequent interactions. So, it made perfect sense for these accounts to have
human-readable names that were easy to remember, but that also meant that
they took up network storage. But, this centralized account also makes you a
target. If you have a single private key that you regularly use to access an
account and that account holds valuable tokens, then hackers are going to do
their best to gain access to your computer so that they can steal your money
and anything else of value you might have on there.
No more transistors: The end of Moore’s law
The problem with Moore’s Law in 2022 is that the size of a transistor is now
so small that there just isn’t much more we can do to make them smaller. The
transistor gate, the part of the transistor through which electrons flow as
electric current, is now approaching a width of just 2 nanometers, according
to the Taiwan Semiconductor Manufacturing Company’s production roadmap for
2024. A silicon atom is 0.2 nanometers wide, which puts the gate length of 2
nanometers at roughly 10 silicon atoms across. At these scales, controlling
the flow of electrons becomes increasingly more difficult as all kinds of
quantum effects play themselves out within the transistor itself. With
larger transistors, a deformation of the crystal on the scale of atoms
doesn’t affect the overall flow of current, but when you only have about 10
atoms distance to work with, any changes in the underlying atomic structure
are going to affect this current through the transistor. Ultimately, the
transistor is approaching the point where it is simply as small as we can
ever make it and have it still function. The way we’ve been building and
improving silicon chips is coming to its final iteration.
Embracing Agile Values as a Tech and People Lead
A key agile principle to me is “Embrace Change”, the subtitle of XP Xplained
by Kent Beck. Change is continuous in our world and also at work. Accepting
this fact makes it easier to let go of a decision that was taken once under
different circumstances, and find a new solution. To change something is
also easier if there is already momentum from another change. So I like to
understand where the momentum is and then facilitate its flow. We had a
large organizational change at the beginning of 2020. Some teams were newly
created and everyone at MOIA was allowed to self-select into one of around
15 teams. That was very exciting. Some team formations went really well.
Others didn’t. There were two frontend developers who had self-selected into
a team that had less frontend work to do than expected. These two tried to
make it work by taking over more responsibility in other fields, thus
supporting their team, but after a year they were frustrated and felt stuck.
Recognizing the right moment that they needed support from the outside to
change their team assignment was very important.
Quote for the day:
"Successful leadership requires positive self-regard fused with optimism
about a desired outcome." -- Warren Bennis
No comments:
Post a Comment