PaaS is back: Why enterprises keep trying to resurrect self-service developer platforms

As ever in enterprise IT, it’s a question of control. Or, really, it’s an
attempt by organizations to find the right balance between development and
operations, between autonomy and governance. No two enterprises will land
exactly the same on this freedom continuum, which is arguably why we see every
enterprise determined to build its own PaaS/cloud. Hearkening back to Coté’s
comment, however, the costs associated with being a snowflake can be high. One
solution is simply to enable developer freedom … up to a point. As Leong
stressed: “I talk to far too many IT leaders who say, ‘We can’t give developers
cloud self-service because we’re not ready for You build it, you run it!’
whereupon I need to gently but firmly remind them that it’s perfectly okay to
allow your developers full self-service access to development and testing
environments, and the ability to build infrastructure as code (IaC) templates
for production, without making them fully responsible for production.” In other
words, maybe enterprises needn’t give their developers the keys to the kingdom;
the garage will do.
Why EA As A Subject Is A "Must Have" Now Than Ever Before?
Enterprise architecture as a subject and knowledge of reference architecture like IT4ITTM would help EA aspirants appreciate tools for managing a digital enterprise. As students, we know that various organizations are undergoing digital transformation. But hardly do we understand where to start the journey or how to go about the digital transformation if we are left on our own. Knowledge of the TOGAF® Architecture Development Method (ADM)  would be a fantastic starting point to answer the abovementioned question. The as-is assessment followed by to-be assessment (or vice versa depending on context) across business, data, application and technology could be a practical starting point. The phase “Opportunities and Solutions” would help get a roadmap of several initiatives an enterprise can choose for its digital transformation. Enterprise Architecture as a subject in b-school would cut across various subjects and help students with a holistic view.
5 steps to minimum viable enterprise architecture

At Carrier Global Corp., CIO Joe Schulz measures EA success by business metrics
such as how employee productivity is affected by application quality or service
outages. “We don’t look at enterprise architecture as a single group of people
who are the gatekeepers, who are more theoretical in nature about how something
should work,” says Schulz. He uses reports and insights generated by EA tool
LeanIX to describe the interconnectivity of the ecosystem as well the systems
capabilities across the portfolio to identify redundancies or gaps. This allows
the global provider of intelligent building and cold chain solutions to
“democratize a lot of the decision-making…(to) bring all the best thinking and
investment capacity across our organization to bear.” George Tsounis, chief
technology officer at bankruptcy technology and services firm Stretto,
recommends using EA to “establish trust and transparency” by informing business
leaders about current IT spending and areas where platforms are not aligned to
the business strategy. That makes future EA-related conversations “much easier
than if the enterprise architect is working in a silo and hasn’t got that
relationship,” he says.
3 strategies to launch an effective data governance plan
Develop a detailed lifecycle for access that covers employees, guests, and
  vendors. Don’t delegate permission setting to an onboarding manager as they
  may over-permission or under-permission the role. Another risk with handling
  identity governance only at onboarding is that this doesn’t address changes in
  access necessary as employees change roles or leave the company. Instead,
  leaders of every part of the organization should determine in advance what
  access each position needs to do their jobs—no more, no less. Then, your IT
  and security partner can create role-based access controls for each of these
  positions. Finally, the compliance team owns the monitoring and reporting to
  ensure these controls are implemented and followed. When deciding what data
  people need to access, consider both what they’ll need to do with the data and
  what level of access they need to do their jobs. For example, a salesperson
  will need full access to the customer database, but may need only read access
  to the sales forecast, and may not need any access to the accounts payable
  app.
The Profound Impact of Productivity on Your Soul

Finishing what you set out to do feels great. Have you ever had a rush of
  satisfaction after checking off that last item on your to-do list? Feeling
  satisfied and fulfilled about what you are doing is the essence of great
  productivity. Of course, it means you are getting stuff done, but you are also
  getting stuff that is actually important and meaningful. ... When we “do,” we
  share a piece of ourselves with the world. Our work can speak volumes about
  ourselves. Every time we decide to be productive and take action to complete
  something, we are embracing our identity and who we are. Being able to choose
  our efforts and be who we want to be is a rewarding feeling. However, it is
  also essential to ensure you are doing it for yourself and are not trying to
  meet someone else’s expectations of you. For example, some younger kids will
  play sports that they hate to ensure the happiness of their parents. The kids
  are doing it for their parents, rather than themselves. What happens when you
  don’t do it for yourself is twofold; First, you become dependent on someone
  else’s validation. 
Apple and Meta shared data with hackers pretending to be law enforcement officials
Apple and Meta handed over user data to hackers who faked emergency data
  request orders typically sent by law enforcement, according to a report by
  Bloomberg. The slip-up happened in mid-2021, with both companies falling for
  the phony requests and providing information about users’ IP addresses, phone
  numbers, and home addresses. Law enforcement officials often request data from
  social platforms in connection with criminal investigations, allowing them to
  obtain information about the owner of a specific online account. While these
  requests require a subpoena or search warrant signed by a judge, emergency
  data requests don’t — and are intended for cases that involve life-threatening
  situations. Fake emergency data requests are becoming increasingly common, as
  explained in a recent report from Krebs on Security. During an attack, hackers
  must first gain access to a police department’s email systems. The hackers can
  then forge an emergency data request that describes the potential danger of
  not having the requested data sent over right away, all while assuming the
  identity of a law enforcement official. 
New algorithm could be quantum leap in search for gravitational waves

Grover's algorithm, developed by computer scientist Lov Grover in 1996,
  harnesses the unusual capabilities and applications of quantum theory to make
  the process of searching through databases much faster. While quantum
  computers capable of processing data using Grover's algorithm are still a
  developing technology, conventional computers are capable of modeling their
  behavior, allowing researchers to develop techniques which can be adopted when
  the technology has matured and quantum computers are readily available. The
  Glasgow team are the first to adapt Grover's algorithm for the purposes of
  gravitational wave search. In the paper, they demonstrate how they have
  applied it to gravitational wave searches through software they developed
  using the Python programming language and Qiskit, a tool for simulating
  quantum computing processes. The system the team developed is capable of a
  speed-up in the number of operations proportional to the square-root of the
  number of templates. Current quantum processors are much slower at performing
  basic operations than classical computers, but as the technology develops,
  their performance is expected to improve.
ID.me and the future of biometric zero trust architecture

Although poorly executed and architected, ID.Me and the IRS were on the right
  path: biometrics is a great way to verify identity and provides a way to deter
  fraud. But the second part, the part they missed, is that biometrics only
  fights fraud if it is deployed in a way that preserves user privacy and
  doesn’t itself become a new data source to steal. Personal data fraud has
  become the seemingly unavoidable penalty for the convenience of digital
  services. According to consumer reporting agency Experian, fraud has increased
  33 percent over the past two years, with fraudulent credit card applications
  being one of the main infractions. Cisco’s 2021 Cybersecurity Threat Trends
  report finds that at least one person clicked a phishing link in 86 percent of
  organizations and that phishing accounts for 90 percent of data breaches. It’s
  hard not to think that storing personal and biometric data of the entire
  United States tax-paying population in one database wouldn’t become a catalyst
  for the mother of all data breaches.
GitOps Workflows and Principles for Kubernetes

In essence, GitOps uses the advantages of Git with the practicality and
  reliability of DevOps best practices. By utilizing things like version
  control, collaboration and compliance and applying them to infrastructure,
  teams are using the same approach for infrastructure management as they do for
  software code, enabling greater collaboration, release speed and accuracy. ...
  Just like Kubernetes, GitOps is declarative. Git declares the desired state,
  while GitOps works to achieve and maintain that state; As mentioned above,
  GitOps creates a single source of truth because everything—from your app code
  to cluster configurations—is stored, versioned and controlled in Git. GitOps
  focuses on automation; The approved desired state can be automatically applied
  and does not require hands-on intervention. Having built-in automated
  environment testing (the same way you test app code) leverages a familiar
  workflow used in other places to ensure software quality initiatives are being
  met before merging to production; GitOps is, in a way, self-regulating. If the
  application deviates from the desired state, an alert can be raised.
Running legacy systems in the cloud: 3 strategies for success

Teams are capable of learning, but may not be familiar with cloud at the onset
  of the project. This impacts not only the initial migration but also Day 2
  operations and beyond, especially given the velocity of change and new
  features that the hyperscale platforms — namely Amazon Web Services, Google
  Cloud Platform, and Microsoft Azure — roll out on a continuous basis. Without
  the necessary knowledge and experience, teams struggle to optimize their
  legacy system for cloud infrastructure and resources — and then don’t attain
  the full capabilities of these platforms. ... No one gains a competitive
  advantage from worrying about infrastructure these days; they win with a laser
  focus on transforming their applications and their business. That’s a big part
  of cloud’s appeal – it allows companies to do just that because it effectively
  takes traditional infrastructure concerns off their plates. You can then shift
  your focus to business impacts of the new technologies at your disposal, such
  as the ability to extract data from a massive system like SAP and integrate
  with best-of-breed data analytics tooling for new insights.
Quote for the day:
"A friend of mine characterizes
    leaders simply like this : "Leaders don't inflict pain. They bear pain." --
    Max DePree
No comments:
Post a Comment