What is a Data Lake? It is not a Data Swamp
A data lake is a place for storing large amounts of data that originate from
various sources and are stored in their raw form. Important is, the
heterogeneous data is neither cleaned nor transformed before the loading
process. After the loading process is complete, the data is now available in a
single system. In addition to structured data, a data lake also stores and
manages semi-structured (CSV, logs, JSON), unstructured (e-mails, documents,
reports), and binary data (video, audio, images). The list of all possible
formats is of course incomplete, but I think you know what I mean. The goal is
to gather all company data in one place in order to be able to quickly access
the entire data stock. Users should be able to immediately create
visualizations, reports, and analyses from the data. ... In order for the
principle of the data lake to work efficiently for you and not result in a data
swamp in which no more data can be found, the collected data must show a
business added value for the future. It is very difficult for analysts to
extract information from the volume of data. This is especially true when there
is no metadata or tags used. Without this, it is hardly possible for analysts to
assign the data.
How to Get Developer and Security Teams Aligned
The concept of policies replacing security standards builds on the idea of
culture shifts. Security standards are typically just a piece of documentation
saved on Confluence or GSuite somewhere. They may get examined by a developer
during a mandatory annual training session, or occasionally for reference, but
they aren’t dynamic and are rarely top of mind. Those responsible for enforcing
such standards are normally compliance or security operations specialists, who
are logically distanced from developers. Aside from low adoption rates and
disruptions to Agile workflows, security standards often lead to the ‘enforcer’
becoming the bad guy. This pushes even more of a wedge between dev and security,
making security feel a bit like doing your taxes (and no one wants that). If the
expertise of the traditional ‘enforcer’ is shared with developers and dynamic,
adaptable policies are adopted in place of rigid standards, then security simply
becomes part of the workflow. Zero-trust networking is a great example of this.
Zero-trust networking is probably the best way to secure your infrastructure,
and it relies on expertly defined and managed policies being present through
each of its 10 principles.
The evolution of the chief information officer (CIO)
Future success relies on leaders’ digital ability as much as their aptitude for
uniting teams and encouraging people to embrace new technology and new ways of
working at every level of the organisation. Without the leadership to make new
systems and processes work and deliver against business objectives, outlay in
innovation quickly becomes a source of future technical debt. Developing
leadership behaviours including emotional intelligence (EQ) will help CIOs to
build empathy; understanding the human impact of transition ensures teams feel
heard and valued, reduce resistance to change and enables CIOs to build trust.
Equally, strong communication skills will enable CIOs to speak in both the
languages of data and business and use storytelling to share their vision and
secure buy-in from teams and shareholders. CIOs are also protectorates. With
greater concern over business threats, CIOs safeguard their organisations’
assets and future. As well as managing data governance and cyber security, they
can add business value by anticipating the opportunities and risks presented by
disruption.
It’s time to shift from verifying data to authenticating identity
One of the more complex vishing schemes is the man-in-the-middle attack, in
which a fraudster sets up two parallel conversations between a business and its
customer. The business believes it is connecting with the customer, and the
customer thinks they are talking to the business — but in reality, it is the
fraudster interacting with both. The fraudster might initiate the scheme by
requesting the issuance of a one-time passcode via a session on the business’s
website. In parallel, posing as the business, the fraudster calls the unwitting
customer and, using social engineering, convinces the individual to read off the
one-time passcode sent by the business. The fraudster then uses this information
to log in to the customer’s account and perform unauthorized transactions. Since
the fraudster was able to provide all requested data to pass each point in the
verification process, access is granted. With synthetic identity fraud,
criminals combine real and fake information to create a fictitious identity,
which they use to open up financial accounts and make fraudulent purchases.
While a false identity might seem easy to spot, the reality is much more
challenging.
Serverless Computing Brings New Security Risks
Given the distributed nature of serverless functions – essentially, the reason
for its flexibility and scalability – many existing security tools will
provide little to no visibility, nor the control capabilities, for these
computing environments. Many of the security attacks that will occur in
serverless functions will be a result of misconfigurations and mistakes that
happen outside the purview of the security team and due to legacy solutions
which don’t translate to serverless architectures. Further, because abstracted
workloads create blind spots, attackers will have more room to maneuver
undetected. Serverless functions will even render some traditional DevSecOps
tools less useful. Scanning tools must monitor hundreds of individual
repositories instead of a single monolithic repository, while application
performance monitoring (APM) tools lack security proficiency and cannot
protect from the OWASP Serverless Top 10 risks. ... For many organizations,
serverless architecture is a very different and unique computing environment –
unlike anything they’ve experienced or had to protect before now. That reality
means that organizations need a fresh approach to securing these environments
and will need to look beyond the traditional tools they have in their tech
stack today.
Center for Internet Security: 18 security controls you need
The Center for Internet Security has updated its set of safeguards for warding
off the five most common types of attacks facing enterprise
networks—web-application hacking, insider and privilege misuse, malware,
ransomware, and targeted intrusions. In issuing its CIS Controls V8 this
month, the organization sought to present practical and specific actions
businesses can take to protect their networks and data. These range from
making an inventory of enterprise assets to account management to auditing
logs. In part the new version was needed to address changes to how businesses
operate since V7 was issued three years ago, and those changes guided the
work. “Movement to cloud-based computing, virtualization, mobility,
outsourcing, work-from-home, and changing attacker tactics have been central
in every discussion,” the new controls document says. CIS changed the format
of the controls a bit, describing actions that should be taken to address
threats and weaknesses without saying who should perform those tasks. That put
the focus on the tasks without tying them to specific teams within the
enterprise. The controls each come with detailed procedures for implementing
them along with links to related resource.
Advantages of Cloud Computing In Banking Can’t Be Ignored
The key to successful digital banking transformation includes embracing the
cloud. While there have been reservations in the past around cloud security
and regulation, cloud computing solutions are becoming prevalent in the
marketplace for both traditional and non-traditional financial institutions.
The use of data and deployment of advanced analytics, machine learning, and
artificial intelligence requires more processing power than all but the
largest financial institutions posses. The good news is that there are several
cloud-based solution providers, like IBM, that have created industry-specific
solutions for the banking industry. According to IBM, “Organizations have an
enormous opportunity to leverage cloud computing to drive innovation and
improve their competitive position. Cloud computing – whether private, hybrid
or public – enables organizations to be far more agile while reducing IT costs
and operational expenses. In addition, cloud models enable organizations to
embrace the digital transformation necessary to remain competitive in the
future.”
The cybersecurity industry is guarding against all the wrong threats
Smart technology itself, increasingly being deployed across government and
private sector systems, may soon create new webs of vulnerability, according
to a number of leading cybersecurity researchers. The problem is twofold:
Hackers will ultimately begin using artificial intelligence against systems
and there is concern that an inability to quickly spot flaws in machine
learning models could create even more vulnerabilities. It would be naïve to
believe that criminal hackers, who have already built help desk support
operations and a vast marketplace for “plug and play” intrusion tools, would
not find a way to use AI for attacks. “My guess is this isn’t very far off,
and we had better start thinking about its implications,” said security
technologist Bruce Schneier. “As AI systems get more capable, society will
cede more and more important decisions to them, which means that hacks of
those systems will become more damaging.” There is also concern within the
cybersecurity community that growing use of machine learning could be
opening new avenues of exploit for threat actors. Adi Shamir, professor at
the Weizmann Institute in Rehovot, Israel, and a co-founder of RSA, has been
analyzing the fragile state of neural networks and recently published a
paper on his findings.
AIOps Has a Data(Ops) Problem
There are two issues with data collection. The first is proper
instrumentation. It sounds easier than it is. The entire observability,
monitoring, and AIOps eco-system depend on properly instrumenting your
observable sources. If your systems, devices, services, and infrastructure
is not properly instrumented, then you will have data blind spots. No matter
how much data you collect from certain areas, if you do not have a holistic
view of all the telemetry components, you will be getting a partial view of
any system. Obviously, the instrumentation depends mostly on developers. The
second issue is integration. As any AIOps vendor will tell you, this
probably is the most difficult part to get your AIOps solution going. The
more input from varying telemetry sources, the better the insights will be.
Any good AIOps solution will be able to integrate easily with the basic
golden telemetry – logs, metrics, and traces. In addition, integrating with
notification systems (such as OpsGenie, Pagerduty, etc.), and maybe event
streams (such as Kafka, etc.) is useful as well. However, I quite often see
major enterprises struggling a lot to integrate the AIOps solutions with
their existing enterprise systems.
Agile Transformation: an Integral Approach
An integral approach incorporates all of the essential perspectives, schools
of thought, and methods into a unified, comprehensive and accurate
framework" is a simple definition from the book. The main leverage of
Integral Theory is that it provides a meta-framework for mapping other
techniques, approaches, and frameworks onto. The fundamental premise of
integral thinking is that any school of thought or method that has been
around for any length of time must have some truth to it -- "all
perspectives are true, but partial" is a Wilber quote. Integral helps us
take multiple perspectives on situations, which is key for change and
adaptability in a complex world, instead of getting stuck in our own,
limited perspective. As Ken Wilber said to us when we interviewed him for
the book -- there are two things that, above all else, make real
transformation possible -- the ability to take the perspective of others,
and the ability to see one’s own "seer". Both of these are fostered by using
integral thinking. Doing this cuts through our confusion when we run into
the challenges of existing culture and leadership mindsets when implementing
agile.
Quote for the day:
"Open Leadership: the act of
engaging others to influence and execute a coordinated and harmonious
conclusion." -- Dan Pontefract
No comments:
Post a Comment