4 Reasons to Shift Left and Add Security Earlier in the SDLC
Collaboration is critical for the security and development teams, especially
when timelines have to change. The security operations center (SOC) team may
need to train on cloud technologies and capabilities, while the cloud team may
need help understanding how the organization performs risk management.
Understanding the roles and responsibilities of these teams and the security
functions each fulfill is critical to managing security risks. In some
scenarios, security teams can act as enablers for cloud engineering, teaching
teams how to be self-sufficient in performing threat-modeling exercises. In
other situations, security teams can act as escalation paths during security
incidents. Last, security teams can also own and operate underlying platforms or
libraries that provide contextual value to more stream-oriented cloud
engineering teams, such as IAC scanning capabilities, shared libraries for
authentication and monitoring, and support of workloads constructs, such as
secure service meshes.
We have bigger targets than beating Oracle, say open source DB pioneers
The pitching of open source against Oracle's own proprietary database has
shifted as the market has moved on and developers lead a database strategy
building a wide range of applications in the cloud, rather than a narrower set
of business applications. Zaitsev pointed out that if you look at the rankings
on DB-Engines, which combines mentions, job ads and social media data, Oracle is
always the top RDBMS. But a Stack Overflow survey would not even in put Oracle
in the top five. So as developers are concerned, the debate about whether Oracle
is the enemy is over. "The reality is, the majority of developers — especially
good developers — prefer open source," he said. ... "There's a lot of companies
now who are basically saying, 'Forget the Oracle API, I want to standardise on
the PostgreSQL API.' They don't even want a non-PostgreSQL API because they see
it is a growing market and opportunity with additional cost savings,
flexibility, and continual innovation," he said, also speaking at Percona Live.
"Years ago, if you had to rewrite your application from Oracle to PostgreSQL,
that was a negative, that was a cost to you. ..."
Ultrafast Computers Are Coming: Laser Bursts Drive Fastest-Ever Logic Gates
The researchers’ advances have opened the door to information processing at the
petahertz limit, where one quadrillion computational operations can be processed
per second. That is almost a million times faster than today’s computers
operating with gigahertz clock rates, where 1 petahertz is 1 million gigahertz.
“This is a great example of how fundamental science can lead to new
technologies,” says Ignacio Franco, an associate professor of chemistry and
physics at Rochester who, in collaboration with doctoral student Antonio José
Garzón-Ramírez ’21 (PhD), performed the theoretical studies that lead to this
discovery. ... The ultrashort laser pulse sets in motion, or “excites,” the
electrons in graphene and, importantly, sends them in a particular
direction—thus generating a net electrical current. Laser pulses can produce
electricity far faster than any traditional method—and do so in the absence of
applied voltage. Further, the direction and magnitude of the current can be
controlled simply by varying the shape of the laser pulse (that is, by changing
its phase).
A computer cooling breakthrough uses a common material to boost power 740 percent
Researchers at the University of Illinois at Urbana-Champaign (UIUC) and the
University of California, Berkeley (UC Berkeley) have recently devised an
invention that could cool down electronics more efficiently than other
alternative solutions and enable a 740 percent increase in power per unit,
according to a press release by the institutions published Thursday. Tarek
Gebrael, the lead author of the new research and a UIUC Ph.D. student in
mechanical engineering, explained that current cooling solutions have three
specific problems. "First, they can be expensive and difficult to scale up," he
said. He brought up the example of heat spreaders made of diamonds which are
obviously very expensive. Second, he described how conventional heat spreading
approaches generally place the heat spreader and a heat sin (a device for
dissipating heat efficiently) on top of the electronic device. Unfortunately,
"in many cases, most of the heat is generated underneath the electronic device,"
meaning that the cooling mechanism isn't where it is needed most.
Tech firms are making computer chips with human cells – is it ethical?
Cortical Labs believes its hybrid chips could be the key to the kinds of complex
reasoning that today’s computers and AI cannot produce. Another start-up making
computers from lab-grown neurons, Koniku, believes their technology will
revolutionise several industries including agriculture, healthcare, military
technology and airport security. Other types of organic computers are also in
the early stages of development. While silicon computers transformed society,
they are still outmatched by the brains of most animals. For example, a cat’s
brain contains 1,000 times more data storage than an average iPad and can use
this information a million times faster. The human brain, with its trillion
neural connections, is capable of making 15 quintillion operations per second.
This can only be matched today by massive supercomputers using vast amounts of
energy. The human brain only uses about 20 watts of energy, or about the same as
it takes to power a lightbulb. It would take 34 coal-powered plants generating
500 megawatts per hour to store the same amount of data contained in one human
brain in modern data storage centres.
SolarWinds: Here's how we're building everything around this new cybersecurity strategy
Now, SolarWinds uses a system of parallel builds, where the location keeps
changing, even after the project has been completed and shipped. Much of this
access is only provided on a need-to-know basis. That means if an attacker was
ever able to breach the network, there's a smaller window to poison the code
with a malicious build. "What we're really trying to achieve from a security
standpoint is to reduce the threat window, providing the least amount of time
possible for a threat actor to inject malware into our code," said Ramakrishna.
But changing the process of how code is developed, updated and shipped isn't
going to help prevent cyberattacks alone, which is why SolarWinds is now
investing heavily in many other areas of cybersecurity. These areas include the
likes of user training and actively looking for potential vulnerabilities in
networks. Part of this involved building up a red team, cybersecurity personnel
who have the job of testing network defences and finding potential flaws or
holes that could be abused by attackers – crucially before the attackers find
them.
How to stop your staff ignoring cybersecurity advice
While regular reminders are great, if you deliver the same message repeatedly,
there is a danger that staff will zone out and ultimately become disengaged with
the process. We’ve seen clear evidence of this over the past year, with
awareness of key phrases falling, sometimes significantly. In this year’s State
of the Phish Report, just over half (53%) of users could correctly define
phishing, down from 63% the previous year. Recognition also fell across common
terms like malware (down 2%) and smishing (down 8%). Ransomware(opens in new
tab) was the only term to see an increase in understanding, yet only 36% could
correctly define the term. ... Cybersecurity training may not sound like most
people’s idea of fun, but there are plenty of ways to keep it positive and even
enjoyable. Deliver training in short sharp models, and don’t be afraid to use
different approaches such as animation or humor if it fits well into your
company culture. Making security training competitive and turning it into a game
can also aid the process. The gamification of training modules has been shown to
increase engagement and motivation, as well as improving attainment scores in
testing.
Why are current cybersecurity incident response efforts failing?
A risk-based approach to incident response enables enterprises to prioritize
vulnerabilities and incidents based on the level of risk they pose to an
organization. The simplest way of framing risk is a calculation on frequency of
occurrence and severity. Malware frequently reaches endpoints, and response and
clean-up can cost thousands of dollars (both directly and in lost productivity).
Furthermore – and security teams all over the world would agree on this –
vulnerabilities on internet-facing systems must be prioritized and remediated
first. Those systems are continuously under attack, and as the rate of
occurrence starts to approach infinity, so does risk. Similarly, there have been
many threat groups that have costed enterprises millions directly, and in some
cases tens of millions in lost operations and ERP system downtime. Large
enterprises measure the cost of simple maintenance windows in ERP systems in
tens of millions. Thus, it’s difficult to imagine the substantial calculations
on a business-critical application breach. As severity increases to that order
of magnitude, so does risk.
3 Must-Have Modernization Competencies for Application Teams
To decide the best path forward, leverage Competency #1. Architects and
decision-makers should begin with automated architectural assessment tools to
assess the technical debt of their monolithic applications, accurately identify
the source of that debt, and measure its negative impact on innovation. These
insights will help teams early in the cloud journey to determine the best
strategy moving forward. Using AI-based modernization solutions, architects can
exercise Competency #2 and automatically transform complex monolithic
applications into microservices — using both deep domain-driven observability
via a passive JVM agent and sophisticated static analysis‚ by analyzing flows,
classes, usage, memory and resources to detect and unearth critical business
domain functions buried within a monolith. Whether your application is still
on-premises or you have already lifted and shifted to the cloud (Competency #3),
the world’s most innovative organizations are applying vFunction on their
complex “megaliths” to untangle complex, hidden and dense dependencies for
business-critical applications that often total over 10 million lines of code
and consist of thousands of classes.
The surprising upside to provocative conversations at work
To be sure, supporting and encouraging sensitive conversations isn’t easy.
However, leaders can create the right conditions by establishing norms, offering
resources, and helping ensure that these conversations happen in safe
environments, with ground rules about avoiding judgment or trying to persuade
people to change their minds. Critically, employees should always have the
option to just show up and listen to better understand how colleagues are
impacted by something happening in the world. The objective of these
conversations should definitely not be to reach solutions or generate consensus.
In that way, fostering these conversations is a growth opportunity for senior
executives as well, who are often much more comfortable in problem-solving mode.
The leader’s role here is to help the company bring meaning, humanity, and
social impact to the workforce—not to deliver answers. The main takeaway for
senior leaders is that you can’t isolate employees from the issues of the world.
You can, however, help them sort through those issues and create a more
welcoming, inclusive environment in which people are free to be their authentic
selves—and maybe even learn from their colleagues.
Quote for the day:
"Cream always rises to the top...so do
good leaders" -- John Paul Warren
No comments:
Post a Comment