The digital future may rely on ultrafast optical electronics and computers
Our system of ultrafast optical data transmission is based on light rather than
voltage. Our research group is one of many working with optical communication at
the transistor level – the building blocks of modern processors – to get around
the current limitations with silicon. Our system controls reflected light to
transmit information. When light shines on a piece of glass, most of it passes
through, though a little bit might reflect. That is what you experience as glare
when driving toward sunlight or looking through a window. We use two laser beams
transmitted from two sources passing through the same piece of glass. One beam
is constant, but its transmission through the glass is controlled by the second
beam. By using the second beam to shift the properties of the glass from
transparent to reflective, we can start and stop the transmission of the
constant beam, switching the optical signal from on to off and back again very
quickly. With this method, we can switch the glass properties much more quickly
than current systems can send electrons. So we can send many more on and off
signals – zeros and ones – in less time.
SEO Poisoning Attacks on Healthcare Sector Rising, HHS Warns
Some threat actors also use targeted types of SEO poisoning, including
spear-phishing, to go after specific users, such as IT administrators and other
privileged users. "The technique enables attackers to target and customize their
attacks to specific audiences, making them more challenging to identify and
defend against," HHS HC3 wrote. Common SEO poisoning methods also include
typosquatting, which targets users who might open their browser and input a
website address that has an inadvertent typo or click on a link with a
misspelled URL, HHS HC said. Attackers often register domain names that are
similar to legitimate ones but contain minor spelling errors. ... Another tactic
is cloaking, which involves displaying search engine crawlers with different
material than what is presented to the user when the link is clicked;
manipulating search ranking by artificially increasing a website's click-through
rate to boost its ranking in search engine; and using private link networks,
which involves connecting a group of unrelated websites resulting in a network
of backlinks to a main website.
Democratizing data to fuel data-driven business decisions
Gone are the days when all an organization’s data lived in just one place.
Business and technical users alike must be able to fully leverage data that
spans cloud, distributed, and mainframe infrastructures. Because of this,
effective data intelligence tools are those that can be applied at scale and
support various technology connections to successfully plug and play into large
organizations’ complex environments. This is increasingly critical as a growing
number of organizations turn to hybrid solutions to leverage the benefits of
both the cloud and the mainframe. According to a recent Rocket Software survey,
an overwhelming 93% of respondents strongly agree with the sentence, “I believe
my organization needs to embrace a hybrid infrastructure model that spans from
mainframe to cloud.” Today’s data intelligence tools need to remove the barriers
that prevent organizations from leveraging their data assets to the fullest.
From being able to find and use the right data to increasing data use and trust,
maintaining a competitive edge requires organizations to leverage trusted data
to make informed and strategic business decisions.
I'm done with Red Hat (Enterprise Linux)
This past week, Red Hat took that knife and twisted it hard, when they published
this blog post. Let there be no mistake: this was meant to destroy the
distributions the community built to replace what Red Hat took away. There were
only two things that kept me around after Red Hat betrayed us the first time:
First, instead of attacking the community of open source users, many Red Hatters
reached out and asked, "How can we do better?" It didn't heal the wound, but it
meant something, knowing someone at Red Hat would at least listen. Second—and
more importantly—Rocky Linux and AlmaLinux stepped in. They prevented a
mass-exodus from the Red Hat ecosystem, giving developers like me a stable
target for my open source work. But Rocky and Alma relied on Red Hat sharing
their source code. Here's how it used to work:Red Hat would grab a copy of Linux
They would add magic sauce that makes it Red Hat Enterprise Linux; They
would release a new version; They would update a source code repository
with all the data required to build it from scratch
What’s Next for JavaScript: New Features to Look Forward to
TypeScript was developed to make JavaScript developers more productive, rather
than to replace JavaScript, but it’s also been a source of improvements to the
language. Currently, you use TypeScript to make types explicit in your code
while you’re writing it — but then you remove them when your code runs. Still
some way off, the stage 1 Type Annotations proposal for including type
information in JavaScript code but having them treated as comments by JavaScript
engines is important, because it converges TypeScript and JavaScript for
consistency in a way that keeps them aligned, but also makes it clear that
they’re working at different layers. Developers can use first class syntax for
types, whether that’s TypeScript or Flow syntax with long JSDoc comment blocks,
and know that their code is still compatible with JavaScript engines and
JavaScript tooling — avoiding the complexity of needing a build step to erase
the types before their code will run, Palmer pointed out. “There’s huge value
just in having static types that only exist during development and are fully
erased during runtime,” he explained.
Human brain-inspired computer memory design set to increase energy efficiency and performance
The researchers focused on hafnium oxide, an insulating material commonly used
in the semiconductor industry. However, there was one significant obstacle to
overcome: hafnium oxide lacks structure at the atomic level, making it
unsuitable for memory applications. But the team found an ingenious solution by
introducing barium into thin films of hafnium oxide, resulting in the formation
of unique structures within the composite material. These novel structures,
known as vertical barium-rich "bridges," allowed electrons to pass through while
the surrounding hafnium oxide remained unstructured. At the points where these
bridges met the device contacts, an adjustable energy barrier was created. This
barrier influenced the electrical resistance of the composite material and
enabled multiple states to exist within it. ... One remarkable aspect of this
breakthrough is that the hafnium oxide composites are self-assembled at low
temperatures, unlike other composite materials that require expensive
high-temperature manufacturing methods.
Enhancing Security With Data Management Software Solutions
Implementing a data management platform offers a multitude of advantages.
Foremost, it ensures heightened data integrity, furnishing dependable and
accurate information crucial for making well-informed decisions. Furthermore, it
mitigates the risk of data breaches, shielding the reputation of your business
and preserving the trust of your customers. Lastly, it streamlines regulatory
compliance, which proves invaluable considering the stringent data regulations
prevalent in numerous jurisdictions. This feature serves as a lifeline, aiding
in the avoidance of potential legal entanglements and financial repercussions
that may arise from non-compliance. By embracing a comprehensive data management
platform, businesses can enjoy the assurance of data accuracy, fortify their
security measures, and navigate the complex landscape of regulatory requirements
with ease, ultimately fostering growth, resilience, and long-term success. When
confronted with a myriad of data management solutions, selecting the ideal one
for your business requires careful consideration.
Wi-Fi 7 is coming — here's what to know
Here's how the Wi-Fi Alliance explains the upcoming standard: “Based on the
developing IEEE 802.11be standard, Wi-Fi 7 will be the next major generational
Wi-Fi technology evolution. Wi-Fi 7 focuses on physical (PHY) and medium access
control (MAC) improvements capable of supporting a maximum throughput of at
least 30Gbps to increase performance, enable Wi-Fi innovations, and expand use
cases. Additional Wi-Fi 7 enhancements will support reduced latency and jitter
for time sensitive networking applications including AR/VR, 4K and 8K video
streaming, automotive, cloud computing, gaming, and video applications, as well
as mission critical and industrial applications. As with other Wi-Fi
generations, Wi-Fi 7 will be backward compatible and coexist with legacy devices
in the 2.4, 5, and 6 GHz spectrum bands.” The alliance promises peak data rates
of 46Gbps, which is almost four times faster than Wi-Fi 6 (802.11ax) and 6E and
five times faster than Wi-Fi 5 (802.11ac). Wi-Fi 7 is also known as IEEE
802.11be Extremely High Throughput (EHT). It works in the 2.4GHz, 5GHz, and 6Ghz
bands.
Hackers Targeting Linux and IoT Devices for Cryptomining
These bots also are instructed to download and execute additional scripts to
brute-force every host in the hacked device's subnet and backdoor and any
vulnerable systems using the Trojanized OpenSSH package. The bots' purpose is to
maintain persistence and deploy mining malware crafted for Hiveon OS systems,
which are Linux-based open-source operating systems designed for cryptomining.
Microsoft attributed the campaign to a user named "asterzeu" on the
cardingforum.cx hacking forum. The user offered multiple tools for sale on the
platform, including an SSH backdoor, Microsoft said. Microsoft's disclosure
comes two days after a report on a similar campaign was published by the AhnLab
Security Emergency Response Center. The attack campaign consists of the Tsunami
- another name for Kaiten - DDoS bot being installed on inadequately managed
Linux SSH servers, the report said. As observed in Microsoft's analysis, Tsunami
also installed various other malware and cryptominer and obfuscation tools, such
as ShellBot, XMRig CoinMiner and Log Cleaner.
Most popular generative AI projects on GitHub are the least secure
The OpenSSF Scorecard is a tool created by the OpenSSF to assess the security of
open-source projects and help improve them. The metrics it bases the assessment
on are different facts about the repository such as the number of
vulnerabilities it has, how often it's maintained, and if it contains binary
files. By running Scorecard on a project, different parts of its software supply
chain will be checked, including the source code, build dependencies, testing,
and project maintenance. The purpose of the checks is to ensure adherence to
security best practices and industry standards. Each check has a risk level
associated with it, representing the estimated risk associated with not adhering
to a specific best practice. Individual check scores are then compiled into a
single aggregate score to gauge the overall security posture of a project.
Currently, there are 18 checks that can be divided into three themes: holistic
security practices, source code risk assessment, and build process risk
assessment. The Scorecard assigns an ordinal scale between 0 to 10 and a risk
level score for each check.
Quote for the day:
"Becoming a leader is synonymous with
becoming yourself. It is precisely that simple, and it is also that
difficult." -- Warren G. Bennis
No comments:
Post a Comment