Preparing The Office For The AI Revolution
Continuous learning should already be a key part of your culture. Training and
developing awareness of AI capabilities and benefits is simply another string to
the bow. It helps your business stay competitive, but it also feeds into
employee engagement, retention and attracting new talent. It may be that you
organize on-site workshops or for those who want it or subsidize an advanced
degree program. Every employee is different, so it’ll take some mixing and
matching to tailor your efforts to their preferred learning environments, but
harnessing this learning and applying these new skills within your business not
only nurtures talent, but it also fosters forward-thinking, equipping your
people with the skills and the mentality — resilience even — to evolve with the
ever-changing world of work. Your people are your most important asset — we all
know this. Finding the right people for your business can be the difference
between success and failure. However, finding the right combination of people
and technology can give you an even greater competitive advantage one that you
never even considered.
5 years until enterprise quantum, but your prep begins now
Taking a longer view of the enterprise quantum computing journey will yield
greater benefit and business value. That means defining the link between your
organization’s quantum program and broader business strategies. Ideally, set up
a feedback loop where business strategies help to prioritize quantum program
outcomes and the quantum program informs new strategies. ... Navigating the
rapidly evolving quantum ecosystem is difficult, with an increasing variety of
vendors, academic institutions, startups, and cloud-based service providers.
Selecting the right company requires identifying which ones can understand your
specific business use case(s) and provide the type of quantum technology you
need for your specific strategy. One challenge can be around making long-term
commitments to quantum partners. Hardware and software providers have until
recently been focused on finding a few high-value strategic collaborators, as
reflected in their access contracts or licensing terms, lengthy subscription
models, or research programs.
Ransomware attacks show we're getting clobbered on cybersecurity
The Colonial Pipeline and JBS attacks disrupt the traditional construct that
makes IT attacks the playground of criminals and OT attacks the purview of
hostile intelligence services. Criminals now have a taste of how lucrative an OT
attack might be. Will we start to see criminals target OT infrastructure more
often? That is certainly the concern of law enforcement and the intelligence
community. Attacks on infrastructure by criminals who aren’t deterred by
diplomatic niceties changes the game. This concern was reflected in
announcements on Friday by the Department of Justice that ransomware extortions
will immediately move up the priority stack to equal terrorism. FBI Director
Christopher Wray followed up by likening the surge in ransomware hacks to 9/11.
These are startling statements that wouldn’t have been issued without the
Colonial Pipeline attack. Interesting what some gas station waiting lines will
trigger. Lurking behind all this is a scenario that has our protectors reaching
for antacids. Are enemy intelligence services making moves to leverage criminal
organizations to attack OT infrastructure, with all its debilitating effects, in
a kind of proxy cyber war?
Linux Variant of HelloKitty Ransomware Targets VMware ESXi Servers
The days when Linux, Unix and other Unix-like computer operating systems weren’t
typically targeted by malware authors are long gone. It might well have been the
case that attackers used to prefer bedeviling Windows systems, given that
Windows instances are far more widespare than Linux instances. As well, Linux
instances are generally well-protected against vulnerabilities, thanks to a
tight-knit user base that delivers fast security updates. Andrew Barratt,
managing principal of solutions and investigations at cybersecurity advisory
firm Coalfire, told Theatpost on Friday that we said goodbye to the days when
malware didn’t target Linux “a long time ago,” but that change was typically
server-side and hence not particularly visible to the public. “With the rise of
Mac OS on the desktop and its underlying infrastructure being based on BSD –
everyone’s favourite ‘hard nix’ – there has been a correlation in *nix based
malware as attackers target the Apple end user,” Barratt said in an email, *nix
being shorthand for any Unix, Linux or other Unix-like systems.
Assessing the Feasibility of Research and Data Science Projects
Estimating the probability of technical success — designated as P(T) from now
on — is an exercise in identifying the salient risk factors, then scoring each
individually on a consistent scale before plugging the values into a
mathematical function. Easy. But what exactly is “technical success”? The
definition of P(T) is as follows: a likelihood that, given existing data,
available technologies, required research, and development, the prototype will
be completed and will perform the defined function in a curated laboratory or
constrained production environment. P(T) is given as a percentage. The
definition is important as it sets expectations for the deliverables. We limit
the scope of P(T) to the prototype designed to test the technologies and
convince a moderate sceptic that the approach works. The tests should run in a
limited but representative case and use a vetted data set. Why the
limitations? They decrease the time needed to develop the prototype without
compromising the evaluation. They reduce the chance of misalignment with final
goals by facilitating minor, continual adjustments.
Top 10 Low-Code App Development Best Practices to Follow
It’s a better idea to put an imperfect application in production than it is
to release a fully-baked one. Succeeding with low-code means dividing your
application into meaningful chunks and publishing them as often as you can.
The idea is to constantly get immediate field feedback from users and make
consistent small improvements. Conduct regular feedback sessions with your
team members and see how the app operates and what it lacks. ... Low-code
platforms provide prepared UI libraries with consistent components. They’re
easy to use, and it’s a good practice to create a simple platform that
borrows the best features from market leaders. It will take less time and
money than creating a unique UI/UX and will allow you to release your app
faster. In our experience, UX specialists bring value at the beginning of
the project, but later their role decreases significantly, and on the off
chance that UX and Visual Design support is needed later on, experts can
contribute on an as-needed basis. ... To continue developing a good product,
you should always communicate with the low-code community and your
users.
The ICE Blockchain
The purpose of this new blockchain is the much-needed addition of EVM
compatibility to our ecosystem. We believe EVM compatibility is paramount for
the growth of the ICON Network (see Polygon’s recent success as a good
indicator of its importance). It enables developers to iterate and innovate
much faster using existing tooling and code from the Ethereum ecosystem and
allows for increased growth prospects of ICON given easier portability of
applications. Additionally, with this new chain, we are also adding eWASM
support that puts us ahead of the game, preparing us for ETH 2.0
compatibility. ... The ICE blockchain will start as a Proof of Authority
network, with trusted validators operating the chain. After the network is
stabilized, it will migrate to NPoS, the consensus mechanism commonly used by
other Substrate SDK networks. NPoS is a variation of DPoS (ICON Network’s
current consensus mechanism), where token holders vote for validators to
produce blocks and govern the network. Smart contracts on the ICE Network will
be written in Solidity, Rust, and C++. Fees for smart contract interactions,
smart contract deployments, and token transfers will be paid in ICE tokens.
Physical threats increase as employees return to the office
“As corporations advance the digital transformation of their physical security
operations, it has never been more important to seize the opportunity to align
their cybersecurity operations and infrastructure,” said Lukas Quanstrom, CEO
of Ontic. “Bringing together all threat data and intelligence in an always-on,
technology-driven approach to security is the most effective way to advance
business continuity in today’s increasingly hyper-connected, hyper-violent
environment.” ... Of the physical threats that resulted in harm or death at
companies in 2021, 49% of respondents think most or almost all could have been
avoided if cybersecurity and physical security intelligence were unified so
threats could be shared and actioned by cross-functional teams. 55% of
physical security and IT leaders say their CEO believes training employees so
they are better prepared for potential workplace violence will create a
culture of fear. 26% of those surveyed say their company has never addressed
the potential for workplace violence and employees would not know what to do
if an active shooter entered their facilities.
First, repeated coding vulnerabilities caught by the security team create
frustration for both groups, causing dissension that can impede innovation.
Further, the cycle of coding, testing, fixing and then retesting for
vulnerabilities delays application releases and extends time to value. And when
vulnerabilities lead to successful cyberattacks, everyone is frustrated. The
legal and compliance teams face regulatory repercussions. Customers and partners
lose trust in the organization. And the C-suite and board see the full scope of
the ramifications: time and money wasted, privacy violations and fines, loss of
intellectual property, customer and partner attrition, and lawsuits and brand
damage. As a result, ensuring secure code from the very beginning of the
software development life cycle cannot be ignored as an essential part of a
comprehensive cybersecurity defense strategy. So how can you close this security
gap, eliminate application vulnerabilities at the source and make everyone
happy? First, recognize there are no bad guys on your defense team. This is a
problem of process — that is, ensuring the right people have the right skills
and access to the right information.
Decentralized Finance: The Next Big Threat for the Finance Sector
Decentralized finance, or DeFi, is a fast-growing segment of the financial
markets. Based on a blockchain platform, DeFi provides software services that
can cut out intermediaries in financial transactions, thereby allowing for
financial services, such as mortgages and investment, to be delivered at lower
costs. The question is: Will it take off, or will the financial sector push
back? ... First of all, there are a host of technical risks and concerns about
attacks and hacks that have been very significant in DeFi. There have been
hundreds of millions of dollars lost because these systems are not sufficiently
mature, robust and resilient. ... There are legal risks as well, where
regulators appropriately have concerns about things like money laundering and
fraud that are going on in the larger blockchain and cryptocurrency world and as
well as in DeFi specifically. The value of these DeFi services is that they are
decentralized, so there’s not one actor that is responsible for all of the
transactional activity.
Quote for the day:
"It's not about how smart you are--it's
about capturing minds." -- Richie Norton
No comments:
Post a Comment