Researchers develop new learning algorithm to boost AI efficiency
A working group led by two computer scientists Wolfgang Maass and Robert
Legenstein of TU Graz has adopted this principle in the development of the new
machine learning algorithm e-prop (short for e-propagation). Researchers at
the Institute of Theoretical Computer Science, which is also part of the
European lighthouse project Human Brain Project, use spikes in their model for
communication between neurons in an artificial neural network. The spikes only
become active when they are needed for information processing in the network.
Learning is a particular challenge for such less active networks, since it
takes longer observations to determine which neuron connections improve
network performance. Previous methods achieved too little learning success or
required enormous storage space. E-prop now solves this problem by means of a
decentralized method copied from the brain, in which each neuron documents
when its connections were used in a so-called e-trace (eligibility trace). The
method is roughly as powerful as the best and most elaborate other known
learning methods.
Data Leadership Book Review and Interview
The Data Leadership Framework is first about acknowledging that there is a whole
bunch of stuff an organization needs to do to make the most of data. The five
DLF Categories are where we evaluate an organization’s data capabilities and
figure out where they are struggling most among the complexity. The twenty-five
DLF Disciplines are where we then focus energy (i.e., invest our limited
resources) to make the biggest outcomes. By creating relative balance across the
DLF Categories, we maximize the overall impact of our data efforts. This is what
we need to be doing all the time with data, but without something like the Data
Leadership Framework, the problems can feel overwhelming and people have trouble
figuring out where to start, or what to do next. This is true of everybody, from
data architects and developers to the CEO. If we can use the Data Leadership
Framework to make sense amidst the chaos, the individual steps themselves are
much less daunting. Data competency is no longer a “nice-to-have” item. From
data breaches to analytics-driven disruptors in every industry, this is as big
of a deal to businesses as cash flow.
Enterprise Architecture for Managing Information Technology Standards
While globalization is excellent for business as it extends opportunities
for markets that were previously closed and permits the sharing of ideas and
information across different platforms, it could threaten the budgetary
plans of SMBs. Investments in licensing, infrastructure, and global
solutions, in general, hit this segment harshly. Lack of Talent Pool: This
problem is primarily limited to the technology segment. Around half of
employees lack the critical thinking skills that would qualify them to grow
further in this field. The IT team faced the most significant hurdle so far
is having members that aren’t smart enough to put a general hardware and
software security environment cost-effectively. IT Policy Compliance
Failure: Specific technologies used by IT projects don’t comply with the
policy rules as defined by their departments. IT departments are sometimes
unaware of techniques used by their teams and business stakeholders,
increasing the risk of uncontrolled data flows and non-compliance. Besides,
these technologies are sometimes incompatible with the existing portfolio.
This increases IT debt, primarily if technology standards are not enforced.
IoT Architecture: Topology and Edge Compute Considerations
Network engineers often have experience with a particular topology and may
assume it can be used in any setting, but sometimes another choice would be
more optimal for a different use case. To determine whether a mesh
networking topology is a good choice for your application, it is important
to understand the pros and cons of this strategy. A critical factor to
analyze is your system's timing requirements. Mesh networking topologies
route data from node to node across a network that is architected in a mesh.
So the "hops" need to be accounted for due to added latency. Do you need the
data back in 100 mS or can you live with once a second? ... Wireless
point-to-point (PTP) and point-to-multipoint (PTMP) are topologies used for
connectivity in a wide range of applications, such as use cases where you
want to replace cables with wireless communication. These protocols
communicate between two devices (point-to-point) or from one device to many
(point-to-multipoint). There are a few factors to consider, such as
distance, timing and battery power that may indicate if a PTP network is
needed versus a mesh network.
An introduction to confidential edge computing for IoT security
Recent attacks, even outside of IoT, showed that hackers exploited weak
configurations of public cloud services to access sensitive data. The reason
that hackers succeeded in obtaining sensitive information stored on a public
cloud had nothing to do with the security mechanisms implemented by the
cloud provider but were rather the result of little mistakes made by the end
users, typically in the Web Application Firewall (WAF) that controls the
access to the cloud network or by leaving credentials unprotected. These
little mistakes are almost inevitable for companies that have a cloud-only
infrastructure. However, by demarcating sensitive and non-sensitive
information, this could help their IT teams in setting up the cloud services
to achieve safer security practices. Those mistakes emphasize the need for a
broader security expertise aiming at defining the security architecture to
be enforced on the overall system and at finding out whether the security
features of the cloud provider need to be completed by additional protection
mechanisms. A first logical step consists of demarcating sensitive and
non-sensitive information, to help the IT team establish appropriate
priorities.
How IoT Devices are Rapidly Revolutionizing the World of Small Businesses
Small business owners may want to take some time to look through a list of
the top IoT software rankings before they decide on a single platform. It
can be difficult to migrate to another one after your firm has become
heavily invested in a certain type of technology. This is especially true of
those who plan to primarily use consumer-grade equipment that often goes
through various revisions as market pressures force engineers to redesign
certain aspects of their builds. Keep in mind that all Internet of Things
devices include some sort of embedded general purpose computer. This means
that each piece of smart equipment is free to share information collected
from onboard peripherals. That makes it easy to learn more about how
different circumstances impact your business. Think of a hotel or restaurant
that has multiple rooms. Each of these have an adjustable thermostat. If
some of them are set too high or low, then the business in question may end
up losing thousands by using too much energy. A number of service providers
in the hospitality industry now use IoT software to monitor energy usage
throughout entire buildings.
The Journey to Effective Data Management in HPC
High Performance Computing (HPC) continues to be a major resource investment
of research organizations worldwide. Large datasets are used and generated
by HPC, and these make data management a key component of effectively using
the expensive resources that underlie HPC infrastructure. Despite this
critical element of HPC, many organizations do not have a data management
plan in place. As an example of data generation rates, the total storage
footprint worldwide from DNA sequencing alone is estimated at over 2
Exabytes by 2025, most of which will be processed and stored in an HPC
environment. This growth rate causes an immense strain on life science
organizations. But it is not only big data from life sciences that is
stressing HPC infrastructure, but research institutions like Lawrence
Livermore National Labs (LLNL) also generate 30TB of data a day. This data
serves to support their research and development efforts applied to national
security, and these daily data volumes can also be expected to increase. As
the HPC community continues to generate massive amounts of file data,
drawing insights, making that data useful, and protecting the data becomes a
considerable effort with major implications.
Taking A Deep Look at DLT (Distributed Ledger Technology)
A great deal of effort and investment is continuously going into mitigating
blockchain’s scalability issues. One of the headline motivations for
this directive is to level-up the user experience on blockchain networks to
accommodate a diverse range of concurrent activity without compromising any
of the blockchain elements. When this is achieved – blockchain architects
and companies will have a more comprehensive suite of blockchain tools to
meet new and growing needs in the market. For a long time blockchain has
been unfairly subjected to pessimistic scrutiny that undermine its value.
Unfair in a sense that blockchain is brilliant, revolutionary and still
young. But then again, nothing exists in a vacuum totally free from
pessimistic sentiments. Everything in existence has some criticism attached
to it. Even so – blockchain is resilient! It is here for good – and so is
DLT. If you look at DLT you will see that many DLT based start-ups offer
business-to-business solutions. Distributed ledgers are well poised for
companies because they address multiple-infrastructural issues plagued in
industries. One of them is databases. Given how disparate and complex
organizations have grown - legacy databases have become victim to
inefficiencies and security loopholes.
Adapting online security to the ways we work, remotely and post-coronavirus
Not only were many companies unprepared for the mass transition to remote
work, but they were also caught off guard by the added technology and
security needs. According to CNBC, 53 senior technology executives say their
firms have never stress-tested their systems for a crisis like this. For
example, when companies are working from the office, it is easier for IT
teams to identify threat agents that make attempts into systems since
hackers’ locations are removed from those offices. However, with employees
dispersed at their homes, identifying these foreign breaches are less
recognizable. Companies have also been caught flatfooted during this crisis
by relying on employees to use their personal devices instead of providing a
separate work device. This prevents IT teams from identifying suspicious
activity. To keep employee and company information secure, it is up to the
CISO and IT decision-makers to create and strictly enforce a regular
practice for accessing, editing and storing their data. Most employees value
productivity over security. This is problematic. Employees gravitate towards
tools and technology they prefer to get their work done effectively.
Is Your Approach to Data Protection More Expensive than Useful?
Now more than ever, data is the lifeblood of an organization – and any
incidence of data loss or application unavailability can take a significant
toll on that business. With the recent rise in cyberattacks and exponential
data growth, protecting data has become job #1 for many IT organizations.
Their biggest hurdle: managing aging infrastructure with limited resources.
Tight budgets should not discourage business leaders from modernizing data
protection. Organizations that hang onto older backup technology don’t have
the tools they need to face today’s threats. Rigid, siloed infrastructures
aren’t agile or scalable enough to keep up with fluctuations in data
requirements, and they are based on an equally rigid backup approach.
Traditional backup systems behave like insurance policies, locking data away
until you need it. That’s like having an extra car battery in the garage,
waiting for a possible crisis. The backup battery might seem like a reasonable
preventive measure, but most of the time it’s a waste of space, and if the
crisis never arises it’s an unnecessary upfront investment, more expensive
than useful. In the age of COVID-19 where cash is king and onsite resources
are particularly limited, some IT departments are postponing data protection
modernization, looking to simplify overall operations and lower infrastructure
cost first.
Quote for the day:
No comments:
Post a Comment