Safeguarding the use of complex algorithms and machine learning
The immediate fallouts of algorithmic risks can include inappropriate and
potential illegal decisions. And they can affect a range of functions, such as
finance, sales and marketing, operations, risk management, information
technology, and human resources. Algorithms operate at faster speeds in fully
automated environments, and they become increasingly volatile as algorithms
interact with other algorithms or social media platforms. Therefore,
algorithmic risks can quickly get out of hand. Algorithmic risks can also
carry broader and long-term implications across a range of risks, including
reputation, financial, operational, regulatory, technology, and strategic
risks. Given the potential for such long-term negative implications, it’s
imperative that algorithmic risks be appropriately and effectively managed.
... A good starting point for implementing an algorithmic risk management
framework is to ask important questions about the preparedness of your
organization to manage algorithmic risks. For example: Does your organization
have a good handle on where algorithms are deployed?; Have you evaluated
the potential impact should those algorithms function improperly?; Does senior
management within your organization understand the need to manage algorithmic
risks?
How Decision Transformation is Essential to Digital Transformation
The human is better at telling them that the customers aren't happy. And the
fact that the customer is unhappy is a crucial determinant in how the decision
should be made. So instead of ignoring it in the automation, and throwing up
an answer, and then having the person go, "Well that was a stupid answer
because this customer is unhappy." Go ahead and ask the person, is the
customer unhappy, and if they say yes or no, then use that as part of the
decision making. So we find that there's often a role for humans in decision
making, but it's often not this supervisory, "You make a suggestion, I'll
override it if I feel like it kind of thing." And so we find you have to
really understand the structure of your decision making before you can make
those judgments. So we encourage people, when we're working with them—Look,
let's understand the decision making first and let's understand all of it,
automated pieces and the manual pieces. And once we understand all of it, then
we can draw a suitable automation boundary to figure out which pieces to
digitize, which technologies to use, and make it an integrated whole.
3 Daunting Ways Artificial Intelligence Will Transform The World Of Work
Even in seemingly non-tech companies (if there is such a thing in the future),
the employee experience will change dramatically. For one thing, robots and
cobots will have an increasing presence in many workplaces, particularly in
manufacturing and warehousing environments. But even in office environments,
workers will have to get used to AI tools as “co-workers.” From how people are
recruited, to how they learn and develop in the job, to their everyday working
activities, AI technology and smart machines will play an increasingly
prominent role in the average person's working life. Just as we've all got
used to tools like email, we'll also get used to routinely using tools that
monitor workflows and processes and make intelligent suggestions about how
things could be done more efficiently. Tools will emerge to carry out more and
more repetitive admin tasks, such as arranging meetings and managing a diary.
And, very likely, new tools will monitor how employees are working and flag up
when someone is having trouble with a task or not following procedures
correctly. On top of this, workforces will become decentralized – which
means the workers of the future can choose to live anywhere, rather than going
where the work is.
Facebook open-sources one of Instagram's security tools
While most static analyzers look for a wide range of bugs, Pysa was
specifically developed to look for security-related issues. More particularly,
Pysa tracks "flows of data through a program." How data flows through a
program's code is very important. Most security exploits today take advantage
of unfiltered or uncontrolled data flows. For example, a remote code execution
(RCE), one of today's worst types of bugs, when stripped down, is basically a
user input that reaches unwanted portions of a codebase. Under the hood, Pysa
aims to bring some insight into how data travels across codebases, and
especially large codebases made up of hundreds of thousands or millions of
lines of code. This concept isn't new and is something that Facebook has
already perfected with Zoncolan, a static analyzer that Facebook released in
August 2019 for Hack -- the PHP-like language variation that Facebook uses for
the main Facebook app's codebase. Both Pysa and Zoncolan look for "sources"
(where data enters a codebase) and "sinks" (where data ends up). Both tools
track how data moves across a codebase, and find dangerous "sinks," such as
functions that can execute code or retrieve sensitive user data.
Google’s New TF-Coder Tool Claims To Achieve Superhuman Performance
TF-Coder uses two ML models in order to predict the needed operations from
features of the input/output tensors and a natural language description of the
task. These predictions are then combined within a general framework to modify
the weights to customise the search process for the given task. The
researchers introduced three key ideas in the synthesis algorithm. Firstly,
they introduced per-operation weights to the prior algorithm, allowing
TF-Coder to enumerate over TensorFlow expressions in order of increasing
complexity. Secondly, they introduced a novel, flexible, and efficient type-
and value-based filtering system that handles arbitrary constraints imposed by
the TensorFlow library, such as “the two tensor arguments must have
broadcastable shapes.” Finally, they developed a framework to combine
predictions from multiple independent machine learning models that choose
operations to prioritise during the search, conditioned on features of the
input and output tensors and a natural language description of the task. The
researchers evaluated TF-Coder on 70 real-world tensor transformation tasks
from StackOverflow and from an industrial setting.
Microservice Architecture in ASP.NET Core with API Gateway
A traditional Approach would be do Build a Single Solution on Visual Studio
and then Seperate the Concerns via Layers. Thus you would probably have
Projects like eCommerce.Core, eCommerce.DataAccess and so on. Now these
seperations are just at the level of code-organization and is effecient only
while developing. When you are done with the application, you will have to
publish them to a single server where you can no longer see the seperation in
the production environment, right? Now, this is still a cool way to build
applications. But let’s take a practical scenario. Our eCommerce API has,
let’s say , endpoints for customer management and product management, pretty
common, yeah? Now down the road, there is a small fix / enhancement in the
code related to the Customer endpoint. If you had built using the Monolith
Architecture, you would have to re-deploy the entire application again and go
through several tests that guarentee that the new fix / enhancement did not
break anything else. A DevOps Engineer would truly understand this pain. But
if you had followed a Microservice Architecture, you would have made Seperate
Components for Customers, Products and so on.
Granularity Decision of Microservice Splitting in View of Maintainability ...
In practical service application, challenges come from both the service and
the technique. This section draws a detailed summary of the features of the
four architectures and analyzes their key distinctions (Table
1). In
terms of hierarchy, monolithic and vertical architectures centralize
functional modules of each hierarchy with high coupling degree; SOA uncouples
multiple functional modules of vertical and horizontal hierarchies of three or
more tiers, but public modules can only be shared on horizontal hierarchies,
leading to unthorough uncoupling; the fully self-service flexibility achieved
by simultaneous uncoupling on vertical and horizontal hierarchies represents
the main characteristic of microservice architecture; however, when putting
large projects into practice, development teams cannot comply with all the
features and they must consider the integration of irreplaceable systems and
promote the flexibility of full uncoupling within acceptable changing rate.
The core role of microservice architecture is to cope with the growing service
capability within the system and the increasingly complex interaction demands
between systems.
Global Cybercrime Surging During Pandemic
The stress and uncertainty caused by the COVID-19 crisis is creating the ideal
environment for cybercriminals looking to cash in or create chaos. "Given the
impact and scale of COVID-19, cyberattacks related to organizations involved
in COVID-19 research or those firms providing relief services have continued
to evolve, morph and expand," says Stanley Mierzwa, director of the Center for
Cybersecurity at Kean University in Union, New Jersey. "Threat actors will
continue to look for areas of vulnerability, and this could potentially reside
in 'local' or 'satellite' offices of larger global for-profit, non-profit and
non-governmental organizations that may not be utilizing centrally managed or
administered systems," Mierzwa says. Craig Jones, who leads the global
cybercrime program for Interpol, said in a recent interview with Information
Security Media Group: "Certainly in relation to the COVID-19 pandemic, we're
seeing a unique combination of events that have led to a whole range of
specific criminal opportunities." Criminals haven't shied away from attempting
to seize those opportunities, as demonstrated by their rush to rebrand attacks
and even "fake news" campaigns to give them a COVID-19 theme, as well as
unleash scams involving personal protective equipment, he told ISMG.
Fixing the Biggest IoT Issue — Data Security
By removing the latency and bandwidth scaling issues of cloud-based
intelligence, it paves the way for a far more natural human–machine
interaction. In the smart home, for example, the AIoT brings a whole new
dimension to home control. By coupling voice with human sensing technology,
such as presence detection and biometrics, we can build a multi-modal
interaction that delivers an energy efficient and seamless, personalised
experience. The TV will know when you’re in the room and ‘wake’ to a standby
mode, it will know who you are and on hearing the wake word, greet you with
familiarity and deliver your preferred settings. This kind of interaction also
has clear applications across smart cities. Multi-modal sensing opens the path
for significant steps forward in safety, security and energy efficiency. Let’s
take the humble streetlight: the inclusion of human presence detection would
enable it to light up only when a pedestrian or cyclist is in the vicinity.
Add in voice control and the lamppost can detect a cry for help — of even the
sound of glass breaking, triggering a call to the emergency services for
assistance. In offices and public buildings, we won’t need to push buttons on
elevators or hunt in our bags for a lift pass, instead our biometrics will
form our signature for access, enabling a secure and convenient experience.
Exploring the Forgotten Roots of 'Cyber'
"What does 'cyber' even mean? And where does it come from?" writes Thomas Rid
in "Rise of the Machines," his book-length quest to unravel cyber's origin
story. Everyone from military officers and spies, to bankers, hackers and
scholars "all slapped the prefix 'cyber' in front of something else ... to
make it sound more techy, more edgy, more timely, more compelling - and
sometimes more ironic," writes Rid, who's a professor of political science at
Johns Hopkins University. Cyber has cachet. Cyber inevitably seems to always
be pointing to the future. But as Rid writes in his book, "the future of
machines has a past," and cyber has long stood not just for a future, Utopian
merging of humans and machines, but a potential dystopia as well. On the good
side exists the potential offered by cyborg-like technologies that might one
day, for example, enable humans with spinal injuries to walk again. Such
technology may even facilitate the human colonization of Mars. For a view of
the flip side, however, take the "Matrix's" rendering of a postapocalyptic
hellhole in which humans have been made to unthinkingly serve machines.
Quote for the day:
No comments:
Post a Comment