IoT security: Development and defense
While IoT adoption continues to grow, the standards, compliance requirements and
secure coding practices surrounding IoT have not advanced at the same rate.
Recent high profile software supply chain attacks have brought the issue of
secure coding into sharp focus, prompting the Biden administration to issue an
executive order addressing new requirements for federal agencies to only
purchase and deploy secure software. This pivotal shift will have an immediate
impact on global software development processes and lifecycles, especially when
you consider the vast reach of U.S. federal procurement. Virtually all device
manufacturers and software companies will be impacted directly as the
administration begins to increase obligations on the private sector and
establish new security standards across the industry. Specific to IoT, the order
directs the federal government to initiate pilot programs to educate the public
of the security capabilities of IoT devices, and to identify IoT cybersecurity
criteria and secure software development practices for a consumer-labeling
program.
Efficient unit-testing with a containerised database
The real problem is mixing two languages in one body of code. The dbUtil handle
is just a boilerplate reduction device here. The raw SQL is still there. We
still can’t test the complex individual statements separate from the simple yet
crucial control logic captured in the if-statements, which depend solely on the
state of the person object, not on the database. Sure, we can test this control
logic fine if we mock out the calls to the database. The mock for dbUtil returns
a prepared list of person objects, and we can verify the correct invocation of
it for the two different conditions. That unavoidably leaves the SQL untested.
If we want to test the execution of these statements, we need to run the entire
code inside the for loop, this time using a real database. That test needs to
set up the conditions for all the three execution paths (condition 1, 1 and 2,
or none), as well as verify what happened to the state after executing the void
statement executions. It can be done, but we are of necessity testing both the
Java and SQL realms here. That’s hardly the lean unit testing we’re looking for.
Ansible vs Docker: A Detailed Comparison Of DevOps Tools
Ansible is an open-source automation engine that helps in DevOps and comes to
the rescue to improve your technological environment’s scalability,
consistency, and reliability. It is mainly used for rigorous IT tasks such as
configuration management, application deployment, intraservice orchestration,
and provisioning. In recent times, Ansible has become the top choice for
software automation in many organizations. Automation is one of the most
crucial aspects of industries these days. Unfortunately, many IT environments
are too complex and often require to be scaled too quickly for system
administrators and developers to keep up, rather than manually. ... Docker is
an open-source platform application for developing, shipping, and running
applications. It enables developers to package applications into containers, a
set of standardized and executable components that combine the application
source code with the operating system libraries and dependencies required to
run that code in an executable environment. Containers can even be created
without Docker, but the platform and user interface make it easier, simpler,
and safer to build, deploy and manage containers.
Delegation and Scale: How Remote Work Affected Various Industries
The basic goal of delegation of authority is to enable efficient organization.
Just as no single individual in a company can do all of the tasks required to
achieve a group's goals, it becomes arduous for the management to wield all
decision-making authority as a business expands. This is because there is a
limit to the number of people a manager can successfully monitor and make
decisions. When this threshold is reached, the authority must be handed to
subordinates. While centralization was still a possibility before the
pandemic, this was no longer the case after back-to-back lockdowns and
economic slowdowns. In such a situation, the delegation came as a boon that
not only kept the workflow active but also helped in scaling the growth. ...
Delegating gives your team greater confidence, makes them feel important, and
allows them to demonstrate their abilities. This will result in mutual
appreciation with colleagues motivating one another to work more, and staying
devoted to attaining the goals.
Seeking a Competitive Edge vs. Chasing Savings in the Cloud
If companies do not make changes to their IT operations in response to a
migration, finding savings can be more difficult, L’Horset says. “In the
industry, there’s a lot of debate: Is cloud saving you money or not? Our
research indicates that even at the basic level, yes it does,” he says. “The
difference between the cost-savings, which you can get through cloud, and the
value of innovation that you absolutely can and should get through cloud, is
the fundamental reason you should go.” Roy Illsley, chief analyst with Omdia,
the research arm of Informa Tech, says the cost benefits of cloud can be
positive if the workload is variable in its resource requirements, its
resource requirements match the cloud providers packaging of resources, or it
requires high availability. "If the workload is stable in its resource
requirements then on-premises is more cost effective," he says. Respondent
companies to the Accenture survey that did not list cloud as a top priority
still saw significant cost-savings, says Jim Wilson, managing director of
information technology and business research at Accenture Research.
7 Ways AI and ML Are Helping and Hurting Cybersecurity
AI/MI is used in network traffic analysis, intrusion detection systems,
intrusion prevention systems, secure access service edge, user and entity
behavior analytics, and most technology domains described in Gartner's Impact
Radar for Security. In fact, it's hard to imagine a modern security tool
without some kind of AI/ML magic in it. ... Through social engineering and
other techniques, ML is used for better victim profiling, and cybercriminals
leverage this information to accelerate attacks. For example, in 2018,
WordPress websites experienced massive ML-based botnet infections that granted
hackers access to users' personal information; ... Ransomware is experiencing
an unfortunate renaissance. Examples of criminal success stories are numerous;
one of the nastiest incidents led to Colonial Pipeline's six-day shutdown and
$4.4 million ransom payment; ... ML algorithms can create fake messages
that look like real ones and aim to steal user credentials. In a Black Hat
presentation, John Seymour and Philip Tully detailed how an ML algorithm
produced viral tweets with fake phishing links that were four times more
effective than a human-created phishing message.
Electronic signatures: please sign on the digital line
First, let’s look at the importance of content to a business. In simple terms,
content is the inherent value of a company. It’s NASA’s designs for their new
space station, AstraZeneca’s highly regulated pharmaceutical patents, and
Oxfam’s humanitarian aid records. It’s the clinical trial results for the next
breakthrough vaccine, or the blueprint for the innovative new approach to
flooding solutions. Content is the entire work of an organisation and is
completely unique for every company. Content is the database of its most
valuable insights. But to effectively realise this value, organisations need
to find a single place for their content. Separating content between different
silos and applications creates friction, which can stand in the way of
employees accessing and sharing information, inhibiting innovation and
productivity. Applications in today’s content-driven world are often judged by
their ease of integration with other technologies. As a result, businesses are
turning to single platforms where content can be securely stored and managed,
while all compliance requirements are met and all teams have the opportunity
to collaborate on the content, both internally and externally.
Protect your smartphone from radio-based attacks
An IMSI catcher is equipment designed to mimic a real cell tower so that a
targeted smartphone will connect to it instead of the real cell network.
Various techniques may be employed to do it, such as masquerading as a
neighboring cell tower or jamming the competing 5G/4G/3G frequencies with
white noise. After capturing the targeted smartphone’s IMSI (the ID number
linked to its SIM card), the IMSI catcher situates itself between the phone
and its cellular network. From there, the IMSI catcher can be used to track
the user’s location, extract certain types of data from the phone, and in some
cases even deliver spyware to the device. Unfortunately, there’s no surefire
way for the average smartphone user to notice/know that they’re connected to a
fake cell tower, though there may be some clues: perhaps a noticeably slower
connection or a change in band in the phone’s status bar (from LTE to 2G, for
example). Thankfully, 5G in standalone mode promises to make IMSI catchers
obsolete, since the Subscription Permanent Identifier (SUPI) – 5G’s IMSI
equivalent – is never disclosed in the handshake between smartphone and cell
tower.
The value of data — a new structural challenge for data scientists
Some companies with data scientists in place have difficulty operationalising
their skills. If we look at the volumes of data processed by organisations,
the different structures and architectures, it is not imperative to have a
data scientist in its ranks of data experts. For companies managing an
astronomical amount of data, on multiple channels and with a complex
structure, the expertise of a data scientist will prove beneficial in modeling
data, query it and make predictions. One of the first questions to ask is
therefore related to data and business needs and to organise the structure
according to an organisation’s structure and its data strategy. Companies have
also realised that having a data scientist was not the answer to their data
value problems. This is partly due to a lack of understanding in the
environment surrounding data. A data scientist may understand the data, but
not its purposes and environments or business applications. Let’s take the
example of a marketing department working on implementing AI to accelerate its
web ROI.
Interview With Prof B Ravindran, Head, Robert Bosch Centre For Data Science & AI
Interpretability of deep learning models is essential for widespread adoption
of these techniques in the Medical image diagnosis community. Deep learning
models have been phenomenally successful at beating state of the art in common
medical image diagnosis tasks like segmentation and screening applications,
e.g. classification of diabetic retinopathy and chest X-ray scans, among
others. While these successes have created huge interest in adopting these
techniques in clinical practice, a huge barrier in adoption is the lack of
interpretability of these models. Convolutional Neural Networks with hundreds
of layers is the workhorse for medical image diagnosis. While the initial
layers are typically edge detectors and shape detectors, it is fairly
impossible to explain or interpret the feature maps as one goes deeper into
the network. In order for clinicians to trust the output from these networks,
it is essential that a mechanism for explaining the output be present. In
addition, black-box techniques will make it hard for clinicians to justify the
diagnosis and follow up procedures.
Quote for the day:
"Honor bespeaks worth. Confidence
begets trust. Service brings satisfaction. Cooperation proves the quality of
leadership." -- James Cash Penney
No comments:
Post a Comment