19 Realistic Habits To Improve Software Development
When you finish writing a fragment of code and see that it works, take some time
to reread it and see if you can improve it. Think that you are going to show it
to someone else who is going to evaluate your code. Would you leave it the same?
One of the best code refactoring techniques is the red/green process used in
Agile test-driven development. To use this technique, your code must be covered
with tests. If when refactoring, something fails, the test will not pass, and
you will be aware that something is wrong with your refactor. ... Plan a
time interval without distractions or interruptions. Interruptions will make
your mind lose track of what it is developing, and you will have to start again
when you resume the activity, which will cost you extra work time and make you
more prone to make mistakes. It works to leave only the IDE open and a browser
with a maximum of two tabs. ... Don’t try to write clever code that only
you understand. Write code that someone else can read and understand. It doesn’t
matter if your code has a few more lines if they’re necessary to make it
understood better. Remember that in a few months, you or someone else on your
team may have to modify the code, and if it is not easy to understand, it will
not be easy to modify.
Clear & Present Danger: Data Hoarding Undermines Better Security
Even though there is overlap between the users of big companies' services and
the customers of small businesses, the big companies aren't sharing their data.
As a result, customers who use smaller businesses are left to fend for
themselves. A few companies are trying to change that.
Deduce (disclosure, another company I've
consulted for) created a data collective through which companies can share
information about user's security-related behavior and logins. In exchange for
sharing data with the platform, companies get access to Deduce's repository of
identity data from over 150,000 websites. They can use this shared data to
better detect suspicious activity and alert their users, just like Microsoft and
Google do using their own data. In a different approach to helping businesses
identify suspicious users, LexisNexis created unique identifiers for their
clients' customers. Using these identifiers, their clients can share trust
scores that indicate if a particular user is suspicious. If a suspicious user
attempts to log in to a website, the site can block that user to keep themselves
and their legitimate users safer.
Optimizing the CIO and CFO Relationship“
CIOs are more likely to be pioneers and/or integrators, while CFOs are more
likely to be guardians and drivers,” according to consultancy Deloitte in a
description of different corporate personality types. “Pioneers are
novelty-seeking, they like having a variety of possibilities, generating new
ideas….On the other hand, the guardian personality values structure and loyalty,
are much more methodical, detail-oriented, and perhaps a little more
risk-averse.” ... CFOs understand that they have to change and expand their
skills,” said Mastanuono. “The modern CFO understands technology and how it can
transform the business. He or she also needs to understand the future of what
finance will look like, and be a transformer of people, processes, and systems.
The CFO must move from being a reactive to a proactive collaborator so the end
business can be positioned to have the right systems and data at the right time.
Breaking down silos and developing empathy and cross-functional collaboration
are requirements, and the CFO-CIO relationship is a critical piece.” ... If CFOs
and CIOs can develop a common approach to IT investments that looks at strategic
risks as well as benefits, it creates common ground for project discussions and
evaluations.
How to address post-pandemic infrastructure pain points
Managing workforce transformation is already challenging enough for employees
who need to access on-premises resources. It becomes even more difficult if
these employees work in regulated sectors, as medical and financial
organizations need to track their employees’ identities, access requests, and
usage to an even greater degree. Moreover, because there’s no one set of
global standards, IT teams will need to account for many different compliance
frameworks that vary based on where an employee is sitting, what information
they’re accessing, and what sector they’re working in. On top of that, as
businesses build new infrastructures that can accommodate and monitor
permanently remote workers, they must be mindful of how certain regulations
affect what personally identifiable information they can record about their
own employees. GDPR, CCPA, and other privacy laws predate the pandemic, but
like workforce transformation, they’ve become even starker and more
commonplace challenges now. Different jurisdictions will have different
mandates, and your IT teams will need to account for them all.
12 steps towards a secure project management framework
Cyber security is a tech-heavy domain, and project/program management is
essential to deliver successful projects. However, cyber security requires a
few tweaks in regular management practices as it comes with a different set of
requirements. Cyber security is a security management program that is complex
in nature and entails systematic processes. It deals with all aspects of a
company’s operations, from mapping and recruiting skilled security
professionals to vendor risk management. It involves protecting and securing
computer systems, networks, and data from theft or damage, thereby ensuring
business continuity. A project manager usually has to oversee many one-time
and recurring cyber security tasks while handling usual responsibilities and
priorities. A good project management framework will ensure that projects are
delivered smoothly, without exceeding budgets, and are carried out in the
timeframe decided. For any project management program to be successful, it’s
important to define roles and responsibilities, a detailed plan of action, and
milestones to be achieved.While most of the standard project management
practices hold good in cyber security programs, there are a few cyber
security-specific aspects that need to be taken care of with absolute
diligence and strict adherence.
Information Relativity
Relativity was introduced at the beginning of the last century when Einstein
proved that reality is fundamentally different depending on your frame of
reference, a distortion of the spacetime continuum. The concept has led to the
discovery of black holes, gravitational lenses, time dilation, and all kinds
of other fantastic things. Relativity is not at all what one would expect
based on our regular day-to-day lives that operate according to classic laws
of physics. It changes what it means to observe and to be an observer—it means
that how we experience the world differs not just in how we interpret it.
There are circumstances where the world I experience is inconsistent with
yours. It turns out that communication has these same circumstances that also
work in this same peculiar way. Information is distorted depending on the
location of the observer. Mark Burgess calls this “information relativity”:
messages can take multiple paths and interfere with one another, information
can be reversed in its order as it travels along one path, the speed of
communication can be different from the speed of communication on another
path.
The Role of EiPaaS in Enterprise Architecture: Part 1
When discussing enterprise architecture, a diagram of the IT landscape comes
to mind because that is the standard approach to defining an architecture.
However, during our work with a number of enterprise architecture teams
worldwide, we discovered that enterprise architecture has a larger strategic
scope than what typical IT diagrams capture. Fundamentally, enterprise
architecture converts business strategy into a value generation outcome by
creating a foundation to execute various IT initiatives and processes. It is
about gaining a long-term view for the organization, including the
integration and standardization of various elements involved in the
business. ... At the initial stages, an enterprise architecture will define
the systems and subsystems required for each organization’s function. It
starts with purchasing core systems, such as human resource management
(HRM), customer relationship management (CRM) and/or enterprise resource
planning (ERP) based on the business domain of the organization. In
addition, subsystems will be built around the core systems by in-house or
outsourced development teams. Systems and subsystems that belong to each
function operate independently with limited or no information exchange.
Nvidia announces Morpheus, an AI-powered app framework for cybersecurity
Morpheus essentially enables compute nodes in networks to serve as
cyberdefense sensors — Nvidia says its newly announced BlueField-3 data
processing units can be specifically configured for this purpose. With
Morpheus, organizations can analyze packets without information replication,
leveraging real-time telemetry and policy enforcement, as well as data
processing at the edge. Thanks to AI, Morpheus can ostensibly analyze more
security data than conventional cybersecurity app frameworks without
sacrificing cost or performance. Developers can create their own Morpheus
skills using deep learning models, and Nvidia says “leading” hardware,
software, and cybersecurity solutions providers are working to optimize and
integrate datacenter security offerings with Morpheus, including Aria
Cybersecurity Solutions, Cloudflare, F5, Fortinet, Guardicore Canonical, Red
Hat, and VMware. Morpheus is also optimized to run on a number of
Nvidia-certified systems from Atos, Dell, Gigabyte, H3C, HPE, Inspur,
Lenovo, QCT, and Supermicro. Businesses are increasingly placing their faith
in defensive AI like Morpheus to combat the growing number of
cyberthreats.
Automation will accelerate decentralization and digital transformation
As the vaccinated population grows, doors reopen, and more people come
together again, the reality we find ourselves in will not be the one left
behind in 2019. Many long for a return to in-person experiences, but at the
same time, have grown accustomed to the flexibilities of a decentralized,
digital-first world. As we emerge from lockdown, hitting "rewind" will not
satisfy customer and employee needs. Instead, companies must create hybrid
experiences that integrate both digital and in-person modalities. In
addition, the growing expectations of stakeholders has created unprecedented
demand for IT innovation and greater sense of urgency in the post-pandemic
world. Even as more offline activities resume, 2020's rapid digitalization
will have a large and lasting impact on both customer and employee
experiences. For example, analysis of global research from Salesforce shows
customers anticipate engaging online with companies just as much in 2021 as
they did in 2020. That customers expect to maintain this substantial
departure from their 2019 patterns suggests that the swing to digital at the
height of the pandemic wasn't purely due to unavailability of in-person
channels.
How data poisoning attacks corrupt machine learning models
The main problem with data poisoning is that it's not easy to fix. Models
are retrained with newly collected data at certain intervals, depending on
their intended use and their owner's preference. Since poisoning usually
happens over time, and over some number of training cycles, it can be hard
to tell when prediction accuracy starts to shift. Reverting the poisoning
effects would require a time-consuming historical analysis of inputs for the
affected class to identify all the bad data samples and remove them. Then a
version of the model from before the attack started would need to be
retrained. When dealing with large quantities of data and a large number of
attacks, however, retraining in such a way is simply not feasible and the
models never get fixed, according to F-Secure's Patel. "There's this whole
notion in academia right now that I think is really cool and not yet
practical, but we'll get there, that's called machine unlearning," Hyrum
Anderson, principal architect for Trustworthy Machine Learning at Microsoft,
tells CSO. "For GPT-3 [a language prediction model developed by OpenAI], the
cost was $16 million or something to train the model once.
Quote for the day:
"It's not about how smart you
are--it's about capturing minds." -- Richie Norton
No comments:
Post a Comment