The Chaos Mindset: Teaching Your Code to Cope
Like Agile, chaos engineering is more than a set of activities and
workflows—it’s also a state of mind. Your people and your culture must be ready
and able to adopt chaos principles, as well as chaos processes. For the DevOps
leader, adopting a new mindset might sound a little, well, vague. But this shift
is based on concrete actions, not just philosophical musings. Consider an
example from the world of cloud infrastructure: a mission-critical application
that is hosted within a cloud service could be at risk for failure if, say, that
cloud service is centralized in a single location, or within a limited number of
microservices within the cloud infrastructure. But if the app is hosted in a
distributed way, you can create greater opportunity for application-level
availability and resilience, and you can test for that resilience within the
existing production environment. This kind of distributed architecture isn’t
brand-new for most enterprises, and, therefore, the process of developing
applications in way that tests for availability in a variety of infrastructure
scenarios also shouldn’t be a foreign concept. As a DevOps leader, you can build
a culture of resilience-centric thinking by empowering your teams with the tools
they need to adopt chaos-style testing, and then showing them how to build that
thinking into every sprint and every standup.
Intel Outside: How The Chip Giant Lost Its Edge
For Intel, the year 2020 was a roller coaster ride. The company saw more lows
than highs. If Apple delivered the much dreaded news to the company, its rivals—
NVIDIA and AMD chipped in with more bad news with mega acquisitions and
advancements in technology. Intel’s woes didn’t end there. Last year, rockstar
chip architect Jim Keller, who was hired to put Intel on top again, resigned
after a brief stint at the company; this is Keller’s shortest tenure compared to
his time at Apple and Tesla. Then there was Chief Engineer Venkata Murthy
Renduchintala, who promised in 2019, that the Intel’s next gen 7nm chips were on
track to start production in 2021. That didn’t happen. Intel parted ways with
Renduchintala as part of a technical team shake up. Constant engineering hiccups
and internal debates of whether Intel needs to outsource manufacturing further
delayed the arrival of next gen CPUs. The top brass of the company moving in and
out also signals Intel’s leadership vulnerabilities. Current chief Bob Swan who
will be replaced soon, was also only appointed a couple of years ago. Swan was
tasked with restructuring the company to adjust to the disrupting technologies
like AI and cloud.
North Korea-Sponsored Hackers Attack with Bad-Code Visual Studio Projects
Microsoft reported a battle with North Korean-sponsored hackers who attacked
security researchers with a most innovative technique: compromised Visual Studio
projects. The attack was attributed to a group called ZINC, said to be
associated with the Democratic People's Republic of Korea (DPRK). A Jan. 28 post
titled "ZINC attacks against security researchers" described the organization as
a DPRK-affiliated and state-sponsored group. That determination was based on
"observed tradecraft, infrastructure, malware patterns, and account
affiliations." "This ongoing campaign was reported by Google’s Threat Analysis
Group (TAG) earlier this week, capturing the browser-facing impact of this
attack," Microsoft said. "By sharing additional details of the attack, we hope
to raise awareness in the cybersecurity community about additional techniques
used in this campaign and serve as a reminder to security professionals that
they are high-value targets for attackers." While such battles between hackers
and enterprises and security organizations are obviously common and ongoing, one
unusual aspect of this encounter was the choice of payloads for the bad code.
AI Ethics Really Come Down To Security
Innovating trustworthy AI/ML depends on the design, development and distribution
of AI systems that learn from and work collaboratively with humans in a
comprehensive and meaningful fashion. It's critical for security and privacy to
be considered at the start of any new technology's architecture. They cannot be
properly included as an afterthought; the absolute highest required level of
security and protection of data must be incorporated in both hardware and
software, which will ensure that it is already configured into all steps of the
development and supply chain — beginning with design all the way through to the
technology's business and utilization model. The Charter of Trust initiative for
IoT cybersecurity (of which we're a partner) has also provided excellent
guidelines for a risk-based methodology and verification that should be
incorporated as core requirements throughout that supply chain. After we
identify the core principles that will govern AI development, we must then
determine how to ensure these ethical AI systems are not compromised. Machine
learning can monitor data and pinpoint anomalies, but it unfortunately also can
be used by hackers to increase the impact of their actual cyberattacks.
Use social design to help your distributed team self-organize
For those on the front lines, a restructuring can feel more like something done
to them than with them. Managers might overlook the experience and insights of
those expected to innovate, collaborate, and satisfy customers within the new
structure. And there is often an explicit or implicit power dynamic that
distorts functional considerations as executives jostle for control of
prominence and resources. An alternative to the top-down approach is to let
function drive form, supporting those most directly connected to creating value
for customers. Think of it as bottom-up or outside-in. One discipline useful in
such efforts is social design, a subspecialty of design that aspires to solve
complex human issues by supporting, facilitating, and empowering cultures and
communities. Its practitioners design systems, not simply beautiful things. I
spoke with one of the pioneers in this area, Cheryl Heller, author of The
Intergalactic Design Guide: Harnessing the Creative Potential of Social Design.
Her current work at Arizona State University centers on integrating design
thinking and practice into functions that don't typically utilize design
principles. “People’s work is often their only source of stability right now,”
she told me. “You have to be careful, because people are brittle.”
How-to improve Wi-Fi roaming
The initial tendency may be to install more APs in hopes of finding an easy fix,
but doing so without careful analysis can make the situation even worse. Proper
roaming requires more than just good signal strength throughout coverage areas;
it takes a careful balance between the coverage of each AP on both 2.4 and 5GHz
bands to make roaming work right. ... Getting the coverage overlap just right
between all the APs in your network is one of the most important things you can
do to help improve the roaming. At the same time, it is one of the toughest. You
have to check the coverage throughout the coverage areas and analyze the
overlapping. If issues are found you need to figure out how to address them,
perform the fix, and then double-check that it’s actually fixed. Keep in mind
you want about a 15% to 20% coverage overlap between AP cells, using -67dBm as
the signal boundary for each cell. You want to look at both bands, too, keeping
in mind 2.4GHz naturally provides longer range than 5GHz. Less overlap can
result in spots with bad signals. If you have too much overlap between AP cells
in either band, it can cause co-channel interference and “sticky” clients that
don’t roam, which can result in APs that become overloaded with clients.
UK's leading AI startup and scaleup founders highlight the main pain points of running an AI business
Looking specifically at financial institutions, Hodgson says that they must
ensure that their data foundations are fit for purpose. “Data is the raw
material of our industry, and without it, the benefits and potential of AI are
stunted and capped before the system even gets switched on. Many financial
institutions already sit atop mountains of their own data in addition to buying
more from vendors — yet they do not have the time, the resources or the staff
expertise to sift through it,” Hodgson explains. Dr Richard Ahlfeld, founder and
CEO at Monolith AI — a startup that builds new machine learning software to help
engineers to improve the product development process, echoes this view. He says:
“Any pain points tend to boil down to the data: getting the data, ensuring data
security, making sure that you can trust the data. “There’s no standardisation
of what makes data ‘valuable’ across the industry either, and not all engineers
follow the same protocols and practices. For example, deciding what data to keep
can be tricky as it’s hard to anticipate what might or might not be useful to
have in the future. Even saving data from failed ventures (a practice which is
often overlooked) can have its value, as it acts as a reference for future
experiments.”
Ransomware payments are going down as more victims decide not to pay up
While it's positive that a higher percentage of these victims are choosing not
to pay cyber criminals, there's still a large number of organisations that do
give in – allowing ransomware to continue to be successful, even if those behind
attacks have been making slightly less money. However, it might be enough for
some ransomware operators to consider if the effort is worth it. "When fewer
companies pay, regardless of the reason, it causes a long-term impact, that
compounded over time can make a material difference in the volume of attacks,"
said a blog post by Coveware. The rise in organisations choosing not to give
into extortion tactics around ransomware has also led the gangs to change their
tactics, as shown by the increase in ransomware attacks where criminals threaten
to leak stolen data if the victim doesn't pay. According to Coveware, these
accounted for 70% of ransomware attacks in the final three months of 2020 – up
from 50% during the previous three months. However, while almost three-quarters
of organisations threatened with data being published between July and September
paid ransoms, that dropped to 60% for organisations who fell victim between
October and December.
Measuring Crop Health Using Deep Learning – Notes From Tiger Analytics
Agrochemical companies are already experimenting with advanced data science
techniques to overcome these challenges: they employ drones to capture
high-resolution aerial images of the farms and apply computer vision techniques
and other complex algorithms to process the images. However, challenges persist;
leaf characteristics such as orientation, alignment, length, shape and twists
are difficult to discern when viewed from above, particularly in crops that grow
tall and narrow, such as maise. Further complexities are introduced by
variability in ambient light conditions, soil terrain, cloud refraction,
occlusion and other environmental factors. Finally, all these factors vary over
time, which means that to get a clear picture of plant health and treatment
performance, regular measurement is required. As deep learning and computer
vision fields mature, scientists are beginning to use these technologies for
such LAI measurements, and more. Tiger Analytics has collaborated with leading
agrochemical companies to develop such solutions. In this article, we outline
the possible approaches and challenges. The primary challenge in developing a
deep learning solution is the near nonexistence of training data.
Contemporising Data Protection Legislation
Provisioning blanket exemption to government agencies from the application of
the data protection law and processing obligations (Section 35, PDP Bill) poses
a challenge to reforming and upgrading the data access and surveillance regime.
The importance of procedural safeguards, the right to effective recourse, and
necessary and proportionate access principles has been reiterated by numerous
Supreme Court judgments like PUCL v. Union of India and K.S. Puttaswamy v. Union
of India. Such an exemption might inadvertently curtail the government’s stated
vision of becoming the data processing and analytics hub of the world, and dent
digital economy goals. According to the updated draft of the Standard
Contractual Clauses (SCCs) by the European Commission on personal data transfers
outside the European region, data exporters must take into account the laws and
overall regime that enable public authorities to access personal data through
binding requests in the destination country, and gauge if they meet “necessary
and proportionate” requirements expected from a “democratic society”. If
governments and businesses find the exemption under Section 35 of the PDP Bill
excessive, digital trade and investments, and the ability to forge agreements,
might be impacted.
Quote for the day:
"Trust is one of the greatest gifts
that can be given and we should take creat care not to abuse it." --Gordon Tredgold
No comments:
Post a Comment