How phishing attacks are exploiting Google's own tools and services
Armorblox's co-founder and head of engineering, Arjun Sambamoorthy, explains
that Google is a ripe target for exploitation due to the free and democratized
nature of many of its services. Adopted by so many legitimate users, Google's
open APIs, extensible integrations, and developer-friendly tools have also
been co-opted by cybercriminals looking to defraud organizations and
individuals. Specifically, attackers are using Google's own services to sneak
past binary security filters that look for traffic based on keywords or URLs.
... cybercriminals spoof an organization's security administration team with
an email telling the recipient that they've failed to receive some vital
messages because of a storage quota issue. A link in the email asks the user
to verify their information in order to resume email delivery. The link in the
email leads to a phony login page hosted on Firebase, Google's mobile platform
for creating apps, hosting files and images, and serving up user-generated
content. This link goes through one redirection before landing on the Firebase
page, confusing any security product that tries to follow the URL to its final
location. As it's hosted by Google, the parent URL of the page will escape the
notice of most security filters.
Women in Data: How Leaders Are Driving Success
Next-gen analytics have helped to shift perception and enable the business to
accelerate the use of data, according to panelist Barb Latulippe, Sr. Director
Enterprise Data at Edward Life Sciences, who emphasized the trend toward
self-service in enterprise data management. The days of the business going to
IT are gone—a data marketplace provides a better user experience. Coupled with
an effort to increase data literacy throughout the enterprise, such data
democratization empowers users to access the data they need themselves, thanks
to a common data language. This trend was echoed by panelist Katie Meyers,
senior vice president at Charles Schwab responsible for data sales and service
technologies. A data leader for 25 years, Katie focused on the role cloud
plays in enabling new data-driven capabilities. Katie emphasized that we’re
living in a world where data grows faster than our ability to manage the
infrastructure. By activating data science and artificial intelligence (AI),
Charles Schwab can leverage automation and machine learning to enable both the
technical and business sides of the organization to more effectively access
and use data.
Developer experience: an essential aspect of enterprise architecture
Code that provides the structure and resources to allow a developer to meet
their objectives with a high degree of comfort and efficiency is indicative of
a good developer experience. Code that is hard to understand, hard to use,
fails to meet expectations and creates frustration for the developer is
typical of a bad developer experience. Technology that offers a good developer
experience allows a programmer to get up and running quickly with minimal
frustration. A bad developer experience—one that is a neverending battle
trying to figure out what the code is supposed to do and then actually getting
it to work—costs time, money, and, in some cases, can increase developer
turnover. When working with a company’s code is torturous enough, a talented
developer who has the skills to work anywhere will do take one of their many
other opportunities and leave. There is only so much friction users will
tolerate. While providing a good developer experience is known to be essential
as one gets closer to the user of a given software, many times, it gets
overlooked at the architectural design level. However, this oversight is
changing. Given the enormous demand for more software at faster rates of
delivery, architects are paying attention.
ISP Security: Do We Expect Too Much?
"The typical Internet service provider is primarily focused on delivering
reliable, predictable bandwidth to their customers," Crisler says. "They value
connectivity and reliability above everything else. As such, if they need to
make a trade-off decision between security and uptime, they will focus on
uptime." To be fair, demand for speed and reliable connections was crushing
many home ISPs in the early days of the pandemic. For some, it remains a
serious strain. "In the early weeks of the pandemic, when people started using
their residential connections at once, ISPs were faced with major outages as
bandwidth oversubscription and increased botnet traffic created serious
bottlenecks for people working at home," says Bogdan Botezatu, director of
threat research and reporting at Bitdefender. ISPs' often aging and
inadequately protected home hardware presents many security vulnerabilities as
well. "Many home users rent network hardware from their ISP. These devices are
exposed directly to the Internet but often lack basic security controls. For
example, they rarely if ever receive updates and often leave services like
Telnet open," says Art Sturdevant, VP of technical operations at Internet
device search engine Censys. "And on devices that can be configured using a
Web page, we often see self-signed certificates, a lack of TLS for login
pages, and default credentials in use."
Can private data as a service unlock government data sharing?
Data as a service (DaaS), a scalable model where many analysts can access a
shared data resource, is commonplace. However, privacy assurance about that
data has not kept pace. Data breaches occur by the thousands each year, and
insider threats to privacy are commonplace. De-identification of data can
often be reversed and has little in the way of a principled security model.
Data synthesis techniques can only model correlations across data attributes
for unrealistically low-dimensional schemas. What is required to address the
unique data privacy challenges that government agencies face is a
privacy-focused service that protects data while retaining its utility to
analysts: private data as a service (PDaaS). PDaaS can sit atop DaaS to
protect subject privacy while retaining data utility to analysts. Some of the
most compelling work to advance PDaaS can be found with projects funded by the
Defense Advanced Research Projects Agency’s Brandeis Program, ... According to
DARPA, “[t]he vision of the Brandeis program is to break the tension between:
(a) maintaining privacy and (b) being able to tap into the huge value of data.
Rather than having to balance between them, Brandeis aims to build a third
option – enabling safe and predictable sharing of data in which privacy is
preserved.”
How to Create High-Impact Development Teams
Today’s high-growth, high-scale organizations must have well-rounded tech
teams in place -- teams that are engineered for success and longevity.
However, the process of hiring for, training and building those teams requires
careful planning. Tech leaders must ask themselves a series of questions
throughout the process: Are we solving the right problem? Do we have the right
people to solve these problems? Are we coaching and empowering our people to
solve all aspects of the problem? Are we solving the problem the right way?
Are we rewarding excellence? Is 1+1 at least adding up to 2 if not 3? ... When
thinking of problems to solve for the customers -- don’t constrain yourself by
the current resources. A poor path is to first think of solutions based on
resource limitations and then find the problems that fit those solutions. An
even worse path is to lose track of the problems and simply start implementing
solutions because “someone” asked for it. Instead, insist on understanding the
actual problems/pain points. Development teams who understand the problems
often come back with alternate, and better, solutions than the initial
proposed ones.
Apstra arms SONiC support for enterprise network battles
“Apstra wants organizations to reliably deploy and operate SONiC with
simplicity, which is achieved through validated automation...Apstra wants to
abstract the switch OS complexity to present a consistent operational model
across all switch OS options, including SONiC,” Zilakakis said. “Apstra wants
to provide organizations with another enterprise switching solution to enable
flexibility when making architecture and procurement decisions.” The company’s
core Apstra Operating System (AOS), which supports SONIC-based network
environments, was built from the ground up to support IBN. Once running it
keeps a real-time repository of configuration, telemetry and validation
information to constantly ensure the network is doing what the customer wants
it to do. AOS includes automation features to provide consistent network and
security policies for workloads across physical and virtual infrastructures.
It also includes intent-based analytics to perform regular network checks to
safeguard configurations. AOS is hardware agnostic and integrated to work with
products from Cisco, Arista, Dell, Juniper, Microsoft and Nvidia/Cumulus.
New EU laws could erase its legacy of world-leading data protection
As the European Union finalises new digital-era laws, its legacy of
world-leading privacy and data protection is at stake. Starting next week, the
European Commission will kick off the introduction of landmark legislative
proposals on data governance, digital market competition, and artificial
intelligence. The discussions happening now and over the next few months have
implications for the future of the General Data Protection Regulation and the
rights this flagship law protects. With Google already (predictably) meddling
in the debate, it is imperative that regulators understand what the pitfalls
are and how to avoid them. ... The first new legislation out of the gate will
be the Data Governance Act, which the European Commission is set to publish on
November 24. According to Commissioner Thierry Breton, the new Data Strategy
aims to ensure the EU “wins the battle of non-personal data” after losing the
“race on personal data”. We strongly object to that narrative. While countries
like the US have fostered the growth of privacy-invasive data harvesting
business models that have led to repeated data breaches and scandals such as
Cambridge Analytica, the EU stood against the tide, adopting strong data
protection rules that put people before profits.
The journey to modern data management is paved with an inclusive edge-to-cloud Data Fabric
We want everything to be faster, and that’s what this Data Fabric approach
gets for you. In the past, we’ve seen edge solutions deployed, but you weren’t
processing a whole lot at the edge. You were pushing along all the data back
to a central, core location -- and then doing something with that data. But we
don’t have the time to do that anymore. Unless you can change the laws of
physics -- last time I checked, they haven’t done that yet -- we’re bound by
the speed of light for these networks. And so we need to keep as much data and
systems as we can out locally at the edge. Yet we need to still take some of
that information back to one central location so we can understand what’s
happening across all the different locations. We still want to make the
rearview reporting better globally for our business, as well as allow for more
global model management. ... Typically, we see a lot of data silos still out
there today with customers – and they’re getting worse. By worse, I mean
they’re now all over the place between multiple cloud providers. I may use
some of these cloud storage bucket systems from cloud vendor A, but I may use
somebody else’s SQL databases from cloud vendor B, and those may end up having
their own access methodologies and their own software development kits (SDKs).
Rebooting AI: Deep learning, meet knowledge graphs
"Most of the world's knowledge is imperfect in some way or another. But there's
an enormous amount of knowledge that, say, a bright 10-year-old can just pick up
for free, and we should have RDF be able to do that. Some examples are, first of
all, Wikipedia, which says so much about how the world works. And if you have
the kind of brain that a human does, you can read it and learn a lot from it. If
you're a deep learning system, you can't get anything out of that at all, or
hardly anything. Wikipedia is the stuff that's on the front of the house. On the
back of the house are things like the semantic web that label web pages for
other machines to use. There's all kinds of knowledge there, too. It's also
being left on the floor by current approaches. The kinds of computers that we
are dreaming of that can help us to, for example, put together medical
literature or develop new technologies are going to have to be able to read that
stuff. We're going to have to get to AI systems that can use the collective
human knowledge that's expressed in language form and not just as a spreadsheet
in order to really advance, in order to make the most sophisticated systems."
Quote for the day:
"To have long term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley
No comments:
Post a Comment