To begin your company's data privacy digital transformation, you should do two main things. First, define your company's privacy requirements. Create a clear list of the current needs you have. Do you need help managing and fulfilling users' privacy requests? Do you need a consent management tool? Do you want to automate your data mapping efforts? Do you need third-party risk assessment? Make sure you clearly define your desired set of requirements based on your user base size, business assets and countries of operation. Depending on where your business and customers reside, you will need to research the requirements for data privacy compliance in each of those countries. ... A digital transformation will help the data privacy field make strides as it progresses. With privacy technology and automation, companies can seamlessly integrate data privacy into their businesses, products and customer experiences. Data ownership marks a new era in the digital world, and to make it possible and successful, we have to welcome this change with smart technologies and an open mind.
BigLake is a unified storage engine that simplifies data access for data warehouses and lakes by providing uniform fine-grained access control across multi-cloud storage and open formats. BigLake extends BigQuery's fine-grained row- and column-level security to tables on data resident object stores such as Amazon S3, Azure Data Lake Storage Gen2, and Google Cloud Storage. BigLake decouples access to the table from the underlying cloud storage data through access delegation. This feature helps you to securely grant row- and column-level access to users and pipelines in your organization without providing them full access to the table. After you create a BigLake table, you can query it like other BigQuery tables. BigQuery enforces row- and column-level access controls, and every user sees only the slice of data that they are authorized to see. Governance policies are enforced on all access to the data through BigQuery APIs. For example, the BigQuery Storage API lets users access authorized data using open source query engines such as Apache Spark ... For data administrators, BigLake lets you abstract access management on data lakes from files to tables, and it helps you manage users' access to data on lakes.
The serious lesson from that is to acknowledge but forgive errors. "He's said, many times, that he knew at that moment it was going to be OK," Ellis says. "Creating a safe culture requires a lot of practices, and one of them is closure. Humor is a great way to provide closure because you rarely laugh about something that is still creating tension." There isn't a lot to laugh about in cybersecurity, with security teams fighting off a growing number of cyberattacks and deploying protective measures for a fast-evolving environment. But security shouldn't be about browbeating people into doing the right thing or scaring them with the prospect of punishment. For security to be a team sport, you need to make people want to play. It's vitally important to your business to create a security culture — that is, an atmosphere in which someone who messes up and breaks something feels they can report it without getting blasted for their actions. This idea isn't new, but considering recent analysis about how some companies aren't backing up their source code, sometimes stories need to be repeated.
The discrepancy in effort between multiplying two known primes together, and splitting that product back into its two factors, is pretty much the computational basis of a lot of modern online security…so if quantum computers ever do become both reliable and powerful enough to work their superpositional algorithmic magic on 3072-digit prime factors, then breaking into messages we currently consider uncrackable in practice may become possible in theory. Even if you’d have to be a nation state to have even the tiniest chance of succeeding, you’d have turned a feat that everyone once considered computationally unfeasible into a task might just be worth having a crack at. This would undermine a lot of existing public-key crypto algorithms to the point that they simply couldn’t be trusted. Even worse, quantum computers that could crack new problems might also be used to have a go at older cryptographic puzzles, so that data we’d banked on keeping encrypted for at least N years because of its high value might suddenly be decryptable in just M years, where M < N, perhaps less by an annoyingly large amount.
“Managing productivity is one of the most complex things any one person or organization can aspire to do,” says Dr. Sahar Yousef, a cognitive neuroscientist at University of California—Berkeley. The first step, though, is to define what you mean by productive, she says. “You can’t improve or change something that is not measurable.” And you can’t trust your team if you can’t also verify that they are working productively. If, in the past, you measured how hard people were working by noting who was at their desk or who spoke up in meetings, you’ll have to find a new way. Those things aren’t available anymore and they were never a good measure of productivity anyway. “We measure baselines around productivity, not hours worked,” says Andi Mann, CTO at Qumu. Because tracking how many hours someone worked doesn’t tell you much about productivity, even when you could tell the difference between work and home. “I spent nine hours at work,” says Mann. “Does that mean I accomplished something? Not necessarily. So that’s not the measure I’m looking for. My team are grownups — coders, engineers, smart people. I measure metrics that matter — outputs and accomplishments.”
Given the need to software companies to constantly grow their customer bases, the relative low cost of cash for the past decade and a half, and the ability to cross sell and upsell, it is natural for software conglomerations to form. And so it was only a matter of time before Puppet Software and its peers, Ansible, Chef and SaltStack, were acquired once they built up sufficient momentum to demonstrate their likely longevity across service providers, smaller clouds, and enterprises that do not build their own DevOps software stacks. So Red Hat bought Ansible in October 2015 for around $100 million, and Ansible was absolutely one of the reasons why IBM was compelled to pay $34 billion to acquire Red Hat in October 2018. ... And then VMware paid an undisclosed sum to buy SaltStack in that same month. HashiCorp, which has built a big following with its Terraform and Vagrant configuration management tools, has gone all the way and built a complete DevOpsContainer platform and has also gone public – but HashiCorp is the exception, not the rule, and it will have to keep expanding its platform and adding more tools if it hopes to keep growing its business.
Developers’ interest in security has been a long time coming. Google search data shows that queries for terms like “what is DevSecOps” and “DevSecOps vs. DevOps” first popped up in 2014 and have been steadily rising since 2017. The cloud, microservices, containerization and APIs are responsible for this burgeoning interest. These innovative technologies aren’t only changing the way applications are built and operated, they’re also changing what’s needed from a security perspective. In a modern environment, developers, engineers and architects need to think about data privacy and security because today’s applications benefit from having security measures baked into discrete components. Before the cloud became as ubiquitous as it is today, traditional cybersecurity relied on a perimeter-based model. Measures like firewalls and browser isolation systems essentially “surrounded” on-premise networks and systems. Applications and data were secure because they were hosted on physically isolated infrastructure.
Data democratization strategies ensure that company data is easily accessible by all employees, regardless of their position, without the involvement of the IT department. As valuable company data is placed in the hands of more individuals, cybercriminals can broaden the scope of potential targets to hack. Now an entire organization’s employee population theoretically faces an increased risk of malware penetration, and IT departments have a more difficult time deciphering when an unauthorized user has infiltrated the cloud-based systems where the data lives. Many organizations have implemented traditional detection-based security technology to thwart these threats, yet these solutions are only able to detect threats with known malware signatures. As enterprises work to secure their cloud infrastructures, they need to consider that solutions that focus on detecting threats are unable to protect against sophisticated attacks. As mentioned, proper security is critical for data democratization. Yet, in order for data democratization to work and make an impact, productivity has to be a critical focus.
High-performing CISOs are taking strategic business objectives and efforts into account and adapting their security programs to deliver results that multiply business velocity and revenue, instead of hindering the business by basing a security program on threats and vulnerabilities alone. This means CISOs are also having to become more business-savvy, helping promote a security culture through shared values, trust, and accountability, often more through influencing skills than with the security and compliance hammer. “We're seeing the CISO role being elevated out from underneath the CIO's IT umbrella and becoming a direct report to the CEO,” explains John Hellickson, field CISO executive advisor for Coalfire. “This means they are expected to bring a high degree of business acumen in how they represent risk to their business peers and stakeholders.” He said the need for establishing business-aligned cybersecurity programs that go beyond typical control frameworks is now table stakes -- the ability to demonstrate positive business outcomes and ROI of security risk management activities and investments will continue to be expected in the years to come.
"A security gap forms when quality updates that protect against new threats aren't adopted in a timely fashion. A productivity gap forms when feature updates that enhance users' ability to create and collaborate aren't rolled out. As gaps widen, it can require more effort to catch up," Bela says. In a separately released Windows Autopatch FAQ, Microsoft says the updates will be applied to a small initial set of devices, evaluated and then graduated to increasingly larger sets, with an evaluation period at each progression. "This process is dependent on customer testing and verification of all updates during these rollout stages. The outcome is to assure that registered devices are always up to date and disruption to business operations is minimized, which will free an IT department from that ongoing task," Microsoft says. In addition, Microsoft says that in case of an issue, the Autopatch service can be paused by the customer or the service itself. "When applicable, a rollback will be applied or made available," it says.
Quote for the day:
"The secret of a leader lies in the tests he has faced over the whole course of his life and the habit of action he develops in meeting those tests." -- Gail Sheehy