RBAC is a policy-neutral access control solution built around roles and privileges. Also known as role-based security, RBAC helps restrict access to authorized users only. It supports both discretionary and mandatory access controls per business requirements. Its features including but not limited to permission groups, role permissions, and user-role or role-role relationships help block or restrict users from doing unauthorized actions or tasks or from using unauthorized data storage. Without an enforcing access control system, employees can do almost anything. For example, an employee can send a modified invoice or quote with his bank account information, stealing the payment from the organization’s clients. Or, he can provide access to third-party persons or organizations, allowing them to infiltrate in your organization, check or steal your sensitive data, and more. ... Wiith a role-based access control system, you can reduce the paperwork for onboarding employees, changing passwords, switching roles, etc. You can make use of the control system to add or switch roles quickly, implement roles and permissions to multiple employees or globally, and do more. Since the complete access control settings sit under one platform, it generates fewer errors and more efficiency when assigning roles and permissions to the employees.
Cybersecurity began as an effort to wall off companies from the outside world, protecting trade secrets, customer data, and other sensitive information from unauthorized people. Since then, the world has grown far more complicated. Data has become increasingly important even as it has been moved to the "cloud," and accessed through the internet. No longer do just employees need access to that data--customers do, too. And no longer do just people need access to that data--other computer systems do, too. Corporate computer systems are no longer isolated forts, they are interconnected hives with information passing back and forth in myriad ways. The result has been a steady increase in ways for criminals to get that data, and a steady drumbeat of increasingly spectacular breaches, with criminals stealing everything from credit card and social security numbers to the blueprints for nuclear power plants. With virtual private networks that were built to handle modest numbers of workers now facing hordes, the threat vectors are proliferating.
One of the benefits of using AI is that it can improve data quality. This improvement is needed within any analytics-driven organisation where the proliferation of personal, public, cloud, and on-premise data has made it nearly impossible for IT to keep up with user demand. Companies want to improve data quality by taking advanced design and visualisation concepts typically reserved for the final product of a BI solution, namely dashboards and reports, and putting them to work at the very beginning of the analytics lifecycle. AI-based data visualisation tools, such as Qlik’s Sense platform and Google Data Studio, are enabling enterprises to identify critical data sets which need attention for business decision-making, reducing human workloads. In an effort to speed time-to-market for custom-built AI tools, technology vendors are introducing pre-enriched, machine-readable data specific to given industries. Intended to help data scientists and AI engineers, these kits include the data necessary to create AI models that will speed the creation of those models. For example, the IBM Watson Data Kit for food menus includes 700,000 menus from across 21,000 US cities and dives into menu dynamics like price, cuisine, ingredients, etc.
The independent report, “Weathering the Perfect Storm: Securing the Cyber-Physical Systems of Critical Infrastructure,” queried over 400 c-level executives from critical infrastructure organisations across North America, Europe and Asia/Pacific and found: 52% say employees are the biggest threat to operational security; Cyber incursion into IT data systems accounted for 53% of attacks in the last 12 months; 85% of security incursions made their way into Operational Technology networks – of those, 36% started in IT/data systems and 32% involved physical incursion into OT; More than half (64%) say it took a cyber or physical security breach to motivate them to move toward a more holistic approach to cyber security; and Only a quarter believe their existing security is adequate. “The perfect storm of increasing cyber threats, digital transformation and IT/OT convergence means organisations must move swiftly to gain visibility and enhance cybersecurity into their OT and IoT networks,” said Kim Legelis, CMO, Nozomi Networks.
Researchers from FireEye who have been tracking the activity said APT41 attacked as many as 75 of its customers between January 20 and March 11 alone. The targeted organizations are scattered across 20 countries, including the US, UK, Canada, Australia, France, Japan, and India. Organizations from nearly 20 sectors have been impacted, including those in the government, defense, banking, healthcare, pharmaceutical, and telecommunication sectors. Though only a handful of the attacks resulted in an actual security compromise, FireEye described APT41's activity as one of the broadest malicious campaigns ever by a Chinese threat actor in recent years. Chris Glyer, chief security architect at FireEye, says the reason for APT41's sudden burst of activity is unclear. Based on FireEye's current visibility, the attacks appear to be targeted, but it is hard to ascribe a specific motive or intent behind APT41's behavior, he says. But likely triggers include the ongoing trade war between the US and China and the unfolding COVID-19 pandemic.
Asked why they've been moving to cloud-based security, 29% of the respondents cited improvements in the monitoring and tracking of attacks, while 22% pointed to reduced maintenance. Other reasons included reductions in capital expenditures and access to the latest features. But organizations also have specific fears about switching their security tools to cloud-based variants. Asked about their concerns, 30% of the respondents pointed to the privacy of their data, 16% to unauthorized access, 14% to server outages, 14% to integration with other security tools, and 13% to the sovereignty of their data. Further, some 32% said they thought it would be too hard or too risky to migrate their security tools to the cloud. Another 32% said they didn't know what concerns their organization had about this type of migration. Among the organizations that have moved to cloud-based security tools, 22% cited email as the most widely protected type of data, 21% customer information, 20% file sharing, and 18% personnel files. Only 12% of the respondents said they're using cloud-based security to protect corporate financial data.
Today’s challenges with data are heterogeneous. Data is scattered and unstructured in mixed storage and computing environments – endpoints, edge, on-premises, cloud, or a hybrid, which uses a mix of these. Data is also accessible across different architectures, including file-based, database, object, and containers. There are also issues of duplications and conflicts of data. 5G will surely add more complexity to today’s existing challenges. With 5G, even more data will be generated from endpoints and IoT devices, with more metadata and contextual data produced and consumed. As a result, there will be more demand for real-time processing and more edge compute processing, analyzing, and data storage scattered throughout the network. Each application and use case is unique and has different storage requirements and challenges, including performance, integrity of data, workloads, retention of data, and environmental restrictions. In the past, the capabilities of general-purpose storage greatly exceeded the requirements of networks, data, and applications.
Linus Torvalds might be best known as the creator of Linux, but Git, the distributed version control system of his invention, is arguably even more important. Torvalds has said that “Git proved I could be more than a one-hit wonder,” but this is an understatement in the extreme. While there were version control systems before Git (e.g., Subversion), Git has revolutionized how developers build software since its introduction in 2005. Today Git is a “near universal” ingredient of software development, according to studies pulled together by analyst Lawrence Hecht. How “near universal?” Well, Stack Overflow surveys put it at 87 percent in 2018, while JetBrains data has it jumping from 79 percent (2017) to 90 percent (2019) adoption. Because so much code sits in public and (even more in) private Git repositories, we’re in a fantastic position to wrap operations around Git. To quote Weaveworks CEO Alexis Richardson, “Git is the power option, [and] we would always recommend it if we could, but it is very wrong to say that GitOps requires expertise in Git. Using Git as the UI is not required. Git is the source of truth, not the UI.” Banks, for example, have old repositories sitting in Subversion or Mercurial. Can they do GitOps with these repositories?
Quote for the day:
"All organizations are perfectly designed to get the results they are now getting. If we want different results, we must change the way we do things." -- Tom Northup