Daily Tech Digest - September 17, 2021

How CISOs and CIOs should share cybersecurity ownership

While the CISO is responsible for various elements of cybersecurity day-to-day and forward planning, in most organizations, the buck often stops with the CIO, who reports to the CEO and the board of directors, Finch says. “As a result, the CIO cannot hand responsibility to the CISO entirely. Instead, they need to retain awareness of security strategy and ensure that it isn’t putting the organization’s overall strategy in danger—or vice versa.” Brad Pollard, CIO at Tenable, says today's CIOs have a range of security accountabilities founded in availability, performance, budget, and the timely delivery of projects. “CIOs enable and support every business unit within an organization. In doing so, they inherit the information security requirements for each business unit.” For example, the CISO may well be charged with defining security parameters such as service level agreements for vulnerability remediation or access controls, but it falls to the CIO to deliver on these requirements for all business units, spanning all the company’s technologies, Pollard says. 

A Guide to DataOps: The New Age of Data Management

In a data-driven competitive landscape, ignoring the benefits of data, or even the inability to extract its fullest potential, can only mean a disastrous end for organizations. To be sure, many of these organizations are collecting plenty of data. They just don’t want, know, or have the processes in place to use it. Part of the problem is legacy data pipelines. As data moves from source to target in the data pipeline, each stage has its own idea of what that data means and how it can be put to use. This disconnected view of data renders the data pipelines brittle and resistant to change, in turn making the organizations slow to react in the face of change. ... DataOps, short for data operationalization, is a collaborative data management approach that emphasizes communication, integration, and automation of data pipelines within organizations. Unlike data storage management, DataOps is not primarily concerned about ‘storing’ the data. It’s more concerned about ‘delivery’, i.e., making the data readily available, accessible, and usable for all the stakeholders. 

How To Reduce Context Switching as a Developer

Often, developers struggle to balance timely communication and context switching. As we already know, context switching has a negative impact on your productivity because it prevents you from reaching a deep state of work. On the other hand, when colleagues ask a question, you want to help them promptly. For example, a developer asks for your assistance and might be blocked if you don’t help him. But should you sacrifice your flow state to help your colleague? Well, the answer is somewhat divided. Try to find a balance between responding on time and prioritising your work. Asynchronous communication has become a popular approach to tackle this problem. Instead of calling a meeting for each problem, communicate with the involved people and resolve it via text-based communication such as Slack. Moreover, it would help if you blocked time in your calendar to reach a flow state and leave time slots open for meetings or handling questions from colleagues. For instance, you can block two slots of three hours of deep work and leave two slots of one hour for asynchronous communication.

Stop Using CSVs for Storage — Pickle is an 80 Times Faster Alternative

Storing data in the cloud can cost you a pretty penny. Naturally, you’ll want to stay away from the most widely known data storage format — CSV — and pick something a little lighter. That is, if you don’t care about viewing and editing data files on the fly. ... In Python, you can use the pickle module to serialize objects and save them to a file. You can then deserialize the serialized file to load them back when needed. Pickle has one major advantage over other formats — you can use it to store any Python object. That’s correct, you’re not limited to data. One of the most widely used functionalities is saving machine learning models after the training is complete. That way, you don’t have to retrain the model every time you run the script. I’ve also used Pickle numerous times to store Numpy arrays. It’s a no-brainer solution for setting checkpoints of some sort in your code. Sounds like a perfect storage format? Well, hold your horses.

Data Management Strategy Is More Strategic than You Think

CXOs are in some ways the most visible representative inside large enterprises of what is, after all, a deeply felt human need to make sense of the world. We try to accomplish this in all parts of our lives including in our professional careers. It’s far more satisfying emotionally to work in an organization that uses data effectively to chart the way forward. But there are some pressing, contemporary drivers of urgency, too, not just an inherent human need. The pandemic radically accelerated awareness of data’s importance for both social and commercial resilience, especially in the face of repeated supply chain shocks and disruptions. But there was another factor too: most of the enterprise world has taken to working from home, operating complex orgs from the relative safety of social isolation, despite the additional challenges such isolation creates. The future of work has become problematized across the enterprise world and that raises questions and urgency around the future of information strategies to support the future of work.

UN Calls For Moratorium On Artificial Intelligence Tech That Threatens Human Rights

The report, which was called for by the UN Human Rights Council, looked at how countries and businesses have often hastily implemented AI technologies without properly evaluating how they work and what impact they will have. The report found that AI systems are used to determine who has access to public services, job recruitment and impact what information people see and can share online, Bachelet said. Faulty AI tools have led to people being unfairly denied social security benefits, while innocent people have been arrested due to flawed facial recognition. "The risk of discrimination linked to AI-driven decisions –- decisions that can change, define or damage human lives –- is all too real," Bachelet said. The report highlighted how AI systems rely on large data sets, with information about people collected, shared, merged and analysed in often opaque ways. The data sets themselves can be faulty, discriminatory or out of date, and thus contribute to rights violations, it warned. For instance, they can erroneously flag an individual as a likely terrorist. The report raised particular concern about the increasing use of AI by law enforcement, including as forecasting tools.

How Do Authentication and Authorization Differ?

As a user, you can usually see authentication happening (although it might be persistent, like staying logged into a website even if you close the browser tab) and you can often do things like changing your password or choosing which second factor you want to use. Users can’t change their authorization options and won’t see authorization happening. But you might see another authentication request if you try to do something that’s considered important enough that your identity has to be verified again before you are authorized to do it. Some banks will let you log in to your account and make payments you’ve done previously with your username and password, but ask you to use 2FA to set up a new payee. Conversely, authentication systems that use conditional access policies can recognize that you’re using the same device, IP address, location and network connection to access the same file share you access from that device, location and network every day to improve your productivity and not make you go through an authentication challenge.

Data Loss Protection Could Be Industry’s Next Big Trend

Data is an organisation’s most precious asset, and understanding the various data states can help determine the best security measures to deploy. Technology like DLP gives both IT and security personnel an overall perspective on the location, distribution, and use of information within an organisation. It eliminates all possibilities of data theft, including fines and lost income. If you’re worried about your upcoming audit and want to keep your data compliant with complex regulations, DLP is a great option for you. For companies wanting to protect their sensitive data from security breaches caused by increased worker mobility and the development of novel channels, the technology is a godsend. For DLP, success with cloud and virtual models has opened up new possibilities. Using business principles, these software tools identify and protect confidential and sensitive data, preventing unaccredited end-users from disclosing information that could endanger the firm.

Cloud Native Driving Change in Enterprise and Analytics

There is a democratization underway of data embedded into workflows and Slack, he said, but being able to expose data from applications or natively integrated in applications is the province of developers. Tools exist, Stanek said, for developers to make such data analytics more accessible and understandable by users. “We want to help people make decisions,” he said. “We also want to get them data at the right time, with the right context and volume.” Stanek said he sees more developers owning business applications, insights, and intelligence up to the point where end users can make decisions. “This industry is heading away from an isolated industry where business people are copying data into visualization tools and data preparation tools and analytics tools,” he said. “We are moving into a world where we will be providing all of this functionality as a headless functionality.” The rise of headless compute services, which do not have local keyboards, monitors, or other means of input and are controlled over a network, may lead to different composition tools that allow business users to build their own applications with low-code/no-code resources, Stanek said.

Is the Net Promoter Score ripe for replacement?

How can businesses measure the success of their marketing efforts? How does their current and future performance benchmark against competitors? How can they work out, for example, the levels of satisfaction and loyalty felt by their customers? The rise of social media during the last decade has simultaneously made these questions easier and in many ways more difficult to answer. On the one hand, the internet is bristling with all the necessary data required to determine how a given business is performing, as customers willingly – even eagerly – share thoughts and opinions which provide insights into such vital issues as customer satisfaction. On the other hand, the sheer volume of the data available can make it challenging to separate the essential from the non-essential. ... Clearly, then, the stage is set for new ways to measure performance: methods which are up-to-the-minute, capable of leveraging AI and machine learning technology to sift through swathes of data, and able to articulate actionable KPIs in a simple and accessible format.

Quote for the day:

"You think you can win on talent alone? Gentlemen, you don't have enough talent to win on talent alone." -- Herb Brooks, Miracle

No comments:

Post a Comment