Daily Tech Digest - November 28, 2023

How a digital design firm navigated its SOC 2 audit

One of the more intense aspects of the audit was the testing of our incident response plan. We had to provide records of past incidents, how they were handled, and the lessons learned. Moreover, the auditors conducted tabletop exercises to assess our preparedness for potential future security events. After weeks of evaluation, the auditors presented their findings. We excelled in some areas, such as in our encryption of sensitive data and our robust user authentication systems. However, they also identified areas for improvement, like the need for more granular access controls and enhanced monitoring of system configurations. Post-audit, we were given a roadmap of sorts--a list of recommendations to address the identified deficiencies. This phase was dedicated to remediation, where we worked diligently to implement the auditors’ suggestions and improve our systems. Reflecting on the transformative impact of SOC 2 certification, L+R has discerned a profound shift in the dynamics of client engagement and internal processes. SOC 2 certification transcends the realm of compliance, fostering enriched dialogues, bolstering trust, and catalyzing decision-making at the executive level.


Is anything useful happening in network management?

The first of these is a management take on something that's already becoming visible in a broader way: absorbing the network into something else. Companies have said for years that the data center network, the LAN, is really driven by data center hardware/software planning and not by network planning. They're now finding that a broader use of hybrid cloud, where the cloud becomes the front-end technology for application access, is pulling the WAN inside the cloud. The network, then, is becoming less visible, and thus explicit network management is becoming less important. ... The second development gaining attention is being proposed by a number of vendors, the largest being Nokia. It envisions using "digital twin" technology, something most commonly associated with IoT and industrial metaverse applications, to construct a software model of the network based on digital twins of network devices. With this approach, the network becomes in effect an industrial system, and potentially could then be monitored and controlled by tools designed for industrial IoT and industrial metaverse deployments. 


The Basis for Business and Solution Architecture

The Business Architect, just like the Solution Architect, is a business technology strategist. The delivery of technology driven business value is core to their professional capability and career. So for that purpose they share a set of skills/competencies, language, professional goals, and experiences with each other. Any other method of architecture has been shown to fail. Without the shared capabilities and focus the team quickly begins to cannibalise its own value proposition to the enterprise and argue about baseline definitions, purpose and ownership. The level of team synergy and shared community is one of the most important first steps to a mature architecture practice. With that in place the Business and Solution Architects work very well together and in alignment with the EA from strategy through execution to measured value. Business Architects must focus on program level outcomes, those that are scoped at the business capability and/or department or region level. These levels are where real business goals and measurements occur and stand closest to the customer while retaining executive authority.


Is the Future of Data Centers Under the Sea?

One of the most appealing aspects of underwater data centers is their proximity to large population centers. Around half of the world’s population lives within 125 miles of a coastal area. Situating data centers near coastal population centers would allow for lower latency and more efficient handling of data. This would increase speeds for various digital services. Perhaps counterintuitively, the servers themselves might also benefit from being dropped in the drink. The inert gases and liquids used to fill underwater data centers are less corrosive than ambient air, leading to longer lifespans for the equipment. The servers are also protected from possible human damage incurred by everyday movement -- people banging into them, dropping them, or accidentally unplugging them. Placing a data center pod or retrieving it for maintenance is fairly simple, according to Subsea Cloud’s Williams. “Let's say the water is 100 meters deep. It's just an hour an hour job. If it’s 3,000 meters deep, it will probably take five or six hours to get the pod down.”


What you don’t know about data management could kill your business

Contributing to the general lack of data about data is complexity. There are many places in the enterprise where data spend happens. Individual business units buy data from third parties, for example. Taking enterprise-wide inventory of all the data feeds being purchased and getting an accurate picture of how all that purchased data is being put to use would be a good first step. The reality is that a significant portion of the data sloshing about modern enterprises is replicated in multiple locations, poorly classified, idiosyncratically defined, locked in closed platforms, and trapped in local business processes. Data needs to be made more liquid in the way of an asset portfolio — that is, transformed to ease data asset reuse and recombination. ... Traditionally business schools have avoided data as a topic, pumping out business leaders who erroneously feel that data is someone else’s job. I recall the mean-spirited dig at early career Harvard Business School alums expecting their assistants to bring in the day’s work arrayed as a case study — that is, a crisp 20-page synopsis of all the relevant issues.


Stop panic buying your security products and start prioritizing

Look inward and optimize. Companies need to understand what inside their networks and data is most attractive and most vulnerable to attackers. Get visibility into what you have, calculate the value of your tools, and use the information to move forward. Understanding risk by gaining full visibility into what you already have can allow companies to communicate better with investors and the public in the case of an attack or breach. For example, they will be able to give clear information about the impact (or lack of impact) on the business when an attack occurs and lay out clear steps for remediation, not having to guess the next best course of action. ... It is important to remember that the goal is not to buy more tools to chase the growing number of vulnerabilities that experts find every day, but to protect the assets that are most relevant to overall vital business operations and limit the fallout of inevitable cyber incidents. By attaching a dollar value to the cyber risks the organization is up against, you will be in a much better position to discuss your security plan and budgetary needs.


US, UK Cyber Agencies Spearhead Global AI Security Guidance

The guidance, the agencies say, is good for all AI developers but is particularly aimed at AI providers who use models hosted by a third party or who use APIs to connect to a model. Risks include adversarial machine learning threats stemming from vulnerabilities in third-party software and hardware applications. Hackers can exploit those flaws to alter model classification and regression performance and to corrupt training data and carry out prompt injection or data poisoning attacks that influence the AI model's decisions. Hackers can also target vulnerable systems to allow unauthorized actions as well as extract sensitive information. The guidance describes cybersecurity as a "necessary precondition" for AI safety and resilience. CISA, in particular, has been on a protracted campaign to evangelize the benefits of secure by design while also warning tech companies that the era of releasing products to the public containing security flaws must come to an end (see: US CISA Urges Security by Design for AI). The guidelines represent a "strong step" in providing universal standards and best practices for international AI security operations and maintenance, according to Tom Guarente, vice president of external and government affairs for the security firm Armis. "But the devil is in the details."


Data De-Identification: Balancing Privacy, Efficacy & Cybersecurity

There are two primary laws guiding online privacy: the General Data Protection Regulation (GDPR) and the California Privacy Rights Act (CPRA), although many countries and states have started to write their own. Among the various safeguard measures, data de-identification is a prime one. Both define data de-identification as the process of making PII anonymized in a way that any piece of secondary information, when associated with the personal data, cannot identify the individual. The industry unanimously agrees on some entities as personal data including a name, address, email address, and phone number. Others, such as an IP address (and versions of it) are based on interpretation. These laws neither explicitly list the attributes that are personal nor do they mention how and when to anonymize, beyond sharing a few best practices. ... However, full anonymization of personal data and the data linked to it is useless to businesses in this ever-digital world. Every new technological breakthrough demands massive input of data sets — both personal and aggregated. 


NCSC publishes landmark guidelines on AI cyber security

The set of guidelines has been designed to help the developers of any system that incorporates AI make informed decisions about cyber security during the development process – whether it is being built from scratch or as an add-on to an existing tool or service provided by another entity. The NCSC believes security to be an “essential pre-condition of AI system safety” and integral to the development process from the outset and throughout. In a similar way to how the secure-by-design principles alluded to by CISA’s Easterly are increasingly being applied to software development, the cognitive leap to applying the same guidance to the world of AI should not be too difficult to make. The guidelines, which can be accessed in full via the NCSC website, break down into four main tracks – Secure Design, Secure Development, Secure Deployment and Secure Operation and Maintenance, and include suggested behaviours to help improve security. These include taking ownership of security outcomes for customers and users, embracing “radical transparency”, and baking secure-by-design practices into organisational structures and leadership hierarchies.


AI partly to blame for spike in data center costs

The research firm attributes the year-over-year jump in absorption to the impact of AI requirements, as well as the growth of traditional cloud computing needs across the region. However, the industry is facing challenges, not the least of which is securing enough power. "Securing power continues to be a challenge for developers, pushing some to the outskirts of primary markets, as well as fueling growth in secondary and tertiary markets. This has led to an uptick in land-banking by companies hoping to secure space and power for future growth," DatacenterHawk said in its report. ... Cloud providers continue to consume most of the capacity in North America. AI and machine learning also continue to drive activity across US and Canadian markets, though the full impact of these rapidly growing industries has yet to be seen, the report noted. Submarkets within major markets will continue to grow from hyperscale users and data center operators, DatacenterHawk predicts. Older, enterprise data centers will be targets for AI companies that need bridge power quickly, providing an environment that allows them to grow larger over time.



Quote for the day:

"Amateurs look for inspiration; the rest of us just get up and go to work." -- Chuck Close

No comments:

Post a Comment