Daily Tech Digest - January 05, 2023

Singh calls herself a business information security partner, but the title most commonly employed for this role is business information security officer (BISO). People in these roles are responsible for one or more areas of the business and they usually report to the CISO or CTO, based on job descriptions found online and those laid out by multiple sources interviewed for this article. The people holding these roles also come from diverse educational and experiential backgrounds, at the core of which are strong familiarity with compliance regulations, solid cybersecurity foundations, and business acumen. ... Renee Guttmann, who’s been CISO to several Fortune 50 companies, says that the most important thing she looks for in a BISO is a thorough understanding of the business unit they support, which includes identifying the company’s “crown jewels”: what the most important assets are, where they are, and the targeted attacks to which they are potentially vulnerable. The BISO should be able to identify the risks and work with others, such as architecture and infrastructure managers, to prioritize risks.


DevOps: 3 steps to plan and execute a successful project

The preliminary stages of a DevOps project are crucial. Without clear direction and a shared understanding across the team, the initiative is doomed to failure. The team and the client must therefore be willing to dedicate the time necessary to understand each other’s goals and ensure their visions align. This can be done through meetings and workshops, where participants identify objectives and team members establish a clear goal for how the final product should look. When executed correctly, the DevOps team will exit the project’s first phase with a well-defined brief and a clear understanding of the client’s goals. If this step is rushed, engineers will be hindered by a lack of direction, increasing the likelihood that the finished product will not meet the client’s requirements. ... Phase two is when the development of the app begins. This is usually facilitated by using a cloud-based solution, where the team begins preparing the environment’s aesthetic, working out the components it should contain, and understanding how they should be configured to maximize efficiency.


5 Key Kubernetes Trends to Follow in 2023

Multi-cluster Kubernetes is important because it makes it feasible to separate workloads using not just namespaces, but entirely distinct clusters. Doing so provides more security and performance protections than you can get using namespaces, which offer only virtualized segmentation between workloads. As a result, multi-cluster Kubernetes makes Kubernetes more valuable for use cases that involve very stringent security requirements, or where it's critical to avoid the "noisy neighbor" problems that can happen when multiple workloads share the same hosting infrastructure. ... No one has ever accused Kubernetes of being an easy platform to use. On the contrary, you'll find plenty of warnings on the internet that Kubernetes is "hard," or even "damn complicated." But you could have said the same thing about Linux in the 1990s, or the Amazon cloud in the mid-2000s, when those platforms were new. Like other major technologies that preceded it, Kubernetes is still growing up, and there remains plenty of room to improve it by improving the platform's usability.


Moving Beyond Security Awareness to Security Education

“Unlike awareness, application security education is based on central principles or ‘big ideas’. If key security concepts are part of a continuous and programmatic education initiative, development teams can learn to apply knowledge, skills, and experience to novel situations and better secure applications,” said Baker. When application security principles are understood, developers can then not only identify when code isn’t quite right or spot something that creates risk, but also effectively design against it. “Awareness doesn’t go far enough for security-critical roles such as software developers, product and UX managers, quality assurance and scrum masters who are all responsible for delivering safe applications,” said Baker. “What’s needed is deeper education, and there are several ways that this can be incorporated into the awareness training mix.” First and foremost, Baker stated, the concept of continuous and programmatic security education—and why it matters for security-critical roles—requires buy-in from everyone within the organization. 


Blow for Meta: must stop serving personalised ads until it's GDPR compliant

According to nyob, ten confidential meetings took place between Meta and the DPC during the course of the proceedings, over which time the DPC came down on the side of the company and its bypassing of the standard GDPR rules for consent. Schrems has launched multiple successful legal campaigns against technology companies and their misuse of personal data. He said: "This case is about a simple legal question. Meta claims that the 'bypass' happened with the blessing of the DPC. For years the DPC has dragged out the procedure and insisted that Meta may bypass the GDPR, but was now overruled by the other EU authorities. It is overall the fourth time in a row the Irish DPC got overruled." Schrems claimed the DPC had refused to release the details of the decision to nyob and accused the regulator of playing "a very diabolic public relations game". He added: "By not allowing noyb or the public to read the decision, it tries to shape the narrative of the decision jointly with Meta. It seems the cooperation between Meta and the Irish regulator is well and alive - despite being overruled by the EDPB"


What is data ingestion?

At its simplest, data ingestion is the process of shifting or replicating data from a source and moving it to a new destination. Some of the sources from which data is moved or replicated are databases, files or even IoT data streams. The data moved and/or replicated during data ingestion is then stored at a destination that can be on-premises. ... Data ingestion uses software automation to move large amounts of data efficiently, as the operation requires little manual effort from IT. Data ingestion is a mass means of data capture from virtually any source. It can deal with the extremely large volumes of data that are entering corporate networks on a daily basis. Data ingestion is a “mover” technology that can be combined with data editing and formatting technologies such as ETL. By itself, data ingestion only ingests data; it does not transform it. For many organizations, data ingestion is a critical tool that helps them manage the front end of their data and data just entering their enterprise. A data ingestion tool enables companies to immediately move their data into a central data repository without the risk of leaving any valuable data “out there” in sources that may later no longer be accessible.


The Most Futuristic Tech at CES 2023

TCL's RayNeo X2 AR glasses are available for demo at CES 2023, and CNET's Scott Stein was able to use them to translate a conversation with a Chinese speaker in real time. The frames on the RayNeo X2 AR glasses are slightly bulkier than regular eyeglass frames, but prescription inserts eliminate the need to wear other glasses underneath, and the expected introduction of Qualcomm's AR1 chipset should reduce the size further. The RayNeo X2 AR glasses will be released to the developer community at the end of the first quarter of 2023, with a commercial launch set for later in the year. ... One of the more unusual prototypes shown at CES 2023 is a wearable neckband from a Japanese startup company called Loovic. The device hangs around your neck, sort of like studio headphones when not in use, and provides audio and tactile directions to help you navigate without looking at your phone. The device was inspired by Loovic CEO Toru Tamanka's son, who suffers from a cognitive impairment that makes following directions difficult. It will work for anyone who wants to receive navigation while keeping their head up. 


Kubernetes must stay pure, upstream open-source

Vendors may modify code for their custom distributions or the supporting applications you need to make Kubernetes run in production. While a modified version of Kubernetes will work with a particular vendor’s application stack and management tools, these proprietary modifications lock you into customized component builds preventing you from integrating with other upstream open-source projects without lock-in. And if their stack comprises multiple products, it’s very hard to achieve interoperability, which can cause lots of downstream issues as you scale. ... It’s incredibly difficult to merge back a fork that has diverged drastically over the years from the upstream. This is called technical debt – the cost of maintaining source code caused by deviation from the main branch where joint development happens. The more changes to forked code, the more money and time it costs to rebase the fork to the upstream project.


Managing Remote Workforces: IT Leaders Look to Expanded Suite of Tools

Ramin Ettehad, co-founder of Oomnitza, says organizations must connect their key systems and orchestrate rules, policies, and workflows across the technology and employee lifecycle, not with tickets and manual workloads, but rather with conditional rule-based automation of all tasks across teams and systems. He notes an example of a key business process that is challenged in a remote working world is employee onboarding and offboarding. “Companies want to make a positive first impression by offering new employees a consumer-esque onboarding experience,” he says. “This effort will maximize employee experience and time to productivity.” From the perspective of Dan Wilson, vice president analyst in the Gartner IT practice, some key tech tools for remote workforce management include remote Control for IT support to remotely see and interact with computers, as well as Unified Endpoint Management (UEM). “UEM lets IT departments discover, manage, configure devices, and deploy software and operating system updates without having to connect to the corporate network or VPN,” he explains.


Social Engineering Attacks: Preparing for What’s Coming in 2023

Impersonation and comment spam have exploded over the past year and will likely be some of the most prominent forms of phishing in 2023. This type of social engineering attack exploits the trust and recognition associated with influencers. Attackers create an account on a social media site that looks nearly identical to an influencer’s. The posts are often giveaway announcements, declaring that fans just need to “click this link” or “DM this account on Telegram” to collect their winnings. Instead, people are tricked into giving away money and are ghosted by the fake account. Impersonation and comment spam have become so serious on YouTube that prominent creators have asked the platform to address the issue. The scam results in monetary theft and hurts the reputation of the creators being impersonated. ... One peculiar new form of social engineering on the rise is reputation ransomware. This scare tactic exploits the headline nature of data breach announcements. The cybercriminal will demand ransom from the victim organization, threatening to “leak” news of a fictional data breach if they do not pay.



Quote for the day:

"Without continual growth and progress, such words as improvement, achievement, and success have no meaning.” -- Benjamin Franklin

No comments:

Post a Comment