Daily Tech Digest - December 24, 2022

3 takeaways to boost your enterprise architect career in 2023

Many people confuse business architecture and IT systems architecture, but they are different practices. Business architects transform business ideas and potential projects that align with an organization's strategies and influence a company's performance as a leader in their market, says Renee Biggs. IT architects contribute to the technology components of business architecture or enterprise architecture. ... Finding the right balance between documentation and implementation is part of an architect's job. But what do you do if your culture values performance a little too much, asks Evan Stoner, a senior specialist solutions architect. His advice is to "deliver what is needed." This means uncovering the current needs and designing for them. Change doesn't happen overnight. The future is uncertain, so architectures are constantly evolving. Software development is a key example, say Neal Fishman and Paul Homan; you must continually redevelop business applications to address new opportunities. But constant change can produce substandard solutions that require frequent revision.


Risk and resilience: compliance in 2023

By starting to prepare early, David Tattam, Chief Research & Content Officer and co-founder of Protecht expects that “businesses will start to realise the tangible benefits a holistic actionable view of risk provides. Smart businesses will track a measurable baseline of risk efficiencies over time, in line with their profit and returns strategy, to demonstrate the ROI of their risk management program”. Some legislation may have complicated and far-reaching impacts. Lee Biggenden, COO and Co-Founder of Nephos Technologies has noticed that “there are ongoing discussions in the European Union about open data platforms which, if passed, could revolutionise how data is used, shared and owned. Anticipated to come into force in 2023, it will have a huge impact on businesses who will need to put the controls and visibility in place over their data regardless of what industry they're in. Although on paper this may seem like a step in the right direction, it does raise concerns about personal privacy as third-party data sharing is a key part of the proposed act. We are all guilty of clicking privacy boxes without reading the full terms and conditions”. 


The Metaverse Doesn’t Have a Leg to Stand On

Zuckerberg’s not the only mark here. Microsoft also placed a bet on the metaverse (the avatars in its iteration, Mesh, also lack legs). In the past few years, a comically wide variety of companies have hired their own “chief metaverse officer,” from Disney and Procter & Gamble to the Creative Artists Agency and the accounting firm Prager Metis. Meta placed its bet on the metaverse in the flashiest way, changing its name, spending all that dough, et cetera, but it’s not alone in its conviction that these virtual worlds are the inevitable future. Even writer Neal Stephenson, who coined the word metaverse in his 1992 novel Snow Crash, founded an actual metaverse company in 2022. In the past few years, metaverse startups like Decentraland and the Sandbox grabbed venture capital interest by hyping themselves as hubs for a new NFT-fueled economy. Despite these companies’ hefty valuations, they have remained decidedly niche. (Refuting a third-party report that it had only 38 active users one day, Decentraland said it had an average of 8,000 daily active users—which is still tiny.) Why did Zuckerberg gamble his business on something so wobbly, so literally legless?


The year 2022 for Women in Tech

The results of the Toptal survey are not able to clearly indicate that certain progress has been made in the last year. These facts reveal that a bigger change still needs to be achieved. There are certain steps and actions that should be taken by everyone involved in the ecosystem, in order to overcome the existing gender inequality in the world and in the domain of technology and STEM specifically. The first step is to break the stereotypes. The stereotypical thinking is at the root of the problem of misconception of the role of women in society, the jobs and activities that are ‘appropriate’ for them. The second step is bridging the gender gap and it presents the next big challenging shift that must be called for. The gender gap exists notably in career opportunities and this is easily noticed when comparing the number of women studying and graduating in the STEM field and the number of women who manage to land jobs in the tech field and achieve real-life professional realization in the tech area. A significant part of the existing gender gap is the pay gap. Still, the gender pay gap provenly exists even in the most advanced countries. 


The Tech:Forward recipe for a successful technology transformation

Having an approach that is both this comprehensive and detailed was instrumental in aligning one large OEM’s tech-transformation goals. Previous efforts had stalled, often because of competing priorities across various business units, which frequently led to a narrow focus on each unit’s needs. One might want to push hard for cloud, for example, while another wanted to cut costs. Each unit would develop its own KPIs and system diagnostics, which made it virtually impossible to make thoughtful decisions across units, and technical dependencies between units would often grind progress to a halt. The company was determined to avoid making that mistake again. So it invested the time to educate stakeholders on the Tech:Forward framework, detail the dependencies within each part of the framework, and review exactly how different sequencing models would impact outcomes. In this way, each function developed confidence that the approach was both comprehensive and responsive to its needs. Meetings with the CFO


3 cloud architecture best practices for industry clouds

Make no assumptions about the security of industry-specific clouds. Those sold by the larger cloud providers may be secure as stand-alone services; however, they could become a security vulnerability when integrated and operated directly within your solution. The best practice here is to build and design security into your custom applications that leverage industry clouds. Also, do so with integration in mind so no new vulnerabilities are opened. You can take two things that are assumed to be secure independently, and then add dependencies that entirely change the security profile. ... However, you’ll often find the best-of-breed option is on another cloud or perhaps from an independent industry cloud provider that decided to go it alone. The best practice here is to not limit the industry-specific services under consideration. As time goes on, there will be dozens of services to do tasks such as risk analytics for investment banking, for example. Picking the less optimized choice means you’ll lower the value that’s returned to the business. In other words, you make less ROI when you make less optimized decisions.


Benefits of A Technology-Enabled Risk Assessment Process

Organizations need effective risk assessments to manage resources and make informed business decisions that enable growth. Performing an effective risk assessment means going beyond an annual, check-the-box activity by implementing a risk assessment process that can yield actionable results and findings and serve as a business intelligence tool to inform risk management strategies. In today’s rapidly evolving market, banks, financial services companies, payment services providers, and  fintechs  are focused on improving their processes to create dynamic and efficient digital experiences for their customers. But the improvements do not extend just to customers. ... Technology-enabled solutions – which support the automated or semi-automated collection of data, scoring of inherent risk, mapping of controls, and scoring of residual risk – can help organizations streamline and add efficiencies to their risk assessments and provide a better understanding of real-time risk than the frequently outdated, once-a-year process can. Organizations then can use the valuable business intelligence obtained through the risk assessment process to increase revenue and identify new business opportunities for further growth.


Making the case for an Enterprise Architect in Digital Transformation programs

Large transformation work need to be addressed in 3 buckets – Plan, Build & Run. While “Build” is the largest portion of investments in any transformation program, it is obvious that organizations are required to safeguard whatever has been built with minimal “Run” budgets. “Build” and “Run” are cyclical in nature. For example, you build, then you maintain (run), then you either build more or something new and then maintain (run) that more or something new. So it is pretty obvious that leaders responsible for large transformations think of Build as the starting point and transition to Run as the end. It is senior leaders however that need to see things being built (and run) as building blocks of a vision or the journey. This is the Planning function (which is strategic). The “Plan” function needs to be executed by someone who understands business vision and maps the journey to that vision. S/he does that by identifying business capabilities and value chain (not business process), and then developing the strategy as building blocks. The skill required to do this is called “Enterprise Architecture”.


EU Cyber Resilience Act: Good for Software Supply Chain Security, Bad for Open Source?

With all of the good that the CRA brings in evolving the regulatory conversations past SBOMs, the current draft has some problematic language that could actually hurt the future of open source. But first, what it gets right about open source. Page 15, Paragraph 10 attempts to exempt, or carve out, open source software (OSS) from the regulations, saying: In order not to hamper innovation or research, free and open-source software developed or supplied outside the course of a commercial activity should not be covered by this Regulation. This is in particular the case for software, including its source code and modified versions, that is openly shared and freely accessible, usable, modifiable and redistributable. This is good, even great. OSS and project maintainers should be exempt from these regulations that apply liability, as this will have the effect of quashing innovation and sharing of ideas via code. However, in the same paragraph, the CRA attempts to draw a line between commercial and non-commercial use of open source software:


Finding the Right Data Governance Model

It is critical to distinguish the term “governance” from the term “management” in the context of Data Governance. It should be noted that the principal difference between “governance” and “management” is that governance refers to the decisions that must be made and who must make them. This is to ensure effective resource allocation and management of data operations. On the other hand, Data Management involves implementing those decisions that arise from assessing and monitoring either existing controls or the environment that includes advancements in technology and the market. The activities required for Data Governance can, therefore, be distinguished from those needed for Data Management since management is influenced by governance. Data Governance is oversight of Data Management activities to ensure that policy and ownership of data are enforced in the organization. The emphasis is on formalizing the Data Management function and associated data ownership roles and responsibilities.



Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - December 23, 2022

Why the industrial metaverse will eclipse the consumer one

The industrial metaverse is further ahead on the 3D front, with simulations and digital twins. The industrial metaverse is ahead on the standards front, with companies like Nvidia pushing potential standards such as Universal Scene Description (USD) through its Omniverse platform. USD has been characterized as doing for the metaverse what HTML did for the internet. In this regard, USD can lead to greater interoperability, [connecting] formerly disparate applications or ecosystems … to make workflows more seamless. ... Digital assets, similarly, are typically locked to a particular ecosystem, servicer or game. Many of the most transformative opportunities in the consumer space will also come with mainstream smart glasses, which are still years away before we see a stronger impact. The enterprise and industrial metaverses are also better grounded in ROI, meaning more trials and initial deployments have a higher potential to succeed or lead to more adoption compared to consumer efforts, which have seen more pushback, such as the addition of NFTs in games in Western markets [gaining] limited traction.


Surviving the Incident

The next step to the IR playbook is to identify the "crown jewels" of the organization — the critical systems, services, and operations that, if impacted by a cyber event, would disrupt business operations and cause a loss of revenue. Similarly, understanding the collected data type, how it is transmitted and stored, and who should access it must be mapped to ensure data security. Identifying and mapping critical systems can be accomplished through penetration tests, risk assessments, and threat modeling. A risk assessment is often the first tool to identify potential attack vectors and prioritize security events. However, to achieve a proactive stance, organizations are increasingly leveraging threat intelligence and modeling to identify and address vulnerabilities and security gaps early on before a known attack occurs. The primary goal is to identify weaknesses or vulnerabilities with assets to reduce the attack surface and close all the security gaps. This guide will focus on web application security as our attack scenario. Why web application security? 


Not everything we call AI is actually 'artificial intelligence'. Here's what you need to know

Most of what we know as AI today has narrow intelligence – where a particular system addresses a particular problem. Unlike human intelligence, such narrow AI intelligence is effective only in the area in which it has been trained: fraud detection, facial recognition or social recommendations, for example. AGI, however, would function as humans do. For now, the most notable example of trying to achieve this is the use of neural networks and “deep learning” trained on vast amounts of data. Neural networks are inspired by the way human brains work. Unlike most machine learning models that run calculations on the training data, neural networks work by feeding each data point one by one through an interconnected network, each time adjusting the parameters. As more and more data are fed through the network, the parameters stabilise; the final outcome is the “trained” neural network, which can then produce the desired output on new data – for example, recognising whether an image contains a cat or a dog. The significant leap forward in AI today is driven by technological improvements in the way we can train large neural networks, readjusting vast numbers of parameters in each run thanks to the capabilities of large cloud-computing infrastructures.


Metaverse Security Concerns Coming Into Focus as Businesses Plan For “Virtual Reality” Futures

Organizations smell potential here, with 23% responding that they are already developing initiatives even as basic specifications are still firming up. Of the respondents that expressed a desire to do business in the metaverse, the leading interest (44%) was customer engagement opportunities. Other popular areas are learning/training measures and workplace collaboration. But when asked about their concerns about expanding into this new area, respondents said that metaverse security was item #1 on the list. By and large, today’s security solutions have not yet considered the prospect of metaverse integration. Nevertheless, 86% of the respondents said that they would feel comfortable sharing user personal information between different metaverse services. Security providers may be waiting to see what users settle on in the metaverse before tailoring their products accordingly. Of the products available thus far, online games are the only ones drawing mass amounts of users (particularly the pre-existing Roblox and Fortnite) along with simple 3D world chat apps that allow users to appear as an avatar.


What’s next for AI

The big companies that have historically dominated AI research are implementing massive layoffs and hiring freezes as the global economic outlook darkens. AI research is expensive, and as purse strings are tightened, companies will have to be very careful about picking which projects they invest in—and are likely to choose whichever have the potential to make them the most money, rather than the most innovative, interesting, or experimental ones, says Oren Etzioni, the CEO of the Allen Institute for AI, a research organization. That bottom-line focus is already taking effect at Meta, which has reorganized its AI research teams and moved many of them to work within teams that build products. But while Big Tech is tightening its belt, flashy new upstarts working on generative AI are seeing a surge in interest from venture capital funds. Next year could be a boon for AI startups, Etzioni says. There is a lot of talent floating around, and often in recessions people tend to rethink their lives—going back into academia or leaving a big corporation for a startup, for example.


How to Innovate by Introducing Product Management in SMB and Non-Tech Companies

It’s common to find product managers and product owners in SaaS, technology, ecommerce, retail, and other B2C companies. Leadership in these companies long realized that understanding markets, determining product-market fits, defining customer personas, and understanding value propositions are all key to developing minimally viable solutions and delivering ongoing product enhancements. But identifying product managers and owners in non-tech companies, B2B businesses, SMBs, and the government remains a long-running work in progress. To start innovating, it comes down to transforming from stakeholder-led backlogs to product-managed, market-driven roadmaps. Tech, media, and ecommerce companies figure this out right away because chasing stakeholder-driven features often yields subpar results. More traditional businesses are likely to misdiagnose the problems with stakeholder-driven backlogs as a technology execution or platform issue. But there are a few secrets to making product management work even in the most traditional businesses.


IT Job Market: 2022's Wild Ride and What to Expect for 2023

Even as those layoff announcements were rolling in, the US Bureau of Labor Statistics job report for October showed a strong job market for tech pros and continued growth for remote jobs. In November that growth continued with IT industry association CompTIA reporting that US tech companies added 14,400 workers during the month, marking two consecutive years of monthly job growth in the sector. Tech jobs in all industry sectors increased by 137,000 positions. And while job postings for future hiring slipped in November, they still totaled nearly 270,000. As the tech sector heads into a changed 2023 employment market, it’s unclear how all these mixed signals will play out, although experts are starting to weigh in on best practices. Employers are likely looking carefully at budgets and head counts. But it will be a challenging line to walk. Employers have spent the past few years investing in employee experience programs and focusing on retaining their valuable talent. An abrupt change in direction such as mass layoffs will likely sour companies’ reputations as employers.


Inside the Next-Level Fraud Ring Scamming Billions Off Holiday Retailers

Besides the operation being stacked with technology know-how, Michael Pezely, Signifyd's director of risk intelligence, tells Dark Reading that the e-commerce threat group has sheer speed and volume of scam transactions on its side. "E-commerce orders — particularly at the enterprise level — arrive at dizzying speed," Pezely says. "Signifyd, for instance, processed as much as $42 million an hour in orders during Cyber Week. It would be virtually impossible for a human team to review that volume of orders for signs of fraud." Pezely added that merchants are on the lookout for goods being shipped to a foreign country, but this group of scammers places orders that appear to originate from the US and ship to US addresses. "Furthermore, if a merchant is relying on only its own transaction data, there likely will be a lag between the time a fraud attack begins and when it is recognized," Pezely explains. "Without having the benefit of seeing millions of transactions across thousands of merchants, a novel fraud attack might not be in plain sight for some time."


Protecting your organization from rising software supply chain attacks

The reason for the continued bombardment, said Moore, is increasing reliance on third-party code (including Log4j). This makes distributors and suppliers ever more vulnerable, and vulnerability is often equated with a higher payout, he explained. Also, “ransomware actors are increasingly thorough and use non-conventional methods to reach their targets,” said Moore. For example, using proper segmentation protocols, ransomware agents target IT management software systems and parent companies. Then, after breaching, they leverage this relationship to infiltrate the infrastructure of that organization’s subsidiaries and trusted partners. “Supply chain attacks are unfortunately common right now in part because there are higher stakes,” said Moore. “Extended supply chain disruptions have placed the industry at a fragile crossroads.” Supply chain attacks are low cost and can be minimal effort and have potential for high reward, said Crystal Morin, threat research engineer at Sysdig. And, tools and techniques are often readily shared online, as well as disclosed by security companies, who frequently post detailed findings.


Why User Journeys Are Critical to Application Detection

The first generation of cybersecurity detection technology is rules, but rules only detect known patterns. Individualized rules require expensive experts to maintain: each application is unique, and one must be extremely familiar with its business logic, log formats, how it is used, etc., in order to write and manage rules for detecting application breaches. ... Over a decade ago, the security market adopted statistical analysis to augment rule-based solutions in an attempt to provide more accurate detection for the infrastructure and access layers. However, UEBA failed to deliver as promised to dramatically increase accuracy and reduce false positive alerts due to a fundamentally mistaken assumption – that user behavior can be characterized by statistical quantities, such as the average daily number of activities. ... The main criteria for success in a detection solution is accuracy, which is dictated by the number of false positives, and the number of false negatives. The evolution of detection solutions led to the third generation of solutions analyzing Sequences of Activity, i.e. Journeys, to contextualize activity and improve detection accuracy.



Quote for the day:

"Before you revel in the anticipation of tomorrow, toil in the preparation of today." -- Tim Fargo

Daily Tech Digest - December 22, 2022

Data forecast for 2023: Time to extract more value

Using data effectively relies in large part on being able to properly manage and control how data is used. That's where data governance comes into play, with tools and technologies that help organizations govern the data they use. Data governance will have an expanded role in 2023, according to Eckerson Research analyst Kevin Petrie. There will be a growing use of ML technologies to improve data governance technology by helping to automate processes and policies for data. Petrie said he also expects a rising number of data governance platforms to help organize, document and apply policies to ML models alongside other data assets in 2023. Benefitting from data to improve business outcomes entails collecting product and service data. That's where the concept of data as a product -- also referred to as data product -- will have growing relevance in 2023. Barr Moses, CEO of data observability vendor Monte Carlo, predicted that nearly every product will become a data product as organizations seek to optimize operations. "In 2023, more and more companies will seek to integrate ways to track and monetize data generated by their products as part of their core offerings to drive competitive advantage," Moses said.


The Future of Skills: Preparing for Industry 4.0 and Beyond

Industry 4.0—Industrial Internet of Things or the 4th Industrial revolution, as it is popularly addressed—has arrived with lots of opportunities and challenges that have the potential to transform the marketplace completely. Industry 4.0 refers to the “smart” and connected production systems that are designed to sense, predict and interact with the physical world so as to make decisions that support production in real-time, increasing productivity, energy efficiency and sustainability. McKinsey estimates that IoT has the potential to unlock an economic value somewhere between US$5.5 to $12.6 trillion by 2030. Therefore, with so many changes happening so quickly, neither employers nor employees (both employed and yet to be employed) can afford to ignore them or to stay in their comfort zone following the same old practices or skills. A report by World Economic Forum states that 84 percent of employers are set to rapidly digitalize working processes with the potential to move 44 percent of their workforce to operate remotely, and the top skills needed as we lead up to 2025 are critical thinking and analysis, problem solving, active learning, resilience, stress tolerance and flexibility.


What is DataOps? Collaborative, cross-functional analytics

Enterprises today are increasingly injecting machine learning into a vast array of products and services and DataOps is an approach geared toward supporting the end-to-end needs of machine learning. “For example, this style makes it more feasible for data scientists to have the support of software engineering to provide what is needed when models are handed over to operations during deployment,” Ted Dunning and Ellen Friedman write in their book, Machine Learning Logistics. “The DataOps approach is not limited to machine learning,” they add. “This style of organization is useful for any data-oriented work, making it easier to take advantage of the benefits offered by building a global data fabric.” ... Because DataOps builds on DevOps, cross-functional teams that cut across “skill guilds” such as operations, software engineering, architecture and planning, product management, data analysis, data development, and data engineering are essential, and DataOps teams should be managed in ways that ensure increased collaboration and communication among developers, operations professionals, and data experts.


Amplified security trends to watch out for in 2023

Cybercriminals target employees across different industries to surreptitiously recruit them as insiders, offering them financial enticements to hand over company credentials and access to systems where sensitive information is stored. This approach isn’t new, but it is gaining popularity. A decentralized work environment makes it easier for criminals to target employees through private social channels, as the employee does not feel that they are being watched as closely as they would in a busy office setting. Aside from monitoring user behavior and threat patterns, it’s important to be aware of and be sensitive about the conditions that could make employees vulnerable to this kind of outreach – for example, the announcement of a massive corporate restructuring or a round of layoffs. Not every employee affected by a restructuring suddenly becomes a bad guy, but security leaders should work with Human Resources or People Operations and people managers to make them aware of this type of criminal scheme, so that they can take the necessary steps to offer support to employees who could be affected by such organizational or personal matters.


How deep learning will ignite the metaverse in 2023 and beyond

Currently, the digital realities being developed by different companies have their own attributes and integrated functionalities, and are at different development levels. Many of these multiverse platforms are expected to converge, and this junction is where AI and data science domains, such as deep learning, will be critical in taking users to a new stage in their metaverse journey. Success in these endeavors will be contingent upon understanding vital elements of the algorithmic models and their metrics. Deep learning-based software is already being integrated into virtual worlds; some examples include autonomously driving chatbots and other forms of natural language processing to ensure seamless interactions. For another example, in AR technology, deep learning-enabled AI is used in camera pose estimation, immersive rendering, real-world object detection and 3D object reconstruction, helping to guarantee the variety and usability of AR applications. ... “Companies have an interesting opportunity for their customers and community to interact with their brand(s) in new and exciting ways, and deep learning-based artificial intelligence plays a major role in facilitating those experiences,” said Stephenson.


Introducing Cadl: Microsoft’s concise API design language

Microsoft has begun to move much of its API development to a language called Cadl, which helps you define API structures programmatically before compiling to OpenAPI definitions. The intent is to do for APIs what Bicep does for infrastructure, providing a way to repeatably deliver API definitions. By abstracting design away from definition, Cadl can deliver much more concise outputs, ensuring that the OpenAPI tool in platforms like Visual Studio can parse it quickly and efficiently. What is Cadl? At first glance it’s a JavaScript-like language with some similarities to .NET languages. Microsoft describes it as “TypeScript for APIs,” intending it to be easy to use for anyone familiar with C#. Like Microsoft’s other domain-specific languages, Cadl benefits from Microsoft’s long history as a development tools company, fitting neatly into existing toolchains. You can even add Cadl extensions to the language server in Visual Studio and Visual Studio Code, ensuring that you get support from built-in syntax highlighting, code completion, and linting. Making Cadl a language makes a lot of sense; it allows you to encapsulate architectural constraints into rules and wrap common constructs in libraries. 


CIOs in 2023: Guiding Business Strategies Through Data-Driven Decisions

“CIOs need to take on a data mindset by first understanding the data, and then determining how critical the data architecture and data governance is,” he says. For understanding the business process, they need to think about how they can move the needle for the company, prioritize the projects that drive business, and implement or evolve the systems they already have. “The third important thing is building business partnerships across the organization,” Kancharla adds. “Having all levels of relationships will go a long way for the CIOs to be successful. The last thing is really thinking of what optimizations they can bring to the company, especially next year.” He points out that next year, every company will have to bring down costs, which means streamlining and optimizing the software within the company and deploying the tools they already have to the full potential. Segovia adds effective CIOs must also be able to understand the tech and recommendations their teams are executing on. “They need to understand areas in a reasonably deep manner in order to lead teams of wide technical and digital acumen,” he says.


Social media use can put companies at risk: Here are some ways to mitigate the danger

The concern is that foreign-owned applications might share the information they collect with government intelligence agencies. That information includes personally identifiable information, keystroke patterns (PII), location information based on SIM card or IP address, app activity, browser and search history, and biometric information. Personal use of social media by employees can impact the company’s brand as well as endanger the firm or employees themselves—bad actors could use social media to identify where a person works, the division in which they work, and possibly their physical location. The potential harm is higher for high-risk employees such as senior executives or those with authority to execute financial transactions. Of course, there are plenty of good reasons for employees to use social media. It can enhance marketing campaigns, announce news or critical information, and otherwise raise the profile of an organization. Social media channels can be used to monitor risks and threats against a government or critical infrastructure. 


The power of generosity in ecosystems

A traditional approach to competition, rooted in the business mindset of one company gaining an advantage over another, can make it difficult to play in an ecosystem as a participant. For example, one of the risks of being part of an ecosystem is the dependency on its orchestrator. Increased reliance on Big Tech and the consolidation of many industries have created an increased risk of a few powerful cash-generator businesses that need to reward shareholders with consistent, attractive margins and will not think twice about burdening their partners to keep those margins—for example, by asking for discounts in exchange for participating in the ecosystem. But what if there was more of a sense of mutual collaboration? Benjamin Gomes-Casseres of Brandeis University has published research with Harvard Business Review Press on different business combinations (his term for business ecosystems). He states that for an ecosystem to logically exist, the players within an ecosystem must fairly share the benefits, creating added value for the entire ecosystem that exceeds the level of value each company could create independently.


6 BI challenges IT teams must address

There can be obstacles, however, to taking the self-service approach. Having too much access across many departments, for example, can result in a kitchen full of inexperienced cooks running up costs and exposing the company to data security problems. And do you want your sales team making decisions based on whatever data it gets, and having the autonomy to mix and match to see what works best? Central, standardized control over tool rollout is key. And to do it correctly, IT needs to govern the data well. Because of these tradeoffs, organizations must ensure they select the BI approach best-suited for the business application at hand. “We have more than 100,000 associates in addition to externals working for us, and that’s quite a large user group to serve,” says Axel Goris, global visual analytics lead at Novartis, the multinational pharmaceutical corporation based in Basel, Switzerland. “A key challenge was organization around delivery — how do you organize delivery, because a pharmaceutical company is highly regulated.” An IT-managed BI delivery model, Goris explains, requires a lot of effort and process, which wouldn’t work for some parts of the business.



Quote for the day:

"Nothing so conclusively proves a man's ability to lead others as what he does from day to day to lead himself." -- Thomas J. Watson

Daily Tech Digest - December 21, 2022

The Cybersecurity Industry Doesn't Have a Stress Problem — It Has a Leadership Problem

Many of the cybersecurity issues raised in the CIISec survey point to a need for strong leadership that proactively identifies and resolves issues. But cybersecurity teams need servant leaders, not those who lead by establishing command and control structures. Servant leaders create authority by — you guessed it — serving their employees. Cybersecurity executives of this ilk are concerned about the well-being of the team, regularly checking in with team members on how they are doing, and removing roadblocks that harm operational performance. They'll go to bat with upper management to get an increased budget for new tools and additional staff to smooth out workloads for teams. Servant leaders take turns serving on call to understand work conditions from analysts' perspectives and hold regular team meetings to discuss key trends and issues. They're also likely to look ahead to anticipate market and business developments and reposition their organization to get ready to meet them. As a result, these leaders' teams feel supported. Analysts are not afraid to share problems or new ideas, as they know their leaders will listen, consider them carefully and, most importantly, respond.


Cybersecurity: What is Changing and What Isn’t

A lot of things have changed, but a lot remain the same. Adversaries have gotten smarter, so defense has had to do the same. Every piece of technology has a computer embedded in it nowadays – cars, fridges, thermostats, cameras, speakers, and of course, the ubiquitous mobile phones – resulting in a vastly increased attack surface, and the need for trained professionals to protect this Internet of Things (IoT). The general migration to the cloud has also encouraged the growth of professionals seeking to protect data outside the confines of on-prem systems. However, some core tenets still hold true – restricting user access, limiting system functionality, backing up critical data, planning for disruptions, and of course, security awareness training. Even the best of security controls can be overcome by a user clicking on the wrong link (phishing), visiting the wrong website (drive-by download), connecting to the wrong network (rogue access point), opening the wrong attachment (malicious macro), letting in the wrong person in a secured area (tailgating), or just simply, disclosing the right information to the wrong person (vishing).


Intro to the Observable design pattern

The Observable design pattern is used in many important Java APIs. One well-known example is a JButton that uses the ActionListener API to execute an action. In this example, we have an ActionListener listening or observing on the button. When the button is clicked, the ActionListener performs an action. The Observable pattern is also used with reactive programming. The use of observers in reactive applications makes sense because the essence of reactive is reaction: something happens when another process occurs. Observable is a behavioral design pattern. Its function is to perform an action when an event happens. Two common examples are button clicks and notifications, but there are many more uses for this pattern. ... By using the Observable pattern, the notification would happen only once to all of your subscribers. It's a huge performance gain as well as being an effective code optimization. This code can easily be extended or changed. The reactive programming paradigm uses the Observable pattern everywhere. If you ever worked with Angular, then you will know that using Observable components is very common. 


How to Embed Gen Z in Your Organization’s Security Culture

Providing the most cutting-edge instruction will engage Gen Zers and provide them with meaningful security best practices for work and home. The threat landscape is more dangerous than it was when Gen Zers were coming of age. Current threats extend beyond traditional scams. They may be lurking in the unsecured WiFi available at a coffee shop. All the threat actor needs is someone desperate for free internet and tired of clicking checkboxes. With that ever-changing threat landscape in mind, your organization’s security program needs the resilience to adapt. The IBM Security X-Force Cyber Range provides a variety of experiences to prepare organizations for a cyber incident. The team can also cater content to different audiences, such as the C-suite or the board of directors. Gen Z may not be a part of those groups yet, but the X-Force Cyber Range offers a range of experiences for professionals at all levels. The X-Force Cyber Range team tailors immersive experiences to your organization’s industry and context to provide the most realistic scenario. 


Intelligence and Efficiency Will Guide Unstructured Data Management in 2023

Smarter edge data management will avoid overspending on storing extraneous data in cloud data lakes and warehouses by filtering and deleting non-valuable data at the edge first. Edge analytics tools will quickly process the data without the need to send large files back and forth to cloud or on-premises data centers, saving time and money. The right edge analytics and data management program can deliver real-time insights to improve customer experiences or detect issues quickly, such as a manufacturing defect or a ransomware breach. ... Storage and IT managers will need to prepare by getting full visibility into data across silos, understanding data characteristics and metadata to enable rapid classification and search, and then moving it into the optimal storage tier to feed the data lake and analytics platforms preferred by their end users. IT will need to work closely with stakeholders from security, legal, data governance, research, and data science teams, as well as business unit leaders, to fulfill the requirements of new, unstructured data analytics programs.


The FBI is worried about a wave of cyber crime against America’s small businesses

Small and medium-sized businesses face a big threat from cyberattacks and hackers, according to a special agent in the FBI’s cyber division. “The large businesses continue to invest in their cybersecurity and enhance their cybersecurity posture,” FBI Supervisory Special Agent Michael Sohn said at CNBC’s Small Business Playbook virtual event on Wednesday. “So what the cybercriminals are doing is they’re pivoting, they’re evolving and targeting the soft targets, which are the small and medium businesses.” In 2021, the FBI’s Internet Crime Complaint Center (IC3) received 847,376 complaints from the American public regarding cyberattacks and malicious cyber activity, a 7% year-over-year increase. In total, potential losses from those attacks exceed $6.9 billion, a 64% increase compared to the previous year. “Unfortunately, the majority of those victims were small businesses,” Sohn told CNBC’s Frank Holland. But even as small businesses are increasingly being targeted by hackers and cyber criminals, CNBC and SurveyMonkey data has shown that most small business owners are not concerned.


Healthcare: Essential Defenses for Combating Ransomware

From a defensive standpoint, Siegel says organizations can employ a long list of tactics. Leading up to ransomware, the biggest weakness he sees is a cultural issue, centered on failing to take the risk seriously and make appropriate investments to prevent such incidents. "These are the times we live in, and it's just the cost of doing business," he says. "You have to make these investments." Ransomware attackers gain remote access to a victim's network and typically linger, studying the network and gaining greater access, before deploying crypto-locking malware. Thus, it's imperative to spot those activities before files start getting encrypted. "Most groups now will also want to steal large amounts of data before they launch the ransomware, and then they'll actually plan out how they're going to deploy the ransomware to all of your servers, all of your machines or whichever ones they choose," says Peter Mackenzie, director of incident response at Sophos. "That's not something that happens instantly. That can take days or weeks of preparation."


Engineering AI-Enabled Computer Vision Systems: Lessons From Manufacturing

While traditional non-AI software acts as a tool to execute preset rules, an AI-enabled system makes decisions based on (past) data and probabilistic outcomes, which constitutes a paradigm shift—especially within traditional manufacturing organizations. Therefore, proven software development approaches need to be extended to build and further evolve systems that contain ML components.13 One example is DevOps, which needs to be extended into DataOps or MLOps when developing AI solutions to meet specific requirements of handling the everchanging data. Engineering AI-enabled computer vision systems goes beyond merely building AI algorithms. To build industrial solutions, these AI algorithms need to be embedded into grown-up software products which also poses novel challenges for software engineers. To provide an overview of challenges and success factors in engineering AI-enabled computer vision systems, we analyzed corresponding manufacturing use cases, shadowed project meetings, and incorporated our own expertise.


IT Industry Outlook 2023: Trends Likely to Impact the Industry and Tech Pros

Employers are no longer restricted to hiring candidates that are within a commutable distance of local offices, giving job hunters an opportunity to apply for roles that may not have been open to them previously. “I believe with the continued prevalence of remote working, hiring decisions will become less based on culture fit and similar criteria, and more focused on skills and performance,” Finnigan says. “This will open the door to a much more globally diverse workforce, provided skills gaps continue to close.” ... Replacing early interview screenings with skills-based assessments that mimic a company's tech stack allows hiring managers to assess candidates’ compatibility quickly and accurately, moving only the best through the pipeline. “With this approach, hiring managers can spend more time with candidates who are truly qualified, which can lead to a more accurate decision and a faster time-to-hire,” Finnigan says. Westfall says that smaller organizations may be able to offer IT pros looking for a change of pace an assortment of unique perks, as well as a close-knit company culture and a greater impact on local communities.


APIs are placing your enterprise at risk

Stolen API keys are the culprit behind some of the largest cyberattacks to date. We see the headlines and we read the news stories, but we often fail to realize the broad consequences – particularly the notable impacts on enterprise mobile security. Consider the news earlier this year of 3,000+ mobile applications leaking Twitter’s API keys, meaning bad actors could compromise thousands of individual accounts and conduct a slew of nefarious activities. Imagine if this was your company and the role was reversed and hundreds or even thousands of mobile applications were leaking the API keys to your corporate Gmail, Slack or OneDrive accounts. If this or similar scenarios were to happen, employee devices and sensitive company data would be at extreme risk. The recent push to focus on API security comes at a critical time where more enterprises are relying on enterprise mobility, meaning increasing a reliance on mobile app connectivity. A recent survey of US and UK-based security directors and mobile applications developers found that 74% of respondents felt mobile apps were critical to business success.



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick

Daily Tech Digest - December 20, 2022

Ransomware: It’s coming for your backup servers

Backup and recovery systems are at risk for two types of ransomware attacks: encryption and exfiltration – and most on-premises backup servers are wide open to both. This makes backup systems themselves the primary target of some ransomware groups, and warrants special attention. Hackers understand that backup servers are often under-protected and administered by junior personnel that are less well versed in information security. And it seems no one wants to do something about it lest they become the new backup expert responsible for the server. This is an age-old problem that can allow backup systems to pass under the radar of sound processes that protect most servers. It should be just the opposite. Backup server should be the most updated and secure systems in the data center. They should be the hardest to login to as Administrator or root. And they should require jumping through the most hoops to login remotely. An important role backup servers play is providing the means to recover from a ransomware attack without paying the ransom. 


How We Improved Application’s Resiliency by Uncovering Our Hidden Issues Using Chaos Testing

The goal of chaos engineering is to educate and inform the organization of unknown vulnerabilities and previously unanticipated outcomes of a computer system. A primary focus of these complex testing procedures is to identify hidden problems that can potentially arise during production environments prior to an outage failure outside of the organization’s control. Only then can the disaster recovery team address systematic weaknesses and enhance the system’s overall fault-tolerance and resiliency. Hence, Chaos testing is being carried out at various levels. ... Chaos testing is a new concept, but we always had the mindset to perform it, and we did perform it sometimes, without knowing that it was a chaos testing. It has its own principles, benefits and pitfalls. However, I would advise all teams to weigh the pros and cons of conducting these tests before formulating a plan. You should be very clear as to what you want to achieve from these disruptive tests. Take permissions from your bosses and convince them why it is important to carry out these tests. 


How Our Behavioral Bad Habits Are a Community Trait and Security Problem

Internal naming groups and conventions become exposed to the outside world in a variety of ways. They're buried in website code, detailed in technical documentation or as part of APIs, or just simply published in public system information. Admittedly, this is a very large haystack, but finding the needles is exactly what the patent I was involved in (US Patent 10,515,219) endeavors to do. Site-scanning tools collect a range of information, and unsurprisingly, an overload of information. My approach strips out all the technical programming information (such as markup, JavaScript, etc.), and leaves just words. It then compares results with lists of English words. The algorithm then identifies groupings of words or abbreviations not present in the selected language that, presumably, may signify an internal naming convention or credentials. As is common with brute-force campaigns, it may not, but as the axiom goes, the attacker only needs to be right once, so the ability to generate context-sensitive word lists may make or break your next campaign. This is when the picture may start to become clearer and the shape of things such as user groups, system names, etc., manifest.


The Agile Compromise Calls for Courage

We cannot eliminate the risk of building the wrong thing with better design and more focus groups. We can only do it by shorter and more frequent iterations of a working product that is incomplete and maybe not that great yet. That’s the Agile compromise. Shorter cycles decrease the risk of building the wrong thing but increase the risk of degrading the process. Accruing technical debt is one. Not a necessary consequence, just a standard price to pay for quicker deliveries. No pundit with stories about the constant commitment to quality will convince me otherwise. If you want greater speed, accept more risks. The Agile compromise towards risk-taking also recognizes that software, as a creative discipline, by nature exposes black swans: the risks we didn’t know we would ever run into. No engineering approach provides full reassurance against them, nor can testing and validation ever give you full peace of mind. It’s a little bit scary, but if you pride yourself on an Agile mindset, you must embrace it. Software is complex rather than complicated. Its many moving parts behave in unpredictable ways when unleashed on the world. Risks are a natural part of that property.


GPT: High-tech parlor trick or the first real AI for everyday use?

In many cases, such as ChatGPT, AI is still a parlor trick that will enthrall us until the next trick comes along. In some cases, it’s a useful technology that can augment both human and machine activities through incredibly fast analysis of huge volumes of data to propose a known reaction. You can see the promise of that in the GPT-fueled Copysmith.AI even as you experience the Potemkin village reality of today. At a basic level, AI is pattern matching and correlation done at incredible speeds that allow for fast reactions — faster than what people can do in some cases, like detecting cyberattacks and improving many enterprise activities. The underlying algorithms and the training models that form the engines of AI try to impose some sense onto the information and derived patterns, as well as the consequent reactions. AI is not simply about knowledge or information, though the more information it can successfully correlate and assess, the better AI can function. AI is also not intelligent like humans, cats, dogs, octopi, and so many other creatures in our world.


How you can stop corporate login credential theft

Organizations should take a layered approach to credential management. The goal is to reduce the number of sites users have to put passwords into. Organizations should endeavor to implement single sign-on (SSO) for all reputable necessary work applications and websites. All SaaS providers should support SSO. If there are logins that require different credentials, a password manager would be helpful in the interim. This also provides a way for employees to know if a login page can be trusted, as the password manager won’t offer credentials up for a site it does not recognize. Organizations should also enable multi-factor authentication (MFA) to secure logins. FIDO2 is also gaining adoption. It will provide a more robust solution than traditional authenticator apps, although those apps are still better than codes sent via text messages. Not all of this is foolproof, and risky login pages could slip through the net. A last resort is needed for flagging risky login pages to employees. This can be done by analyzing, in real time, threat intelligence metrics, webpage similarities, domain age and how users got to a login page. 


Security Risks, Serious Vulnerabilities Rampant Among XIoT Devices in the Workplace

The potential intent of assorted Chinese hardware manufacturers (such as Huawei and ZTE) led to a 2018 ban on use of their equipment by federal agencies. These devices remain widely in use in private organizations, however, and sometimes banned devices slip through dragnets via the process of “white labeling”. Organizations also often do not have visibility into the code that XIoT devices run on. When these devices draw on third-party firmware libraries, several possible security risks emerge. One is simply that the vendor will abandon support for the device, no longer issuing security patches to address emerging vulnerabilities. Another is that the code may be maintained by open source developers, who have the capability to insert malicious elements or even abandon or spike the project unexpectedly. A simple problem that has dogged XIoT devices from the very beginning also remains; the manufacturers are often not tech outfits and thus are not familiar with security by design elements, and/or do not have the budget in place to add them and still come in at their desired price points in competitive markets.


Understanding e-signatures: the key differences and requirements

A QES is considered to have more probative value than an AES, which means that courts will give more weight as evidence. The first key difference is that they offer a higher level of security than AES. This is because qualified signatures are created using a qualified signature creation device (QSCD), which stores the signing key. Examples of physical QSCDs include smart cards, SIM cards or USB tokens. It’s also possible for signatories to create a QES without having a physical device in their hands. In this instance, signatories remotely access a signing key, which is stored in a trusted service provider’s data centre. This is often the preferred choice for organisations since it streamlines device management. A QES must also be based on a ‘qualified certificate for electronic signatures’, which is another key difference between an AES and a QES. Only ‘Qualified trust service providers’ (QTSPs) listed on the European Union’s trusted provider database can issue this certificate. To become a QTSP, organisations must successfully complete a series of evaluations and audits that ensure compliance with eIDAS regulations.


Top cloud strategy mistakes CIOs can’t help making

“Not architecting for the cloud,” says IDC analyst Dave McCarthy, when asked where CIOs commonly go wrong when building their cloud strategy. “While it is possible to ‘lift and shift’ existing workloads, enterprises often experience less than desirable costs and performance with this approach. You need to adapt applications to cloud-native concepts to realize the full value.” CIOs also often make the mistake of “not implementing enough automation,” says McCarthy, who is research vice president of cloud and edge infrastructure services for IDC. “Best practices in cloud include automating everything from the deployment of infrastructure and applications to management and security. Most outages or security breaches are the result of manual misconfigurations. But perhaps the worst sin CIOs can make, analysts across the spectrum agree, is fail to plan for the shift in culture and skills required to devise and implement a successful cloud strategy. The cloud functions differently than traditional IT systems, and a cloud strategy must not only require new skills but a change in thinking about how to design and manage the environment, McCarthy says.


What is a business architect and how do you become one?

An enterprise architecture is comprised of different kinds of components (business strategy and outcome, technology platforms and infrastructure, and security), and a business architecture encompasses how all these things come together to best serve the business. It is a component of enterprise architecture. My responsibility as a business architect is to manage the business architecture practice and its governance. I primarily focus on establishing standards and best practices for our team's deliverables and developing relationships within our organization. I also collect information on our business and map domains (including capabilities, value streams, information, and organization) according to the business architecture framework to gain insights. I think business architecture is foundational to organizations today. A strategy is a plan of action to achieve a goal. My team receives business ideas and potential projects that align with our organization's strategies and influence our performance as a leader in our market. 



Quote for the day:

"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer

Daily Tech Digest - December 19, 2022

7 ways CIOs can build a high-performance team

“People want to grow and change, and good business leaders are willing to give them the opportunity to do so,” adds Cohn. Here, you can get HR involved, encouraging them to bring their expertise and ideas to the table to help you come up with the right approach to training and employee development. In addition, it’s important to remember that an empathetic leader understands that people come from different places and therefore won’t grow and develop in the same manner. Modern CIOs must approach upskilling and training with this reality in mind, advises Benjamin Marais, CIO at financial services company Liberty Group SA. You also need to create opportunities that expose your employees to what’s happening outside the business, suggests van den Berg. This is especially true where it pertains to future technologies and skills because if teams know what’s out there, they better understand what they need to do to keep up. Given the rise in competition for skills in the market, you have to demonstrate your best when trying to attract top talent and retain them, stresses Cohn. 


10 Trends in DevOps, Automated Testing and More for 2023

Developers and QA professionals are some of the most sought-after skilled laborers who are acutely aware of the value they provide to organizations. As we head into next year, this group will continue to leverage the demand for their skills in pursuit of their ideal work environment. Companies that do not consider their developer experience and force pre-pandemic systems onto a hybrid-first world set themselves up for failure, especially when tools for remote and virtual testing and quality assurance are readily available. Developer teams also need to be equally equipped for success through the tools and opportunities that can help ensure an innate sense of value to the organization – and if they don’t have the tools they need, these developers will find them elsewhere. ... We’re starting to see consolidation in both the market and in the user personas we’re all chasing. Testing companies are offering monitoring, and monitoring companies are offering testing. This is a natural outcome of the industry’s desire to move toward true observability: deep understanding of real-world user behavior, synthetic user testing, passively watching for signals and doing real-time root cause analysis—all in service of perfecting the customer experience.


The beautiful intersection of simulation and AI

Simulation models can synthesize real-world data that is difficult or expensive to collect into good, clean and cataloged data. While most AI models run using fixed parameter values, they are constantly exposed to new data that may not be captured in the training set. If unnoticed, these models will generate inaccurate insights or fail outright, causing engineers to spend hours trying to determine why the model is not working. ... Businesses have always struggled with time-to-market. Organizations that push a buggy or defective solution to customers risk irreparable harm to their brand, particularly startups. The opposite is true as “also-rans” in an established market have difficulty gaining traction. Simulations were an important design innovation when they were first introduced, but their steady improvement and ability to create realistic scenarios can slow perfectionist engineers. Too often, organizations try to build “perfect” simulation models that take a significant amount of time to build, which introduces the risk that the market will have moved on.


What is VPN split tunneling and should I be using it?

The ability to choose which apps and services use your VPN of choice and which don't is incredibly powerful. Activities like remote work, browsing your bank's website, or online shopping via public Wi-Fi can definitely benefit from the added security of a VPN, but other pursuits, like playing online games or streaming readily available content, can be hurt by the slight delay VPNs may add to your traffic. The modest decrease to your connection speed is barely noticeable for browsing, but can be disastrous for online games. Being able to simultaneously connect to sensitive sites and services through your secure VPN, and to non-sensitive games and apps means you won't constantly need to enable and disable your VPN connection when switching tasks. This is important as forgetting to enable it at the wrong time could leave you exposed to security risks. ... Split tunneling divides your network traffic in two. Your standard, unencrypted traffic continues to flow unimpeded down one path, while your sensitive and secured data gets encrypted and routed through the VPN's private network. It's like having a second network connection that's completely separate, a tiny bit slower, but also far more secure.


Why don’t cloud providers integrate?

Although it’s not an apples-to-apples comparison, Google’s Athos enables enterprises to run applications across clouds and other operating environments, including ones Google doesn’t control. As with Amazon DataZone, it’s very possible to manage third-party data sources. One senior IT executive from a large travel and hospitality company told me on condition of anonymity, “I’m sure [cloud vendors] can integrate with third-party services, but I suspect that’s not a choice they’re willing to make. For instance, they could publish some interfaces for third parties to integrate with their control plane as well as other means in the data plane.” Integration is possible, in other words, but vendors don’t always seem to want it. This desire to control sometimes leads vendors down roads that aren’t optimal for customers. As this IT executive said, “The ecosystem is being broken. Instead of interoperating with third-party services, [cloud vendors often] choose to create API-compatible competing services.” He continued, “There is a zero-sum game mindset here.” Namely, if a customer runs a third-party database and not the vendor’s preferred first-party database, the vendor has lost.


How RegTech helps financial services providers overcome regulation challenges

Two main types of RegTech capabilities are helping financial service institutions stay compliant: software that encompasses the whole system — for example a full client onboarding cycle — and software that manages a particular process, such as reporting or document management. Hugo Larguinho Brás explains: “The technologies that handle the whole process from A to Z are typically heavier to deploy, but they will allow you to cover most of your needs. These are also more expensive and often more difficult to adapt in line with a company’s specificities.” “Meanwhile, those technologies that treat part of the process can be combined with other tools. While this brings more agility, the need to find and combine several tools can also turn your target model more complex to run.” “We see more and more cloud and on-premises solutions available to asset management and securities companies, from software-as-a-service (SaaS) and platform-as-a-service (PaaS) deployed in-house, to solutions combined to outsourced capabilities ...”


What You Need to Know About Hyperscalers

Current hyperscaler adopters are primarily large enterprises. “The speed, efficiencies, and global reach hyperscalers can provide will surpass what most enterprise organizations can build within their own data centers,” Drobisewski says. He predicts that the partnerships being built today between hyperscalers and large enterprises are strategic and will continue to grow in value. “As hyperscalers maintain their focus on lifecycle, performance, and resiliency, businesses can consume hyperscaler services to thrive and accelerate the creation of new digital experiences for their customers,” Drobisewski says. ... Many adopters begin their hyperscaler migration by selecting the software applications that are best suited to run within a cloud environment, Hoecker says. Over time, these organizations will continue to migrate workloads to the cloud as their business goals evolve, he adds. Many hyperscaler adopters, as they become increasingly comfortable with the approach, are beginning to establish multi-cloud estates. “The decision criteria is typically based on performance, cost, security, access to skills, and regulatory and compliance factors,” Hoecker notes.


UID smuggling: A new technique for tracking users online

Researchers at UC San Diego have for the first time sought to quantify the frequency of UID smuggling in the wild, by developing a measurement tool called CrumbCruncher. CrumbCruncher navigates the Web like an ordinary user, but along the way, it keeps track of how many times it has been tracked using UID smuggling. The researchers found that UID smuggling was present in about 8 percent of the navigations that CrumbCruncher made. The team is also releasing both their complete dataset and their measurement pipeline for use by browser developers. The team’s main goal is to raise awareness of the issue with browser developers, said first author Audrey Randall, a computer science Ph.D. student at UC San Diego. “UID smuggling is more widely used than we anticipated,” she said. “But we don’t know how much of it is a threat to user privacy.” ... UID smuggling can have legitimate uses, the researchers say. For example, embedding user IDs in URLs can allow a website to realize a user is already logged in, which means they can skip the login page and navigate directly to content.


Bring Sanity to Managing Database Proliferation

How can you avoid being a victim of the bow wave of database proliferation? Recognize that you can allocate your resources in a way that benefits both your bottom line and your stress level by consolidating how you run and manage modern databases. Investing heavily in self-managing the legacy databases used in high volume by many of your people makes a lot of sense. Database workloads that are typically used for mission-critical transaction processing, such as IBM DB2 in financial services, are subject to performance tuning, regular patching and upgrading by specialized database administrators in a kind of siloed sanctum sanctorum. Many organizations will hire an in-house Oracle or SAP Hana expert and create a team, ... But what about the 40 other highly functional, highly desirable cloud databases in your enterprise that aren’t used as often? Do you need another 20 people to manage them? Open source databases like MySQL, MongoDB, Cassandra, PostgreSQL and many others have gained wide adoption, and many of their use cases are considered mission-critical. 


An Ode to Unit Tests: In Defense of the Testing Pyramid

What does the unit in unit tests mean? It means a unit of behavior. There's nothing in that definition dictating that a test has to focus on a single file, object, or function. Why is it difficult to write unit tests focused on behavior? A common problem with many types of testing comes from a tight connection between software structure and tests. That happens when the developer loses sight of the test goal and approaches it in a clear-box (sometimes referred to as white-box) way. Clear-box testing means testing with the internal design in mind to guarantee the system works correctly. This is really common in unit tests. The problem with clear-box testing is that tests tend to become too granular, and you end up with a huge number of tests that are hard to maintain due to their tight coupling to the underlying structure. Part of the unhappiness around unit tests stems from this fact. Integration tests, being more removed from the underlying design, tend to be impacted less by refactoring than unit tests. I like to look at things differently. Is this a benefit of integration tests or a problem caused by the clear-box testing approach? What if we had approached unit tests in an opaque-box approach?



Quote for the day:

"Strategy is not really a solo sport even if you_re the CEO." -- Max McKeown