Showing posts with label data warehouse. Show all posts
Showing posts with label data warehouse. Show all posts

Daily Tech Digest - August 24, 2025


Quote for the day:

"To accomplish great things, we must not only act, but also dream, not only plan, but also believe." -- Anatole France



Creating the ‘AI native’ generation: The role of digital skills in education

Boosting AI skills has the potential to drive economic growth and productivity and create jobs, but ambition must be matched with effective delivery. We must ensure AI is integrated into education in a way that encourages students to maintain critical thinking skills, skeptically assess AI outputs, and use it responsibly and ethically. Education should also inspire future tech talent and prepare them for the workplace. ... AI fluency is only one part of the picture. Amid a global skills gap, we also need to capture the imaginations of young people to work in tech. To achieve this, AI and technology education must be accessible, meaningful, and aspirational. That requires coordinated action from schools, industry, and government to promote the real-world impact of digital skills and create clearer, more inspiring pathways into tech careers and expose students how AI is applied in various professions. Early exposure to AI can do far more than build fluency, it can spark curiosity, confidence and career ambition towards high-value sectors like data science, engineering and cybersecurity—areas where the UK must lead. ... Students who learn how to use AI now will build the competencies that industries want and need for years to come. But this will form the first stage of a broader AI learning arc where learning and upskilling become a lifelong mindset, not a single milestone. 


What is the State of SIEM?

In addition to high deployment costs, many organizations grapple with implementing SIEM. A primary challenge is SIEM configuration -- given that the average organization has more than 100 different data sources that must plug into the platform, according to an IDC report. It can be daunting for network staff to do the following when deploying SIEM: Choose which data sources to integrate; Set up SIEM correlation rules that define what will be classified as a security event; and Determine the alert thresholds for specific data and activities. It's equally challenging to manage the information and alerts a SIEM platform issues. If you fine-tune too much, the result might be false positives as the system triggers alarms about events that aren't actually threats. This is a time-stealer for network techs and can lead to staff fatigue and frustration. In contrast, if the calibration is too liberal, organizations run the risk of overlooking something that could be vital. Network staff must also coordinate with other areas of IT and the company. For example, what if data safekeeping and compliance regulations change? Does this change SIEM rule sets? What if the IT applications group rolls out new systems that must be attached to SIEM? Can the legal department or auditors tell you how long to store and retain data for eDiscovery or for disaster backup and recovery? And which data noise can you discard as waste?


AI Data Centers: A Popular Term That’s Hard to Define

The tricky thing about trying to define AI data centers based on characteristics like those described above is that none of those features is unique to AI data centers. For example, hyperscale data centers – meaning very large facilities capable of accommodating more than a hundred thousand servers in some cases – existed before modern AI debuted. AI has made large-scale data centers more important because AI workloads require vast infrastructures, but it’s not as if no one was building large data centers before AI rose to prominence. Likewise, it has long been possible to deploy GPU-equipped servers in data centers. ... Likewise, advanced cooling systems and innovative approaches to data center power management are not unique to the age of generative AI. They, too, predated AI data centers. ... Arguably, an AI data center is ultimately defined by what it does (hosting AI workloads) more than by how it does it. So, before getting hung up on the idea that AI requires investment in a new generation of data centers, it’s perhaps healthier to think about how to leverage the data centers already in existence to support AI workloads. That perspective will help the industry avoid the risk of overinvesting in new data centers designed specifically for AI – and as a bonus, it may save money by allowing businesses to repurpose the data centers they already own to meet their AI needs as well.


Password Managers Vulnerable to Data Theft via Clickjacking

Tóth showed how an attacker can use DOM-based extension clickjacking and the autofill functionality of password managers to exfiltrate sensitive data stored by these applications, including personal data, usernames and passwords, passkeys, and payment card information. The attacks demonstrated by the researcher require 0-5 clicks from the victim, with a majority requiring only one click on a harmless-looking element on the page. The single-click attacks often involved exploitation of XSS or other vulnerabilities. DOM, or Document Object Model, is an object tree created by the browser when it loads an HTML or XML web page. ... Tóth’s attack involves a malicious script that manipulates user interface elements injected by browser extensions into the DOM. “The principle is that a browser extension injects elements into the DOM, which an attacker can then make invisible using JavaScript,” he explained. According to the researcher, some of the vendors have patched the vulnerabilities, but fixes have not been released for Bitwarden, 1Password, iCloud Passwords, Enpass, LastPass, and LogMeOnce. SecurityWeek has reached out to these companies for comment. Bitwarden said a fix for the vulnerability is being rolled out this week with version 2025.8.0. LogMeOnce said it’s aware of the findings and its team is actively working on resolving the issue through a security update.


Iskraemeco India CEO: ERP, AI, and the future of utility leadership

We see a clear convergence ahead, where ERP systems like Infor’s will increasingly integrate with edge AI, embedded IoT, and low-code automation to create intelligent, responsive operations. This is especially relevant in utility scenarios where time-sensitive data must drive immediate action. For instance, our smart kits – equipped with sensor technology – are being designed to detect outages in real time and pinpoint exact failure points, such as which pole needs service during a natural disaster. This type of capability, powered by embedded IoT and edge computing, enables decisions to be made closer to the source, reducing downtime and response lag.  ... One of the most important lessons we've learned is that success in complex ERP deployments is less about customisation and more about alignment, across leadership, teams, and technology. In our case, resisting the urge to modify the system and instead adopting Infor’s best-practice frameworks was key. It allowed us to stay focused, move faster, and ensure long-term stability across all modules. In a multi-stakeholder environment – where regulatory bodies, internal departments, and technology partners are all involved – clarity of direction from leadership made all the difference. When the expectation is clear that we align to the system, and not the other way around, it simplifies everything from compliance to team onboarding.


Experts Concerned by Signs of AI Bubble

"There's a huge boom in AI — some people are scrambling to get exposure at any cost, while others are sounding the alarm that this will end in tears," Kai Wu, founder and chief investment officer of Sparkline Capital, told the Wall Street Journal last year. There are even doubters inside the industry. In July, recently ousted CEO of AI company Stability AI Emad Mostaque told banking analysts that "I think this will be the biggest bubble of all time." "I call it the 'dot AI’ bubble, and it hasn’t even started yet," he added at the time. Just last week, Jeffrey Gundlach, billionaire CEO of DoubleLine Capital, also compared the AI craze to the dot com bubble. "This feels a lot like 1999," he said during an X Spaces broadcast last week, as quoted by Business Insider. "My impression is that investors are presently enjoying the double-top of the most extreme speculative bubble in US financial history," Hussman Investment Trust president John Hussman wrote in a research note. In short, with so many people ringing the alarm bells, there could well be cause for concern. And the consequences of an AI bubble bursting could be devastating. ... While Nvidia would survive such a debacle, the "ones that are likely to bear the brunt of the correction are the providers of generative AI services who are raising money on the promise of selling their services for $20/user/month," he argued.


OpenCUA’s open source computer-use agents rival proprietary models from OpenAI and Anthropic

Computer-use agents are designed to autonomously complete tasks on a computer, from navigating websites to operating complex software. They can also help automate workflows in the enterprise. However, the most capable CUA systems are proprietary, with critical details about their training data, architectures, and development processes kept private. “As the lack of transparency limits technical advancements and raises safety concerns, the research community needs truly open CUA frameworks to study their capabilities, limitations, and risks,” the researchers state in their paper. ... The tool streamlines data collection by running in the background on an annotator’s personal computer, capturing screen videos, mouse and keyboard inputs, and the underlying accessibility tree, which provides structured information about on-screen elements.  ... The key insight was to augment these trajectories with chain-of-thought (CoT) reasoning. This process generates a detailed “inner monologue” for each action, which includes planning, memory, and reflection. This structured reasoning is organized into three levels: a high-level observation of the screen, reflective thoughts that analyze the situation and plan the next steps, and finally, the concise, executable action. This approach helps the agent develop a deeper understanding of the tasks.


How to remember everything

MyMind is a clutter-free bookmarking and knowledge-capture app without folders or manual content organization.There are no templates, manual customizations, or collaboration tools. Instead, MyMind recognizes and formats the content type elegantly. For example, songs, movies, books, and recipes are displayed differently based on MyMind’s detection, regardless of the source, as are pictures and videos. MyMind uses AI to auto-tag everything and allows custom tags. Every word, including those in pictures, is indexed. You can take pictures of information, upload them to MyMind, and find them later by searching a word or two found in the picture. Copying a sentence or paragraph from an article will display the quote with a source link. Every data chunk is captured in a “card.” ... Alongside AI-enabled lifelogging tools like MyMind, we’re also entering an era of lifelogging hardware devices. One promising direction comes from a startup called Brilliant Labs. Its new $299 Halo glasses, available for pre-order and shipping in November, are lightweight AI glasses. The glasses have a long list of features — bone conduction sound, a camera, light weight, etc. — but the lifelogging enabler is an “agentic memory” system called Narrative. It captures information automatically from the camera and microphones and places it into a personal knowledge base. 


From APIs to Digital Twins: Warehouse Integration Strategies for Smarter Supply Chains

Digital twins create virtual replicas of warehouses and supply chains for monitoring and testing. A digital twin ingests live data from IoT sensors, machines, and transportation feeds to simulate how changes affect outcomes. For instance, GE’s “Digital Wind Farm” project feeds sensor data from each turbine into a cloud model, suggesting performance tweaks that boost energy output by ~20% (worth ~$100M more revenue per turbine). In warehousing, digital twins can model workflows (layout changes, staffing shifts, equipment usage) to identify bottlenecks or test improvements before physical changes. Paired with AI, these twins become predictive and prescriptive: companies can run thousands of what-if scenarios (like a port strike or demand surge) and adjust plans accordingly. ... Today’s warehouses are not just storage sheds; they are smart, interconnected nodes in the supply chain. Leveraging IIoT sensors, cloud APIs, AI analytics, robotics, and digital twins transforms logistics into a competitive advantage. Integrated systems reduce manual handoffs and errors: for example, automated picking and instant carrier booking can shorten fulfillment cycles from days to hours. Industry data bear this out, deploying these technologies can improve on-time delivery by ~20% and significantly lower operating costs.


Enterprise Software Spending Surges Despite AI ROI Shortfalls

AI capabilities increasingly drive software purchasing decisions. However, many organizations struggle with the gap between AI promise and practical ROI delivery. The disconnect stems from fundamental challenges in data accessibility and contextual understanding. Current AI implementations face significant obstacles in accessing the full spectrum of contextual data required for complex decision-making. "In complex use cases, where the exponential benefits of AI reside, AI still feels forced and contrived when it doesn't have the same amount and depth of contextual data required to read a situation," Kirkpatrick explained. Effective AI implementation requires comprehensive data infrastructure investments. Organizations must ensure AI models can access approved data sources while maintaining proper guardrails. Many IT departments are still working to achieve this balance. The challenge intensifies in environments where AI needs to integrate across multiple platforms and data sources. Well-trained humans often outperform AI on complex tasks because their experience allows them to read multiple factors and adjust contextually. "For AI to mimic that experience, it requires a wide range of data that can address factors across a wide range of dimensions," Kirkpatrick said. "That requires significant investment in data to ensure the AI has the information it needs at the right time, with the proper context, to function seamlessly, effectively, and efficiently."

Daily Tech Digest - August 01, 2023

Is generative AI mightier than the law?

The FTC hasn’t been shy in going after Big Tech. And in the middle of July, it took its most important step yet: It opened an investigation into whether Microsoft-backed OpenAI has violated consumer protection laws and harmed consumers by illegally collecting data, violating consumer privacy and publishing false information about people. In a 20-page letter sent by the FTC to OpenAI, the agency said it’s probing whether the company “engaged in unfair or deceptive privacy or data security practices or engaged in unfair or deceptive practices relating to risks of harm to consumers.” The letter made clear how seriously the FTC takes the investigation. It wants vast amounts of information, including technical details about how ChatGPT gathers data, how the data is used and stored, the use of APIs and plugins, and information about how OpenAI trains, builds, and monitors the Large Language Models (LLMs) that fuel its chatbot. None of this should be a surprise to Microsoft or ChatGPT. In May, FTC Chair Lina Khan wrote an opinion piece in The New York Times laying out how she believed AI must be regulated.


Four Pillars of Digital Transformation

The four principle was understanding your customers and your customer segments. that's number one. Second is aligning with your customers and your functional teams. Because you cannot do anything digital transformation in a silo. You can say, "Oh, Asif and Shane wants to digital transform this company and forget about what people A and people B are thinking. Shane and I are going to go and make that happen." We will fail. Not going to happen. That's where the cross-functional team alignment comes in. The third is influencing and understanding what the change is, why we want to do it, how we are going to do it. And what's in it, not for Shane, not for Asif. What's in it for you as a customer or as an organization? Again, showing the empathy and explaining the why behind it. And finally, communicating, communicating, communicating, over-communicating and celebrating success. To me, those are the big four pillars that we use to sell the idea of what the digital transformation is to at any level from a C-level, all the way to people at the store level. Can I explain them what's in it for them?


Scientists Seek Government Database to Track Harm from Rising 'AI Incidents'

Faced with mounting evidence of such harmful AI incidents, the FAS noted the government database could somewhat align with other trackers and efforts. "The database should be designed to encourage voluntary reporting from AI developers, operators, and users while ensuring the confidentiality of sensitive information," the FAS said. "Furthermore, the database should include a mechanism for sharing anonymized or aggregated data with AI developers, researchers, and policymakers to help them better understand and mitigate AI-related risks. The DHS could build on the efforts of other privately collected databases of AI incidents, including the AI Incident Database created by the Partnership on AI and the Center for Security and Emerging Technologies. This database could also take inspiration from other incident databases maintained by federal agencies, including the National Transportation Safety Board's database on aviation accidents." The group further recommended that the DHS should collaborate with the NIST to design and maintain the database, including setting up protocols for data validation categorization, anonymization, and dissemination.


Generative AI: Headwind or tailwind?

It's topping everyone's wish list, with influences converging from various directions: customers, staff members and corporate boards, all applying pressure to harness its potential in their respective markets. On the bright side, there's a unified objective: to make progress. The challenge, however, is that, like most early-stage technologies, the path forward with generative AI isn't as straightforward — there's a lot of ambiguity about what to do, how to do it or even where to start. The potential of generative AI surpasses mere cost-effectiveness and efficiency. It can fuel the generation of new ideas, fine-tune designs and facilitate the launch of new products. It could serve as your catalyst for innovation if you're bold enough to step into this new frontier. But where do you step first? Our approach is first to identify a problem or "missing." In the simplest explanation possible, envision a Venn diagram where one circle represents the new tech wave (generative AI) and the other represents your customer, their challenges, opportunities, tasks, pains and gains.


Hackers: We won’t let artificial intelligence get the better of us

Hackers who have adopted or who plan to adopt generative AI are most inclined to use Open AI’s ChatGPT ... Those that have taken the plunge are using generative AI technology in a wide variety of ways, with the most commonly used functions being text summarisation or generation, code generation, search enhancement, chatbots, image generation, data design, collection or summarisation, and machine learning. Within security research workflows specifically, hackers said they found generative AI most useful to automate tasks, analyse data, and identify and validate vulnerabilities. Less widely used applications included conducting reconnaissance, categorising threats, detecting anomalies, prioritising risk and building training models. Many hackers who are not native English speakers or not fluent in English are also using services such as ChatGPT to translate or write reports and bug submissions, and fuel more collaboration across national borders.


Your CDO Does More Than Just Protect Data

Influential CDOs who can collaborate without being perceived as aloof or arrogant stand out in the field. Balancing visionary thinking with practical implementation strategies is vital, and CDOs who instill purpose and forward-looking excitement within their teams create a culture of innovation and continuous improvement. These qualities are essential for unlocking the full potential of data leadership. ... With boards needing more depth of tech knowledge to oversee strategy, CDOs can be valuable directors. CDOs who can demonstrate experience in making data core to the company’s strategy or informing a transformational pivot for the business would bring a high amount of value to boardroom discussions. The opportunity to understand and see risks and opportunities through a board member’s eyes is an invaluable experience for a CDO, which not only helps the CDO to prepare for future board service but also gives your board members additional education about the future of data and what it can bring to your organization.


Data Warehouse Telemetry: Measuring the Health of Your Systems

At the heart of the data warehouse system, there is a pulse. This is the set of measures that indicate the system's performance -- its heartbeat, so to speak. This includes the measurements of system resources, such as disk reads and writes, CPU and memory utilization, and disk usage. These metrics are an indicator of how well the overall system is performing. It is important to measure to make sure that these metrics do not go too low or too high. When they go too low, it is an indicator that the system has been oversized and resources are being wasted. When they go too high, it is an indicator that the system is undersized and resources are nearing exhaustion. As the resources hit a critical level, overall performance can grind to a halt, freezing processes and negatively impacting the user experience. When a medical practitioner sees that a patient’s heart rate/pulse is too fast or too slow, they will provide several recommendations, including ongoing monitoring to see if the situation improves or changes to diet or exercise.


Reducing Generative AI Hallucinations and Trusting Your Data

Data has two dimensions. One is the actual value of the data and the parameter that it represents; for example, the temperature of an asset in a factory. Then, there is also the relational aspect of the data that shows how the source of that temperature sensor is connected to the rest of the other data generators. This value-oriented aspect of data and the relational aspect of that data are both important for quality, trustworthiness, and the history and revision and versioning of the data. There’s obviously the communication pipeline, and you need to make sure that where the data sources connect to your data platform has enough sense of reliability and security. Make sure the data travels with integrity and the data is protected against malicious intent. ... Generative AI is one of those foundational technologies like how software changed the world. Mark [Andreesen, a partner in the Silicon Valley venture capital firm Andreessen Horowitz] in 2011 said that software is eating the world, and software already ate the world. It took 40 years for software to do this. 


10 Reasons for Optimism in Cybersecurity

The new National Cybersecurity Strategy announced by the Biden Administration this year emphasizes the importance public-private collaboration. Google Cloud’s Venables anticipates that knowledge sharing between the public and private sectors will help enhance transparency around cyber threats and improve protection. “As public and private sector collaboration grows, in the next few years we’ll see deeper coordination between agencies and big tech organizations in how they implement cyber protections,” he says. The public and private sectors also have the opportunity to join forces on cybersecurity regulation. ... As the cybersecurity product market matures it will not only embrace secure-by-design and -default principles. XYPRO’s Tcherchian is also optimistic about the consolidation of cybersecurity solutions. “Cybersecurity consolidation integrates multiple cybersecurity tools and solutions into a unified platform, addressing the crowded and complex nature of the cybersecurity market,” he explains. 


Keeping the cloud secure with a mindset shift

Organizations developing software through cloud-based tools and environments must take additional care to adapt their processes. Adapting a “shift-left” approach for the continuous integration and continuous deployment CI/CD pipeline is particularly important. Traditionally, security checks were often performed towards the end of the development cycle. However, this reactive approach can allow vulnerabilities to slip through the cracks and reach production stages. The shift-left approach advocates for integrating security measures earlier in the development cycle. By doing so, potential security risks can be identified and mitigated early, preventing malware infiltration and reducing the cost and complexity of addressing security issues at later stages. This proactive approach aligns with the dynamic nature of cloud environments, ensuring robust security without hindering agility and innovation. Businesses should consider how they can mirror the shift-left ethos across their other cloud operations.



Quote for the day:

"Leadership offers an opportunity to make a difference in someone's life, no matter what the project." -- Bill Owens

Daily Tech Digest - July 22, 2023

All-In-One Data Fabrics Knocking on the Lakehouse Door

The fact IBM, HPE, and Microsoft made such similar data fabric and lakehouse announcements indicate there is strong market demand, Patel says. But it’s also partly a result of the evolution of data architecture and usage patterns, he says. “I think there are probably some large enterprises that decide, listen, I can’t do this anymore. You need to go and fix this. I need you to do this,” he says. “But there’s also some level of just where we’re going…We were always going to be in a position where governance and security and all of those types of things just become more and more important and more and more intertwined into what we do on a daily basis. So it doesn’t surprise me that some of these things are starting to evolve.” While some organizations still see value in choosing the best-of-breed products in every category that makes up the data fabric, many will gladly give up having the latest, greatest feature in one particular area in exchange for having a whole data fabric they can move into and be productive from day one.


Shift Left With DAST: Dynamic Testing in the CI/CD Pipeline

The integration of DAST in the early stages of development is crucial for several reasons. First, by conducting dynamic security testing from the onset, teams can identify vulnerabilities earlier, making them easier and less costly to fix. This proactive approach helps to prevent security issues from becoming ingrained in the code, which can lead to significant problems down the line. Second, early integration of DAST encourages a security-focused mindset from the beginning of the project, promoting a culture of security within the team. This cultural shift is crucial in today’s cybersecurity climate, where threats are increasingly sophisticated, and the stakes are higher than ever. DAST doesn’t replace other testing methods; rather, it complements them. By combining these methods, teams can achieve a more comprehensive view of their application’s security. In a shift left approach, this combination of testing methods can be very powerful. By conducting these tests early and often, teams can ensure that both the external and internal aspects of their application are secure. This layered approach to security testing can help to catch any vulnerabilities that might otherwise slip through the cracks.


First known open-source software attacks on banking sector could kickstart long-running trend

In the first attack detailed by Checkmarx, which occurred on 5 April and 7 April, a threat actor leveraged the NPM platform to upload packages that contained a preinstall script that executed its objective upon installation. To appear more credible, the attacker created a spoofed LinkedIn profile page of someone posing as an employee of the victim bank. Researchers originally thought this may have been linked to legitimate penetration testing services commissioned by the bank, but the bank revealed that to not be the case and that it was unaware of the LinkedIn activity. The attack itself was modeled on a multi-stage approach which began with running a script to identify the victim’s operating system – Windows, Linux, or macOS. Once identified, the script then decoded the relevant encrypted files in the NPM package which then downloaded a second-stage payload. Checkmarx said that the Linux-specific encrypted file was not flagged as malicious by online virus scanner VirusTotal, allowing the attacker to “maintain a covert presence on the Linux systems” and increase its chances of success.


From data warehouse to data fabric: the evolution of data architecture

By introducing domain‑oriented data ownership, domain teams become accountable for their data and products, improving data quality and governance. Traditional data lakes often encounter challenges related to scalability and performance when handling large volumes of data. However, data mesh architecture solves these scalability issues through its decentralized and self‑serve data infrastructure. With each domain having the autonomy to choose the technologies and tools that best suits their needs, data mesh allows teams to scale their data storage and processing systems independently. ... Data Fabric is an integrated data architecture that is adaptive, flexible, and secure. It is an architectural approach and technology framework that addresses data lake challenges by providing a unified and integrated view of data across various sources. Data Fabric allows faster and more efficient access to data by extracting the technological complexities involved in data integration, transformation, and movement so that anybody can use it.


What Is the Role of Software Architect in an Agile World?

It has become evident that there is a gap between the architecture team and those who interact with the application on a daily basis. Even in the context of the microservice architecture, failing to adhere to best practices can result in a tangled mess that may force a return to monolithic structures, as we have seen with Amazon Web Services. I believe that it is necessary to shift architecture left and provide architects with better tools to proactively identify architecture drift and technical debt buildup, injecting architectural considerations into the feature backlog. With few tools to understand the architecture or identify the architecture drift, the role of the architect has become a topic of extensive discussion. Should every developer be responsible for architecture? Most companies have an architect who sets standards, goals, and plans. However, this high-level role in a highly complex and very detailed software project will often become detached from the day-to-day reality of the development process. 


Rapid growth without the risk

The case for legacy modernization should today be clear: technical debt is like a black hole, sucking up an organization’s time and resources, preventing it from developing the capabilities needed to evolve and adapt to drive growth. But while legacy systems can limit and inhibit business growth, from large-scale disruption to subtle but long-term stagnation, changing them doesn’t have to be a painful process of “rip-and-replace.” In fact, rather than changing everything only to change nothing, an effective program enacts change in people, processes and technology incrementally. It focuses on those areas that will make the biggest impact and drive the most value, making change manageable in the short term yet substantial in its effect on an organization's future success and sustainable in the long term. In an era where executives often find themselves in FOMU (fear of messing up) mode, they would be wise to focus on those areas of legacy modernization that will make the biggest impact and drive the most value, making change manageable in the short term yet substantial in its effect on an organization’s future success.


Data Fabric: How to Architect Your Next-Generation Data Management

The data fabric encompasses a broader concept that goes beyond standalone solutions such as data virtualization. Rather, the architectural approach of a data fabric integrates multiple data management capabilities into a unified framework. The data fabric is an emerging data management architecture that provides a net that is cast to stitch together multiple heterogeneous data sources and types through automated data pipelines. ... For business teams, a data fabric empowers nontechnical users to easily discover, access, and share the data they need to perform everyday tasks. It also bridges the gap between data and business teams by including subject matter experts in the creation of data products. ... Implementing an efficient data fabric architecture is not accomplished with a single tool. Rather, it incorporates a variety of technology components such as data integration, data catalog, data curation, metadata analysis, and augmented data orchestration. Working together, these components deliver agile and consistent data integration capabilities across a variety of endpoints throughout hybrid and multicloud environments.


Data Lineage Tools: An Overview

Modern data lineage tools have evolved to meet the needs of organizations that handle large volumes of data. These tools provide a comprehensive view of the journey of data from its source to its destination, including all transformations and processing steps along the way. They enable organizations to trace data back to its origins, identify any changes made along the way, and ensure compliance with regulatory requirements. One key feature of modern lineage tools is their ability to automatically capture and track metadata across multiple systems and platforms. This capability removes the need for manual, time-consuming documentation. Another important aspect of modern data lineage tools is their integration with other technologies such as metadata management systems, Data Governance platforms, and business intelligence solutions. This enables organizations to create a unified view of their data landscape and make informed decisions based on accurate, up-to-date information.


The Impact of AI Data Lakes on Data Governance and Security

One of the primary concerns with AI data lakes is the potential for data silos to emerge. Data silos occur when data is stored in separate repositories or systems that are not connected or integrated with one another. This can lead to a lack of visibility and control over the data, making it difficult for organizations to enforce data governance policies and ensure data security. To mitigate this risk, organizations must implement robust data integration and management solutions that enable them to maintain a comprehensive view of their data landscape and ensure that data is consistently and accurately shared across systems. Another challenge associated with AI data lakes is the need to maintain data quality and integrity. As data is ingested into the data lake from various sources, it is essential to ensure that it is accurate, complete, and consistent. Poor data quality can lead to inaccurate insights and decision-making, as well as increased security risks. 


AppSec Consolidation for Developers: Why You Should Care

Complicated and messy AppSec programs are yielding a three-fold problem: unquantifiable or unknowable levels of risk for the organization, ineffective resource management and excessive complexity. This combined effect leaves enterprises with a fragmented picture of total risk and little useful information to help them strengthen their security posture. ... An increase in the number of security tools leads to an increase in the number of security tests, which in turn translates to an increase in the number of results. This creates a vicious cycle that adds complexity to the AppSec environment that is both unnecessary and avoidable. Most of the time, these results are stored in their respective point tools. As a result, developers frequently receive duplicate issues as well as remediation guidance that is ineffective or lacking context, causing them to waste critical time and resources. Without consolidated and actionable outcomes, it is impossible to avoid duplication of findings and remediation actions.



Quote for the day:

"There is no substitute for knowledge." -- W. Edwards Deming

Daily Tech Digest - June 02, 2023

A Data Scientist’s Essential Guide to Exploratory Data Analysis

Analyzing the individual characteristics of each feature is crucial as it will help us decide on their relevance for the analysis and the type of data preparation they may require to achieve optimal results. For instance, we may find values that are extremely out of range and may refer to inconsistencies or outliers. We may need to standardize numerical data or perform a one-hot encoding of categorical features, depending on the number of existing categories. Or we may have to perform additional data preparation to handle numeric features that are shifted or skewed, if the machine learning algorithm we intend to use expects a particular distribution. ... For Multivariate Analysis, best practices focus mainly on two strategies: analyzing the interactions between features, and analyzing their correlations. ... Interactions let us visually explore how each pair of features behaves, i.e., how the values of one feature relate to the values of the other. 


Resilient data backup and recovery is critical to enterprise success

So, what must IT leaders consider? The first step is to establish data protection policies that include encryption and least privilege access permissions. Businesses should then ensure they have three copies of their data – the production copy already exists and is effectively the first copy. The second copy should be stored on a different media type, not necessarily in a different physical location (the logic behind it is to not store your production and backup data in the same storage device). The third copy could or should be an offsite copy that is also offline, air-gapped, or immutable (Amazon S3 with Object Lock is one example). Organizations also need to make sure they have a centralized view of data protection across all environments for greater management, monitoring and governance, and they need orchestration tools to help automate data recovery. Finally, organizations should conduct frequent backup and recovery testing to make sure that everything works as it should.


Data Warehouse Architecture Types

Different architectural approaches offer unique advantages and cater to varying business requirements. In this comprehensive guide, we will explore different data warehouse architecture types, shedding light on their characteristics, benefits, and considerations. Whether you are building a new data warehouse or evaluating your existing architecture, understanding these options will empower you to make informed decisions that align with your organization’s goals. ... Selecting the right data warehouse architecture is a critical decision that directly impacts an organization’s ability to leverage its data assets effectively. Each architecture type has its own strengths and considerations, and there is no one-size-fits-all solution. By understanding the characteristics, benefits, and challenges of different data warehouse architecture types, businesses can align their architecture with their unique requirements and strategic goals. Whether it’s a traditional data warehouse, hub-and-spoke model, federated approach, data lake architecture, or a hybrid solution, the key is to choose an architecture that empowers data-driven insights, scalability, agility, and flexibility.


What is federated Identity? How it works and its importance to enterprise security

FIM has many benefits, including reducing the number of passwords a user needs to remember, improving their user experience and improving security infrastructure. On the downside, federated identity does introduce complexity into application architecture. This complexity can also introduce new attack surfaces, but on balance, properly implemented federated identity is a net improvement to application security. In general, we can see federated identity as improving convenience and security at the cost of complexity. ... Federated single sign-on allows for sharing credentials across enterprise boundaries. As such, it usually relies on a large, well-established entity with widespread security credibility, organizations such as Google, Microsoft, and Amazon, for example. In this case, applications are usually gaining not just a simplified login experience for their users, but the impression and actual reliance on high-level security infrastructure. Put another way, even a small application can add “Sign in with Google” to its login flow relatively easily, giving users a simple login option, which keeps sensitive information in the hands of the big organization.


Millions of PC Motherboards Were Sold With a Firmware Backdoor

Given the millions of potentially affected devices, Eclypsium’s discovery is “troubling,” says Rich Smith, who is the chief security officer of supply-chain-focused cybersecurity startup Crash Override. Smith has published research on firmware vulnerabilities and reviewed Eclypsium’s findings. He compares the situation to the Sony rootkit scandal of the mid-2000s. Sony had hidden digital-rights-management code on CDs that invisibly installed itself on users’ computers and in doing so created a vulnerability that hackers used to hide their malware. “You can use techniques that have traditionally been used by malicious actors, but that wasn’t acceptable, it crossed the line,” Smith says. “I can’t speak to why Gigabyte chose this method to deliver their software. But for me, this feels like it crosses a similar line in the firmware space.” Smith acknowledges that Gigabyte probably had no malicious or deceptive intent in its hidden firmware tool. But by leaving security vulnerabilities in the invisible code that lies beneath the operating system of so many computers, it nonetheless erodes a fundamental layer of trust users have in their machines. 


Minimising the Impact of Machine Learning on our Climate

There are several things we can do to mitigate the negative impact of software on our climate. They will be different depending on your specific scenario. But what they all have in common is that they should strive to be energy-efficient, hardware-efficient and carbon-aware. GSF is gathering patterns for different types of software systems; these have all been reviewed by experts and agreed on by all member organisations before being published. In this section we will cover some of the patterns for machine learning as well as some good practices which are not (yet?) patterns. If we divide the actions after the ML life cycle, or at least a simplified version of it, we get four categories: Project Planning, Data Collection, Design and Training of ML model and finally, Deployment and Maintenance. The project planning phase is the time to start asking the difficult questions, think about what the carbon impact of your project will be and how you plan to measure it. This is also the time to think about your SLA; overcommitting to strict latency or performance metrics that you actually don’t need can quickly become a source of emission you can avoid.


5 ways AI can transform compliance

Compliance is all about controls. Data must be classified according to multiple rules, and the movement and access to that data recorded. It’s the perfect task for AI. Ville Somppi, vice president of industry solutions at M-Files, says: “Thanks to AI, organisations can automatically classify information and apply pre-defined compliance rules. In the case of choosing the right document category from a compliance perspective, the AI can be trained quickly with a small sample set categorised by people. This is convenient, especially when people can still correct wrong suggestions in the beginning of the learning process. ... Data pools are too big for humans to comb through. AI is the only way. In some sectors, adoption of AI has been delayed owing to regulatory issues. However, full deployment ought now to be possible. Gabriel Hopkins chief product officer at Ripjar, says: “Banks and financial services companies face complex responsibilities when it comes to compliance activities, especially with regard to combatting the financing of terrorism and preventing laundering or criminal proceeds.


Former Uber CSO Sullivan on Engaging the Security Community

CISO is a lonely role. There's a really amazing camaraderie between security executives that I'm not sure exists in any other kind of leadership role. The CISO role is pretty new compared to the other leadership roles. It's far from settled what kind of background is ideal for the role. It's far from settled where the person in the role should report. It’s far from settled what kind of a budget you're going to get. It's far from settled in terms of what type of decision-making power you're going to have. So, as a result, I think security leaders often feel lonely and on an island. They have an executive team above them that expects them to know all the answers about security, and then they have a team underneath them that expects them to know all the answers about security. So, they can't betray ignorance to anybody without undermining their role. And so, the security leader community often turns to each other for support, for guidance. There are a good number of Slack channels and conferences that are just CISOs talking through the role and asking for best practices and advice on how to deal with hard situations.


Google Drive Deficiency Allows Attackers to Exfiltrate Workspace Data Without a Trace

Mitiga reached out to Google about the issue, but the researchers said they have not yet received a response, adding that Google's security team typically doesn't recognize forensics deficiencies as a security problem. This highlights a concern when working with software-as-a-service (SaaS) and cloud providers, in that organizations that use their services "are solely dependent on them regarding what forensic data you can have," Aspir notes. "When it comes to SaaS and cloud providers, we’re talking about a shared responsibility regarding security because you can't add additional safeguards within what is given." ... Fortunately, there are steps that organizations using Google Workspace can take to ensure that the issue outlined by Mitiga isn't exploited, the researchers said. This includes keeping an eye out for certain actions in their Admin Log Events feature, such as events about license assignments and revocations, they said.


How defense contractors can move from cybersecurity to cyber resilience

We’re thinking way too small about a coordinated cyberattack’s capacity for creating major disruption to our daily lives. One recent, vivid illustration of that fact happened in 2022, when the Russia-linked cybercrime group Conti launched a series of prolonged attacks on the core infrastructure of the country of Costa Rica, plunging the country into chaos for months. Over a period of two weeks, Conti tried to breach different government organizations nearly every day, targeting a total of 27 agencies. Soon after that, the group launched a separate attack on the country’s health care system, causing tens of thousands of appointments to be canceled and patients to experience delays in getting treatment. The country declared a national emergency and eventually, with the help of allies around the world including the United States and Microsoft, regained control of its systems. The US federal government’s strict compliance standards often impede businesses from excelling beyond the most basic requirements. 



Quote for the day:

"Uncertainty is not an indication of poor leadership; it underscores the need for leadership." -- Andy Stanley

Daily Tech Digest - November 29, 2022

Cloud-Native Goes Mainstream as CFOs Seek to Monitor Costs

There's interest from the CFO organization in third-party tools for cloud cost management and optimization that can give them a vendor-neutral tool, especially in multicloud environments, according to Forrester analyst Lee Sustar. "The cost management tools from cloud providers are generally fine for tactical decisions on spending but do not always provide the higher level views that the CFO office is looking for," he added. As organizations move to a cloud-native strategy, Sustar said the initiative will often come from the IT enterprise architects and the CTO organization, with backing from the office of the CIO. "Partners of various sorts are often needed in the shift to cloud-native, as they help generalize the lessons from the early adopters," he noted. "Today, organizations new to the cloud are focused not on lifting and shifting existing workloads alone, but modernizing on cloud-native tech. Multicloud container platform vendors offer a more integrated approach that can be tailored to different cloud providers, Sustar added.


Financial services increasingly targeted for API-based cyberattacks

APIs are a core part of how financial services firms are changing their operations in the modern era, Akamai said, given the growing desire for more and more app-based services among the consumer base. The pandemic merely accelerated a growing trend toward remote banking services, which led to a corresponding growth in the use of APIs. With every application and every standardization of how various app functions talk to one another, which creates APIs, the potential target surface for an attacker increases, however. Only high-tech firms and e-commerce companies were more heavily targeted via API exploits than the financial services industry. “Once attackers launch web applications attacks successfully, they could steal confidential data, and in more severe cases, gain initial access to a network and obtain more credentials that could allow them to move laterally,” the report said. “Aside from the implications of a breach, stolen information could be peddled in the underground or used for other attacks. This is highly concerning given the troves of data, such as personal identifiable information and account details, held by the financial services vertical.”


The future of cloud computing in 2023

Gartner research estimates that we exceeded one billion knowledge workers globally in 2019. These workers are defined as those who need to think creatively and deliver conclusions for strategic impact. These are the very people that cloud technology was designed to facilitate. Cloud integrations in many cases can be hugely advanced and mature from an operational standpoint. Businesses have integrated multi-cloud solutions, containerization and continuously learning AI/ML algorithms to deliver truly cutting-edge results, but those results are often not delivered at the scale or speed necessary to make split-second decisions needed to thrive in today’s operating environment. For cloud democratization to be successful, companies need to upskill their knowledge workers and upskill them with the right tools needed to deliver value from cloud analytics. Low-code and no-code tools reduce the experiential hurdle needed to deliver value from in-cloud data, whilst simultaneously delivering on the original vision of cloud technology — giving people the power they need to have their voices heard.


What Makes BI and Data Warehouses Inseparable?

Every effective BI system has a potent DWH at its core. Just because a data warehouse is a platform used to centrally gather, store, and prepare data from many sources for later use in business intelligence and analytics. Consider it as a single repository for all the data needed for BI analyses. Historical and current data are kept structured, ideal for sophisticated querying in a data analytics DWH. Once connected, it produces reports with forecasts, trends, and other visualizations that support practical insights using business intelligence tools. ETL (extract, transform, and load) tools, a DWH database, DWH access tools, and reporting layers are all parts of the business analytics data warehouse. These technologies are available to speed up the data science procedure and reduce or completely do away with the requirement for creating code to handle data pipelines. The ETL tools assist in data extraction from source systems, format conversion, and data loading into the DWH. Structured data for reporting is stored and managed by the database component. 


Covering Data Breaches in an Ethical Way

Ransomware and extortion groups usually publicly release stolen data if a victim doesn't pay. In many cases, the victim organization hasn't publicly acknowledged it has been attacked. Should we write or tweet about that? ... These are victims of crime, and not every organization handles these situations well, but the media can make it worse. Are there exceptions to this rule? Sure. If an organization hasn't acknowledged an incident but numerous media outlets have published pieces, then the incident could be considered public enough. But many people tweet or write stories about victims as soon as their data appears on a leak site. I think that is unfair and plays into the attackers' hands, increasing pressure on victims. Covering Cybercrime Sensitively Using leaked personal details to contact people affected by a data breach is a touchy area. I only do this in very limited circumstances. I did it with one person in the Optus breach. The reason was at that point there were doubts about if the data had originated with Optus. The person also lived down the road from me, so I could talk to them in person.


EU Council adopts the NIS2 directive

NIS2 will set the baseline for cybersecurity risk management measures and reporting obligations across all sectors that are covered by the directive, such as energy, transport, health and digital infrastructure. The revised directive aims to harmonise cybersecurity requirements and implementation of cybersecurity measures in different member states. To achieve this, it sets out minimum rules for a regulatory framework and lays down mechanisms for effective cooperation among relevant authorities in each member state. It updates the list of sectors and activities subject to cybersecurity obligations and provides for remedies and sanctions to ensure enforcement. The directive will formally establish the European Cyber Crises Liaison Organisation Network, EU-CyCLONe, which will support the coordinated management of large-scale cybersecurity incidents and crises. While under the old NIS directive member states were responsible for determining which entities would meet the criteria to qualify as operators of essential services, the new NIS2 directive introduces a size-cap rule as a general rule for identification of regulated entities.


Cybersecurity: How to do More for Less

When assessing your existing security stack, several important questions need to be asked: Are you getting the most out of your tools? How are you measuring their efficiency and effectiveness? Are any tools dormant? And how much automation is being achieved? The same should be asked of your IT stack–is there any bloat and technical debt? Across your IT and security infrastructure, there are often unnecessary layers of complexity in processes, policies and tools that can lead to waste. For example, having too many tools leads to high maintenance and configuration overheads, draining both resources and money. Similarly, technologies that combine on-premises infrastructure and third-party cloud providers require complex management and processes. IT and cybersecurity teams, therefore, need to work together with a clear shared vision to find ways to drive efficiency without reducing security. This requires clarity over roles and responsibilities between security and IT teams for asset management and deployment of security tools. It sounds straightforward but often is not, due to historic approaches to tool rollout.


Being Agile - A Success Story

To better understand the Agile methodology and its concepts, it is crucial to understand the Waterfall methodology. Waterfall is another famous Software Development Life Cycle (SDLC) methodology. This methodology is a strict and linear approach to software development. It aims at a significant project outcome. On the other hand, Agile methodology is an iterative method that delivers results in short intervals. Agile relies on integrating a feedback loop to drive the next iteration of work. The diagram below describes other significant differences between these methodologies. In Waterfall, we define and fix the scope and estimate the resources and time to complete the task. In Agile, the time and resources are fixed (called an "iteration"), and the work is estimated for every iteration. Agile helps estimate and evaluate the work that brings value to the product and the stakeholders. It is always a topic of debate as to which methodology to use for a project. Some projects are better managed with Waterfall, while others are an excellent fit for Agile. 


User Interface Rules That Should Never Be Overlooked

The most important user interface design rule that should never be overlooked is the rule of clarity. Clarity is critical when it comes to user interfaces, says Zeeshan Arif, founder and CEO of Whizpool, a software and website development company. “When you're designing an interface, you need to make sure your users understand what they can do at all times,” Arif advises. This means making sure that buttons are correctly labeled and that there aren't any unexpected changes or surprises that might confuse users. “If a button says ‘delete’, then it should delete whatever it's supposed to delete -- and only that thing,” he says. “If you have a button that does something else, then either make it a different color or label it differently, but don't put in something in that looks like a delete button but doesn't actually delete anything.” Don't perplex users by designing a user interface crammed with superfluous options and/or features. “If you have too many buttons on one page, and none of them are labeled well enough for someone who isn't familiar with them, [users will] probably just give up before they even get started using your product, service, app, or website,” Arif says.


6 non-negotiable skills for CIOs in 2023

CIOs need to think about both internal integrations and external opportunities. They need to have strong relationships and be able to pull the business leaders together. For example, I’m working with an entrepreneurial organization that runs different lines of businesses that are very strong, with heads of those businesses who are also very strong. One of their challenges, however, is that their clients can be customers of multiple businesses. Between the seams, the client experiences the organizational structure of the business, which is a problem – a client should never experience your organizational structure. The person best equipped to identify and close those seams and integration points is the CIO. ... In the past, most organizations operated with a business group that sat between technology and the clients. The movement around agile, however, has knocked those walls down and today allows IT to become client-obsessed – we’re cross-functional teams that are empowered and organized around business and client outcomes. As a CIO, you need to spend time with clients and have a strong internal mission, too. You have to develop great leaders and motivate and engage an entire organization.



Quote for the day:

"A leader has the vision and conviction that a dream can be achieved._ He inspires the power and energy to get it done." -- Ralph Nader

Daily Tech Digest - November 25, 2022

Ripe For Disruption: Artificial Intelligence Advances Deeper Into Healthcare

The challenges and changes needed to advance AI go well beyond technology considerations. “With data and AI entering in healthcare, we are dealing with an in-depth cultural change, that will not happen overnight,” according to Pierron-Perlès at her co-authors. “Many organizations are developing their own acculturation initiatives to develop the data and AI literacy of their resources in formats that are appealing. AI goes far beyond technical considerations.” There has been great concern about too much AI de-humanizing healthcare. But, once carefully considered and planned, may prove to augment human care. “People, including providers, imagine AI will be cold and calculating without consideration for patients,” says Garg. “Actually, AI-powered automation for healthcare operations frees clinicians and others from the menial, manual tasks that prevent them from focusing all their attention on patient care. While other AI-based products can predict events, the most impactful are incorporated into workflows in order to resolve issues and drive action by frontline users.”


Extinguishing IT Team Burnout Through Mindfulness and Unstructured Time

Mindfulness is fundamentally about awareness. For it to grow, begin by observing your mental state of mind, especially when you find yourself in a stressful situation. Instead of fighting emotions, observe your mental state as those negative ones arise. Think about how you’d conduct a deep root cause analysis on an incident and apply that same rigor to yourself. The key to mindfulness is paying attention to your reaction to events without judgment. This can unlock a new way of thinking because it accepts your reaction, while still enabling you to do what is required for the job. This contrasts being stuck behind frustration or avoiding new work as it rolls in. ... Mindfulness is an individual pursuit, while creativity is an enterprise pursuit, and providing space for employees to be creative is another key to preventing burnout. But there are other benefits as well. There is a direct correlation between creativity and productivity. Teams that spend all their time working on specific processes and problems struggle to develop creative solutions that could move a company forward. 


Overcoming the Four Biggest Barriers to Machine Learning Adoption

The first hurdles with adopting AI and ML are experienced by certain businesses even before they begin. Machine learning is a vast field that pervades most aspects of AI. It paves the way for a wide range of potential applications, from advanced data analytics and computer vision to Natural Language Processing (NLP) and Intelligent Process Automation (IPA). A general rule of thumb for selecting a suitable ML use case is to “follow the money” in addition to the usual recommendations on framing the business goals – what companies expect Machine Learning to do for their business, like improving products or services, improving operational efficiency, and mitigating risk. ... The biggest obstacle to deploying AI-related technologies is corporate culture. Top management is often reluctant to take investment risks, and employees worry about losing their jobs. Businesses must start with small-scale ML use cases that demand realistic investments to achieve quick wins and persuade executives in order to assure stakeholder and employee buy-in. By providing workshops, corporate training, and other incentives, they can promote innovation and digital literacy.


Fixing Metadata’s Bad Definition

A bad definition has practical implications. It makes misunderstandings much more likely, which can infect important processes such as data governance and data modeling. Thinking about this became an annoying itch that I couldn’t scratch. What follows is my thought process working toward a better understanding of metadata and its role in today’s data landscape. The problem starts with language. Our lexicon hasn’t kept up with modern data’s complexity and nuance. There are three main issues with our current discourse about metadata: Vague language - We talk about data in terms of “data” or “metadata”. But one category encompasses the other, which makes it very difficult to differentiate between them. These broad, self-referencing terms leave the door open to being interpreted differently by different people. A gap in data taxonomy - We don’t have a name for the category of data that metadata describes, which creates a gap at the top of our data taxonomy. We need to fill it with a name for the data that metadata refers to. Metadata is contextual - The same data set can be both metadata and not metadata depending on the context. So we need to treat metadata as a role that data can play rather than a fixed category.


Addressing Privacy Challenges in Retail Media Networks

The top reason that consumers cite for mistrusting how companies handle their data is a lack of transparency. Customers know at this point that companies are collecting their data. And many of these customers won’t mind that you’re doing it, as long as you’re upfront about your intentions and give them a clear choice about whether they consent to have their data collected and shared. What’s more, recent privacy laws have increased the need for companies to shore up data security or face the consequences. In the European Union, there’s the General Data Protection Regulation (GDPR). In the U.S., laws vary by state, but California currently has the most restrictive policies thanks to the California Consumer Protection Act (CCPA). Companies that have run afoul of these laws have incurred fines as big as $800 million. Clearly, online retailers that already have — or are considering implementing — a retail media network should take notice and reduce their reliance on third-party data sources that may cause trouble from a compliance standpoint.


For Gaming Companies, Cybersecurity Has Become a Major Value Proposition

Like any other vertical industry, games companies are tasked with protecting their organizations from all nature of cybersecurity threats to their business. Many of them are large enterprises with the same concerns for the protection of internal systems, financial platforms, and employee endpoints as any other firm. "Gaming companies have the same responsibility as any other organization to protect customer privacy and preserve shareholder value. While not specifically regulated like hospitals or critical infrastructure, they must comply with laws like GDPR and CaCPA," explains Craig Burland, CISO for Inversion6, a managed security service provider and fractional CISO firm. "Threats to gaming companies also follow similar trends seen in other segments of the economy — intellectual property (IP) theft, credential theft, and ransomware." IP issues are heightened for these firms, like many in the broader entertainment category, as content leaks for highly anticipated new games or updates can give a brand a black eye at best, and at worst hit them more directly in the financials. 


Driving value from data lake and warehouse modernisation

To achieve this, Data Lakes and Data Warehouses need to grow alongside the business requirements in order to be kept efficient and up to date. Go Reply is a leading Google Cloud Platform Service integrator (SI) that is helping companies that span multiple sectors along this vital journey. Part of the Reply Group, Go Reply is a Google Cloud Premier Partner focussing on areas to include Cloud Strategy and Migration; Big Data; Machine Learning; and Compliance. With Data Modernisation capabilities in the GCP environment constantly evolving, businesses can become overwhelmed and unsure on not only next steps, but more importantly next steps for them, particularly if they don’t have in-house Google expertise. Companies often need to utilise both Data Lakes and Data Warehouses simultaneously so guidance on how to do this, as well as driving value from both kinds of storage is vital. When speaking to the Go Reply leadership team they advise that Google Cloud Platform being the hyperscale cloud of choice for these workloads, brings technology around Data Lake, and Data Warehouse efficiency, along with security superior to other market offerings.


Three tech trends on the verge of a breakthrough in 2023

The second big trend is around virtual reality, augmented reality and the metaverse. Big tech has been spending big here, and there are some suggestions that the basic technology is reaching a tipping point, even if the broader metaverse business models are, at best, still in flux. Headset technologies are starting to coalesce and the software is getting easier to use. But the biggest issue is that consumer interest and trust is still low, if only because the science fiction writers got there long ago with their dystopian view of a headset future. Building that consumer trust and explaining why people might want to engage is just as a high a priority as the technology itself. One technology trend that's perhaps closer, even though we can't see it, is ambient computing. The concept has been around for decades: the idea is that we don't need to carry tech with us because the intelligence is built into the world around us, from smart speakers to smart homes. Ambient computing is designed to vanish into the environment around us – which is perhaps why it's a trend that has remained invisible to many, at least until now.


CIOs beware: IT teams are changing

The role of IT is shifting to be more strategy-oriented, innovative, and proactive. No longer can days be spent responding to issues – instead, issues must be addressed before they impact employees, and solutions should be developed to ensure they don’t return. What does this look like? Rather than waiting for an employee to flag an issue within their system – such as recurring issues with connectivity, slow computer start time, etc. – IT can identify potential threats to workflows before they happen. They plug the holes, then they establish a strategy and framework to avoid the problem entirely in the future. In short, IT plays a critical role in successful workplace flow in both a proactive and reactive way. For those looking to start a career in IT, the onus falls on them to make suggestions and changes that look holistically at the organization and how employees interact within it. IT teams are making themselves strategic assets by thinking through how to make things more efficient and cost-effective in the long term.


A Comprehensive List of Agile Methodologies and How They Work

Extreme Programming (or XP), offers some of the best buffers against unexpected changes or late-stage customer demands. Within sprints and from the start of the business process development, feedback gathering takes place. It’s this feedback that informs everything. This means the entire team becomes accustomed to a culture of pivoting on real-world client demands and outcomes that would otherwise threaten to derail a project and seriously warp lead time production. Any organization with a client-based focus will understand the tightrope that can exist between external demands and internal resources. Continuously orienting those resources based on external demands as they appear is the single most efficient way to achieve harmony. This is something that XP does organically once integrated into your development culture. ,,. Trimming the fat from the development process is what this method is all about. If something doesn’t add immediate value, or tasks within tasks seem to be piling up, the laser focus of Lean Development steps in.



Quote for the day:

"Confident and courageous leaders have no problems pointing out their own weaknesses and ignorance. " -- Thom S. Rainer