DATE: February 19, 2025
TIME: 8:00 AM – 12:45 PM Pacific / 11:00 AM – 3:45 PM Eastern – This event has concluded, a recording of the sessions and the slides will be sent to all registrants within a couple of business days.
PRICE: Free to all attendees
Welcome to DATAVERSITY Demo Day, an online exhibit hall. We’ve had many requests from our data-driven community for a vendor-driven online event that gives you an opportunity to learn more about the available tools and services directly from the vendors that could contribute to your Data Governance program’s success.
Register to attend one or all of the sessions in order to receive the follow-up email with links to the slides and recordings.


Session 1: Ensuring Data Quality in Streaming Data Pipelines at Scale
Maintaining high-quality data is crucial in today’s fast-paced, data-driven world. As more organizations leverage real-time streaming data for use cases with low-latency processing requirements, ensuring data quality becomes increasingly challenging.
Join us for a live demo to discover how IBM’s data integration capabilities can help you streamline data quality and governance for streaming data by ingesting data from a wide range of streaming sources, executing in-flight transformations, proactively continuously monitoring data pipelines, and quickly adapting to data and schema changes. Discover how IBM’s streaming data integration and data observability capabilities can ensure the health of streaming data pipelines, enabling reliable data delivery.
Learn how to:
- Process streaming data of any format, including structured, unstructured, and semi-structured data
- Detect changes in data distribution, schema, or quality in real-time, and proactively adjust the pipeline to ensure data integrity
- Get real-time visibility into workflows to identify potential issues before they impact downstream processes
- Manage high-volume and unpredictable data workloads with a highly scalable parallel processing engine
By ensuring data quality in your streaming data pipelines, you can trust your data to drive real-time decision-making and achieve better business outcomes
Session 2: Not Boring, Data Quality Scoring
Transform Your Data Quality with AI-Powered Scorecards Are you ready to become the hero your data deserves?
Join us for a fast-paced, game-changing discussion unveiling Data Quality Scorecards powered by our AI-driven Open Source DataOps Data Quality TestGen software. This is your chance to harness influence within your organization using a free, powerful tool that connects to your data, learns, then scans for dozens of DQ issues, and prepares actionable results—all with just a click.
What You’ll Learn:
- Create Custom Scorecards for different data types, user groups, programs, and goals
- Simplify Compliance & Reporting with automated insights tailored for stakeholders
- Drive Immediate Action—generate an intelligent scorecard in just one hour
- Boost Your Influence by using AI-driven data quality improvements
Whether you’re a seasoned data pro or new to DataOps, this session will equip you with the ultimate cape, shield, and secret weapon for data quality success.
Don’t miss this opportunity to turn data challenges into triumphs! Join us and walk away with actionable insights to quickly elevate your organization’s data quality.
Session 3: Unified Data Reliability with iceDQ: Identify & Fix Data Issues Early
Ensuring data reliability across the entire data lifecycle is critical for any data project. iceDQ is the only unified platform that integrates data testing, data monitoring, and AI-based data observability, providing full coverage from DEV to QA to PROD. By combining Proactive Testing, Pre-Emptive Monitoring, and Predictive Observability, iceDQ enables organizations to detect and resolve data issues early, preventing costly errors and improving data quality. From Data Pipeline Testing, ETL & Data Warehouse Testing, and Data Migration Testing to Big Data Lake Testing, BI Report Validation, and System Migration Testing, iceDQ ensures data reliability across environments. Join our session to explore how a unified approach to data reliability can help your organization identify and fix data issues before they impact production environments.