Skip to content

Antoninj/great_expectations

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyPI PyPI Downloads Build Status DOI Twitter Follow Slack Status Contributors

Great Expectations

Always know what to expect from your data.

What is GX?

Great Expectations (GX) helps data teams build a shared understanding of their data through quality testing, documentation, and profiling.

Data practitioners know that testing and documentation are essential for managing complex data pipelines. GX makes it possible for data science and engineering teams to quickly deploy extensible, flexible data quality testing into their data stacks. Its human-readable documentation makes the results accessible to technical and nontechnical users.

See Down with Pipeline Debt! for an introduction to our philosophy of pipeline data quality testing.

Key features

Seamless operation

GX fits into your existing tech stack, and can integrate with your CI/CD pipelines to add data quality exactly where you need it. Connect to and validate your data wherever it already is, so you can focus on honing your Expectation Suites to perfectly meet your data quality needs.

Start fast

Get useful results quickly even for large data volumes. GX’s Data Assistants provide curated Expectations for different domains, so you can accelerate your data discovery to rapidly deploy data quality throughout your pipelines. Auto-generated Data Docs ensure your DQ documentation will always be up-to-date.

data_assistant_plot_expectations_and_metrics

Unified understanding

Expectations are GX’s workhorse abstraction: each Expectation declares an expected state of the data. The Expectation library provides a flexible, extensible vocabulary for data quality—one that’s human-readable, meaningful for technical and nontechnical users alike. Bundled into Expectation Suites, Expectations are the ideal tool for characterizing exactly what you expect from your data.

  • expect_column_values_to_not_be_null
  • expect_column_values_to_match_regex
  • expect_column_values_to_be_unique
  • expect_column_values_to_match_strftime_format
  • expect_table_row_count_to_be_between
  • expect_column_median_to_be_between
  • ...and many more

Secure and transparent

GX doesn’t ask you to exchange security for your insight. It processes your data in place, on your systems, so your security and governance procedures can maintain control at all times. And because GX’s core is and always will be open source, its complete transparency is the opposite of a black box.

Data contracts support

Checkpoints are a transparent, central, and automatable mechanism for testing Expectations and evaluating your data quality. Every Checkpoint run produces human-readable Data Docs reporting the results. You can also configure Checkpoints to take Actions based on the results of the evaluation, like sending alerts and preventing low-quality data from moving further in your pipelines.

Image of data contact support

Readable for collaboration

Everyone stays on the same page about your data quality with GX’s inspectable, shareable, and human-readable Data Docs. You can publish Data Docs to the locations where you need them in a variety of formats, making it easy to integrate Data Docs into your existing data catalogs, dashboards, and other reporting and data governance tools.

Image of data docs

Quick start

To see Great Expectations in action on your own data:

You can install it using pip

pip install great_expectations

and then run

great_expectations init

(We recommend deploying within a virtual environment. If you’re not familiar with pip, virtual environments, notebooks, or git, you may want to check out the Supporting Resources, which will teach you how to get up and running in minutes.)

For full documentation, visit https://docs.greatexpectations.io/.

If you need help, hop into our Slack channel—there are always contributors and other users there. You can also use our GitHub Discussions

Integrations

Great Expectations works with the tools and systems that you're already using with your data, including:

Integration Notes
DataHub Data Catalog
AWS Glue Data Integration
Athena Data Source
AWS Redshift Data Source
AWS S3 Data Source
BigQuery Data Source
Databricks Data Source
Deepnote Collaborative data notebook
Google Cloud Platform (GCP) Data Source
Microsoft Azure Blob Storage Data Source
Microsoft SQL Server Data Source
MySQL Data Source
Pandas Data Source
PostgreSQL Data Source
Snowflake Data Source
Spark Data Source
SQLite Data Source
Trino Data Source
Apache Airflow Orchestrator
Flyte Orchestrator
Meltano Orchestrator
Prefect Orchestrator
ZenML Orchestrator
Slack Plugin
Jupyter Notebooks Utility

What is GX not?

Great Expectations is not a pipeline execution framework. Instead, it integrates seamlessly with DAG execution tools like Spark, Airflow, dbt, prefect, dagster, Kedro, Flyte, etc. GX carries out your data quality pipeline testing while these tools execute the pipelines.

Great Expectations is not a database or storage software. It processes your data in place, on your existing systems. Expectations and Validation Results that GX produces are metadata about your data.

Great Expectations is not a data versioning tool. If you want to bring your data itself under version control, check out tools like DVC and Quilt.

Great Expectations is not a language-agnostic platform. Instead, it follows the philosophy of “take the compute to the data” by using the popular Python language to support native execution of Expectations in pandas, SQL (via SQLAlchemy), and Spark environments.

Great Expectations is not exclusive to Python programming environments. It can be invoked from the command line without a Python environment. However, if you’re working into another ecosystem, you may want to explore ecosystem-specific alternatives such as assertR (for R environments) or TFDV (for Tensorflow environments).

Who maintains Great Expectations?

Great Expectations OSS is under active development by GX Labs and the Great Expectations community.

What's the best way to get in touch with the Great Expectations team?

If you have questions, comments, or just want to have a good old-fashioned chat about data quality, please hop on our public Slack channel channel or post in our GitHub Discussions.

Can I contribute to the library?

Absolutely. Yes, please. Start here and please don't be shy with questions.

About

Always know what to expect from your data.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 92.8%
  • Jupyter Notebook 4.7%
  • SCSS 1.6%
  • Jinja 0.4%
  • JavaScript 0.3%
  • CSS 0.1%
  • Other 0.1%