great_expectations.datasource.types

Package Contents

Classes

BatchKwargs()

dict() -> new empty dictionary

PandasDatasourceBatchKwargs()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceBatchKwargs()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceBatchKwargs()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

PathBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

S3BatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

InMemoryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

PandasDatasourceInMemoryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceInMemoryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceTableBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceQueryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceQueryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

BatchSpec()

dict() -> new empty dictionary

BatchMarkers(*args, **kwargs)

A BatchMarkers is a special type of BatchSpec (so that it has a batch_fingerprint) but it generally does

PandasDatasourceBatchSpec()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceBatchSpec()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceBatchSpec()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

PathBatchSpec(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

S3BatchSpec(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

RuntimeDataBatchSpec(*args, **kwargs)

dict() -> new empty dictionary

DatasourceTypes()

Generic enumeration.

SupportedDatabases()

Generic enumeration.

class great_expectations.datasource.types.BatchKwargs

Bases: great_expectations.core.id_dict.IDDict

dict() -> new empty dictionary dict(mapping) -> new dictionary initialized from a mapping object’s

(key, value) pairs

dict(iterable) -> new dictionary initialized as if via:

d = {} for k, v in iterable:

d[k] = v

dict(**kwargs) -> new dictionary initialized with the name=value pairs

in the keyword argument list. For example: dict(one=1, two=2)

exception great_expectations.datasource.types.InvalidBatchKwargsError(message)

Bases: great_expectations.exceptions.exceptions.GreatExpectationsError

Common base class for all non-exit exceptions.

great_expectations.datasource.types.logger
class great_expectations.datasource.types.PandasDatasourceBatchKwargs

Bases: great_expectations.core.id_dict.BatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SparkDFDatasourceBatchKwargs

Bases: great_expectations.core.id_dict.BatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SqlAlchemyDatasourceBatchKwargs

Bases: great_expectations.core.id_dict.BatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property limit(self)
property schema(self)
class great_expectations.datasource.types.PathBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.PandasDatasourceBatchKwargs, great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property path(self)
property reader_method(self)
class great_expectations.datasource.types.S3BatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.PandasDatasourceBatchKwargs, great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property s3(self)
property reader_method(self)
class great_expectations.datasource.types.InMemoryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.PandasDatasourceBatchKwargs, great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property dataset(self)
class great_expectations.datasource.types.PandasDatasourceInMemoryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.InMemoryBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SparkDFDatasourceInMemoryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.InMemoryBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SqlAlchemyDatasourceTableBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.SqlAlchemyDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property table(self)
class great_expectations.datasource.types.SqlAlchemyDatasourceQueryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.SqlAlchemyDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property query(self)
property query_parameters(self)
class great_expectations.datasource.types.SparkDFDatasourceQueryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property query(self)
class great_expectations.datasource.types.BatchSpec

Bases: great_expectations.core.id_dict.IDDict

dict() -> new empty dictionary dict(mapping) -> new dictionary initialized from a mapping object’s

(key, value) pairs

dict(iterable) -> new dictionary initialized as if via:

d = {} for k, v in iterable:

d[k] = v

dict(**kwargs) -> new dictionary initialized with the name=value pairs

in the keyword argument list. For example: dict(one=1, two=2)

exception great_expectations.datasource.types.InvalidBatchIdError(message)

Bases: great_expectations.exceptions.exceptions.GreatExpectationsError

Common base class for all non-exit exceptions.

exception great_expectations.datasource.types.InvalidBatchSpecError(message)

Bases: great_expectations.exceptions.exceptions.GreatExpectationsError

Common base class for all non-exit exceptions.

great_expectations.datasource.types.logger
class great_expectations.datasource.types.BatchMarkers(*args, **kwargs)

Bases: great_expectations.core.id_dict.BatchSpec

A BatchMarkers is a special type of BatchSpec (so that it has a batch_fingerprint) but it generally does NOT require specific keys and instead captures information about the OUTPUT of a datasource’s fetch process, such as the timestamp at which a query was executed.

property ge_load_time(self)
class great_expectations.datasource.types.PandasDatasourceBatchSpec

Bases: great_expectations.core.id_dict.BatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SparkDFDatasourceBatchSpec

Bases: great_expectations.core.id_dict.BatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SqlAlchemyDatasourceBatchSpec

Bases: great_expectations.core.id_dict.BatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property limit(self)
property schema(self)
class great_expectations.datasource.types.PathBatchSpec(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_spec.PandasDatasourceBatchSpec, great_expectations.datasource.types.batch_spec.SparkDFDatasourceBatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property path(self)
property reader_method(self)
class great_expectations.datasource.types.S3BatchSpec(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_spec.PandasDatasourceBatchSpec, great_expectations.datasource.types.batch_spec.SparkDFDatasourceBatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property s3(self)
property reader_method(self)
class great_expectations.datasource.types.RuntimeDataBatchSpec(*args, **kwargs)

Bases: great_expectations.core.id_dict.BatchSpec

dict() -> new empty dictionary dict(mapping) -> new dictionary initialized from a mapping object’s

(key, value) pairs

dict(iterable) -> new dictionary initialized as if via:

d = {} for k, v in iterable:

d[k] = v

dict(**kwargs) -> new dictionary initialized with the name=value pairs

in the keyword argument list. For example: dict(one=1, two=2)

_id_ignore_keys
property batch_data(self)
class great_expectations.datasource.types.DatasourceTypes

Bases: enum.Enum

Generic enumeration.

Derive from this class to define new enumerations.

PANDAS = pandas
SPARK = spark
SQL = sql
class great_expectations.datasource.types.SupportedDatabases

Bases: enum.Enum

Generic enumeration.

Derive from this class to define new enumerations.

MYSQL = MySQL
POSTGRES = Postgres
REDSHIFT = Redshift
SNOWFLAKE = Snowflake
BIGQUERY = BigQuery
OTHER = other - Do you have a working SQLAlchemy connection string?