great_expectations.datasource.types

Package Contents

Classes

BatchKwargs()

dict() -> new empty dictionary

PandasDatasourceBatchKwargs()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceBatchKwargs()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceBatchKwargs()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

PathBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

S3BatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

InMemoryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

PandasDatasourceInMemoryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceInMemoryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceTableBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceQueryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceQueryBatchKwargs(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

DatasourceTypes()

Generic enumeration.

SupportedDatabases()

Generic enumeration.

class great_expectations.datasource.types.BatchKwargs

Bases: great_expectations.core.id_dict.IDDict

dict() -> new empty dictionary dict(mapping) -> new dictionary initialized from a mapping object’s

(key, value) pairs

dict(iterable) -> new dictionary initialized as if via:

d = {} for k, v in iterable:

d[k] = v

dict(**kwargs) -> new dictionary initialized with the name=value pairs

in the keyword argument list. For example: dict(one=1, two=2)

exception great_expectations.datasource.types.InvalidBatchKwargsError(message)

Bases: great_expectations.exceptions.exceptions.GreatExpectationsError

Common base class for all non-exit exceptions.

great_expectations.datasource.types.logger
class great_expectations.datasource.types.PandasDatasourceBatchKwargs

Bases: great_expectations.core.id_dict.BatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SparkDFDatasourceBatchKwargs

Bases: great_expectations.core.id_dict.BatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SqlAlchemyDatasourceBatchKwargs

Bases: great_expectations.core.id_dict.BatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property limit(self)
property schema(self)
class great_expectations.datasource.types.PathBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.PandasDatasourceBatchKwargs, great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property path(self)
property reader_method(self)
class great_expectations.datasource.types.S3BatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.PandasDatasourceBatchKwargs, great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property s3(self)
property reader_method(self)
class great_expectations.datasource.types.InMemoryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.PandasDatasourceBatchKwargs, great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property dataset(self)
class great_expectations.datasource.types.PandasDatasourceInMemoryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.InMemoryBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SparkDFDatasourceInMemoryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.InMemoryBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.SqlAlchemyDatasourceTableBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.SqlAlchemyDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property table(self)
class great_expectations.datasource.types.SqlAlchemyDatasourceQueryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.SqlAlchemyDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property query(self)
property query_parameters(self)
class great_expectations.datasource.types.SparkDFDatasourceQueryBatchKwargs(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_kwargs.SparkDFDatasourceBatchKwargs

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property query(self)
class great_expectations.datasource.types.DatasourceTypes

Bases: enum.Enum

Generic enumeration.

Derive from this class to define new enumerations.

PANDAS = pandas
SPARK = spark
SQL = sqlalchemy
class great_expectations.datasource.types.SupportedDatabases

Bases: enum.Enum

Generic enumeration.

Derive from this class to define new enumerations.

MYSQL = MySQL
POSTGRES = Postgres
REDSHIFT = Redshift
SNOWFLAKE = Snowflake
BIGQUERY = BigQuery
OTHER = other - Do you have a working SQLAlchemy connection string?