dlt.sources.sql_database.helpers
SQL database source helpers
TABLE_LOADER_REGISTRY
Maps backend name to the table loader class that handles it.
BaseTableLoader Objects
class BaseTableLoader(ABC)
Base class for SQL table loaders.
Provides query building infrastructure including incremental loading support.
Subclasses must implement load_rows to handle the actual data retrieval.
For loaders that use a SQLAlchemy connection, subclass TableLoader instead
to reuse the result-conversion helpers.
Arguments:
engine- SQLAlchemyEngineused for query compilation and (optionally) execution.backend- Requested output format ("sqlalchemy","pyarrow","pandas","connectorx").table- Reflected SQLAlchemyTableobject.columns- dlt column schema hints for the table.chunk_size- Number of rows per batch.incremental- Optional incremental loading state.query_adapter_callback- Optional callback to modify the generated query.limit- Optional row/time limit from the resource.
compile_query
def compile_query(query: SelectClause) -> str
Compile a SQLAlchemy query into a SQL string with literal binds.
Useful for backends that execute raw SQL strings (ConnectorX, ADBC, etc.).
Raises:
NotImplementedError- When the query cannot be compiled to a string.
get_connection_url
def get_connection_url() -> str
Return a plain database connection URL derived from the engine.
Strips the SQLAlchemy driver portion (e.g. +psycopg2) so the URL
is suitable for non-SQLAlchemy backends such as ConnectorX or ADBC.
Override this method when a backend requires a different connection
string format (e.g. MSSQL ODBC → go-mssqldb conversion).
load_rows
@abstractmethod
def load_rows(backend_kwargs: Dict[str, Any] = None) -> Iterator[TDataItem]
Load rows from the table and yield them as data items.
Arguments:
backend_kwargs- Backend-specific keyword arguments passed through from the resource configuration.
Yields:
Data items in the format determined by the loader implementation (dicts, Arrow tables, DataFrames, etc.).
TableLoader Objects
class TableLoader(BaseTableLoader)
Default table loader using a SQLAlchemy connection.
Supports "sqlalchemy", "pyarrow" and "pandas" backends.
Override _load_rows to customise query execution (e.g. pagination)
while reusing _convert_result for backend format conversion.
ConnectorXTableLoader Objects
class ConnectorXTableLoader(BaseTableLoader)
Table loader using ConnectorX for data retrieval. Yields Arrow tables.
register_table_loader_backend
def register_table_loader_backend(backend_name: str,
loader_class: Type[BaseTableLoader]) -> None
Register a custom table loader backend.
After registration the backend_name can be used as the backend
argument of sql_table / sql_database.
Arguments:
backend_name- Name for the backend (used inbackendparameter).loader_class- A subclass ofBaseTableLoaderthat handles data loading.
get_table_loader_class
def get_table_loader_class(backend_name: str) -> Type[BaseTableLoader]
Look up the table loader class for the given backend name.
unwrap_json_connector_x
def unwrap_json_connector_x(field: str) -> TDataItem
Creates a transform function to be added with add_map that will unwrap JSON columns
ingested via connectorx. Such columns are additionally quoted and translate SQL NULL to json "null"
remove_nullability_adapter
def remove_nullability_adapter(table: Table) -> Table
A table adapter that removes nullability from columns.
SqlTableResourceConfiguration Objects
@configspec
class SqlTableResourceConfiguration(BaseConfiguration)
incremental
type: ignore[type-arg]
default_engine_adapter_callback
def default_engine_adapter_callback(engine: Engine,
metadata: MetaData) -> None
Applies default engine adaptations for known dialects.
For Oracle dialect, registers an event listener on the provided MetaData that forces NUMBER columns to be reflected as Python Decimal to preserve numeric precision.
Arguments:
engine- The SQLAlchemy engine to check dialect for.metadata- The MetaData instance to register the listener on.