Edit on GitHub

SQLGlot logo

SQLGlot is a no-dependency SQL parser, transpiler, optimizer, and engine. It can be used to format SQL or translate between 20 different dialects like DuckDB, Presto / Trino, Spark / Databricks, Snowflake, and BigQuery. It aims to read a wide variety of SQL inputs and output syntactically and semantically correct SQL in the targeted dialects.

It is a very comprehensive generic SQL parser with a robust test suite. It is also quite performant, while being written purely in Python.

You can easily customize the parser, analyze queries, traverse expression trees, and programmatically build SQL.

Syntax errors are highlighted and dialect incompatibilities can warn or raise depending on configurations. However, it should be noted that SQL validation is not SQLGlot’s goal, so some syntax errors may go unnoticed.

Learn more about the SQLGlot API in the documentation.

Contributions are very welcome in SQLGlot; read the contribution guide to get started!

Table of Contents

Install

From PyPI:

pip3 install sqlglot

Or with a local checkout:

make install

Requirements for development (optional):

make install-dev

Versioning

Given a version number MAJOR.MINOR.PATCH, SQLGlot uses the following versioning strategy:

  • The PATCH version is incremented when there are backwards-compatible fixes or feature additions.
  • The MINOR version is incremented when there are backwards-incompatible fixes or feature additions.
  • The MAJOR version is incremented when there are significant backwards-incompatible fixes or feature additions.

Get in Touch

We'd love to hear from you. Join our community Slack channel!

Examples

Formatting and Transpiling

Easily translate from one dialect to another. For example, date/time functions vary between dialects and can be hard to deal with:

import sqlglot
transpile("SELECT EPOCH_MS(1618088028295)", read="duckdb", write="hive")[0]
'SELECT FROM_UNIXTIME(1618088028295 / 1000)'

SQLGlot can even translate custom time formats:

import sqlglot
transpile("SELECT STRFTIME(x, '%y-%-m-%S')", read="duckdb", write="hive")[0]
"SELECT DATE_FORMAT(x, 'yy-M-ss')"

As another example, let's suppose that we want to read in a SQL query that contains a CTE and a cast to REAL, and then transpile it to Spark, which uses backticks for identifiers and FLOAT instead of REAL:

import sqlglot

sql = """WITH baz AS (SELECT a, c FROM foo WHERE a = 1) SELECT f.a, b.b, baz.c, CAST("b"."a" AS REAL) d FROM foo f JOIN bar b ON f.a = b.a LEFT JOIN baz ON f.a = baz.a"""
print(transpile(sql, write="spark", identify=True, pretty=True)[0])
WITH `baz` AS (
  SELECT
    `a`,
    `c`
  FROM `foo`
  WHERE
    `a` = 1
)
SELECT
  `f`.`a`,
  `b`.`b`,
  `baz`.`c`,
  CAST(`b`.`a` AS FLOAT) AS `d`
FROM `foo` AS `f`
JOIN `bar` AS `b`
  ON `f`.`a` = `b`.`a`
LEFT JOIN `baz`
  ON `f`.`a` = `baz`.`a`

Comments are also preserved on a best-effort basis when transpiling SQL code:

sql = """
/* multi
   line
   comment
*/
SELECT
  tbl.cola /* comment 1 */ + tbl.colb /* comment 2 */,
  CAST(x AS INT), # comment 3
  y               -- comment 4
FROM
  bar /* comment 5 */,
  tbl #          comment 6
"""

print(transpile(sql, read='mysql', pretty=True)[0])
/* multi
   line
   comment
*/
SELECT
  tbl.cola /* comment 1 */ + tbl.colb /* comment 2 */,
  CAST(x AS INT), /* comment 3 */
  y /* comment 4 */
FROM bar /* comment 5 */, tbl /*          comment 6 */

Metadata

You can explore SQL with expression helpers to do things like find columns and tables:

from sqlglot import parse_one, exp

# print all column references (a and b)
for column in parse_one("SELECT a, b + 1 AS c FROM d").find_all(exp.Column):
    print(column.alias_or_name)

# find all projections in select statements (a and c)
for select in parse_one("SELECT a, b + 1 AS c FROM d").find_all(exp.Select):
    for projection in select.expressions:
        print(projection.alias_or_name)

# find all tables (x, y, z)
for table in parse_one("SELECT * FROM x JOIN y JOIN z").find_all(exp.Table):
    print(table.name)

Parser Errors

When the parser detects an error in the syntax, it raises a ParseError:

import sqlglot
transpile("SELECT foo( FROM bar")
sqlglot.errors.ParseError: Expecting ). Line 1, Col: 13.
  select foo( FROM bar
              ~~~~

Structured syntax errors are accessible for programmatic use:

import sqlglot
try:
    transpile("SELECT foo( FROM bar")
except sqlglot.errors.ParseError as e:
    print(e.errors)
[{
  'description': 'Expecting )',
  'line': 1,
  'col': 16,
  'start_context': 'SELECT foo( ',
  'highlight': 'FROM',
  'end_context': ' bar',
  'into_expression': None,
}]

Unsupported Errors

Presto APPROX_DISTINCT supports the accuracy argument which is not supported in Hive:

import sqlglot
transpile("SELECT APPROX_DISTINCT(a, 0.1) FROM foo", read="presto", write="hive")
APPROX_COUNT_DISTINCT does not support accuracy
'SELECT APPROX_COUNT_DISTINCT(a) FROM foo'

Build and Modify SQL

SQLGlot supports incrementally building sql expressions:

from sqlglot import select, condition

where = condition("x=1").and_("y=1")
select("*").from_("y").where(where).sql()
'SELECT * FROM y WHERE x = 1 AND y = 1'

You can also modify a parsed tree:

from sqlglot import parse_one
parse_one("SELECT x FROM y").from_("z").sql()
'SELECT x FROM z'

There is also a way to recursively transform the parsed tree by applying a mapping function to each tree node:

from sqlglot import exp, parse_one

expression_tree = parse_one("SELECT a FROM x")

def transformer(node):
    if isinstance(node, exp.Column) and node.name == "a":
        return parse_one("FUN(a)")
    return node

transformed_tree = expression_tree.transform(transformer)
transformed_tree.sql()
'SELECT FUN(a) FROM x'

SQL Optimizer

SQLGlot can rewrite queries into an "optimized" form. It performs a variety of techniques to create a new canonical AST. This AST can be used to standardize queries or provide the foundations for implementing an actual engine. For example:

import sqlglot
from sqlglot.optimizer import optimize

print(
    optimize(
        parse_one("""
            SELECT A OR (B OR (C AND D))
            FROM x
            WHERE Z = date '2021-01-01' + INTERVAL '1' month OR 1 = 0
        """),
        schema={"x": {"A": "INT", "B": "INT", "C": "INT", "D": "INT", "Z": "STRING"}}
    ).sql(pretty=True)
)
SELECT
  (
    "x"."a" <> 0 OR "x"."b" <> 0 OR "x"."c" <> 0
  )
  AND (
    "x"."a" <> 0 OR "x"."b" <> 0 OR "x"."d" <> 0
  ) AS "_col_0"
FROM "x" AS "x"
WHERE
  CAST("x"."z" AS DATE) = CAST('2021-02-01' AS DATE)

AST Introspection

You can see the AST version of the sql by calling repr:

from sqlglot import parse_one
print(repr(parse_one("SELECT a + 1 AS z")))
(SELECT expressions:
  (ALIAS this:
    (ADD this:
      (COLUMN this:
        (IDENTIFIER this: a, quoted: False)), expression:
      (LITERAL this: 1, is_string: False)), alias:
    (IDENTIFIER this: z, quoted: False)))

AST Diff

SQLGlot can calculate the difference between two expressions and output changes in a form of a sequence of actions needed to transform a source expression into a target one:

from sqlglot import diff, parse_one
diff(parse_one("SELECT a + b, c, d"), parse_one("SELECT c, a - b, d"))
[
  Remove(expression=(ADD this:
    (COLUMN this:
      (IDENTIFIER this: a, quoted: False)), expression:
    (COLUMN this:
      (IDENTIFIER this: b, quoted: False)))),
  Insert(expression=(SUB this:
    (COLUMN this:
      (IDENTIFIER this: a, quoted: False)), expression:
    (COLUMN this:
      (IDENTIFIER this: b, quoted: False)))),
  Move(expression=(COLUMN this:
    (IDENTIFIER this: c, quoted: False))),
  Keep(source=(IDENTIFIER this: b, quoted: False), target=(IDENTIFIER this: b, quoted: False)),
  ...
]

See also: Semantic Diff for SQL.

Custom Dialects

Dialects can be added by subclassing Dialect:

from sqlglot import exp
from sqlglot.dialects.dialect import Dialect
from sqlglot.generator import Generator
from sqlglot.tokens import Tokenizer, TokenType


class Custom(Dialect):
    class Tokenizer(Tokenizer):
        QUOTES = ["'", '"']
        IDENTIFIERS = ["`"]

        KEYWORDS = {
            **Tokenizer.KEYWORDS,
            "INT64": TokenType.BIGINT,
            "FLOAT64": TokenType.DOUBLE,
        }

    class Generator(Generator):
        TRANSFORMS = {exp.Array: lambda self, e: f"[{self.expressions(e)}]"}

        TYPE_MAPPING = {
            exp.DataType.Type.TINYINT: "INT64",
            exp.DataType.Type.SMALLINT: "INT64",
            exp.DataType.Type.INT: "INT64",
            exp.DataType.Type.BIGINT: "INT64",
            exp.DataType.Type.DECIMAL: "NUMERIC",
            exp.DataType.Type.FLOAT: "FLOAT64",
            exp.DataType.Type.DOUBLE: "FLOAT64",
            exp.DataType.Type.BOOLEAN: "BOOL",
            exp.DataType.Type.TEXT: "STRING",
        }

print(Dialect["custom"])
<class '__main__.Custom'>

SQL Execution

One can even interpret SQL queries using SQLGlot, where the tables are represented as Python dictionaries. Although the engine is not very fast (it's not supposed to be) and is in a relatively early stage of development, it can be useful for unit testing and running SQL natively across Python objects. Additionally, the foundation can be easily integrated with fast compute kernels (arrow, pandas). Below is an example showcasing the execution of a SELECT expression that involves aggregations and JOINs:

from sqlglot.executor import execute

tables = {
    "sushi": [
        {"id": 1, "price": 1.0},
        {"id": 2, "price": 2.0},
        {"id": 3, "price": 3.0},
    ],
    "order_items": [
        {"sushi_id": 1, "order_id": 1},
        {"sushi_id": 1, "order_id": 1},
        {"sushi_id": 2, "order_id": 1},
        {"sushi_id": 3, "order_id": 2},
    ],
    "orders": [
        {"id": 1, "user_id": 1},
        {"id": 2, "user_id": 2},
    ],
}

execute(
    """
    SELECT
      o.user_id,
      SUM(s.price) AS price
    FROM orders o
    JOIN order_items i
      ON o.id = i.order_id
    JOIN sushi s
      ON i.sushi_id = s.id
    GROUP BY o.user_id
    """,
    tables=tables
)
user_id price
      1   4.0
      2   3.0

See also: Writing a Python SQL engine from scratch.

Used By

Documentation

SQLGlot uses pdoc to serve its API documentation.

A hosted version is on the SQLGlot website, or you can build locally with:

make docs-serve

Run Tests and Lint

make style  # Only linter checks
make unit   # Only unit tests
make check  # Full test suite & linter checks

Benchmarks

Benchmarks run on Python 3.10.5 in seconds.

Query sqlglot sqlfluff sqltree sqlparse moz_sql_parser sqloxide
tpch 0.01308 (1.0) 1.60626 (122.7) 0.01168 (0.893) 0.04958 (3.791) 0.08543 (6.531) 0.00136 (0.104)
short 0.00109 (1.0) 0.14134 (129.2) 0.00099 (0.906) 0.00342 (3.131) 0.00652 (5.970) 8.76E-5 (0.080)
long 0.01399 (1.0) 2.12632 (151.9) 0.01126 (0.805) 0.04410 (3.151) 0.06671 (4.767) 0.00107 (0.076)
crazy 0.03969 (1.0) 24.3777 (614.1) 0.03917 (0.987) 11.7043 (294.8) 1.03280 (26.02) 0.00625 (0.157)

Optional Dependencies

SQLGlot uses dateutil to simplify literal timedelta expressions. The optimizer will not simplify expressions like the following if the module cannot be found:

x + interval '1' month

  1"""
  2.. include:: ../README.md
  3
  4----
  5"""
  6
  7from __future__ import annotations
  8
  9import logging
 10import typing as t
 11
 12from sqlglot import expressions as exp
 13from sqlglot.dialects.dialect import Dialect as Dialect, Dialects as Dialects
 14from sqlglot.diff import diff as diff
 15from sqlglot.errors import (
 16    ErrorLevel as ErrorLevel,
 17    ParseError as ParseError,
 18    TokenError as TokenError,
 19    UnsupportedError as UnsupportedError,
 20)
 21from sqlglot.expressions import (
 22    Expression as Expression,
 23    alias_ as alias,
 24    and_ as and_,
 25    case as case,
 26    cast as cast,
 27    column as column,
 28    condition as condition,
 29    except_ as except_,
 30    from_ as from_,
 31    func as func,
 32    intersect as intersect,
 33    maybe_parse as maybe_parse,
 34    not_ as not_,
 35    or_ as or_,
 36    select as select,
 37    subquery as subquery,
 38    table_ as table,
 39    to_column as to_column,
 40    to_identifier as to_identifier,
 41    to_table as to_table,
 42    union as union,
 43)
 44from sqlglot.generator import Generator as Generator
 45from sqlglot.parser import Parser as Parser
 46from sqlglot.schema import MappingSchema as MappingSchema, Schema as Schema
 47from sqlglot.tokens import Tokenizer as Tokenizer, TokenType as TokenType
 48
 49if t.TYPE_CHECKING:
 50    from sqlglot._typing import E
 51    from sqlglot.dialects.dialect import DialectType as DialectType
 52
 53logger = logging.getLogger("sqlglot")
 54
 55
 56try:
 57    from sqlglot._version import __version__, __version_tuple__
 58except ImportError:
 59    logger.error(
 60        "Unable to set __version__, run `pip install -e .` or `python setup.py develop` first."
 61    )
 62
 63
 64pretty = False
 65"""Whether to format generated SQL by default."""
 66
 67schema = MappingSchema()
 68"""The default schema used by SQLGlot (e.g. in the optimizer)."""
 69
 70
 71def parse(
 72    sql: str, read: DialectType = None, dialect: DialectType = None, **opts
 73) -> t.List[t.Optional[Expression]]:
 74    """
 75    Parses the given SQL string into a collection of syntax trees, one per parsed SQL statement.
 76
 77    Args:
 78        sql: the SQL code string to parse.
 79        read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
 80        dialect: the SQL dialect (alias for read).
 81        **opts: other `sqlglot.parser.Parser` options.
 82
 83    Returns:
 84        The resulting syntax tree collection.
 85    """
 86    return Dialect.get_or_raise(read or dialect).parse(sql, **opts)
 87
 88
 89@t.overload
 90def parse_one(sql: str, *, into: t.Type[E], **opts) -> E:
 91    ...
 92
 93
 94@t.overload
 95def parse_one(sql: str, **opts) -> Expression:
 96    ...
 97
 98
 99def parse_one(
100    sql: str,
101    read: DialectType = None,
102    dialect: DialectType = None,
103    into: t.Optional[exp.IntoType] = None,
104    **opts,
105) -> Expression:
106    """
107    Parses the given SQL string and returns a syntax tree for the first parsed SQL statement.
108
109    Args:
110        sql: the SQL code string to parse.
111        read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
112        dialect: the SQL dialect (alias for read)
113        into: the SQLGlot Expression to parse into.
114        **opts: other `sqlglot.parser.Parser` options.
115
116    Returns:
117        The syntax tree for the first parsed statement.
118    """
119
120    dialect = Dialect.get_or_raise(read or dialect)
121
122    if into:
123        result = dialect.parse_into(into, sql, **opts)
124    else:
125        result = dialect.parse(sql, **opts)
126
127    for expression in result:
128        if not expression:
129            raise ParseError(f"No expression was parsed from '{sql}'")
130        return expression
131    else:
132        raise ParseError(f"No expression was parsed from '{sql}'")
133
134
135def transpile(
136    sql: str,
137    read: DialectType = None,
138    write: DialectType = None,
139    identity: bool = True,
140    error_level: t.Optional[ErrorLevel] = None,
141    **opts,
142) -> t.List[str]:
143    """
144    Parses the given SQL string in accordance with the source dialect and returns a list of SQL strings transformed
145    to conform to the target dialect. Each string in the returned list represents a single transformed SQL statement.
146
147    Args:
148        sql: the SQL code string to transpile.
149        read: the source dialect used to parse the input string (eg. "spark", "hive", "presto", "mysql").
150        write: the target dialect into which the input should be transformed (eg. "spark", "hive", "presto", "mysql").
151        identity: if set to `True` and if the target dialect is not specified the source dialect will be used as both:
152            the source and the target dialect.
153        error_level: the desired error level of the parser.
154        **opts: other `sqlglot.generator.Generator` options.
155
156    Returns:
157        The list of transpiled SQL statements.
158    """
159    write = (read if write is None else write) if identity else write
160    write = Dialect.get_or_raise(write)
161    return [
162        write.generate(expression, copy=False, **opts) if expression else ""
163        for expression in parse(sql, read, error_level=error_level)
164    ]
logger = <Logger sqlglot (WARNING)>
pretty = False

Whether to format generated SQL by default.

schema = <sqlglot.schema.MappingSchema object>

The default schema used by SQLGlot (e.g. in the optimizer).

def parse( sql: str, read: Union[str, sqlglot.dialects.dialect.Dialect, Type[sqlglot.dialects.dialect.Dialect], NoneType] = None, dialect: Union[str, sqlglot.dialects.dialect.Dialect, Type[sqlglot.dialects.dialect.Dialect], NoneType] = None, **opts) -> List[Optional[sqlglot.expressions.Expression]]:
72def parse(
73    sql: str, read: DialectType = None, dialect: DialectType = None, **opts
74) -> t.List[t.Optional[Expression]]:
75    """
76    Parses the given SQL string into a collection of syntax trees, one per parsed SQL statement.
77
78    Args:
79        sql: the SQL code string to parse.
80        read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
81        dialect: the SQL dialect (alias for read).
82        **opts: other `sqlglot.parser.Parser` options.
83
84    Returns:
85        The resulting syntax tree collection.
86    """
87    return Dialect.get_or_raise(read or dialect).parse(sql, **opts)

Parses the given SQL string into a collection of syntax trees, one per parsed SQL statement.

Arguments:
  • sql: the SQL code string to parse.
  • read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
  • dialect: the SQL dialect (alias for read).
  • **opts: other sqlglot.parser.Parser options.
Returns:

The resulting syntax tree collection.

def parse_one( sql: str, read: Union[str, sqlglot.dialects.dialect.Dialect, Type[sqlglot.dialects.dialect.Dialect], NoneType] = None, dialect: Union[str, sqlglot.dialects.dialect.Dialect, Type[sqlglot.dialects.dialect.Dialect], NoneType] = None, into: Union[str, Type[sqlglot.expressions.Expression], Collection[Union[str, Type[sqlglot.expressions.Expression]]], NoneType] = None, **opts) -> sqlglot.expressions.Expression:
100def parse_one(
101    sql: str,
102    read: DialectType = None,
103    dialect: DialectType = None,
104    into: t.Optional[exp.IntoType] = None,
105    **opts,
106) -> Expression:
107    """
108    Parses the given SQL string and returns a syntax tree for the first parsed SQL statement.
109
110    Args:
111        sql: the SQL code string to parse.
112        read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
113        dialect: the SQL dialect (alias for read)
114        into: the SQLGlot Expression to parse into.
115        **opts: other `sqlglot.parser.Parser` options.
116
117    Returns:
118        The syntax tree for the first parsed statement.
119    """
120
121    dialect = Dialect.get_or_raise(read or dialect)
122
123    if into:
124        result = dialect.parse_into(into, sql, **opts)
125    else:
126        result = dialect.parse(sql, **opts)
127
128    for expression in result:
129        if not expression:
130            raise ParseError(f"No expression was parsed from '{sql}'")
131        return expression
132    else:
133        raise ParseError(f"No expression was parsed from '{sql}'")

Parses the given SQL string and returns a syntax tree for the first parsed SQL statement.

Arguments:
  • sql: the SQL code string to parse.
  • read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
  • dialect: the SQL dialect (alias for read)
  • into: the SQLGlot Expression to parse into.
  • **opts: other sqlglot.parser.Parser options.
Returns:

The syntax tree for the first parsed statement.

def transpile( sql: str, read: Union[str, sqlglot.dialects.dialect.Dialect, Type[sqlglot.dialects.dialect.Dialect], NoneType] = None, write: Union[str, sqlglot.dialects.dialect.Dialect, Type[sqlglot.dialects.dialect.Dialect], NoneType] = None, identity: bool = True, error_level: Optional[sqlglot.errors.ErrorLevel] = None, **opts) -> List[str]:
136def transpile(
137    sql: str,
138    read: DialectType = None,
139    write: DialectType = None,
140    identity: bool = True,
141    error_level: t.Optional[ErrorLevel] = None,
142    **opts,
143) -> t.List[str]:
144    """
145    Parses the given SQL string in accordance with the source dialect and returns a list of SQL strings transformed
146    to conform to the target dialect. Each string in the returned list represents a single transformed SQL statement.
147
148    Args:
149        sql: the SQL code string to transpile.
150        read: the source dialect used to parse the input string (eg. "spark", "hive", "presto", "mysql").
151        write: the target dialect into which the input should be transformed (eg. "spark", "hive", "presto", "mysql").
152        identity: if set to `True` and if the target dialect is not specified the source dialect will be used as both:
153            the source and the target dialect.
154        error_level: the desired error level of the parser.
155        **opts: other `sqlglot.generator.Generator` options.
156
157    Returns:
158        The list of transpiled SQL statements.
159    """
160    write = (read if write is None else write) if identity else write
161    write = Dialect.get_or_raise(write)
162    return [
163        write.generate(expression, copy=False, **opts) if expression else ""
164        for expression in parse(sql, read, error_level=error_level)
165    ]

Parses the given SQL string in accordance with the source dialect and returns a list of SQL strings transformed to conform to the target dialect. Each string in the returned list represents a single transformed SQL statement.

Arguments:
  • sql: the SQL code string to transpile.
  • read: the source dialect used to parse the input string (eg. "spark", "hive", "presto", "mysql").
  • write: the target dialect into which the input should be transformed (eg. "spark", "hive", "presto", "mysql").
  • identity: if set to True and if the target dialect is not specified the source dialect will be used as both: the source and the target dialect.
  • error_level: the desired error level of the parser.
  • **opts: other sqlglot.generator.Generator options.
Returns:

The list of transpiled SQL statements.