SQLGlot is a no-dependency SQL parser, transpiler, optimizer, and engine. It can be used to format SQL or translate between 19 different dialects like DuckDB, Presto, Spark, Snowflake, and BigQuery. It aims to read a wide variety of SQL inputs and output syntactically correct SQL in the targeted dialects.
It is a very comprehensive generic SQL parser with a robust test suite. It is also quite performant, while being written purely in Python.
You can easily customize the parser, analyze queries, traverse expression trees, and programmatically build SQL.
Syntax errors are highlighted and dialect incompatibilities can warn or raise depending on configurations. However, it should be noted that SQL validation is not SQLGlot’s goal, so some syntax errors may go unnoticed.
Contributions are very welcome in SQLGlot; read the contribution guide to get started!
Table of Contents
- Install
- Versioning
- Get in Touch
- Examples
- Used By
- Documentation
- Run Tests and Lint
- Benchmarks
- Optional Dependencies
Install
From PyPI:
pip3 install sqlglot
Or with a local checkout:
make install
Requirements for development (optional):
make install-dev
Versioning
Given a version number MAJOR
.MINOR
.PATCH
, SQLGlot uses the following versioning strategy:
- The
PATCH
version is incremented when there are backwards-compatible fixes or feature additions. - The
MINOR
version is incremented when there are backwards-incompatible fixes or feature additions. - The
MAJOR
version is incremented when there are significant backwards-incompatible fixes or feature additions.
Get in Touch
We'd love to hear from you. Join our community Slack channel!
Examples
Formatting and Transpiling
Easily translate from one dialect to another. For example, date/time functions vary from dialects and can be hard to deal with:
import sqlglot
sqlglot.transpile("SELECT EPOCH_MS(1618088028295)", read="duckdb", write="hive")[0]
'SELECT FROM_UNIXTIME(1618088028295 / 1000)'
SQLGlot can even translate custom time formats:
import sqlglot
sqlglot.transpile("SELECT STRFTIME(x, '%y-%-m-%S')", read="duckdb", write="hive")[0]
"SELECT DATE_FORMAT(x, 'yy-M-ss')"
As another example, let's suppose that we want to read in a SQL query that contains a CTE and a cast to REAL
, and then transpile it to Spark, which uses backticks for identifiers and FLOAT
instead of REAL
:
import sqlglot
sql = """WITH baz AS (SELECT a, c FROM foo WHERE a = 1) SELECT f.a, b.b, baz.c, CAST("b"."a" AS REAL) d FROM foo f JOIN bar b ON f.a = b.a LEFT JOIN baz ON f.a = baz.a"""
print(sqlglot.transpile(sql, write="spark", identify=True, pretty=True)[0])
WITH `baz` AS (
SELECT
`a`,
`c`
FROM `foo`
WHERE
`a` = 1
)
SELECT
`f`.`a`,
`b`.`b`,
`baz`.`c`,
CAST(`b`.`a` AS FLOAT) AS `d`
FROM `foo` AS `f`
JOIN `bar` AS `b`
ON `f`.`a` = `b`.`a`
LEFT JOIN `baz`
ON `f`.`a` = `baz`.`a`
Comments are also preserved in a best-effort basis when transpiling SQL code:
sql = """
/* multi
line
comment
*/
SELECT
tbl.cola /* comment 1 */ + tbl.colb /* comment 2 */,
CAST(x AS INT), # comment 3
y -- comment 4
FROM
bar /* comment 5 */,
tbl # comment 6
"""
print(sqlglot.transpile(sql, read='mysql', pretty=True)[0])
/* multi
line
comment
*/
SELECT
tbl.cola /* comment 1 */ + tbl.colb /* comment 2 */,
CAST(x AS INT), /* comment 3 */
y /* comment 4 */
FROM bar /* comment 5 */, tbl /* comment 6 */
Metadata
You can explore SQL with expression helpers to do things like find columns and tables:
from sqlglot import parse_one, exp
# print all column references (a and b)
for column in parse_one("SELECT a, b + 1 AS c FROM d").find_all(exp.Column):
print(column.alias_or_name)
# find all projections in select statements (a and c)
for select in parse_one("SELECT a, b + 1 AS c FROM d").find_all(exp.Select):
for projection in select.expressions:
print(projection.alias_or_name)
# find all tables (x, y, z)
for table in parse_one("SELECT * FROM x JOIN y JOIN z").find_all(exp.Table):
print(table.name)
Parser Errors
When the parser detects an error in the syntax, it raises a ParserError:
import sqlglot
sqlglot.transpile("SELECT foo( FROM bar")
sqlglot.errors.ParseError: Expecting ). Line 1, Col: 13.
select foo( FROM bar
~~~~
Structured syntax errors are accessible for programmatic use:
import sqlglot
try:
sqlglot.transpile("SELECT foo( FROM bar")
except sqlglot.errors.ParseError as e:
print(e.errors)
[{
'description': 'Expecting )',
'line': 1,
'col': 13,
'start_context': 'SELECT foo( ',
'highlight': 'FROM',
'end_context': ' bar'
}]
Unsupported Errors
Presto APPROX_DISTINCT
supports the accuracy argument which is not supported in Hive:
import sqlglot
sqlglot.transpile("SELECT APPROX_DISTINCT(a, 0.1) FROM foo", read="presto", write="hive")
APPROX_COUNT_DISTINCT does not support accuracy
'SELECT APPROX_COUNT_DISTINCT(a) FROM foo'
Build and Modify SQL
SQLGlot supports incrementally building sql expressions:
from sqlglot import select, condition
where = condition("x=1").and_("y=1")
select("*").from_("y").where(where).sql()
'SELECT * FROM y WHERE x = 1 AND y = 1'
You can also modify a parsed tree:
from sqlglot import parse_one
parse_one("SELECT x FROM y").from_("z").sql()
'SELECT x FROM y, z'
There is also a way to recursively transform the parsed tree by applying a mapping function to each tree node:
from sqlglot import exp, parse_one
expression_tree = parse_one("SELECT a FROM x")
def transformer(node):
if isinstance(node, exp.Column) and node.name == "a":
return parse_one("FUN(a)")
return node
transformed_tree = expression_tree.transform(transformer)
transformed_tree.sql()
'SELECT FUN(a) FROM x'
SQL Optimizer
SQLGlot can rewrite queries into an "optimized" form. It performs a variety of techniques to create a new canonical AST. This AST can be used to standardize queries or provide the foundations for implementing an actual engine. For example:
import sqlglot
from sqlglot.optimizer import optimize
print(
optimize(
sqlglot.parse_one("""
SELECT A OR (B OR (C AND D))
FROM x
WHERE Z = date '2021-01-01' + INTERVAL '1' month OR 1 = 0
"""),
schema={"x": {"A": "INT", "B": "INT", "C": "INT", "D": "INT", "Z": "STRING"}}
).sql(pretty=True)
)
SELECT
(
"x"."a" <> 0 OR "x"."b" <> 0 OR "x"."c" <> 0
)
AND (
"x"."a" <> 0 OR "x"."b" <> 0 OR "x"."d" <> 0
) AS "_col_0"
FROM "x" AS "x"
WHERE
CAST("x"."z" AS DATE) = CAST('2021-02-01' AS DATE)
AST Introspection
You can see the AST version of the sql by calling repr
:
from sqlglot import parse_one
print(repr(parse_one("SELECT a + 1 AS z")))
(SELECT expressions:
(ALIAS this:
(ADD this:
(COLUMN this:
(IDENTIFIER this: a, quoted: False)), expression:
(LITERAL this: 1, is_string: False)), alias:
(IDENTIFIER this: z, quoted: False)))
AST Diff
SQLGlot can calculate the difference between two expressions and output changes in a form of a sequence of actions needed to transform a source expression into a target one:
from sqlglot import diff, parse_one
diff(parse_one("SELECT a + b, c, d"), parse_one("SELECT c, a - b, d"))
[
Remove(expression=(ADD this:
(COLUMN this:
(IDENTIFIER this: a, quoted: False)), expression:
(COLUMN this:
(IDENTIFIER this: b, quoted: False)))),
Insert(expression=(SUB this:
(COLUMN this:
(IDENTIFIER this: a, quoted: False)), expression:
(COLUMN this:
(IDENTIFIER this: b, quoted: False)))),
Move(expression=(COLUMN this:
(IDENTIFIER this: c, quoted: False))),
Keep(source=(IDENTIFIER this: b, quoted: False), target=(IDENTIFIER this: b, quoted: False)),
...
]
See also: Semantic Diff for SQL.
Custom Dialects
Dialects can be added by subclassing Dialect
:
from sqlglot import exp
from sqlglot.dialects.dialect import Dialect
from sqlglot.generator import Generator
from sqlglot.tokens import Tokenizer, TokenType
class Custom(Dialect):
class Tokenizer(Tokenizer):
QUOTES = ["'", '"']
IDENTIFIERS = ["`"]
KEYWORDS = {
**Tokenizer.KEYWORDS,
"INT64": TokenType.BIGINT,
"FLOAT64": TokenType.DOUBLE,
}
class Generator(Generator):
TRANSFORMS = {exp.Array: lambda self, e: f"[{self.expressions(e)}]"}
TYPE_MAPPING = {
exp.DataType.Type.TINYINT: "INT64",
exp.DataType.Type.SMALLINT: "INT64",
exp.DataType.Type.INT: "INT64",
exp.DataType.Type.BIGINT: "INT64",
exp.DataType.Type.DECIMAL: "NUMERIC",
exp.DataType.Type.FLOAT: "FLOAT64",
exp.DataType.Type.DOUBLE: "FLOAT64",
exp.DataType.Type.BOOLEAN: "BOOL",
exp.DataType.Type.TEXT: "STRING",
}
print(Dialect["custom"])
<class '__main__.Custom'>
SQL Execution
One can even interpret SQL queries using SQLGlot, where the tables are represented as Python dictionaries. Although the engine is not very fast (it's not supposed to be) and is in a relatively early stage of development, it can be useful for unit testing and running SQL natively across Python objects. Additionally, the foundation can be easily integrated with fast compute kernels (arrow, pandas). Below is an example showcasing the execution of a SELECT expression that involves aggregations and JOINs:
from sqlglot.executor import execute
tables = {
"sushi": [
{"id": 1, "price": 1.0},
{"id": 2, "price": 2.0},
{"id": 3, "price": 3.0},
],
"order_items": [
{"sushi_id": 1, "order_id": 1},
{"sushi_id": 1, "order_id": 1},
{"sushi_id": 2, "order_id": 1},
{"sushi_id": 3, "order_id": 2},
],
"orders": [
{"id": 1, "user_id": 1},
{"id": 2, "user_id": 2},
],
}
execute(
"""
SELECT
o.user_id,
SUM(s.price) AS price
FROM orders o
JOIN order_items i
ON o.id = i.order_id
JOIN sushi s
ON i.sushi_id = s.id
GROUP BY o.user_id
""",
tables=tables
)
user_id price
1 4.0
2 3.0
See also: Writing a Python SQL engine from scratch.
Used By
Documentation
SQLGlot uses pdoc to serve its API documentation:
make docs-serve
Run Tests and Lint
make check # Set SKIP_INTEGRATION=1 to skip integration tests
Benchmarks
Benchmarks run on Python 3.10.5 in seconds.
Query | sqlglot | sqlfluff | sqltree | sqlparse | moz_sql_parser | sqloxide |
---|---|---|---|---|---|---|
tpch | 0.01308 (1.0) | 1.60626 (122.7) | 0.01168 (0.893) | 0.04958 (3.791) | 0.08543 (6.531) | 0.00136 (0.104) |
short | 0.00109 (1.0) | 0.14134 (129.2) | 0.00099 (0.906) | 0.00342 (3.131) | 0.00652 (5.970) | 8.76E-5 (0.080) |
long | 0.01399 (1.0) | 2.12632 (151.9) | 0.01126 (0.805) | 0.04410 (3.151) | 0.06671 (4.767) | 0.00107 (0.076) |
crazy | 0.03969 (1.0) | 24.3777 (614.1) | 0.03917 (0.987) | 11.7043 (294.8) | 1.03280 (26.02) | 0.00625 (0.157) |
Optional Dependencies
SQLGlot uses dateutil to simplify literal timedelta expressions. The optimizer will not simplify expressions like the following if the module cannot be found:
x + interval '1' month
1""" 2.. include:: ../README.md 3 4---- 5""" 6 7from __future__ import annotations 8 9import logging 10import typing as t 11 12from sqlglot import expressions as exp 13from sqlglot.dialects.dialect import Dialect as Dialect, Dialects as Dialects 14from sqlglot.diff import diff as diff 15from sqlglot.errors import ( 16 ErrorLevel as ErrorLevel, 17 ParseError as ParseError, 18 TokenError as TokenError, 19 UnsupportedError as UnsupportedError, 20) 21from sqlglot.expressions import ( 22 Expression as Expression, 23 alias_ as alias, 24 and_ as and_, 25 cast as cast, 26 column as column, 27 condition as condition, 28 except_ as except_, 29 from_ as from_, 30 func as func, 31 intersect as intersect, 32 maybe_parse as maybe_parse, 33 not_ as not_, 34 or_ as or_, 35 select as select, 36 subquery as subquery, 37 table_ as table, 38 to_column as to_column, 39 to_identifier as to_identifier, 40 to_table as to_table, 41 union as union, 42) 43from sqlglot.generator import Generator as Generator 44from sqlglot.parser import Parser as Parser 45from sqlglot.schema import MappingSchema as MappingSchema, Schema as Schema 46from sqlglot.tokens import Tokenizer as Tokenizer, TokenType as TokenType 47 48if t.TYPE_CHECKING: 49 from sqlglot._typing import E 50 from sqlglot.dialects.dialect import DialectType as DialectType 51 52logger = logging.getLogger("sqlglot") 53 54 55try: 56 from sqlglot._version import __version__, __version_tuple__ 57except ImportError: 58 logger.error( 59 "Unable to set __version__, run `pip install -e .` or `python setup.py develop` first." 60 ) 61 62 63pretty = False 64"""Whether to format generated SQL by default.""" 65 66schema = MappingSchema() 67"""The default schema used by SQLGlot (e.g. in the optimizer).""" 68 69 70def parse(sql: str, read: DialectType = None, **opts) -> t.List[t.Optional[Expression]]: 71 """ 72 Parses the given SQL string into a collection of syntax trees, one per parsed SQL statement. 73 74 Args: 75 sql: the SQL code string to parse. 76 read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql"). 77 **opts: other `sqlglot.parser.Parser` options. 78 79 Returns: 80 The resulting syntax tree collection. 81 """ 82 dialect = Dialect.get_or_raise(read)() 83 return dialect.parse(sql, **opts) 84 85 86@t.overload 87def parse_one( 88 sql: str, 89 read: None = None, 90 into: t.Type[E] = ..., 91 **opts, 92) -> E: 93 ... 94 95 96@t.overload 97def parse_one( 98 sql: str, 99 read: DialectType, 100 into: t.Type[E], 101 **opts, 102) -> E: 103 ... 104 105 106@t.overload 107def parse_one( 108 sql: str, 109 read: None = None, 110 into: t.Union[str, t.Collection[t.Union[str, t.Type[Expression]]]] = ..., 111 **opts, 112) -> Expression: 113 ... 114 115 116@t.overload 117def parse_one( 118 sql: str, 119 read: DialectType, 120 into: t.Union[str, t.Collection[t.Union[str, t.Type[Expression]]]], 121 **opts, 122) -> Expression: 123 ... 124 125 126@t.overload 127def parse_one( 128 sql: str, 129 **opts, 130) -> Expression: 131 ... 132 133 134def parse_one( 135 sql: str, 136 read: DialectType = None, 137 into: t.Optional[exp.IntoType] = None, 138 **opts, 139) -> Expression: 140 """ 141 Parses the given SQL string and returns a syntax tree for the first parsed SQL statement. 142 143 Args: 144 sql: the SQL code string to parse. 145 read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql"). 146 into: the SQLGlot Expression to parse into. 147 **opts: other `sqlglot.parser.Parser` options. 148 149 Returns: 150 The syntax tree for the first parsed statement. 151 """ 152 153 dialect = Dialect.get_or_raise(read)() 154 155 if into: 156 result = dialect.parse_into(into, sql, **opts) 157 else: 158 result = dialect.parse(sql, **opts) 159 160 for expression in result: 161 if not expression: 162 raise ParseError(f"No expression was parsed from '{sql}'") 163 return expression 164 else: 165 raise ParseError(f"No expression was parsed from '{sql}'") 166 167 168def transpile( 169 sql: str, 170 read: DialectType = None, 171 write: DialectType = None, 172 identity: bool = True, 173 error_level: t.Optional[ErrorLevel] = None, 174 **opts, 175) -> t.List[str]: 176 """ 177 Parses the given SQL string in accordance with the source dialect and returns a list of SQL strings transformed 178 to conform to the target dialect. Each string in the returned list represents a single transformed SQL statement. 179 180 Args: 181 sql: the SQL code string to transpile. 182 read: the source dialect used to parse the input string (eg. "spark", "hive", "presto", "mysql"). 183 write: the target dialect into which the input should be transformed (eg. "spark", "hive", "presto", "mysql"). 184 identity: if set to `True` and if the target dialect is not specified the source dialect will be used as both: 185 the source and the target dialect. 186 error_level: the desired error level of the parser. 187 **opts: other `sqlglot.generator.Generator` options. 188 189 Returns: 190 The list of transpiled SQL statements. 191 """ 192 write = (read if write is None else write) if identity else write 193 return [ 194 Dialect.get_or_raise(write)().generate(expression, **opts) 195 for expression in parse(sql, read, error_level=error_level) 196 ]
Whether to format generated SQL by default.
The default schema used by SQLGlot (e.g. in the optimizer).
71def parse(sql: str, read: DialectType = None, **opts) -> t.List[t.Optional[Expression]]: 72 """ 73 Parses the given SQL string into a collection of syntax trees, one per parsed SQL statement. 74 75 Args: 76 sql: the SQL code string to parse. 77 read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql"). 78 **opts: other `sqlglot.parser.Parser` options. 79 80 Returns: 81 The resulting syntax tree collection. 82 """ 83 dialect = Dialect.get_or_raise(read)() 84 return dialect.parse(sql, **opts)
Parses the given SQL string into a collection of syntax trees, one per parsed SQL statement.
Arguments:
- sql: the SQL code string to parse.
- read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
- **opts: other
sqlglot.parser.Parser
options.
Returns:
The resulting syntax tree collection.
135def parse_one( 136 sql: str, 137 read: DialectType = None, 138 into: t.Optional[exp.IntoType] = None, 139 **opts, 140) -> Expression: 141 """ 142 Parses the given SQL string and returns a syntax tree for the first parsed SQL statement. 143 144 Args: 145 sql: the SQL code string to parse. 146 read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql"). 147 into: the SQLGlot Expression to parse into. 148 **opts: other `sqlglot.parser.Parser` options. 149 150 Returns: 151 The syntax tree for the first parsed statement. 152 """ 153 154 dialect = Dialect.get_or_raise(read)() 155 156 if into: 157 result = dialect.parse_into(into, sql, **opts) 158 else: 159 result = dialect.parse(sql, **opts) 160 161 for expression in result: 162 if not expression: 163 raise ParseError(f"No expression was parsed from '{sql}'") 164 return expression 165 else: 166 raise ParseError(f"No expression was parsed from '{sql}'")
Parses the given SQL string and returns a syntax tree for the first parsed SQL statement.
Arguments:
- sql: the SQL code string to parse.
- read: the SQL dialect to apply during parsing (eg. "spark", "hive", "presto", "mysql").
- into: the SQLGlot Expression to parse into.
- **opts: other
sqlglot.parser.Parser
options.
Returns:
The syntax tree for the first parsed statement.
169def transpile( 170 sql: str, 171 read: DialectType = None, 172 write: DialectType = None, 173 identity: bool = True, 174 error_level: t.Optional[ErrorLevel] = None, 175 **opts, 176) -> t.List[str]: 177 """ 178 Parses the given SQL string in accordance with the source dialect and returns a list of SQL strings transformed 179 to conform to the target dialect. Each string in the returned list represents a single transformed SQL statement. 180 181 Args: 182 sql: the SQL code string to transpile. 183 read: the source dialect used to parse the input string (eg. "spark", "hive", "presto", "mysql"). 184 write: the target dialect into which the input should be transformed (eg. "spark", "hive", "presto", "mysql"). 185 identity: if set to `True` and if the target dialect is not specified the source dialect will be used as both: 186 the source and the target dialect. 187 error_level: the desired error level of the parser. 188 **opts: other `sqlglot.generator.Generator` options. 189 190 Returns: 191 The list of transpiled SQL statements. 192 """ 193 write = (read if write is None else write) if identity else write 194 return [ 195 Dialect.get_or_raise(write)().generate(expression, **opts) 196 for expression in parse(sql, read, error_level=error_level) 197 ]
Parses the given SQL string in accordance with the source dialect and returns a list of SQL strings transformed to conform to the target dialect. Each string in the returned list represents a single transformed SQL statement.
Arguments:
- sql: the SQL code string to transpile.
- read: the source dialect used to parse the input string (eg. "spark", "hive", "presto", "mysql").
- write: the target dialect into which the input should be transformed (eg. "spark", "hive", "presto", "mysql").
- identity: if set to
True
and if the target dialect is not specified the source dialect will be used as both: the source and the target dialect. - error_level: the desired error level of the parser.
- **opts: other
sqlglot.generator.Generator
options.
Returns:
The list of transpiled SQL statements.