diff options
author | Daniel Baumann <daniel.baumann@progress-linux.org> | 2024-02-20 09:37:57 +0000 |
---|---|---|
committer | Daniel Baumann <daniel.baumann@progress-linux.org> | 2024-02-20 09:37:57 +0000 |
commit | 5c70c63284a8ff61607db1a51ac2829b74f71c1c (patch) | |
tree | 10a81ffbd8da8cae58e292848cbdd0550d08721d /README.md | |
parent | Adding upstream version 21.1.1. (diff) | |
download | sqlglot-5c70c63284a8ff61607db1a51ac2829b74f71c1c.tar.xz sqlglot-5c70c63284a8ff61607db1a51ac2829b74f71c1c.zip |
Adding upstream version 21.1.2.upstream/21.1.2
Signed-off-by: Daniel Baumann <daniel.baumann@progress-linux.org>
Diffstat (limited to 'README.md')
-rw-r--r-- | README.md | 8 |
1 files changed, 4 insertions, 4 deletions
@@ -74,15 +74,15 @@ We'd love to hear from you. Join our community [Slack channel](https://tobikodat I tried to parse SQL that should be valid but it failed, why did that happen? -* You need to specify the dialect to read the SQL properly, by default it is SQLGlot's dialect which is designed to be a superset of all dialects `parse_one(sql, dialect="spark")`. If you tried specifying the dialect and it still doesn't work, please file an issue. +* Most of the time, issues like this occur because the "source" dialect is omitted during parsing. For example, this is how to correctly parse a SQL query written in Spark SQL: `parse_one(sql, dialect="spark")` (alternatively: `read="spark"`). If no dialect is specified, `parse_one` will attempt to parse the query according to the "SQLGlot dialect", which is designed to be a superset of all supported dialects. If you tried specifying the dialect and it still doesn't work, please file an issue. I tried to output SQL but it's not in the correct dialect! -* You need to specify the dialect to write the sql properly, by default it is in SQLGlot's dialect `parse_one(sql, dialect="spark").sql(dialect="spark")`. +* Like parsing, generating SQL also requires the target dialect to be specified, otherwise the SQLGlot dialect will be used by default. For example, to transpile a query from Spark SQL to DuckDB, do `parse_one(sql, dialect="spark").sql(dialect="duckdb")` (alternatively: `transpile(sql, read="spark", write="duckdb")`). -I tried to parse invalid SQL and it should raise an error but it worked! Why didn't it validate my SQL. +I tried to parse invalid SQL and it worked, even though it should raise an error! Why didn't it validate my SQL? -* SQLGlot is not a validator and designed to be very forgiving, handling things like trailing commas. +* SQLGlot does not aim to be a SQL validator - it is designed to be very forgiving. This makes the codebase more comprehensive and also gives more flexibility to its users, e.g. by allowing them to include trailing commas in their projection lists. ## Examples |