Implement your own MySQL Proxy using Golang
SQL Parsing in Go: A Comprehensive Guide for 2023
SQL is the lingua franca of the database world. Whether you‘re using MySQL, PostgreSQL, SQL Server or one of countless other relational databases, SQL is the common interface for querying and manipulating data.
As a result, being able to programmatically parse and analyze SQL queries can be extremely useful for a variety of applications – database drivers, query builders, ORMs, proxies, monitoring tools, and more. Rather than treat SQL as an opaque string, parsing it into a structured format opens up a world of possibilities.
Fortunately, if you‘re working in Go, there are a number of excellent open source libraries available for parsing SQL. In this post, we‘ll dive into the world of SQL parsing in Go – understanding how these parsers work, popular use cases, and tips for using them effectively.
What is an SQL Parser?
An SQL parser is a piece of software that reads in a string containing an SQL statement and breaks it down into its component parts based on the SQL grammar and syntax rules. The output is typically an abstract syntax tree (AST) – a hierarchical tree structure where each node represents a syntactic element of the SQL query.
For example, take this simple SQL query:
SELECT id, name
FROM users
WHERE age > 18
After lexical analysis to break it into tokens and then syntactic analysis to determine the grammatical structure, a parser would output an AST similar to:
- Select
- ProjectionItems
- id
- name
- FromClause
- users
- WhereClause
- BinaryExpr (>)
- age
- 18
- BinaryExpr (>)
- ProjectionItems
The AST contains all the information from the original query, but in a format that is easy to traverse and manipulate. With an AST, it becomes possible to:
- Validate that a query follows proper SQL syntax
- Analyze which tables and columns a query references
- Modify or rewrite the query
- Convert between different SQL dialects
- Calculate query metrics and complexity
- Build graphical representations of the query plan
Popular SQL Parsers for Go
When it comes to choosing an SQL parser to use in your Go applications, there are a few key options to consider:
-
vitess/sqlparser – This is the SQL parsing library used by Vitess, YouTube‘s database clustering system for MySQL. It parses a large subset of MySQL syntax and is used heavily in production.
-
pingcap/parser – Created for the TiDB project, this parser supports MySQL syntax and outputs an AST that is very similar to MySQL‘s internal parser. It is used by several other Go MySQL tools.
-
xwb1989/sqlparser – Forked from vitess/sqlparser with a focus on performance, this library claims to be the fastest SQL parser for Go. It makes some trade-offs to achieve this speed.
-
dolthub/go-mysql-server – A MySQL server implementation in Go that includes a full SQL parser. It aims to be compatible with the entire MySQL syntax.
All of these libraries essentially work the same under the hood. They use a lexer (sometimes called a tokenizer or scanner) to break the SQL string into tokens, and then a parser that recursively matches tokens to define grammar rules and build the AST.
Most offer an API to parse a query string and get back the AST, as well as potentially other helper methods to work with the AST. They typically depend only on the Go standard library, making them easy to integrate into any project.
Common Use Cases for SQL Parsers
So what are some of the most common reasons you might want to parse SQL queries in your Go application? Let‘s look at a few major use cases:
Database Drivers and ORMs
One of the most fundamental uses of SQL parsers is in database drivers and ORMs. Whenever you use a library like database/sql or gorm to interact with a database, under the hood it constructs SQL queries from your method calls.
In order to do this, the library uses an SQL parser to validate the queries and translate them into the appropriate wire protocol format to send to the database server. Parsers allow these libraries to support multiple databases since each one can ingest standard SQL and output the database-specific dialect.
Query Builders
Another common application for SQL parsing is in query builders. These are libraries that provide a fluent, idiomatic interface for constructing SQL queries in code, without having to resort to manually concatenating strings.
Under the hood, query builders work by programmatically constructing an AST representing the query. They then rely on an SQL parser and formatter to convert the AST into a valid SQL string when it‘s time to execute the query. This allows them to guarantee that the generated SQL will be syntactically correct.
Query Analysis and Optimization
For more advanced use cases, SQL parsing enables detailed analysis and optimization of queries. By looking at the AST structure, you can determine things like:
- Which tables and columns does a query access?
- What are the relationships between tables (joins, subqueries, unions)?
- Are there any cartesian products or redundant joins?
- How many expressions are in the select list and where clause?
- What indexes could be used to speed up the query?
- Are there any expensive constructs like ORDER BY or GROUP BY?
With this information, you can build tools to help developers write more efficient queries, or even automatically rewrite queries to optimize performance. You can also feed query ASTs into engine-specific tools like EXPLAIN analyzers to get even more insight into bottlenecks.
Monitoring and Observability
Instrumenting your application to log and trace database queries can be very useful for monitoring and debugging. But without parsing those queries, the information you can extract is limited.
By parsing queries, you can generate meaningful metrics like which tables are accessed most frequently, what the slowest queries are, and whether any queries return large result sets. You can even aggregate similar queries together by comparing their ASTs.
Parsing also enables you to redact sensitive information from query strings before logging them. You can scan the AST for literals and replace them with placeholders to avoid leaking things like personally identifiable information.
Database Proxies
SQL parsing is a key ingredient in database proxies and middleware. These are tools that sit between the application and the database, intercepting and potentially modifying queries as they flow through.
By parsing queries, a proxy can route queries to different database instances based on rules like isolating write traffic from reads. It can reject or rewrite queries that don‘t conform to security or performance standards. It can even transform queries to be compatible with a totally different database engine.
Without the ability to parse and understand the content of queries, proxies would be limited to making decisions based only on metadata like the query length or duration. Parsing allows them to be much more intelligent and useful.
How to Parse SQL in Your Go Application
Now that we‘ve seen some of the use cases for parsing SQL, let‘s walk through a quick example of how to actually do it in Go code. We‘ll use the vitess/sqlparser library, but the general concepts apply to any of the parser options.
First, install the library:
go get github.com/vitessio/vitess/go/sqltypes
go get github.com/vitessio/vitess/go/vt/sqlparser
Then you can parse a query string by calling the sqlparser.Parse function:
import (
"fmt"
"github.com/vitessio/vitess/go/vt/sqlparser"
)
func main() {
sql := "SELECT id, name FROM users WHERE age > 18"
stmt, err := sqlparser.Parse(sql)
if err != nil {
// Handle parsing errors
}
// Print out the type of the statement
fmt.Printf("%T\n", stmt)
}
In this case, stmt will be a pointer to a sqlparser.Select struct, which contains all the information about the query. You can then traverse the AST using the various struct fields and methods.
For example, to print out the table and column names referenced:
import (
"fmt"
"github.com/vitessio/vitess/go/vt/sqlparser"
)
func main() {
sql := "SELECT id, name FROM users WHERE age > 18"
stmt, _ := sqlparser.Parse(sql)
sel := stmt.(*sqlparser.Select)
// Extract table names
tables := sqlparser.GetTableNames(sel.From)
for _, t := range tables {
fmt.Printf("Table: %s\n", sqlparser.String(t))
}
// Print column names
for _, e := range sel.SelectExprs {
switch expr := e.(type) {
case *sqlparser.AliasedExpr:
col, ok := expr.Expr.(*sqlparser.ColName)
if !ok {
continue
}
fmt.Printf("Column: %v\n", col.Name)
}
}
}
This would print out:
Table: users
Column: id
Column: name
You can modify the query by manipulating the AST and then use sqlparser.String to format it back into SQL:
sel.Where = &sqlparser.Where{
Type: "where",
Expr: &sqlparser.ComparisonExpr{
Operator: ">",
Left: &sqlparser.ColName{Name: sqlparser.NewColIdent("age")},
Right: sqlparser.NewIntLiteral("21"),
},
}
updatedSQL := sqlparser.String(sel)
fmt.Println(updatedSQL)
This would print out the updated query string:
SELECT id, name FROM users WHERE age > 21
By comparing the original and updated AST, you can also compute a diff to see what changed. This is useful for things like schema migration tools.
Tips for SQL Parsing in Production
When using SQL parsers for anything beyond basic validation, there are a few things to keep in mind:
Parsing is expensive – Building the AST requires traversing the entire query string and allocating lots of small structs. For this reason, you generally want to avoid parsing the same query over and over. Parse once and cache the AST if you need to analyze the same query multiple times.
Watch out for dialect differences – The SQL language has a standard, but every database implements its own quirks and extensions. If you‘re using a parser that was built for a different database than your target, you may run into compatibility issues with things like data types, built-in functions, and hint syntax. Be sure to test against the full grammar of your specific database.
Beware arbitrary input – It should go without saying, but never use string concatenation or interpolation to build SQL queries. Injection is just as big a risk in your Go code as it is in the database! Use parameter placeholders for any user-supplied values.
Consider streaming interfaces – Some parsers like pgx support operate on a streaming protocol rather than requiring the entire query string to be buffered into memory. For very large queries, this can be a major performance win.
The Future of SQL Parsing
SQL has been the workhorse of the database world for decades, and that‘s unlikely to change anytime soon. While trendy NoSQL datastores come and go, the relational model and SQL still reign supreme for most workloads.
This means SQL parsing will remain a critical capability, and we can expect to see continued investment in parser performance, compliance with new SQL standards, and integration with cloud-native deployment models.
One exciting frontier is machine learning applied to query analysis and optimization. By collecting massive corpuses of queries, we can train models to predict things like resource utilization and index compatibility, allowing what used to be rules-based systems to become self-optimizing.
As data workloads continue to grow in scale and complexity, technologies like SQL parsing that provide visibility into application-database interaction will become even more valuable. Instrumentation, governance, and automation will be critical to keep database infrastructure performant and secure.
While SQL has sometimes been dismissed as a clunky, outdated technology, it‘s proven remarkably resilient and adaptable to modern needs. By providing an abstraction over physical storage and enabling declarative data manipulation, SQL will continue to be the lingua franca that ties together our data systems.
Conclusion
In this post, we‘ve taken a whirlwind tour through the world of SQL parsing in Go. We‘ve seen how SQL parsers work under the hood, common use cases ranging from query building to database proxies, and tips for using parsers effectively in your applications.
Next time you find yourself working with SQL queries, consider how parsing them might open up new possibilities. By taking advantage of the fantastic open source SQL parsing libraries available in the Go ecosystem, you can add powerful new capabilities to your database tooling.
Here are the key takeaways:
- SQL parsing enables treating queries as structured data rather than opaque strings, which unlocks a huge range of analysis and transformation use cases
- Go has a number of mature, well-tested SQL parsing libraries that support MySQL and PostgreSQL syntax
- Parsers work by tokenizing SQL and constructing an abstract syntax tree (AST) following the rules of the SQL grammar
- Common applications of SQL parsing include database drivers/ORMs, query builders, monitoring, and proxy middleware
- Adding a query parsing layer to your application can enable better observability, security, and performance optimization
- When parsing in a production environment, be mindful of performance overhead, dialect compatibility, and injection risks
- SQL is not going away anytime soon, so SQL parsing will continue to be a critical tool in the data engineer‘s toolbox
For further reading, check out these resources:
- How a SQL Database Works: https://blog.codemagic.io/how-a-sql-database-works/
- Writing an SQL Parser in Go: https://spapas.github.io/2018/09/23/go-parse-sql/
- Building an SQL-Based DB Proxy in Go: https://dev.to/pliutau/building-an-sql-based-database-proxy-in-go–part-1-motivation–design-3mi1
Happy parsing!