Go Database Guide: database/sql and GORM Tutorial

Go Database Guide: database/sql and GORM
Go Database Connectivity: Quick Answer
Go connects to databases via the database/sql standard library package combined with a database-specific driver. sql.DB manages a connection pool automatically. For productivity, the GORM ORM wraps database/sql with a clean, chainable API for CRUD, associations, and migrations. Use raw database/sql for maximum performance and control; use GORM for faster development with less boilerplate.
Database Connectivity: SQL & GORM
For any application to be useful in the long term, it must persist data. Go provides a very powerful, driver-agnostic foundation called database/sql in its standard library. This allows you to write database code once and swap out the underlying engine (PostgreSQL, MySQL, SQLite) with minimal changes.
In this module, we will explore both raw SQL interactions and the most popular Object-Relational Mapper (ORM) in the Go world: GORM.
Pool Management is Automatic
The Standard Way: database/sql
To use the standard library, you need a "Driver" for your specific database. For this example, we'll use PostgreSQL.
The Developer's Favorite: GORM
While raw SQL gives you the most control, many developers prefer the productivity of an ORM. GORM is the industry standard for Go, offering a fantastic balance of features and performance.
Defining Your Model
GORM uses structs as the blueprint for your database tables.
CRUD Operations with GORM
Database Best Practices
SELECT name, emailBe explicit about which columns you need. Fetching '*' is slower and wastes memory.
db.Begin(), db.Commit()Crucial for operations that involve multiple tables to ensure data integrity if part of the process fails.
Handling NULLsGo's standard types aren't nullable. Use specialized types like sql.NullString to handle nullable columns correctly.
| Task / Feature | Standard Library (sql) | GORM (ORM) |
|---|---|---|
| Performance | Highest (No abstraction overhead) | Slightly Slower (Reflection overhead) |
| Developer Speed | Moderate (More boilerplate) | High (Clean, chaining API) |
| Relationship Handling | Complex (Requires manual joins) | Automatic (HasMany, BelongsTo, etc.) |
Using Transactions
Transactions are critical when multiple database operations must succeed or fail together. A payment that deducts from one account and credits another must be atomic — if the credit fails, the deduction must be rolled back.
Transactions with database/sql
Transactions with GORM
GORM's Transaction helper automatically commits on nil return and rolls back on error — a cleaner API than managing tx.Rollback() manually.
Connection Pool Configuration
The sql.DB connection pool has sensible defaults but should be tuned for production:
Under-provisioning MaxOpenConns causes request queuing under load. Over-provisioning can exhaust database connection limits. For PostgreSQL, the max_connections setting (default 100) is the hard ceiling — your pool size across all application instances must stay below it.
Preventing SQL Injection
Always use parameterised queries. Never concatenate user input into SQL strings:
GORM also uses parameterised queries by default when you use its API methods. Only be careful with the Raw() and Exec() methods if you construct query strings manually.
For more on SQL patterns and query structure, see our SQL query examples guide. For securing your entire backend, see Go security best practices.
GORM Associations
GORM's relationship handling is one of its biggest productivity advantages. Defining associations between models automatically handles JOIN queries and eager loading:
The Preload clause generates an efficient WHERE user_id IN (...) query rather than N+1 individual queries.
External Resources
- database/sql — Go standard library documentation
- GORM official documentation — gorm.io
- Opening a Database Handle — go.dev
Next Steps
Persistence is the final piece of the structural puzzle. Now we enter the final phases of our course. In our next tutorial, we will explore Middleware Patterns, learning how to layer security, logging, and metrics onto our web servers in a clean and reusable way.
Common Database Mistakes in Go
1. Not closing rows after a query
rows, err := db.QueryContext(ctx, query) — if you forget defer rows.Close(), the connection is held open until garbage collection, exhausting the connection pool under load.
2. Using db.Query instead of db.QueryContext
The non-context variants (Query, Exec, QueryRow) cannot be cancelled. In a web server, always use the Context variants and pass r.Context() so database calls are cancelled when the client disconnects.
3. Not using db.Prepare for repeated queries
Executing the same query thousands of times without a prepared statement re-parses it on every call. Use db.PrepareContext for queries in hot paths to save the round-trip to the database server. See the database/sql documentation.
4. GORM AutoMigrate in production
db.AutoMigrate(&User{}) is convenient for development but dangerous in production — it can add columns and indexes but never drops them, and may lock tables on large datasets. Use proper migration tools (golang-migrate, goose) for production schema changes.
5. Ignoring sql.ErrNoRows
db.QueryRowContext(...).Scan(...) returns sql.ErrNoRows when no record is found. This is not a fatal error — handle it explicitly: if errors.Is(err, sql.ErrNoRows) { return nil, ErrNotFound }.
Frequently Asked Questions
Should I use GORM or raw database/sql?
GORM reduces boilerplate for CRUD operations and is excellent for rapid development. Raw database/sql with sqlx or pgx gives more control and is faster for complex queries. Many production services use GORM for simple operations and drop to raw SQL for performance-critical paths.
How do I manage database connection pool settings?
Set db.SetMaxOpenConns(n), db.SetMaxIdleConns(n), and db.SetConnMaxLifetime(d) immediately after sql.Open. A typical starting point for a web server: max open = number of CPU cores × 2, max idle = same, lifetime = 5 minutes.
How do I run database migrations in Go? golang-migrate is the most widely used migration tool for Go — it supports PostgreSQL, MySQL, SQLite, and more, and integrates with both CLI and application code. Write migrations as plain SQL files for maximum portability.
