Skip to content

Transaction Management with PostgreSQL in C# .NET: Practical Experiences Using Dapper

Last night, I dived into coding mode again, as you know, the weather in Bursa has turned chilly, so I opened the window to get some fresh air. I sat down at the computer, working on a REST API project. Data integrity is crucial, of course, especially when you perform multiple operations that should all succeed or fail together. That’s when I started working with transactions, which I handle using Dapper in PostgreSQL—it’s always practical for me. I’ve been developing such projects for years, but every time, some small details come up.

Anyway, shall we review what a transaction is? Simply put, it involves grouping a series of database operations to ensure they are all successful. If an error occurs at any point, you perform a rollback to undo all operations. In PostgreSQL, this is managed with BEGIN, COMMIT, and ROLLBACK commands. Integrating this in C# with Dapper requires a bit of coding, but it’s nothing to be afraid of. It’s indispensable for someone who codes daily, like me.

By the way, I remembered last week I forgot to use a transaction in a project. I was writing data to two tables, one succeeded while the other failed, leaving the database in an inconsistent state. Fortunately, it was in a testing environment, so I noticed before deploying to production. And since then, I always include transactions, right?

Let’s get into the topic. First, you need to install the NuGet packages for Dapper and Npgsql. Npgsql is the connector for PostgreSQL. Then, configure your connection string, I usually keep it in appsettings.json. Open the connection, start the transaction. In Dapper, there is an IDbTransaction interface you use. Here’s an example code snippet:

using (var connection = new NpgsqlConnection(connectionString))
{
connection.Open();
using (var transaction = connection.BeginTransaction())
{
try
{
// Operations go here
connection.Execute(“INSERT INTO users (name) VALUES (@name)”, new { name = “Ali” }, transaction);
connection.Execute(“INSERT INTO orders (user_id, amount) VALUES (@user_id, @amount)”, new { user_id = 1, amount = 100 }, transaction);
transaction.Commit();
}
catch
{
transaction.Rollback();
throw;
}
}
}

This code performs two inserts, and in case of error, rolls back. Isn’t it simple? Just be careful not to forget to dispose of the connection; the using blocks help with that. I once forgot to close the connection, which caused a memory leak and slowed down the program. Learned my lesson from that mistake.

Now, imagine a more complex scenario. Suppose there’s a user registration and email sending in an API endpoint. If email fails, you rollback the transaction, but the email is outside the database. This is where distributed transactions come into play, but for PostgreSQL, it’s better to keep things simple. I usually send the email after commit, or I use the compensation pattern, which means cleaning up if an error occurs.

Practical Example: Usage in REST API

In my projects, I develop REST APIs with C# in .NET Core. You write a method in the controller, pass the transaction to the service layer, and work with Dapper there. For example, in a UserService class:

public bool CreateUserWithOrder(UserDto user, OrderDto order)
{
using var connection = _dbContext.CreateConnection();
connection.Open();
using var transaction = connection.BeginTransaction();
try
{
var userId = connection.QuerySingle(“INSERT INTO users (name, email) VALUES (@name, @email) RETURNING id”, user, transaction);
order.UserId = userId;
connection.Execute(“INSERT INTO orders (user_id, amount) VALUES (@user_id, @amount)”, order, transaction);
transaction.Commit();
return true;
}
catch (Exception ex)
{
transaction.Rollback();
_logger.LogError(ex, “Transaction failed”);
return false;
}
}

Here, the RETURNING clause fetches the inserted ID from PostgreSQL—it’s very practical. Remember to use dependency injection for the connection instead of hardcoding it. Based on my experience, this keeps the code clean.

Sometimes, index management confuses people. The default in PostgreSQL is Read Committed, but if you want higher data consistency, set the isolation level, e.g., connection.BeginTransaction(IsolationLevel.Serializable). It increases deadlock risk but offers better data integrity. In one project, I forgot to set this, and concurrent users caused data conflicts. I spent 2-3 hours debugging before realizing the issue.

Additionally, Dapper’s transaction support is excellent, but if you are using Entity Framework, the DbContext.Transaction property offers a more declarative approach. I prefer Dapper because it’s lightweight, but don’t mix them. PostgreSQL also supports savepoints for nested transactions, which you can refer to with transaction.Save(“point1”); and rollback with RollbackTo(“point1”); in case of partial errors.

Why is Transaction Important?

Suppose you’re building a banking app, transferring money. Deducting from one account and crediting another—if an error occurs in between, money could be lost. Transactions ensure ACID properties: Atomicity, Consistency, Isolation, and Durability. PostgreSQL is very strong in this regard, using WAL logging to ensure durability. I consider it essential to use transactions for multi-statement operations.

When coding, I’m always paranoid, and I simulate transactions in tests. I use an in-memory database for unit tests, but it’s hard to test transactions there. In integration tests, I typically run real database instances. It takes time but prevents issues in production. Last week, I encountered a fault in a project logging data for embedded systems, where missing a transaction led to partial data on reset. Luckily, I caught it early and fixed it quickly.

Now, let’s look at alternatives. You can manage transactions inside stored procedures in PostgreSQL. Call them from C# with Dapper, like connection.Execute(“CALL my_proc(@param)”, param, transaction);. Inside the procedure, wrap with BEGIN…END;. Useful for complex logic but harder to maintain.

I think the best approach is hybrid—use simple Dapper transactions for straightforward tasks, and stored procedures for complex operations. Microsoft’s documentation provides Dapper examples for basic transaction management, which are a good starting point. They’re not detailed but sufficient for beginners.

The world of coding is filled with such details; sometimes you spend hours on a single line. When I’m with my family or doing mountaineering around Bursa, I disconnect, but I continue immediately when I return.

Potential Errors and Solutions

Common errors include connection timeout or long transactions, which can be fixed by adjusting CommandTimeout in Npgsql. Deadlocks and concurrent access issues can be mitigated by optimizing indexes on tables. I once forgot to create an index, causing a transaction to take 10 seconds before timing out.

When using PostgreSQL in the cloud, like AWS RDS, connection pooling is important. Npgsql handles pooling automatically, but setting a max pool size is recommended. Default is 15, which might be insufficient under heavy load. A site I saw discussed this, search for ‘npgsql connection pool’ for more info.

Now, a practical tip: async transactions. Use async/await in C#. For example, await connection.ExecuteAsync(…, transaction); await transaction.CommitAsync. It’s essential to avoid blocking in API calls. I always use async in my REST APIs, which improves performance.

Here’s an example of async code:

await using var connection = new NpgsqlConnection(connectionString);
await connection.OpenAsync();
await using var transaction = await connection.BeginTransactionAsync();
try
{
await connection.ExecuteAsync(“INSERT …”, param, transaction);
await transaction.CommitAsync();
}
catch
{
await transaction.RollbackAsync();
throw;
}

It works very well. Just remember to use ‘await using’ so that the disposal is asynchronous. I’m not fully sure of the details, but Microsoft docs cover this.

In summary, transaction management is foundational for data security. I always include it in my code; don’t forget. For beginners, search for ‘c# dapper transaction postgresql example’ on Google, plenty of samples are available. On Reddit, the community in r/csharp also discusses this; it’s helpful.

And one more thing: in electronics circuit design, there’s a similar principle—if a signal chain encounters an error, reset the system. I sometimes draw parallels between coding and this perspective. Anyway, I hope you find this helpful. Feel free to share your experiences in comments, maybe I’ll learn something new too.

Ultimately, Dapper with PostgreSQL transactions is practical and effective. Start with small projects, and once you get used to it, it becomes indispensable. Keep coding and enjoy it! 🙂