Skip to content

C# .NET with PostgreSQL JSON Operations: Practical Examples Using Dapper

Recently, I was dealing with managing JSON data in a database; you know how complex data structures can sometimes be a headache. I was developing a C# .NET REST API and turned to PostgreSQL because its JSON support feels more natural compared to MySQL. I’ve been using Dapper for years because I love its lightweight and fast nature, making it ideal to avoid the heaviness of Entity Framework. So, while thinking about how to query JSONs, a past project came to mind where I made mistakes that led to data mix-ups.

PostgreSQL’s JSON features are really powerful, aren’t they? For instance, adding a JSON column to a table that contains nested objects gives you NoSQL-like flexibility within a relational setup. I think this is essential for modern applications, especially when dealing with dynamic data in APIs. Using JSONB is preferable because it’s stored in binary format and allows for easier indexing. What’s your opinion on how much space relational databases should allocate to JSON?

Integrating with Dapper is actually very simple. First, you need to install the Npgsql package from NuGet for PostgreSQL. Then, configure your connection string, typically stored in appsettings.json. After that, for JSON queries, you use PostgreSQL’s built-in functions, like the ->> operator for extracting values.

Let me give a practical example. Suppose there’s a ‘users’ table with a ‘profile’ JSON column containing fields like ‘name’ and ‘age’. The query would be: SELECT id, profile->>'name' AS name FROM users WHERE profile->>'age' > '30'; To call this with Dapper, you define a model class and then use connection.Query(sql). That’s it; the query runs and maps automatically. Be cautious with type conversions, such as converting strings to integers.

Adding and Updating JSON Data

For insertion, you can write: INSERT INTO users (profile) VALUES (@profile::jsonb); passing a JSON string as a parameter. Dapper allows easy parameterization with dynamic parameters: you serialize your object with JsonSerializer.Serialize(userProfile) and pass it as a parameter. PostgreSQL parses the JSON automatically. Beware that large data can impact performance, so consider creating an index: CREATE INDEX idx_profile ON users USING gin (profile); — a GIN index is optimized for JSON data.

Nested JSON path issues can arise, for example, if you use an incorrect path like ‘$.key.subkey’, leading to no data returned. I spent hours debugging that. For updates, you can use: UPDATE users SET profile = profile || @newData::jsonb WHERE id = @id; which merges JSON objects. Do you also use such merge operations?

(Check the PostgreSQL documentation for more detailed examples: PostgreSQL docs.)

Why I Prefer Dapper

I love Dapper because it’s a micro-ORM where you write your own SQL, giving you control. Entity Framework’s LINQ to SQL sometimes produces unexpected queries, making optimization difficult. Managing PostgreSQL JSON with Dapper is both fast and secure. It automatically uses prepared statements, protecting against injection. Honestly, I don’t abandon it even in large projects.

Here’s a sample C# code snippet. First, add using System.Text.Json;:

using (var conn = new NpgsqlConnection(connectionString))
{
await conn.OpenAsync();
var sql = “SELECT * FROM users WHERE (profile->>’city’) = @city”;
var users = await conn.QueryAsync<User>(sql, new { city = “Bursa” });
return users.ToList();
}

It’s that simple. The ‘User’ class has ‘profile’ as a string, which you can then deserialize. I was working fine until one day the connection pool was exhausted, causing a timeout. Increasing the Max Pool Size in the settings resolved the issue.

By the way, I found discussions like this online, such as on forums (e.g., search for ‘c# dapper postgresql json query examples’) and Stack Overflow — the links are memorable but not specific here.

You can filter JSONs with advanced queries, like: WHERE profile ? 'key' to check if a key exists or use @> containment operators, e.g., {age: 30} @> profile->'age'. These are powerful for data analysis, though I typically keep it simple to avoid excessive complexity.

In summary, using PostgreSQL JSON features with Dapper in C# .NET REST APIs boosts efficiency, especially for dynamic form data. My advice is to try it on small projects; once you get used to it, you’ll never want to go back. (Performance is critical, after all.)

PostgreSQL also supports JSON schema validation through extensions, but I prefer manual checks. Anyway, I won’t extend further.

Check Microsoft docs for C# JSON handling; it’s quite helpful. I also saw a library in Dapper contrib that offers JSON mappers, but I stick to vanilla.

Overall, I’m satisfied with this setup, but sometimes JSON size limits become an issue, as PostgreSQL has a 1GB limit for JSON data. Do you use separate collections for large JSONs?

Practical tip: Use AS aliases in queries for easier mapping, like profile->>'name' AS fullName.

On a tangent, last week in Bursa, it was raining while I was coding. I looked outside at the mountains and dreamed of mountain climbing. Then I got back to work, designed a microcontroller board, and accidentally swapped pins on an I2C connection. Eventually, I fixed it. Camping and coding breaks are both necessary; they help clear the mind 🙂