When I had to sync thousands of complex orders with nested items from an external ERP system, my classic Dapper solution hit a wall.
The result? 40+ seconds of runtime and massive CPU load.
That’s when I switched to Dapper Plus - and dropped execution time to just under 2 seconds.
This article gives you everything: a real problem, working .NET 9 code, full schema, and in-depth explanations so you can learn how to use Dapper Plus effectively and confidently.
Dapper excels at micro-ORM read operations. But when you need to perform thousands of inserts, updates, or deletes, the number of round-trips to the database becomes a major bottleneck.
The bigger your data, the worse the performance.
We get a CSV export or ERP API payload every hour with updated order data. Each order has:
We want to:
Let’s dive into the structure.
Here’s the table definition we use to model the orders and their related items. This structure is a common one found in many systems.
CREATE TABLE Orders (Id INT PRIMARY KEY,Customer NVARCHAR(100),Status NVARCHAR(50),TotalAmount DECIMAL(18,2),CreatedAt DATETIME2); CREATE TABLE OrderItems (Id INT PRIMARY KEY,OrderId INT FOREIGN KEY REFERENCES Orders(Id),Product NVARCHAR(100),Quantity INT,Price DECIMAL(18,2));
Why OrderItems Has Its Own Primary Key? Dapper Plus requires a primary key for operations like BulkUpdate, BulkMerge, and BulkDelete. That’s why we don’t just rely on (OrderId, Product).
These are our strongly typed C# models for orders and order items.
public class Order{ public int Id { get; set; } public string Customer { get; set; } = default!; public string Status { get; set; } = default!; public decimal TotalAmount { get; set; } public DateTime CreatedAt { get; set; } public List<OrderItem> Items { get; set; } = new();} public class OrderItem{ public int Id { get; set; } public int OrderId { get; set; } public string Product { get; set; } = default!; public int Quantity { get; set; } public decimal Price { get; set; }}
To simulate a real sync, we generate 500 orders with 20 items each (total 10,000 rows).
List<Order> GenerateOrders(int count){ var rng = new Random(); var orders = new List<Order>(); for (int i = 1; i <= count; i++) { var order = new Order { Id = i, Customer = $"Customer {i}", Status = "Pending", CreatedAt = DateTime.UtcNow, Items = new List<OrderItem>() }; int itemsCount = rng.Next(1, 5); for (int j = 1; j <= itemsCount; j++) { order.Items.Add(new OrderItem { Id = i * 10 + j, // unique ID across orders Product = $"Product {j}", Quantity = rng.Next(1, 10), Price = rng.Next(10, 100) }); } orders.Add(order); } return orders;}
This creates the data we'll use for both benchmarking and testing Dapper vs Dapper Plus.
Let’s look at what a naive Dapper implementation might look like:
public async Task SyncOrdersWithClassicDapperAsync(List<Order> orders){ foreach (var order in orders) { // Recalculate total before insert order.TotalAmount = order.Items.Sum(x => x.Price * x.Quantity); // Try UPDATE Order first var rowsAffected = await _connection.ExecuteAsync( @"UPDATE OrdersSET Customer = @Customer,Status = @Status,TotalAmount = @TotalAmount,CreatedAt = @CreatedAtWHERE Id = @Id", order); // If not updated, INSERT Order if (rowsAffected == 0) { await _connection.ExecuteAsync( @"INSERT INTO Orders (Id, Customer, Status, TotalAmount, CreatedAt)VALUES (@Id, @Customer, @Status, @TotalAmount, @CreatedAt)", order); } // Delete existing OrderItems (optional if recreating) await _connection.ExecuteAsync( "DELETE FROM OrderItems WHERE OrderId = @OrderId", new { OrderId = order.Id }); // Set OrderId and insert new items foreach (var item in order.Items) { item.OrderId = order.Id; await _connection.ExecuteAsync( @"INSERT INTO OrderItems (Id, OrderId, Product, Quantity, Price)VALUES (@Id, @OrderId, @Product, @Quantity, @Price)", item); } }}
Why It’s Slow:
Time: ~20s
public async Task SyncOrdersWithDapperPlusAsync(List<Order> orders){ DapperPlusManager.Entity<Order>().Table("Orders"); DapperPlusManager.Entity<OrderItem>().Table("OrderItems"); using var transaction = _connection.BeginTransaction(); foreach (var order in orders) { order.TotalAmount = order.Items.Sum(x => x.Price * x.Quantity); foreach (var item in order.Items) { item.OrderId = order.Id; // Make sure FK is set } } // Build all OrderItems (flattened) var allItems = orders.SelectMany(o => o.Items).ToList(); // Build dummy OrderItems just to delete by OrderId var itemsToDelete = orders.Select(o => new OrderItem { OrderId = o.Id }).ToList(); // Delete old OrderItems (must match key + FK mapping in config) await transaction.BulkDeleteAsync(itemsToDelete); // Insert new OrderItems await transaction.BulkInsertAsync(allItems); // Merge Orders await transaction.BulkMergeAsync(orders); transaction.Commit();}
Why It’s Better:
Time: ~1.5s
Dapper Plus leverages:
Dapper Plus is not just a faster Dapper - it’s a higher-level abstraction that makes bulk workflows:
Sometimes you don’t want to update the entire row - just one or two fields (like status flags or timestamps). Instead of updating all columns, you can tell Dapper Plus exactly which ones to include, reducing locking and improving write efficiency.
await connection.BulkUpdateAsync(orders, opts => opts.ColumnInputExpression = x => new { x.Status });
When inserting or updating large numbers of rows, you may want to limit how many records are sent per round-trip to the database. This avoids timeouts or memory pressure in constrained environments.
await connection.BulkInsertAsync(orders, opts => opts.BatchSize = 500);
Need to track which entities are being processed? Dapper Plus gives you hooks like BeforeBulkAction and AfterBulkAction, so you can log or modify behavior dynamically.
DapperPlusManager.Entity<Order>().BeforeBulkAction = (e, t) => Console.WriteLine($"{t} - {((Order)e).Id}");
If your table doesn’t use a single primary key but rather a combination of fields (e.g., TenantId + Code), Dapper Plus lets you configure composite keys so it knows how to identify rows during merge and delete operations.
DapperPlusManager.Entity<YourType>().Key(x => new { x.TenantId, x.Code });
All Dapper Plus methods - BulkInsertAsync, BulkUpdateAsync, BulkDeleteAsync, BulkMergeAsync - support asynchronous operations, letting you integrate smoothly with modern .NET async workflows for high scalability. All operations support full async variants.
Bulk data operations are a reality in most serious applications - from importing data and syncing systems to cleaning up large datasets.
While Dapper gives you raw performance and control, it quickly becomes inefficient and verbose when you move beyond a few hundred records.
That’s where Dapper Plus shines. It keeps everything familiar and low-friction for Dapper users, but gives you:
If your application handles thousands of inserts, updates, or deletes - Dapper Plus will save you time, headaches, and infrastructure costs.
That's all from me today.
P.S. Follow me on YouTube.
Stop arguing about code style. In this course you get a production-proven setup with analyzers, CI quality gates, and architecture tests — the exact system I use in real projects. Join here.
Not sure yet? Grab the free Starter Kit — a drop-in setup with the essentials from Module 01.
Design Patterns that Deliver — Solve real problems with 5 battle-tested patterns (Builder, Decorator, Strategy, Adapter, Mediator) using practical, real-world examples. Trusted by 650+ developers.
Just getting started? Design Patterns Simplified covers 10 essential patterns in a beginner-friendly, 30-page guide for just $9.95.
Every Monday morning, I share 1 actionable tip on C#, .NET & Architecture that you can use right away. Join here.
Join 20,000+ subscribers who mass-improve their .NET skills with actionable tips on C#, Software Architecture & Best Practices.
Subscribe to the TheCodeMan.net and be among the 20,000+ subscribers gaining practical tips and resources to enhance your .NET expertise.