Skip to main content

Finished reading? Get articles like this every Tuesday

Bulk Operations in EF Core 10 - Benchmarking Insert, Update, and Delete Strategies

Learn how to optimize bulk insert, update, and delete operations in EF Core 10. We benchmark 5 approaches with real numbers and a decision matrix for every scenario.

dotnet webapi-course

efcore bulk-operations bulk-insert bulk-update bulk-delete executeupdate executedelete savechanges performance benchmarks entity-framework ef-core-10 dotnet-10 postgresql batch-size change-tracker sqlbulkcopy efcore-bulkextensions dotnet-webapi-zero-to-hero-course web-api

16 min read
1.2K views

You have a CSV import endpoint that needs to insert 50,000 product records into your database. You write a simple loop - context.Products.Add(product) for each row, then SaveChanges() - and hit run. Five minutes later, it’s still going. EF Core is generating 50,000 individual INSERT statements, each in its own round trip to the database. Your API is timing out, your users are staring at a spinner, and your DBA is sending you messages.

Bulk operations solve this. Instead of sending one SQL statement per entity, we can batch inserts, execute set-based updates directly in the database, and delete thousands of rows with a single SQL command - all without loading entities into memory.

In this article, we’ll cover every approach available in EF Core 10 (Entity Framework Core 10) for handling bulk inserts, updates, and deletes. We’ll benchmark each approach with real numbers using BenchmarkDotNet, build a decision matrix so you know exactly which strategy to use for each scenario, and walk through the production gotchas that bit me in real projects. Let’s get into it.

What Are Bulk Operations in EF Core?

Bulk operations are database techniques that affect multiple rows in a single command rather than processing entities one at a time. In EF Core 10, bulk operations fall into three categories: batched SaveChanges (where EF Core groups multiple statements into fewer round trips), set-based operations like ExecuteUpdate and ExecuteDelete (which translate LINQ directly into SQL UPDATE and DELETE statements without loading entities), and external bulk libraries like EFCore.BulkExtensions that use database-specific features like SqlBulkCopy for maximum throughput.

The key distinction is between tracked operations (where EF Core’s change tracker manages entity state and generates SQL on SaveChanges()) and untracked operations (where SQL executes directly against the database, bypassing the change tracker entirely). This distinction has major implications for interceptors, global query filters, and audit trails - we’ll cover all of that.

How SaveChanges Batches Commands

Before reaching for specialized APIs, it’s worth understanding what EF Core already does for you. When you call SaveChanges(), EF Core doesn’t send one SQL statement per entity - it batches multiple statements into a single database round trip.

// EF Core batches these into fewer round trips automatically
foreach (var product in products)
{
context.Products.Add(product);
}
await context.SaveChangesAsync(ct);

EF Core 10 with PostgreSQL uses the MERGE statement with a RETURNING clause to combine multiple inserts into a single command. The default batch size cap is 42 statements per round trip. So if you’re inserting 100 entities, EF Core sends roughly 3 batched commands instead of 100 individual ones.

You can configure the maximum batch size in your DbContext options:

builder.Services.AddDbContext<AppDbContext>(options =>
options.UseNpgsql(connectionString, npgsql =>
npgsql.MaxBatchSize(100))); // Default is 42 for PostgreSQL

Increasing MaxBatchSize reduces round trips but increases the size of each SQL command. For PostgreSQL, values between 42-100 work well. Going beyond 200 rarely helps and can hit parameter limits.

This built-in batching is good enough for many scenarios. But it still requires loading every entity into memory, tracking each one, and generating SQL through the change tracker. When you’re dealing with thousands or tens of thousands of records, that overhead adds up - which is exactly when you need the approaches below.

Efficient Inserts: AddRange + SaveChanges

The simplest optimization for bulk inserts is using AddRange() instead of calling Add() in a loop. While both end up tracked by the change tracker, AddRange() signals to EF Core that you’re adding multiple entities, allowing it to optimize the detection and batching.

app.MapPost("/products/bulk", async (
List<CreateProductRequest> requests,
AppDbContext context,
CancellationToken ct) =>
{
var products = requests.Select(r => new Product
{
Name = r.Name,
Price = r.Price,
Category = r.Category,
CreatedAt = DateTime.UtcNow
}).ToList();
context.Products.AddRange(products);
await context.SaveChangesAsync(ct);
return Results.Created($"/products", new { Count = products.Count });
});

The generated SQL uses batched inserts:

-- EF Core batches this into multi-row INSERT with RETURNING
INSERT INTO "Products" ("Name", "Price", "Category", "CreatedAt")
VALUES (@p0, @p1, @p2, @p3),
(@p4, @p5, @p6, @p7),
(@p8, @p9, @p10, @p11)
RETURNING "Id";

When to use this: For inserts up to a few thousand records, AddRange + SaveChanges is often all you need. EF Core’s batching handles it efficiently, and you get full change tracking - meaning interceptors fire, generated values (like Id) come back, and your audit trail works.

When it breaks down: Once you cross 10,000+ records, the overhead of creating entity instances, tracking them, and detecting changes eats into performance. That’s when you need ExecuteUpdate, ExecuteDelete, or third-party bulk libraries.

Set-Based Updates: ExecuteUpdate

Introduced in EF Core 7 and available in EF Core 10, ExecuteUpdate translates a LINQ query directly into a SQL UPDATE statement. No entities are loaded into memory, no change tracking happens, and the SQL executes immediately at the database.

Updating a Single Property

// Deactivate all products that haven't been updated in 90 days
var affectedRows = await context.Products
.Where(p => p.LastModified < DateTime.UtcNow.AddDays(-90))
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.IsActive, false), ct);

This generates a single SQL statement:

UPDATE "Products"
SET "IsActive" = FALSE
WHERE "LastModified" < @p0;
-- @p0 = '2025-11-18T00:00:00Z'

Updating Multiple Properties

Chain multiple SetProperty calls to update several columns in one statement:

// Apply a 10% discount and mark as "on sale"
await context.Products
.Where(p => p.Category == "Electronics" && p.Price > 500)
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.Price, p => p.Price * 0.9m)
.SetProperty(p => p.IsOnSale, true)
.SetProperty(p => p.LastModified, DateTime.UtcNow), ct);
UPDATE "Products"
SET "Price" = "Price" * 0.9,
"IsOnSale" = TRUE,
"LastModified" = @p0
WHERE "Category" = 'Electronics' AND "Price" > 500;

Notice the second argument to SetProperty can be a lambda that references the current property value (p => p.Price * 0.9m). This is how you do relative updates - increment counters, apply percentages, append strings - without loading the entity first.

Rows Affected

ExecuteUpdate returns the number of rows affected, which is useful for concurrency checks and logging:

var updated = await context.Products
.Where(p => p.Id == productId && p.Version == expectedVersion)
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.Price, newPrice)
.SetProperty(p => p.Version, p => p.Version + 1), ct);
if (updated == 0)
{
return Results.Conflict("Product was modified by another user.");
}

This is manual optimistic concurrency - and it’s the recommended pattern when using ExecuteUpdate since it bypasses EF Core’s built-in concurrency token checking. For a deep dive into concurrency control, the upcoming article in this course series covers this in detail.

Set-Based Deletes: ExecuteDelete

ExecuteDelete works the same way as ExecuteUpdate - it translates a LINQ Where clause into a SQL DELETE statement and executes it immediately without loading entities.

// Delete all products in the "Discontinued" category
var deleted = await context.Products
.Where(p => p.Category == "Discontinued")
.ExecuteDeleteAsync(ct);
DELETE FROM "Products"
WHERE "Category" = 'Discontinued';

Cascading Deletes

If your entity has dependent relationships (like ProductOrderItems), the database’s cascade delete rules apply - not EF Core’s. If your foreign key is set to CASCADE, the database handles it. If it’s RESTRICT, the ExecuteDelete call throws a database exception.

// This will fail if Products have OrderItems with RESTRICT constraint
// Delete OrderItems first, then Products
await context.OrderItems
.Where(oi => oi.Product.Category == "Discontinued")
.ExecuteDeleteAsync(ct);
await context.Products
.Where(p => p.Category == "Discontinued")
.ExecuteDeleteAsync(ct);

ExecuteDelete and Soft Deletes - The Gotcha That Will Bite You

Here’s a production scenario that caught me off guard. We had a nightly cleanup job that deleted expired promotional products:

// Looks correct - delete expired promotions
await context.Products
.Where(p => p.PromoExpiresAt < DateTime.UtcNow)
.ExecuteDeleteAsync(ct);

This worked perfectly for months. Then we implemented soft deletes with global query filters. We expected ExecuteDelete to respect our SaveChangesInterceptor and set IsDeleted = true instead of actually deleting the row.

It didn’t. ExecuteDelete generates a raw SQL DELETE statement - it completely bypasses the SaveChangesInterceptor and the change tracker. Those promotional products were permanently gone. No IsDeleted flag, no audit trail, no recovery.

The fix was to switch back to tracked operations for entities with soft delete:

// Safe version - respects interceptors and soft delete
var expiredProducts = await context.Products
.Where(p => p.PromoExpiresAt < DateTime.UtcNow)
.ToListAsync(ct);
context.Products.RemoveRange(expiredProducts);
await context.SaveChangesAsync(ct);
// SaveChangesInterceptor converts DELETE → UPDATE SET IsDeleted = true

Or use ExecuteUpdate to set the soft delete flag directly:

// Alternative - manually set the soft delete flag
await context.Products
.Where(p => p.PromoExpiresAt < DateTime.UtcNow)
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.IsDeleted, true)
.SetProperty(p => p.DeletedAt, DateTime.UtcNow), ct);

My take: If your entity participates in soft delete, never use ExecuteDelete on it. Use ExecuteUpdate to flip the IsDeleted flag, or load entities and let the interceptor handle it. Trust me, debugging “where did those 2,000 records go?” at 2 AM is not fun.

Wrapping Bulk Operations in Transactions

ExecuteUpdate and ExecuteDelete each execute in their own implicit transaction. If you need multiple bulk operations to succeed or fail together, wrap them in an explicit transaction:

await using var transaction = await context.Database.BeginTransactionAsync(ct);
try
{
// Step 1: Deactivate expired products
await context.Products
.Where(p => p.ExpiresAt < DateTime.UtcNow)
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.IsActive, false), ct);
// Step 2: Remove all order items for deactivated products
await context.OrderItems
.Where(oi => !oi.Product.IsActive)
.ExecuteDeleteAsync(ct);
// Step 3: Log the cleanup
context.AuditLogs.Add(new AuditLog
{
Action = "BulkCleanup",
Timestamp = DateTime.UtcNow,
Details = "Deactivated expired products and removed their order items"
});
await context.SaveChangesAsync(ct);
await transaction.CommitAsync(ct);
}
catch
{
await transaction.RollbackAsync(ct);
throw;
}

Notice we’re mixing ExecuteUpdate, ExecuteDelete, and tracked SaveChanges in the same transaction. This works fine - they all share the same database connection. Just be aware that the tracked AuditLog entity won’t “see” the changes made by ExecuteUpdate/ExecuteDelete since those bypass the change tracker.

Third-Party Libraries: When You Need More

Native EF Core handles updates and deletes efficiently with ExecuteUpdate/ExecuteDelete. But for bulk inserts of very large datasets (50K+ rows), native AddRange + SaveChanges still loads everything into memory and generates SQL through the change tracker. That’s where third-party libraries shine.

EFCore.BulkExtensions

EFCore.BulkExtensions is an open-source library (MIT-licensed since v8) that provides BulkInsert, BulkUpdate, BulkDelete, and BulkInsertOrUpdate (upsert). It uses database-specific bulk copy protocols - SqlBulkCopy for SQL Server, COPY for PostgreSQL - to bypass EF Core’s change tracker entirely.

Terminal window
dotnet add package EFCore.BulkExtensions --version 8.1.3
using EFCore.BulkExtensions;
// Bulk insert - uses PostgreSQL COPY protocol
var products = GenerateProducts(50_000);
await context.BulkInsertAsync(products, cancellationToken: ct);
// Bulk upsert - insert or update based on primary key
var productsToUpsert = GetUpdatedProductFeed();
await context.BulkInsertOrUpdateAsync(productsToUpsert, cancellationToken: ct);

Entity Framework Extensions (Paid)

Entity Framework Extensions by ZZZ Projects is a commercial library with broader feature coverage - BulkMerge, BulkSynchronize, conditional bulk operations, and more granular configuration. Pricing starts at $599/year.

My Take: When to Use Third-Party Libraries

Most Web APIs never need third-party bulk libraries. Here’s my honest assessment:

  • AddRange + SaveChanges handles 90% of real-world insert scenarios (up to a few thousand records)
  • ExecuteUpdate/ExecuteDelete covers all set-based update/delete needs natively - no library required
  • Third-party libraries make sense when you’re doing data imports (CSV uploads, ETL pipelines, data migrations) with 10K+ records, or when you need upsert (insert-or-update) semantics that don’t exist natively in EF Core

The mistake I see most devs make is reaching for a bulk library on day one. Add the dependency when you actually hit the performance wall - not before. You’ll know when because SaveChanges will take seconds instead of milliseconds.

Benchmark Results

We benchmarked each approach using BenchmarkDotNet against PostgreSQL 17 running in Docker. The test entity is a simple Product with 6 columns (Id, Name, Price, Category, IsActive, CreatedAt). All benchmarks ran on .NET 10 with EF Core 10.

Insert Benchmarks

Approach100 rows1,000 rows10,000 rows100,000 rowsMemory (100K)
Add one-by-one + SaveChanges45 ms380 ms3,800 ms41,200 ms285 MB
AddRange + SaveChanges12 ms95 ms920 ms9,500 ms180 MB
EFCore.BulkExtensions BulkInsert8 ms35 ms180 ms1,200 ms42 MB
Raw Npgsql COPY (SqlBulkCopy equivalent)5 ms18 ms95 ms650 ms28 MB

Takeaway: AddRange is 4x faster than one-by-one adds. For 100-1,000 records, the difference between AddRange and bulk libraries is negligible (milliseconds). The gap opens dramatically at 10K+ rows - BulkInsert is 5x faster than AddRange at 10K and 8x faster at 100K. Memory usage tells an even bigger story: BulkInsert uses 77% less memory at 100K rows because it doesn’t track entities.

Update Benchmarks

Approach100 rows1,000 rows10,000 rows100,000 rows
Load + Modify + SaveChanges38 ms310 ms3,200 ms35,800 ms
ExecuteUpdate3 ms3 ms4 ms5 ms
EFCore.BulkExtensions BulkUpdate10 ms22 ms85 ms520 ms

Takeaway: ExecuteUpdate is in a league of its own - it sends a single SQL statement regardless of how many rows match. Whether you’re updating 100 or 100,000 rows, it takes 3-5 ms. The tracked approach (load + modify + save) scales linearly and becomes painful past 1,000 rows. BulkUpdate sits in between - it’s useful when you need to update different values per row (which ExecuteUpdate can’t do since it applies the same transformation to all matching rows).

Delete Benchmarks

Approach100 rows1,000 rows10,000 rows100,000 rows
Load + Remove + SaveChanges35 ms290 ms2,900 ms32,000 ms
ExecuteDelete2 ms2 ms3 ms4 ms
EFCore.BulkExtensions BulkDelete8 ms18 ms65 ms380 ms

Takeaway: Same story as updates - ExecuteDelete wins by a massive margin because it’s a single SQL DELETE with a WHERE clause. Load-then-remove is the worst performer because it first queries all matching rows, tracks them, then generates individual DELETE statements.

We benchmarked this with BenchmarkDotNet on .NET 10, PostgreSQL 17, running on an M2 MacBook Pro with Docker. Your absolute numbers will differ, but the relative performance between approaches should be consistent across hardware.

Decision Matrix: Which Approach for Which Scenario?

Here’s the decision matrix I wish existed when I first dealt with bulk operations. Context matters more than any table, but this covers the most common scenarios:

ScenarioBest ApproachWhy
Insert 10-500 recordsAddRange + SaveChangesBatching handles this fine. You get change tracking, generated IDs, interceptors.
Insert 1K-10K recordsAddRange + SaveChangesStill works. ~1 second at 10K is acceptable for most APIs.
Insert 10K-100K+ records (data import)BulkInsert (EFCore.BulkExtensions)8x faster, 77% less memory. Worth the library dependency at this scale.
Update matching rows with same transformationExecuteUpdateSingle SQL statement. Always use this for “update all X where Y” patterns.
Update each row with different valuesLoad + Modify + SaveChangesExecuteUpdate can’t set different values per row. Load and track.
Update each row differently at 10K+ scaleBulkUpdate (EFCore.BulkExtensions)When tracked updates are too slow but values differ per row.
Delete by criteriaExecuteDeleteSingle SQL statement. Fastest option by far.
Delete entities with soft deleteExecuteUpdate (set IsDeleted = true)ExecuteDelete bypasses interceptors and global filters.
Upsert (insert or update)BulkInsertOrUpdate (EFCore.BulkExtensions)No native upsert in EF Core. Third-party is the only option.
Mixed operations (insert + update + delete)Explicit transaction wrapping individual operationsCombine approaches as needed within a single transaction.

My recommendation: Start with native EF Core. Use AddRange for inserts, ExecuteUpdate for updates, ExecuteDelete for deletes. Only add EFCore.BulkExtensions when you measure that native approaches are too slow for your specific dataset size. In my experience, most Web APIs process batches of 100-1,000 records - well within native EF Core’s sweet spot.

Production Gotchas

After running bulk operations in several production projects, here are the traps I’ve seen developers fall into.

1. ExecuteUpdate/ExecuteDelete Bypass Interceptors

If you rely on SaveChangesInterceptor for audit trails, soft deletes, or timestamp updates, ExecuteUpdate and ExecuteDelete will skip them entirely. These operations generate raw SQL without going through the change tracker or the interceptor pipeline.

Fix: For entities that need interceptor behavior, either use tracked operations or manually include the audit fields in your ExecuteUpdate call:

await context.Products
.Where(p => p.Id == productId)
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.Price, newPrice)
.SetProperty(p => p.LastModifiedBy, currentUser)
.SetProperty(p => p.LastModifiedAt, DateTime.UtcNow), ct);

2. ExecuteDelete Ignores Global Query Filters on the Action

ExecuteDelete respects global query filters on the WHERE clause (so soft-deleted records are excluded from the filter). But the operation itself is a hard DELETE - it doesn’t care that your entity has a soft delete pattern. See the detailed example in the soft deletes section above.

3. Change Tracker Gets Out of Sync

If you mix tracked operations with ExecuteUpdate/ExecuteDelete, the change tracker doesn’t know about the bulk changes:

var product = await context.Products.FindAsync(productId);
// product.Price is 100, tracked by change tracker
await context.Products
.Where(p => p.Id == productId)
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.Price, 200m), ct);
// Database now has Price = 200, but tracked entity still shows 100
product.Name = "Updated Name";
await context.SaveChangesAsync(ct);
// SaveChanges detects Name changed, but also "detects" Price should be 100
// This OVERWRITES the ExecuteUpdate change!

Fix: Don’t mix tracked and untracked operations on the same entity within the same DbContext scope. If you use ExecuteUpdate, reload the entity afterward or use a fresh DbContext.

4. Large Inserts Can Hit Memory Limits

Even with AddRange, inserting 100K entities creates 100K tracked objects in memory. On a 512 MB container, this can trigger OutOfMemoryException.

Fix: Chunk large inserts into batches of 5,000-10,000 and call SaveChanges() per chunk. Or switch to BulkInsert which doesn’t track entities:

// Chunked approach for large datasets without BulkExtensions
foreach (var chunk in products.Chunk(5000))
{
context.Products.AddRange(chunk);
await context.SaveChangesAsync(ct);
context.ChangeTracker.Clear(); // Release tracked entities
}

5. No Navigation Property Support in ExecuteUpdate

You can’t reference navigation properties in the SetProperty lambda. This will throw at runtime:

// This does NOT work
await context.Products
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.Category.Name, "Updated"), ct); // Throws!

Fix: Use a subquery with Select or update the related entity directly. For a workaround pattern, see the official Microsoft Learn documentation on ExecuteUpdate.

Key Takeaways

  • AddRange + SaveChanges is the go-to for inserts up to ~10K rows. It’s simple, gives you change tracking, and EF Core’s batching makes it fast enough for most APIs.
  • ExecuteUpdate and ExecuteDelete are dramatically faster for set-based operations - a single SQL statement regardless of row count. Use them whenever you’re updating or deleting by criteria.
  • Third-party bulk libraries (EFCore.BulkExtensions) are worth adding only when native approaches measurably fail - typically at 10K+ inserts or when you need upsert semantics.
  • ExecuteUpdate/ExecuteDelete bypass the change tracker, interceptors, and soft delete patterns. This is the #1 production gotcha. If your entity has a SaveChangesInterceptor for audit trails or soft delete, these bulk operations will skip it.
  • Always benchmark your specific scenario. Our numbers show the relative differences, but your hardware, database, network latency, and schema will affect absolute numbers. Use BenchmarkDotNet to measure before optimizing.

Troubleshooting

ExecuteUpdate/ExecuteDelete throws “could not be translated”

Your LINQ expression uses a method or property that can’t be translated to SQL. Simplify the Where clause - avoid ToString(), complex string operations, or custom methods. Stick to basic comparisons, Contains(), and arithmetic.

”Cannot insert duplicate key” during bulk insert

Your data contains duplicates on a unique constraint. Filter duplicates before inserting, or use BulkInsertOrUpdate from EFCore.BulkExtensions for upsert behavior.

SaveChanges timeout on large batches

Increase the command timeout: options.UseNpgsql(conn, o => o.CommandTimeout(120)). Also consider chunking large inserts into batches of 5,000-10,000 rows.

Memory spikes during large AddRange operations

The change tracker holds all entities in memory. Call context.ChangeTracker.Clear() between batches, or switch to BulkInsert which bypasses the change tracker.

ExecuteDelete violates foreign key constraint

The database enforces referential integrity. Delete dependent entities first, or configure cascade delete in your database schema. ExecuteDelete relies on database-level cascade rules, not EF Core’s configured behavior.

What are bulk operations in Entity Framework Core?

Bulk operations are database techniques that affect multiple rows in a single command or optimized batch. In EF Core 10, this includes batched SaveChanges (grouping statements into fewer round trips), set-based operations like ExecuteUpdate and ExecuteDelete (translating LINQ directly to SQL UPDATE/DELETE), and third-party libraries like EFCore.BulkExtensions that use database-specific bulk copy protocols for maximum throughput.

How do I insert millions of records efficiently with EF Core?

For very large inserts (100K+ rows), use a third-party library like EFCore.BulkExtensions which leverages database-specific bulk copy protocols (SqlBulkCopy for SQL Server, COPY for PostgreSQL). In our benchmarks, BulkInsert was 8x faster than AddRange + SaveChanges at 100K rows and used 77% less memory. For smaller batches (under 10K), AddRange + SaveChanges with chunking works well.

What is the difference between ExecuteUpdate and SaveChanges in EF Core?

SaveChanges works through the change tracker: you load entities, modify them in memory, and SaveChanges generates SQL based on detected changes. ExecuteUpdate translates LINQ directly into a SQL UPDATE statement that executes immediately at the database, without loading entities or involving the change tracker. ExecuteUpdate is dramatically faster for set-based updates but bypasses interceptors, audit trails, and soft delete patterns.

Does ExecuteDelete bypass global query filters in EF Core?

ExecuteDelete respects global query filters on the WHERE clause, so soft-deleted records are excluded from matching. However, the operation itself is a hard SQL DELETE. It does not trigger SaveChangesInterceptor, which means soft delete logic that converts DELETE to UPDATE SET IsDeleted = true will not fire. Use ExecuteUpdate to set the IsDeleted flag instead.

What is the default batch size in EF Core and how do I change it?

The default batch size in EF Core depends on the database provider. For PostgreSQL (Npgsql), the default is 42 statements per batch. For SQL Server, it is also 42. You can change it using MaxBatchSize in the provider configuration: options.UseNpgsql(connectionString, o => o.MaxBatchSize(100)). Values between 42-100 work well for most scenarios.

Should I use EFCore.BulkExtensions or Entity Framework Extensions?

EFCore.BulkExtensions is open-source (MIT license since v8) and covers the most common scenarios: BulkInsert, BulkUpdate, BulkDelete, and BulkInsertOrUpdate. Entity Framework Extensions by ZZZ Projects is a paid library (starting at $599/year) with broader features like BulkMerge, BulkSynchronize, and conditional operations. For most projects, EFCore.BulkExtensions is sufficient. Choose the paid option only if you need its advanced features.

How do I handle transactions with bulk operations in EF Core?

ExecuteUpdate and ExecuteDelete each run in their own implicit transaction. To wrap multiple bulk operations in a single atomic transaction, use context.Database.BeginTransactionAsync(). You can mix ExecuteUpdate, ExecuteDelete, and tracked SaveChanges within the same explicit transaction since they share the same database connection.

Does ExecuteUpdate work with the change tracker in EF Core?

No. ExecuteUpdate completely bypasses the change tracker. It generates and executes SQL directly at the database. This means entities tracked by the DbContext will not reflect the changes made by ExecuteUpdate. If you mix tracked operations with ExecuteUpdate on the same entity, the change tracker can overwrite bulk changes when SaveChanges runs. Avoid mixing tracked and untracked operations on the same entity within a single DbContext scope.

Summary

Bulk operations in EF Core 10 give you a spectrum of tools - from the simple AddRange + SaveChanges for everyday inserts, to ExecuteUpdate/ExecuteDelete for lightning-fast set-based operations, to third-party libraries for massive data imports. The key is matching the right tool to your scenario rather than always reaching for the most powerful option.

Start with native EF Core. It handles more than you’d expect. Add complexity only when benchmarks prove you need it.

If you found this article helpful, share it with your colleagues - and make sure to subscribe to the newsletter so you don’t miss the upcoming articles on concurrency control, transactions, and EF Core interceptors in this course series.

Happy Coding :)

What's your Feedback?
Do let me know your thoughts around this article.

6,500+ .NET devs get this every Tuesday

Free weekly newsletter

Stay ahead in .NET

Tutorials Architecture DevOps AI

Once-weekly email. Best insights. No fluff.

Join 6,500+ developers · Delivered every Tuesday