0

It seems as if the DbContext in the EntityFramework keeps getting slower the more operations (add, delete, modify, querying) you execute on it. Calling periodically SaveChanges() after a few operations won't fix that problem. The only workaround is to dispose the context and regenerate it.

To convey an idea to you: A process, that makes use of one DbContext needs about 4 hours whereas the code with the workaround needs just about 45 minutes, so it's really significant! Is there a reason or switch I don't know?

  • See [Why is inserting so slow](http://stackoverflow.com/questions/5943394/why-is-inserting-entities-in-ef-4-1-so-slow-compared-to-objectcontext/5943699#5943699) and [Fastest way of inserting](http://stackoverflow.com/questions/5940225/fastest-way-of-inserting-in-entity-framework/5942176#5942176) – Mark Oreta Feb 27 '13 at 01:16
  • Thanks Mark, but I need DetectChanges. And if i dispose and instantiate the context periodically, DetectChanges don't causes performance problems. I'm just wondering why SaveChanges() is no solutions since after that it should have the same starting position as with a new context?! Or am I wrong? After that the context should only have to detect changes on entities that were modified, added, remove after SaveChanges()... – Uli Armbruster Feb 27 '13 at 02:17
  • Are you using change tracking proxy POCOs? –  Feb 27 '13 at 03:57
  • Thanks for response. With the concept of Identity Map everything is clear. Since I need Change Tracking I have to recreate the context... – Uli Armbruster Apr 04 '13 at 10:27

2 Answers2

1

Here's a useful link for performance considerations for EF5.

Are you using change tracking proxies? If not you may be able to speed things up. From the link:

When a POCO entity does not have a change tracking proxy, changes are found by comparing the contents of your entities against a copy of a previous saved state. This deep comparison will become a lengthy process when you have many entities in your context, or when your entities have a very large amount of properties, even if none of them changed since the last comparison took place.

Otherwise you can set DbContextConfiguration.AutoDetectChangesEnabled = false as suggested in the comments and links. You can still explicitly call DetectChanges() after you have done some intensive DbSet/DbContext method calls that would normally auto call it.

Could you also reduce the number of entities in your context? Perhaps use AsNoTracking() queries if you have some entities that do not need to be tracked by the ObjectStateManager.

1

It seems as if the DbContext in the EntityFramework keeps getting slower the more operations (add, delete, modify, querying) you execute on it.

It absolutely does - Your DbContext (and the ObjectContext it's based on) saved every entity that it ever touches, loads, updates, saves.

Quick Excerpt from an MSDN blog post:

The more you use an ObjectContext, generally the bigger it gets. This is because it holds a reference to all the Entities it has ever known about, essentially whatever you have queried, added or attached. So you should reconsider sharing the same ObjectContext indefinitely.

Since you said the process takes about 4 hours, I assume you have thousands of entities that are being modified. Save Changes doesn't dump the object graph, you're just creating more and more for it to track. Everytime you make a change, it's having to traverse the graph so the bigger the graph, the longer it takes.

Is your work-around really bad code? Could there be a way to split your process so that each section creates, and disposes of it's own DbContext rather than sharing one?

Mark Oreta
  • 10,086
  • 1
  • 29
  • 35
  • Like you said: I have to do bulk operations and because of the Identity Map Pattern everything is "cached". Since EF can recreate the context very quickly it's fine for me to implement it this way. Of course, there's an intelligible argument for using a new context in every single operation. – Uli Armbruster Apr 04 '13 at 11:07