In one of my application methods I'm dealing with a lot of entries that I have to save at the same time:
foreach (CategoryType category in apiCall.CategoryList)
{
EBAY_CATEGORY itemToAdd = new EBAY_CATEGORY
{
CATEGORY_ID = int.Parse(category.CategoryID),
CATEGORY_LEVEL = category.CategoryLevel,
NAME = category.CategoryName,
LEAF = category.LeafCategory,
EXPIRED = category.Expired,
VIRTUAL = category.Virtual
};
int? parentId = Int32.Parse(category.CategoryParentID[0]);
if (parentId != null)
{
itemToAdd.PARENTID = parentId;
}
db.EBAY_CATEGORY.Add(itemToAdd);
db.Entry(itemToAdd).State = EntityState.Added;
db.SaveChanges();
}
(Yup, I'm currently dealing with eBay). When this loop starts, I have downloaded up to about 20 000 items in a collection of complex items.
So you see that I have initiated a foreach
to make sure that I save each of the entries in my database, but since it's a lot of operations it also take a lot of time. What would be the best way to deal with this collection of items?
EDIT So, following some very useful suggestions down below, here's my new code!
using (TransactionScope scope = new TransactionScope())
{
MyContext db = null;
try
{
db = new MyContext();
db.Database.Connection.Open();
db.Configuration.AutoDetectChangesEnabled = false;
db.Configuration.ValidateOnSaveEnabled = false;
int count = 0;
ApiContext context = GeteBayApiContext();
GetCategoriesCall apiCall = new GetCategoriesCall(context)
{
EnableCompression = true,
ViewAllNodes = true
};
apiCall.DetailLevelList.Add(DetailLevelCodeType.ReturnAll);
apiCall.GetCategories();
foreach (CategoryType category in apiCall.CategoryList)
{
EBAY_CATEGORY itemToAdd = new EBAY_CATEGORY
{
CATEGORY_ID = int.Parse(category.CategoryID),
CATEGORY_LEVEL = category.CategoryLevel,
NAME = category.CategoryName,
LEAF = category.LeafCategory,
EXPIRED = category.Expired,
VIRTUAL = category.Virtual
};
int? parentId = Int32.Parse(category.CategoryParentID[0]);
if (parentId != null)
{
itemToAdd.PARENTID = parentId;
}
count++;
db = AddToContext(db, itemToAdd, count, 2000, true);
}
db.SaveChanges();
}
finally
{
if (db != null)
{
db.Dispose();
}
}
scope.Complete();
}
And the AddToContext method:
private static MyContext AddToContext(MyContext context, EBAY_CATEGORY itemToAdd, int count, int commitCount,
bool recreateContext)
{
context.Set<EBAY_CATEGORY>().Add(itemToAdd);
if (count%commitCount == 0)
{
context.SaveChanges();
if (recreateContext)
{
context.Dispose();
context = new MyContext();
context.Configuration.AutoDetectChangesEnabled = false;
context.Configuration.ValidateOnSaveEnabled = false;
}
}
return context;
}
It works fine and a lot faster, but each time the Savechanges
method is called after a bulk of data has been inserted, I get the following error:
The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements.
This occurs after a bulk of 2000 data entries, or sometimes after a few shots of 100 data entries. I don't understand what's wrong.