0

In our ASP.NET application we've implemented a data layer built on top of MySQL, and which uses Connector/NET in order to implement the Entity Framework.

This works perfectly well for most of our CRUD actions, but when we run an INSERT statement with several hundred records it throws a "Too many connections" exception.

While I accept that we are making a lot of separate read/write calls (we query for duplicate entries prior to every insert, plus there's an audit log which is also written to using a separate query for every insert) but shouldn't the connection pool take care of this?

Also, in this similar thread the OP was told to use using statements which I fully agree with, and in our case every query is already wrapped in a using statement which I assume is managing the connections in an efficient manner: i.e., not disposing of them if another request has been queued.

And anyway, the connect timeout is currently 50, which is already pretty small, right? Moving it up to 500 made no difference, while moving it down to 10 gave me about 50% more inserts before the same exception was thrown.

So, given that every instance of the DbContext is wrapped in a using statement, why are we getting a "Too many connections" exception?

UPDATE: We found that the root of our problem was actually caused by one of our repositories opening a DB connection in its constructor, even though this connection was never used. Sorry guys, this turned out to be a red herring. Thanks for your suggestions.

Community
  • 1
  • 1
awj
  • 6,501
  • 8
  • 54
  • 98
  • What's the value of max_connections? Try to reade this post maybe it can help you: http://www.mysqlperformanceblog.com/2010/03/23/too-many-connections-no-problem/ It's interesting this: "You would usually have one connection reserved for SUPER user, but that does not help if your application is connecting as a SUPER user (which is a bad idea anyway)." – ᗩИᎠЯƎᗩ Jul 30 '13 at 10:39
  • max_connections = 100. I don't want to increase this because I don't feel that's the answer. It's just pushing the problem a little farther down the road. – awj Jul 30 '13 at 13:05

2 Answers2

0

Not exactly the answer you are looking for... but:

Is there a particular reason why you aren't using the same connection for all your actions?

As an example:

If you are performing 50 inserts in a row and by consequence 50 queries (one for each insert) using the same connection would greatly improve your system performance.

Sergio
  • 7,761
  • 10
  • 44
  • 76
  • Are you asking why we're not collecting 50 inserts, and then calling SaveChanges()? If so, we can't do this because of our audit log which requires the object with properties populated by the DB. To rewrite this would require a major rewrite of our data layer. I suppose it *is* possible, but yeah, not the remedy that I was looking for. – awj Jul 30 '13 at 08:35
  • Plus, we also need to check for duplicates before we know whether to include that entity or not. – awj Jul 30 '13 at 09:06
  • I just sugested this because, by what i took from your problem, you are waisting too many resources opening and closing connections to the database. Have you tried closing the connection inside the using block? – Sergio Jul 30 '13 at 13:14
0

What about average execution of your typical insert? Is there chance that connection for new insert (with all around-insert subqueries) request is acquired before previous one is handled? Did you try to profile/log how your DAL insert functions are performing during this heavy insert load? Sometimes performance (if this will be the issue) can be greatly improved by moving part of logic from "pure entity queries" to triggers (audit logging, see How to program a MySQL trigger to insert row into another table?) and stored procedures (query for duplicates + inserts by themselves), as you will have less roundtrips to server/ more control over your queries.

And, as last option, you can try to dig in mysql configuration, increasing connections limit, which is 100 connections by default. But this idea typically just postpones search for bottlenecks in your DAL.

Community
  • 1
  • 1
Dmytro
  • 1,430
  • 14
  • 14