1

We created the WebAPI for querying an Oracle database. The query returns results that are very huge. We cannot go for pagination as the client application requires the entire result in one call. There was an issue when there is a huge amount of data it throws Out of memory exception. As suggested in other forums I tried to add below tags in the web.config also as suggested in link

 <requestLimits maxAllowedContentLength="2147483647" />

(this was placed inside

<system.webServer><security><requestFiltering>

and

<httpRuntime targetFramework="4.6.1" maxRequestLength="2147483647" executionTimeout="1600" requestLengthDiskThreshold="2147483647" />

Placed inside

  <system.web>

But after adding, tried calling the API it keeps loading forever(waited for almost 1hour) yhen again it threw the same out of System.OutofMemoryException. Can anyone please help me with this as this is my first Web API. We cannot create stored procedure for this because the database we just have read access

 public HttpResponseMessage Getdetails([FromUri] string[] id)
{
  string connStr =  ConfigurationManager.ConnectionStrings["ProDataConnection"].ConnectionString;
  using (OracleConnection dbconn = new OracleConnection(connStr))
 {
  var inconditions = id.Distinct().ToArray();
  var srtcon = string.Join(",", inconditions);
  DataSet userDataset = new DataSet();
  var strQuery = @"SELECT * from STCD_PRIO_CATEGORY where STPR_STUDY.STD_REF IN(" + srtcon + ")";
   using (OracleCommand selectCommand = new OracleCommand(strQuery, dbconn))

 {
  using (OracleDataAdapter adapter = new OracleDataAdapter(selectCommand))
{
    DataTable selectResults = new DataTable();
    adapter.Fill(selectResults);
    var returnObject = new { data = selectResults };
    var response = Request.CreateResponse(HttpStatusCode.OK, returnObject, MediaTypeHeaderValue.Parse("application/json"));
    ContentDispositionHeaderValue contentDisposition = null;
    if (ContentDispositionHeaderValue.TryParse("inline; filename=ProvantisStudyData.json", out contentDisposition))
    {
        response.Content.Headers.ContentDisposition = contentDisposition;
    }
    return response;

I even tried to add Breakpoints and just the adapter.Fill(selectResults) takes longer time. If the row count was around 45,000 it returns result without any issue. The ID I tried is suppose to return 600,000 rows is where we get the memory exception. Can anyone please help me with this.

user4912134
  • 575
  • 3
  • 7
  • 27
  • That tag is for maximum Http request does not related to DB – jonju Aug 03 '16 at 18:49
  • 1
    Please do yourself a favor and do *use sql parameters* instead of concatenating your sql string together. – hometoast Aug 03 '16 at 18:51
  • @jonju doesn't that help.? I am new to Asp.Net and this is my application. – user4912134 Aug 03 '16 at 18:52
  • @hometoast sorry I am not getting you. But I am getting those Id as the array of input parameter – user4912134 Aug 03 '16 at 18:54
  • 1
    You're leaving yourself open to SQL Injection attacks. http://stackoverflow.com/questions/601300/what-is-sql-injection. It is slightly off topic because it's not your direct questions, but it's vitally important that you address that as well. – hometoast Aug 03 '16 at 18:56
  • 4
    You are attempting to retrieve, process, serialize, and transmit over 2GB of data. That is insane. "We cannot go for pagination as the client application requires the entire result in one call" The requirements are not realistic with this amount of data, and indicates a fatal design flaw. – DVK Aug 03 '16 at 18:57
  • The tags you've add they are not related to your problem. That tags are for maximul URL length, Maximum file upload size etc. Besides you should suggest your client to allow to load the data partially by describing possible issues such as you are facing now – jonju Aug 03 '16 at 18:58
  • @DVK how would I change this to pagination. I will try checking with them again – user4912134 Aug 03 '16 at 18:59
  • @hometoast so shouldn't I be giving the the ID's as input parameters?! – user4912134 Aug 03 '16 at 19:01
  • 1
    Keeping in mind about huge data you should check out about [Sql INDEX](http://www.w3schools.com/sql/sql_create_index.asp) also. Try not to use `Select *` it has huge impact on performance instead Select using column names(bit boring I know). – jonju Aug 03 '16 at 19:05
  • In the actual code I am giving all the column names directly. The number of columns is huge so I just gave Select * here – user4912134 Aug 03 '16 at 19:10
  • I missed in mentioning that we are querying is Oracle DB – user4912134 Aug 03 '16 at 19:12
  • Use IEnumerable for DB rows and writes all the data to disk as soon as you receive them. Then do manual serialization piece by piece like an external merge sort and append them. Finally opens up a file stream to send back the json – Steve Aug 03 '16 at 20:38
  • @Steve do have any samples, I understand using the IEnumerable but not sure how we serialize piece by piece. I am very new to Asp.Net. – user4912134 Aug 03 '16 at 21:11
  • @user4912134 the idea of IEnumerable is to retrieve one element at a time without loading all of them in memory. The actual serialize piece by piece is done by your own implementation of serialization and merge algorithm. purely algorithm nothing to do with asp.net – Steve Aug 04 '16 at 13:22
  • @Steve should I be changing the entire code – user4912134 Aug 04 '16 at 16:00
  • Pretty much. There is no build in function for external serialization and merge as far as I know – Steve Aug 04 '16 at 16:07
  • @Steve are there any samples on this. I tried to look to other forums but I am not sure of the serialization you mentioned – user4912134 Aug 05 '16 at 16:32
  • @user4912134 No. Its very situational I don't think you will find any example on this. Currently you are trying to build the entire object and serialize them in memory. What im suggesting is that you build the object on disk, similar to external merge sort. If you really want an example look up how external merge sort works. that's the concept behind it – Steve Aug 05 '16 at 16:42

0 Answers0