0

EDIT: Iterating over Reader takes time, Execute Reader returns in few seconds. Is there some faster way instead of iterating ?

I am trying to fetch data from MySQL based on Filters. Previously i was using EF it was very slow, so then i decided to switch to Reading data using MySQLDataReader, it works fine when the result contains few records but is terribly slow on large result lets say Million rows. I am using following function to formulate the query.

public string GetCorrespondingQuery()
    {
        string zips = "\"" + string.Join("\",\"", _zips) + "\"";
        string counties = "\"" + string.Join("\",\"", _counties) + "\"";
        string states = "\"" + string.Join("\",\"", _states) + "\"";
        string cities = "\"" + string.Join("\",\"", _cities) + "\"";

        if ((_zips.Count == 0) && (_states.Count == 0) && (_counties.Count == 0) && (_cities.Count == 0))
            throw new Exception("Alert! No Filters Selected.");
        string query = "select * from dbcustomer JOIN dbzipcode On dbcustomer.ZIPCODE = dbzipcode.ZIP";
        if (_zips.Count > 0)
            query += " where dbzipcode.Zip in (" + zips + ")";
        if (_states.Count > 0)
            query += " and dbzipcode.STATE in (" + states + ")";
        if (_cities.Count > 0)
            query += " and dbzipcode.City in (" + cities + ")";
        if (_counties.Count > 0)
            query += " and dbzipcode.County in (" + counties + ")";

        return query + ";";
    }

Above query takes few seconds when executed in MySQL Workbench, but takes several minutes using C#.

Following is the Code to fetch Data from MySQL Database.

public List<CustomerDTO> FetchCustomersUsingQuery(CustomersFilter filter)
    {
        string query = filter.GetCorrespondingQuery();
        List<CustomerDTO> customers = new List<CustomerDTO>();
        using (MySqlConnection con = new MySqlConnection(_connectionString))
        {
            MySqlCommand cmd = new MySqlCommand(query, con);
            cmd.CommandTimeout = 0;
            cmd.CommandType = CommandType.Text;

            con.Open();
            using (MySqlDataReader reader = cmd.ExecuteReader())
            {
                while (reader.Read())
                {
                    var customer = new CustomerDTO()
                    {
                        PHONE = reader.GetString(0),
                        FIRSTNAME = reader.GetString(1),
                        LASTNAME = reader.GetString(2),
                        ADDRESS = reader.GetString(3),
                        CStatus = reader.GetString(4),
                        Campaign = reader.GetString(5),
                        ListName = reader.GetString(6),
                        Email = reader.GetString(7),
                        Recording = reader.GetBoolean(8),
                        ZIP = reader.GetString(9),
                        CITY = reader.GetString(11),
                        COUNTY = reader.GetString(12),
                        STATE = reader.GetString(13),
                    };
                    customers.Add(customer);
                }
                reader.Close();
            }
            con.Close();
        }
        //s.Stop();
        //throw new Exception("Time to Fetch " + customers.Count + " records = " + s.Elapsed);

        return customers;
    }

I have tried whatever i could,Can anyone guide me to speed it up? I am struggling to improve this from past few weeks

Faizan khan
  • 155
  • 1
  • 11
  • [How to output MySQL query results in CSV format?](https://stackoverflow.com/questions/356578/how-to-output-mysql-query-results-in-csv-format) – Lukasz Szozda Dec 28 '18 at 17:49
  • @LukaszSzozda Can it be done using C# ? – Faizan khan Dec 28 '18 at 17:52
  • I propose to change strategy instead of pulling 1+MLN rows to application and concatenate in app you could do the work directly on DB server and just grab results – Lukasz Szozda Dec 28 '18 at 17:53
  • How am i supposed to do that ? By using `Into OutFile "filename.csv"` ? But what if server is not running on localhost ? – Faizan khan Dec 28 '18 at 17:56
  • 2
    There is certainly no good reason to allocate 1 million items in a collection - write out the CSV directly from the datareader. – Ňɏssa Pøngjǣrdenlarp Dec 28 '18 at 17:56
  • If you're not doing anything other than writing to csv with the list of customer objects you're building, I would suggest just writing them directly to csv. If the query runs fast in Workbench, and has the same results, the problem probably stems from the overhead of storing all results locally. – Uueerdo Dec 28 '18 at 17:57
  • The reason i was doing this was because i am using CsvHelper which writes List to file directly. But you guys might be right so i am now working to write directly to file. – Faizan khan Dec 28 '18 at 18:06
  • I have tried writing records directly to the file, but performance is not improved. – Faizan khan Dec 28 '18 at 18:19
  • Did you use BufferedStream to write to the file? – Mark Feldman Dec 28 '18 at 20:15
  • Can you show your output code you are using? – NetMage Dec 28 '18 at 21:42
  • Instead of adding customer to list i tried writing it to file directly by passing each record to function below. `public void WriteCustomerToCSV(CustomerDTO customer, string filepath) { using (TextWriter textWriter = new StreamWriter(filepath,append:true)) { var csv = new CsvWriter(textWriter); { csv.WriteRecord(customer); }; } }` – Faizan khan Dec 29 '18 at 13:40

0 Answers0