0

I have a mongoDb database with one of the collection having 2300000 documents and growing. Till the database had 1000000 documents the api response time was quick and the webpage loaded quickly, as soon as it crossed the 2000000 mark it started giving issues and took about a 100 seconds to find and throw the data. I dont know what to do with this sudden surge in the data, are there any practices that I have to follow inorder to manage and reduce the response time from the APIs The data that im trying to fetch is based on date and the query has to run through the entire database inorder to find data for just one day. I searched for a lot of things but am not able to find the solution.

1 Answers1

0

[Not enough reputation to comment]

Index is probably the solution for you. Can you provide example of both a typical document and the query you run? Are you retrieving (or do you really need) the whole documents, or just some fields on them?

Typically i would suggest to create an index on your date field, with inverse order, it will surely improve your search if it concerns the more recent documents. I can help you to achieve it if you need.

This doc will help you to understand indexes and how to optimize queries.

matthPen
  • 3,733
  • 1
  • 12
  • 12