0

Currently i work on an application that can send and retrieve arbitary large files. In the beginning we decided to use json for this because it is quite easy to handle and store. This works until images, videos or larger stuff in general comes in. The current way we do this.

So we got a few problems at least with the current approach:

  • 1 MB File size limit of express. Solution
  • 10 MB File size limit of axios. Solution
  • 16 MB File size limit of MongoDB. No solution currently

So currently we are trying to overcome the limits of MongoDB, but in general this seems like we are on the wrong path. As we go higher there will be more and more limits that are harder to overcome and maybe MongoDB's limit is not solveable. So would there be a way to do this in a more efficent way then what we currently do?

There is one thing left to say. In general we need to load the whole object on serverside back together to verify that the structure is the one we would expect and to hash the whole object. So we did not think of splitting it right now, but maybe that is the only option left. But even then how would you send videos or similar big chunks ?

NemesisFLX
  • 23
  • 5

1 Answers1

0

If you need to store files bigger than 16 MB in MongoDb, you can use GridFS.
GridFS works by splitting your file into smaller chunks of data and store them separately. When that file is needed it gets reassembled and becomes available.

philoez98
  • 423
  • 4
  • 12
  • Sounds nice, but is it even a best practice to lets say encode a video into a json and send and store it that way ? – NemesisFLX Jun 23 '19 at 14:21
  • Well if you want to use Mongodb then yes, it is. It's pretty efficient and it's the recommended approach in such cases. Of course you could store big files like videos in a different db, but that is another story. I'd go with GridFS. – philoez98 Jun 23 '19 at 16:11