11

I am trying to upload files using similar approach HttpClient: How to upload multiple files at once in windows phone.

using (var content = new MultipartFormDataContent())
{
    content.Add(CreateFileContent(imageStream, "image.jpg", "image/jpeg"));
    content.Add(CreateFileContent(signatureStream, "image.jpg.sig", "application/octet-stream"));

    var response = await httpClient.PostAsync(_profileImageUploadUri, content);
    response.EnsureSuccessStatusCode();
}

private StreamContent CreateFileContent(Stream stream, string fileName, string contentType)
{
    var fileContent = new StreamContent(stream);
    fileContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("form-data") 
    { 
        Name = "\"files\"", 
        FileName = "\"" + fileName + "\""
    }; // the extra quotes are key here
    fileContent.Headers.ContentType = new MediaTypeHeaderValue(contentType);            
    return fileContent;
}

This works fine while uploading small files. If I tried to upload larger file(say > 50mb) in a low end device(512Mb memory), it throws System.OutOfMemoryException. I used the diagnostic tools to monitor the memory consumption and noticed that memory grows exponentially during PostAsync call. Seems like it is copying the entire content to the memory. Right now we don't have chunking support in the api.

What is the best strategy to upload large file using HttpClient in a low memory windows phone device?

Community
  • 1
  • 1
RP.
  • 709
  • 2
  • 12
  • 29
  • Here's a solution: http://stackoverflow.com/questions/26223902/window-phone-8-submit-post-form-with-an-image/26243886#26243886 – Scott Nimrod Jan 19 '15 at 06:32
  • 1
    What about `Windows.Networking.BackgroundTransfer.BackgroundUploader`? Look here: http://stackoverflow.com/a/27430331/27211 – kiewic Jan 21 '15 at 01:22
  • All API calls for download&upload is implemented in a SDK (Portable class Library supporting .net 4.5.1, Windows Phone 8.1 & Windows Store 8.1). As far as I know support for Windows.Networking.BackgroundTransfer.BackgroundUploader is not available in PCL, correct if I am wrong. I have to implement this using HttpClient. – RP. Jan 22 '15 at 00:44

3 Answers3

5

Do multipart POST manually - without help from MultipartFormDataContent

If you must send it multi-part, then you could POST it more manually, reading from the source file in 4k buffer blocks.

You don't necessarily need to do so with async methods. The solution is "manual control over 4k buffering". But async would be ideal, being most thread/CPU efficient.

Here's another recommended link to understand how to code multipart posts. And another for understanding the protocol, here's an example of what gets sent out over the stream, illustrating the boundary markers

Also, architecturally, I tend to prefer uploading files separately to any (form) data. This avoids multipart posting completely, making your APIs atomic and simple. You may have a service which simple stores an uploaded file and returns the URL or ID. That URL or ID can then be referenced with your data and posted subsequently.

Todd
  • 14,946
  • 6
  • 42
  • 56
  • I don't have control over the server side api's. I am trying to follow the second approach. Also trying with http://www.strathweb.com/2013/01/asynchronously-streaming-video-with-asp-net-web-api/. – RP. Jan 25 '15 at 09:21
  • 1
    That link doesn't relate to multipart, and may confuse your manual implementation if you start with their async architecture. You should start with my link, and learn from your link to apply the asynch methods. There are many ways of implementing a manual 4k buffered POST multipart stream, but my overall answer is essentially to do it manually and give up on MultipartFormDataContent. – Todd Jan 26 '15 at 06:40
  • @Todd why don't you recommend using MultipartFormDataContent? – gusmally supports Monica Nov 08 '19 at 23:58
  • @gusmally-ApologisetoMonica- I think the full reason why is captured in my answer. 1) Multi-part is added complexity, for both the client and the server. Complexity leads to bugs, and code that's harder to maintain. 2) Where possible, functions should only have one purpose. 3) Architecturally, it's very easy to separate file-upload data from form-data. Therefore, there is no good reason to use multi-part unless it's the API for an old system, or a newer third-party system. – Todd Nov 16 '19 at 00:21
4

I'm not really an expert on MultipartFormDataContent (and it might split the content underwater) but a tip could be that you divide the data you want to send.

Then send the other blocks and reconstruct it on the receiving end.

e.g. divide the images in smaller blocks (for example 10mb or less depending on the memory usage) and send these

So it might result in a for loop to traverse the blocks.

foreach (byte[] block in dividedContent)
{
    using (var content = new MultipartFormDataContent())
    {
        content.Add(block);

        var response = await httpClient.PostAsync(_profileImageUploadUri, content);
        response.EnsureSuccessStatusCode();
    }
}

perhaps something like that would solve your problem :)

Blaatz0r
  • 1,185
  • 11
  • 23
  • I don't have any control over server side api's. I have to find a way to upload till the api's supports chunked upload. – RP. Jan 25 '15 at 09:25
  • dividedContent, and join it together like 7z multi archieve? Good idea, but not try it yet – toha Sep 09 '16 at 11:10
0

You could use a fileStream in addition to the StreamContent class:

    using (var fileStream = File.OpenRead(filepath))
    {
       var response = await _httpClient.PostAsync(requestUri, new 
          StreamContent(fileStream)).ConfigureAwait(false);
    }

This seems to do all the necessary chunking of reading in the file and POSTing it in the background, plus you can set the buffer size in the StreamContent constructor as needed. I did not see my application memory footprint grow more than a MB with this method, even with very large files.