Sending and receiving big files using Egnyte.API nuget package

Handling big files can be a problem when sending it through web. Simple REST calls are enough for small or medium files, but it’s limitation is a size of a request, that cannot be larger then 2GB. For files larger than that, you have to send or download file in chunks or as a stream.

In this post I’ll describe how to send and download really big files, bigger then 2GB connecting to Egnyte cloud storage with Egnyte.Api nuget package. I have written an introduction to Egnyte api here and wrote about using Egnyte.Api nuget package here.

Sending big files in chunks.

Egnyte API exposes dedicated method for sending big files, which is described here: Egnyte file chunked upload. First you need to install Egnyte.Api nuget package. Simple code can look like this:

And ChunkUploadFile asynchronous helper method looks like this:

Notice, that this code uses three methods that are reflected to three web requests and they are used for sending firs, next and last data chunk. Response of ChunkedUploadFirstChunk gives you UploadId that will identify upload and must be provided in other two methods. Buffer size I used is 10485760 bytes, that is 10 Megabytes, but you can use whatever suites you between 10 MB and 1 GB. Memory usage of sample console application looks like this:

Downloading big files

Downloading is much simpler then uploading. Important thing is to use streams the right way, so that application would not allocate to much memory.

And CopyStream helper method looks like this:

I tested this code by sending and downloading 2.5GB files and many smaller ones and it works great.

All posted code is available in my public github repository: https://github.com/mikuam/Blog.

If you’d like to see other examples of usage Egnyte.Api, let me know.

2 thoughts on “Sending and receiving big files using Egnyte.API nuget package

  1. Hi,

    Thanks for this valuable post which save many developers’ lot of time and effort.
    I am having a problem when I use this code. I am facing “Out Of Memory” exception.
    I think it is due to the line “MemoryStream(File.ReadAllBytes(“C:/test/big-file.zip”));”

    Can you post another version which has both read and write in chunks?

    It would be a complete solution then.

    Thanks

    1. It seems that you have a problem with reading a file from your local directory? I tested my code and I showed that it didn’t use more then 100Mb of memory, while a file was 2.5GB. Check how much memory does your process allocate in a task manager. What is your file size?

Leave a Reply

Your email address will not be published. Required fields are marked *