Sending and receiving big files using Egnyte.API nuget package

Handling big files can be a problem when sending it through web. Simple REST calls are enough for small or medium files, but it’s limitation is a size of a request, that cannot be larger then 2GB. For files larger than that, you have to send or download file in chunks or as a stream.

In this post I’ll describe how to send and download really big files, bigger then 2GB connecting to Egnyte cloud storage with Egnyte.Api nuget package. I have written an introduction to Egnyte api here and wrote about using Egnyte.Api nuget package here.

Sending big files in chunks.

Egnyte API exposes dedicated method for sending big files, which is described here: Egnyte file chunked upload. First you need to install Egnyte.Api nuget package. Simple code can look like this:

    var client = new EgnyteClient(Token, Domain);

    var fileStream = new MemoryStream(File.ReadAllBytes("C:/test/big-file.zip"));
    var response = await ChunkUploadFile(client, "Shared/MikTests/Blog/big-file.zip", fileStream);

And ChunkUploadFile asynchronous helper method looks like this:

    private async Task<UploadedFileMetadata> ChunkUploadFile(
        EgnyteClient client,
        string serverFilePath,
        MemoryStream fileStream)
    {
        // first chunk
        var defaultChunkLength = 10485760;
        var firstChunkLength = defaultChunkLength;
        if (fileStream.Length < firstChunkLength)
        {
            firstChunkLength = (int)fileStream.Length;
        }

        var bytesRead = firstChunkLength;
        var buffer = new byte[firstChunkLength];
        fileStream.Read(buffer, 0, firstChunkLength);

        var response = await client.Files.ChunkedUploadFirstChunk(serverFilePath, new MemoryStream(buffer))
            .ConfigureAwait(false);
        int number = 2;

        while (bytesRead < fileStream.Length)
        {
            var nextChunkLength = defaultChunkLength;
            bool isLastChunk = false;
            if (bytesRead + nextChunkLength >= fileStream.Length)
            {
                nextChunkLength = (int)fileStream.Length - bytesRead;
                isLastChunk = true;
            }

            buffer = new byte[nextChunkLength];
            fileStream.Read(buffer, 0, nextChunkLength);

            if (!isLastChunk)
            {
                await client.Files.ChunkedUploadNextChunk(
                    serverFilePath,
                    number,
                    response.UploadId,
                    new MemoryStream(buffer)).ConfigureAwait(false);
            }
            else
            {
                return await client.Files.ChunkedUploadLastChunk(
                    serverFilePath,
                    number,
                    response.UploadId,
                    new MemoryStream(buffer)).ConfigureAwait(false);
            }
            number++;
            bytesRead += nextChunkLength;
        }

        throw new Exception("Something went wrong - unable to enumerate to next chunk.");
    }

Notice, that this code uses three methods that are reflected to three web requests and they are used for sending firs, next and last data chunk. Response of ChunkedUploadFirstChunk gives you UploadId that will identify upload and must be provided in other two methods. Buffer size I used is 10485760 bytes, that is 10 Megabytes, but you can use whatever suites you between 10 MB and 1 GB. Memory usage of sample console application looks like this:

Downloading big files

Downloading is much simpler then uploading. Important thing is to use streams the right way, so that application would not allocate to much memory.

    var client = new EgnyteClient(Token, Domain);

    var responseStream = await client.Files.DownloadFileAsStream("Shared/MikTests/Blog/big-file.zip");

    using (FileStream file = new FileStream("C:/test/big-file01.zip", FileMode.OpenOrCreate, FileAccess.Write))
    {
        CopyStream(responseStream.Data, file);
    }

And CopyStream helper method looks like this:

    /// <summary>
    /// Copies the contents of input to output. Doesn't close either stream.
    /// </summary>
    public static void CopyStream(Stream input, Stream output)
    {
        byte[] buffer = new byte[8 * 1024];
        int len;
        while ((len = input.Read(buffer, 0, buffer.Length)) > 0)
        {
            output.Write(buffer, 0, len);
        }
    }

I tested this code by sending and downloading 2.5GB files and many smaller ones and it works great.

All posted code is available in my public github repository: https://github.com/mikuam/Blog.

If you’d like to see other examples of usage Egnyte.Api, let me know.

2 thoughts on “Sending and receiving big files using Egnyte.API nuget package

  1. Parthasarathi Adhikary

    Hi,

    Thanks for this valuable post which save many developers’ lot of time and effort.
    I am having a problem when I use this code. I am facing “Out Of Memory” exception.
    I think it is due to the line “MemoryStream(File.ReadAllBytes(“C:/test/big-file.zip”));”

    Can you post another version which has both read and write in chunks?

    It would be a complete solution then.

    Thanks

    Reply
    1. Michał Białecki Post author

      It seems that you have a problem with reading a file from your local directory? I tested my code and I showed that it didn’t use more then 100Mb of memory, while a file was 2.5GB. Check how much memory does your process allocate in a task manager. What is your file size?

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *