Tag Archives: asp.net

How to send many requests in parallel in ASP.Net Core

I want to make 1000 requests! How can I make it really fast? Let’s have a look at 4 approaches and compare their speed.

Preparations

In order to test different methods of handling requests, I created a very simple ASP.Net Core API, that return user by his id. It fetches them from plain old MSSQL database.

I deployed it quickly to Azure using App services and it was ready for testing in less than two hours. It’s amazing how quickly a .net core app can be deployed and tested in a real hosting environment. I was also able to debug it remotely and check it’s work in Application Insights.

Here is my post on how to build an app and deploy it to Azure: https://www.michalbialecki.com/2017/12/21/sending-a-azure-service-bus-message-in-asp-net-core/

And a post about custom data source in Application Insights: https://www.michalbialecki.com/2017/09/03/custom-data-source-in-application-insights/

API in a swagger looks like this:

So the task here is to write a method, that would call this endpoint and fetch 1000 users by their ids as fast as possible.

I wrapped a single call in a UsersClient class:

    public class UsersClient
    {
        private HttpClient client;

        public UsersClient()
        {
            client = new HttpClient();
        }

        public async Task<UserDto> GetUser(int id)
        {
            var response = await client.GetAsync(
                "http://michalbialeckicomnetcoreweb20180417060938.azurewebsites.net/api/users/" + id)
                .ConfigureAwait(false);
            var user = JsonConvert.DeserializeObject<UserDto>(await response.Content.ReadAsStringAsync());

            return user;
        }
    }

#1 Let’s use asynchronous programming

Asynchronous programming in C# is very simple, you just use async / await keywords in your methods and magic happens.

    public async Task<IEnumerable<UserDto>> GetUsersSynchrnously(IEnumerable<int> userIds)
    {
        var users = new List<UserDto>();
        foreach (var id in userIds)
        {
            users.Add(await client.GetUser(id));
        }

        return users;
    }

Score: 4 minutes 51 seconds

This is because although it is asynchronous programming, it doesn’t mean requests are done in parallel. Asynchronous means requests will not block the main thread, that can go further with the execution. If you look at how requests are executed in time, you will see something like this:

Let’s run requests in parallel

Running in parallel is the key here because you can make many requests and use the same time that one request takes. The code can look like this:

    public async Task<IEnumerable<UserDto>> GetUsersInParallel(IEnumerable<int> userIds)
    {
        var tasks = userIds.Select(id => client.GetUser(id));
        var users = await Task.WhenAll(tasks);

        return users;
    }

WhenAll is a beautiful creation that waits for tasks with the same type and returns a list of results. A drawback here would be an exception handling because when something goes wrong you will get an AggregatedException with possibly multiple exceptions, but you would not know which task caused it.

Score: 28 seconds

This is way better than before, but it’s not impressive. The thing that slows down the process is thread handling. Executing 1000 requests at the same time will try to create or utilize 1000 threads and managing them is a cost. Timeline looks like this:

Let’s run requests in parallel, but smarter

The idea here is to do parallel requests, but not all at the same time. Let’s do it batches for 100.

    public async Task<IEnumerable<UserDto>> GetUsersInParallelFixed(IEnumerable<int> userIds)
    {
        var users = new List<UserDto>();
        var batchSize = 100;
        int numberOfBatches = (int)Math.Ceiling((double)userIds.Count() / batchSize);

        for(int i = 0; i < numberOfBatches; i++)
        {
            var currentIds = userIds.Skip(i * batchSize).Take(batchSize);
            var tasks = currentIds.Select(id => client.GetUser(id));
            users.AddRange(await Task.WhenAll(tasks));
        }
            
        return users;
    }

Score: 20 seconds

This is the slightly better result because framework needs to handle fewer threads at the same time and therefore it is more effective. You can manipulate the batch size and figure out what is best for you. Timeline looks like this:

The proper solution

The proper solution needs some modifications in the API. You won’t always have the ability to change the API you are calling, but only changes on both sides can get you even further. It is not effective to fetch users one by one when we need to fetch thousands of them. To further enhance performance we need to create a specific endpoint for our use. In this case – fetching many users at once. Now swagger looks like this:

and code for fetching users:

    public async Task<IEnumerable<UserDto>> GetUsers(IEnumerable<int> ids)
    {
        var response = await client
            .PostAsync(
                "http://michalbialeckicomnetcoreweb20180417060938.azurewebsites.net/api/users/GetMany",
                new StringContent(JsonConvert.SerializeObject(ids), Encoding.UTF8, "application/json"))
            .ConfigureAwait(false);

        var users = JsonConvert.DeserializeObject<IEnumerable<UserDto>>(await response.Content.ReadAsStringAsync());

        return users;
    }

Notice that endpoint for getting multiple users is a POST. This is because payload we send can be big and might not fit in a query string, so it is a good practice to use POST in such a case.

Code that would fetch users in batches in parallel looks like this:

    public async Task<IEnumerable<UserDto>> GetUsersInParallelInWithBatches(IEnumerable<int> userIds)
    {
        var tasks = new List<Task<IEnumerable<UserDto>>>();
        var batchSize = 100;
        int numberOfBatches = (int)Math.Ceiling((double)userIds.Count() / batchSize);

        for (int i = 0; i < numberOfBatches; i++)
        {
            var currentIds = userIds.Skip(i * batchSize).Take(batchSize);
            tasks.Add(client.GetUsers(currentIds));
        }
            
        return (await Task.WhenAll(tasks)).SelectMany(u => u);
    }

Score: 0,38 seconds

Yes, less than one second! On a timeline it looks like this:

Comparing to other methods on a chart, it’s not even there:

How to optimize your requests

Have in mind, that every case is different and what works for one service, does not necessarily need to work with the next one. Try different things and approaches, find methods to measure your efforts.

Here are a few tips from me:

  • Remember that the biggest cost is not processor cycles, but rather IO operations. This includes SQL queries, network operations, message handling. Find improvements there.
  • Don’t start with parallel processing in the beginning as it brings complexity. Try to optimize your service by using hashsets or dictionaries instead of lists
  • Use smallest Dtos possible, serialize only those fields you actually use
  • Implement an endpoint suited to your needs
  • Use caching if applicable
  • Try different serializers instead of Json, for example ProfoBuf
  • When it is still not enough… – try different architecture, like push model architecture or maybe actor-model programming, like Microsoft Orleans: https://www.michalbialecki.com/2018/03/05/getting-started-microsoft-orleans/

You can find all code posted here in my github repo: https://github.com/mikuam/Blog.

Optimize and enjoy 🙂

Sending and receiving big files using Egnyte.API nuget package

Handling big files can be a problem when sending it through web. Simple REST calls are enough for small or medium files, but it’s limitation is a size of a request, that cannot be larger then 2GB. For files larger than that, you have to send or download file in chunks or as a stream.

In this post I’ll describe how to send and download really big files, bigger then 2GB connecting to Egnyte cloud storage with Egnyte.Api nuget package. I have written an introduction to Egnyte api here and wrote about using Egnyte.Api nuget package here.

Sending big files in chunks.

Egnyte API exposes dedicated method for sending big files, which is described here: Egnyte file chunked upload. First you need to install Egnyte.Api nuget package. Simple code can look like this:

    var client = new EgnyteClient(Token, Domain);

    var fileStream = new MemoryStream(File.ReadAllBytes("C:/test/big-file.zip"));
    var response = await ChunkUploadFile(client, "Shared/MikTests/Blog/big-file.zip", fileStream);

And ChunkUploadFile asynchronous helper method looks like this:

    private async Task<UploadedFileMetadata> ChunkUploadFile(
        EgnyteClient client,
        string serverFilePath,
        MemoryStream fileStream)
    {
        // first chunk
        var defaultChunkLength = 10485760;
        var firstChunkLength = defaultChunkLength;
        if (fileStream.Length < firstChunkLength)
        {
            firstChunkLength = (int)fileStream.Length;
        }

        var bytesRead = firstChunkLength;
        var buffer = new byte[firstChunkLength];
        fileStream.Read(buffer, 0, firstChunkLength);

        var response = await client.Files.ChunkedUploadFirstChunk(serverFilePath, new MemoryStream(buffer))
            .ConfigureAwait(false);
        int number = 2;

        while (bytesRead < fileStream.Length)
        {
            var nextChunkLength = defaultChunkLength;
            bool isLastChunk = false;
            if (bytesRead + nextChunkLength >= fileStream.Length)
            {
                nextChunkLength = (int)fileStream.Length - bytesRead;
                isLastChunk = true;
            }

            buffer = new byte[nextChunkLength];
            fileStream.Read(buffer, 0, nextChunkLength);

            if (!isLastChunk)
            {
                await client.Files.ChunkedUploadNextChunk(
                    serverFilePath,
                    number,
                    response.UploadId,
                    new MemoryStream(buffer)).ConfigureAwait(false);
            }
            else
            {
                return await client.Files.ChunkedUploadLastChunk(
                    serverFilePath,
                    number,
                    response.UploadId,
                    new MemoryStream(buffer)).ConfigureAwait(false);
            }
            number++;
            bytesRead += nextChunkLength;
        }

        throw new Exception("Something went wrong - unable to enumerate to next chunk.");
    }

Notice, that this code uses three methods that are reflected to three web requests and they are used for sending firs, next and last data chunk. Response of ChunkedUploadFirstChunk gives you UploadId that will identify upload and must be provided in other two methods. Buffer size I used is 10485760 bytes, that is 10 Megabytes, but you can use whatever suites you between 10 MB and 1 GB. Memory usage of sample console application looks like this:

Downloading big files

Downloading is much simpler then uploading. Important thing is to use streams the right way, so that application would not allocate to much memory.

    var client = new EgnyteClient(Token, Domain);

    var responseStream = await client.Files.DownloadFileAsStream("Shared/MikTests/Blog/big-file.zip");

    using (FileStream file = new FileStream("C:/test/big-file01.zip", FileMode.OpenOrCreate, FileAccess.Write))
    {
        CopyStream(responseStream.Data, file);
    }

And CopyStream helper method looks like this:

    /// <summary>
    /// Copies the contents of input to output. Doesn't close either stream.
    /// </summary>
    public static void CopyStream(Stream input, Stream output)
    {
        byte[] buffer = new byte[8 * 1024];
        int len;
        while ((len = input.Read(buffer, 0, buffer.Length)) > 0)
        {
            output.Write(buffer, 0, len);
        }
    }

I tested this code by sending and downloading 2.5GB files and many smaller ones and it works great.

All posted code is available in my public github repository: https://github.com/mikuam/Blog.

If you’d like to see other examples of usage Egnyte.Api, let me know.

How to handle error 0x800703E3, when user cancells file download

Recently at work I came across a difficult error, that gives an error message, that would lead me nowhere.

The remote host closed the connection. The error code is 0x800703E3.

I’ll give you more context – error occurs in the micro-service that serves big files across the web with REST interface. Service was working perfectly and none of our clients rose issues. But something was wrong. After some hours I finally managed to reproduce it. The error occurred when client was downloading file, but intentionally canceled it. How to handle such situation? Exception did not have any distinct type that can be handled separately.

In odrer to handle it in ASP.NET MVC properly I added an exception logger:

public static class RegisterFilters
{
    public static void Execute(HttpConfiguration configuration)
    {
        configuration.Services.Add(typeof(IExceptionLogger), new WebExceptionLogger());
    }
}

And WebExceptionLogger class implementation:

public class WebExceptionLogger : ExceptionLogger
{
    private const int RequestCancelledByUserExceptionCode = -2147023901;

    public override void Log(ExceptionLoggerContext context)
    {
        var dependencyScope = context.Request.GetDependencyScope();
        var loggerFactory = dependencyScope.GetService(typeof(ILoggerFactory)) as ILoggerFactory;
        if (loggerFactory == null)
        {
            throw new IoCResolutionException<ILoggerFactory>();
        }

        var logger = loggerFactory.GetTechnicalLogger<WebExceptionLogger>();
        if (context.Exception.HResult == RequestCancelledByUserExceptionCode)
        {
            logger.Info($"Request to url {context.Request.RequestUri} was cancelled by user.");
        }
        else
        {
            logger.Error("An unhandled exception has occured", context.Exception);
        }
    }
}

I noticed that this specific error type has HResult = -2147023901, so this is what I’m filtering by.

Hope this helps you.

Implementing OData in ASP.Net API

OData is a protocol that allows creating custom queries to simple REST services. Using OData query parameters you can filter, sort or transform output you’re getting to fit your needs without any implementation changes on the API side. It sounds groundbreaking and innovative and it actually is, but it’s not a new thing – Microsoft introduced it in 2007.

Is it a lot of work to introduce OData to existing API?

No! It is surprisingly easy. Let’s try it on a simple WebApi controller in a ASP.NET framework. First you need to install nuget package: Microsoft.Data.OData. Let’s say we have a such REST api controller:

public class FoldersController : ApiController
{
    private IFoldersAndFilesProvider _provider;

    public FoldersController(IFoldersAndFilesProvider provider)
    {
        _provider = provider;
    }

    [Route("api/Folders")]
    public IHttpActionResult GetFolders()
    {
        var folders = _provider.GetFolders();
        return Ok(folders);
    }
}

This is a very simple controller that returns a list of folders in a tree structure. All there is need to be done to make this endpoint OData friendly, we need to change endpoints attributes and return IQueryable result.

[Route("odata/Folders")]
[EnableQuery]
public IQueryable<Folder> GetFolders()
{
    var folders = _provider.GetFolders();
    return folders.AsQueryable();
}

And this is it! So…

Let’s see some magic

Plain old REST endpoint would return all folders, but with OData we can query that output.

http://localhost:51196/odata/Folders

Will return the same full result.

http://localhost:51196/odata/Folders?$orderby=Size

This query will sort the output by folders size.

http://localhost:51196/odata/Folders?$top=5

There is a way to return only few results.

http://localhost:51196/odata/Folders?$skip=10&$top=5

Or use for returning partial result or even paging.

http://localhost:51196/odata/Folders?$filter=Folders/all(folder: folder/Size ge 10000)

More complex query can get folders only above certain size.

http://localhost:51196/odata/Folders?$filter=Folders/all(f: f/Hidden eq false)

Or only those that are not hidden.

You can find more examples like this here.

Not only getting data

OData is perfect for querying data, but it also can be used for adding, updating, patching and deleting entities. In Visual Studio you can add a ODataController and it will prepare a controller for you with pre-generated CRUD operations that you can use.

odata-add-controller

There are good developer articles about OData here.

This post is just scratching the surface and Microsoft implementation offers a lot, but it offers only a subset of OData features. Works on this subject seems to stop a few years ago, but there’s a new hope on the horizon. Microsoft is working on OData support for .Net Core APIs. You can track progress in this gitub repository. And here you can find some guidelines how to start using this new package.

What can I use it for

OData offers query options for simple REST APIs, that would normally require a lot of developers work to handle all the cases. In my opinion OData is perfect for scenarios where you serve data for many clients that needs different data. It could be a perfect API for automatition tests, that can fetch data that they need at the moment, without hardcoding them. Also it can be a nice addon for APIs that you don’t intend to maintain actively.

All code posted here is also available in my github repo here.

Getting started with CosmosDB in Azure with .NET Core

CosmosDB is Microsoft’s new way of storing data in the cloud, comparing to good old MSSQL Server. It offers globally distributed, multi-model database. Interesting fact is that it offers multiple model of storing data: key-value, column-family, documents and graph as shown in this picture:

azure-cosmos-db

Image from https://docs.microsoft.com/en-us/azure/cosmos-db/media/introduction/

First you need a Cosmos DB account

Create a Cosmos DB account, then go to Keys tab – you will need PrimaryKey and EndpointUri.

cosmos-db-keys

Now go to Data Explorer and create a database and collection. I created Documents database and Messages collection.

cosmos-db-data-explorer

Connecting to Cosmos DB

I’m developing my app in .NET Core and for that I need to install Microsoft.Azure.DocumentDB.Core nuget package. Then I created a DocumentDbService class, that will connect to application to Cosmos DB api.

public class DocumentDbService
{
    private const string DatabaseName = "Documents";

    private const string CollectionName = "Messages";

    public async Task SaveDocumentAsync(DocumentDto document)
    {
        try
        {
            var client = new DocumentClient(new Uri(ConfigurationHelper.GetCosmosDbEndpointUri()), ConfigurationHelper.GetCosmosDbPrimaryKey());
            await client.UpsertDocumentAsync(UriFactory.CreateDocumentCollectionUri(DatabaseName, CollectionName), document);
        }
        catch (Exception e)
        {
            Console.WriteLine("Error: {0}, Message: {1}", e.Message, e.GetBaseException().Message);
        }
    }
}

ConfigurationHelper class is just a static class that gets EndpointUri and PrimaryKey as strings, so you can just paste them here directly. The code above will create a new document in Documents database and Messages collection.

DocumentDto is just a simple object that will be saved as json:

public class DocumentDto
{
    public string StockId { get; set; }

    public string Name { get; set; }

    public float Price { get; set; }

    public DateTime UpdatedAt { get; set; }
}

In order do use it in ASP.NET Core I created a controller:

public class MessagesController : Controller
{
    [HttpPost]
    public async Task<IActionResult> Save([FromBody]SendMessageDto message)
    {
        try
        {
            var document = new DocumentDto
            {
                StockId = message.StockId,
                Name = message.Name,
                Price = message.Price,
                UpdatedAt = DateTime.UtcNow
            };

            await new DocumentDbService().SaveDocumentAsync(document);

            return StatusCode(200);
        }
        catch (Exception e)
        {
            Console.WriteLine(e);
            return StatusCode(500, e.Message);
        }
    }
}

Usage of it is very simple – it creates DocumentDto and store it in Cosmos DB database. To see the result you need to go to Azure’s Data Explorer and fetch for Messages like in a screen above.

Getting data from Cosmos DB with SQL api

Microsoft’s new storage api has ability to store data in a multiple formats. Let’s try getting the latest updates from Messages collection. In DocumentDbService class we need a part of code to get data:

public IQueryable<DocumentDto> GetLatestDocuments()
{
    try
    {
        var client = new DocumentClient(new Uri(ConfigurationHelper.GetCosmosDbEndpointUri()), ConfigurationHelper.GetCosmosDbPrimaryKey());
        return client.CreateDocumentQuery<DocumentDto>(
            UriFactory.CreateDocumentCollectionUri(DatabaseName, CollectionName),
            "SELECT * FROM Messages ORDER BY Messages.UpdatedAt desc",
            new FeedOptions { MaxItemCount = 10 });
    }
    catch (Exception e)
    {
        Console.WriteLine("Error: {0}, Message: {1}", e.Message, e.GetBaseException().Message);
        return null;
    }
}

This is where magic happens. As you can see I used plain old SQL query as it would be Messages table, but instead I queried json objects that does not necessary need to have UpdatedAt field.

Code in the controller is very simple.

[HttpGet]
public IQueryable<DocumentDto> GetTenLatestUpdates()
{
    try
    {
        var documents = new DocumentDbService().GetLatestDocuments();

        return documents;
    }
    catch (Exception e)
    {
        Console.WriteLine(e);
        return null;
    }
}

Notice that GetTenLatestUpdates controller method returns IQueryable interface that on web will be presented as json, but there is also a way to efficiently filter data with OData.

Sending a Azure Service Bus message in ASP.NET core

ASP.NET Core is a open-source web framework that everyone are so excited about recently. There are some good arguments to be excited about it: ability to run on Windows, macOS and Linux, ability to host website in IIS, Nginx, Apache and Docker and it’s fast.

Can it be used for Service Bus scenarios?

Yes, it certainly can. Let’s create a project, that will send service bus message triggered by web request. I’ll create the simplest ASP.NET Core Web Application in .Net Core 2.0 framework.

net-core-create-new-api

Now lets create a helper class to connect to Service Bus.

IMPORTANT: Install Microsoft.Azure.ServiceBus nuget package instead of WindowsAzure.ServiceBus, which will not work with .NET Core.

My ServiceBusHelper class looks like this:

public class ServiceBusHelper
{
    public static QueueClient GetQueueClient(ReceiveMode receiveMode = ReceiveMode.ReceiveAndDelete)
    {
        const string queueName = "stockchangerequest";
        var queueClient = new QueueClient(ConfigurationHelper.ServiceBusConnectionString(), queueName, receiveMode, GetRetryPolicy());
        return queueClient;
    }

    public static TopicClient GetTopicClient(string topicName = "stockupdated")
    {
        var topicClient = new TopicClient(ConfigurationHelper.ServiceBusConnectionString(), topicName, GetRetryPolicy());
        return topicClient;
    }

    private static RetryExponential GetRetryPolicy()
    {
        return new RetryExponential(TimeSpan.FromSeconds(5), TimeSpan.FromSeconds(30), 10);
    }
}

Microsoft.Azure.ServiceBus nuget package differs just a bit from WindowsAzure.ServiceBus so for creating topic you won’t use QueueClient.CreateFromConnectionString method, but rather TopicClient constructor, where you can directly pass custom retry policy.

You probably noticed that I created a ConfigurationHelper class to read vales from config. To have a connection string to your bus in a file, add appsettings.json file your peoject. Also set it’s properties to Content and Copy if newer. This way it will be copied to server when project is deployed. My configuration file looks like this:

{
    "ServiceBusConnectionString":
      "Endpoint=sb://bialecki.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[removedForSafety]"
}

And ConfigurationHelper class looks like this:

public static class ConfigurationHelper
{
    private static string connection;

    public static string ServiceBusConnectionString()
    {
        if (string.IsNullOrWhiteSpace(connection))
        {
            connection = GetServiceBusConnectionString();
        }

        return connection;
    }

    private static string GetServiceBusConnectionString()
    {
        var builder = new ConfigurationBuilder()
            .SetBasePath(Directory.GetCurrentDirectory())
            .AddJsonFile("appsettings.json");

        var config = builder.Build();

        var value = config.GetValue<string>("ServiceBusConnectionString");
        return value;
    }
}

All code needed to connect to service bus is complete – congrats:)

However our job is not yet done. I mentioned earlier that I want to send messages to the bus triggered by web request. Do achieve it I need to have a controller:

public class MessagesController : Controller
{
    [HttpPost]
    public async Task<IActionResult> Send([FromBody]SendMessageDto mesaage)
    {
        try
        {
            var topicClent = ServiceBusHelper.GetTopicClient();
            await topicClent.SendAsync(new Message(Encoding.UTF8.GetBytes(mesaage.Value)));

            return StatusCode(200);
        }
        catch (Exception e)
        {
            Console.WriteLine(e);
            return StatusCode(500, e.Message);
        }
    }
}

public class SendMessageDto
{
    public string Value { get; set; }
}

Notice that there is no ApiController. In .NET Core there is only one Controller that can be used both to handle api logic and return json, or serve views for a web page.

In order for routing to work I also added some code in Startup class.

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
    if (env.IsDevelopment())
    {
        app.UseDeveloperExceptionPage();
    }

    app.UseMvc(routes =>
    {
        routes.MapRoute("default", "api/{controller=Home}/{action=Index}/{id?}");
    });
}

Publishing to Azure App Service

Since it is much better to test an app online instead of locally, I published it with Azure App Service. It’s a powerful tool for deploying and scaling web, mobile and api apps running on any platform. Microsoft also says that it ensures performance, scalability and security, but the most important for me is that you can deploy your app within couple of clicks from Visual Studio.

net-core-publish-to-azure-service-app

Now I can test my app by making POST request like this:

net-core-sending-request-to-app-service

And the response is 200! To sum up:

  • there is no problem with writing code in .Net Core or .Net Standard using Azure Service Bus
  • Core already written for regular .Net Framework will not work, but it’s not a big job to make it compatible
  • Working with event hubs and relays will require to install separate nuget packages

To read more about Azure Service Bus nuget package go to this accoucement.

All code published here can be found in my public github repo.

How to prevent ASP.Net api from going to sleep

Web Api is a framework to build HTTP based services, especially RESTful apis. It is designed to build request triggered action, where system will be ready to do some work when receiving a request. If it’s not asked, it shouldn’t do anything and in a while it will go to sleep.

Problem starts when you’re planning to execute some background jobs. Even if you start a background thread on application start it will go asleep along with application. When planning actions executed in pre-scheduled intervals I would strongly suggest to use an external tool, like Hangfire. For other things like listening messages you can configure your asp.net application properly.

What you need to set in IIS:

  • In Application Pool for your app, set startMode to AlwaysRunning
  • In Application Pool for your app, set idleTimeout to 0
  • In you app, set preloadEnabled to True

This of course can also be set in applicationHost.config.

You can find your applicationHost.config file in C:\Windows\System32\inetsrv\config\ directory. Then you need to edit:

<applicationPools>
	<add name="StockExample" autoStart="true" managedRuntimeVersion="v4.0" startMode="AlwaysRunning">
		<processModel idleTimeout="00:00:00" />
    </add>
</applicationPools>

<site name="StockExample.Api" id="12006" serverAutoStart="true">
    <application path="/" applicationPool="StockExample">
		<virtualDirectory path="/" physicalPath="C:\sources\blog\StockExample" />
    </application>
    <applicationDefaults preloadEnabled="true" />
</site>