Dotnet Core Storage, cloud or file system storage made easy


If you ever dealt with file storage during cloud development, you are probably familiar with solutions like Azure blob storage or Amazon S3. They are powerful solutions to store large amount of files in an efficient, scalable and reliable way.

However, they come with several constraints :

  • An internet connection or an emulator is required during development.
  • Their APIs are not as simple as filesystem ones for basic operations.
  • Your product becomes cloud dependent.
  • It often strongly ties your solution to a specific cloud provider.

It's a trap

Introducing Geeklearning's storage abstraction

Since I started working with early asp.net core alphas, I maintained closed source helpers around Azure storage.
Their first goal was to provide a simplified api. And after a few months copy pasting those in every single project, I decided it was time to make this a library.

Here are the objectives of this library :

  • Making offline development as simple as possible by providing a full fledged filesystem based implementation.
  • Configuration based provider switching.
  • Allow users to easily migrate from or to Azure with a simple azcopy operation.

I've made a good progress towards this. You can find version 0.3.0 on Nuget which offers :

Features :

  • Read
  • Write
  • Delete
  • List files (including globbing support)
  • System Metadata (partial support for FileSystem)
  • Custom Metadata (Azure only)

This is all open source stuff and you can find it on Github !

Hello storage

We recommand using configuration to define your stores. You can, for instance, add a Storage configuration section to your appsettings.json file :

 "Storage": {
    "Stores": {
      "LocalAssets": {
        "Provider": "FileSystem",
        "Parameters": {
          "Path": "LocalAssets"
        }
      },
      "SharedAssets": {
        "Provider": "Azure",
        "Parameters": {
          "ConnectionString": "DefaultEndpointsProtocol=https;AccountName=account;AccountKey=skjfkjdfkj",
          "Container": "SharedAssets"
        }
      }
    }
  }

You need to configure your container

services.AddStorage()
        .AddAzureStorage()
        .AddFileSystemStorage(HostingEnvironement.ContentRootPath)
        .AddFileSystemStorageServer(options=> {
               options.SigningKey = signingKey;
               options.BaseUri = new Uri("http://localhost:11149/");
         });

services.Configure<StorageOptions>(Configuration.GetSection("Storage"));          

Then all you need to do is require an IStorageFactory and retrieve the IStore you want to work with :

public class SampleController : Controller
    {
        private IStore sharedAssets;

        public SampleController(IStorageFactory storageFactory)
        {
            this.sharedAssets = storageFactory.GetStore("SharedAssets");
        }
}

Listing files will be as trivial as this :

await this.sharedAssets.ListAsync("summaries", "*.txt", recursive: true, withMetadata: false);

The store will return a list of IFileReference that expose various method and properties; allowing you to read, edit or delete file and metadata. For instance, the following statement will read summaries/don-quijote.txt content:

var summary = await this.sharedAssets.GetAsync("summaries/don-quijote.txt");
return await summary.ReadAllTextAsync();

Complete Api Reference can be found here

What's next?

There are a few things planned for the future, such as:

  • An Amazon S3 provider.
  • Concurrency improvements, cloud providers offer interesting solutions to handle concurrency, they deserve to be supported!
  • File share support to bring support for NAS based storage.
  • FileSystem Metadata store.

I hope you enjoyed this article and found it useful. I’d love to hear about your thoughts in the comment section. We are looking forward to your feedback and suggestions. Thanks for reading! And may code be with you.