Using POCOs when persisting to Azure Table Storage
Asked Answered
H

3

11

I'm planning to use Azure Table Storage in my ASP.NET 5 (MVC 6) app and have added the WindowsAzure.Storage NuGet package, but I got really disappointed when I noticed that all my entnty models need to inherit from Microsoft.WindowsAzure.Storage.Table.TableEntity. Now I'm thinking the best solution is to have 2 sets of entities and create mappings between my main domain objects and the entity objects used to persist to Table Storage. I don't want to add the WindowsAzure.Storage package to all my projects.

The deprecated azure-sdk-for-net got support for POCOs at one point, but I don't see this in the current WindowsAzure.Storage.

What's the best practice here?

Heavyhearted answered 25/11, 2015 at 9:0 Comment(3)
For table storage need class to be inheriting from TableEntity so definitely you need storage namespace . You can consider the table entity as edmx and need to have mapper class for converting the table entity to plain POCO model classes. Though you can use the model class inheriting from TableEntityTyrosine
Yes, exactly Maesh! But this is what I want to avoid. Of course I need a reference to WindowsAzure.Storage in the project where I'm implementing the persistance mechanism, but I have a different project/assembly where I keep my model object. I want these model objects to be clean POCOs and not in any way tied to TableEntity or any other library.Heavyhearted
Agree with OP. I think that it's ridiculous that something that "stores structured NoSQL data in the cloud" actually requires structured data!Fulminant
J
3

You can get away from inheriting from TableEntity, but to do so you end up writing some mapping code. In your code that actually will interact with Table Storage you can do some mapping from more raw table data to your object using the DynamicTableEntity to control serialization completely.

There are a couple of articles that may help you out:

If you look at the second article it shows what the code looks like for a specific POCO object being saved and updated in Azure Table Storage. The third article expands upon the work of the first to include ETag Support.

Juvenile answered 25/11, 2015 at 12:40 Comment(0)
M
10

You have not given much detail about the type of entities you try to write to Azure Table Storage however if your entities contain nested complex properties and if you want to write the entire object graph including the complex nested properties (which themselves may contain nested properties), none of these suggested solutions work.

I have come across a similar problem and have implemented a generic object flattener/recomposer API that will flatten your complex entities into flat EntityProperty dictionaries and make them writeable to Table Storage, in the form of DynamicTableEntity.

Same API will then recompose the entire complex object back from the EntityProperty dictionary of the DynamicTableEntity.

Have a look at: https://www.nuget.org/packages/ObjectFlattenerRecomposer/

I am working with Azure team to integrate this API into Azure Storage SDK. You can have a look at the pull request and the code here:

https://github.com/Azure/azure-storage-net/pull/337/commits

Usage:

//Flatten object of type Order) and convert it to EntityProperty Dictionary
 Dictionary<string, EntityProperty> flattenedProperties = EntityPropertyConverter.Flatten(order);

// Create a DynamicTableEntity and set its PK and RK
DynamicTableEntity dynamicTableEntity = new DynamicTableEntity(partitionKey, rowKey);
dynamicTableEntity.Properties = flattenedProperties;

// Write the DynamicTableEntity to Azure Table Storage using client SDK

//Read the entity back from AzureTableStorage as DynamicTableEntity using the same PK and RK
DynamicTableEntity entity = [Read from Azure using the PK and RK];

//Convert the DynamicTableEntity back to original complex object.
 Order order = EntityPropertyConverter.ConvertBack<Order>(entity.Properties);

That's all :)

Latest version of the nuget package also supports IEnumerable, ICollection etc. type properties as well.

The .Net Core version of the package is here: https://www.nuget.org/packages/ObjectFlattenerRecomposer.Core/

CosmosDb Table api version of the package is here: https://www.nuget.org/packages/ObjectFlattenerRecomposer.CosmosDb.Table.Core/

Memoirs answered 28/1, 2016 at 18:13 Comment(2)
Links to MSDN Flatten and ConvertBack are broken. New Link: learn.microsoft.com/en-us/dotnet/api/…Honourable
I removed the obsolete links from the answer. The one merged into Azure Storage SDK is an older version of my nuget package. I added the links to .core and cosmosdb.table api versions in nuget.Memoirs
J
3

You can get away from inheriting from TableEntity, but to do so you end up writing some mapping code. In your code that actually will interact with Table Storage you can do some mapping from more raw table data to your object using the DynamicTableEntity to control serialization completely.

There are a couple of articles that may help you out:

If you look at the second article it shows what the code looks like for a specific POCO object being saved and updated in Azure Table Storage. The third article expands upon the work of the first to include ETag Support.

Juvenile answered 25/11, 2015 at 12:40 Comment(0)
V
2

I made libraries that do just this:

TableStorage.Abstractions.TableEntityConverters convert POCOs to DynamicTableEntity and vice versa. It's got functionality that let's you specify the partition key, row key, and ignored fields.

TableStorage.Abstractions.POCO builds on top of this and a Table Storage repository library (TableStorage.Abstractions). Combined, it gives you a pretty easy way to CRUD on Table Storage using POCOs.

Verbose answered 7/1, 2019 at 18:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.