I'm writing code that's traversing big amounts of picture data, preparing a big delta block containing it all compressed for sending.
Here's a sample on how this data could be
[MessagePackObject]
public class Blob : VersionEntity
{
[Key(2)]
public Guid Id { get; set; }
[Key(3)]
public DateTime CreatedAt { get; set; }
[Key(4)]
public string Mediatype { get; set; }
[Key(5)]
public string Filename { get; set; }
[Key(6)]
public string Comment { get; set; }
[Key(7)]
public byte[] Data { get; set; }
[Key(8)]
public bool IsTemporarySmall { get; set; }
}
public class BlobDbContext : DbContext
{
public DbSet<Blob> Blob { get; set; }
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Blob>().HasKey(o => o.Id);
}
}
When working with this I process everything into a filestream, and I want to keep as little as possible in the memory at any given time.
Is it enough to do it like this?
foreach(var b in context.Where(o => somefilters).AsNoTracking())
MessagePackSerializer.Serialize(stream, b);
Will this still fill up the memory with all the blob records, or will they be processed one by one as I iterate on the enumerator. It's not using any ToList, only the enumerator, so Entity Framework should be able to process it on the go, but I'm not sure if that's what it does.
Any Entity Framework experts here who can give some guidance on how this is handled properly.
stream
come from?). Finally, handling SQL Server filestream data fast and streaming requires a different approach that is beyond EF. – Diction