Yes, MARS still have their place in modern data access frameworks because they provide the (efficient) solution of the following two major general querying problems - streaming (i.e. non buffering) (1) data with eager loaded related data collections and (2) lazy loaded related data of any type.
In both cases, executing a query is expected to provide IEnumerator<T>
(or its async version) which is the object equivalent of data reader (or database forward only read only cursor). So each MoveNext{Async}
should be mapped to ReadNext
of the data reader and is expected provide one fully populated T
, w/o buffering ahead all others. In order to achieve that, the underlying data reader must be kept open during the enumeration, and close when it is complete or aborted earlier (for instance, FirstOrDefault()
) - one of the reasons IEnumerator<T>
is IDisposable
.
Now imagine what happens if you have lazy loading enabled. You get some entity and access some navigation property. This triggers lazy load operation, which of course needs to execute reader to get the data from the database. Since the outer reader is still open (active), w/o MARS this operation will simply fail with runtime exception. Not good. And there is nothing you or framework can do other than either buffer everything in advance (basically switching to snapshot mode) or not use lazy loading.
Let say you don't use lazy loading (it's not recommended anyway). But your entities contain related data collections and you want to eager load them. Relational database SQL provide flat result sets, i.e. does not support "nested collections" inside query result set. So how to stream such data?
There are basically two solutions.
First is based on single SQL query which contains all master + correlated table columns, and returns some sort of hybrid records where some fields apply to specific result and other are nulls. This approach is used by EF6 and EF Core 3.0+. While EF Core 1.x/2.x uses the other approach and EF Core 5.0 allows you to choose between the two. Why? Because when you have several sub collections, this tend to produce quite ineffective queries (both execution and processing the result set since it transfers a lot of unnecessary data).
Second is using separate queries - one for the main result set and one for each correlated sub collection. The idea is simple. Since usually both PKs and FKs are indexed, the database can effectively return them ordered by these columns using index (which is needed for join
operations anyway), and then they can easily be merged client side by reading ahead (buffering) maximum one record.
Sounds good, isn't it? With one small, but important caveat - it needs MARS! Otherwise, it has to switch to buffering mode. Which totally defeats the idea of IEnumerator
and in async version - the cancellation concept. You can see the effect of the latter in my answer to How can I abort a running EF Core Query using GetAsyncEnumerator?, and at the end the suggestion was to enable MARS.
For more info about EF Core querying, see Split queries and How Queries Work (and basically the whole Query data) sections of the official EF Core documentation.
Side note Separate connection isn't really an option, especially if one needs transaction levels like repeatable reads. MARS is providing the exact abstraction needed over connection. And AFAIK inside SP one can open simultaneously as many cursors as they want, hence not sure what's the exact problem with ADO connection layer and why MARS is considered optional feature which needs enabling, rather than just out of the box functionality. The ORM though could try utilizing separate connection behind the scenes when applicable. EF Core currently doesn't.
So to recap shortly, if you don't use lazy loading (likely) and don't have correlated collections (unlikely - one-to-many is quite common, and the correlated collection doesn't necessarily mean navigation property and Include
- same applies to projections to list and similar members), then you don't need MARS. Otherwise better enable them - they are feature, so use it.