A difficult challenge. If I need to do massive queries, I normally break the query into chunks. I then create a stored procedure that takes parameters @ChunkNumber, @ChunkSize and @TotalChunks. So you would only run the query for records from (@ChunkNumber-1)@ChunkSize+ 1 to @ChunkNumber@ChunkSize. In your Delphi code, simply run a loop like this (PSeudo Code):
for(Chunk = 1 to TotalChunks)
{
DataTableResults = sp_SomePrecedure @ChunkNumber = @Chunk,
@ChunkSize = ChunkSize
RenderTableToClient(DataTableResults)
}
In this way, lets say you have 10,000 records, chunk size is 100. So you will have 100 SP calls. So you could render each chunk received from the SP, so the user is able to see the table updating.
Limitations are if the query running needs to run all records in one hit first. E.g. a Group By. SQL server uses OFFSET so you can combine to get something useful.
I have queries that run about 800K records take about 10 mins to run which I do this with. But what I do is chunk up the source tables and then run queries, e.g. if one table users has 1M records and you want to return a query which shows the total pages accessed per hour, you could chunk the users up and run the query for each chunk only.
Sorry I dont have specific code examples but hope this suggestion leads you in a positive direction.