Does SQL Server support in-memory database?
Asked Answered
L

1

12

I have basically a testing scenario, where I

  1. Create database
  2. Fill it with some data
  3. Execute the business logic to be tested, which modifies the data. I don't own the business logic implementation or the DB schema. I must test what is already there.
  4. Assert the data are changed as expected
  5. Delete the database

Does SQL Server support something like that in memory only, so that I can speed it up by removing #1 and #5? Obviously I don't need persistence.

Luralurch answered 8/3, 2019 at 8:21 Comment(19)
I don't know if SQL Server has an in-memory only version. But, you may look into using a cache layer on top of SQL Server. Or, for a pure in memory database, read about things like H2.Panoply
H2 does not help, because I don't own the existing implementation of the business logicLuralurch
That's the wrong question. SQL Server has in-memory tables since SQL Server 2014. All versions have temporary tables and table variables. All of those things would cover the bullet points. None of these are relevant for testing though (except perhaps for temporary tables)Duumvir
@PanagiotisKanavos I recant my earlier comment, but for integration testing, an actual SQL Server instance is precisely against what the OP should be running those tests. For unit testing, the database can be mocked.Panoply
@TimBiegeleisen: FYI, it is not unit testing. I'm testing a really complicated workflows. As far as I know inmemory tables and temporary tables are not helpful in my case. The DB Schema already exists and I don't own it. I could at best execute some additional scriptsLuralurch
Yes, in memory tables are helpful here. Just create a new test database with the same schema as what you have in production, and point your tests to it.Panoply
@Luralurch why not just create and fill the database each time you want it then? If you don't want to write the script (why?), restore it from a backupDuumvir
In-memory tables have their own limitations and don't offer all features. A simple DROP DATABASE and RESTORE would work just fine though. Using a script would work just fine too. In fact, even it if was in-memory, you wouldn't be able to tell the difference until the server restartedDuumvir
I do create and fill the database each time using the existing scripts. It is just I can't change the scripts. However, restoring from backup might be a good idea. I just wanted in memory to make it much fasterLuralurch
@Liero, in fact, you could create the tables in tempdb and all of them would be gone the next time you restarted the server. An "in-memory database" though would require its own drop database statement to get deleted without a restartDuumvir
This only half of the problem. You really should have a separate database environment for testing, versus QA and production. The test database can be more in-memory, and optimize for small data sets and fast speed.Panoply
@Luralurch if you use EF Core you could use its in-Memory and SQLite provider for low-fidelity tests or to "mock" the database without mocking the DbContext - In-memory is just a dictionary so it can't enforce unique constraints while SQLite' syntax is very limited compared to SQL ServerDuumvir
@TimBiegeleisen as long as it's a separate database with a small dataset, in-memory isn't that important. Cleaning up leftover test databases though could be an issueDuumvir
@TimBiegeleisen: true, but this is not about test vs production environment. I have one legacy application. NOw I'm developing another application, that allows user to specify test cases (data to be filled in the db) and test the logic in the legacy applicationLuralurch
@PanagiotisKanavos: I will investigate SQLLite, but I doubt the tested BL implementation is compatible with SQLLiteLuralurch
@Luralurch all of those features can work for what you want. You don't really need an in-memory database. After all, to compare new vs old outcomes the easiest way would be to join tables between the two databases and compare the data. What you ask is neither unit nor integration testingDuumvir
@Luralurch in fact, such scenarios appear in production as well - you need to test your data and compare it against known problems or expected results all the time, especially in integration scenarios. Just because you passed all your unit and integration tests doesn't mean you won't get any dirty data or inputs in production that end up causing problemsDuumvir
@Paul compared to SSD storage? There's really no need for in-memory in the OP's scenario. Just a way to quickly reset the databaseDuumvir
@Luralurch another possibility would be to create the test database, detach it and use the files as a template. Each time you want a new test database, copy them to a new location followed by an ATTACH DATABASE. This will be faster than RESTORE because it won't have a recovery phaseDuumvir
S
12

SQL Server has in-memory OLTP. This feature is quite close to what you are looking into. Starting SQL Server 2016 SP1 it is possible in all editions, including sql express.

Obviously I don't need persistence

The option DURABILITY=SCHEMA_ONLY preserves only the metadata. Such a scenario can be optimal for staging tables because it results in a lock/latch/log free way of data load. Obviously, the table will not survive instance/service restart.

CREATE DATABASE [Test]
GO 
-- Memory Optimized FG
ALTER DATABASE [Test] ADD FILEGROUP [MemoryOptimizedFG] CONTAINS MEMORY_OPTIMIZED_DATA 
ALTER DATABASE [Test] ADD FILE (name='Test1', filename='D:\SQLData\TestInMemory') TO FILEGROUP [MemoryOptimizedFG]
GO

CREATE TABLE dbo.[TestTable] (
    ID int NOT NULL IDENTITY (1, 1) PRIMARY KEY NONCLUSTERED,
    ColumnID int NOT NULL,
    Description varchar(100) NOT NULL,
    dateAndTime datetime NOT NULL
)  WITH (MEMORY_OPTIMIZED=ON, DURABILITY=SCHEMA_ONLY) 
GO

References:

Spearhead answered 8/3, 2019 at 8:40 Comment(5)
This is almost what I'm looking for, but It doesn't help me with cleaning data after the test is finished, so I need to delete and recreate the DB anywayLuralurch
@Liero, for deletion and recreation you have steps 1 and 5 in your original testing scenario ;)Spearhead
@Luralurch what's the problem with that? What are you trying to avoid? DROP & CREATE don't take that long. Why would you need a transient database for this?Duumvir
Perhaps, you will want to keep an empty database, but delete/recreate in-memory objects in itSpearhead
@PanagiotisKanavos: Not really a problem, but if there was real inmemory db, it would make it much simpler and faster (there is a lot of tests) Since there isn't I will go with these approach probably. However, the DB scripts are given and I don't want to modify them. Let me check if I can ALTER TABLE WITH MEMORY_OPTIMIZED=ONLuralurch

© 2022 - 2024 — McMap. All rights reserved.