Get all keys in Redis database with python
Asked Answered
B

6

130

There is a post about a Redis command to get all available keys, but I would like to do it with Python.

Any way to do this?

Between answered 7/3, 2014 at 16:34 Comment(0)
M
188

Use scan_iter()

scan_iter() is superior to keys() for large numbers of keys because it gives you an iterator you can use rather than trying to load all the keys into memory.

I had a 1B records in my redis and I could never get enough memory to return all the keys at once.

SCANNING KEYS ONE-BY-ONE

Here is a python snippet using scan_iter() to get all keys from the store matching a pattern and delete them one-by-one:

import redis
r = redis.StrictRedis(host='localhost', port=6379, db=0)
for key in r.scan_iter("user:*"):
    # delete the key
    r.delete(key)

SCANNING IN BATCHES

If you have a very large list of keys to scan - for example, larger than >100k keys - it will be more efficient to scan them in batches, like this:

import redis
from itertools import izip_longest

r = redis.StrictRedis(host='localhost', port=6379, db=0)

# iterate a list in batches of size n
def batcher(iterable, n):
    args = [iter(iterable)] * n
    return izip_longest(*args)

# in batches of 500 delete keys matching user:*
for keybatch in batcher(r.scan_iter('user:*'),500):
    r.delete(*keybatch)

I benchmarked this script and found that using a batch size of 500 was 5 times faster than scanning keys one-by-one. I tested different batch sizes (3,50,500,1000,5000) and found that a batch size of 500 seems to be optimal.

Note that whether you use the scan_iter() or keys() method, the operation is not atomic and could fail part way through.

DEFINITELY AVOID USING XARGS ON THE COMMAND-LINE

I do not recommend this example I found repeated elsewhere. It will fail for unicode keys and is incredibly slow for even moderate numbers of keys:

redis-cli --raw keys "user:*"| xargs redis-cli del

In this example xargs creates a new redis-cli process for every key! that's bad.

I benchmarked this approach to be 4 times slower than the first python example where it deleted every key one-by-one and 20 times slower than deleting in batches of 500.

Monophonic answered 8/12, 2015 at 21:50 Comment(11)
I keep getting "redis.exceptions.ResponseError: unknown command 'SCAN'" when iterating over r.scan_iter(). Any idea why? I haven't found an anwswer yet.Tupper
@Tupper Your version of redis is too old, install a new one.Cleodel
@Cleodel Well, I haven't upgraded my redis but your guess seems obviously right!Tupper
@Tupper It's not a guess. I had the same problem, upgrade solved that. Can't remember the version that I had that didn't support the SCAN, but it was few years old. Any recent version of Redis should be OK.Cleodel
what is user:* for?Smew
@LeiYang redis search allows globs/wildcards. So "mykey*", "user_", "user:". redis.io/commands/keysMonophonic
@PatrickCollins any idea of how to pass a codec while reading ?Agro
izip_longest was renamed to zip_longest in Python 3 #38635310Tessler
The "scanning in batches" section here is misleading here. You've probably got a better performance but it's not related to fetching of the keys which is what that question is about. The better performance that you've got is probably from deleting the keys in batches instead of 1 by 1.Nasalize
AVOID RUNNING THIS AS IS!!! It delete all the redis keys. I have improved on this answer and added export to CSV the keys and the values.Kareykari
Note that for true "scanning in batches" experience we have count argument of the scan_iter method: redis-py-doc.readthedocs.io/en/master/…Dinh
P
81

Yes, use keys() from the StrictRedis module:

>>> import redis
>>> r = redis.StrictRedis(host=YOUR_HOST, port=YOUR_PORT, db=YOUR_DB)
>>> r.keys()

Giving a null pattern will fetch all of them. As per the page linked:

keys(pattern='*')

Returns a list of keys matching pattern

Palecek answered 7/3, 2014 at 16:35 Comment(8)
Be aware that the use of this command is discouraged on production servers. If you have a high number of keys, your Redis instance will not respond to any other request while processing this one, that may take a rather long time to complete.Felsite
Consider adding a reference to SCAN command as it is now a preferred way to get all keys with O(1) time complexity of each request. (and O(N) for all of the requests)Cremator
r.keys() is quite slow when you are trying to match a pattern and not just returning all keys. Consider using scan as suggested in the answer belowValrievalry
@KonstantineNikolaou I notified the OP and he gladly unaccepted my answer to accept the other one. Thanks for reporting, I had used this so long ago but I now lack the focus on the topic to check what is best.Palecek
@Palecek glad to hear that 😊Valrievalry
Don't use keys(). Use scan() instead. With the added benefit of pattern matching.Reeding
@SoroushParsa if you uphold the scan() option, then upvote the other answer. In fact, mine was the accepted one and I asked the OP to accept the other one. To me, downvoting this one per se doesn't really match the "this answer is not useful" thingie.Palecek
@fedorqui'SOstopharming' valid point. I did in fact upvote the accepted answer.Reeding
R
19
import redis
r = redis.Redis("localhost", 6379)
for key in r.scan_iter():
       print key

using Pyredis library

scan command

Available since 2.8.0.

Time complexity: O(1) for every call. O(N) for a complete iteration, including enough command calls for the cursor to return back to 0. N is the number of elements inside the collection..

Royceroyd answered 7/5, 2015 at 10:12 Comment(0)
T
3

I'd like to add some example code to go with Patrick's answer and others.
This shows results both using keys and the scan_iter technique. And please note that Python3 uses zip_longest instead of izip_longest. The code below loops through all the keys and displays them. I set the batchsize as a variable to 12, to make the output smaller.

I wrote this to better understand how the batching of keys worked.

import redis
from itertools import zip_longest

\# connection/building of my redisObj omitted here

\# iterate a list in batches of size n
def batcher(iterable, n):
    args = [iter(iterable)] * n
    return zip_longest(*args)
    
result1 = redisObj.get("TestEN")
print(result1)
result2 = redisObj.get("TestES")
print(result2)

print("\n\nLoop through all keys:")
keys = redisObj.keys('*')
counter = 0
print("len(keys)=", len(keys))
for key in keys:
    counter +=1
    print (counter, "key=" +key, " value=" + redisObj.get(key))

print("\n\nLoop through all keys in batches (using itertools)")
\# in batches of 500 delete keys matching user:*
counter = 0
batch_counter = 0
print("Try scan_iter:")
for keybatch in batcher(redisObj.scan_iter('*'), 12):
    batch_counter +=1
    print(batch_counter, "keybatch=", keybatch)
    for key in keybatch:
        if key != None:
            counter += 1
            print("  ", counter, "key=" + key, " value=" + redisObj.get(key))

Example output:

Loop through all keys:
len(keys)= 2
1 key=TestES  value=Ola Mundo
2 key=TestEN  value=Hello World


Loop through all keys in batches (using itertools)
Try scan_iter:
1 keybatch= ('TestES', 'TestEN', None, None, None, None, None, None, None, None, None, None)
   1 key=TestES  value=Ola Mundo
   2 key=TestEN  value=Hello World

Note redis comamnds are single threaded, so doing a keys() can block other redis activity. See excellent post here that explains that in more detail: SCAN vs KEYS performance in Redis

Tessler answered 27/10, 2020 at 18:11 Comment(0)
N
2

An addition to the accepted answer above.

scan_iter can be used with a count parameter in order to tell redis to search through a number of keys during a single iteration. This can speed up keys fetching significantly, especially when used with matching pattern and on big key spaces.

Be careful tough when using very high values for the count since that may ruin the performance for other concurrent queries.

https://docs.keydb.dev/blog/2020/08/10/blog-post/ Here's an article with more details and some benchmarks.

Nasalize answered 22/12, 2020 at 9:32 Comment(0)
K
2

I have improved on Patrick's and Neal's code and added export to csv:

import csv
import redis
from itertools import zip_longest

redisObj = redis.StrictRedis(host='localhost', port=6379, db=0, decode_responses=True)
searchStr = ""

# iterate a list in batches of size n
def batcher(iterable, n):
    args = [iter(iterable)] * n
    return zip_longest(*args)

with open('redis.csv', 'w', newline='') as csvfile:
    fieldnames = ['key', 'value']
    writer = csv.DictWriter(csvfile, fieldnames=fieldnames)
    writer.writeheader()

    print("\n\nLoop through all keys in batches (using itertools)")
    counter = 0
    batch_counter = 0
    print("Try scan_iter:")
    for keybatch in batcher(redisObj.scan_iter('*'), 500):
        batch_counter +=1
        #print(batch_counter, "keybatch=", keybatch)
        for key in keybatch:
            if key != None:
                counter += 1
                val = ""
                if (searchStr in key):
                    valType = redisObj.type(key)
                    print(valType)
                    match valType:
                        case "string":
                            val = redisObj.get(key)
                        case "list":
                            valList = redisObj.lrange(key, 0, -1)
                            val = '\n'.join(valList)
                        case "set":
                            valList = redisObj.smembers(key)
                            val = '\n'.join(valList)
                        case "zset":
                            valDict = redisObj.zrange(key, 0, -1, False, True)
                            val = '\n'.join(['='.join(i) for i in valDict.items()])
                        case "hash":
                            valDict = redisObj.hgetall(key)
                            val = '\n'.join(['='.join(i) for i in valDict.items()])
                        case "stream":
                            val = ""
                        case _:
                            val = ""
                print("  ", counter, "key=" + key, " value=" + val)
                writer.writerow({'key': key, 'value': val})
Kareykari answered 2/6, 2022 at 7:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.