If you're dumping it iteratively, you'd have to read it iteratively as well.
You can run a loop (as the accepted answer shows) to keep unpickling rows until you reach the end-of-file (at which point an EOFError
is raised).
data = []
with open("data.pickle", "rb") as f:
while True:
try:
data.append(pickle.load(f))
except EOFError:
break
Minimal Verifiable Example
import pickle
# Dumping step
data = [{'a': 1}, {'b': 2}]
with open('test.pkl', 'wb') as f:
for d in data:
pickle.dump(d, f)
# Loading step
data2 = []
with open('test.pkl', 'rb') as f:
while True:
try:
data2.append(pickle.load(f))
except EOFError:
break
data2
# [{'a': 1}, {'b': 2}]
data == data2
# True
Of course, this is under the assumption that your objects have to be pickled individually. You can also store your data as a single list of object, then use a single pickle/unpickle call (no need for loops).
data = [{'a':1}, {'b':2}] # list of dicts as an example
with open('test.pkl', 'wb') as f:
pickle.dump(data, f)
with open('test.pkl', 'rb') as f:
data2 = pickle.load(f)
data2
# [{'a': 1}, {'b': 2}]
list
as container seems reasonable. – Gleam