How to make a class JSON serializable
Asked Answered
B

45

1413

How to make a Python class serializable?

class FileItem:
    def __init__(self, fname):
        self.fname = fname

Attempt to serialize to JSON:

>>> import json
>>> x = FileItem('/foo/bar')
>>> json.dumps(x)
TypeError: Object of type 'FileItem' is not JSON serializable
Bundy answered 22/9, 2010 at 11:52 Comment(14)
It's unfortunate that the answers all seem to answer the question "How do I serialize a class?" rather than the action question "How do I make a class serializable?" These answers assume that you're doing the serialization yourself, rather than passing the object along to some other module that serializes it.Encephalograph
If you're using Python3.5+, you could use jsons. It will convert your object (and all its attributes recursively) to a dict. import jsons see answer below - it works perfectly fineSatiable
@KyleDelaney I was really hoping for an interface/magic method I could implement to become searializable too. I guess I will have to implement a .to_dict() function or something which can be called on the object before it is passed to the module which tries to serialize it.Rosado
see https://mcmap.net/q/41907/-how-to-make-a-class-json-serializable for a start for a JSONAble mixinForeyard
@FelixB. You can use the built-in vars function in combination with json.dumps (see my answer https://mcmap.net/q/41907/-how-to-make-a-class-json-serializable)Gelderland
@FelixB. If it suits your use case, you could use a dataclass, which has it's own .__dict__Spermatozoon
It's amazing that in 11 years there has not been a single response that answers this question. OP states he wants to use json.dumps yet all the answers, including with the bounty awarded, involve creating a custom encoder, which dodges the point of the question entirely.Leora
This article explains the specific methods of making a class serializable, which is exactly what the question asked: pynative.com/make-python-class-json-serializableWohlen
@Leora a custom encoder is not required; a default hook - which is a simple parameter to json.dumps - suffices. One answer simply offers json.dumps(..., default=vars). There's also an answer that does work solely by modifying the class: specifically, it must be modified to subtype dict. Your assessment of the answers is simply off base.Fallfish
That said, this question serves as a canonical now, so it's entirely reasonable that it attracts answers that tell beginners the right thing(s) to do.Fallfish
First working answer I found to make any arbitrary class and all of its children serialisable was given by @R H.Sedge
This article as @AsifShiraz points out, really helped to pick a method to my needs. pynative.com/make-python-class-json-serializableKrysta
The whole point of JSON is its independence from custom code. Don't entangle data with code! Instead, make your class know how to translate to/from a nested builtin python object (of type Builtin[Builtin] where Builtin = List|Tuple|Dict|float|str|bool).Periodical
learning python and coming from something like typescript, it's hard to believe how lousy python is with json... You should be able to serialize pretty much anything, especially a dataclassMisti
B
719

Do you have an idea about the expected output? For example, will this do?

>>> f  = FileItem("/foo/bar")
>>> magic(f)
'{"fname": "/foo/bar"}'

In that case you can merely call json.dumps(f.__dict__).

If you want more customized output then you will have to subclass JSONEncoder and implement your own custom serialization.

For a trivial example, see below.

>>> from json import JSONEncoder
>>> class MyEncoder(JSONEncoder):
        def default(self, o):
            return o.__dict__    

>>> MyEncoder().encode(f)
'{"fname": "/foo/bar"}'

Then you pass this class into the json.dumps() method as cls kwarg:

json.dumps(cls=MyEncoder)

If you also want to decode then you'll have to supply a custom object_hook to the JSONDecoder class. For example:

>>> def from_json(json_object):
        if 'fname' in json_object:
            return FileItem(json_object['fname'])
>>> f = JSONDecoder(object_hook = from_json).decode('{"fname": "/foo/bar"}')
>>> f
<__main__.FileItem object at 0x9337fac>
>>> 
Berard answered 22/9, 2010 at 12:2 Comment(10)
Using __dict__ will not work in all cases. If the attributes have not been set after the object was instantiated, __dict__ may not be fully populated. In the example above, you're OK, but if you have class attributes that you also want to encode, those will not be listed in __dict__ unless they have been modified in the class' __init__ call or by some other way after the object was instantiated.Mallon
+1, but the from_json() function used as object-hook should have an else: return json_object statement, so it can deal with general objects as well.Inquisition
@KrisHardy __dict__ also doesn't work if you use __slots__ on a new style class.Amaurosis
You could use a custom JSONEncoder as above to create a custom protocol, such as checking for the existence of __json_serializable__ method and calling it to obtain a JSON serializable representation of the object. This would be in keeping with other Python patterns, like __getitem__, __str__, __eq__, and __len__.Clipping
am new to python...can we use these libraries (json or simplejson) to serialize/deserialize objects which are extensively used by python developers say pandas dataframe, series etc?Hierolatry
__dict__ also won't work recursively, e.g., if an attribute of your object is another object.Ishtar
JSONEncoder only call subclass's default method if the given data if non-standard.Ordinary
@Hierolatry use the df.to_json() method for Dataframe, Series, etcRosemarierosemary
As mentioned by @Amaurosis I ran into the problem of having used __slots__ in for some of my classes, but I simply wrote a similar utility type function to serialize to json with @property def as_json(self): return json.dumps({k:getattr(self, k) for k in self.__slots__ if type(getattr(self, k)) is not set}) where in my case the filter clause of is not set is because sets are not json serializable. Otherwise though this works when you use __slots__ and so don't have __dict__ to iterate over. In my use case I don't need the sets, they are computed by the constructorAbm
My comment above (apologizes for the formatting, can't really do a code block in a comment but this didn't seem worth another separate answer) doesn't address any concerns of recursion when class variables are themselves objects of other non-primitive classes. Just how to do the simple thing for simple class if you used __slots__ and don't have __dict__. Almost everything I find here is going after the __dict__ internal implementation of classes, but you don't have that if you predefine member variables with __slots__Abm
S
858

Here is a simple solution for a simple feature:

.toJSON() Method

Instead of a JSON serializable class, implement a serializer method:

import json

class Object:
    def toJSON(self):
        return json.dumps(
            self,
            default=lambda o: o.__dict__, 
            sort_keys=True,
            indent=4)

So you just call it to serialize:

me = Object()
me.name = "Onur"
me.age = 35
me.dog = Object()
me.dog.name = "Apollo"

print(me.toJSON())

will output:

{
    "age": 35,
    "dog": {
        "name": "Apollo"
    },
    "name": "Onur"
}
Stolen answered 21/3, 2013 at 2:26 Comment(19)
Very limited. If you have a dict {"foo":"bar","baz":"bat"}, that will serialize to JSON easily. If instead you have {"foo":"bar","baz":MyObject()}, then you cannot. The ideal situation would be that nested objects are serialized to JSON recursively, not explicitly.Prognathous
It will still work. You're missing o.__dict___. Try your own example: class MyObject(): def __init__(self): self.prop = 1 j = json.dumps({ "foo": "bar", "baz": MyObject() }, default=lambda o: o.__dict__)Angulo
Is this solution reversible? I.e. Is it easy to reconstruct the object from json?Banjermasin
@J.C.Leitão No. You could have two different classes with the same fields. Objects a and b of that class (probably with the same properties) would have the same a.__dict__ / b.__dict__.Cognac
However, you could probably add a field unjson_class or so which gives a hint which class the object originally had. Then you could use introspection of your module and use something like https://mcmap.net/q/41978/-how-to-create-a-class-instance-without-calling-initializer to get the object back.Cognac
This does not work with datetime.datetime instances. It throws the following error: 'datetime.datetime' object has no attribute '__dict__'Saintsimonianism
The following will work for anything that doesn't have a __dict__: def _try(o): try: return o.__dict__ except: return str(o) and def to_JSON(self): return json.dumps(self, default=lambda o: _try(o), sort_keys=True, indent=0, separators=(',',':')).replace('\n', '')Saintsimonianism
@J.C.Leitão yes, it is reversible. when you load the json you can add each key/value pair to the object dict property see #6579486Crozier
I must be missing something but that seems like it doesn't work (ie., json.dumps(me) doesn't call Object's toJSON method.Raven
You can use default=vars instead of default=lambda o: o.__dict__.Danelle
@Raven That is because he didn't make the class serializable, he just made a method that spits JSON String. This is not a proper answer for the question, it is more of a hack for special cases. But the correct one is above. If you would need YourObject getting serialized as a part/content of another ParentObject, you need to create an encoder.Ellis
The question specifically asks about making json.dumps(). Not how to implement all of JSON serialization yourself.Castleberry
744 people did not understand the questionLeora
I found this to be super useful for finding out what classes don't work (or just bypassing them). This assumes, of course, you don't care about missing some data. def default_serialize_func(o): if hasattr(o, 'dict'): return o.__dict__ return f"<could not serialize {o.__class__}>"Acronym
Wow, comments are supremely broken in formatting ability. Check out this gist for your eyes to stop bleeding: gist.github.com/terabyte/8b961c29eab7fd155ddbe115d7941326Acronym
I may be late but to serialize a datetime with classes in classes: json.dumps(myObject, default=lambda o: o.isoformat() if (isinstance(o, datetime.datetime)) else o.__dict__, sort_keys=True, indent=4) - Do not need to implement an additional function.Informality
Is there a way to make this callable for lists?Earvin
@OnurYıldırım I reiterate NdrsImpk's question. Is there a way to make it callable for a list of object, because with this, the only way to call it a list of object is to loop on it. This way it creates multiple json for each Object, but I would like one for all the Object (The list of Object)Architectural
What is the point of this answer when it doesn't include the reverse operation?Lycaon
B
719

Do you have an idea about the expected output? For example, will this do?

>>> f  = FileItem("/foo/bar")
>>> magic(f)
'{"fname": "/foo/bar"}'

In that case you can merely call json.dumps(f.__dict__).

If you want more customized output then you will have to subclass JSONEncoder and implement your own custom serialization.

For a trivial example, see below.

>>> from json import JSONEncoder
>>> class MyEncoder(JSONEncoder):
        def default(self, o):
            return o.__dict__    

>>> MyEncoder().encode(f)
'{"fname": "/foo/bar"}'

Then you pass this class into the json.dumps() method as cls kwarg:

json.dumps(cls=MyEncoder)

If you also want to decode then you'll have to supply a custom object_hook to the JSONDecoder class. For example:

>>> def from_json(json_object):
        if 'fname' in json_object:
            return FileItem(json_object['fname'])
>>> f = JSONDecoder(object_hook = from_json).decode('{"fname": "/foo/bar"}')
>>> f
<__main__.FileItem object at 0x9337fac>
>>> 
Berard answered 22/9, 2010 at 12:2 Comment(10)
Using __dict__ will not work in all cases. If the attributes have not been set after the object was instantiated, __dict__ may not be fully populated. In the example above, you're OK, but if you have class attributes that you also want to encode, those will not be listed in __dict__ unless they have been modified in the class' __init__ call or by some other way after the object was instantiated.Mallon
+1, but the from_json() function used as object-hook should have an else: return json_object statement, so it can deal with general objects as well.Inquisition
@KrisHardy __dict__ also doesn't work if you use __slots__ on a new style class.Amaurosis
You could use a custom JSONEncoder as above to create a custom protocol, such as checking for the existence of __json_serializable__ method and calling it to obtain a JSON serializable representation of the object. This would be in keeping with other Python patterns, like __getitem__, __str__, __eq__, and __len__.Clipping
am new to python...can we use these libraries (json or simplejson) to serialize/deserialize objects which are extensively used by python developers say pandas dataframe, series etc?Hierolatry
__dict__ also won't work recursively, e.g., if an attribute of your object is another object.Ishtar
JSONEncoder only call subclass's default method if the given data if non-standard.Ordinary
@Hierolatry use the df.to_json() method for Dataframe, Series, etcRosemarierosemary
As mentioned by @Amaurosis I ran into the problem of having used __slots__ in for some of my classes, but I simply wrote a similar utility type function to serialize to json with @property def as_json(self): return json.dumps({k:getattr(self, k) for k in self.__slots__ if type(getattr(self, k)) is not set}) where in my case the filter clause of is not set is because sets are not json serializable. Otherwise though this works when you use __slots__ and so don't have __dict__ to iterate over. In my use case I don't need the sets, they are computed by the constructorAbm
My comment above (apologizes for the formatting, can't really do a code block in a comment but this didn't seem worth another separate answer) doesn't address any concerns of recursion when class variables are themselves objects of other non-primitive classes. Just how to do the simple thing for simple class if you used __slots__ and don't have __dict__. Almost everything I find here is going after the __dict__ internal implementation of classes, but you don't have that if you predefine member variables with __slots__Abm
H
265

For more complex classes you could consider the tool jsonpickle:

jsonpickle is a Python library for serialization and deserialization of complex Python objects to and from JSON.

The standard Python libraries for encoding Python into JSON, such as the stdlib’s json, simplejson, and demjson, can only handle Python primitives that have a direct JSON equivalent (e.g. dicts, lists, strings, ints, etc.). jsonpickle builds on top of these libraries and allows more complex data structures to be serialized to JSON. jsonpickle is highly configurable and extendable–allowing the user to choose the JSON backend and add additional backends.

Transform an object into a JSON string:

import jsonpickle
json_string = jsonpickle.encode(obj)

Recreate a Python object from a JSON string:

recreated_obj = jsonpickle.decode(json_string)

(link to jsonpickle on PyPi)

Hamper answered 23/12, 2011 at 9:11 Comment(16)
Coming from C#, this is what I was expecting. A simple one liner and no messing with the classes.Rosaline
jsonpickle is awesome. It worked perfectly for a huge, complex, messy object with many levels of classesBloodstained
is there an example of the proper way to save this to a file? The documentation only shows how to encode and decode a jsonpickle object. Also, this was not able to decode a dict of dicts containing pandas dataframes.Younger
@Younger you can use obj = jsonpickle.decode(file.read()) and file.write(jsonpickle.encode(obj)).Marvelmarvella
It works for me!. It is what I needed. I just wanted to print a behave scenario object.Alanalana
This is a great solution. Rather than building a complex JSON decoder extension for the json module, I was able to serialize my class with one line: s=jsonpickle.encode(self). Restoration was every bit as simple. This is the true Python spirit. +1Fulvia
Sadly, doesn't seem to work on a openpyxl spreadsheet. I'm trying to turn a spreadsheet into .json, write it to a file, which works great. Then read that .json file back in, decode and save back to Excel. Unfortunately it fails on decode with Cannot convert [] to Excel.Simpleminded
Just have to be careful that you don't expose yourself to RCE if deserializing data from untrusted sourcesChantell
As a side note, if using redis to stream, replace json.dumps() and json.loads() with jsonpickle.encode() and jsonpickle.decode(). You will save having to write tons of boiler-plate code!Rendezvous
Despite being useful, this is still a link-only answer.Autosuggestion
@Rosaline Coming from python, we aren't all that different. This solution does not require modification of the class-to-be-serialized, it is simple and it's naming conventions are good enough to self document what's going on. That's the correct answer IMO.Welch
I've had jsonpickle fail with decoding dataframes, crashing the kernel in jupyter. as it was a dataframe in a class, i therefore had to use this workaround in order to serialize the class github.com/jsonpickle/jsonpickle/issues/213Mardellmarden
@Simpleminded That is likely because the JSON decoder cannot tell between [] an empty spreadsheet object and [] an empty list. Not totally sure on that, but it would be my suspicionLycaon
Is there a reason why one would not want to use this as a solution in all cases? To me, this looks like a canonical, default, no-brainer, no-thinking required solution. In other words, something people should "just use" without further consideration. However, I am wondering if there might be a reason why you would not want to "just use this".Lycaon
@Lycaon it is inherently unsafe. Same as just pickle lib. If there is some Python code embedded in the JSON that's being decoded - it will be executed. That's the #1 thing to consider when picking this over custom decoders for jsonRefreshing
@Refreshing Right ok thanks that does make a lot of sense. Since reading these questions I sort of convinced myself that jsons might be the "no brainer" solution. (Maybe with some configuration options to remove private fields.)Lycaon
T
218

Most of the answers involve changing the call to json.dumps(), which is not always possible or desirable (it may happen inside a framework component for example).

If you want to be able to call json.dumps(obj) as is, then a simple solution is inheriting from dict:

class FileItem(dict):
    def __init__(self, fname):
        dict.__init__(self, fname=fname)

f = FileItem('tasks.txt')
json.dumps(f)  #No need to change anything here

This works if your class is just basic data representation, for trickier things you can always set keys explicitly in the call to dict.__init__().

This works because json.dumps() checks if the object is one of several known types via a rather unpythonic isinstance(value, dict) - so it would be possible to fudge this with __class__ and some other methods if you really don't want to inherit from dict.

Tadeas answered 3/7, 2015 at 13:22 Comment(16)
This can really be a nice solution :) I believe for my case it is. Benefits: you communicate the "shape" of the object by making it a class with init, it is inherently serializable and it looks interpretable as repr.Sixtieth
Though "dot-access" is still missing :(Sixtieth
Ahh that seems to work! Thanks, not sure why this is not the accepted answer. I totally agree that changing the dumps is not a good solution. By the way, in most cases you probably want to have dict inheritance together with delegation, which means that you will have some dict type attribute inside your class, you will then pass this attribute as parameter as initialisation something like super().__init__(self.elements).Raven
In my use case I needed to store data that was "invisible" to json.dumps(), so I used this method. The class DictWithRider takes in an arbitrary object, stores it as a member, and makes it accessible via a function get_rider_obj() but does not pass it to dict.__init__(). So parts of the application who want to see the "hidden" data can call d.get_rider_obj() but json.dumps() sees basically an empty dict. As @Sixtieth mentioned, you can't access regular members with dot notation, but you can access functions.Rhyme
for simple use this is ideal. dot notation can be easily enabled by additional lines subsequent to dict.__init__( line as self.fname = fname and the object can be deserialised with f = FileItem(**json.loads(serialised_f))Seamy
I absolutely love this one for its simplicity. You have to design your classes completely (no monkey-patching), but it works really nicely, even if your class members are lists of objects.Antetype
for "dot-access" I recommend using properties instead of adding additional attributesKampmann
this solution's a bit hacky - for a true, production quality solution, replace json.dumps() and json.loads() with jsonpickle.encode() and jsonpickle.decode(). You will avoid having to write ugly boilerplate code, and most importantly, if you are able to pickle the object, you should be able to serialize it with jsonpickle without boilerplate code (complex containers/objects will just work).Rendezvous
@Rendezvous this answer addresses cases where you have no control over the code which calls json.dumps.Tadeas
It would be nice to explain why this works, though. What Python magic is being relied upon here, and can we just implement that directly by implementing a special dunder function?Shererd
@Mike'Pomax'Kamermans there's no magic here, Python or otherwise, we're just subclassing dict. That works because json.dumps uses JSONEncoder which does a rather unpythonic isinstance(value, dict) - so unfortunately merely implementing the required methods won't work here.Tadeas
That is literally the magic I was hoping to get explained. It would be quite useful to put the bit about it using isinstance, explicitly checking for instances of dict, in the post =) That said, there is some magic pertaining to the json default function, which you can override as in https://mcmap.net/q/41907/-how-to-make-a-class-json-serializable and actually works really well. (why this is not the default json encoder behaviour is a bit of a mystery, as "objects" in json are just the data aspect, so literally just the dict of instance values)Shererd
@Mike'Pomax'Kamermans A function in a library checking the type of an object is neither magic, python specific or pythonic. If there was python "magic" at play we probably wouldn't need this hack. Happy to update the answer to include the additional explanation however.Tadeas
Another alternative, which is perhaps better from an OOP / design perspective is rather than to inherit from a dict (because you probably are not really building a dict) instead use composition, and have the class wrap a single dict object. In other words, contain a single dict. Serialization and deserialization require two functions to serialize and deserialize the inner dict.Lycaon
Some may consider that this comes with it's own disadvantage. You will have to write get/set methods for each keyed value of the dict you want to expose get/set operations for. If your dict contains N keys, that's 2 N functions to maintain.Lycaon
surprise: collections.UserDict is not JSON serializable, also collections.UserList and collections.UserStringKalliekallista
G
194

As mentioned in many other answers you can pass a function to json.dumps to convert objects that are not one of the types supported by default to a supported type. Surprisingly none of them mentions the simplest case, which is to use the built-in function vars to convert objects into a dict containing all their attributes:

json.dumps(obj, default=vars)

Note that this covers only basic cases, if you need more specific serialization for certain types (e.g. exluding certain attributes or for objects that don't have a __dict__ attribute) you need to use a custom function or a JSONEncoder as desribed in the other answers.

Gelderland answered 21/10, 2020 at 18:40 Comment(6)
it is unclear what you mean by default=vars, does that mean that vars is the default serializer? If not: This does not really solve the case where you can not influence how json.dumps is called. If you simply pass an object to a library and that library calls json.dumps on that object, it doesn't really help that you have implemented vars if that library does not use dumps this way. In that sense it is equivalent to a custom JSONEncoder.Rosado
You are correct, it is nothing else than just a simple choice for a custom serializer and doesn't solve the case you describe. If I see it correctly there is no solution to the case were you don't control how json.dumps is invoked.Gelderland
For some objects, this approach will throw vars() argument must have __dict__ attributeRajah
Thanks for this, pretty straightforward to use with library that have proper definition built in.Mignonne
Any workaround for the error vars() argument must have __dict__ attribute?Filial
this does not work: vars() argument must have dict attributeMerocrine
W
103

Just add to_json method to your class like this:

def to_json(self):
  return self.message # or how you want it to be serialized

And add this code (from this answer), to somewhere at the top of everything:

from json import JSONEncoder

def _default(self, obj):
    return getattr(obj.__class__, "to_json", _default.default)(obj)

_default.default = JSONEncoder().default
JSONEncoder.default = _default

This will monkey-patch json module when it's imported, so JSONEncoder.default() automatically checks for a special to_json() method and uses it to encode the object if found.

Just like Onur said, but this time you don't have to update every json.dumps() in your project.

Wexler answered 4/8, 2016 at 10:27 Comment(3)
Big thanks! This is the only answer that allows me to do what I want: be able to serialize an object without changing the existing code. The other methods mostly do not work for me. The object is defined in a third-party library, and the serialization code is third-party too. Changing them will be awkward. With your method, I only need to do TheObject.to_json = my_serializer.Scab
This is the correct answer. I did a small variation: import json _fallback = json._default_encoder.default json._default_encoder.default = lambda obj: getattr(obj.__class__, "to_json", _fallback)(obj) Conventional
There's gotta be a better way than hacking the JSON encoder at the top of everything. Too brittle to be a reliable solutionAdlib
M
83

TLDR: copy-paste Option 1 or Option 2 below

A Full Answer to:
Making Pythons json module work with Your Class

AKA, solving: json.dumps({ "thing": YOUR_CLASS() })


Explanation:

  • Yes, a reliable solution exists
  • No, there is no python "official" solution
    • By official solution, I mean there is no way (as of 2024) to add a method to your class (like toJSON in JavaScript) and/or no way to register your class with the built-in json module. When something like json.dumps([1,2, your_obj]) is executed, python doesn't check a lookup table or object method.
    • I'm not sure why other answers don't explain this
    • The closest official approach is probably andyhasit's answer which is to inherit from a dictionary. However, inheriting from a dictionary doesn't work very well for many custom classes like AdvancedDateTime, or pytorch tensors.
  • The ideal workaround is this:
    • Add def __json__(self) method to your class
    • Mutate json.dumps to check for __json__ method (affects everywhere, even pip modules that import json)
    • Note: Modifing builtin stuff usually isn't great, however this change should have no side effects, even if its applied multiple times by different codebases. It is entirely reversable durning runtime (if a module wants to undo the modification). And for better or worse, is the best that can done at the moment.


Option 1: Let a Module do the Patching


pip install json-fix
(extended + packaged version of Fancy John's answer, thank you @FancyJohn)

your_class_definition.py

import json_fix

class YOUR_CLASS:
    def __json__(self):
        # YOUR CUSTOM CODE HERE
        #    you probably just want to do:
        #        return self.__dict__
        return "a built-in object that is naturally json-able"

Thats it.


Example usage:

from your_class_definition import YOUR_CLASS
import json

json.dumps([1,2, YOUR_CLASS()], indent=0)
# '[\n1,\n2,\n"a built-in object that is naturally json-able"\n]'

To make json.dumps work for Numpy arrays, Pandas DataFrames, and other 3rd party objects, see the Module (only ~2 lines of code but needs explanation).




How does it work? Well...

Option 2: Patch json.dumps yourself


Note: this approach is simplified, it fails on known edgecases (ex: if your custom class inherits from dict or another builtin), and it misses out on controlling the json behavior for external classes (numpy arrays, datetime, dataframes, tensors, etc).

some_file_thats_imported_before_your_class_definitions.py

# Step: 1
# create the patch
from json import JSONEncoder
def wrapped_default(self, obj):
    return getattr(obj.__class__, "__json__", wrapped_default.default)(obj)
wrapped_default.default = JSONEncoder().default
   
# apply the patch
JSONEncoder.original_default = JSONEncoder.default
JSONEncoder.default = wrapped_default

your_class_definition.py

# Step 2
class YOUR_CLASS:
    def __json__(self, **options):
        # YOUR CUSTOM CODE HERE
        #    you probably just want to do:
        #        return self.__dict__
        return "a built-in object that is natually json-able"

_

All other answers seem to be "Best practices/approaches to serializing a custom object"

Which, is alreadly covered here in the docs (search "complex" for an example of encoding complex numbers)

Madalene answered 25/8, 2021 at 16:52 Comment(6)
It's a bit aggressive to modify json.dumps across the whole codebase, but this is clearly the nicest solution IMO.Elmira
Good solution. is there an equivalent for json.loads?Comstockery
Sadly no @Comstockery and there kind of fundamentally can't be; json-dumping is effectively a one-way operation. Ex: think of a BigInt class that converts itself to a string for json.dumps. Now consider a random string-value somewhere in a json file. Maybe that string-value contains all-digits, does that mean it should be loaded as a BigInt? What about strings that just coincidentally contain all-digits, but are supposed to remain as strings? There's no way that json.loads can know, so instead you have to do something like BigInt.from_json(a_str) with a string that you KNOW is should be a BigInt.Madalene
really, I'm new to python, but man, simple things like serialize/deserialize classes to JSON should be simple. So many answers and really no solution. why do people like python so much these days? I wonder how many successful production software are created with python. It's just a pain for every little things. I come from dotnet/javascript world.Merocrine
@JeffHykin That's where type information comes in. Surely the solution would be to use class methods or a class constructor. This provides type information to disambiguate the type.Lycaon
@Merocrine - well, javascript world manages to make many other things horrifying. And yes, Python is in widespread use in successful production software. Often Python works fine and much better than competition.Alroi
N
68

I like Onur's answer but would expand to include an optional toJSON() method for objects to serialize themselves:

def dumper(obj):
    try:
        return obj.toJSON()
    except:
        return obj.__dict__
print json.dumps(some_big_object, default=dumper, indent=2)
Neutralization answered 27/1, 2015 at 16:4 Comment(6)
I found this to be the best balance between using the existing json.dumps and introducing custom handling. Thanks!Hare
I actually really like this; but rather than try-catch would probably do something like if 'toJSON' in obj.__attrs__():... to avoid a silent failure (in the event of failure in toJSON() for some other reason than it not being there)... a failure which potentially leads to data corruption.Impressionism
@Impressionism as I understand it, idomatic python asks for forgiveness, not permission, so try-except is the right approach, but the correct exception should be caught, an AttributeError in this case.Bahamas
@phil a few years older and wiser now, I'd agree with you.Impressionism
This really should be catching an AttributeError explicitlyShadwell
And what if AttributeError is raised inside obj.toJSON()?Coset
T
48

If you're using Python3.5+, you could use jsons. (PyPi: https://pypi.org/project/jsons/) It will convert your object (and all its attributes recursively) to a dict.

import jsons

a_dict = jsons.dump(your_object)

Or if you wanted a string:

a_str = jsons.dumps(your_object)

Or if your class implemented jsons.JsonSerializable:

a_dict = your_object.json
Trigon answered 19/12, 2018 at 9:34 Comment(7)
If you are able to use Python 3.7+, I found that the cleanest solution to convert python classes to dicts and JSON strings (and viceversa) is to mix the jsons library with dataclasses. So far, so good for me!Gare
This is an external library, not built into the standard Python install.Burdett
only for class that has slots attributeThwack
You can, but you don't need to use slots. Only when dumping according to the signature of a specific class you'll need slots. In the upcoming version 1.1.0 that is also no longer the case.Trigon
This library is extremely slow in both deserialization/serialization, at least from personal testing. I'd suggest other ser libraries instead.Chevalier
How are you charting the JSON in the visuals/images? By the way this works. json.dumps(obj) then json.loads(obj).Mercie
One still-missing feature is an option for human consumption / readability. Pretty-printing / indenting is not yet implemented (at least not yet as of v1.6.1).Hirundine
V
44

Another option is to wrap JSON dumping in its own class:

import json

class FileItem:
    def __init__(self, fname):
        self.fname = fname

    def __repr__(self):
        return json.dumps(self.__dict__)

Or, even better, subclassing FileItem class from a JsonSerializable class:

import json

class JsonSerializable(object):
    def toJson(self):
        return json.dumps(self.__dict__)

    def __repr__(self):
        return self.toJson()


class FileItem(JsonSerializable):
    def __init__(self, fname):
        self.fname = fname

Testing:

>>> f = FileItem('/foo/bar')
>>> f.toJson()
'{"fname": "/foo/bar"}'
>>> f
'{"fname": "/foo/bar"}'
>>> str(f) # string coercion
'{"fname": "/foo/bar"}'
Vonnievonny answered 16/6, 2012 at 10:30 Comment(4)
Hi, I don't really like this "custom encoder" approach, it would be better if u can make your class json seriazable. I try, and try and try and nothing. Is there any idea how to do this. The thing is that json module test your class against built in python types, and even says for custom classes make your encoder :). Can it be faked? So I could do something to my class so it behave like simple list to json module? I try subclasscheck and instancecheck but nothing.Gerson
@ADRENALIN You could inherit from a primary type (probably dict), if all class attribute values are serializable and you don't mind hacks. You could also use jsonpickle or json_tricks or something instead of the standard one (still a custom encoder, but not one you need to write or call). The former pickles the instance, the latter stores it as dict of attributes, which you can change by implementing __json__encode__ / __json_decode__ (disclosure: I made the last one).Connected
That doesn't make the object serializeable for the json class. It only provides a method to get a json string returned (trivial). Thus json.dumps(f) will fail. That's not what's been asked.Plate
I don't see the point of this. You just created a class JsonSerializable which you are using as an interface to provide default implementations of toJson ?Lycaon
T
35

I came across this problem the other day and implemented a more general version of an Encoder for Python objects that can handle nested objects and inherited fields:

import json
import inspect

class ObjectEncoder(json.JSONEncoder):
    def default(self, obj):
        if hasattr(obj, "to_json"):
            return self.default(obj.to_json())
        elif hasattr(obj, "__dict__"):
            d = dict(
                (key, value)
                for key, value in inspect.getmembers(obj)
                if not key.startswith("__")
                and not inspect.isabstract(value)
                and not inspect.isbuiltin(value)
                and not inspect.isfunction(value)
                and not inspect.isgenerator(value)
                and not inspect.isgeneratorfunction(value)
                and not inspect.ismethod(value)
                and not inspect.ismethoddescriptor(value)
                and not inspect.isroutine(value)
            )
            return self.default(d)
        return obj

Example:

class C(object):
    c = "NO"
    def to_json(self):
        return {"c": "YES"}

class B(object):
    b = "B"
    i = "I"
    def __init__(self, y):
        self.y = y
        
    def f(self):
        print "f"

class A(B):
    a = "A"
    def __init__(self):
        self.b = [{"ab": B("y")}]
        self.c = C()

print json.dumps(A(), cls=ObjectEncoder, indent=2, sort_keys=True)

Result:

{
  "a": "A", 
  "b": [
    {
      "ab": {
        "b": "B", 
        "i": "I", 
        "y": "y"
      }
    }
  ], 
  "c": {
    "c": "YES"
  }, 
  "i": "I"
}
Tear answered 18/2, 2016 at 14:10 Comment(1)
Although this is a bit old..I'm facing some circular imports error. So instead of return obj in the last line I did this return super(ObjectEncoder, self).default(obj). Reference HEREKatey
S
18

A really simplistic one-liner solution

import json

json.dumps(your_object, default=vars)

The end!

What comes below is a test.

import json
from dataclasses import dataclass


@dataclass
class Company:
    id: int
    name: str

@dataclass
class User:
    id: int
    name: str
    email: str
    company: Company


company = Company(id=1, name="Example Ltd")
user = User(id=1, name="John Doe", email="[email protected]", company=company)


json.dumps(user, default=vars)

Output:

{
  "id": 1, 
  "name": "John Doe", 
  "email": "[email protected]", 
  "company": {
    "id": 1, 
    "name": "Example Ltd"
  }
}
Simonasimonds answered 19/12, 2022 at 12:5 Comment(4)
I worked for me! In my opinion, it is the best solution so far.Shut
This is by far the best solution.Mamie
Nice clean one-liner \o/Wandering
default=lambda __o: __o.__dict__ can be replaced by default=vars, as mentioned here. Also see python docs.Edraedrea
L
15
import simplejson

class User(object):
    def __init__(self, name, mail):
        self.name = name
        self.mail = mail

    def _asdict(self):
        return self.__dict__

print(simplejson.dumps(User('alice', '[email protected]')))

if using standard json, you need to define a default function

import json
def default(o):
    return o._asdict()

print(json.dumps(User('alice', '[email protected]'), default=default))
Lukash answered 17/6, 2015 at 3:17 Comment(1)
I simplifed this by removing the _asdict function with a lambda json.dumps(User('alice', '[email protected]'), default=lambda x: x.__dict__)Kirbie
C
8

json is limited in terms of objects it can print, and jsonpickle (you may need a pip install jsonpickle) is limited in terms it can't indent text. If you would like to inspect the contents of an object whose class you can't change, I still couldn't find a straighter way than:

 import json
 import jsonpickle
 ...
 print  json.dumps(json.loads(jsonpickle.encode(object)), indent=2)

Note: that still they can't print the object methods.

Chenay answered 4/4, 2016 at 13:41 Comment(0)
H
7

The most simple answer

class Object(dict):
    def __init__(self):
        pass

    def __getattr__(self, key):
        return self[key]

    def __setattr__(self, key, value):
        self[key] = value

# test
obj = Object()
obj.name = "John"
obj.age = 25
obj.brothers = [ Object() ]
text = json.dumps(obj)

Now it gives you the output, don't change anything to json.dumps(...)

'{"name": "John", "age": 25, "brothers": [{}]}'
Hydroelectric answered 19/3, 2023 at 6:9 Comment(3)
Step by step: 1. Make sure your object inherits dict. 2. Add the __getattr__, so that json.dumps can access your attributes. 3. Add the __setattr__, so that when you add your own props, they are added to the dictionary.Sociometry
This is a nice answer if you really can't change json.dumps by injecting a custom serializer.Sociometry
Best answer so far, unless someone find an annotation to avoid having to repeat the above to all classes, or creating a base class with those 2 methods? If you want to avoid inheriting from dict, you can use self.__dict__[key]Anagnos
F
6

To throw another log on this 11 year old fire, I want a solution that meets the following criteria:

  • Allows an instance of class FileItem to be serialized using only json.dumps(obj)
  • Allows FileItem instances to have properties: fileItem.fname
  • Allows FileItem instances to be given to any library which will serialise it using json.dumps(obj)
  • Doesn't require any other fields to be passed to json.dumps (like a custom serializer)

IE:

fileItem = FileItem('filename.ext')
assert json.dumps(fileItem) == '{"fname": "filename.ext"}'
assert fileItem.fname == 'filename.ext'

My solution is:

  • Have obj's class inherit from dict
  • Map each object property to the underlying dict
class FileItem(dict):
    def __init__(self, fname):
        self['fname'] = fname

    #fname property
    fname: str = property()
    @fname.getter
    def fname(self):
        return self['fname']

    @fname.setter
    def fname(self, value: str):
        self['fname'] = value

    #Repeat for other properties

Yes, this is somewhat long winded if you have lots of properties, but it is JSONSerializable and it behaves like an object and you can give it to any library that's going to json.dumps(obj) it.

Forthcoming answered 23/9, 2021 at 0:41 Comment(3)
Just an FYI, the type of fname isn’t str just because it’s a property that evaluates to a str at runtime. if you need to annotate fname as you have shown it should likely be something like fname: property = property(); the type of the getter and setter methods is typing.MethodWrapperType – and you can annotate fname.getter(…) to show a str return type.Kentledge
Inheriting from dict works. I then added __getattr__ and __setattr__ methods on my class so that it will use the dict values for any undefined attributes.Rhodia
I don't understand the line fname: str = property(). Isn't this acting like a static variable? (Not sure what Python calls them - a non member variable in other OO languages.)Lycaon
A
5

Here is my 3 cents ...
This demonstrates explicit json serialization for a tree-like python object.
Note: If you actually wanted some code like this you could use the twisted FilePath class.

import json, sys, os

class File:
    def __init__(self, path):
        self.path = path

    def isdir(self):
        return os.path.isdir(self.path)

    def isfile(self):
        return os.path.isfile(self.path)

    def children(self):        
        return [File(os.path.join(self.path, f)) 
                for f in os.listdir(self.path)]

    def getsize(self):        
        return os.path.getsize(self.path)

    def getModificationTime(self):
        return os.path.getmtime(self.path)

def _default(o):
    d = {}
    d['path'] = o.path
    d['isFile'] = o.isfile()
    d['isDir'] = o.isdir()
    d['mtime'] = int(o.getModificationTime())
    d['size'] = o.getsize() if o.isfile() else 0
    if o.isdir(): d['children'] = o.children()
    return d

folder = os.path.abspath('.')
json.dump(File(folder), sys.stdout, default=_default)
Atonality answered 10/7, 2013 at 17:59 Comment(0)
D
5

This class can do the trick, it converts object to standard json .

import json


class Serializer(object):
    @staticmethod
    def serialize(object):
        return json.dumps(object, default=lambda o: o.__dict__.values()[0])

usage:

Serializer.serialize(my_object)

working in python2.7 and python3.

Dorton answered 9/10, 2016 at 14:14 Comment(1)
I liked this method the most. I ran into issues when trying to serialize more complex objects whos members/methods aren't serializable. Here's my implementation that works on more objects: ``` class Serializer(object): @staticmethod def serialize(obj): def check(o): for k, v in o.__dict__.items(): try: _ = json.dumps(v) o.__dict__[k] = v except TypeError: o.__dict__[k] = str(v) return o return json.dumps(check(obj).__dict__, indent=2) ```Seafaring
C
5

jaraco gave a pretty neat answer. I needed to fix some minor things, but this works:

Code

# Your custom class
class MyCustom(object):
    def __json__(self):
        return {
            'a': self.a,
            'b': self.b,
            '__python__': 'mymodule.submodule:MyCustom.from_json',
        }

    to_json = __json__  # supported by simplejson

    @classmethod
    def from_json(cls, json):
        obj = cls()
        obj.a = json['a']
        obj.b = json['b']
        return obj

# Dumping and loading
import simplejson

obj = MyCustom()
obj.a = 3
obj.b = 4

json = simplejson.dumps(obj, for_json=True)

# Two-step loading
obj2_dict = simplejson.loads(json)
obj2 = MyCustom.from_json(obj2_dict)

# Make sure we have the correct thing
assert isinstance(obj2, MyCustom)
assert obj2.__dict__ == obj.__dict__

Note that we need two steps for loading. For now, the __python__ property is not used.

How common is this?

Using the method of AlJohri, I check popularity of approaches:

Serialization (Python -> JSON):

Deserialization (JSON -> Python):

Cognac answered 27/6, 2018 at 5:24 Comment(1)
You should delete the parts where you search github for strings. This seems to be a bit pointless because if you inspect the results you can see it doesn't give the statistics you want to present anyway. Also it distracts from what it otherwise a pretty good answer. I personally like the class method approach for deserialization because it enables you to use type information to tell Python what type you want to end up with. (Inspecting a string it can't know that. A string could represent any type, given a weird enough encoding, in theory. It is ambiguous.)Lycaon
T
4
import json

class Foo(object):
    def __init__(self):
        self.bar = 'baz'
        self._qux = 'flub'

    def somemethod(self):
        pass

'''
The parameter default(obj) is a function that should return a 
serializable version of obj or raise TypeError. The default 
default simply raises TypeError. 

https://docs.python.org/3.4/library/json.html#json.dumps
'''
def default(instance):
    return {k: v
            for k, v in vars(instance).items()
            if not str(k).startswith('_')}

json_foo = json.dumps(Foo(), default=default)
assert '{"bar": "baz"}' == json_foo

print(json_foo)
Transposition answered 17/7, 2015 at 6:20 Comment(3)
From doc: The parameter default(obj) is a function that should return a serializable version of obj or raise TypeError. The default default simply raises TypeError.Paction
@Paction isn't that what this does? A new default function is declared that returns a dictionary, and everything works as expected.Shererd
@Mike'Pomax'Kamermans yes. I just added the documentation, as only code as above isn't really helpful in explaining it.Paction
A
4

This has worked well for me:

class JsonSerializable(object):

    def serialize(self):
        return json.dumps(self.__dict__)

    def __repr__(self):
        return self.serialize()

    @staticmethod
    def dumper(obj):
        if "serialize" in dir(obj):
            return obj.serialize()

        return obj.__dict__

and then

class FileItem(JsonSerializable):
    ...

and

log.debug(json.dumps(<my object>, default=JsonSerializable.dumper, indent=2))
Audiphone answered 18/1, 2019 at 8:10 Comment(0)
C
3

If you don't mind installing a package for it, you can use json-tricks:

pip install json-tricks

After that you just need to import dump(s) from json_tricks instead of json, and it'll usually work:

from json_tricks import dumps
json_str = dumps(cls_instance, indent=4)

which'll give

{
        "__instance_type__": [
                "module_name.test_class",
                "MyTestCls"
        ],
        "attributes": {
                "attr": "val",
                "dct_attr": {
                        "hello": 42
                }
        }
}

And that's basically it!


This will work great in general. There are some exceptions, e.g. if special things happen in __new__, or more metaclass magic is going on.

Obviously loading also works (otherwise what's the point):

from json_tricks import loads
json_str = loads(json_str)

This does assume that module_name.test_class.MyTestCls can be imported and hasn't changed in non-compatible ways. You'll get back an instance, not some dictionary or something, and it should be an identical copy to the one you dumped.

If you want to customize how something gets (de)serialized, you can add special methods to your class, like so:

class CustomEncodeCls:
        def __init__(self):
                self.relevant = 42
                self.irrelevant = 37

        def __json_encode__(self):
                # should return primitive, serializable types like dict, list, int, string, float...
                return {'relevant': self.relevant}

        def __json_decode__(self, **attrs):
                # should initialize all properties; note that __init__ is not called implicitly
                self.relevant = attrs['relevant']
                self.irrelevant = 12

which serializes only part of the attributes parameters, as an example.

And as a free bonus, you get (de)serialization of numpy arrays, date & times, ordered maps, as well as the ability to include comments in json.

Disclaimer: I created json_tricks, because I had the same problem as you.

Connected answered 10/11, 2016 at 12:53 Comment(2)
I've just tested json_tricks and it worked beautify (in 2019).Gowen
Why doesn't this answer have more upvotes? It looks like a de-facto default solution. Are there any downsides to this approach? What makes this library different from jsons and jsonpickle ?Lycaon
F
3

Kyle Delaney's comment is correct so i tried to use the answer https://mcmap.net/q/41907/-how-to-make-a-class-json-serializable as well as an improved version of https://mcmap.net/q/41980/-encoding-nested-python-object-in-json

to create a "JSONAble" mixin.

So to make a class JSON serializeable use "JSONAble" as a super class and either call:

 instance.toJSON()

or

 instance.asJSON()

for the two offered methods. You could also extend the JSONAble class with other approaches offered here.

The test example for the Unit Test with Family and Person sample results in:

toJSOn():

{
    "members": {
        "Flintstone,Fred": {
            "firstName": "Fred",
            "lastName": "Flintstone"
        },
        "Flintstone,Wilma": {
            "firstName": "Wilma",
            "lastName": "Flintstone"
        }
    },
    "name": "The Flintstones"
}

asJSOn():

{'name': 'The Flintstones', 'members': {'Flintstone,Fred': {'firstName': 'Fred', 'lastName': 'Flintstone'}, 'Flintstone,Wilma': {'firstName': 'Wilma', 'lastName': 'Flintstone'}}}

Unit Test with Family and Person sample

def testJsonAble(self):
        family=Family("The Flintstones")
        family.add(Person("Fred","Flintstone")) 
        family.add(Person("Wilma","Flintstone"))
        json1=family.toJSON()
        json2=family.asJSON()
        print(json1)
        print(json2)

class Family(JSONAble):
    def __init__(self,name):
        self.name=name
        self.members={}
    
    def add(self,person):
        self.members[person.lastName+","+person.firstName]=person

class Person(JSONAble):
    def __init__(self,firstName,lastName):
        self.firstName=firstName;
        self.lastName=lastName;

jsonable.py defining JSONAble mixin

 '''
Created on 2020-09-03

@author: wf
'''
import json

class JSONAble(object):
    '''
    mixin to allow classes to be JSON serializable see
    https://mcmap.net/q/41907/-how-to-make-a-class-json-serializable
    '''

    def __init__(self):
        '''
        Constructor
        '''
    
    def toJSON(self):
        return json.dumps(self, default=lambda o: o.__dict__, 
            sort_keys=True, indent=4)
        
    def getValue(self,v):
        if (hasattr(v, "asJSON")):
            return v.asJSON()
        elif type(v) is dict:
            return self.reprDict(v)
        elif type(v) is list:
            vlist=[]
            for vitem in v:
                vlist.append(self.getValue(vitem))
            return vlist
        else:   
            return v
    
    def reprDict(self,srcDict):
        '''
        get my dict elements
        '''
        d = dict()
        for a, v in srcDict.items():
            d[a]=self.getValue(v)
        return d
    
    def asJSON(self):
        '''
        recursively return my dict elements
        '''
        return self.reprDict(self.__dict__)   

You'll find these approaches now integrated in the https://github.com/WolfgangFahl/pyLoDStorage project which is available at https://pypi.org/project/pylodstorage/

Foreyard answered 3/9, 2020 at 7:16 Comment(0)
D
3

Why are you guys making it so complicated? Here is a simple example:

#!/usr/bin/env python3

import json
from dataclasses import dataclass

@dataclass
class Person:
    first: str
    last: str
    age: int

    @property
    def __json__(self):
        return {
            "name": f"{self.first} {self.last}",
            "age": self.age
        }

john = Person("John", "Doe", 42)
print(json.dumps(john, indent=4, default=lambda x: x.__json__))

This way you could also serialize nested classes, as __json__ returns a python object and not a string. No need to use a JSONEncoder, as the default parameter with a simple lambda also works fine.

I've used @property instead of a simple function, as this feels more natural and modern. The @dataclass is also just an example, it works for a "normal" class as well.

Dalessio answered 15/5, 2022 at 8:4 Comment(6)
possibly because you'd need to define a __json__ property for each class, which can be sometimes a pain. also, dataclasses provides asdict so technically you don't need a __json__ property at all.Chevalier
Sure, but what if you want to represent the json in a different way? Like in this case I combine first and last name. Thje asdict would not work for nested elements, right?Dalessio
hmm, in that case I would suggest making first and last as InitVar (init-only) fields, and setting name field in the __post_init__ constructor. I think that should hopefully work to represent json in a diff format in this case. Also, i might be wrong but I believe asdict works for nested dataclasses as well.Chevalier
But that does not work if you change the variables later on.Dalessio
Hmm, to best of my understanding it should. can you provide an example of what you mean?Chevalier
[First Comment on main question] "These answers assume that you're doing the serialization yourself, rather than passing the object along to some other module that serializes it. – Kyle Delaney Oct 17, 2019" I think that sums up the issueMadalene
H
2

jsonweb seems to be the best solution for me. See http://www.jsonweb.info/en/latest/

from jsonweb.encode import to_object, dumper

@to_object()
class DataModel(object):
  def __init__(self, id, value):
   self.id = id
   self.value = value

>>> data = DataModel(5, "foo")
>>> dumper(data)
'{"__type__": "DataModel", "id": 5, "value": "foo"}'
Hardspun answered 7/10, 2014 at 5:32 Comment(1)
Does it work well for nested objects? Including decoding and encodingOphthalmology
N
2

I came up with my own solution. Use this method, pass any document (dict,list, ObjectId etc) to serialize.

def getSerializable(doc):
    # check if it's a list
    if isinstance(doc, list):
        for i, val in enumerate(doc):
            doc[i] = getSerializable(doc[i])
        return doc

    # check if it's a dict
    if isinstance(doc, dict):
        for key in doc.keys():
            doc[key] = getSerializable(doc[key])
        return doc

    # Process ObjectId
    if isinstance(doc, ObjectId):
        doc = str(doc)
        return doc

    # Use any other custom serializting stuff here...

    # For the rest of stuff
    return doc
Nil answered 21/5, 2015 at 5:6 Comment(0)
K
2

Building on Quinten Cabo's answer:

def sterilize(obj):
    """Make an object more ameniable to dumping as json
    """
    if type(obj) in (str, float, int, bool, type(None)):
        return obj
    elif isinstance(obj, dict):
        return {k: sterilize(v) for k, v in obj.items()}
    list_ret = []
    dict_ret = {}
    for a in dir(obj):
        if a == '__iter__' and callable(obj.__iter__):
            list_ret.extend([sterilize(v) for v in obj])
        elif a == '__dict__':
            dict_ret.update({k: sterilize(v) for k, v in obj.__dict__.items() if k not in ['__module__', '__dict__', '__weakref__', '__doc__']})
        elif a not in ['__doc__', '__module__']:
            aval = getattr(obj, a)
            if type(aval) in (str, float, int, bool, type(None)):
                dict_ret[a] = aval
            elif a != '__class__' and a != '__objclass__' and isinstance(aval, type):
                dict_ret[a] = sterilize(aval)
    if len(list_ret) == 0:
        if len(dict_ret) == 0:
            return repr(obj)
        return dict_ret
    else:
        if len(dict_ret) == 0:
            return list_ret
    return (list_ret, dict_ret)

The differences are

  1. Works for any iterable instead of just list and tuple (it works for NumPy arrays, etc.)
  2. Works for dynamic types (ones that contain a __dict__).
  3. Includes native types float and None so they don't get converted to string.
  4. Classes that have __dict__ and members will mostly work (if the __dict__ and member names collide, you will only get one - likely the member)
  5. Classes that are lists and have members will look like a tuple of the list and a dictionary
  6. Python3 (that isinstance() call may be the only thing that needs changing)
Kutchins answered 2/5, 2020 at 11:29 Comment(0)
S
2
class DObject(json.JSONEncoder):
    def delete_not_related_keys(self, _dict):
        for key in ["skipkeys", "ensure_ascii", "check_circular", "allow_nan", "sort_keys", "indent"]:
            try:
                del _dict[key]
            except:
                continue

    def default(self, o):
        if hasattr(o, '__dict__'):
            my_dict = o.__dict__.copy()
            self.delete_not_related_keys(my_dict)
            return my_dict
        else:
            return o

a = DObject()
a.name = 'abdul wahid'
b = DObject()
b.name = a

print(json.dumps(b, cls=DObject))
Stunsail answered 19/6, 2020 at 15:15 Comment(0)
S
1

I liked Lost Koder's method the most. I ran into issues when trying to serialize more complex objects whos members/methods aren't serializable. Here's my implementation that works on more objects:

class Serializer(object):
    @staticmethod
    def serialize(obj):
        def check(o):
            for k, v in o.__dict__.items():
                try:
                    _ = json.dumps(v)
                    o.__dict__[k] = v
                except TypeError:
                    o.__dict__[k] = str(v)
            return o
        return json.dumps(check(obj).__dict__, indent=2)
Seafaring answered 11/11, 2017 at 5:35 Comment(0)
S
1

I ran into this problem when I tried to store Peewee's model into PostgreSQL JSONField.

After struggling for a while, here's the general solution.

The key to my solution is going through Python's source code and realizing that the code documentation (described here) already explains how to extend the existing json.dumps to support other data types.

Suppose you current have a model that contains some fields that are not serializable to JSON and the model that contains the JSON field originally looks like this:

class SomeClass(Model):
    json_field = JSONField()

Just define a custom JSONEncoder like this:

class CustomJsonEncoder(json.JSONEncoder):
    def default(self, obj):
        if isinstance(obj, SomeTypeUnsupportedByJsonDumps):
            return < whatever value you want >
        return json.JSONEncoder.default(self, obj)

    @staticmethod
    def json_dumper(obj):
        return json.dumps(obj, cls=CustomJsonEncoder)

And then just use it in your JSONField like below:

class SomeClass(Model):
    json_field = JSONField(dumps=CustomJsonEncoder.json_dumper)

The key is the default(self, obj) method above. For every single ... is not JSON serializable complaint you receive from Python, just add code to handle the unserializable-to-JSON type (such as Enum or datetime)

For example, here's how I support a class inheriting from Enum:

class TransactionType(Enum):
   CURRENT = 1
   STACKED = 2

   def default(self, obj):
       if isinstance(obj, TransactionType):
           return obj.value
       return json.JSONEncoder.default(self, obj)

Finally, with the code implemented like above, you can just convert any Peewee models to be a JSON-seriazable object like below:

peewee_model = WhateverPeeweeModel()
new_model = SomeClass()
new_model.json_field = model_to_dict(peewee_model)

Though the code above was (somewhat) specific to Peewee, but I think:

  1. It's applicable to other ORMs (Django, etc) in general
  2. Also, if you understood how json.dumps works, this solution also works with Python (sans ORM) in general too

Any questions, please post in the comments section. Thanks!

Serrated answered 30/7, 2018 at 15:4 Comment(0)
C
1

If the object can pe pickled one can use the following two functions to decode and encode an object:

def obj_to_json(obj):
    pickled = pickle.dumps(obj)
    coded = base64.b64encode(pickled).decode('utf8')
    return json.dumps(coded)

def json_to_obj(s):
    coded = base64.b64decode(s)
    return pickle.loads(coded)

This is for example usefull in combination with pytest and config.cache.

Cykana answered 23/3, 2023 at 16:11 Comment(0)
B
1

I don't know if that suits your needs, but using orjson as json and adding a dataclass decorator to your class solves the problem:

from dataclasses import dataclass

@dataclass()
class FileItem:
    def __init__(self, fname):
        self.fname = fname

import orjson as json
x = FileItem("/foo/bar")
json.dumps(x)
# -> returns b'{"fname":"/foo/bar"}'
Barbee answered 25/9, 2023 at 7:11 Comment(0)
H
0

This is a small library that serializes an object with all its children to JSON and also parses it back:

https://github.com/tobiasholler/PyJSONSerialization/

Hornpipe answered 17/7, 2014 at 9:44 Comment(0)
P
0

If you are able to install a package, I'd recommend trying dill, which worked just fine for my project. A nice thing about this package is that it has the same interface as pickle, so if you have already been using pickle in your project you can simply substitute in dill and see if the script runs, without changing any code. So it is a very cheap solution to try!

(Full anti-disclosure: I am in no way affiliated with and have never contributed to the dill project.)

Install the package:

pip install dill

Then edit your code to import dill instead of pickle:

# import pickle
import dill as pickle

Run your script and see if it works. (If it does you may want to clean up your code so that you are no longer shadowing the pickle module name!)

Some specifics on datatypes that dill can and cannot serialize, from the project page:

dill can pickle the following standard types:

none, type, bool, int, long, float, complex, str, unicode, tuple, list, dict, file, buffer, builtin, both old and new style classes, instances of old and new style classes, set, frozenset, array, functions, exceptions

dill can also pickle more ‘exotic’ standard types:

functions with yields, nested functions, lambdas, cell, method, unboundmethod, module, code, methodwrapper, dictproxy, methoddescriptor, getsetdescriptor, memberdescriptor, wrapperdescriptor, xrange, slice, notimplemented, ellipsis, quit

dill cannot yet pickle these standard types:

frame, generator, traceback

Pitts answered 18/12, 2018 at 16:48 Comment(0)
D
0

I see no mention here of serial versioning or backcompat, so I will post my solution which I've been using for a bit. I probably have a lot more to learn from, specifically Java and Javascript are probably more mature than me here but here goes

https://gist.github.com/andy-d/b7878d0044a4242c0498ed6d67fd50fe

Darrondarrow answered 27/8, 2019 at 21:39 Comment(0)
R
0

To add another option: You can use the attrs package and the asdict method.

class ObjectEncoder(JSONEncoder):
    def default(self, o):
        return attr.asdict(o)

json.dumps(objects, cls=ObjectEncoder)

and to convert back

def from_json(o):
    if '_obj_name' in o:
        type_ = o['_obj_name']
        del o['_obj_name']
        return globals()[type_](**o)
    else:
        return o

data = JSONDecoder(object_hook=from_json).decode(data)

class looks like this

@attr.s
class Foo(object):
    x = attr.ib()
    _obj_name = attr.ib(init=False, default='Foo')
Relativize answered 15/10, 2019 at 17:12 Comment(0)
W
0

In addition to the Onur's answer, You possibly want to deal with datetime type like below.
(in order to handle: 'datetime.datetime' object has no attribute 'dict' exception.)

def datetime_option(value):
    if isinstance(value, datetime.date):
        return value.timestamp()
    else:
        return value.__dict__

Usage:

def toJSON(self):
    return json.dumps(self, default=datetime_option, sort_keys=True, indent=4)
Wornout answered 3/2, 2020 at 8:34 Comment(0)
M
0

First we need to make our object JSON-compliant, so we can dump it using the standard JSON module. I did it this way:

def serialize(o):
    if isinstance(o, dict):
        return {k:serialize(v) for k,v in o.items()}
    if isinstance(o, list):
        return [serialize(e) for e in o]
    if isinstance(o, bytes):
        return o.decode("utf-8")
    return o
Millenary answered 27/2, 2020 at 8:6 Comment(0)
D
0

This function uses recursion to iterate over every part of the dictionary and then calls the repr() methods of classes that are not build-in types.

def sterilize(obj):
    object_type = type(obj)
    if isinstance(obj, dict):
        return {k: sterilize(v) for k, v in obj.items()}
    elif object_type in (list, tuple):
        return [sterilize(v) for v in obj]
    elif object_type in (str, int, bool, float):
        return obj
    else:
        return obj.__repr__()
Dou answered 30/3, 2020 at 20:12 Comment(0)
L
0

We often dump complex dictionaries in JSON format in log files. While most of the fields carry important information, we don't care much about the built-in class objects(for example a subprocess.Popen object). Due to presence of unserializable objects like these, call to json.dumps() fails.

To get around this, I built a small function that dumps object's string representation instead of dumping the object itself. And if the data structure you are dealing with is too nested, you can specify the nesting maximum level/depth.

from time import time

def safe_serialize(obj , max_depth = 2):

    max_level = max_depth

    def _safe_serialize(obj , current_level = 0):

        nonlocal max_level

        # If it is a list
        if isinstance(obj , list):

            if current_level >= max_level:
                return "[...]"

            result = list()
            for element in obj:
                result.append(_safe_serialize(element , current_level + 1))
            return result

        # If it is a dict
        elif isinstance(obj , dict):

            if current_level >= max_level:
                return "{...}"

            result = dict()
            for key , value in obj.items():
                result[f"{_safe_serialize(key , current_level + 1)}"] = _safe_serialize(value , current_level + 1)
            return result

        # If it is an object of builtin class
        elif hasattr(obj , "__dict__"):
            if hasattr(obj , "__repr__"):
                result = f"{obj.__repr__()}_{int(time())}"
            else:
                try:
                    result = f"{obj.__class__.__name__}_object_{int(time())}"
                except:
                    result = f"object_{int(time())}"
            return result

        # If it is anything else
        else:
            return obj

    return _safe_serialize(obj)

Since a dictionary can also have unserializable keys, dumping their class name or object representation will lead to all keys with same name, which will throw error as all keys need to have unique name, that is why the current time since epoch is appended to object names with int(time()).

This function can be tested with the following nested dictionary with different levels/depths-

d = {
    "a" : {
        "a1" : {
            "a11" : {
                "a111" : "some_value" ,
                "a112" : "some_value" ,
            } ,
            "a12" : {
                "a121" : "some_value" ,
                "a122" : "some_value" ,
            } ,
        } ,
        "a2" : {
            "a21" : {
                "a211" : "some_value" ,
                "a212" : "some_value" ,
            } ,
            "a22" : {
                "a221" : "some_value" ,
                "a222" : "some_value" ,
            } ,
        } ,
    } ,
    "b" : {
        "b1" : {
            "b11" : {
                "b111" : "some_value" ,
                "b112" : "some_value" ,
            } ,
            "b12" : {
                "b121" : "some_value" ,
                "b122" : "some_value" ,
            } ,
        } ,
        "b2" : {
            "b21" : {
                "b211" : "some_value" ,
                "b212" : "some_value" ,
            } ,
            "b22" : {
                "b221" : "some_value" ,
                "b222" : "some_value" ,
            } ,
        } ,
    } ,
    "c" : subprocess.Popen("ls -l".split() , stdout = subprocess.PIPE , stderr = subprocess.PIPE) ,
}

Running the following will lead to-

print("LEVEL 3")
print(json.dumps(safe_serialize(d , 3) , indent = 4))

print("\n\n\nLEVEL 2")
print(json.dumps(safe_serialize(d , 2) , indent = 4))

print("\n\n\nLEVEL 1")
print(json.dumps(safe_serialize(d , 1) , indent = 4))

Result:

LEVEL 3
{
    "a": {
        "a1": {
            "a11": "{...}",
            "a12": "{...}"
        },
        "a2": {
            "a21": "{...}",
            "a22": "{...}"
        }
    },
    "b": {
        "b1": {
            "b11": "{...}",
            "b12": "{...}"
        },
        "b2": {
            "b21": "{...}",
            "b22": "{...}"
        }
    },
    "c": "<Popen: returncode: None args: ['ls', '-l']>"
}



LEVEL 2
{
    "a": {
        "a1": "{...}",
        "a2": "{...}"
    },
    "b": {
        "b1": "{...}",
        "b2": "{...}"
    },
    "c": "<Popen: returncode: None args: ['ls', '-l']>"
}



LEVEL 1
{
    "a": "{...}",
    "b": "{...}",
    "c": "<Popen: returncode: None args: ['ls', '-l']>"
}

[NOTE]: Only use this if you don't care about serialization of a built-in class object.

Laruelarum answered 26/12, 2022 at 20:4 Comment(0)
C
-1

To throw yet another log into a 10-year old fire, I would also offer the dataclass-wizard for this task, assuming you're using Python 3.6+. This works well with dataclasses, which is actually a python builtin module in 3.7+ onwards.

The dataclass-wizard library will convert your object (and all its attributes recursively) to a dict, and makes the reverse (de-serialization) pretty straightforward too, with fromdict. Also, here is the PyPi link: https://pypi.org/project/dataclass-wizard/.

Disclaimer: I am the creator and maintener of this library.

import dataclass_wizard
import dataclasses

@dataclasses.dataclass
class A:
    hello: str
    a_field: int

obj = A('world', 123)
a_dict = dataclass_wizard.asdict(obj)
# {'hello': 'world', 'aField': 123}

Or if you wanted a string:

a_str = jsons.dumps(dataclass_wizard.asdict(obj))

Or if your class extended from dataclass_wizard.JSONWizard:

a_str = your_object.to_json()

Finally, the library also supports dataclasses in Union types, which basically means that a dict can be de-serialized into an object of either class C1 or C2. For example:

from dataclasses import dataclass

from dataclass_wizard import JSONWizard

@dataclass
class Outer(JSONWizard):

    class _(JSONWizard.Meta):
        tag_key = 'tag'
        auto_assign_tags = True

    my_string: str
    inner: 'A | B'  # alternate syntax: `inner: typing.Union['A', 'B']`

@dataclass
class A:
    my_field: int

@dataclass
class B:
    my_field: str


my_dict = {'myString': 'test', 'inner': {'tag': 'B', 'myField': 'test'}}
obj = Outer.from_dict(my_dict)

# True
assert repr(obj) == "Outer(my_string='test', inner=B(my_field='test'))"

obj.to_json()
# {"myString": "test", "inner": {"myField": "test", "tag": "B"}}
Chevalier answered 14/6, 2022 at 15:38 Comment(1)
I suppose it could be that I didn't have a disclaimer.. if so, just fixed that.Chevalier
B
-1

Whomever wants to use basic conversion without an external library, it is simply how you can override __iter__ & __str__ functions of the custom class using following way.

class JSONCustomEncoder(json.JSONEncoder):
    def default(self, obj):
        return obj.__dict__


class Student:
    def __init__(self, name: str, slug: str):
        self.name = name
        self.age = age

    def __iter__(self):
        yield from {
            "name": self.name,
            "age": self.age,
        }.items()

    def __str__(self):
        return json.dumps(
            self.__dict__, cls=JSONCustomEncoder, ensure_ascii=False
        )

Use the object by wrapping in a dict(), so that data remains preserved.

s = Student("aman", 24)
dict(s)
Bruch answered 22/9, 2022 at 19:55 Comment(0)
L
-1

Given that there is no "standardized" way to perform Serialization and Deserialization in Python (compare what Python has to offer to Rust which is an alternative language which I happen to know about which does Serialization and Deserialization well) I think what would be helpful is to have an answer which collects together a summary of the possible approaches along with their advantages, disadvantages and performance comparisons.

I cannot provide all this information myself, at least not all at once. So I will start off by providing some information and leave this answer for others to edit and contribute to. I will provide a summary of the most notable answers thus far. For the ones I have missed please freely edit this question or comment and someone will update it. (Hopefully)

When this becomes "production ready" I will clean up this preamble to remove it. My aim would be for this to become a long-term reference which provides the relevant information succinctly, rather than have it be distributed across a large number of individual answers, each arguing their case for why they should be used.

General

  • Serialization is a many-to-1 operation, meaning that once serialized type information is lost and the same serialized string could represent infinitly many possible parent types. The obvious example is that of a Set and a list. These are two different objects (types), which could contain the same set of elements, would be serialized in the same way.
  • Many languages solve this problem by explicitly providing type information as part of the deserialization function call. For example Type::deserialize() or deserialize(..., type=Type). This is not code for any particular language, it is simply here to present how type information might be present in code.

json

Advantages:

  • Native to Python
  • Will serialize basic types: dict, list, str, int, float, bool, None

Disadvantages:

  • Does not serialize recursively, meaning if one Python object contains another then the containing object is not serializable
  • Does not serialize common types like datetime.datetime or datetime.date

jsons

Advantages:

  • Correctly serializes and deserializes recursively nested types (?)
  • Correctly serializes and deserializes common types like datetime objects (?)

Disadvantages:

  • Slow (?)

jsonpickle

Advantages:

  • Correctly serializes and deserializes recursively nested types (?)
  • Correctly serializes and deserializes common types like datetime objects (?)

Disadvantages:

  • Type information is present in the encoded (serialized) output.
  • This means that the encoded output is more verbose, contains fields that you might not expect to see (type info) and the encoding is "special" to both Python and the jsonpickle library.
  • If you deserialize this in another languages or using another Python library, you will obviously not have access to the same behaviours. (In other words the code you write will behave differently. This is sort of obvious and goes without saying.)
  • You can suppress the type information using an argument unpicklable=False
  • Might be slow? (Citation required: please edit to add performance comparisons)

inheriting from dict, using Python's inbuilt json library

Advantages:

  • Works without requiring other libraries
  • Minimal boilerplate

Disadvantages:

  • Does not work for non-serializable types like datetime
  • Does not work for nested types (?)
  • Since your type is probably not actually a dict but something else, this violates fundamental principles of OOP design

** Considerations:**

  • You can use __getattr__ and __setattr__ methods so that it will use the dict values for any undefined attributes, see answer by Sunding Wei

Use composition over inheritance, aka wrap a dict

Advantages:

  • Does not violate OOP design compared to above alternative
  • Works for nested types, but requires a lot of boilerplate, better for things that do not have nesting

Disadvantages:

  • Unless more boilerplate is added, accessing elements of the dict requires more code, and the resulting code is less intuitive. If the dict object is named data_dict rather than accessing my_class.my_field one has to my_class.data_dict.my_field
  • Properties or getters/setters can mitigate this but that requires maintainance of 2N functions for N fields
  • Requires adding from_dict class method for deserialization and __json__ or to_json for serializing
  • As such, this is a more manual operation compared to the previously presented examples. That might be preferable/acceptable if explicit code is prefered for some cases where there is no nesting of types

Manual implementation by returning a dict as an interface type

  • See answer by Martin Thoma, it is similar to the above option of wrapping a dict

Advantages:

  • Uses explicit type information from TYPE.from_json class methods
  • Allows creating a class with explicitly named fields rather than keys in a dictionary

Disadvantages:

  • Requires two step loading instead of a single line of code
  • Requires some boilerplate
  • Since the serialization is being done by relying on conversion of the class to a dict structure, might consider using above method more straightforward
  • Does not work in cases where the fields include types like datetime

subclassing JSONEncoder and JSONDecoder

Advantages:

  • Leverages Python native json library
  • others?

Disadvantages:

  • Not a 1 line solution
  • Requires creating a class to serialize and deserialize every type you want to be able to serialize and deserialize

Side note: This looks like it should be the "canonical" choice ... but I'm not completely sure I understand it and the fact that this weird "hook" think is required makes me suspect it's perhaps not that generalizable? Maybe someone else can edit this section and clarify?

default=vars

Advantage:

  • Very quickly allows serialization of custom objects

Disadvantage:

  • Only works for types serializable with native json library, does not work for types like datetime
  • Does not work for nested types (?)

json-fix

See answer by Jeff Hykin

simplejson

todo

json-tricks

todo

jsonweb

todo

Lycaon answered 14/2 at 11:24 Comment(0)
E
-2

There are many approaches to this problem. 'ObjDict' (pip install objdict) is another. There is an emphasis on providing javascript like objects which can also act like dictionaries to best handle data loaded from JSON, but there are other features which can be useful as well. This provides another alternative solution to the original problem.

Expunction answered 2/10, 2016 at 4:29 Comment(0)
A
-2

I chose to use decorators to solve the datetime object serialization problem. Here is my code:

#myjson.py
#Author: jmooremcc 7/16/2017

import json
from datetime import datetime, date, time, timedelta
"""
This module uses decorators to serialize date objects using json
The filename is myjson.py
In another module you simply add the following import statement:
    from myjson import json

json.dumps and json.dump will then correctly serialize datetime and date 
objects
"""

def json_serial(obj):
    """JSON serializer for objects not serializable by default json code"""

    if isinstance(obj, (datetime, date)):
        serial = str(obj)
        return serial
    raise TypeError ("Type %s not serializable" % type(obj))


def FixDumps(fn):
    def hook(obj):
        return fn(obj, default=json_serial)

    return hook

def FixDump(fn):
    def hook(obj, fp):
        return fn(obj,fp, default=json_serial)

    return hook


json.dumps=FixDumps(json.dumps)
json.dump=FixDump(json.dump)


if __name__=="__main__":
    today=datetime.now()
    data={'atime':today, 'greet':'Hello'}
    str=json.dumps(data)
    print str

By importing the above module, my other modules use json in a normal way (without specifying the default keyword) to serialize data that contains date time objects. The datetime serializer code is automatically called for json.dumps and json.dump.

Animal answered 16/7, 2017 at 17:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.