Easiest way to serialize a simple class object with simplejson?
Asked Answered
F

7

31

I'm trying to serialize a list of python objects with JSON (using simplejson) and am getting the error that the object "is not JSON serializable".

The class is a simple class having fields that are only integers, strings, and floats, and inherits similar fields from one parent superclass, e.g.:

class ParentClass:
  def __init__(self, foo):
     self.foo = foo

class ChildClass(ParentClass):
  def __init__(self, foo, bar):
     ParentClass.__init__(self, foo)
     self.bar = bar

bar1 = ChildClass(my_foo, my_bar)
bar2 = ChildClass(my_foo, my_bar)
my_list_of_objects = [bar1, bar2]
simplejson.dump(my_list_of_objects, my_filename)

where foo, bar are simple types like I mentioned above. The only tricky thing is that ChildClass sometimes has a field that refers to another object (of a type that is not ParentClass or ChildClass).

What is the easiest way to serialize this as a json object with simplejson? Is it sufficient to make it serializable as a dictionary? Is the best way to simply write a dict method for ChildClass? Finally, does having the field that refer to another object significantly complicate things? If so, I can rewrite my code to only have simple fields in classes (like strings/floats etc.)

thank you.

Firooc answered 26/2, 2010 at 17:31 Comment(2)
#2250292Lobachevsky
possible duplicate of Python: how to make a class JSON serializableArmpit
C
30

I've used this strategy in the past and been pretty happy with it: Encode your custom objects as JSON object literals (like Python dicts) with the following structure:

{ '__ClassName__': { ... } }

That's essentially a one-item dict whose single key is a special string that specifies what kind of object is encoded, and whose value is a dict of the instance's attributes. If that makes sense.

A very simple implementation of an encoder and a decoder (simplified from code I've actually used) is like so:

TYPES = { 'ParentClass': ParentClass,
          'ChildClass': ChildClass }


class CustomTypeEncoder(json.JSONEncoder):
    """A custom JSONEncoder class that knows how to encode core custom
    objects.

    Custom objects are encoded as JSON object literals (ie, dicts) with
    one key, '__TypeName__' where 'TypeName' is the actual name of the
    type to which the object belongs.  That single key maps to another
    object literal which is just the __dict__ of the object encoded."""

    def default(self, obj):
        if isinstance(obj, TYPES.values()):
            key = '__%s__' % obj.__class__.__name__
            return { key: obj.__dict__ }
        return json.JSONEncoder.default(self, obj)


def CustomTypeDecoder(dct):
    if len(dct) == 1:
        type_name, value = dct.items()[0]
        type_name = type_name.strip('_')
        if type_name in TYPES:
            return TYPES[type_name].from_dict(value)
    return dct

In this implementation assumes that the objects you're encoding will have a from_dict() class method that knows how to take recreate an instance from a dict decoded from JSON.

It's easy to expand the encoder and decoder to support custom types (e.g. datetime objects).

EDIT, to answer your edit: The nice thing about an implementation like this is that it will automatically encode and decode instances of any object found in the TYPES mapping. That means that it will automatically handle a ChildClass like so:

class ChildClass(object):
    def __init__(self):
        self.foo = 'foo'
        self.bar = 1.1
        self.parent = ParentClass(1)

That should result in JSON something like the following:

{ '__ChildClass__': {
    'bar': 1.1,
    'foo': 'foo',
    'parent': {
        '__ParentClass__': {
            'foo': 1}
        }
    }
}
Canal answered 26/2, 2010 at 17:49 Comment(3)
How does this compare to using something like jsonpickle module? thanks.Firooc
I just coded up something very similar using the ClassName method in the new Underverse module. You can also auto detect types to encode by checking if they are in the TYPES dict, if not you can add them and encode on the fly. All you have to do then is add a list of classes when decoding the JSON and encoding is handled automatically.Secondclass
Good solution, but it won't work on any classes with names that start or end on an underscore because of how you save the class name. Better to just use the class names verbatim.Sepaloid
Y
10

An instance of a custom class could be represented as JSON formatted string with help of following function:

def json_repr(obj):
  """Represent instance of a class as JSON.
  Arguments:
  obj -- any object
  Return:
  String that reprent JSON-encoded object.
  """
  def serialize(obj):
    """Recursively walk object's hierarchy."""
    if isinstance(obj, (bool, int, long, float, basestring)):
      return obj
    elif isinstance(obj, dict):
      obj = obj.copy()
      for key in obj:
        obj[key] = serialize(obj[key])
      return obj
    elif isinstance(obj, list):
      return [serialize(item) for item in obj]
    elif isinstance(obj, tuple):
      return tuple(serialize([item for item in obj]))
    elif hasattr(obj, '__dict__'):
      return serialize(obj.__dict__)
    else:
      return repr(obj) # Don't know how to handle, convert to string
  return json.dumps(serialize(obj))

This function will produce JSON-formatted string for

  • an instance of a custom class,

  • a dictionary that have instances of custom classes as leaves,

  • a list of instances of custom classes
Yesseniayester answered 13/1, 2011 at 16:38 Comment(1)
This is awesome. Should the dict case be handled as such? ``` elif isinstance(obj, dict): objc = obj.copy() for key in obj: objc[key] = serialize(obj[key]) return objc ``` Also, "basestring" is not in Python3, it is now "str", plus "long" is not needed for Python 3.Ridenour
N
3

As specified in python's JSON docs // help(json.dumps) // >

You should simply override the default() method of JSONEncoder in order to provide a custom type conversion, and pass it as cls argument.

Here is one I use to cover Mongo's special data types (datetime and ObjectId)

class MongoEncoder(json.JSONEncoder):
    def default(self, v):
        types = {
            'ObjectId': lambda v: str(v),
            'datetime': lambda v: v.isoformat()
        }
        vtype = type(v).__name__
        if vtype in types:
            return types[type(v).__name__](v)
        else:
            return json.JSONEncoder.default(self, v)     

Calling it as simple as

data = json.dumps(data, cls=MongoEncoder)
Nobe answered 12/6, 2013 at 1:55 Comment(0)
T
2

If you are using Django, it can be easily done via Django's serializers module. More info can be found here: https://docs.djangoproject.com/en/dev/topics/serialization/

Throe answered 1/8, 2011 at 0:39 Comment(0)
T
1

This is kind of hackish and I'm sure there's probably a lot that can be wrong with it. However, I was producing a simple script and I ran the issue that I did not want to subclass my json serializer to serialize a list of model objects. I ended up using list comprehension

Let: assets = list of modelobjects

Code:

myJson = json.dumps([x.__dict__ for x in assets])

So far seems to have worked charmingly for my needs

Tripos answered 11/6, 2014 at 19:33 Comment(0)
A
1

I have a similar problem but the json.dump function is not called by me. So, to make MyClass JSON serializable without giving a custom encoder to json.dump you have to Monkey patch the json encoder.

First create your encoder in your module my_module:

import json

class JSONEncoder(json.JSONEncoder):
    """To make MyClass JSON serializable you have to Monkey patch the json
    encoder with the following code:
    >>> import json
    >>> import my_module
    >>> json.JSONEncoder.default = my_module.JSONEncoder.default
    """
    def default(self, o):
        """For JSON serialization."""
        if isinstance(o, MyClass):
            return o.__repr__()
        else:
            return super(self,o)

class MyClass:
    def __repr__(self):
        return "my class representation"

Then as it is described in the comment, monkey patch the json encoder:

import json
import my_module
json.JSONEncoder.default = my_module.JSONEncoder.default

Now, even an call of json.dump in an external library (where you cannot change the cls parameter) will work for your my_module.MyClass objects.

Ajax answered 25/4, 2018 at 14:44 Comment(0)
U
0

I feel a bit silly about my possible 2 solutions rereading it now, of course when you use django-rest-framework, this framework have some excellent features buildin for this problem mentioned above.

see this model view example on their website

If you're not using django-rest-framework, this can help anyway:

I found 2 helpfull solutions for this problem in this page: (I like the second one the most!)

Possible solution 1 (or way to go): David Chambers Design made a nice solution

I hope David does not mind I copy paste his solution code here:

Define a serialization method on the instance's model:

def toJSON(self):
import simplejson
return simplejson.dumps(dict([(attr, getattr(self, attr)) for attr in [f.name for f in self._meta.fields]]))

and he even extracted the method above, so it's more readable:

def toJSON(self):
fields = []
for field in self._meta.fields:
    fields.append(field.name)

d = {}
for attr in fields:
    d[attr] = getattr(self, attr)

import simplejson
return simplejson.dumps(d)

Please mind, it's not my solution, all the credits goes to the link included. Just thought this should be on stack overflow.

This could be implemented in the answers above as well.

Solution 2:

My preferable solution is found on this page:

http://www.traddicts.org/webdevelopment/flexible-and-simple-json-serialization-for-django/

By the way, i saw the writer of this second and best solution: is on stackoverflow as well:

Selaux

I hope he sees this, and we can talk about starting to implement and improve his code in an open solution?

Underdeveloped answered 15/8, 2012 at 19:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.