I have a table in Google Cloud Datastore where I store a small data structure which is written in one Python service and read in another. I am using gcloud version 0.15.0. Here is the Python code that I use to write/read data to/from GCD:
from gcloud import datastore
import datetime
import json
class GCD(object):
def __init__(self, project_id):
self.client = datastore.Client(project_id)
def put(self, table, key, data):
with self.client.transaction():
entity = datastore.Entity(self.client.key(table, key), exclude_from_indexes=['context'])
entity.update({'context': json.dumps(data), 'created': datetime.datetime.utcnow(), 'done': True})
try:
self.client.put(entity)
except Exception as e:
print "GCD save failed with exception: %s" % e
return None
def get(self, table, key):
entity_key = self.client.key(table, key)
entity = None
try:
entity = self.client.get(entity_key)
except Exception as e:
print "GCD read failed with exception: %s" % e
if not entity:
return None
else:
return json.loads(entity['context'])
I am observing a large number of read/write failures with the message "The read operation timed out"; >5% failures which is quite contrary to the documentation that mentions an expected failure rate of 1 in 30K.
My questions then are:
Is it possible to increase timeout in the datastore.client.get and datastore.client.put calls? I am not looking for answers based on retries; already tried and don't want to depend just on retries.
Is there anything I should do when creating the table or setting up the client that can mitigate these timeout errors?
I read somewhere (https://github.com/GoogleCloudPlatform/gcloud-python/issues/1214) that the Python gcloud uses httplib2.Http which is not threadsafe and has timeout issues. Is there a way to use the (more stable) Python requests package?
Thanks,