I have a plypython function which does some json magic. For this it obviously imports the json library.
Is the import called on every call to the function? Are there any performance implication I have to be aware of?
I have a plypython function which does some json magic. For this it obviously imports the json library.
Is the import called on every call to the function? Are there any performance implication I have to be aware of?
The import
is executed on every function call. This is the same behavior you would get if you wrote a normal Python module with the import
statement inside a function body as oppposed to at the module level.
Yes, this will affect performance.
You can work around this by caching your imports like this:
CREATE FUNCTION test() RETURNS text
LANGUAGE plpythonu
AS $$
if 'json' in SD:
json = SD['json']
else:
import json
SD['json'] = json
return json.dumps(...)
$$;
This is admittedly not very pretty, and better ways to do this are being discussed, but they won't happen before PostgreSQL 9.4.
sys.modules
and subsequent imports will simply fetch it from there. –
Oliguria do $$ import functools $$ language plpython3u;
Then do $$ import sys if 'functools' in sys.modules: plpy.notice('functools has already been imported') else: plpy.notice('functools has not been imported') $$ language plpython3u;
The output is 'functools has already been imported' –
Nocturne The declaration in the body of a PL/Python function will eventually become an ordinary Python function and will thus behave as such. When a Python function imports a module for the first time the module is cached in the sys.modules
dictionary (https://docs.python.org/3/reference/import.html#the-module-cache). Subsequent imports of the same module will simply bind the import name to the module object found in the dictionary. In a sense, what I'm saying may cast some doubt on the usefulness of the tip given in the accepted answer, since it makes it somewhat redundant, as Python already does a similar caching for you.
To sum things up, I'd say that if you import in the standard way of simply using the import
or from [...] import
constructs, then you need not worry about repeated imports, in functions or otherwise, Python has got you covered.
On the other hand, Python allows you to bypass its native import semantics and to implement your own (with the __import__()
function and importlib
module). If this is what you're doing, maybe you should review what's available in the toolbox (https://docs.python.org/3/reference/import.html).
import
in a BEFORE INSERT OR UPDATE ON my_table
trigger. This trigger caused a noticeable 3-4 second delay for the first triggering transaction of every database connection. Ouch. I fixed it via a manual preload on connection initialization (move the import
delay up front). –
Telethermometer © 2022 - 2024 — McMap. All rights reserved.