So, let's imagine I've got a python package of library code with 20 logically-separated modules, and I want to select 1 or 2 classes from each of them for the package's public api. Rather than forcing users to import these classes from the modules directly, I want to make them available directly from the package's namespace, within __init__.py
.
But I don't want everything to be loaded eagerly every time, since loading all 20 modules whenever someone tries to access just one class from a single one is a huge waste (some of them contain their own expensive imports), so I implement a module-level __getattr__()
as per https://www.python.org/dev/peps/pep-0562/ and use importlib within it to load the module of a given class whenever someone tries to import that class.
This is a relatively clean solution, but the part that makes it a nightmare is that this absolutely kills static code analysis tools like Jedi or PyCharm. Autocompletion and on-cursor-hover docstrings are a huge deal to me, since they massively increase productivity, so I don't want to write library code that IDEs cannot understand.
I could write typing stubs, but that would add more burden of maintenance, when really I already have all my code type-annotated and with docstrings inline. It's not a great solution.
Does anyone have an idea how else I could go about this? I'm hoping there's some clever way around this that I just haven't thought about.
_typing
helps with the missing type annotations? How does defining a variable and than just deleting it does anything? In the comment, you mentioned that it replacestyping.TYPE_CHECKING
, buttyping.TYPE_CHECKING
isn't mentioned in the lined pep-0562. – Corene