What a debate!
I am a relative newcomer to Python (but years of programming experience, and dislike of Perl) and am a relative layperson when it comes to the dark art of Apache setup, but I know what I (think I) need to get my little experimental projects working at home.
Here is my summary of what the situation seems to be.
If I use the -m 'module' approach, I need to:
- dot it all together;
- run it from a parent folder;
- lose the '.py';
- create an empty (!)
__init__.py
file in every subfolder.
How does that work in a CGI environment, where I have aliased my scripts directory, and want to run a script directly as /dirAlias/cgi_script.py??
Why is amending sys.path a hack? The Python documentation page states: "A program is free to modify this list for its own purposes." If it works, it works, right? The bean counters in Accounts don't care how it works.
I just want to go up one level and down into a 'modules' directory:
.../py
/cgi
/build
/modules
So my 'modules' can be imported from either the CGI world or the server world.
I've tried the -m
/modules approach but I think I prefer the following (and am not confused how to run it in CGI space):
Create XX_pathsetup.py
in the /path/to/python/Lib directory (or any other directory in the default sys.path list). 'XX' is some identifier that declares an intent to set up my path according to the rules in the file.
In any script that wants to be able to import from the 'modules' directory in above directory config, simply import XX_pathsetup.py
.
And here's my really simple XX_pathsetup.py file:
import sys, os
pypath = sys.path[0].rsplit(os.sep, 1)[0]
sys.path.insert(0, pypath + os.sep + 'modules')
It is not a 'hack', IMHO. It is one small file to put in the Python 'Lib' directory and one import statement which declares intent to modify the path search order.