I've managed to build some DLLs on Linux that are necessary for my Python extension using MinGW. Something along these lines:
from setuptools.command.build_py import build_py
class BuildGo(build_py):
def run(self):
if # need to build windows binaries
self.build_win()
build_py.run(self)
def build_win(self):
if # compilers and toolchain available
try:
# builds extra libraries necessary for this extension
except subprocess.CalledProcessError as e:
print(e.stderr)
raise
try:
result = subprocess.check_output([
'x86_64-w64-mingw32-gcc-win32',
'-shared',
'-pthread',
'-o',
EXTRA_DLL,
FAKE_WIN_BINDINGS,
ARCHIVE_GENERATED_IN_PREVIOUS_STEP,
'-lwinmm',
'-lntdll',
'-lws2_32',
])
print(result)
except subprocess.CalledProcessError as e:
print(e.stderr)
raise
I was now hoping I could avoid extending build_ext
in the same painful way to get it to cross-compile Cython code for Windows... I looked into the abyss of "elegant interplay of setuptools
, distutils
and cython
", and before the abyss has a chance to look back into me... Isn't there a way to just specify some flag... like a name of compiler and Python binary for desired platform and... it would just do it?
I've read this article: http://whatschrisdoing.com/blog/2009/10/16/cross-compiling-python-extensions/ - it's almost 10 years old. And it just made me want to cry... did anything change since it was written? Or are these steps more or less what I'll have to do to compile for the platform other than the one I'm running on?
Or, is there an example project on the web which does it?
Goal
My ultimate goal is to produce an egg
package which will contain both PE and ELF binaries in it and will install them in the correct location on either platform when installed by pip
or pipenv
. It should compile on Linux (compiling it on MS Windows isn't necessary).
mingw
requirements... time for sackcloth and ashes. – Bonapartetype
function? How does this make any sense... – Bonaparte