Package pxd defintions from different installed packages - cython

I have an installed python module with cython extensions. Now I am writing a second (different) cython module that wants to import extensions from the installed cython module. However, it is not able to find the definition files of the first module.
The first module has .../python3.8/site-packages/plexim[version]/EGG-INFO/SOURCES.txt as follows:
setup.py
...
plexsim/models.cpp
plexsim/models.pxd
...
Which looks good as the pxd is packed with the module.
In the other module I want to import the pxd from models.pxd. However, when I try to install my other extensions module it cannot find the extension definition when doing
from plexsim.models cimport *
How do I package the data correctly such that the other module sees the definition from the already installed module?
My setup.py looks a follows
setup(
package_dir = {"" : "imi"
},
namespace_packages = find_namespace_packages (include = ["imi.*"]),
package_data = {"": "*.pxd *.pyx".split(),
ext_modules = cythonize(
exts,
language_level = 3,
compiler_directives = cdirectives,
nthreads = mp.cpu_count(),
),\
Thanks

After hours of debugging I figured the path-issue out. Adding __init__.pxd to the module seemed to have cured the problem. For future reference be mindful of whether setuptools actually find the pxd files.

Related

Cython: Is there a use case for compiling multiple pyx-file into one extension module

I was looking for a method to compile multiple pyx-files using only setup.py file. The solution was found in the official documentation:
ext_modules = [Extension("*", ["*.pyx"])]
setup(ext_modules=cythonize(ext_modules))
This will compile all pyx-files within the current directory and create a shared-object for each single file. However, someone suggested this alternative:
ext_modules = [Extension("*", [file for file in os.listdir("folder_where_your_pyx_files_are")
if file.endswith('.pyx'])]
setup(ext_modules=cythonize(ext_modules))
Which will compile all pyx-files into one shared-object.
However none of the imports are working properly.(e.g. if there imports between the files, none of them will work)
So my question is:
Is there a use case for compiling multiple pyx-file into one extension module ?
Note: I am new to cython and have little knowledge about Extension module.

No module named _caffe

_caffe.so is present in the caffe/python/caffe folder
Have set the path variable as export PYTHONPATH=/home/itstudent1/caffe/python:$PYTHONPATH.
make pycaffe was also successful.
I am not understanding what else might be the cause for this error. I am able to import caffe in python.
File
"/home/itstudent1/MajorProject/densecap-master/lib/tools/../../python/caffe/pycaffe.py",
line 13, in
from ._caffe import Net, SGDSolver, NesterovSolver, AdaGradSolver, \ ImportError: No module named _caffe
It seems like you have two versions of caffe:
one in /home/itstudent1/caffe and another in /home/itstudent1/MajorProject/densecap-master.
While the first version is built and compiled, the later is not and your import looks for _caffe.so in the later.

how to prepare cython submodules

I have three questions, but related, and I not getting how can I split them well. I've found many information about those issues, like submodule extension, hierarchy, about an empty __init__.py file, and how to cythonize multiple pyx files. But when I try them together I cannot make them work.
I've prepared an small repo thinking in place together code samples of issues solved. I've check even the code of some of the projects listed that uses cython, but still not getting how to does three things at the same time.
Empty __init__.py file:
In a project (where all the files are pyx (and pxd ifneedbe)), with a __init__.pyx that includes all of them, when there is a __init__.py file, then import it doesn't load the ".so" but the empty init.
cythonize multiple files:
When instead of prepare a __init__.py that includes all the elements of a module. Like:
cythonize(['factory.pyx', 'version.pyx'])
the resulting ".so" import raises an exception:
>>> import barfoo
(...)
ImportError: dynamic module does not define init function (PyInit_barfoo)
It would be related with the previous question if it is necessary to write something in __init__.py.
Submodule:
In fact, this is the main question. The singleton.pyx would part of a submodule, lets say utils to be used from other elements in the module.
For the sample there is a submodule (simply called subm) added in the setup.py as another extension. I've placed earlier than the main one (I don't know if this really does any difference, I didn't see it).
>>> import barfoo
>>> import barfoo.subm
(...)
ImportError: No module named 'barfoo.subm'
Separately, those recipes work, but together I cannot. The "__init__.py" seems to be necessary when there is a mix of "py" and "pyx" files. The examples explain how to cythonize with multiple files, but don't include the last key point for the import. And the submodules doesn't complete about how they can be imported from one place or another (import submodules when import the base one, or optional imports when they are specified).
Thanks to the comments from oz1 and DavidW, I've got the solution. Yes, those three things come together.
Very important is the order when import in the setup.py. Even the PEP8 doesn't say that the imports should be alphabetically sorted, there are other guide lines (like reddit's) that does and I usually follow:
When import first cythonize and then setup, will cause that when cythonize(find_pyx()) is called, the result will be a list of distutils.extension.Extension objects.
from setuptools import setup, find_packages
from Cython.Build import cythonize
setuptools must be imported before cython and then the result of cythionize() will be a list of setuptools.extension.Extension objects that can be passed to the setup() call.
Important to understand the meanings of the __init__'s:
All the __init__.pyx files with includes has been removed and each .pxy file produces its own .so binary.
The main module and the submodules will exist as long as their directories contain the __init__.py file like happen with a pure python code.
In the example I've linked, the file barfoo/__init__.py is not empty because I want that import barfoo provides access to elements like version() or Factory(). Then, this __init__.py is who imports them like a normal pure python.
For the submodule:
Similar for the submodule and its own __init__.py file. In this example the import barfoo will do a from .factory import Factory, who will call from barfoo.subm import Bar, then the subm will be available.
But if the submodule is not imported in this secondary way, the user will have access to it with calls like import barfoo.subm.
Last night I saw your question, and made a simple example according to the wiki. But that question was deleted quickly.
Here is the example: https://github.com/justou/cython_package_demo
Make sure the settings of C compiler and python environment is correct, compile the pyx files by run:
python setup.py build_ext --inplace
Usage is the same as python package:
from dvedit.filters import flip, inverse, reverse
flip.show() # print: In flip call Core function
inverse.show() # print: In inverse call Core function
reverse.show() # print: In reverse call Core function
BTW, there is no need to create an __init__.pyx, you can do the ext_module importings in the __init__.py file

Plone/SQLAlchemy(?) - How can I import a python package (i.e. sqlalchemy) in a module in a subpackage?

I am trying to import sqlalchemy in a module in a subpackage.
Here is my folder layout
PloneInstance
my.package
my
package
subpackage
In the buildout.cfg file of the root folder, I add "sqlalchemy" to the eggs.
In my.package, in configure.zcml, I add:
In the subpackage, I have a blank __init__.py file, a configure.zcml file, and a file called mymodule.py
In mymodule.py I have a line for importing sqlalchemy
import sqlalchemy
Unfortunately, I am getting an error when I try to run an instance:
ImportError: No module named sqlalchemy
I'm assuming I am missing a step. How do I properly import python packages?
Thank you in advance. I apologize if my terminology is off.
Edit:
The module in question I am importing from turned out to be zope.sqlalchemy.
I accidentally overlooked this because prior to moving files to a subpackage, the import statement for zope.sqlalchemy was working without adding zope.sqlalchemy to the eggs section of the buildout.
Look in the setup.py file at the top directory of your package. You'll find a section like:
install_requires=['setuptools',
# -*- Extra requirements: -*-
],
In place of the "Extra requirements' comment, put a comma-separated list of strings specifying your package's requirements. You may even specify versions.
Do not add standard Plone packages to the list. They're taken for granted.
Re-run buildout after specifying your requirements. The result is that the new install requires will be added to your Python environment when you start Plone.

making one pyd for a set of files with cython [duplicate]

This question already has answers here:
Collapse multiple submodules to one Cython extension
(5 answers)
Closed 2 years ago.
I have multiple .py files in one package
packageA
\__init__.py
\mod1.py
\mod2.py
\mod3.py
can I config cython to compile and then packing them all in one packageA.pyd ?
Personally, I would better turn all the .py files into .pyx, then include them into the main .pyx of the Cython extension:
packageA.pyx:
include "mod1.pyx"
include "mod2.pyx"
include "mod3.pyx"
Then, compile using a setup.py looking like:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
cmdclass = {'build_ext': build_ext},
ext_modules = [
Extension("packageA", sources=["packageA.pyx"])
]
)
Running this would generate an all in one packageA.pyd binary file.
Of course, this will output a single module named packageA, and I don't know if this is acceptable for you, or if you really need distinct modules in your package.
But there might be other ways that better fit your question...