The repo works on my machine but not on gcloud.
Structure of repo (on Google Cloud Source Repo):
project/
├── localdep.py
└── mylocalpackage/
└── main.py
In main.py:
import localdep
Yet I receive the following error:
ModuleNotFoundError: No module named 'localdep'
What am I doing wrong?! There is no problem running this on Pycharm on my machine, yet when I push to gcloud there is...
The proper structure should be to have the main.py at the top level and the other files on the nested folders. You can take a look into this, which talks about Structuring Your Project.
Just in addition, I have tried with both from ..localdep import * and from ..package import localdep, and other like from ... import localdep, but either I receive ImportError: attempted relative import with no known parent package, or ValueError: attempted relative import beyond top-level package.
It worth to rethink your project structure.
Related
In my VUE webproject I like to read a local JSON.
This JSON should not be included in the webpack bundle.
From the Vue Cli help files:
"Any static assets placed in the public folder will simply be copied and not go through webpack. You need to reference them using absolute paths."
https://cli.vuejs.org/guide/html-and-static-assets.html#the-public-folder
I created the Hello-World application from VUE CLI with VUE 3.
Now when I import a static json from the public folder it does get copied as-is to the destination folder, but it get's included in the webpack bundle as well.
import config from '/public/config.json'
Any help is much appreciated!
My folder structure is like this
cloud_fn_dir
cf1_dir
main.py
util.py
requirements.txt
test_main_cf1.py
cf2_dir
main.py
requirements.txt
test_main_cf2.py
cf3_dir
main.py
requirements.txt
test_main_cf3.py
I am executing unit tests and generating coverage report using command - pytest -v --cov=main --cov-report=html
If I am executing the command from inside the Cloud function folders i.e. cf1_dir or cf2_dir folders then the pytest command works as expected and executes the unit tests and generates reports inside that folder.
But I wish to execute all the unit tests at once and generate a single report and hence I tried to execute the same command from outermost folder i.e. cloud_fn_dir and encountered Import Error where the test_main_cf3.py file is trying to import Class of cf1_dir main.py file.
Each of the test file is importing it's respective main.py file in it and creates an object and calls the methods to test. So when I execute the pytest command it is making the last test file i.e. test_main_cf3.py to import main of cf1_dir main.py and execute the methods. Pytest is trying to import the first encountered main.py class
How do I resolve this import error ?
I have a file that I want to "cythonize", myfile.pyx. I also have a helper file, myhelper.pxi.
myfile.pyx contains within it the following line:
include 'myhelper.pxi'
If the helper file was not needed, I think setup.py would look like this:
from distutils.core import setup
from Cython.Build import cythonize
import numpy
setup(
ext_modules = cythonize("myfile.pyx")
)
Can you tell me how to properly incorporate the helper file into the setup.py code?
It's fine as is - setup.py only needs to know about "myfile.pyx". It doesn't need to know about the internal details of "myfile.pyx" (i.e. what files it textually includes).
This does mean that setup.py won't recompile things when you've only changed "myhelper.pxi" because it doesn't know about the dependency. There isn't really a good way round that. Use --force on the command line to make setup rebuild everything if that's a problem.
Here is a simple Cython package:
foo/
__init__.py # Contains: from . import foo
foo.pyx
I use waf (I know pyximport or setup.py can be used too) to build the Python extension from foo.pyx:
foo/
__init__.py # Contains: from . import foo
foo.pyx
build/
foo/
foo.cpython-35m-x86_64-linux-gnu.so
I would like to import the foo package without installing it (development mode).
If the Python extension would be next to __init___ like this:
foo/
__init__.py # Contains: from . import foo
foo.cpython-35m-x86_64-linux-gnu.so
it would work, but __init__.py is in the the source directory while the .so file is in the build directory.
I would like to avoid copying .so files in source directory.
A solution solution would be to copy (with waf) foo/__init__.py in build/foo/__init__.py, and import the foo package in build/.
Is there other alternatives?
Source code is here.
I started a Django 1.7 OpenShift instance. When I have python print all of the paths from sys.path I do not see OPENSHIFT_REPO_DIR (/var/lib/openshift/xxxxx/app-root/runtime/repo).
When I use https://github.com/jfmatth/openshift-django17 to create a project I do see OPENSHIFT_REPO_DIR in the path.
Looking through the example app above I don't see anywhere that this is specifically added to the path. What am I missing?
To clarify:
I have to add the following to my wsgi.py:
import os
import sys
ON_PASS = 'OPENSHIFT_REPO_DIR' in os.environ
if ON_PASS:
x = os.path.abspath(os.path.join(os.environ['OPENSHIFT_REPO_DIR'], 'mysite'))
sys.path.insert(1, x)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings")
OPENSHIFT_REPO_DIR is not in my path as I would expect. When I used the example git above, I did not have to add anything to the path.
A little while back I had issues with some of the pre-configured OpenShift environment variables not appearing until I restarted my application.
For what its worth, I started up a brand new Django gear, printed the environment variables to the application log, and verified that I do see OPENSHIFT_REPO_DIR (and all other env vars) properly.
This issue appears to be caused by trying to use the standard file structure layout that django produces when you use startproject. Openshift appears to need a flatter file structure. As soon as I moved wsgi up to a sibling of mysite it resolved the issue.