We are using SGE cluster.
From time to time i'm getting the error below. This happens when I'm running the same script multiple times at once (using the cluster) on different input files.
Do you have a clue on what's causing it?
If I run it again, I won't get this error...
Traceback (most recent call last):
File "/groups/pupko/mosheein/pupkoSVN/trunk/scripts/ploiDB/buildGeneraTree.py", line 10, in <module>
import MySQLdb
File "build/bdist.linux-x86_64/egg/MySQLdb/__init__.py", line 19, in <module>
File "build/bdist.linux-x86_64/egg/_mysql.py", line 7, in <module>
File "build/bdist.linux-x86_64/egg/_mysql.py", line 6, in __bootstrap__
ImportError: libmysqlclient_r.so.16: cannot open shared object file: No such file or directory
Related
The game is running fine from source code. When I used cx_Freeze to compile the binary file on the Linux I got into error on the second machine:
~/tmp/exe.linux-x86_64-2.7 $ ./rungame
/home/local/tmp/exe.linux-x86_64-2.7/library.zip/lib/MenuItem.py:13: RuntimeWarning: use font: libSDL_ttf-2.0.so.0: cannot open shared object file: No such file or directory
(ImportError: libSDL_ttf-2.0.so.0: cannot open shared object file: No such file or directory)
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/cx_Freeze/initscripts/Console.py", line 27, in <module>
File "rungame.py", line 10, in <module>
File "/lib/gameloop.py", line 13, in <module>
File "/lib/settings.py", line 10, in <module>
File "/lib/menuitem.py", line 13, in <module>
File "/usr/lib64/python2.7/site-packages/pygame/__init__.py", line 74, in __getattr__
NotImplementedError: font module not available
(ImportError: libSDL_ttf-2.0.so.0: cannot open shared object file: No such file or directory)
On the machine where I compile the binary it runs fine. Could someone advice me.
Run this : sudo apt-get install libsdl-ttf2.0-0
I use google colab to train may dataset. I uploaded my data set to google drive and recall that from google colab. but running the train.py script imply following errors. more precisely i run:
!python3 /content/drive/tensorflow1/models/research/object_detection/train.py --logtostderr --train_dir=/content/drive/tensorflow1/models/research/object_detection/training/ --pipeline_config_path=/content/drive/tensorflow1/models/research/object_detection/training/faster_rcnn_inception_v2_pets.config
and i get these errores:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/usr/lib/python3.6/imp.py", line 243, in load_module
return load_dynamic(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 343, in load_dynamic
return _load(spec)
ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/content/drive/tensorflow1/models/research/object_detection/train.py", line 47, in <module>
import tensorflow as tf
File "/usr/local/lib/python3.6/dist-packages/tensorflow/__init__.py", line 24, in <module>
from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/__init__.py", line 49, in <module>
from tensorflow.python import pywrap_tensorflow
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow.py", line 74, in <module>
raise ImportError(msg)
ImportError: Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/usr/lib/python3.6/imp.py", line 243, in load_module
return load_dynamic(name, filename, file)
File "/usr/lib/python3.6/imp.py", line 343, in load_dynamic
return _load(spec)
ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory
Failed to load the native TensorFlow runtime.
See https://www.tensorflow.org/install/install_sources#common_installation_problems
for some common reasons and solutions. Include the entire stack trace
above this error message when asking for help.
Do i need to install or upload Cuda9 or Cudnn to google drive first to address theme on colab? How can i pass these errors?
Do keep in mind that you have to enable GPU explicitly on a notebook before you could use tensorflow-gpu. I suspect that this step is missing.
In order to enable GPU, try the menu 'runtime->change runtime->hardware accelerator->gpu'
Mark this as solution if that helped so others could benefit.
Since tensorflow-gpu>=1.5.0 requires CUDA 9, you should install the tensorflow-gpu==1.4.0.
pip install --upgrade tensorflow-gpu==1.4
Please refer to below two links.
https://github.com/tensorflow/tensorflow/issues/15604
https://www.tensorflow.org/install/install_sources#tested_source_configurations
First, enable GPU on Google Colab Notebook
Go to Menu > Runtime > Change runtime.
Change hardware acceleration to GPU.
How to install CUDA in Google Colab GPU's
I have trouble installing and setting up keyrock development environment.
We have set up virtualenv and installed requirements according to this guide.
Everything seemed to be ok, but none of the fabric commands are working. Every time we try to run a fab command within cloned idm directory, this error appears:
Traceback (most recent call last):
File "/root/.virtualenvs/idm_tools/lib/python2.7/site-packages/fabric/main.py", line 658, in main docstring, callables, default = load_fabfile(fabfile)
File "/root/.virtualenvs/idm_tools/lib/python2.7/site-packages/fabric/main.py", line 165, in load_fabfile imported = importer(os.path.splitext(fabfile)[0])
File "/root/idm/fabfile.py", line 17, in <module> from deployment import keystone
File "/root/idm/deployment/keystone.py", line 25, in <module> from keystoneclient.v3 import client
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/__init__.py", line 34, in <module> from keystoneclient import client
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/client.py", line 13, in <module> from keystoneclient import discover
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/discover.py", line 22, in <module> from keystoneclient.v2_0 import client as v2_client
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/v2_0/__init__.py", line 1, in <module> from keystoneclient.v2_0.client import Client # noqa
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/v2_0/client.py", line 23, in <module> from keystoneclient.v2_0 import ec2
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/v2_0/ec2.py", line 16, in <module> from keystoneclient import base
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/base.py", line 31, in <module> from keystoneclient.openstack.common.apiclient import base
File "/root/.virtualenvs/idm_tools/src/fiwareclient/keystoneclient/openstack/common/apiclient/base.py", line 29, in <module> from oslo.utils import strutils
ImportError: No module named oslo.utils
As far as I can see, oslo.utils is installed in /root/.virtualenvs/idm_tools/lib/python2.7/site-packages
A fix was committed on KeyRock's Github account fixing the package's namespace.
If you pull changes from github and run:
sudo python tools/install_venv.py
it should fix it.
You can check out the github issue here
I have just got the release pipeline working with Google's push to deploy on App Engine.
I released that some of my unit tests were failing.
======================================================================
ERROR: Failure: ImportError (No module named google.appengine.ext)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/nose/loader.py", line 414, in loadTestsFromName
addr.filename, addr.module)
File "/usr/lib/python2.7/dist-packages/nose/importer.py", line 47, in importFromPath
return self.importFromDir(dir_path, fqname)
File "/usr/lib/python2.7/dist-packages/nose/importer.py", line 94, in importFromDir
mod = load_module(part_fqname, fh, filename, desc)
File "/var/jenkins/workspace/deployment_5175583809994752_1410355681821/tests.py", line 9, in <module>
from cfc.search.nosqlhelper import set_meeting_date
File "/var/jenkins/workspace/deployment_5175583809994752_1410355681821/cfc/search/nosqlhelper.py", line 3, in <module>
from cfc.models.event import Event
File "/var/jenkins/workspace/deployment_5175583809994752_1410355681821/cfc/models/event.py", line 3, in <module>
from google.appengine.ext import ndb
ImportError: No module named google.appengine.ext
----------------------------------------------------------------------
Ran 1 test in 0.270s
I tried to run the following from python REPL in the VM containing Jenkins provisioned automatically via release pipeline in Google Developer Console.
from google.appengine.ext import ndb
I got the following error.
ImportError: No module named google.appengine.ext
I logged into the VM and noticed that the files are located at for instance:
var/lib/docker/aufs/mnt/bbc6deda44e4806c5c0e377df6f1109733e382e8531c1c9ed440084d8e98e3fe/google-cloud-sdk/platform/google_
appengine/google/appengine/ext/ndb/utils.py
How do I se the path universally in the VM ?
I'm having trouble getting SQLAlchemy to work with my compiled app after getting through py2app. I've done this successfully in Windows with py2exe. The following is my setup file:
APP = ['Blah.py']
DATA_FILES = []
OPTIONS = {'argv_emulation': True,
'includes': ['sip',
'PyQt4._qt',
'sqlalchemy.dialects.mysql',
'MySQLdb',
"gzip"]
}
setup(
app=APP,
data_files=DATA_FILES,
options={'py2app': OPTIONS},
setup_requires=['py2app'],
)
This appears like it's the right way to do it, as I've seen people use it for sqlite, however I still get this error upon trying to run the app after compile:
sqlalchemy.exc.ArgumentError: Could not determine dialect for 'mysql+mysqldb'
I've recently been trying with PyInstaller and have gotten stuck at pretty much the same spot, albeit with a different error which is the following:
Traceback (most recent call last):
File "<string>", line 96, in <module>
File "/Users/tom/Downloads/pyinstaller-pyinstaller-2145d84/PyInstaller/loader/iu.py", line 386, in importHook
mod = _self_doimport(nm, ctx, fqname)
File "/Users/tom/Downloads/pyinstaller-pyinstaller-2145d84/PyInstaller/loader/iu.py", line 480, in doimport
exec co in mod.__dict__
File "build/bdist.macosx-10.7-intel/egg/MySQLdb/__init__.py", line 19, in <module>
File "/Users/tom/Downloads/pyinstaller-pyinstaller-2145d84/PyInstaller/loader/iu.py", line 386, in importHook
mod = _self_doimport(nm, ctx, fqname)
File "/Users/tom/Downloads/pyinstaller-pyinstaller-2145d84/PyInstaller/loader/iu.py", line 480, in doimport
exec co in mod.__dict__
File "build/bdist.macosx-10.7-intel/egg/_mysql.py", line 7, in <module>
File "build/bdist.macosx-10.7-intel/egg/_mysql.py", line 4, in __bootstrap__
File "OSX_Installer/Jango/build/pyi.darwin/Jango/out00-PYZ.pyz/pkg_resources", line 882, in resource_filename
File "OSX_Installer/Jango/build/pyi.darwin/Jango/out00-PYZ.pyz/pkg_resources", line 1352, in get_resource_filename
File "OSX_Installer/Jango/build/pyi.darwin/Jango/out00-PYZ.pyz/pkg_resources", line 1363, in _extract_resource
KeyError: '_mysql/_mysql.so'
you probably also need _mysql which should include the native mysql bindings. also, this bindings need the binary mysql libraries to be installed on the target system.
your application would probably be a lot more portable if you used a pure python mysql library, such as pymysql or MySQL Connector/Python (both are supported by sqlalchemy)