ModuleNotFoundError: No module named 'paddle.distributed' - ocr

I am trying to run the following code to train paddleOCR.
import paddle
import paddle.distributed as dist
But I'm getting this error:
ModuleNotFoundError: No module named 'paddle.distributed'
Even after I have installed paddle-client.

docker pull paddlepaddle/paddle:2.3.0-gpu-cuda11.2-cudnn8
I use this images which can work well.

You can try the paddlepaddle with 2.3.1 version,and quick install can refer to: https://www.paddlepaddle.org.cn/en

Related

ModuleNotFoundError: No module named 'torchvision.models.feature_extraction'

I want to extract features in ResNet101, however, I have trouble importing torchvision.models.feature_extraction.
Here is my code:
from torchvision import models
from torchvision.models.feature_extractor import create_feature_extractor
res101 = models.resnet101(pretrained=True)
extractor = create_feature_extractor(
res101,
return_nodes=[
"conv1",
"maxpool",
"layer1",
"layer2",
"layer3",
"layer4",
]
)
features = extractor(inputs)
And here is the error
from torchvision.models.feature_extractor import create_feature_extractor
Traceback (most recent call last):
Input In [11] in <cell line: 1>
from torchvision.models.feature_extractor import create_feature_extractor
ModuleNotFoundError: No module named 'torchvision.models.feature_extractor'
You might be trying to use something like:
from torchvision.models.feature_extraction import create_feature_extractor
See the extraction vs extractor
Check this module
Same problem. I installed PyTorch using conda and it works fine in Jupyter notebooks. But it does not work in terminal.
Turns out the pip listed torchvision version was 0.82.
Solved by updating torchvision using pip.
Maybe some packages installed the old version for me. Hope my experience helps you.

ModuleNotFoundError: No module named 'snowflake.sqlalchemy'

I'm trying to use sqlalchemy on airflow AWS.
for some resons I get ModuleNotFoundError: No module named 'snowflake.sqlalchemy',
even thought I have it in my requirments.txt.
anyone know how to fix it? I'm trying to use it with PythonOperator. I also tried to specify the package:
snowflake-sqlalchemy==1.3.4

ModuleNotFoundError: No module named 'fastai.vision'

I am trying to use ImageDataBunch from fastai, and it worked fine, but recently when I ran my code, it showed this error ModuleNotFoundError: No module named 'fastai.vision'
Then, I upgraded my fastai version pip install fastai --upgrade. This error got cleared but landed in NameError: name 'ImageDataBunch' is not defined
Here's my code:
import warnings
import numpy as np
from fastai.vision import *
warnings.filterwarnings("ignore", category=UserWarning, module="torch.nn.functional")
np.random.seed(42)
data = ImageDataBunch.from_folder(path, train='.', valid_pct=0.2,
ds_tfms=get_transforms(), size=224, num_workers=4, no_check=True).normalize(imagenet_stats)
How can I fix this?
I actually ran into this same issue when I started using Colab, but haven't been able to reproduce it. Here was the thread describing what I and another developer did to troubleshoot: https://forums.fast.ai/t/no-module-named-fastai-data-in-google-colab/78164/4
I would recommend trying to factory reset your runtime ( "Runtime" -> "Factory Reset Runtime")
Then you can check which version of fastai you have (you have to restart the runtime to use the new version if you've already imported it)
import fastai
fastai.__version__
I'm able to run fastai.vision import * on fastai version 1.0.61 and 2.0.13
In Google Colab:
Upgrade fastai on colab:
! [ -e /content ] && pip install -Uqq fastai
Import necessary libraries:
from fastai.vision.all import *
from fastai.text.all import *
from fastai.collab import *
from fastai.tabular.all import *
Get the images and annotations:
path = untar_data(URLs.PETS)
path_anno = path/'annotations'
path_img = path/'images'
print( path_img.ls() ) # print all images
fnames = get_image_files(path_img) # -->> 7390 images
print(fnames[:5]) # print first 5 images
The solution that worked for me is to copy to (connect) my google drive & then run the cells. Source
You might have installed the older version of fastai. You need to upgrade to fastaiv2. You can upgrade fastai by using pip as shown below.
!pip install fastai --upgrade
Also check your fastai version using
import fastai
print(fastai.__version__)

How to include python mysql.connector into AWS Chalice deployment?

I try to deploy an AWS lambda application, I implemented with the Chalice Python Framework. My app.py connects to a MySQL server and therefore has to
import mysql.connector
But on every invocation of one of my lambda functions I get an error in the log
'Unable to import module 'app': No module named mysql.connector'
I tried to add the mysql.connector to the requirements.txt file in the chalice project:
mysql_connector==2.1.6
And if I do so, 2 additional folders containing several files appear in the AWS lambda environment:
/mysql_connector-2.1.6.data
/mysql_connector-2.1.6.dist-info
But the error remains the same. How to deploy python mysql.connector with Chalice?
This finally worked for me:
lib_path=os.path.abspath(os.path.join(__file__, '..', 'mysql_connector-2.1.6.data', 'purelib'))
sys.path.append(lib_path)
import mysql.connector
Putting the "mysql_connector==2.1.6" into the "requirements.txt" file did install the mysql connector in lambda environment. I added the path of the package (../mysql_connector-2.1.6.data/purelib) to system path.

Running Google's DeepDream on Windows with CUDA: ImportError DLL load failed [duplicate]

I have build .dll of _caffe.cpp on Windows (Release, x64).
I changed extension .dll to .pyd and trying to import it in python:
import caffe
File "\caffe-master\python\caffe\__init__.py", line 1, in <module>
from .pycaffe import Net, SGDSolver
File "\caffe-master\python\caffe\pycaffe.py", line 13, in <module>
from ._caffe import Net, SGDSolver
ImportError: DLL load failed: The specified module could not be found.
What does it mean, some module of dependencies missing which was included in project in Visual Studio, where I build this dll?
You need to add Python Caffe to PYTHONPATH. For example:
export PYTHONPATH=$PYTHONPATH:/home/username/caffe/python
For windows :
Adding /caffe/Build/x64/Release/pycaffe to system path(path) works for me, and I think the best way to do it is :
New a system variable : PYTHON_PKG = /caffe/Build/x64/Release/pycaffe;
Include PYTHON_PKG in path : path = %PYTHON_PKG%; %OtherDirs%
After I did this, I get PKG missing google.internal, then I did pip install google.internal in CMD. It works.
Once you have a compiled and built caffe, try
echo 'export PYTHONPATH=/path/to/caff-dir/python'
Also, you may need to run following:
pip install -r requirement.txt