ModuleNotFoundError: No module named 'fastai.vision' - deep-learning

I am trying to use ImageDataBunch from fastai, and it worked fine, but recently when I ran my code, it showed this error ModuleNotFoundError: No module named 'fastai.vision'
Then, I upgraded my fastai version pip install fastai --upgrade. This error got cleared but landed in NameError: name 'ImageDataBunch' is not defined
Here's my code:
import warnings
import numpy as np
from fastai.vision import *
warnings.filterwarnings("ignore", category=UserWarning, module="torch.nn.functional")
np.random.seed(42)
data = ImageDataBunch.from_folder(path, train='.', valid_pct=0.2,
ds_tfms=get_transforms(), size=224, num_workers=4, no_check=True).normalize(imagenet_stats)
How can I fix this?

I actually ran into this same issue when I started using Colab, but haven't been able to reproduce it. Here was the thread describing what I and another developer did to troubleshoot: https://forums.fast.ai/t/no-module-named-fastai-data-in-google-colab/78164/4
I would recommend trying to factory reset your runtime ( "Runtime" -> "Factory Reset Runtime")
Then you can check which version of fastai you have (you have to restart the runtime to use the new version if you've already imported it)
import fastai
fastai.__version__
I'm able to run fastai.vision import * on fastai version 1.0.61 and 2.0.13

In Google Colab:
Upgrade fastai on colab:
! [ -e /content ] && pip install -Uqq fastai
Import necessary libraries:
from fastai.vision.all import *
from fastai.text.all import *
from fastai.collab import *
from fastai.tabular.all import *
Get the images and annotations:
path = untar_data(URLs.PETS)
path_anno = path/'annotations'
path_img = path/'images'
print( path_img.ls() ) # print all images
fnames = get_image_files(path_img) # -->> 7390 images
print(fnames[:5]) # print first 5 images

The solution that worked for me is to copy to (connect) my google drive & then run the cells. Source

You might have installed the older version of fastai. You need to upgrade to fastaiv2. You can upgrade fastai by using pip as shown below.
!pip install fastai --upgrade
Also check your fastai version using
import fastai
print(fastai.__version__)

Related

ModuleNotFoundError: No module named 'torchvision.models.feature_extraction'

I want to extract features in ResNet101, however, I have trouble importing torchvision.models.feature_extraction.
Here is my code:
from torchvision import models
from torchvision.models.feature_extractor import create_feature_extractor
res101 = models.resnet101(pretrained=True)
extractor = create_feature_extractor(
res101,
return_nodes=[
"conv1",
"maxpool",
"layer1",
"layer2",
"layer3",
"layer4",
]
)
features = extractor(inputs)
And here is the error
from torchvision.models.feature_extractor import create_feature_extractor
Traceback (most recent call last):
Input In [11] in <cell line: 1>
from torchvision.models.feature_extractor import create_feature_extractor
ModuleNotFoundError: No module named 'torchvision.models.feature_extractor'
You might be trying to use something like:
from torchvision.models.feature_extraction import create_feature_extractor
See the extraction vs extractor
Check this module
Same problem. I installed PyTorch using conda and it works fine in Jupyter notebooks. But it does not work in terminal.
Turns out the pip listed torchvision version was 0.82.
Solved by updating torchvision using pip.
Maybe some packages installed the old version for me. Hope my experience helps you.

ModuleNotFoundError: No module named 'paddle.distributed'

I am trying to run the following code to train paddleOCR.
import paddle
import paddle.distributed as dist
But I'm getting this error:
ModuleNotFoundError: No module named 'paddle.distributed'
Even after I have installed paddle-client.
docker pull paddlepaddle/paddle:2.3.0-gpu-cuda11.2-cudnn8
I use this images which can work well.
You can try the paddlepaddle with 2.3.1 version,and quick install can refer to: https://www.paddlepaddle.org.cn/en

How to include python mysql.connector into AWS Chalice deployment?

I try to deploy an AWS lambda application, I implemented with the Chalice Python Framework. My app.py connects to a MySQL server and therefore has to
import mysql.connector
But on every invocation of one of my lambda functions I get an error in the log
'Unable to import module 'app': No module named mysql.connector'
I tried to add the mysql.connector to the requirements.txt file in the chalice project:
mysql_connector==2.1.6
And if I do so, 2 additional folders containing several files appear in the AWS lambda environment:
/mysql_connector-2.1.6.data
/mysql_connector-2.1.6.dist-info
But the error remains the same. How to deploy python mysql.connector with Chalice?
This finally worked for me:
lib_path=os.path.abspath(os.path.join(__file__, '..', 'mysql_connector-2.1.6.data', 'purelib'))
sys.path.append(lib_path)
import mysql.connector
Putting the "mysql_connector==2.1.6" into the "requirements.txt" file did install the mysql connector in lambda environment. I added the path of the package (../mysql_connector-2.1.6.data/purelib) to system path.

Running Google's DeepDream on Windows with CUDA: ImportError DLL load failed [duplicate]

I have build .dll of _caffe.cpp on Windows (Release, x64).
I changed extension .dll to .pyd and trying to import it in python:
import caffe
File "\caffe-master\python\caffe\__init__.py", line 1, in <module>
from .pycaffe import Net, SGDSolver
File "\caffe-master\python\caffe\pycaffe.py", line 13, in <module>
from ._caffe import Net, SGDSolver
ImportError: DLL load failed: The specified module could not be found.
What does it mean, some module of dependencies missing which was included in project in Visual Studio, where I build this dll?
You need to add Python Caffe to PYTHONPATH. For example:
export PYTHONPATH=$PYTHONPATH:/home/username/caffe/python
For windows :
Adding /caffe/Build/x64/Release/pycaffe to system path(path) works for me, and I think the best way to do it is :
New a system variable : PYTHON_PKG = /caffe/Build/x64/Release/pycaffe;
Include PYTHON_PKG in path : path = %PYTHON_PKG%; %OtherDirs%
After I did this, I get PKG missing google.internal, then I did pip install google.internal in CMD. It works.
Once you have a compiled and built caffe, try
echo 'export PYTHONPATH=/path/to/caff-dir/python'
Also, you may need to run following:
pip install -r requirement.txt

I get an error message when I try FreqDist() in NLTK -- NameError: name 'nltk' is not defined

I'm learning about the NLTK and my mac
is working fine except I have trouble with the FreqDist(). (I saw another question about FreqDist() but he was getting a different error message. TypeError: unhashable type: 'list')
Here's an example:
>>> from nltk.corpus import brown
>>> news_text = brown.words(categories='news')
>>> fdist = nltk.FreqDist([w.lower() for w in news_text])
Traceback (most recent call last):
` File "<stdin>", line 1, in <module>`
`NameError: name 'nltk' is not defined`
This error message is pretty consistent. I get this message every time I try the FreqDist(). Other commands like - >>> brown.fileids() are fine.
Thanks for your help!
Before you can use FreqDist, you need to import it.
Add a line as follows:
import nltk
or if you just want to use FreqDist you should try this:
>>> from nltk.corpus import brown
>>> from nltk import FreqDist
>>> news_text = brown.words(categories='news')
>>> fdist = FreqDist([w.lower() for w in news_text])
which means you haven't installed nltk.
follow these steps to install nltk:
1:go to this link https://pypi.python.org/pypi/setuptools at the end of page you find setuptools-7.0.zip (md5) download it, then unzip it. you can find easy_install.py python script.
2:use the command sudo easy_install pip. By this time pip will be installed ready to use, (make sure you are in the directory where you can find easy_install script file).
3:use this command sudo pip install -U nltk. successful execution ensure that nltk is now installed.
4:open the IDLE then you type the following:
import nltk
if nltk is installed properly then you will be returned with console.
setuptools are required for older versions of Python. There is no need for the same if you are running 3.2+
You can easily download the same from https://pypi.python.org/pypi/nltk
For more information on http://www.nltk.org/install.html
nltk requires data you need to download first.
Then run the following code:
import nltk
nltk.download('stopwords')
from nltk.corpus import stopwords
stopwords.words("english")