I am receiving the following error message below when using the NetLogo GIS extension:
Extension exception: shapefile Data1/Hough.shp not found
error while observer running GIS:LOAD-DATASET
called by procedure SETUP
called by Button 'setup'
My shape file is loaded in the same folder as my script. Can someone help me trouble shoot this issue?
-Ivory
I made sure my shape file is in the same folder as my script.
This seems like the error you get when you have misspelled something or the file address is not well written.
If your Shapefile is in the same folder as your model, then maybe you need to use just set my-map gis:load-dataset "Hough.shp
In case you have your Shapefile in its own folder, contained in Data1, then it could be set my-map gis:load-dataset "Data1/Hough/Hough.shp
Hope this can help you. Next time, make sure you add in your question the part of the code giving you problems, it is way easier to help you knowing what you have coded to get that error!
Related
I have a bunch of .csv file and i was looking for the easiest way to load them into SnowPark? I am not sure which are the APIs required.
If someone can point me to the APIs or provide a code example that will be great.
Thanks
I don't think you can do this directly in one step as Snowpark doesn't have an API for that.
What you can do is:
Load the CSV files to a stage.
// Upload a file to a stage.
session.file.put("file:///tmp/file1.csv", "#myStage/prefix1")
Create a DataFrame by reading files from the stage.
val dfCatalog = session.read.csv("#myStage/prefix1")
See more information here and here and here
I am attempting to pickle.load three files that definitely exist within my Google Drive yet I am not able to load them because my code is not able to find them.
Here is the error message:
FileNotFoundError: [Errno 2] Failed to open local file '/root/.cache/huggingface/datasets/good_reads_practice_dataset/main_domain/1.1.0/5f8cad709a7746be18b722642fc8ade5c1cedfa3b259440a401f7f4701079561/cache-af18ded1f5c5aaac.arrow'. Detail: [errno 2] No such file or directory
This is my code that is not working:
with open(r"/content/drive/MyDrive/Thesis/Datasets/book_preprocessing/PreTokenized/ALBERT_NER_512/train_dataset.pkl", "rb") as input_file:
train_dataset = pickle.load(input_file)
with open(r"/content/drive/MyDrive/Thesis/Datasets/book_preprocessing/PreTokenized/ALBERT_NER_512/val_dataset.pkl", "rb") as input_file:
val_dataset = pickle.load(input_file)
with open(r"/content/drive/MyDrive/Thesis/Datasets/book_preprocessing/PreTokenized/ALBERT_NER_512/test_dataset.pkl", "rb") as input_file:
test_dataset = pickle.load(input_file)
And here is a screenshot of my google drive directory tree:
I have checked the names of the files many times so unless I am going crazy the files as I have pointed to them definitely exist. Moreover I am sure that my code is correct as I have loaded pickle objects in all the other folders in the 'PreTokenized' folder several times without issue which makes this bug even more mysterious. I also have no clue as to why it is looking for the pickled objects in the cache of the google colab environment. If anyone has an idea as to what is happening and how I can solve it, I would greatly appreciate any help.
Still don't know why this is happening exactly but I did realize that my files were not actually being uploaded since they were too big. (I did not get an error or warning of this but the pickled files were only a few Kbs so I knew something was up.
What I did to get around this was to segment the files in halves and then upon downloading, I would concatenate them back together.
During the SETUP loading phase of shp files of an urban model, Netlogo states:
Extension exception: unsupported shape type 15
error while observer running GIS:LOAD-DATASET
called by procedure SETUP
called by Button 'Go'
This happens when it tries to load this shp file: ZC.shp
Is there any way to fix this problem?
I have saved the shp file again but without z-axis information, and it worked
I am running into the following error when I am loading my shape files through the DashDB console:
My shape files are the following:
Would anyone have experience working with DashDB and ran into a similar problem?
UPDATE:
I downloaded a separate dataset with the following files, and I still running into the same error:
Please find the following sample files https://www.dropbox.com/s/bkrac971g9uc02x/deng.zip?dl=0
I brought the Shapefile into QGIS easily, so I knew the format was OK. I unzipped the Shapefile, changed the file names to lower-case and re-zipped it up. Then I was able to get further in the dashDB upload UI. I got to a message saying the SRS was unknown. I then used QGIS to convert the SRS (spatial reference system) into a known one -- EPSG:4269, NAD83, and I was then able to upload it into dashDB. Here's the version of your file that works:
https://dl.dropboxusercontent.com/u/8196680/dc.zip
I am getting the exception "ValueError: insecure string pickle" when attempting to run my program after creating a sandbox from MKS.
Hopefully you are still interested in helping if you are still reading this, so here's the full story.
I created an application in Python that analyzes data. When saving specific data from my program, I pickle the file. I correctly read and write it in binary and everything is working correctly on my computer.
I then used py2exe to wrap everything into an .exe. However, in order to get the pickled files to continue to work, I have to physically copy them into the the folder that py2exe. So my pickle is inside of the .exe folder and everything is working correctly when I run the .exe.
Next, I upload everything to MKS (an ALM, here is the Wikipedia page http://en.wikipedia.org/wiki/MKS_Integrity).
When I proceed to create a sandbox of my files and run the program, I get the dreaded "insecure string pickle" error. In other words, I am wondering if MKS screwed something up or added an end of line character to my pickle files. When I compare the contents of the MKS pickle file and the one I created before I uploaded the program to MKS, there are no differences.
I hope this is enough detail to describe my problem.
Please help!
Thanks
Have you tried adding your pickled files to your Integrity sandbox as binaries and not text?
When adding the file, on the Create Archive interface, select the options button, and change data type to "Binary" from "Auto". This will maintain any non-text formatting within the file.