I am using Community edition 3.0.5 on Windows 10 . I made multiple efforts to execute a LOAD CSV command before being told that such files cannot reside on an external drive. When I moved the file to users/user/ and tried to execute the LOAD CSV command I got the same message "Couldn't load the external resource at: file:/F:/Neo4j%20DBs/Data.gov%20Consumer%20Complaints/Consumer%20Complaints%20DB/import/Users/CharlieOh/Consumer_Complaints.csv" in spite of the fact the command I entered was
"LOAD CSV WITH HEADERS FROM
'file:///Users/CharlieOh/Consumer_Complaints.csv' AS line
WITH line
LIMIT 1
RETURN line"
I tried to locate the file neo4j.conf and could only find C:\Program Files (x86)\Neo4j Community 3.2.2\Neo4j Community.install4j\i4jparams.conf . I even deleted the old DB and recreated the small amount of data and got the same error, which seems to indicate that the LOAD CSV function is totally useless across all my neo4j databases. BTW the %20 in the file specification was due to suggestions on Stack Overflow as well as using underscores to avoid any use of blank spaces in the file specification. None of it worked and now that I believe that I may have solved the problem by putting the csv file in the user directory, the LOAD CSV function won't let me do it. One last thing, I am following the YouTube video https://www.youtube.com/watch?v=Eh_79goBRUk to learn how to load a csv file into neo4j.
The csv file needs to go in the import directory of the specific database. With Neo4j Desktop this is easy to identify by clicking on the Manage button of the database and then the open folder button. It looks like you've found it.
Once the database import directory is located, you specify it in the LOAD CSV with the statement LOAD CSV WITH HEADERS FROM 'file:///" + FN + "'where FN is your file name, including the csv extension. You do NOT use the full path; that is assumed.
Related
After running a query and saving the results into a table, I went on exporting its content into a GCS bucket.
When in the table, I clicked Export and the following screen showed up
Because the table was bigger than 1 GB, I've used
bucketname/all_years*
Then, because I wanted it in both CSV and JSON, specified the Export format CSV, started the export and repeated for JSON.
Didn't notice if I got CSV and JSON files inside of the bucket (I deleted it right away due to costs, but my memory tells they weren't .csv / .json already inside of the bucket) and once I downloaded the content from the bucket to my Windows machine, I got a file of type file:
To go around this I had to go to every file properties and add .csv / .json and click OK
Why is that even though i specified the export format as .CSV and .JSON I got a file of type file?
Just setting Export Format controls file format but not file extension - You should explicitly set file extension
So, instead of bucketname/all_years* - you should use bucketname/all_years*.json for example or bucketname/all_years*.csv
I am trying to load (a.txt) file into mysql with the load command, but it says no such file or directory even if file is present at the specified path?
load data local infile 'F:\makarand\a.txt'
into TABLE file;
load data local infile 'F:\makarand\a.txt'
into TABLE file;
I have tried this also tried by removing local word but the issue remains same
It says:
Error No 2:No such file or directory
File is a protected name in MYSQL
https://dev.mysql.com/doc/refman/8.0/en/keywords.html
It expects a filename now, rather than your suggested table. Either rename the table (use a pre- or suffix) or try to quote it in your query. These kind of things sadly happen, thus I always use a prefix on my tables, to make sure i dont encounter this kind of things.
// edit //
See also this topic. This has nothing to do with Windows or Linux
Load data infile, difference between Windows and Linux
My apologies if a solution has been provided elsewhere. I have searched and could not find anything similar to what I am experiencing. I am trying to upload categories on a Magento CE 2.1.1 website. I have a file with almost 4000 categories and sub categories and the only practical way is to upload via a csv file.
I downloaded a sample file to use and when I upload the same sample file it's working fine when I click Check data" button. However, when I replace the values on the rows with my own and save the file as csv with UTF-8 text encoding, I am getting an error message below. This is also happening when I save the file as csv even without changing the values. I have tested this with a csv file saved from both Mac Numbers and Windows Excel.
I only need to upload Categories (and not products) but I am not sure if this is possible.
File links:
Importing
Not importing
Actual project sample file
The files are quite similar but strangely one is working and the other is not.
Error
We can't find required columns: sku.
Column names: "sku;store_view_code;attribute_set_code;product_type;categories;product_websites;name;description;short_description;weight;product_online;tax_class_name;visibility;price;special_price;special_price_from_date;special_price_to_date;url_key;meta_title;meta_keywords;meta_description;base_image;base_image_label;small_image;small_image_label;thumbnail_image;thumbnail_image_label;swatch_image;swatch_image_label;created_at;updated_at;new_from_date;new_to_date;display_product_options_in;map_price;msrp_price;map_enabled;gift_message_available;custom_design;custom_design_from;custom_design_to;custom_layout_update;page_layout;product_options_container;msrp_display_actual_price_type;country_of_manufacture;additional_attributes;qty;out_of_stock_qty;use_config_min_qty;is_qty_decimal;allow_backorders;use_config_backorders;min_cart_qty;use_config_min_sale_qty;max_cart_qty;use_config_max_sale_qty;is_in_stock;notify_on_stock_below;use_config_notify_stock_qty;manage_stock;use_config_manage_stock;use_config_qty_increments;qty_increments;use_config_enable_qty_inc;enable_qty_increments;is_decimal_divided;website_id;related_skus;related_position;crosssell_skus;crosssell_position;upsell_skus;upsell_position;additional_images;additional_image_labels;hide_from_product_page;bundle_price_type;bundle_sku_type;bundle_price_view;bundle_weight_type;bundle_values;bundle_shipment_type;associated_skus" are invalid
This might be because you opened the file in Excel which will add a BOM to the start of the file. When the Magento importer tries to read the file, it expects the first header/cell to say sku, but it instead sees the BOM.
Two ways to solve this:
1) Don't open it in excel - use Google sheets, or a text editor if you are feeling brave,
2) If you opened the file in excel, close it, open it in notepad++, click encoding up the top and set to "Encode in UTF-8" (NOT "Encode in UTF-8-BOM"). Then save and you are good to go.
For a course on Excel I was trying to load a CSV in Neo4j (first time using this application) when I was blocked at the first step of replicating an example shown in said course: loading.
The command which was used in the example was this;
LOAD CSV WITH HEADERS FROM "file:/path/to/file/file.csv"
as row
CREATE (m:movie {name:row.movie})
But it gave syntax errors. I found out I could correct it by using double \ and add "file:";
LOAD CSV WITH HEADERS FROM "file://C:\\path\\to\\file\\file.csv"
as row
CREATE (m:movie {name:row.movie})
Neo4j accepts this syntax, processes for a few moments, and returns YET ANOTHER error;
Neo.TransientError.Statement.ExternalResourceFailure
I tried the same commands (original and my own) in the online Neo4j console but no luck. I can reach the file using that path without problem; it really is there. The CSV file consist out of just 5 strings of regular letters, that's all. No fancy formatting or characters.
What's going on?
Not that mysterious, Neo4j's IMPORT CSV function looks for the specified CSV file in the import directory within your server configuration for that database, as specified at the top of its server configuration file. (IE: dbms.directories.import=import in your neo4j.conf file.)
You should create the import directory in...
"C:\Users\[User Name]\Documents\Neo4j\default.graphdb\"
If you place your CSV file in there, you can specify any sub-directory or just the "file.csv" you want to import with the IMPORT CSV function as below.
LOAD CSV WITH HEADERS FROM "file:///file.csv"
AS row
RETURN row
LIMIT 5
Try using:
"file:///C:/path/to/file/file.csv"
Since your file is on your local computer, the third / following the file scheme is not preceded by a host name or address -- but it still needs to be there. Also, file URI path separators should be forward slashes (even on Windows machines).
See the File URI scheme Wikipedia page if you need more information.
Trying CSV import to Neo4j - doesn't seem to be working.
I'm loading a local file using the syntax:
LOAD CSV WITH HEADERS FROM "file:///location/local/my.csv" AS csvDoc
Am wondering if there's something wrong with my CSV file, or if there's some syntax problem here.
If you didn't read the title, the error is:
Couldn't load the external resource at: file:/location/local/my.csv
[Neo.TransientError.Statement.ExternalResourceFailure]
Neo4j seems to need a full path spec to get a file on the local system.
On linux or mac try
LOAD CSV FROM "file:/Users/you/location/local/my.csv"
On windows try
LOAD CSV FROM "file://c:/location/local/my.csv"
.
In the browser interface (Neo4j 3.0.3, MacOS 10.11) it looks like Neo4j prefixes your file path with $path_to_graph_database/import. So you could move your files there. If you are using a command line tool, then see this SO question.
Easy solution:
Once you choose your database location (in my case ReactomeGraphDB60)...
here I placed my ddbb
...go to that folder, and create inside a folder called "import".
Later in the cypher query write (as an example):
LOAD CSV WITH HEADERS FROM "file:///ILClasiffStruct.csv" AS row
CREATE (n:Interleukines)
SET n = row