I create CSV file by a software dynamically in another centos 7 server and send it to Neo4j server by Samba in home/t/Desktop/temp directory and need to load them into Neo4j.
But Neo4j could not load the file and i get this error:
java.sql.SQLException: Some errors occurred :
[Neo.ClientError.Statement.ExternalResourceFailed]:Couldn't load the external resource at: file:/home/t/Desktop/temp/5d8db3a4-83d3-4850-b134-7e3d24855b88.csv
I comment the import line at neo4j config file and add below line to it too.
dbms.security.allow_csv_import_from_file_urls=true
The permission for the temp directory is nobody:nobody and 0777.
But still error!!!
I think Neo4j has some issues with Selinux and other security things in Centos 7.
You can make a new top level directory (under /) e.g. named test and set the permissions appropriately:
sudo mkdir /test
sudo chmod 777 /test
by making it a top level directory you don't have to worry about permissions of intermediate directories...
answer link:
https://unix.stackexchange.com/a/127298
Related
I'm using fairly large MySQL DB Tables with XAMPP, which makes it tough with my rather small internal storage of my Mac. I thought I would just keep MySQL data on an external USB3.0 SSD drive, but it looks like it's not that easy.
Here is what I've tried:
With XAMPP ( not VM ): Moved /Applications/XAMPP/xamppfiles/var/mysql to /Volumes/myexternalssd/mysql and then pointed everything in my.cnf to that dir. The permissions seem to have copied properly. But it didn't work. MySQL does not start at all if I trash the original dir, or just keeps using the original dir if I leave it in place.
With XAMPP-VM: Moving ~/.bitnami dir to the ext drive and then symlinling ( ln -s ) to the new location. The error is then:
Cannot load stack definitions.
Dtails:
1 error occurred:
* failed to create stack: cannot deserialize stack from file "/Users/arseni/.bitnami/stackman/machines/xampp/metadata.json": open /Users/arseni/.bitnami/stackman/machines/xampp/metadata.json: operation not permitted
I am getting the below given error when I try to forward certain log files using syslog-ng in Suse Linux
Starting syslog servicesError opening file for reading; filename='/tmp/app.log', error='Permission denied (13)'
my conf file - Source definition seems to be ok
source app {
file("/tmp/app.log");
};
I went through similar posts and dont see any problems with my steps.The weird part is that the file is owned by root and when i run syslog-ng as root it gives read permission error
Am I missing anything?
This problem is caused because of AppArmor linux security module. Solution to this problem is mentioned in attached thread. syslog-ng read file permission denied
Here are steps I followed.
Open /etc/apparmor.d/sbin.syslong-ng
Add /opt/xxx/logs/* rw, line anywhere. rw below means allow read & write access. Change your directory appropriately.
Run apparmor_parser -r /etc/apparmor.d/sbin.syslong-ng to set new rules.
Restart syslog-ng using service command or any other way you have set up.
I am trying to load the node from csv in Neo4j, however, every time I try to do this I get such an error:
Neo.ClientError.Statement.ExternalResourceFailed: Couldn't load the external resource at: file:/var/lib/neo4j/import/events.csv
My event.csv file is in /var/lib/neo4j/import directory with 777 permissions. The query I try to run looks like this:
USING PERIODIC COMMIT 500 LOAD CSV WITH HEADERS FROM "file:///events.csv" AS line
CREATE (e:Event { event_id: toInteger(line.event_id),
created: line.created,
description: line.description })
I set up Neo4j using the latest version of docker image. What might be wrong with file permissions or file location?
Docker container cannot get access to files outside on the host machine, unless you mount those files to the container.
Solution is to bind-mount the directory to your container when calling the docker run command:
docker run -v /var/lib/neo4j/import:/var/lib/neo4j/import ... <IMAGE> <CMD>
I am trying to move my location on disk of a Bolt.cm install. Config.yml doesnt have any details of location.
When I rename the directory I receive app config issues as well as this :
Fatal error: Uncaught exception 'Bolt\Exception\LowLevelDatabaseException
Thrown from within Bolt. Seems like I need to update a config file but I don't see one with the path do web root or path to database stored.
Does such a file exist?
You need to flush the cache.
This can be done from the command line by:
php app/nut cache:clear
Or inside app/cache/ you will find the file config_cache.php that can be safely deleted.
I've created a simple 'hello' type JRuby application and use Warbler to WAR up and then deploy to JBoss. However, I get the following error when using the application:
ActiveRecord::JDBCError (The driver encountered an unknown error: java.sql.SQLException: path to '/opt/jboss/server/ruby/tmp/deploy/tmp8791905909469840942demo-exp.war/WEB-INF/db/production.sqlite3': '/opt/jboss/server/ruby/tmp/deploy/tmp8791905909469840942demo-exp.war/WEB-INF/db' does not exist):
Sure enough when I dig into the demo.war file the db directory is missing from WEB-INF directory. The db directory exists in the app directory though along with test, development and production database files.
Any ideas?
Usually the db directory contains migrations only. If you're using sqlite3 it contains the database file, but since when you deploy a war in production it gets unpacked somewhere in the server innards and potentially gets deleted on redeploy, your database file would go away with the redeploy. If the file is read-only and you're not worried about that consideration then you can easily add the db directory by running warble config and editing config/warble.rb and adding db to the config.dirs array.