I'm trying to import data to my MySQL DB using Toad.
I'm importing the file as I always do, but suddenly I get this message:
Cannot use named and unnamed parameters in the same command.
What does this mean? I've googled, but did not get any wiser.
Here is the entire error message:
Import Started [11.07.2011 10:16:17] Processing "somefile.csv" into
"db1.table1"
Reading from file somefile.csv
Error: Cannot use named and unnamed parameters in the same command.
Error importing data, please check file format options: Cannot use named and unnamed
parameters in the same command.
Import Finished [11.07.2011 10:16:19]
Related
I am trying to reconstruct a website from backups. When I try to import the database content via PHPMyAdmin, I get the following error:
Fatal error: Uncaught TypeError: mb_strcut() expects parameter 1 to be string, bool given in /home/customer/public_html/phpmyadmin/libraries/classes/File.php:772 Stack trace: #0 /home/customer/public_html/phpmyadmin/libraries/classes/File.php(772): mb_strcut(false, 0, 32768) #1 /home/customer/public_html/phpmyadmin/libraries/classes/Import.php(432): PhpMyAdmin\File->read(32768) #2 /home/customer/public_html/phpmyadmin/libraries/classes/Plugins/Import/ImportSql.php(135): PhpMyAdmin\Import->getNextChunk(Object(PhpMyAdmin\File)) #3 /home/customer/public_html/phpmyadmin/libraries/classes/Controllers/ImportController.php(635): PhpMyAdmin\Plugins\Import\ImportSql->doImport(Object(PhpMyAdmin\File), Array) #4 /home/customer/public_html/phpmyadmin/libraries/classes/Routing.php(186): PhpMyAdmin\Controllers\ImportController->index(Array) #5 /home/customer/public_html/phpmyadmin/index.php(18): PhpMyAdmin\Routing::callControllerForRoute('/import', Object(FastRoute\Dispatcher\GroupCountBased), Object(Symfony\Component\DependencyInjecti in /home/customer/public_html/phpmyadmin/libraries/classes/File.php on line 772
I have looked everywhere to try and find how to overcome the problem, but so far, no joy.
Any help more than welcome.
We have tried importing on to two different hosting platforms (Plesk and Siteground's own).
Plesk produced an "unknown" Plesk version error.
Also tried importing a dump and importing directly via PHPMyAdmin. Dump did not recognise the file, PHPMyAdmin produced this error. We also tried to restore the back up without any luck. Finally, we tried to open the mySQL zip file directly but it is password protected.
Note we have no control of the file source, which is the only back up.
Note 2. I get the same import error when I try to import to Xampp local server
I am trying to load a json file on to my MongoDB. When I try to load it I keep getting this error....
error validating settings: incompatible options: --file and positional argument(s)
What is the problem?
I have updated the neo4j.conf file but can't seem to get rid of this error after changing the file and restarting. I am just trying to load a json file through neo4j and have included the line apoc.import.file.enabled=true on the neo4j.conf but doesn't seem to be working for me, I'm still getting the error message:
Failed to invoke procedure 'apoc.load.json' Caused by
java.lang.RuntimeException : Import from files not enabled, please set
apoc.import.file.enabled=true in your neo4j.conf
I am using neo4jCE 3.2.3 and have used the right file path for the json file as it previously worked on my desktop computer (I'm just trying to replicate it on my laptop) and I am using apoc 3.2.0.4 version plugin. The procedure apoc.load.json is also there when I call all procedures directory.
I am using pyspark to do some work on a csv file, hence I need to import package from spark-csv_2.10-1.4.0.jar downloaded from https://repo1.maven.org/maven2/com/databricks/spark-csv_2.11/1.4.0/spark-csv_2.11-1.4.0.jar
I downloaded the jar to my local due to proxy issue.
Can anyone tell me what is the right usage of referring to a local jar:
Here is the code I use:
pyspark --jars /home/rx52019/data/spark-csv_2.10-1.4.0.jar
it will take me to the pyspark shell as expected, however, when I run:
df = sqlContext.read.format('com.databricks.spark.csv').options(header='true',inferschema='true').load('hdfs://dev-icg/user/spark/routes.dat')
the route.dat is uploaded to hdfs already at hdfs://dev-icg/user/spark/routes.dat
It gives me error:
: java.lang.NoClassDefFoundError: org/apache/commons/csv/CSVFormat
If I run:
df = sqlContext.read.format('com.databricks.spark.csv').options(header='true',inferschema='true').load('routes.dat')
I get this error:
py4j.protocol.Py4JJavaError: An error occurred while calling o72.load.
: java.lang.NoClassDefFoundError: Could not initialize class
com.databricks.spark.csv.package$
Can anyone help to sort it out for me? Thank you very much. Any clue is appreciated.
The correct way to do this would be to add the options (say if you are starting a spark shell)
spark-shell --packages com.databricks:spark-csv_2.11:1.4.0 --driver-class-path /path/to/csvfilejar.jar
I have not used the databricks csvjar directly, but I used a netezza connector to spark where they mention using this option
https://github.com/SparkTC/spark-netezza
I am wondering what the correct format is for passing connection string properties on the command line when using dtexec:
dtexec.exe /Ser IpAddress\Instance /IS "\SSISDB\Data Warehouse\MyProject\MyPackage.dtsx" /DumpOnError /Set \Package.Variables[DW_ConnectionString].Properties[Value];\""Data Source=IpAddress;Initial Catalog=DWDB;Provider=SQLNCLI10.1;IntegratedSecurity=SSPI;"\"
I have defined the above command line configuration where I am attempting to pass override properties for the default connection string properties. The packages I'm targeting are not using package connections, but instead project level parameters/properties have been defined to store the DB connections.
For some reason I can't get this to work . I am getting an error message on the server saying
Failed to configure an overridden property that has the
following path: \Package.Variables
[DW_ConnectionString].Properties
[Value]. An error occurred while setting the value of
property "Value". The error returned is 0x80020009
Is my format correct for overriding properties?
The packages are hosted on a remote server
Try using DTEXECUI next time to generate your command string. It has places for all of the variables, connection managers etc. All you have to do is bring up your package and it fills everything in. You then enter any changes you want in the GUI then go to the Command line tab and it will give you the string to put after DTEXEC.EXEC. You can of course also run the package from DTEXECUI also.
It turns out that my format was wrong:
It's incorrect to use /Set Package Variable in this context:
The correct format is:
/Par "$Project::DW_ConnectionString";\""Data Source=Server\Instance;Initial Catalog=myDb;Provider=SQLNCLI11.1;Integrated Security=SSPI;AutoTranslate=False;"\"