i am attempting to export a sas dataset to an existing access database as a new table. i am using the Data copy files tool and have connected it to my Program.
i receive the following error: Target folder \filelocation\file.accdb does not exist or cannot be access on "NY52214"(computername).
i am using the following:
proc export data=WORK.DETAILS
dbms=accesscs
OUTTable="data"
replace;
DATABASE='\\filelocation\file.accdb';
RUN;
I have used a similar code to this to export to excel which works, so i am not sure what i am doing wrong. i am unable to utlize pc file server but everything i have searched indicates this should be working. i have tried different file locations, different versions of access. Can anyone tell me what i am missing please?
Related
I'm trying to save several of my queries as XML files in order to re-assemble my database in another location where the only viable transfer method is text or XML files via email (long story).
When I use the built-in export function, Access allows me to select a save location and nest the schema inside of the XML file, and then says that the export was completed successfully. The file is not in the destination folder, and no error was thrown.
This only happens when exporting bound queries. Other Access elements (tables and forms, for example) export just fine.
If I watch the folder during the export process, I see a file appear very briefly, and then dissapear. Has anyone else experienced this?
How can I export the one database table from parse.com into a *.csv file which is stored in parse online?
I just once got a file in the following format and now I need to do that on my own:
http://files.parsetfss.com/f0e70754-45fe-43c2-5555-6a8a0795454f/tfss-63214f6e-1f09-481c-83a2-21a70d52091f-STUDENT.csv
So, the question is how can I do this? I have not found a dashboard function yet
Thank you very much
You can create a job in cloud code which will query through all the rows in the table and generate CSV data for each. This data can then be saved to a parse file for access by URL.
If you are looking to simply export a class every once in awhile and you are on a mac, check out ParseToCSV on the Mac App Store. Works very well.
I'm going to be getting a new computer soon and I don't want to lose all of the data I have entered in my tables, so I decided to test out the feature that allows you to export and import CSV files. I exported a table successfully (data was transferred to Microsoft Excel in CSV file), but when I opened the file in Microsoft Excel and added a few rows and tried to import it back in to MySQL Workbench, I got the following error:
"Error importing recordset
error calling Python module function
SQLIDEUtils.importRecordsetDataFromFile"
I've searched all over for info on this, but can't find any solutions. Does anyone know what I'm doing wrong?
In Workbench, open a MySQL connection and then navigate to [Server] --> [Data Export]. There are several backup options here, including saving the data as an individual file or folder. Choose the databases you want to export, and then click [Start Export].
If you ever prefer using Excel for editing and such, then use the MySQL for Excel plugin to access MySQL databases from within Excel. However, I don't think you need it here.
To export your mySQL data, use mysqldump, which will create all the schema for you.
Excel probably added some stuff to your file and now mySQL can't understand it. The best way to find out is by comparing the files before and after the change.
That error indicates a format problem. If the file is small enough, try opening it in wordpad (or the mac equivalent) and see if there's any difference in the formatting? Could be that the delimiting got a little messed up (this can happen especially with end of row markers in MySQL, I've noticed, it can also happen in mac to pc handoffs). If all else fails you could try exporting using a different format and see if that makes a difference (maybe tsv) when you add new rows.
Another reason can be the line endings used. Depending on the system and editor used to work with the cvs file it the line endings might get changed. For me mysql supported UNIX line endings. And in the editor the line ending had been set to MAC OS 9 since I was using a MAC.
Changing it to UNIX line ending worked.
I found that it might be due to a wrong encoding of the input file.
Using Notepad++ for example (or another similar editor) you need to change file encoding to UTF-8.
I have received a few .accdb files from a client, and I am trying to open them in Microsoft Access 2013. The files seem to open correctly, but whenever I click on any of the tables or queries on the left I get the following error message:
C:\[hard-coded path on client's computer] is not a valid path. Make sure that the path name is spelled correctly and that you are connected to the server on which the file resides.
Now, I know that the path does not exist on my computer. But why is Access looking for a hard-coded path on another person's computer? And how can I access the tables and queries in Access?
Additional question: Is there an easy way to import the data to SQL Server instead? I read a couple of posts about importing data from Access to SQL Server, but apparently the SQL Server Import and Export Wizard is expecting a file of a different format, not .accdb.
Thanks in advance.
You need to get the back-end database file from your client. All the tables are stored there. Once you receive it, save it at a convenient location on your computer and use the "Linked Table Manager" on the "External Data" tab in the .accdb you already have. That will allow you to update the table links for the current location of the back-end database file on your system.
I have a problem that has been annoying me for quite some time now and a few days ago I started googling for a solution, but I haven't really gotten anything to work. I've read a little about something called SSIS, but I'm not sure it does what I'm looking for or if there is something else I should research in order to accomplish my goal. This is my problem:
My accounting program produces and updates a .dbf file with information about all vouchers and places it in a folder on my local computer. Our MySQL must continually be updated with this information. So this is what I do twice a day:
I open up the .dbf file in excel
Save it as a .csv.
Close Excel
Open the file in notepad++
Convert the formating to utf8
Save
log in to MySQL
Go to the right table
Upload the .csv
Replace the old data with the new
As this takes quite a bit of time, I feel that there must be better ways to do this. It would be great if I could have this scheduled to be done automatically or if there is some kind of an SQL query that could do this, because then I could use PHP to make a website that I could enter and have the query run when I press a button or something.
So my question is: What is the most simple way to continually get the info from the .dbf file into my SQL server?
There is a way to do your job by shedule with DBF Commander Pro's command-line interface. Use the following command in a *.BAT file:
dbfcommander.exe -edb <dbf_file_name> <server_table_name> <connection_string>
After that, create a shedule for this BAT file using Windows Sheduler.
The only issue remains, that you need to clear the destination table on MySQL database before the export process.
In order to try the export process in app GUI, click 'File -> Export to DBMS'. In the window appears click Build button in order to build the connection string: select MS OLEDB Provider for MySQL Server, then choose your server from the list, provide login and password, select a database, click OK:
In the Export to DBMS window select the destination table you want to import source DBF file to, then click Export. The command line you need you can find at the bottom part of the window.
More info on import and export DBF to a database you can find here. Detailed using of command-line is here.
As you mention of doing in PHP. What is stopping you from doing it there.
You could create one connection handle using a VFPOleDB provider to open the path location of the table, open and read the table. Then have a SECOND connection to your MySQL database open and ready to push the data there.
Then, for each row read from the VFP OleDB connection result set, do whatever special cleansing you need to.
Then, query from the MySQL connection if its an existing entry or not and if an add or update is necessary, then send the data respectively.
Continue for the rest of the records from the VFP result set.
No need to open in Excel, save to CSV format, load yet another tool, etc...