MS Access 2013 export query-to-XML not saving file - ms-access

I'm trying to save several of my queries as XML files in order to re-assemble my database in another location where the only viable transfer method is text or XML files via email (long story).
When I use the built-in export function, Access allows me to select a save location and nest the schema inside of the XML file, and then says that the export was completed successfully. The file is not in the destination folder, and no error was thrown.
This only happens when exporting bound queries. Other Access elements (tables and forms, for example) export just fine.
If I watch the folder during the export process, I see a file appear very briefly, and then dissapear. Has anyone else experienced this?

Related

How to handle Onedrive duplicating accdb files?

We currently have a shared Access database in a Onedrive folder.
User interacts with the file via an excel file with read\write functions. E.g. dbInteract.xlsm
Otherwise, nothing else interacts with the accdb file. The excel file points to a specific accdb file name. E.g. myDb.accdb
Problem arises when (I think) there is an syncing error and one user interacts with the database anyways.
Onedrive creates duplicates of myDb.accdb and names them myDb-user1.accdb etc.
The file myDb-user1.accdb becomes the latest updated file while myDb.accdb remains untouched.
When user retrieves info via dbInteract.xlsm, info retrieved is still from the myDb.accdb file (the file that was not updated). This has created confusion amongst users and they would try to update the database and it reflecting that nothing has changed whilst unknowingly create more accdb files on the Onedrive.
Issue resolves itself once the spare databases has been deleted but was wondering if there can be any preventive measures I can use.

How to import excel/csv with "File Import" widget in Foundry's Slate?

Context:
For a data pipeline we need to ingest excel spreadsheets directly into foundry (arriving via email). In order to avoid any manual handling error, we'd like to build a small slate app that basically just uploads an excel sheet and automatically appends it to an existing dataset (given schema, headers, etc.).
Unfortunately, there is very little documentation on the "File Import" widget or the API that gets called when drag and dropping a file into a folder.
Idea: Is there a way of uploading a file with slate? Could this file then be added to a dataset, similarly as with the prompt that opens when dropping it into a folder?
You actually don't have to build a Slate app to do this! Datasets that are made up of underlying .csv files support new additions of files directly.
Note: All of the following screenshots are from the dataset preview page.
For example, the following dataset I created from 4 .csv files:
And I can click on the Import button in the top right to add in more files (with the same schema, or not. Depends on if you want to strictly adhere to your applied schema.
If you have already applied a schema, you can also simply Import new files on top of the dataset, but the schemas of the files must exactly match those already present, otherwise your dataset will fail when attempted to be read.

How to modify Access DB File which is in sharepoint Document Library

I am having accdb file(Ms access DB file) in sharepoint Document Library. I want edit the file in browser or without downloading the file to the local.
I am able to view the file but not able to Edit it.
Need help
Access doesn’t really work this way. You can store the data tables in Sharepoint, but you still need the local access file to interact with it. You can link to data tables in Sharepoint. Instead of creating a brand new table, you import the data source.
Microsoft does have a newish program out now called PowerApps that may help you accomplish something closer to what you’re describing.

Ssis empty excel columns causing error

Using Microsoft Visual Studio Community 2015.
Goal of project
-create "*\temp\email" directory
-start program to extract all emails that include xls attachments to the previously created folder
-use for each loop to cycle through each file in the folder, process, and shift to sql table.
The problem I am running into is caused by either a blank excel document (which is occasionally sent from a remote location) or some of the original xls reports only contain 5 columns instead of 6 that I have mapped now. Is there any way to separate files that include the correct columns from those that do not match?
** as Long as these two problems do not exist I can run the ssis package and everything runs without issue.
Control flow;
File System Task (creates directory --->Execute Process Task (xls extraction)-->ForEach Loop(Data flow Task "email2Sql")
Data Flow;
Excel Source (uses expression ExcelFilePath,#user:filepath) delay validation ==true
(columns are initially set to f1-f6 and are mapped to for ex. a,b,c,d,e,f. The Older files that get mixed in only include a,b,c,d,e.) This is where I want to be able to separate the xls files
Conditional Transformation split (column names are not in row 1, this helps remove "null" values)
Ole Db destination (sql table)
Sorry for the amount of reading, but for the first post I tried to include anything that I thought may be relevant.
There are some tools out there which would allow you to open the excel doc and read it. However, I think the simplest thing to do would be to use SSIS out of the box:
1 - add a file system task after the data flow which reads the file.
2 - Make the precedence constraint from the data flow to the file system task "failure." This will cause that to only fire when the data flow task fails.
3 - set the file task to move the "bad" files to another folder
This will allow you to loop through all the files and move the failed ones. Ultimately, the package will end in failure. If you don't want that behavior you can change the ForceExecutionResult property to be success. However, it might be good to know that there were problems with some files so that they can be addressed.
m

Easiest way to continually import data to MySQL from a dbf file on my local computer

I have a problem that has been annoying me for quite some time now and a few days ago I started googling for a solution, but I haven't really gotten anything to work. I've read a little about something called SSIS, but I'm not sure it does what I'm looking for or if there is something else I should research in order to accomplish my goal. This is my problem:
My accounting program produces and updates a .dbf file with information about all vouchers and places it in a folder on my local computer. Our MySQL must continually be updated with this information. So this is what I do twice a day:
I open up the .dbf file in excel
Save it as a .csv.
Close Excel
Open the file in notepad++
Convert the formating to utf8
Save
log in to MySQL
Go to the right table
Upload the .csv
Replace the old data with the new
As this takes quite a bit of time, I feel that there must be better ways to do this. It would be great if I could have this scheduled to be done automatically or if there is some kind of an SQL query that could do this, because then I could use PHP to make a website that I could enter and have the query run when I press a button or something.
So my question is: What is the most simple way to continually get the info from the .dbf file into my SQL server?
There is a way to do your job by shedule with DBF Commander Pro's command-line interface. Use the following command in a *.BAT file:
dbfcommander.exe -edb <dbf_file_name> <server_table_name> <connection_string>
After that, create a shedule for this BAT file using Windows Sheduler.
The only issue remains, that you need to clear the destination table on MySQL database before the export process.
In order to try the export process in app GUI, click 'File -> Export to DBMS'. In the window appears click Build button in order to build the connection string: select MS OLEDB Provider for MySQL Server, then choose your server from the list, provide login and password, select a database, click OK:
In the Export to DBMS window select the destination table you want to import source DBF file to, then click Export. The command line you need you can find at the bottom part of the window.
More info on import and export DBF to a database you can find here. Detailed using of command-line is here.
As you mention of doing in PHP. What is stopping you from doing it there.
You could create one connection handle using a VFPOleDB provider to open the path location of the table, open and read the table. Then have a SECOND connection to your MySQL database open and ready to push the data there.
Then, for each row read from the VFP OleDB connection result set, do whatever special cleansing you need to.
Then, query from the MySQL connection if its an existing entry or not and if an add or update is necessary, then send the data respectively.
Continue for the rest of the records from the VFP result set.
No need to open in Excel, save to CSV format, load yet another tool, etc...