How to run append query from data macro MS Access? - mysql

I have 2 table, one is local named 'Client' and other is a linked table to MySQL DB in my web-server named 'ClientSql'.
When I insert data into the table Client, I want it to be insert into clientSql too. I try it with data macro (after insert), but it shows me an error saying
It's not possible in linked tables.
I have tried to create an append query successfully, and it works, but it just works if I execute it manually. My question is:
Is it possible to call it from a data macro? if is possible, can you show me how? If not, can you point me to a solution?

I'm not sure why you should be able to do it manually, but not via a macro. According to this link
http://dev.mysql.com/doc/connector-odbc/en/connector-odbc-examples-tools-with-access-linked-tables.html
you should be able to do it either way.
Another thought is to eliminate the local access client table and have the access program update the mySql table directly. However, if another program is accessing the client table at the same time, this could become tricky due to multi-user and locking situations.

Related

SQL Server rows not editable for Access after Insert

I have this problem: I'm using a SQL Server 2008R2 backend and MS Access 2000 frontend where some tables are connected via ODBC.
Following Structure (Tables all on SQL-Server):
Import (not connected to Access)
Products (connected via ODBC to Access)
Pricing (connected via ODBC to Access)
I want to fill the Pricing table automatically with some data from Products and Import. This is supposed to run as a SQL Agent job with a T-SQL script.
I want to insert the data from "Products" with following command:
INSERT INTO Pricing (Productnr, Manufacturernr)
(SELECT Productnr, Manufacturernr
FROM Products
WHERE Valid = 1
AND Productnr NOT IN (SELECT Productnr FROM Pricing ));
Right after that the inserted rows are locked for Access, I can't change anything. If I execute sql queries with SQL Server Management Suite or if i start queries as SQL Agent jobs everything works fine.
Why are the rows locked in ms access after the query ran (even if it finished successfully)? and how can I unlock them or make it unlock itself right after the query/job ran?
Thanks
When SQL Server inserts new rows, those new rows are in fact exclusively locked to prevent other transactions from reading or manipulating them - that's by design, and it's a good thing! And it's something you cannot change - you cannot insert without those locks.
You can unlock them by committing the transaction that they're being inserted under - once they're committed to SQL Server, you can access them again normally.
The error message i get says, that the dataset has been changed by another user and if i save it, i would undo the changes of the other user. (and asks me for copying into clipboard).
This is different from "locked", and completely normal.
If you have a ODBC linked table (or form based on the table) open, and change data in the backend, Access doesn't know about the change.
You need to do a full requery (Shift+F9) in Access to reload the data, afterwards all records can be edited again.
Got the solution for my Problem now.
I had to add a timestamp row into the pricing table, so access could recognize the change.
Access loads the data into the front end when the table is first accessed. If something in the backend changes the data, you need Access to refresh it first, before you can edit it from the front end (or see the changes).
Do this by (in Access) by closing and reopening the table, or switching to the table and pressing shift-F9 as Andre suggested, or programmatically using a requery statement. You must requery, not refresh, for it to release the locks and register the changes made in SQL.

Modifying table entries from LibreOffice Base, possible?

I've successfully connected LibreOffice Base with MySQL data base server. I've tested if I modify my table from host (free hosting service on internet) then the changes are reflected when refreshing the table object in LO Base.
But my question is, can I modify DB table directly from LO Base? I guess that it's possible using sql queries from LO Base, but how? Please give me some insights or tutorials. Thanks.
The normal way to alter a table:
Tools -> SQL
Enter an ALTER TABLE command and press Execute button.
A way that works, even though it complains that no result set is returned:
Create a query in SQL view.
Enter ALTER TABLE command.
Click button in toolbar to mark it as Run SQL command directly. Or Edit -> Run SQL command directly.
Close the query and double-click to run it.
My guess is it could be done with a macro as well, similar to https://forum.openoffice.org/en/forum/viewtopic.php?f=5&t=75763 but using ALTER TABLE.
For more ideas see https://forum.openoffice.org/en/forum/viewtopic.php?f=61&t=37687.
EDIT:
Inserting new row data in a form is easier than altering the table. First, make sure this works:
Double-click on your table under Tables.
Insert -> Record, or enter data in the last new row.
If Insert -> Record is disabled, then you need to set up the table for editing. Make sure that your connection to the database allows editing. Also the table must have a primary key.
Once you can insert records in Table view, it's time to create the form:
Under Forms, Use Wizard to Create Form.
Select your table and press >> to include all fields.
Click Finish.
Now you should be able to open the form and enter data into the final new row.
More complete instructions with examples are at http://www.open-of-course.org/courses/mod/url/view.php?id=786.

SSIS - check if row exists before importing from source to destination

Can someone please advise:
I have got SOURCE (which is OLD SQL Server) and we need to move data to new DESTINATION (which is NEW SERVER). So moving data between different instances.
I'm struggling how to write the package which looks up in destination first and check if row exists then do nothing else INSERT.
Regards
Here are the steps:
Take an OLEBD Source, connect it with a Lookup task.
Select the column that can be looked up. There should be some kind of ID for you to do this. Also select all the columns that need to be passed(SSIS has provisions of applying check boxes).
Connect the lookup no match rows to an OLEDB destination, do the mapping, and you are done.
If you want to redirect all those matching rows to somewhere, say a notepad file, you can do that too...
I would use an Lockup transformation and redirect match output to something else like a OLEDB command in there you can write a IF exist statement or create that in a stored procedure that way it will either insert data or update data not insert duplicates

SQL 2008 - Alternative to trigger

I am looking for a solution to the following:
Database: A
Table: InvoiceLines
Database: B
Table: MyLog
Every time lines are added to InvoiceLines in database A, I want to run a query that updates the table MyLog in database B. And I want it instantly.
Normally I would create a trigger in database A on INSERT in InvoiceLines. The problem is that database A belongs to a ERP program where I don't want to make any changes at all (updates, unknown functionality in 3-layer program, etc)
Any hints to help me in the right direction...?
You can use transactional replication to send changes from your table in database A to a copy in DB B, then create your triggers on the copy. It's not "instant," but it's usually considered "near real time."
You might be able to use DB mirroring to do this somehow, but you'd have to do some testing to see if you could get it to work right (maybe set up triggers in the mirror that don't exist in the original?)
One possible solution to replicate trigger's functionality without database update is to poll the table by an external application (i.e. java) which on finding new insert would fire required query.
In SQLServer2008, something similar can be done via C# assembly but again this needs to be installed which requires database update.

When a new row in database is added, an external command line program must be invoked

Is it possible for MySQL database to invoke an external exe file when a new row is added to one of the tables in the database?
I need to monitor the changes in the database, so when a relevant change is made, I need to do some batch jobs outside the database.
Chad Birch has a good idea with using MySQL triggers and a user-defined function. You can find out more in the MySQL CREATE TRIGGER Syntax reference.
But are you sure that you need to call an executable right away when the row is inserted? It seems like that method will be prone to failure, because MySQL might spawn multiple instances of the executable at the same time. If your executable fails, then there will be no record of which rows have been processed yet and which have not. If MySQL is waiting for your executable to finish, then inserting rows might be very slow. Also, if Chad Birch is right, then will have to recompile MySQL, so it sounds difficult.
Instead of calling the executable directly from MySQL, I would use triggers to simply record the fact that a row got INSERTED or UPDATED: record that information in the database, either with new columns in your existing tables or with a brand new table called say database_changes. Then make an external program that regularly reads the information from the database, processes it, and marks it as done.
Your specific solution will depend on what parameters the external program actually needs.
If your external program needs to know which row was inserted, then your solution could be like this: Make a new table called database_changes with fields date, table_name, and row_id, and for all the other tables, make a trigger like this:
CREATE TRIGGER `my_trigger`
AFTER INSERT ON `table_name`
FOR EACH ROW BEGIN
INSERT INTO `database_changes` (`date`, `table_name`, `row_id`)
VALUES (NOW(), "table_name", NEW.id)
END;
Then your batch script can do something like this:
Select the first row in the database_changes table.
Process it.
Remove it.
Repeat 1-3 until database_changes is empty.
With this approach, you can have more control over when and how the data gets processed, and you can easily check to see whether the data actually got processed (just check to see if the database_changes table is empty).
you could do what replication does: hang on the 'binary log'. setup your server as a 'master server', and instead of adding a 'slave server', run mysqlbinlog. you'll get a stream of every command that modifies your database.
step in 'between' the client and server: check MySQLProxy. you point it to your server, and point your client(s) to the proxy. it lets you interpose Lua scripts to monitor, analyze or transform any SQL command.
I think it's going to require adding a User-Defined Function, which I believe requires recompilation:
MySQL FAQ - Triggers: Can triggers call an external application through a UDF?
I think it's really a MUCH better idea to have some external process poll changes to the table and execute the external program - you could also have a column which contains the status of this external program run (e.g. "pending", "failed", "success") - and just select rows where that column is "pending".
It depends how soon the batch job needs to be run. If it's something which needs to be run "sooner or later" and can fail and need to be retried, definitely have an app polling the table and running them as necessary.