I have been working on a module to sync data between client side CRM tool(Microsoft C5 dynamics - MS sql DB) and magento.
I am seeing possibility to update the records from magento to CRM DB(event/observer from save_after methods) because when ever customers do some actions like edit/place order we can trigger a event and we can update the record into CRM DB. So it is single record update.
But how does it work for bulk records(I need to sync Products/Orders/Customers data between both way).
Yes,It should be via cron. But how can we pick only updated/added rows from both end to sync. Is that possible or only way is to compare each record on both DBs and update them in both side?
Please help me in this or can some one suggests me a right way to do.
Thanks
You are touching the areas of Master Data Management (MDM) which can be quite complex.
A direct answer to your question would be to do an SQL dump of the tables you need and import to magento. While not very elegant it will give you a batch update.
We use Windows Workflows and Biztalk for MDM processes between Magento and CRM.
We also use the security federation plattform to provide user information as claims to Mangeto, where one claim the the customer number.
Related
What’s the best practice for integrating SQL Server with Active Directory (AD)?
NB. I’m using SQL Server 2016
Crux of the issue: I'm using SSRS 2016 and have several reports that need to be filtered based on the user accessing the reports. Originally I created a table of users that would need to access the reports. Then in the report builder I passed the UserID as a parameter within the query so that the resulting dataset would be limited to the data the user needed to see.
The problem this created is that the User table would have to be maintained, and Active Directories are dynamic. Now that I have some time to develop a better option, I’d like to link the LDAP data with SQL Server.
I’m wondering what the best practice for doing this is.
One way I pursued this was through an SSIS package ADO.Net connection. Then convert the data. Then load it into a table. Then schedule a job to run the package however often I needed it. This was problematic because for whatever reason I couldn’t get the data conversion process to work.
The second way I’ve been approaching this is to create a linked server instance for the AD. My research has indicated that I’ll need to create a function that overcomes the string limitation of the xp_sprintf Function. Then leverage temp tables and loop through LDAP data to get around the 1000 record limitation from the AD. I've been able to accomplish all this.
At this point though, there appears to be some other issues.
This ultimately increases the code necessary in the views for my reports which may make it harder for other database users to update if & when the time comes. To the point that I'd need to abandon the views and create stored procedures for the reports to pull from.
This also increases transaction counts beyond the SQL Server to include LDAP every time a user accesses a report.
So to resolve that I could wrap the original query of the LDAP data to create a table and then create a job to run that stored procedure every so often.
Either option solves the problem of maintaining the users table which is good, but it isn't perfect because AD changes can take place at any time.
Which option is better here?
If the SSIS package is the better route, I’m curious as to why that is the better route. I’m not opposed to going back and figuring out what it is I’m missing on the SSIS package to make it work.
Are there additional options I should consider if I want to get the most up-to-date Active Directory listing?
Thanks.
Ques: I have two database one is client's database(live database) and another is mine.I am using MySQL database. I should not access client's database directly so I created my own database. By using 'Talend' data warehousing tool I created job for each table and by executing all jobs I can get all updated data from client's database(live database) to my database. I need to execute these jobs manually for getting updated data into my Database, But my question is: is there any process which will automatically remind me, when client insert or update data on there data base so I can execute those jobs manually to get updated data into my database ?? or if client update their any database table so automatically associated job will Execute/Run ?? Please help me on this.
You would need to set up a database trigger that somehow notifies the Talend job and runs it. To do this you'd typically call the job as a web service using a stored procedure or user defined function. This link shows a typical way that a web service may be called on an update trigger for example.
If your source data tables are large, rather than extracting all of the data from the table and then I guess dropping your table and recreating you could use a tMysqlCDC component to only affect changes. The built in tutorial for the component looks like it pretty much covers a useful example of this in practice. If you are seeing regular changes in the source database this could make your job much more performant.
If you have absolutely no access to your client's database then you could alternatively just run the job with some scheduler. The Enterprise versions of Talend come with the Talend Administration Console that allows you to set CRON triggers for a job and could easily be set to run every minute or any other interval (not seconds). Alternatively you could use your operating systems scheduling system to run the job at your desired intervals.
If you can't modify your clients database (i.e. add triggers), and there is no other way to identify changed records (i.e. some kind of audit table) then you're our of luck.
Hey i need some help here, i am developing a vb.net desktop application using visual Studio 2010 and mysql v5.5 whose database will be located in a main server hence the applications will have to communicate with the the database to get/post information.
My problem however is that, i want my applications to have a real time update of the database content such that if user1 updates the database, it immediately reflects that in user2's application. I have read articles that recommend use of triggers and Stored Procedures in mysql syntax to do this but i have no idea how this will work.
I have a table called 'Store' , when a user enters an item in table store, i'd like the application to know it and update the Item and its contents in a listbox or datagrid.
How do i capture these events in my vb.net code?
I hope i am clear enough, if not please ask. All your suggestions will be highly appreciated!
I think the easiest way to solve the problem would be to periodically get the records from the server and update the local list appropriately. On your client application you could store the last time the list was updated, and send that as a parameter when you do the get and get all the items that were created after that time (that way you only get new items).
Another possible solution would be to have a client register with the server application (so the server keeps track of all the clients) and when you create a new record you send that new record to all the clients, which would be listening for that event.
It's sort of hard to say what the 'best' approach is in your situation. How is your code designed (what's the architecture)? Are connections between the client and server applications persistent? Is there a server application that your client application talks to or is each client directly talking to the database?
We are currently having an OLTP sql server 2005 database for our project. We are planning to build a separate reporting database(de-normalized) so that we can take the load off from our OLTP DB. I'm not quite sure which is the best approach to sync these databases. We are not looking for a real-time system though. Is SSIS a good option? I'm completely new to SSIS, so not sure about the feasibility. Kindly provide your inputs.
Everyone has there own opinion of SSIS. But I have used it for years for datamarts and my current environment which is a full BI installation. I personally love its capabilities to move data and it still is holding the world record for moving 1.13 terabytes in under 30 minutes.
As for setup we use log shipping from our transactional DB to populate a 2nd box. Then use SSIS to de-normalize and warehouse the data. The community for SSIS is also very large and there are tons of free training and helpful resources online.
We build our data warehouse using SSIS from which we run reports. Its a big learning curve and the errors it throws aren't particularly useful, and it helps to be good at SQL, rather than treating it as a 'row by row transfer' - what I mean is you should be creating set based queries in sql command tasks rather than using lots of SSIS component and dataflow tasks.
Understand that every warehouse is difference and you need to decide how to do it best. This link may give you some good idea's.
How we implement ours (we have a postgres backend and use PGNP provider, and making use of linked servers could make your life easier ):
First of all you need to have a time-stamp column in each table so you can when it was last changed.
Then write a query that selects the data that has changed since you last ran the package (using an audit table would help) and get that data into a staging table. We run this as a dataflow task as (using postgres) we don't have any other choice, although you may be able to make use of a normal reference to another database (dbname.schemaname.tablename or somthing like that) or use a linked server query. Either way the idea is the same. You end up with data that has change since your query.
We then update (based on id) the data that already exists then insert the new data (by left joining the table to find out what doesn't already exist in the current warehouse).
So now we have one denormalised table that show in this case jobs per day. From this we calculate other tables based on aggregated values from this one.
Hope that helps, here are some good links that I found useful:
Choosing .Net or SSIS
SSIS Talk
Package Configurations
Improving the Performance of the Data Flow
Trnsformations
Custom Logging / Good Blog
I've perused the threads here on migration from SQL 2000 to SQL 2008 but haven't really run into my question, so here we go with another one.
I'm building a strategy to move specific SQL 2000 databases to a new SQL 2008 R2 instance. My question comes with regards to the best method for transferring the schema and data. One way I know of is to do the quick 'n' dirty detach - copy - attach method, which should work so long as I've done my homework wrt compatibility and code and such.
What if, though, I wrote the schema and logins via script and then copied the data via SSIS? I'm thinking of trying that so I can more easily integrate some of my test cases into the package (error handling and whatnot). What would I be setting myself up for if I did this?
Since you are moving the data between servers or instances, I would recommend moving the data via data flows. If you don't expect to run the code more than once, then you can let the wizard generate your code for this move. However, when I did this once 2+ years ago, the wizard code generated combined execute sql tasks that combined many "create table" commands into one task and created a few data flow tasks that had multiple source and destinations in them to insert data in the destination. This was good to get up and running, but it was inadequate when I wanted to refresh the tables one more time after I modified the schema of the new target tables. If you expect to run the refresh more than once, then you may want to take the time to create the target schema first and then manually create the data flows.
Once you have moved the data, then you can enable full-text search on the new server. I don't believe you will need to have this enabled on your first load.
One reason I recommend against the detach-attach method for migration is that you bring all the dirty laundry from the 2000 database to the 2008 R2 database. If you had too lax security on the 2000 server or many ancient users that shouldn't exist, it could be easier to clean this up by starting from scratch. If you use the detach-attach method, then you have to worry about users.