SSIS Updating User Variables from a CSV file - ssis

I am fairly new to SSIS and I have been looking everywhere for the answer to this question and can't find it, which makes me think its really simple and obvious, because I'm pretty sure this is a standard problem with SSIS.
I am building a SSIS package to automate the uploading of data.
We have a multi-instance environment across four servers and are using SQL Server 2005. I therefore have a user variable for the server name and instance name. The database and table will always remain the same. The data is held in an excel file, but I will import the data using CSV.
Is there a way for me to update the user variables from the CSV file? Is TSQL - 'Open rowset' the way forward?
I had previously been updating the variables from the table I had imported the data into, but then I realised in a live situation I wont know where to import the data to, as the values will still be in the CSV file.
Please help! This is driving me crazy and I have a sinking feel that the answer is really obvious which is making it worse!!
Thank you!
Julie

There is a good example here:
http://vsteamsystemcentral.com/cs/blogs/applied_team_system/archive/2007/01/10/247.aspx
of how to load a user variable from a flat file.

Related

How to Import a CSV file into an entity automatically?

Is there a way to import a CSV file into a CRM record automatically, say when the CSV file is created?
The plan is that this CSV file would have some cost center hours inside and job number which corresponds to a certain record already created in CRM.
And uploading this CSV would then update this record.
Please help me to solve this problem
You can import data both using UI and code in D365.
Also there are plenty of tools solving exactly same problem: KingswaySoft SSIS, Scribe, etc.
But it looks like buying 3rd party software might be overkill in your scenario. You can use Windows task scheduler and write a few PowerShell scripts to implement it.
Where to start:
https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/import-data
https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/sample-import-data-complex-data-map
https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell
https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/define-alternate-keys-entity

Ruby on rails - how to update my rails database (MYSQL) with values from csv file on a daily basis?

I have a csv file which contains very detailed data of the products that my company sells and it gets updated daily.
And I want my rails to import the data from the csv file then update my database (MYSQL) if there are any new changes found.
What's the best way to achieve this? Some people mentioned about MYSQL for excel. Would this be the way to go about it?
I will appreciate if someone can give me a guidance on this. Thank you.
I'm not gonna walk through the details, specifically because you have no details, nothing attempted at all so I'm gonna stick with a overview.
From a systems point of view I would (assuming your rails app is live and not local):
have the CSV file live in place where you (or whoever needs to) can update it and is also fetchable to the application (dropbox, s3 bucket, your own server, wtv).
have a daily cron rake task, that downloads the CSV file
parse the CSV file and decide what to update.
The trickiest part will be to decide what to update from the CSV and it will depend on how it can change itself. Like if only new lines can be added, or lines removed, if columns in lines can be changed, etc.

Copy and Paste Data into PhpMyAdmin

after searching the internet for a bit, I'm pretty sure this hasn't been answered directly so I'm asking here.
I am currently creating a Runescape (Laugh at me all you want ;P) Skilling Calculator for a School Programming Project, and am creating databases for XP values with phpMyAdmin, using information that is already on the web.
Instead of having to manually type out approximately 6000 different entries, each with 3 columns, I would rather copy and paste them, alleviating both time, and chances for errors. For example, I want to copy and paste all the information from here:
http://www.tip.it/runescape/pages/view/divination_calc.htm onto phpMyAdmin in bulk; not one entry at a time. I was wondering if this was possible in any way.
I would suggest copying and pasting the HTML table into excel, tidying up columns to match your database, saving as a CSV and importing using PHPMyAdmin's import function.
Here's an article I found on importing a CSV into PHPMyAdmin: importing a CSV into phpmyadmin

Extract Transform Load into MySql

Am basically from Microsoft background working much on SSIS for ETL sought of project.
Now I got another project on hand to deal with loading of .csv files into MySql database. In process of loading these tables data has to go through some transformations and then into destination table. It is much of ETL project.
Client doesn't have SSIS (BIDS) and am compelled to use open source tools.
I did bit of research and found Talend Data Integration tool best fits for my situation.
As am new to this environment and am sure there are experts in this area, I need some advice on best tools to do ETL of this type and best practices.
If need any futher information please let me know.
If I remember correctly, PhpMyAdmin can import CSV into MySQL, and this question is about a similar topic too, but these don't come close to what SSIS can offer...
Yes you are right Talend Open Studio is pretty good tool with hundreds of connector,
in your case just create job which take CSV as your source and MySQl is destination apply any transformation if required and load it.
you can get more information on CSV to MySQL load with examples Talend forum
if you have any base plan then, share with me, I can guide you how to transfer CSV to MySQL table.

Best way to gather, then import data into drupal?

I am building my first database driven website with Drupal and I have a few questions.
I am currently populating a google docs excel spreadsheet with all of the data I want to eventually be able to query from the website (after it's imported). Is this the best way to start?
If this is not the best way to start what would you recommend?
My plan is to populate the spreadsheet then import it as a csv into the mysql db via the CCK Node.
I've seen two ways to do this.
http://drupal.org/node/133705 (importing data into CCK nodes)
http://drupal.org/node/237574 (Inserting data using spreadsheet/csv instead of SQL insert statements)
Basically my question(s) is what is the best way to gather, then import data into drupal?
Thanks in advance for any help, suggestions.
There's a comparison of the available modules at http://groups.drupal.org/node/21338
In the past when I've done this I simply write code to do it on cron runs (see http://drupal.org/project/phorum for an example framework that you could strip down and build back up to do what you need).
If I were to do this now I would probably use the http://drupal.org/project/migrate module where the philosophy is "get it into MySQL, View the data, Import via GUI."
There is a very good module for this, node import. It allows you to take your GoogleDocs spreadsheet and import it as a .csv file.
It's really easy to use, the module allows you to map your .csv columns to the node fields you want them to go to, so you don't have to worry about setting your columns in a particular order. Also, if there is an error on some records, it will spit out a .csv with the error files and what caused the error, but will import all good records.
I have imported up to 3000 nodes with this method.