How to Import a CSV file into an entity automatically? - csv

Is there a way to import a CSV file into a CRM record automatically, say when the CSV file is created?
The plan is that this CSV file would have some cost center hours inside and job number which corresponds to a certain record already created in CRM.
And uploading this CSV would then update this record.
Please help me to solve this problem

You can import data both using UI and code in D365.
Also there are plenty of tools solving exactly same problem: KingswaySoft SSIS, Scribe, etc.
But it looks like buying 3rd party software might be overkill in your scenario. You can use Windows task scheduler and write a few PowerShell scripts to implement it.
Where to start:
https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/import-data
https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/sample-import-data-complex-data-map
https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell
https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/define-alternate-keys-entity

Related

Ruby on rails - how to update my rails database (MYSQL) with values from csv file on a daily basis?

I have a csv file which contains very detailed data of the products that my company sells and it gets updated daily.
And I want my rails to import the data from the csv file then update my database (MYSQL) if there are any new changes found.
What's the best way to achieve this? Some people mentioned about MYSQL for excel. Would this be the way to go about it?
I will appreciate if someone can give me a guidance on this. Thank you.
I'm not gonna walk through the details, specifically because you have no details, nothing attempted at all so I'm gonna stick with a overview.
From a systems point of view I would (assuming your rails app is live and not local):
have the CSV file live in place where you (or whoever needs to) can update it and is also fetchable to the application (dropbox, s3 bucket, your own server, wtv).
have a daily cron rake task, that downloads the CSV file
parse the CSV file and decide what to update.
The trickiest part will be to decide what to update from the CSV and it will depend on how it can change itself. Like if only new lines can be added, or lines removed, if columns in lines can be changed, etc.

NetSuite Migrations

Has anyone had much experience with data migration into and out of NetSuite? I have to export DB2 tables into MySQL, manipulate data, and then export ina CSV file. Then take a CSV file of accounts and manipulate the data again for accounts to match up from our old system to new. Anyone tried to do this in MySQL?
A couple of options:
Invest in a data transformation tool that connects to NetSuite and DB2 or MySQL. Look at Dell Boomi, IBM Cast Iron, etc. These tools allow you to connect to both systems, define the data to be extracted, perform data transformation functions and mappings and do all the inserts/updates or whatever you need to do.
For MySQL to NetSuite, php scripts can be written to access MySQL and NetSuite. On the NetSuite side, you can either do SOAP web services, or you can write custom REST APIs within NetSuite. SOAP is probably a bit slower than REST, but with REST, you have to write the API yourself (server side JavaScript - it's not hard, but there's a learning curve).
Hope this helps.
I'm an IBM i programmer; try CPYTOIMPF to create a pretty generic CSV file. I'll go to a stream file - if you have NetServer running you can map a network drive to the IFS directory or you can use FTP to get the CSV file from the IFS to another machine in your network.
Try Adeptia's Netsuite integration tool to perform ETL. You can also try Pentaho ETL for this (As far as I know Celigo's Netsuite connector is built upon Pentaho). Also Jitterbit does have an extension for Netsuite.
We primarily have 2 options to pump data into NS:
i)SuiteTalk ---> Using which we can have SOAP based transformations.There are 2 versions of SuiteTalk synchronous and asynchronous.
Typical tools like Boomi/Mule/Jitterbit use synchronous SuiteTalk to pump data into NS.They also have decent editors to help you do mapping.
ii)RESTlets ---> which are typical REST based architures by NS can also be used but you may have to write external brokers to communicate with them.
Depending on your need you can have whatever you need.IN most of the cases you will be using SuiteTalk to bring in data to Netsuite.
Hope this helps ...
We just got done doing this. We used an iPAAS platform called Jitterbit (similar to Dell Boomi). It can connect to mySql and to NetSuite and you can do transformations in the tool. I have been really impressed with the platform overall so far
There are different approaches, I like the following to process a batch job:
To import data to Netsuite:
Export CSV from old system and place it in Netsuite's a File Cabinet folder (Use a RESTlet or Webservices for this).
Run a scheduled script to load the files in the folder and update the records.
Don't forget to handle errors. Ways to handle errors: send email, create custom record, log to file or write to record
Once the file has been processed move the file to another folder or delete it.
To export data out of Netsuite:
Gather data and export to a CSV (You can use a saved search or similar)
Place CSV in File Cabinet folder.
From external server call webservices or RESTlet to grab new CSV files in the folder.
Process file.
Handle errors.
Call webservices or RESTlet to move CSV File or Delete.
You can also use Pentaho Data Integration, its free and the learning curve is not that difficult. I took this course and I was able to play around with the tool within a couple of hours.

SSIS Updating User Variables from a CSV file

I am fairly new to SSIS and I have been looking everywhere for the answer to this question and can't find it, which makes me think its really simple and obvious, because I'm pretty sure this is a standard problem with SSIS.
I am building a SSIS package to automate the uploading of data.
We have a multi-instance environment across four servers and are using SQL Server 2005. I therefore have a user variable for the server name and instance name. The database and table will always remain the same. The data is held in an excel file, but I will import the data using CSV.
Is there a way for me to update the user variables from the CSV file? Is TSQL - 'Open rowset' the way forward?
I had previously been updating the variables from the table I had imported the data into, but then I realised in a live situation I wont know where to import the data to, as the values will still be in the CSV file.
Please help! This is driving me crazy and I have a sinking feel that the answer is really obvious which is making it worse!!
Thank you!
Julie
There is a good example here:
http://vsteamsystemcentral.com/cs/blogs/applied_team_system/archive/2007/01/10/247.aspx
of how to load a user variable from a flat file.

How to convert an ESRI Shape-file into SQL Server 2008?

I have a shapefile that I would like to upload to a spatial SQL Server 2008. I have tried using this tool: SQL Server 2008 Spatial Tools. But without luck.
Does anyone know any other (free) tools for doing this?
You can use ogr2ogr to convert from shapefile to GML (or many other formats) and then use SQL Server's GeomFromGML to import. You will need to call GeomFromGML for each feature in your shapefile, but that's a relatively easy program to write.
I have written code in arcbjects for do this task. If you have license arcview or engine you can create a console application in c# and use this code: see https://gis.stackexchange.com/questions/33917/how-to-import-shapefiles-into-ms-sql-2008-and-then-view-that-data-using-qgis?lq=1
UPDATE: I decided I would just point people to the official Github repo instead: https://github.com/zer0infinity/OGR2GUI
This [ogr2ogr fork] tool will attempt to parse the content of the input file (in my case, it was a shapefile) and output it as a bunch of different formats (in my case, I needed it as a CSV, but you can even export your file as a SQLite file). Unfortunately it doens't do straight up SQL, but you can do a dump from a SQLite viewer such as SQLite Browser and import it onto MS SQL. I did notice some inefficiencies when converting to SQLite (I lost some attributes/tables). I also tried MobileMapper Office (MMO) (with better luck) to export to CSV and it did preserve a lot of the data, but you'd then need to write a script to import the data to SQL. If you are going that route, let me know, I'm currently writing a VBA script to deal with the exported data from MMO.
If you're hoping to build this into your application (a script of some sort), you may have some luck trying with ogr2ogr, but you never know what the data is going to look like. documentation is found here http://www.gdal.org/ogr2ogr.html
Original answer: Save yourself sometime and just use this amazing tool: http://ogr2gui.ca/
It's based off ogr2ogr, but with a nice GUI.
I've made an app for importing shapefiles into SQL Server. It's made primarily to suit my needs but i had some spare time so i made an installer for you.
Some nice things you can do with it:
choose encoding of input shapefile
rename/remap destination table column names
choose the destination table name
set the primary key for the destination table
It has a user interface and you can download it for free.
More details can be found on my blog, here's the link: Import shapefiles into SQL Server

Best way to gather, then import data into drupal?

I am building my first database driven website with Drupal and I have a few questions.
I am currently populating a google docs excel spreadsheet with all of the data I want to eventually be able to query from the website (after it's imported). Is this the best way to start?
If this is not the best way to start what would you recommend?
My plan is to populate the spreadsheet then import it as a csv into the mysql db via the CCK Node.
I've seen two ways to do this.
http://drupal.org/node/133705 (importing data into CCK nodes)
http://drupal.org/node/237574 (Inserting data using spreadsheet/csv instead of SQL insert statements)
Basically my question(s) is what is the best way to gather, then import data into drupal?
Thanks in advance for any help, suggestions.
There's a comparison of the available modules at http://groups.drupal.org/node/21338
In the past when I've done this I simply write code to do it on cron runs (see http://drupal.org/project/phorum for an example framework that you could strip down and build back up to do what you need).
If I were to do this now I would probably use the http://drupal.org/project/migrate module where the philosophy is "get it into MySQL, View the data, Import via GUI."
There is a very good module for this, node import. It allows you to take your GoogleDocs spreadsheet and import it as a .csv file.
It's really easy to use, the module allows you to map your .csv columns to the node fields you want them to go to, so you don't have to worry about setting your columns in a particular order. Also, if there is an error on some records, it will spit out a .csv with the error files and what caused the error, but will import all good records.
I have imported up to 3000 nodes with this method.