I'd like to import CSV file into Salesforce automatically, e.g. once a day. I can't use Data Loader (Bulk API), because I don't have the needed edition. Is there any other, simple way?
Check to see if DBAmp (on the app exchange) is available for your edition. If it is, then you can use a SQL stored procedure to do it.
Otherwise, I think your only other option would be to schedule an Apex job in Salesforce to grab it from somewhere accessible (like an FTP).
Related
I've got a test SSIS package that reads this API https://api.coindesk.com/v1/bpi/currentprice.json
Which exports it to a table in SQL Server.
What is the best way of parsing this data so it is split into multiple columns correctly?
Disclaimer - I work for ZappySys (Company which makes API Connectors / Drivers for SSIS and ODBC)
Loading data from JSON file or REST API into SQL Server can be done few ways. For example, I literally took URL you supplied and put in JSON Source and got it working in 2 mins.
Method-1: Use 3rd party JSON Source Component (e.g. ZappySys)
Here is how to do using SSIS JSON Source by ZappySys (3rd party)
Method-2: Use C# code in Script Component
If you like to use FREE approach, then you can write C# code like this.
Version of Newmarket Delphi I'm using:
I am looking for help with creating a CSV from all of the data (Contacts, Bookings) that is available inside of my old Delphi, because I would like to be able to just import the data into our newer Salesforce backed Delphi system without having to input every single booking manually into the Salesforce interface.
Is there a way to do this, or am I just stuck inputting everything by hand?
The input interface on the new system:
Besides streaming a csv file yourself and painstakingly executing inserts for each line of data, is it possible to use the google cloud sdk to import an entire csv file in bulk, from inside a cloud function. I know in gcp console you can go to the import tab, select a file from storage and just import. But how can I emulate this programmatically?
in general, one has to parse the .csv and generate SQL from that; one line in the .csv would be represented by one INSERT statement. in the first place, you would have to upload the file - or pull it into Cloud Functions temporary storage, eg. with gcs.bucket.file(filePath).download.
then the most easy might be utilize a library, eg. csv2sql-lite - with the big downside, that one does not have full control over the import - while eg. csv-parse would provide a little more control (eg. checking for possible duplicates, skipping some columns, importing to different tables, whatever).
... and order to connect to Cloud SQL, see Connecting from Cloud Functions.
I'm posting it here because I couldn't' find any such scenario on the web so far. I have a webpage which contains a set of reports both in XLS and PDF formats. I should be downloading the excel files from the page and load into my database. I wish I could use the URL for XLS file directly but the problem is the naming convention may keep changing every time (Sales_Quarter1.xlsx can be Sales_Q1.xlsx the next year). The only thing that would be constant in the following example is "Sales for Calendar Year". I should be looking up for the file that corresponds to this text and download it before loading it into database table.
I would like to know from experts if this would be possible?
<li>
<sub>Sales for Calendar Year 2015--All Countries </sub>
<a href="/Data/Downloads/Documents/Sales/Sales_Quarter1.xlsx">
<sub>[XLS]</sub></a><sub> , <sub>[PDF]</sub><sub></sub></sub>
</li>
PS: I am using SQL Server 2014.
Thanks!
Have a look at Integration Services. Create a package for both pulling the web page using a script task, along with a variable name that will represent your downloaded, local filenames for the html file and excel files (you will also have to parse the link out of the html file). Then utilize an Excel Source next in your package.
The variable name for the excel file used in the script task will need to be set to ReadWrite as well.
You can also schedule the resulting package execution via SQL Agent job, if you plan to run this on a reoccurring basis, placing logic into the script or the execution paths,
I understand that from Business Objects client I have an option to export to "CSV (data only)", but my understanding is that, such an export will not care about the report but just dump the raw universe data.
Isn't there any single way to be able to export the report "view" to CSV ?
It depends on the version of BusinessObjects you're working on.
Originally, the CSV export only looked at the Web Intelligence (I assume you're referring to that particular client) microcube, meaning the raw data retrieved from the data provider(s), and disregards any formatting, filters, aggregations, … you may have specified on your report.
GUI
However, you know have the option to export a report (so not the whole document) as a CSV Archive, which results in a Zip file containing a CSV for the active report at the time of export.
I'm referring to BI 4.1 SP05, previous versions may have this option as I'm not sure when it was introduced.
API
Using the RESTful API that is available in BI4, you can also export a report to CSV. In this case, the actual CSV file will be returned instead of an archive.
Remember that in order to use the RESTful API, you need to have a WACS server in your BusinessObjects environment, running the RESTful API service. You cannot deploy the REST API on an external Java application server.
For more information, have a look at the section Exporting a Report in Listing Mode (SDK information for BI 4.1 SP05).
Remarks
A report is a tab within a document; documents however are often (incorrectly) referred to as reports.