automatically download csv files from a website and add to database - csv

I want to write a program that automatically downloads all the csv files on a webpage and then enter the data in those csv files into SQL Server. I have written the macro to enter the data from csv to SQL server. I just want you guys to help me in automatically downloading the files from a website everyday. is it similar to webscraping? I am new to web programming. So please guide me which languages to go through to do this

Quite easy with PowerShell:
$web = New-Object System.Net.WebClient
$web.Downloadstring("http://<your url here>") | out-file $env:tmp\MyFile.csv
Then use
Import-CSV
and the SQLServer PowerShell provider to inject your data into SQLServer.

I recommend Python, as usual.
After you figure out how to work with it, here are the modules I would use:
csv - for parsing CSV files.
urllib2 - for downloading files.
SQLAlchemy for database operations.

I also recommend PowerShell way to do the job.
Download data from web by WebClient like #David Brabant said
Then Use Import-Csv to get data row by row Import-Csv -Path file.csv
In the end, make use of SQL Server PowerShell Module in your script to upload data into SQL Server Invoke-Sqlcmd -Query '<sql statements>'
Here exists a complete sample script for you to download and learn. How to use SQL Server PowerShell Module to import data from CSV file

Related

Can you upload non-SSRS files to SSRS Server with Powershell?

SQL Server 2019. In the SSRS Report Server web interface I can upload any file, Pdf, Excel, Word, etc. I've got a lot of files I want to upload and the web interface only allows me to do one at a time. Can I upload all files in a folder to the SSRS server using Powershell? So far what I've found only seems to work for SSRS files - rdl, rsd, etc. Is there some other way to upload multiple non-SSRS files? Thanks!
You can use the below PowerShell script, you will need to change the folder location and the report server URL as well as the -RSFolder reference. The script will upload all files within the folder. Please be aware that SSRS does restrict some file types, these can be found using the following SQL code :
SELECT ConfigInfoID
,Name
,Value
FROM ReportServer.dbo.ConfigurationInfo
WHERE Name = 'AllowedResourceExtensionsForUpload'
-- PowerShell Script --
$FileLocation = "C:\Files\"
$Files = Get-ChildItem $FileLocation
foreach ($File in $Files)
{
Write-RSRestCatalogItem -Overwrite -ReportPortalUri http://ReportServer/Reports/ -Path $Files.FullName -RsFolder "Files" -RestApiVersion V1.0
}
Also should have mentioned that you need to install a PowerShell module:
Install-Module -Name ReportingServicesTools

I want to compare the data I have in csv file to the data which is in ldap produciton server

I want to compare the data I have in csv file to the data which is in ldap produciton server.
There are thousands of users data in csv file and i want to compare the data with the data in production server.
Let's suppose I have user ID xtz12345 in the csv file with uid number 123456. Now I want to cross check the uidNumber of the same user ID xtz12345 in the production server.
Is there any way I can automate this? There are thousands of UserID to be checked and if i do it manually it probably gonna take a lot of time. Can anyone suggest what should I do?
Powershell script is good start place.
import activedirectory module (assuming Windows ADdownload and install RSAT tools, here) in Powershell to fetch information from AD, example
use import-csv in powershell to read csv values. Now, compare first with second. example
Happy to help

parse.com export database table into *.csv stored in parse cloud

How can I export the one database table from parse.com into a *.csv file which is stored in parse online?
I just once got a file in the following format and now I need to do that on my own:
http://files.parsetfss.com/f0e70754-45fe-43c2-5555-6a8a0795454f/tfss-63214f6e-1f09-481c-83a2-21a70d52091f-STUDENT.csv
So, the question is how can I do this? I have not found a dashboard function yet
Thank you very much
You can create a job in cloud code which will query through all the rows in the table and generate CSV data for each. This data can then be saved to a parse file for access by URL.
If you are looking to simply export a class every once in awhile and you are on a mac, check out ParseToCSV on the Mac App Store. Works very well.

Big Query table to be extracted as JSON in Local machine

I have an idea on how to extract Table data to Cloud storage using Bq extract command but I would like rather like to know, if there are any options to extract a Big Query table as NewLine Delimited JSON to Local Machine?
I could extract Table data to GCS via CLI and also download JSON data from WEB UI but I am looking for solution using BQ CLI to download table data as JSON in Local machine?. I am wondering is that even possible?
You need to use Google Cloud Storage for your export job. Exporting data from BigQuery is explained here, check also the variants for different path syntaxes.
Then you can download the files from GCS to your local storage.
Gsutil tool can help you further to download the file from GCS to local machine.
You first need to export to GCS, then to transfer to local machine.
If you use the BQ Cli tool, then you can set output format to JSON, and you can redirect to a file. This way you can achieve some export locally, but it has certain other limits.
this exports the first 1000 line as JSON
bq --format=prettyjson query --n=1000 "SELECT * from publicdata:samples.shakespeare" > export.json
It's possible to extract data without using GCS, directly to your local machine, using BQ CLI.
Please see my other answer for details: BigQuery Table Data Export

Upload Data in MapQuest DMv2 through CSV using Data Manager API call

I need to upload data in MapQuest DMv2 through a CSV file. After going through the documentation I found following syntax of uploading data-
http://www.mapquestapi.com/datamanager/v2/upload-data?key=[APPLICATION_KEY]&inFormat=json&json={"clientId": "[CLIENT_ID]","password": "[REGISTRY_PASSWORD]","tableName": "mqap.[CLIENT_ID]_[TABLENAME]","append":true,"rows":[[{"name":"[NAME]","value":"[VALUE]"},...],...]}
This is fair enough if I want to put individual rows in in rows[], but there is no mention of the procedure to follow to upload data through a CSV file. It has been clearly mentioned that "CSV, KML, and zipped Shapefile uploads are supported ". How can I achieve it though this Data Manager API service?
Use a multipart post to upload the csv instead of the rows. You can see it working here.
I used the CURL program to accomplish that. Here is an example of a CURL.exe command line. You can call it from a batch file, or in my case, from a C# program.
curl.exe -F clientId=XXXXX -F password=XXXXX -F tableName=mqap.XXXXX_xxxxx -F append=false --referer http://www.mapquest.com -F "file=#C:\\file.csv" "http://www.mapquestapi.com/datamanager/v2/upload-data?key=KEY&ambiguities=ignore"