Is it possible to filter data via adjusting the URL address? - html

I would need to extract some numbers from following table:
COVID-19 table
I need to display it as a supporting table in Tableau. I know I can download it as an .csv, import it in SQL/Tableau and then display it in Tableau but that is too much effort for what I need. Ideally I would like to display/filter directly from Tableau as an iframe or something. Is it possible to filter the data somehow by adding some attributes to the URL address?
Is this somehow possible, I don't know anything about R?
The whole purpose is to not download manually a csv file to have the workbook updated. This would be the easiest way. Do you have any other idea how to get this table to Tableau easily?
Thank you!

Related

Use python requests to download csv file

I am wanting to create a program that can run either weekly or daily to scrape data from a website and download the data in csv format. I was wondering if this was possible to use with Python because that is my preferred application to use, if at all possible how do you suggest I go about it. The website I am wanting to use is: https://finance.yahoo.com/ with the (S&P 500) data. Thank you!
It's not clear what table you're looking to convert into a csv. Yahoo doesn't block requests.
If you know your way around pandas:
df = pd.read_html('https://finance.yahoo.com/quote/%5EGSPC?p=^GSPC')[0]
df.to_csv('df.csv', index=False)

Specifying multiple columns of Quandl data to download with column_index

With Google Financial, you can specify which columns of data you want to download. Can that be done with Quandl data? If so, I can't find an example that illustrates how.
I want to download Open and Close data only, not the entire table which is quite large. Quandl does supply a URL parameter, column_index that allows specification of a single column, so I can query for Close data only, or Open data only, but not both.
Perhaps, this can't be done. Could someone please confirm?

XML/JSON query to create Database

My coding knowledge is very basic so please do bear that in mind. Basically there is a encoding service called vid.ly. We have hundreds of videos on there and would like to create an excel spreadsheet with all the information of them.
Vidly accepts API queries in XML and JSON. Below is a basic example of the query I want to make:
http://api.vid.ly/#GetMediaList
Is there a way that I can get Excel to send that query to the Vidly website, receive an XML/JSON response and make a table from it? I have gotten it to work with an XML generated manually but I really want Excel to pull that information automatically.
Sure, you need to write VBA code in excel sheet. Refer to following urls
https://msdn.microsoft.com/en-us/library/dd819156%28v=office.12%29.aspx
http://www.dotnetspider.com/resources/19-STEP-BY-STEP-Consuming-Web-Services-through-VBA-Excel-or-Word-Part-I.aspx
http://www.automateexcel.com/2004/11/14/excel_vba_consume_web_services/

"POST" form data to XLS/CSV

So I'm trying to essentially "POST" data from a form in an offline webpage, to an Excel spreadsheet, or CSV, or even just a TXT file. Now I have seen this to be possible using ActiveX in Internet Explorer, however, the methods I saw were pretty particular to the user's code, so I got a bit lost in translation being a beginner. Also some recommended using an offline database using JS, but I'm not sure where to begin with that.
Can anyone offer some insight on this? Is it possible? What would be the best route to take?
There are many ways to accomplish this. The best solution will be the one that suits your specific requirements. Obviously, creating a text/csv file is easier than creating an xls. That said, the basic psuedo code is as follows:
Collect form data
Create (in-memory or temporary) file from collected form data.
Return file as download to client, or just save to some location, OR (best option) insert a row into a database.

How to import PBP Data from NFL.com into R

I am trying to import data from past NFL games in the form of Play-by-play tables and am mostly working in R to collect the data and create a data set.
An example of the data I am after is on this page: http://www.nfl.com/gamecenter/2012020500/2011/POST22/giants#patriots#menu=gameinfo&tab=analyze&analyze=playbyplay
I know that NFL.com uses JSON and much of the necessary data are in JSON files attached to the site. My efforts at extracting data from these files using the JSON package in R have been pretty feeble. Any advice y'all have is appreciated.
Would I just be better off using PHP to farm the data?
I don't know if you have already succeeded loading the JSON files into R, but here is an example of that:
library(rjson)
json=fromJSON(file='http://www.nfl.com/liveupdate/game-center/2012020500/2012020500_gtd.json')
json$`2012020500`$home$stats
If you are having trouble finding the URL of the JSON file, use Firebug (an extension for Firefox) and you can see the webpage requesting the JSON file.
The JSON file, is, of course, huge and complicated. But it is complicated data. Whatever you are looking for should be in there. If you are just looking for a straight dump of the play-by-play text, then you can use this URL:
http://www.nfl.com/widget/gc/2011/tabs/cat-post-playbyplay?gameId=2012020500
I extracted all the data for one team for one season more-or-less manually. If you want data for a lot of games consider emailing the league and asking for the files you mentioned. They publish the data, so maybe they will give you the files. The NFL spokesman is Greg Aiello. I suspect you could find his email address with Google.
Sorry this is not a suggested programming solution. If this answer is not appropriate for the forum please delete it. It is my first posted answer.