XML/JSON query to create Database - json

My coding knowledge is very basic so please do bear that in mind. Basically there is a encoding service called vid.ly. We have hundreds of videos on there and would like to create an excel spreadsheet with all the information of them.
Vidly accepts API queries in XML and JSON. Below is a basic example of the query I want to make:
http://api.vid.ly/#GetMediaList
Is there a way that I can get Excel to send that query to the Vidly website, receive an XML/JSON response and make a table from it? I have gotten it to work with an XML generated manually but I really want Excel to pull that information automatically.

Sure, you need to write VBA code in excel sheet. Refer to following urls
https://msdn.microsoft.com/en-us/library/dd819156%28v=office.12%29.aspx
http://www.dotnetspider.com/resources/19-STEP-BY-STEP-Consuming-Web-Services-through-VBA-Excel-or-Word-Part-I.aspx
http://www.automateexcel.com/2004/11/14/excel_vba_consume_web_services/

Related

Converting nested JSON response to a CSV table

I am testing APIs on a locally run service using Postman. I am getting a large nested response as JSON. But I am required to produce a CSV file with the data in tables. i.e., keys as column names and the values as separate rows into the tables
An offline solution for needed, for privacy and confidentiality reasons.
First I came across this solution: postman saveResponseToFile template
A template of postman that sends the requests to a node server running locally which in turn saves the JSON to a CSV.
But all this does is save each line into a separate cell horizontally across the CSV. Not of much use for readability.
Desired Output
The needed table is something like what this online tool does:
ConvertCSV
This takes the keys as column names and creates new rows for each nested data.
Is there a OSS script that can accomplish this. Sorry if this is a duplicate. I haven't found a proper one after trying more similar scripts similar to the first.

How do I use Pentaho spoon to push data from MySQL database to facebook webpage

1) I have already made transformation mapping for getting data from specific MySQL (Table Input) and convert it as Text File output.
2) Also I have created a facebook developer account page and trying to figure out how the Facebook API works to push data from MYsql to facebook.
3) would appreciate if transformation mapping can be provided. Also I would not like to use XML, instead I would like to use JSON.
Already the msql table is converted to csv file, but I am not sure how to post the csv file to facebook or is there a way to connect mysql table to facebook directly. Please share your ideas or transformation mapping. Thanks
I would Assuemm you are familiar with Facebook Development API to do all actions like post,get and so on.
You have a step called "REST CLIENT STEP" in Pentaho.
you will have an API url to post the data that you want from mySQL. There several methods GET PUT POST DELETE
Also set the Application Format to Json (XML,JSON etc).
I used to read data from FB using REST Client by using GET Method. Work around.

how to retrieve the data from xml and display it in table in html

I wanted to develop a small search website where I will be storing the data in XML files. When we search anything, it should display those data as table format in html. How does one retrieve the data from XML files?
Below is the basic thing to display data of only two columns, but I want to display data dynamically:
html file:http://www.w3schools.com/xml/xml_applications.asp
This is the sample code for retrieving the data from xml only for two columns.
Well the first problem I see is that you have two functions in there that are not being called. Nothing programmatic will happen in this scenario. When you have a method you need to call said method with myFunction(). I would recommend reading up a little more on javascript instead of copying and pasting it and expecting it to just "work"
To further elaborate, you removed the function call from the example you took when you took off the button. What is your xml endpoint? (it's not going to be the same as the example unless you build it to be that way). In this example it's just an xml file that is hosted on the server with the same root as the html.

"POST" form data to XLS/CSV

So I'm trying to essentially "POST" data from a form in an offline webpage, to an Excel spreadsheet, or CSV, or even just a TXT file. Now I have seen this to be possible using ActiveX in Internet Explorer, however, the methods I saw were pretty particular to the user's code, so I got a bit lost in translation being a beginner. Also some recommended using an offline database using JS, but I'm not sure where to begin with that.
Can anyone offer some insight on this? Is it possible? What would be the best route to take?
There are many ways to accomplish this. The best solution will be the one that suits your specific requirements. Obviously, creating a text/csv file is easier than creating an xls. That said, the basic psuedo code is as follows:
Collect form data
Create (in-memory or temporary) file from collected form data.
Return file as download to client, or just save to some location, OR (best option) insert a row into a database.

How to import PBP Data from NFL.com into R

I am trying to import data from past NFL games in the form of Play-by-play tables and am mostly working in R to collect the data and create a data set.
An example of the data I am after is on this page: http://www.nfl.com/gamecenter/2012020500/2011/POST22/giants#patriots#menu=gameinfo&tab=analyze&analyze=playbyplay
I know that NFL.com uses JSON and much of the necessary data are in JSON files attached to the site. My efforts at extracting data from these files using the JSON package in R have been pretty feeble. Any advice y'all have is appreciated.
Would I just be better off using PHP to farm the data?
I don't know if you have already succeeded loading the JSON files into R, but here is an example of that:
library(rjson)
json=fromJSON(file='http://www.nfl.com/liveupdate/game-center/2012020500/2012020500_gtd.json')
json$`2012020500`$home$stats
If you are having trouble finding the URL of the JSON file, use Firebug (an extension for Firefox) and you can see the webpage requesting the JSON file.
The JSON file, is, of course, huge and complicated. But it is complicated data. Whatever you are looking for should be in there. If you are just looking for a straight dump of the play-by-play text, then you can use this URL:
http://www.nfl.com/widget/gc/2011/tabs/cat-post-playbyplay?gameId=2012020500
I extracted all the data for one team for one season more-or-less manually. If you want data for a lot of games consider emailing the league and asking for the files you mentioned. They publish the data, so maybe they will give you the files. The NFL spokesman is Greg Aiello. I suspect you could find his email address with Google.
Sorry this is not a suggested programming solution. If this answer is not appropriate for the forum please delete it. It is my first posted answer.