I am creating bubble cloud using these sample codes in Github. https://github.com/vlandham/bubble_cloud
The codes are using .csv in data folder. But I want to create a page showing dynamic data. Now I have the link to the dynamic JSON page.
How to implement the codes to make the bubble cloud dynamic with the JSON page?
Are you just looking for somethign like the d3.json function? There are a bunch of request functions built into d3. Check here
Related
I am trying to create a webpage or page element that will read and display the data from an external XML data feed. I can't seem to find documentation on their site that will help and I am very new to this.
This is the XML url generated: https://spacedout.ampsuite.com/xml/releases?cid=2&s_date=2018-01-01&e_date=2019-01-11&order=release_date&dir=desc&limit=10
And this is an example of how I would like it displayed: https://client.ampsuite.com/
Pretty much just the section under "featured releases" that lists current music releases.
You can use wixCode backend function in order to do that. All you need is to use the wix-fetch API to get the data, then you can parse the XML using xml-js (which is a node module you can install in the backend).
In your client code you'll need to call your backend function and then inject the results to something like a repeater / table element on your UI.
I am trying to use Node.js to implement Data Scrawling. I used axios to GET HTML file and use cheerio to get data.
However, I found that the HTML doesn't return with data but only layout. I guess the website with load the layout first, then doing ajax things to query data then rendering.
So, Anyone know how to GET the full HTML with data? Any library or tools?
Thanks.
i would suggest you to use selenium library with bs4 library in python if have some experience on python.
for node
https://www.npmjs.com/package/selenium-webdriver
i have written scraper in python using both library.
scraper is for linked in profile which take name from excel file and search if data available add it into another excel file
https://github.com/harsh4870/Scraper_LinkedIn
for node code goes like
driver = webdriver.Firefox();
driver.get("http://example.com");
html = driver.getPageSource();
I have created a "connector" with a very nice tool called import.io which allows me to do a search inquiry by a other website and gets me an resultlist. I followed an other article by stackoverflow.com to do this:
basic import.io html search
This works well. But my question now:
How i style my HTML(Resultlist) with CSS like on this site?
Thanks
To get the data from your API into a web page you need to access the API via a programming language or script. Once you have the API return the Data as Json, you could try something like http://json2html.com/ to convert the Data into HTML and write that to your page.
Alternatively you could download the data as CSV, open it in excel and wrap html tags around the data and copy paste that into your website. its not idea, but at least you can get the data online.
So I want to create a chart using Google Chart API. The problem is following:
I need to create a chart which would use JSON data for the Output.
Example:
I am creating a BarChart which should show Montly Income. Each bar should represent X value of income. But this income changes monthly and I dont want to update the new numbers (as there are plenty) manually by writting them inside the JSON structure (I will use PERL scripts that will gather the new data).
I have read plenty on ther web, but all are using JSON COMBINED WITH PHP ! But I do not have a Web Server. What I want is to create a JSON FOLDER on my desktop, which will contain 100 JSON files inside it.
And when I browse my HTML page, I will click on Monthly Income (May) and this should open a new HTML page which should have some sort of INCLUDE or call function, which should call a SPECIFIC JSON file from the JSON folder on my computer that is coresponded with the path I have chosen on my webpage (in this case Monthly income (May)).
I think my problem is MORE SIMPLE than the method with PHP updating, because I am not worried with the updates - I will update my text files with PERL script. I just need a way, how to include a JSON file into my HTML script, without using PHP or any of these WEB SERVER related stuff.
Any ideas / suggestion ?
Thanks,
David
You can't reliably do it cross-browser with JSON because some browsers block ajax calls from local files (e.g., file:// URLs). But you can do it with JSON-P.
You create the files like this:
dataCallback(/*...JSON here...*/)
E.g.:
dataCallback({
"foo": "bar"
});
...and use this to load the relevant:
<script>
function dataCallback(data) {
// Here, the data will be the deserialized (parsed) data
}
</script>
<script src="/path/to/jsonp/file"></script>
Basically this is using JavaScript, rather than JSON, but where the data file although technically code is really just a function call with the data in the argument.
I want to use R language in HTML. Is there any R library or other method so I can put graph created by R in html page?
R will create the graph and then those graph should be shown in html page. It is a continuous process (i.e. r will continuously create graph and html will use them.)
To embed charts (and R code) into a simple html page just use knitr as suggested in the above comment. But if you need something more try Rook which is a web server interface for R. You'll be able not just to put chart and code but also to make the user able to send requests to R (and get the responses, e.g. re-plot a chart, display different variables, etc). Basic 'getting started' here
Another package I am just looking at now is hwriter
The simplest way is to export graph to a ftp location from where your webpage is sourcing it's graph. I already using same solution just for private use, I export png to google drive subdirectory on my computer, google drive application sync it automatically to cloud space.