How do I create a web page to show dynamic data? - html

I have a microcontroller that monitors 4 different sensor variables (temperature, pressure, weight, speed) with each variable being read at a different rate. The microcontroller also runs a lightweight webserver. I want to create a simple webpage such that when a client connection is made to the webserver, it will display the values of the 4 variables. While I am somewhat familiar with HTML I can’t figure out how to populate a webpage with the data.
Am looking for some pointers on how to do this.
Secondly, I know the page that is displayed is static. How can it be made dynamic in that when a new value is read, the page is updated with that value.
I am sure there are ‘standard’ ways to do this but I have no clue what the mechanisms are called or how they work.
Any guidance is really appreciated. – Thanks!
J

Perhaps SSI (Server Side Includes) is what you need to show dynamic data inside HTML page. Usually lightweight webserver for micro-controller would had this feature implemented already.
You need to implement Web 2.0 feature by yourself to extend your lightweight webserver. For client side you can use the js library from Socket.IO
Alternatively you can have a socket.io server (by python) play for this web 2.0 feature, and your micro-control just update its data to the web server by JSON API. Then socket.io server update the received data to the clients. This is what I've done on my previous MCU project.

Related

Game Maker Studio HTML5 localStorage issue

I'm embedding GMStudio game in browser using . I need to send some data to the game from site's frontend in JSON and to receive some data from the game in frontend to make consequent actions.
So, my idea before was to save data in cookies/localStorage and to get it in the game somehow, using HTTP functionality or DLL's. Also, I'd like to emit messages from the game using window.parent.postMessage and receive them in frontend correctly.
Alas, I did not find a way to implement this. I hope there's some consistent approach to this problem about which I do not know.
The backup plan is to use Game Maker http_post_string and web sockets to get user's data before game's start and to make frontend do something after game's ending. It's clumsy and insecure, however.
The standard approach is to make a JavaScript extension.
That is done by creating a blank extension, adding a blank JS file to it, defining the functions via the context menu on it, and then adding the implementations into the JS file. Then you'll be able to call them from GML side as per usual.
This way you can access LocalStorage\Cookies, transmit\receive data from JS backends, and overall mess with the runtime as you please (with various degrees of understanding required to access internal data).

How Would I Go About Using Node.js For Frontend And Wordpress As The Backend?

I've had a thought of using Wordpress as a CMS backend, because well a lot of people know it and it is easy to use and then using Node.JS as the front-end. You're probably thinking now why would I want to do that in the first place, what is the advantage?
I want to use websockets and the wonderful Socket.io library for Node.JS provides beautiful cross-browser websockets support. Essentially I want a user to come to a site, a websocket is created and then content is fed to the frontend asynchronously as JSON and then decoded on the frontend all without page refreshing.
Effectively I am making Wordpress become a real-time CMS. You visit a site, but every link you click fetches the page as JSON and returns it via a websocket to save multiple requests and of course, page size.
How do I go about getting Node.JS talking to a MySQL database, pulling out info and then showing it? Any tutorials, resources and other useful tips would be gratefully appreciated. A few of my colleagues have wondered the same thing, so I think the answers will be a big help to everyone.
To be exact, you can't use Node.js for a front-end solution, since it runs on the server, not the browser (think of it like any other server-side language such as PHP, JSP etc).
You can, however, create the described solution with jQuery or any other Javascript library, you just have to implement data transfer with Socket.IO. On the server-side you'd need something to handle websockets, so the most native way would be to use Node.js, but since you want to use Wordpress, it gets really complicated, as Wordpress is not meant to be used in the way you described, so I'm afraid you'd have to write your CMS from ground up in Node.
Also, the way you described has a huge flaw. Search engine crawlers are still unable to parse and run Javascript, so if all of your content is loaded dynamically, it would seem empty to Google and others, so it would be impossible to ever make it in the search results rendering your site pretty much useless.
For MySQL and other modules for Node, you should check NPM registry and the Node modules page.
EDIT
After Dwayne explained his solution in comments, this is how I'd do it:
I'd use jQuery for front-end. Binding the document with .on(), and setting the selector to 'a', so that every anchor on the webpage would fire the handler.
The handler parses the a.href attribute and figures out whether it's an external link, which shouldn't be handled by Javascript, or if it's a link to the next page, to an article etc. You can prevent the default action by calling e.preventDefault() in the handler, which prevents the browser from redirecting to the location.
Then the handler would get the content in JSON by calling .getJSON() to the URL based on the article. The easiest way would be to have a certain pattern (such as all urls like www.domain.com/api) redirect to the Node service via .htaccess, to prevent cross-domain problems.
Node would then see the request, extract the parameters and figure out what the user wants. Then connect to the MySQL database with this module (it's as simple as it can get) and return the corresponding content formatted as JSON. Don't forget to set the Content-Type headers to 'application/json'.
jQuery gets the response, figure out the type of the request and updates the content accordingly. Profit.
As you can see, I wouldn't use WebSockets in this case, since you wouldn't really benefit much from it. They are mostly meant for small real-time updates (no huge HTTP headers to reduce the bandwidth) that are both-ways. This means that the server could also push data into the browser, without the browser asking for it. In a blog context, this is not required, and you won't have too many request, so the difference in bandwidth wouldn't be noticeable anyway. If, however, you would like to use it for educational purposes, just basically replace the getJSON part with SocketIO, I'm not sure whether Apache supports proxying WebSockets, though. Extra information about SocketIO basics are here.
Edit: I overlooked the part with 'using Node.js on the front-end'. As Vahur Roosimaa said, Node.js is on the server-side (think of it as Nginx / Apache + PHP combination). Node isn't a frontend library like jQuery.
If you want you can use it just for the websockets functionality (I suggest using Socket.IO).
Nice tutorials about Node.js and MySQL:
http://www.giantflyingsaucer.com/blog/?p=2596
http://mclear.co.uk/2011/01/26/very-simple-nodejs-mysql-select-query-example/
http://www.hacksparrow.com/using-mysql-with-node-js.html
This SO question might also help: MySQL with Node.js
Also check the examples from the github repo of node-mysql.
If you want something more advanced like an ORM, I recommend Sequelize.
Another good question from SO: Which ORM should I use for Node.js and MySQL?
You should check out Wordscript which I recently added a Node JS example which can act as a simple front end for doing basic post retrieval from a Wordpress database.
It uses a common mysql library for node, and generates MySQL queries from get parameters and renders data as it is retrieved from the database; including tags.
Wordscript aims to free backend/frontend developers from being forced to work with the Wordpress PHP codebase, but still allows for Wordpress'es administrative interface to be used when needed (and prudent to do so). API's have been written in Ruby and PHP that both return JSON feeds and function generally the same way the node version does; so thats an additional option where a scripting language is available.
One option you have, if you want to have wordpress as the CMS and keep its admin UI, is to write your wordpress templates to output JSON instead of HTML.
In contrast to Wordscript, this is more solution specific, since you will need to write your JSON output for every template/data you want. The upside is that you can create the JSON specifically for your needs.
On the node side, you write a small server that will consume the JSON, letting you use whatever javascript template language you want. Nodejs will also help out with performance, since you can save the rendered content and/or the JSON output in memory, saving you roundtrips to the wordpress templates.
I wrote a blog about this, which describes more of the benefits of using nodejs and wordpress together.
http://www.1001.io/improve-wordpress-with-nodejs/

http push django comet

I want to make a django server to refresh the content that you approach the database, if the idea is to first make the user see the current contents of the database and as the valley became the new content, this content comes and is placed above the previous content without reloading the page, in another part of the site is to make you change the current content with the new as it gets to the database?
evserver clearer is my choice, but really do not know how and what would be the most simple and efficient?
I think you should avoid HTTP Polling. Here's why:
if the frequency of the setInterval combined with the number of users on your web app is going to lead to a big resource drain. If you go through slides 9 to 19 in this presentation you'll see some quite dramatic figures for using Push (Note: this example uses a hosted service but hosting your own realtime server and using Push also has similar benefits)
between setInterval calls the data displayed in your app is potentially out of data. Using a Push technology means the instant that new data is available it can be push and displayed in your app. You don't want users looking at an app and thinking they are seeing correct information when they are not.
You should take a the following StackOverflow questions:
Django / Comet (Push): Least of all evils?
Need help understanding Comet in Python (with Django)
For Python/Comet see:
Python Comet Server
The latest recommendation for Comet in Python?
I'd recommend you also start considering "WebSockets" as well as "Comet". Most Comet servers now prefer to use a WebSocket connection when possible.
If you'd prefer to avoid installing and managing your own Comet/WebSocket solution then you could use a realtime hosted service which will allow you Push data through them using a REST API and your clients can receive events by embedding a JavaScript library and writing a small about of code to subscribe and receive the event.
The steps are quite straightforward:
Write a model to store data in DB
Write a view that will generate JSON-serialized data upon POST request.
Write a template that will contain JavaScript with setInterval() that will
proceed AJAX requests to the view and render recieved data. (I'd suggest using JQuery as it's well documented and widespread).

how to dynamically update HTML5 canvas via server push without page reload?

What I would like to do is create a canvas that will show a network map. It's not really a network map but in terms of explaining it the network map example works best to not bog you down with details that don't pertain to my question.
On the network map I want to display routes the traffic takes. These routes change in time, sometimes as frequently as multiple times per minute. On the server side I have a log file to which each route change is appended as it happens.
I know how to create the canvas, I know how to draw my routes onto the canvas.
Is it possible to have the server push an update to the canvas without requiring a page reload/refresh, essentially requiring no user interaction at all? The routes drawn just automagically change?
This would need to work on IIS so a jscript or .Net based solution would be necessary. I won't be able to install PHP, Python, Ruby etc.
Thanks in advance for any insights you can provide.
I recommend that you look at a WebSockets solution to push the information from the server to the client (JavaScript). When you receive the update you can update the canvas as required.
Technologies you should look at if your preferred server technology is .NET would be a service like Pusher, who I work for, and our .NET APIs which let you push updates to the client via our REST API.
If you would prefer to host your own realtime infrastructure then you could look at WebSync (which is actually a Comet technology) which integrates with IIS and also XSockets. There are also a number of realtime technologies on this guide which may interest you. If you've any further questions just let me know.
Server-Sent Events: http://www.html5rocks.com/en/tutorials/eventsource/basics
( Server-sent events is a technology for providing push notifications from a server to a browser client in the form of DOM events. The Server-Sent Events EventSource API is now being standardized as part of HTML5[1] by the W3C. )
Yes, I've done this before (with an ajax-based drawing application). It's very possible. Send packets of information via AJAX (JSON or something), interpret them, and draw them on the client's canvas element. This is trivial to design (and easy to implement using something like jQuery). It seems that you already figured out you need to have a server-side script that pushes information to a web page and a web page that actually draws stuff on the canvas. That's essentially all it is.

Downloading a lot of slippery data

I have access to a web interface for a large amount of data. This data is usually accessed by people who only want a handful of items. The company that I work for wants me to download the whole set. Unfortunately, the interface only allows you to see fifty elements (of tens of thousands) at a time, and segregates the data into different folders.
Unfortunately, all of the data has the same url, which dynamically updates itself through ajax calls to an aspx interface. Writing a simple curl script to grab the data is difficult due to this and due to the authentication required.
How can I write a script that navigates around a page, triggers ajax requests, waits for the page to update, and then scrapes the data? Has this problem been solved before? Can anyone point me towards a toolkit?
Any language is fine, I have a good working knowledge of most web and scripting languages.
Thanks!
I usually just use a program like Fiddler or Live HTTP Headers and just watch what's happening behind the scenes. 99.9% of the time you'll see that there's a querystring or REST call with a very simple pattern that you can emulate.
If you need to directly control a browser
Have you thought of using tools like WatiN which are actually used for UI testing purposes but I suppose you could use it to programmaticly make requests anywhere and act upon responses.
If you just need to get the data
But since you can do whatever you please you can just make usual web requests from a desktop application and parse results. You could customize it to your own needs. And simulate AJax requests at will by setting certain request headers.
Maybe this ?
Website scraping using jquery and ajax
http://www.kelvinluck.com/2009/02/data-scraping-with-yql-and-jquery/