XML-RPC versus HTML Form - html

Can anyone explain to me the benefits of using XML-RPC over a straight HTML Form ?
On first glance, they seem to accomplish the same thing.
The XML-RPC is "formatted" using XML, but you can do the same in a form (think textbox).
I have an app that takes data from a script that runs on the client PC. The output from the script is XML. Currently it is submitted to the app (PHP using CodeIgniter) via a form POST. I have been told to look at using XML-RPC, but I am trying to understand "why"...

I primary benefit of XML-RPC is that you don't have to write any (or nearly any) glue code to get remote processes to communicate. There are a wide variety of XML-RPC client libraries available for many languages. In the case that you have a rich API of functions, XML-RPC can be a very easy way connect remote processes to that API. Performing the same task with a plain www-form-encoded POST will require you to convert an API to a form and dispatch requests into api calls. There are a few systems that can help with that, but it is certain to be more difficult than just exposing the API through XML-RPC.
On the other hand, if (as it sounds) you already have a rich API exposed through plain form encoded requests, It's hard to justify the work of porting both client and server to another interface.

If you mean encoding data in the www-form-encoded (or "query string style") format vs POSTing XML-RPC, it's really a matter of taste. Some people prefer to use XML for everything.

XML-RPC is a bit dated, in my opinion. Unless you're talking about a different protocol, this is the one that preceeded SOAP. I wouldn't use it at all. Instead, I'd either use a SOAP-based service, or a REST-based one.

HTML forms are not comparable to XML-RPCs.
You use HTML forms to get some input data through a WebPage, while you use XML-RPC to execute remote procedures using XML.
Usually you will use XML-RPC for a process without a UI trying to execute some procedure remotely.

Related

Connect Sproutcore App to MySQL Database

I'm trying to build my first Sproutcore App and I struggle to connect it to a MySQL-Database or any datasource other than fixture. I can't seem to find ANY tutorial except this one from 2009 which is marked as deprecated: http://wiki.sproutcore.com/w/page/12413058/Todos%2007-Hooking%20Up%20to%20the%20Backend .
Do people usually not connect SC-Apps to a Database? If they do so, how do they find out how to? Or does the above mentioned tutorial still work? A lot of gem-commands in the introduction seems to already differ from the official Sproutcore getting-started-guide.
SproutCore apps, as client-side "in-browser" apps, cannot connect directly to a MySQL or any other non-browser database. The application itself runs only within the user's browser (it's just HTML, CSS & JavaScript once built and deployed) and typically accesses any external data via XHR requests to an API or APIs. Therefore, you will need to create a service wrapper around your MySQL database in order for your client-side app to be able to load and update data.
There are two things worth mentioning. The first is that since the SproutCore app contains all of your user interface and a great deal of business logic, your API can be quite simple and should only return raw data (such as JSON). The second is that, I should mention that the client-server design, while more tedious to implement, is absolutely necessary in practice, because you can never trust the client side code, which is in the hands of a possibly nefarious user. Therefore, your API should also act as the final gatekeeper to validate all requests from the client.
This tutorial I found helped me a lot. Its very brief and demonstrates how to implement a very simple login-app, how to send post-requests (triggered by the login-button-action) to the backend-server and how to asynchronously process the response inside the Sproutcore-App:
http://hawkins.io/2011/04/sproutcore_login_tutorial/

json-rpc with erlang

I have never used json before, and don't care anything about it except I now have a requirement to access an application through json-rpc.
I have done searches on "erlang json" which returned everything from proposed erlang bifs to mochijson to whatever. Thing is, I have yet to find any documentation or example using any of this stuff to do what I need to do, which is control another app through json-rpc. Most of the docs and examples I've seen have dealt with conversions and mappings from erlang data types to json and back. In fact, docs seem to go so overboard with enthusiasm for representing "language X" terms in json that I've often wondered whether there was something I've missed along the way. Thus far the topic has failed to stimulate any blood flow towards certain regions of the body, but whatever - it is what it is.
WHAT I DON'T WANT
I DON'T care about javascript, and I DON'T care about doing anything related to json-rpc from javascript or the browser.
WHAT I WANT
To use json-rpc from erlang SERVER SIDE to control an app SERVER SIDE.
At any rate...
1) Can someone point me to some docs and examples showing erlang using a json-rpc library to control or access another app?
2) Can someone recommend a library or libraries to do this? Since I am currently using yaws (or attempting to), my first choice would probably be yaws since it appears to have some json built-in. Thing is, the only yaws examples I saw were focused on using javascript code browser side to trigger some kind of json-rpc thing server side, and I don't want to do that.
At any rate, I'll accept the first thing that can do what I want and has documentation showing it being used that way.
Thanks.
WHAT I WANT
To use json-rpc from erlang SERVER SIDE to control an app SERVER SIDE.
Did you mean There is a json-rpc server implemented with erlang? and you want control an app behind that json-rpc?
Lets say there is a json-rpc server at "https://10.11.1.100:8006/json-rpc/"
And you can use any program language to access that url by http library, for example python's http module with some data for post query or just get query depend on what your json-rpc server provided.
Let's say there is a "Getfruit" and "Putfruit" method provided on json-rpc server for control the back app to load fruit to database.
Then you can use python http library start a get query to "https://10.11.1.100:8006/json-rpc/Getfruit/3", then the server will return a json data contains fruit(ID is 3) detail.
if you want control back app by sending some data to it's database then use post method on "Putfruit" path.
So, this is a simple example for how use json-rpc server.
If I misunderstanding you, if you want how use erlang to control a app through json-rpc server, then just use erlang's http library

GWT Convert RPC to JSON

My application uses GWT-RPC to communicate to the server. Is there anyway to transparentlly serialzie my data using JSON without changing the RPC layer?
IMHO this could be acheived by changing the serializers and using autobean codex in the UI.
Why do I need that?
I want to make cross domain RPC calls
I want to call the server side from a non GWT-app without providing an extra layer in server side.
UPDATE I just stumbled upon http://code.google.com/p/gerrit/source/browse/README?repo=gwtjsonrpc which is actively maintained (as it's part of Gerrit, the code-review tool used by the Android team)
Have a look at http://code.google.com/p/gwt-rpc-plus/ but it's no longer maintained...
If you really need it and don't want to move away from GWT-RPC, then replacing GWT's serializers should be possible: that's exactly what deRPC (com.google.gwt.rpc) does re. standard GWT-RPC (com.google.gwt.user.rpc), but it needs to do a bit more than that (namely: generate the serialization code for the client-side, as there's no reflection at runtime).
This going to be a hard task. I don't think the changing of serializers will work, GWT-RPC serializers are working with input as with a stream (basically data sent from server are in fact in JSON format, but they can be parsed only by GWT-RPC). You will have to create totally new generator, which will create code for parsing and object serialization/deserialization. AutoBean framework might be very helpful in this case. In the end you should be able to to migrate from GWT-RPC serialization to some other protocol without actually changing current code, which is using GWT-RPC services.
The biggest problem is cross-domain messaging. Normally you would use JSONP, but the problem is that JSONP basically allows only GET requests, if you need send a lot of data to the other server, you might not be able to fit everything into single requests. You can solve such problem with cross domain document messaging (e.g. you will open iframe, which will load special communication javascript from remote server, and you will use this iframe as proxy for your service via postMessage) , but this feature is not supported in IE7.

How Would I Go About Using Node.js For Frontend And Wordpress As The Backend?

I've had a thought of using Wordpress as a CMS backend, because well a lot of people know it and it is easy to use and then using Node.JS as the front-end. You're probably thinking now why would I want to do that in the first place, what is the advantage?
I want to use websockets and the wonderful Socket.io library for Node.JS provides beautiful cross-browser websockets support. Essentially I want a user to come to a site, a websocket is created and then content is fed to the frontend asynchronously as JSON and then decoded on the frontend all without page refreshing.
Effectively I am making Wordpress become a real-time CMS. You visit a site, but every link you click fetches the page as JSON and returns it via a websocket to save multiple requests and of course, page size.
How do I go about getting Node.JS talking to a MySQL database, pulling out info and then showing it? Any tutorials, resources and other useful tips would be gratefully appreciated. A few of my colleagues have wondered the same thing, so I think the answers will be a big help to everyone.
To be exact, you can't use Node.js for a front-end solution, since it runs on the server, not the browser (think of it like any other server-side language such as PHP, JSP etc).
You can, however, create the described solution with jQuery or any other Javascript library, you just have to implement data transfer with Socket.IO. On the server-side you'd need something to handle websockets, so the most native way would be to use Node.js, but since you want to use Wordpress, it gets really complicated, as Wordpress is not meant to be used in the way you described, so I'm afraid you'd have to write your CMS from ground up in Node.
Also, the way you described has a huge flaw. Search engine crawlers are still unable to parse and run Javascript, so if all of your content is loaded dynamically, it would seem empty to Google and others, so it would be impossible to ever make it in the search results rendering your site pretty much useless.
For MySQL and other modules for Node, you should check NPM registry and the Node modules page.
EDIT
After Dwayne explained his solution in comments, this is how I'd do it:
I'd use jQuery for front-end. Binding the document with .on(), and setting the selector to 'a', so that every anchor on the webpage would fire the handler.
The handler parses the a.href attribute and figures out whether it's an external link, which shouldn't be handled by Javascript, or if it's a link to the next page, to an article etc. You can prevent the default action by calling e.preventDefault() in the handler, which prevents the browser from redirecting to the location.
Then the handler would get the content in JSON by calling .getJSON() to the URL based on the article. The easiest way would be to have a certain pattern (such as all urls like www.domain.com/api) redirect to the Node service via .htaccess, to prevent cross-domain problems.
Node would then see the request, extract the parameters and figure out what the user wants. Then connect to the MySQL database with this module (it's as simple as it can get) and return the corresponding content formatted as JSON. Don't forget to set the Content-Type headers to 'application/json'.
jQuery gets the response, figure out the type of the request and updates the content accordingly. Profit.
As you can see, I wouldn't use WebSockets in this case, since you wouldn't really benefit much from it. They are mostly meant for small real-time updates (no huge HTTP headers to reduce the bandwidth) that are both-ways. This means that the server could also push data into the browser, without the browser asking for it. In a blog context, this is not required, and you won't have too many request, so the difference in bandwidth wouldn't be noticeable anyway. If, however, you would like to use it for educational purposes, just basically replace the getJSON part with SocketIO, I'm not sure whether Apache supports proxying WebSockets, though. Extra information about SocketIO basics are here.
Edit: I overlooked the part with 'using Node.js on the front-end'. As Vahur Roosimaa said, Node.js is on the server-side (think of it as Nginx / Apache + PHP combination). Node isn't a frontend library like jQuery.
If you want you can use it just for the websockets functionality (I suggest using Socket.IO).
Nice tutorials about Node.js and MySQL:
http://www.giantflyingsaucer.com/blog/?p=2596
http://mclear.co.uk/2011/01/26/very-simple-nodejs-mysql-select-query-example/
http://www.hacksparrow.com/using-mysql-with-node-js.html
This SO question might also help: MySQL with Node.js
Also check the examples from the github repo of node-mysql.
If you want something more advanced like an ORM, I recommend Sequelize.
Another good question from SO: Which ORM should I use for Node.js and MySQL?
You should check out Wordscript which I recently added a Node JS example which can act as a simple front end for doing basic post retrieval from a Wordpress database.
It uses a common mysql library for node, and generates MySQL queries from get parameters and renders data as it is retrieved from the database; including tags.
Wordscript aims to free backend/frontend developers from being forced to work with the Wordpress PHP codebase, but still allows for Wordpress'es administrative interface to be used when needed (and prudent to do so). API's have been written in Ruby and PHP that both return JSON feeds and function generally the same way the node version does; so thats an additional option where a scripting language is available.
One option you have, if you want to have wordpress as the CMS and keep its admin UI, is to write your wordpress templates to output JSON instead of HTML.
In contrast to Wordscript, this is more solution specific, since you will need to write your JSON output for every template/data you want. The upside is that you can create the JSON specifically for your needs.
On the node side, you write a small server that will consume the JSON, letting you use whatever javascript template language you want. Nodejs will also help out with performance, since you can save the rendered content and/or the JSON output in memory, saving you roundtrips to the wordpress templates.
I wrote a blog about this, which describes more of the benefits of using nodejs and wordpress together.
http://www.1001.io/improve-wordpress-with-nodejs/

Retrieving JSON from a web URL

This may be a terribly uninformed question, brace yourself. A company I'm working with has given an 'API' that I can use to access orders, however, there are only two real commands, getorders and getorderdetails. These commands are put in the format of www.server.com/path/to/the/orderapi/getorders/UniqueKey/
If I go to that web address, I'm prompted for a username and password, and once authenticating, get presented with a page of JSON formatted order details, contained in the body of the html page. I would like a service to check this information and create orders in our CRM based on it, is there an obvious way to access it without the browser/client interaction?
Update: We intended to Use BizTalk to consume this resource but after a bit of research and experimenting have decided to use a different service (WDSL), mainly because BizTalk doesn't seem to have terribly great support for Restful webservices. If anyone with more knowledge of the subject would like to chime in that's fine by me.
Update 2: Noticed that (a since deleted) thread on stackoverflow is basically a feedback thread of biztalk 2009-r2, and one of the requests is support for restful web services, so I don't think there is a graceful solution for restful services in BizTalk 2009.
Use your language of choice, along with some library that speaks HTTP, and start hacking away?
In PHP, you'd use the build-in cURL library to make the HTTP requests. You'd grab the JSON data, run it through json_decode() to create native PHP datatypes, and then operate on them at will, doing whatever you need to do to create orders in your CRM.