Connect Parse with external database? - mysql

I am working on project where it is build using traditional technologies like PHP, mysql and it is a web application.
Now we want to build an app for mobile users on platform like Andoid, iOS.
So we are thinking to connect MySql with Parse.com database.
I know parse uses NoSql kind of database for storing objects.
So my question is can we connect parse database to any other SQL database ?
If yes then how we can do that ?
EDIT
#Luca laco I just created a new cloud function like you. which is below.
Parse.Cloud.define("get_parse4j_object",
function(request,response){
// Parameters from client (iOS/Android app)
//var requestedObjectId = request.params.objectId;
// Calling beckend service for getting user information
Parse.Cloud.httpRequest({
method: "GET",
headers: {
'Content-Type': 'application/json'
},
url: "https://api.parse.com/1/parse4j/MLiOgxncUM", /* This could be your url for the proper php module */
//body: { "objectId":requestedObjectId }, /* Here you compose the body request for the http call, passing the php parameters and their values */
success: function(httpResponse) {
/* We expect that the php response is a Json string, using the header('application/json'), so: */
var jsonResponse = JSON.parse(httpResponse.text);
/* sample structure in jsonResponse: { "name":"Joe", "surname":"Banana", "birth_date":"01-02-1999" } */
/* Do any additional stuff you need... */
/* return the result to your iOS/Android client */
return response.success( { "myRequestedUserInfo" : jsonResponse } );
alert(jsonResponse);
},
error: function(httpResponse) {
return response.error({ "msg":"Unable to fetch this user", "code":123456 }); // sample error response
}
});
});
I followed the same way which Luca Laco explained me.
But I am getting error when I am calling function from client JS.
This is my client JS
<script type="text/javascript">
Parse.initialize("APP_ID", "JAVASCRIPT_KEY");
Parse.Cloud.run('get_parse4j_object', {}, {
success: function(result) {
alert(result);
},
error: function(error) {
alert(JSON.stringify(error));
}
});
</script>
In the network tab I can see
POST https://api.parse.com/1/functions/get_parse4j_object 400 (Bad Request)
and error is: {"code":141, "message":"function not found"}
Where I am missing and doing wrong ?

If you mean something like a common mysql connector, then the response is no, you can't. At now, The only way to make parse and something else in relation, is to query from and to Parse. To be clear:
If you want to get a value from Parse, that is stored in mysql, you have to use a http request to a specific php module stored on your php website ( and implemented by you ) that expect some paramenter, and return the result in a specific way, normally in json format, using also the http header application/json.
If you want to get a value from php, that is stored on the parse db, you can run a REST call from php following the spec on the parse website ( https://parse.com/docs/rest/guide/ ), or simply using the php sdk ( https://github.com/ParsePlatform/parse-php-sdk ). Take a look also to the Parse Webhooks.
From what i understood, you already have a working web service, so doing this, you would just proxy the resources stored on your server on mysql to your clients through Parse. In other words you should create a Parse Cloud function for each type of information you want to retrieve on the clients using the Parse SDK (for iOS or Android) and another Parse Colud function for each action you perform on your devices and you want to save on your mysql db, always through Parse system.
My personal opinion, is to stay on Mysql, especially because on Parse we still have a lot of limitation on the queries ( no group by, no distinct, query timeout, etc. ), while seems to be a really good service for the push notification. Anyway all this depends by the complexity of your software and as i said, is just my opinion.
[Edit]
Here an example:
In Parse cloud code, let's make a cloud function called 'get_user_info'
Parse.Cloud.define("get_user_info",
function(request,response){
// Parameters from client (iOS/Android app)
var requestedUserId = request.params.user_id;
// Calling beckend service for getting user information
Parse.Cloud.httpRequest({
method: "POST",
url: "https://www.yourPhpWebsite.com/getUser.php", /* This could be your url for the proper php module */
body: { "php_param_user_id":requestedUserId }, /* Here you compose the body request for the http call, passing the php parameters and their values */
success: function(httpResponse) {
/* We expect that the php response is a Json string, using the header('application/json'), so: */
var jsonResponse = JSON.parse(httpResponse.text);
/* sample structure in jsonResponse: { "name":"Joe", "surname":"Banana", "birth_date":"01-02-1999" } */
/* Do any additional stuff you need... */
/* return the result to your iOS/Android client */
return response.success( { "myRequestedUserInfo" : jsonResponse } );
},
error: function(httpResponse) {
return response.error({ "msg":"Unable to fetch this user", "code":123456 }); // sample error response
}
});
});
The sample 'getUser.php' module could be
<?php
$expectedUserId = $_POST['php_param_user_id'];
// query your MySql db using passed user id
$query = "SELECT name,surname,birth_date FROM MyUserTable Where id = ".$expectedUserId;
// perform your query (the above one is just an example, would be better to use PDO and any other check, just to avoid SQL Injection)
// ...
// ..
// .
$resultQuery = row[0];
// sample json structure
$jsonResponseToParse = '{ "name":'.resultQuery["name"].', "surname":'.resultQuery["surname"].', "birth_date":'.resultQuery["birth_date"].' }';
header('application/json');
echo jsonResponseToParse;
exit();
?>
Hope it helps

Related

Featherjs - Add custom field to hook context object

When using feathersjs on both client and server side, in the app hooks (in the client) we receive an object with several fields, like the service, the method, path, etc.
I would like, with socket io, to add a custom field to that object. Would that the possible? To be more precise, I would like to send to the client the current version of the frontend app, to be able to force or suggest a refresh when the frontend is outdated (using pwa).
Thanks!
For security reasons, only params.query and data (for create, update and patch) are passed between the client and the server. Query parameters can be pulled from the query into the context with a simple hook like this (where you can pass the version as the __v query parameter):
const setVersion = context => {
const { __v, ...query } = context.params.query || {};
context.version = __v;
// Update `query` with the data without the __v parameter
context.params.query = query;
return context;
}
Additionally you can also add additional parameters like the version number as extraHeaders which are then available as params.headers.
Going the other way around (sending the version information from the server) can be done by modifying context.result in an application hook:
const { version } = require('package.json');
app.hooks({
after: {
all (context) {
context.result = {
...context.result,
__v: version
}
}
}
});
It needs to be added to the returned data since websockets do not have any response headers.

How to get around previously declared json body-parser in Nodebb?

I am writing a private plugin for nodebb (open forum software). In the nodebb's webserver.js file there is a line that seems to be hogging all incoming json data.
app.use(bodyParser.json(jsonOpts));
I am trying to convert all incoming json data for one of my end-points into raw data. However the challenge is I cannot remove or modify the line above.
The following code works ONLY if I temporarily remove the line above.
var rawBodySaver = function (req, res, buf, encoding) {
if (buf && buf.length) {
req.rawBody = buf.toString(encoding || 'utf8');
}
}
app.use(bodyParser.json({ verify: rawBodySaver }));
However as soon as I put the app.use(bodyParser.json(jsonOpts)); middleware back into the webserver.js file it stops working. So it seems like body-parser only processes the first parser that matches the incoming data type and then skips all the rest?
How can I get around that? I could not find any information in their official documentation.
Any help is greatly appreciated.
** Update **
The problem I am trying to solve is to correctly handle an incoming stripe webhook event. In the official stripe documentation they suggested I do the following:
// Match the raw body to content type application/json
app.post('/webhook', bodyParser.raw({type: 'application/json'}),
(request, response) => {
const sig = request.headers['stripe-signature'];
let event;
try {
event = stripe.webhooks.constructEvent(request.body, sig,
endpointSecret);
} catch (err) {
return response.status(400).send(Webhook Error:
${err.message});
}
Both methods, the original at the top of this post and the official stripe recommended way, construct the stripe event correctly but only if I remove the middleware in webserver. So my understanding now is that you cannot have multiple middleware to handle the same incoming data. I don't have much wiggle room when it comes to the first middleware except for being able to modify the argument (jsonOpts) that is being passed to it and comes from a .json file. I tried adding a verify field but I couldn't figure out how to add a function as its value. I hope this makes sense and sorry for not stating what problem I am trying to solve initially.
The only solution I can find without modifying the NodeBB code is to insert your middleware in a convenient hook (that will be later than you want) and then hack into the layer list in the app router to move that middleware earlier in the app layer list to get it in front of the things you want to be in front of.
This is a hack so if Express changes their internal implementation at some future time, then this could break. But, if they ever changed this part of the implementation, it would likely only be in a major revision (as in Express 4 ==> Express 5) and you could just adapt the code to fit the new scheme or perhaps NodeBB will have given you an appropriate hook by then.
The basic concept is as follows:
Get the router you need to modify. It appears it's the app router you want for NodeBB.
Insert your middleware/route as you normally would to allow Express to do all the normal setup for your middleware/route and insert it in the internal Layer list in the app router.
Then, reach into the list, take it off the end of the list (where it was just added) and insert it earlier in the list.
Figure out where to put it earlier in the list. You probably don't want it at the very start of the list because that would put it after some helpful system middleware that makes things like query parameter parsing work. So, the code looks for the first middleware that has a name we don't recognize from the built-in names we know and insert it right after that.
Here's the code for a function to insert your middleware.
function getAppRouter(app) {
// History:
// Express 4.x throws when accessing app.router and the router is on app._router
// But, the router is lazy initialized with app.lazyrouter()
// Express 5.x again supports app.router
// And, it handles the lazy construction of the router for you
let router;
try {
router = app.router; // Works for Express 5.x, Express 4.x will throw when accessing
} catch(e) {}
if (!router) {
// Express 4.x
if (typeof app.lazyrouter === "function") {
// make sure router has been created
app.lazyrouter();
}
router = app._router;
}
if (!router) {
throw new Error("Couldn't find app router");
}
return router;
}
// insert a method on the app router near the front of the list
function insertAppMethod(app, method, path, fn) {
let router = getAppRouter(app);
let stack = router.stack;
// allow function to be called with no path
// as insertAppMethod(app, metod, fn);
if (typeof path === "function") {
fn = path;
path = null;
}
// add the handler to the end of the list
if (path) {
app[method](path, fn);
} else {
app[method](fn);
}
// now remove it from the stack
let layerObj = stack.pop();
// now insert it near the front of the stack,
// but after a couple pre-built middleware's installed by Express itself
let skips = new Set(["query", "expressInit"]);
for (let i = 0; i < stack.length; i++) {
if (!skips.has(stack[i].name)) {
// insert it here before this item
stack.splice(i, 0, layerObj);
break;
}
}
}
You would then use this to insert your method like this from any NodeBB hook that provides you the app object sometime during startup. It will create your /webhook route handler and then insert it earlier in the layer list (before the other body-parser middleware).
let rawMiddleware = bodyParser.raw({type: 'application/json'});
insertAppMethod(app, 'post', '/webhook', (request, response, next) => {
rawMiddleware(request, response, (err) => {
if (err) {
next(err);
return;
}
const sig = request.headers['stripe-signature'];
let event;
try {
event = stripe.webhooks.constructEvent(request.body, sig, endpointSecret);
// you need to either call next() or send a response here
} catch (err) {
return response.status(400).send(`Webhook Error: ${err.message}`);
}
});
});
The bodyParser.json() middleware does the following:
Check the response type of an incoming request to see if it is application/json.
If it is that type, then read the body from the incoming stream to get all the data from the stream.
When it has all the data from the stream, parse it as JSON and put the result into req.body so follow-on request handlers can access the already-read and already-parsed data there.
Because it reads the data from the stream, there is no longer any more data in the stream. Unless it saves the raw data somewhere (I haven't looked to see if it does), then the original RAW data is gone - it's been read from the stream already. This is why you can't have multiple different middleware all trying to process the same request body. Whichever one goes first reads the data from the incoming stream and then the original data is no longer there in the stream.
To help you find a solution, we need to know what end-problem you're really trying to solve? You will not be able to have two middlewares both looking for the same content-type and both reading the request body. You could replace bodyParser.json() that does both what it does now and does something else for your purpose in the same middleware, but not in separate middleware.

Twilio Synch token in ColdFusion

ColdFusion being as obscure as it is, Twilio doesn't have any SDKs for it. I'm trying to give Synch a go; I'm not getting the JSON request for the access token correct. Trying to mimic what is done by their node.js example here, I thought I could just output the JSON to the page on token.cfm:
{
"identity":"#Username#",
"token":["#AccountSID#","#APPSID#","#SECRET#"]
}
This is called from index.cfm:
<script src="js/jquery.js" ></script>
<script src="https://media.twiliocdn.com/sdk/js/sync/releases/0.5.7/twilio-sync.min.js"></script>
<script>
function fetchAccessToken(handler) {
// We use jQuery to make an Ajax request to our server to retrieve our
// Access Token
$.getJSON('token.cfm', function(data) {
// The data sent back from the server should contain a long string, which
// is the token you'll need to initialize the SDK. This string is in a format
// called JWT (JSON Web Token) - more at http://jwt.io
console.log(data.token);
// Since the starter app doesn't implement authentication, the server sends
// back a randomly generated username for the current client, which is how
// they will be identified while sending messages. If your app has a login
// system, you should use the e-mail address or username that uniquely identifies
// a user instead.
console.log(data.identity);
handler(data);
});
}
$(document).ready(function()
{
fetchAccessToken(initializeSync);
function initializeSync(tokenResponse) {
var syncClient = new Twilio.Sync.Client(tokenResponse.token);
// Use syncClient here
}
});
</script>
The response I receive is
{code: 400, message: "Unable to process JSON"}
code:
400
message:
"Unable to process JSON"
Can I accomplish this? Or, alternately, can the token be built by JavaScript alone?
Your JS is a very roundabout way of writing this:
$(function() {
$.get('token.cfm').done(function (response) {
var syncClient = new Twilio.Sync.Client(response.token);
// ... use syncClient here
});
});
but this still requires that the response is actually parseable as JSON.
If your CFM page just contains this:
<cfoutput>
{
"identity":"#Username#",
"token":["#AccountSID#","#APPSID#","#SECRET#"]
}
</cfoutput>
then this almost certainly produces syntactically wrong JSON. Don't do that.
JSON is to be produced from a data structure and a serialization function, that's no different in ColdFusion than in any other language.
<cfset AccountSID = "...">
<cfset APPSID = "...">
<cfset SECRET = "...">
<cfset tokenData = {
"identity" = Username,
"token" = [AccountSID, APPSID, SECRET]
}>
<cfcontent type="application/json"><cfoutput>#SerializeJSON(tokenData)#</cfoutput>
There are other, nicer ways of creating JSON responses, most prominently CF components with functions annotated with the "json" returnformat, but doing it manually like above is enough for a one-off.

Angular resource 404 Not Found

I've read other posts that have similar 404 errors, my problem is that I can correctly query the JSON data, but can't save without getting this error.
I'm using Angular's $resource to interact with a JSON endpoint. I have the resource object returning from a factory as follows:
app.factory('Product', function($resource) {
return $resource('api/products.json', { id: '#id' });
});
My JSON is valid and I can successfully use resource's query() method to return the objects inside of my directive, like this:
var item = Product.query().$promise.then(function(promise) {
console.log(promise) // successfully returns JSON objects
});
However, when I try to save an item that I've updated, using the save() method, I get a 404 Not Found error.
This is the error that I get:
http://localhost:3000/api/products.json/12-34 404 (Not Found)
I know that my file path is correct, because I can return the items to update the view. Why am I getting this error and how can I save an item?
Here is my data structure:
[
{
"id": "12-34",
"name": "Greece",
"path": "/images/athens.png",
"description": ""
},
...
]
By default the $save method use the POST verb, you will need to figure out which HTTP verbs are accepted by your server en order to make an update, most modern api servers accept PATCH or PUT requests for updating data rather than POST.
Then configure your $resource instance to use the proper verb like this :
app.factory('Product', function($resource) {
return $resource('api/products.json', { id: '#id' }, {'update': { method:'PUT' }});
});
check $resource docs for more info.
NOTE: $resource is meant to connect a frontend with a backend server supporting RESTful protocol, unless you are using one to receive data & save it into a file rather than a db.
Otherwise if you are only working with frontend solution where you need to implement $resource and have no server for the moment, then use a fake one, there is many great solutions out there like deployd.
You probably don't implement POST method for urls like /api/products.json/12-34. POST method is requested from angular for saving a new resource. So you need to update your server side application to support it and do the actual saving.
app.factory('Product', function($resource) {
return $resource('api/products.json/:id', { id: '#id' });
});
Try adding "/:id" at the end of the URL string.

How to use update function to upload attachment in CouchDB

I would like to know what can I do to upload attachments in CouchDB using the update function.
here you will find an example of my update function to add documents:
function(doc, req){
if (!doc) {
if (!req.form._id) {
req.form._id = req.uuid;
}
req.form['|edited_by'] = req.userCtx.name
req.form['|edited_on'] = new Date();
return [req.form, JSON.stringify(req.form)];
}
else {
return [null, "Use POST to add a document."]
}
}
example for remove documents:
function(doc, req){
if (doc) {
for (var i in req.form) {
doc[i] = req.form[i];
}
doc['|edited_by'] = req.userCtx.name
doc['|edited_on'] = new Date();
doc._deleted = true;
return [doc, JSON.stringify(doc)];
}
else {
return [null, "Document does not exist."]
}
}
thanks for your help,
It is possible to add attachments to a document using an update function by modifying the document's _attachments property. Here's an example of an update function which will add an attachment to an existing document:
function (doc, req) {
// skipping the create document case for simplicity
if (!doc) {
return [null, "update only"];
}
// ensure that the required form parameters are present
if (!req.form || !req.form.name || !req.form.data) {
return [null, "missing required post fields"];
}
// if there isn't an _attachments property on the doc already, create one
if (!doc._attachments) {
doc._attachments = {};
}
// create the attachment using the form data POSTed by the client
doc._attachments[req.form.name] = {
content_type: req.form.content_type || 'application/octet-stream',
data: req.form.data
};
return [doc, "saved attachment"];
}
For each attachment, you need a name, a content type, and body data encoded as base64. The example function above requires that the client sends an HTTP POST in application/x-www-form-urlencoded format with at least two parameters: name and data (a content_type parameter will be used if provided):
name=logo.png&content_type=image/png&data=iVBORw0KGgoA...
To test the update function:
Find a small image and base64 encode it:
$ base64 logo.png | sed 's/+/%2b/g' > post.txt
The sed script encodes + characters so they don't get converted to spaces.
Edit post.txt and add name=logo.png&content_type=image/png&data= to the top of the document.
Create a new document in CouchDB using Futon.
Use curl to call the update function with the post.txt file as the body, substituting in the ID of the document you just created.
curl -X POST -d #post.txt http://127.0.0.1:5984/mydb/_design/myddoc/_update/upload/193ecff8618678f96d83770cea002910
This was tested on CouchDB 1.6.1 running on OSX.
Update: #janl was kind enough to provide some details on why this answer can lead to performance and scaling issues. Uploading attachments via an upload handler has two main problems:
The upload handlers are written in JavaScript, so the CouchDB server may have to fork() a couchjs process to handle the upload. Even if a couchjs process is already running, the server has to stream the entire HTTP request to the external process over stdin. For large attachments, the transfer of the request can take significant time and system resources. For each concurrent request to an update function like this, CouchDB will have to fork a new couchjs process. Since the process runtime will be rather long because of what is explained next, you can easily run out of RAM, CPU or the ability to handle more concurrent requests.
After the _attachments property is populated by the upload handler and streamed back to the CouchDB server (!), the server must parse the response JSON, decode the base64-encoded attachment body, and write the binary body to disk. The standard method of adding an attachment to a document -- PUT /db/docid/attachmentname -- streams the binary request body directly to disk and does not require the two processing steps.
The function above will work, but there are non-trivial issues to consider before using it in a highly-scalable system.