Contact form 7 over the JSON REST API (WP API)? - json

is this possible?
To get the JSON (plugin url) of the form (fields, input types, etc.) how would I have to hook this up in a plugin?
Then to use the send mechanism of the plugin how would I transfer my rest POST to the plugins send function?
Any ideas would be appreciated

You can hook in the wpcf7_before_send_mail action to get POST data right before the mail il sent by CF7.
add_action('wpcf7_before_send_mail', 'my_wpcf7_choose_recipient');
function my_wpcf7_choose_recipient($WPCF7_ContactForm)
{
// use $submission to access POST data
$submission = WPCF7_Submission::get_instance();
$data = $submission->get_posted_data();
$subject = $data['subject']
// use WPCF7_ContactForm->prop() to access form settings
$mail = $WPCF7_ContactForm->prop('mail');
$recipient = $mail['recipient'];
// update a form property
$WPCF7_ContactForm->set_properties(array('mail' => $mail));
}
Then in this function you can call your plugin and transfer him $submission.
And if you want to alter the POST data, you can use the wpcf7_posted_data filter:
add_filter('wpcf7_posted_data', 'my_wpcf7_posted_data');
function my_wpcf7_posted_data($data)
{
$data['subject'] = 'Test ' . $data['subject'];
return $data;
}

I know this is two years old, but I needed the same exact thing so I made a plugin and happened to run across this post.
https://github.com/CodeBradley/contact-form-7-rest-api

Related

Google Sheets API OAuth Refresh Token Only Issued Once Per Account [duplicate]

I want to get the access token from Google. The Google API says that to get the access token, send the code and other parameters to token generating page, and the response will be a JSON Object like :
{
"access_token" : "ya29.AHES6ZTtm7SuokEB-RGtbBty9IIlNiP9-eNMMQKtXdMP3sfjL1Fc",
"token_type" : "Bearer",
"expires_in" : 3600,
"refresh_token" : "1/HKSmLFXzqP0leUihZp2xUt3-5wkU7Gmu2Os_eBnzw74"
}
However, I'm not receiving the refresh token. The response in my case is:
{
"access_token" : "ya29.sddsdsdsdsds_h9v_nF0IR7XcwDK8XFB2EbvtxmgvB-4oZ8oU",
"token_type" : "Bearer",
"expires_in" : 3600
}
The refresh_token is only provided on the first authorization from the user. Subsequent authorizations, such as the kind you make while testing an OAuth2 integration, will not return the refresh_token again. :)
Go to the page showing Apps with access to your account:
https://myaccount.google.com/u/0/permissions.
Under the Third-party apps menu, choose your app.
Click Remove access and then click Ok to confirm
The next OAuth2 request you make will return a refresh_token (providing that it also includes the 'access_type=offline' query parameter.
Alternatively, you can add the query parameters prompt=consent&access_type=offline to the OAuth redirect (see Google's OAuth 2.0 for Web Server Applications page).
This will prompt the user to authorize the application again and will always return a refresh_token.
In order to get the refresh token you have to add both approval_prompt=force and access_type="offline"
If you are using the java client provided by Google it will look like this:
GoogleAuthorizationCodeFlow flow = new GoogleAuthorizationCodeFlow.Builder(
HTTP_TRANSPORT, JSON_FACTORY, getClientSecrets(), scopes)
.build();
AuthorizationCodeRequestUrl authorizationUrl =
flow.newAuthorizationUrl().setRedirectUri(callBackUrl)
.setApprovalPrompt("force")
.setAccessType("offline");
I'd like to add a bit more info on this subject for those frustrated souls who encounter this issue. The key to getting a refresh token for an offline app is to make sure you are presenting the consent screen. The refresh_token is only returned immediately after a user grants authorization by clicking "Allow".
The issue came up for me (and I suspect many others) after I'd been doing some testing in a development environment and therefore already authorized my application on a given account. I then moved to production and attempted to authenticate again using an account which was already authorized. In this case, the consent screen will not come up again and the api will not return a new refresh token. To make this work, you must force the consent screen to appear again by either:
prompt=consent
or
approval_prompt=force
Either one will work but you should not use both. As of 2021, I'd recommend using prompt=consent since it replaces the older parameter approval_prompt and in some api versions, the latter was actually broken (https://github.com/googleapis/oauth2client/issues/453). Also, prompt is a space delimited list so you can set it as prompt=select_account%20consent if you want both.
Of course you also need:
access_type=offline
Additional reading:
Docs: https://developers.google.com/identity/protocols/oauth2/web-server#request-parameter-prompt
Docs: https://developers.google.com/identity/protocols/oauth2/openid-connect#re-consent
Discussion about this issue: https://github.com/googleapis/google-api-python-client/issues/213
I searched a long night and this is doing the trick:
Modified user-example.php from admin-sdk
$client->setAccessType('offline');
$client->setApprovalPrompt('force');
$authUrl = $client->createAuthUrl();
echo "<a class='login' href='" . $authUrl . "'>Connect Me!</a>";
then you get the code at the redirect url
and the authenticating with the code and getting the refresh token
$client()->authenticate($_GET['code']);
echo $client()->getRefreshToken();
You should store it now ;)
When your accesskey times out just do
$client->refreshToken($theRefreshTokenYouHadStored);
This has caused me some confusion so I thought I'd share what I've come to learn the hard way:
When you request access using the access_type=offline and approval_prompt=force parameters you should receive both an access token and a refresh token. The access token expires soon after you receive it and you will need to refresh it.
You correctly made the request to get a new access token and received the response that has your new access token. I was also confused by the fact that I didn't get a new refresh token. However, this is how it is meant to be since you can use the same refresh token over and over again.
I think some of the other answers assume that you wanted to get yourself a new refresh token for some reason and sugggested that you re-authorize the user but in actual fact, you don't need to since the refresh token you have will work until revoked by the user.
Rich Sutton's answer finally worked for me, after I realized that adding access_type=offline is done on the front end client's request for an authorization code, not the back end request that exchanges that code for an access_token. I've added a comment to his answer and this link at Google for more info about refreshing tokens.
P.S. If you are using Satellizer, here is how to add that option to the $authProvider.google in AngularJS.
In order to get the refresh_token you need to include access_type=offline in the OAuth request URL. When a user authenticates for the first time you will get back a non-nil refresh_token as well as an access_token that expires.
If you have a situation where a user might re-authenticate an account you already have an authentication token for (like #SsjCosty mentions above), you need to get back information from Google on which account the token is for. To do that, add profile to your scopes. Using the OAuth2 Ruby gem, your final request might look something like this:
client = OAuth2::Client.new(
ENV["GOOGLE_CLIENT_ID"],
ENV["GOOGLE_CLIENT_SECRET"],
authorize_url: "https://accounts.google.com/o/oauth2/auth",
token_url: "https://accounts.google.com/o/oauth2/token"
)
# Configure authorization url
client.authorize_url(
scope: "https://www.googleapis.com/auth/analytics.readonly profile",
redirect_uri: callback_url,
access_type: "offline",
prompt: "select_account"
)
Note the scope has two space-delimited entries, one for read-only access to Google Analytics, and the other is just profile, which is an OpenID Connect standard.
This will result in Google providing an additional attribute called id_token in the get_token response. To get information out of the id_token, check out this page in the Google docs. There are a handful of Google-provided libraries that will validate and “decode” this for you (I used the Ruby google-id-token gem). Once you get it parsed, the sub parameter is effectively the unique Google account ID.
Worth noting, if you change the scope, you'll get back a refresh token again for users that have already authenticated with the original scope. This is useful if, say, you have a bunch of users already and don't want to make them all un-auth the app in Google.
Oh, and one final note: you don't need prompt=select_account, but it's useful if you have a situation where your users might want to authenticate with more than one Google account (i.e., you're not using this for sign-in / authentication).
1. How to get 'refresh_token' ?
Solution: access_type='offline' option should be used when generating authURL.
source : Using OAuth 2.0 for Web Server Applications
2. But even with 'access_type=offline', I am not getting the 'refresh_token' ?
Solution: Please note that you will get it only on the first request, so if you are storing it somewhere and there is a provision to overwrite this in your code when getting new access_token after previous expires, then make sure not to overwrite this value.
From Google Auth Doc : (this value = access_type)
This value instructs the Google authorization server to return a
refresh token and an access token the first time that your application
exchanges an authorization code for tokens.
If you need 'refresh_token' again, then you need to remove access for your app as by following the steps written in Rich Sutton's answer.
I'm using nodejs client for access to private data
The solution was add the promp property with value consent to the settings object in oAuth2Client.generateAuthUrl function.
Here is my code:
const getNewToken = (oAuth2Client, callback) => {
const authUrl = oAuth2Client.generateAuthUrl({
access_type: 'offline',
prompt: 'consent',
scope: SCOPES,
})
console.log('Authorize this app by visiting this url:', authUrl)
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
})
rl.question('Enter the code from that page here: ', (code) => {
rl.close()
oAuth2Client.getToken(code, (err, token) => {
if (err) return console.error('Error while trying to retrieve access token', err)
oAuth2Client.setCredentials(token)
// Store the token to disk for later program executions
fs.writeFile(TOKEN_PATH, JSON.stringify(token), (err) => {
if (err) return console.error(err)
console.log('Token stored to', TOKEN_PATH)
})
callback(oAuth2Client)
})
})
}
You can use the online parameters extractor to get the code for generate your token:
Online parameters extractor
Here is the complete code from google official docs:
https://developers.google.com/sheets/api/quickstart/nodejs
I hope the information is useful
Setting this will cause the refresh token to be sent every time:
$client->setApprovalPrompt('force');
an example is given below (php):
$client = new Google_Client();
$client->setClientId($client_id);
$client->setClientSecret($client_secret);
$client->setRedirectUri($redirect_uri);
$client->addScope("email");
$client->addScope("profile");
$client->setAccessType('offline');
$client->setApprovalPrompt('force');
For me I was trying out CalendarSampleServlet provided by Google. After 1 hour the access_key times out and there is a redirect to a 401 page. I tried all the above options but they didn't work. Finally upon checking the source code for 'AbstractAuthorizationCodeServlet', I could see that redirection would be disabled if credentials are present, but ideally it should have checked for refresh token!=null. I added below code to CalendarSampleServlet and it worked after that. Great relief after so many hours of frustration . Thank God.
if (credential.getRefreshToken() == null) {
AuthorizationCodeRequestUrl authorizationUrl = authFlow.newAuthorizationUrl();
authorizationUrl.setRedirectUri(getRedirectUri(req));
onAuthorization(req, resp, authorizationUrl);
credential = null;
}
Using offline access and prompt:consent worked well to me:
auth2 = gapi.auth2.init({
client_id: '{cliend_id}'
});
auth2.grantOfflineAccess({prompt:'consent'}).then(signInCallback);
In order to get new refresh_token each time on authentication the type of OAuth 2.0 credentials created in the dashboard should be "Other". Also as mentioned above the access_type='offline' option should be used when generating the authURL.
When using credentials with type "Web application" no combination of prompt/approval_prompt variables will work - you will still get the refresh_token only on the first request.
To get a refresh token using postman, here is an example of the configurations
Expected Response
now google had refused those parameters in my request (access_type, prompt)... :( and there is no "Revoke Access" button at all. I'm frustrating because of getting back my refresh_token lol
UPDATE:
I found the answer in here :D you can get back the refresh token by a request
https://developers.google.com/identity/protocols/OAuth2WebServer
curl -H "Content-type:application/x-www-form-urlencoded" \
https://accounts.google.com/o/oauth2/revoke?token={token}
The token can be an access token or a refresh token. If the token is an access token and it has a corresponding refresh token, the refresh token will also be revoked.
If the revocation is successfully processed, then the status code of the response is 200. For error conditions, a status code 400 is returned along with an error code.
#!/usr/bin/env perl
use strict;
use warnings;
use 5.010_000;
use utf8;
binmode STDOUT, ":encoding(utf8)";
use Text::CSV_XS;
use FindBin;
use lib $FindBin::Bin . '/../lib';
use Net::Google::Spreadsheets::V4;
use Net::Google::DataAPI::Auth::OAuth2;
use lib 'lib';
use Term::Prompt;
use Net::Google::DataAPI::Auth::OAuth2;
use Net::Google::Spreadsheets;
use Data::Printer ;
my $oauth2 = Net::Google::DataAPI::Auth::OAuth2->new(
client_id => $ENV{CLIENT_ID},
client_secret => $ENV{CLIENT_SECRET},
scope => ['https://www.googleapis.com/auth/spreadsheets'],
);
my $url = $oauth2->authorize_url();
# system("open '$url'");
print "go to the following url with your browser \n" ;
print "$url\n" ;
my $code = prompt('x', 'paste code: ', '', '');
my $objToken = $oauth2->get_access_token($code);
my $refresh_token = $objToken->refresh_token() ;
print "my refresh token is : \n" ;
# debug p($refresh_token ) ;
p ( $objToken ) ;
my $gs = Net::Google::Spreadsheets::V4->new(
client_id => $ENV{CLIENT_ID}
, client_secret => $ENV{CLIENT_SECRET}
, refresh_token => $refresh_token
, spreadsheet_id => '1hGNULaWpYwtnMDDPPkZT73zLGDUgv5blwJtK7hAiVIU'
);
my($content, $res);
my $title = 'My foobar sheet';
my $sheet = $gs->get_sheet(title => $title);
# create a sheet if does not exit
unless ($sheet) {
($content, $res) = $gs->request(
POST => ':batchUpdate',
{
requests => [
{
addSheet => {
properties => {
title => $title,
index => 0,
},
},
},
],
},
);
$sheet = $content->{replies}[0]{addSheet};
}
my $sheet_prop = $sheet->{properties};
# clear all cells
$gs->clear_sheet(sheet_id => $sheet_prop->{sheetId});
# import data
my #requests = ();
my $idx = 0;
my #rows = (
[qw(name age favorite)], # header
[qw(tarou 31 curry)],
[qw(jirou 18 gyoza)],
[qw(saburou 27 ramen)],
);
for my $row (#rows) {
push #requests, {
pasteData => {
coordinate => {
sheetId => $sheet_prop->{sheetId},
rowIndex => $idx++,
columnIndex => 0,
},
data => $gs->to_csv(#$row),
type => 'PASTE_NORMAL',
delimiter => ',',
},
};
}
# format a header row
push #requests, {
repeatCell => {
range => {
sheetId => $sheet_prop->{sheetId},
startRowIndex => 0,
endRowIndex => 1,
},
cell => {
userEnteredFormat => {
backgroundColor => {
red => 0.0,
green => 0.0,
blue => 0.0,
},
horizontalAlignment => 'CENTER',
textFormat => {
foregroundColor => {
red => 1.0,
green => 1.0,
blue => 1.0
},
bold => \1,
},
},
},
fields => 'userEnteredFormat(backgroundColor,textFormat,horizontalAlignment)',
},
};
($content, $res) = $gs->request(
POST => ':batchUpdate',
{
requests => \#requests,
},
);
exit;
#Google Sheets API, v4
# Scopes
# https://www.googleapis.com/auth/drive View and manage the files in your Google D# # i# rive
# https://www.googleapis.com/auth/drive.file View and manage Google Drive files and folders that you have opened or created with this app
# https://www.googleapis.com/auth/drive.readonly View the files in your Google Drive
# https://www.googleapis.com/auth/spreadsheets View and manage your spreadsheets in Google Drive
# https://www.googleapis.com/auth/spreadsheets.readonly View your Google Spreadsheets
My solution was a bit weird..i tried every solution i found on internet and nothing. Surprisely this worked: delete the credentials.json, refresh, vinculate your app in your account again. The new credentials.json file will have the refresh token. Backup this file somewhere.
Then keep using your app until the refresh token error comes again. Delete the crendetials.json file that now is only with an error message (this hapenned in my case), then paste you old credentials file in the folder, its done!
Its been 1 week since ive done this and had no more problems.
Adding access_type=offline to the authorisation Google authorisation URL did the trick for me. I am using Java and Spring framework.
This is the code that creates the client registration:
return CommonOAuth2Provider.GOOGLE
.getBuilder(client)
.scope("openid", "profile", "email", "https://www.googleapis.com/auth/gmail.send")
.authorizationGrantType(AuthorizationGrantType.AUTHORIZATION_CODE)
.authorizationUri("https://accounts.google.com/o/oauth2/v2/auth?access_type=offline")
.clientId(clientId)
.redirectUriTemplate("{baseUrl}/{action}/oauth2/code/{registrationId}")
.clientSecret(clientSecret)
.build();
The important part here is the authorization URI, to which ?access_type=offline is appended.

How do I create a webhook in clickfunnels to send information to Wishlist Member

The instructions in clickfunnels as me to start by creating a test endpoint URL:
. Creating A Test Endpoint.
First, you will need to create a test endpoint at <your-domain/funnel_webhooks/test>
And it should include the headers below.
Content-Type as application/json
X-Clickfunnels-Webhook-Delivery-Id as an MD5 of the URL and Payload.
The payload (HTTP message body) will be a JSON object with a key of "time" and value of the current time in UTC as follows:
{ "time": "YYYY-MM-DD HH:MM:SS UTC" }
I went into the file manager in my hosting and added a folder for funnel_webhooks & a file called test.
I think I changed the content type to JSON
And, I think I figured out how to create the JSON object within the file.
I'm not sure how to do this part:
X-Clickfunnels-Webhook-Delivery-Id as an MD5 of the URL and Payload.
This loom video will show where I'm at so far in the process.
https://www.loom.com/share/1c9be96014b8413a8c9ba54f56dd42a8
Any support would be greatly appreciated!
It's actually much more simple than you think. They just want to verify you own the domain by getting a 200 response from any file you place in the folder on your server.
Create this folder structure on your server in the root and add an index.php file or index.html file like this:
/funnel_webhooks/test/index.php (I wrote "hello" in this file for good measure)
Go to Click Funnels. Click your Funnel. Click Settings. Scroll to bottom and click Manage Your Funnel Webhooks
Create a New webhook with the URL: whateveryourdomain/funnel_webhooks/test/index.php
Click Create Funnel Webhook Button
If your file is in the right location then click funnels will use this to verify you own the domain.
All of your webhooks going forward will work with no problem as long as you use this domain that was verified.
let me give it a try!
What you are doing right now is just creating a JSON file that is not going to get the webhook event from the clickfunnel.
Start by creating a PHP file if your server is compatible with PHP.
And in it just write this code and try it out.
<?php
//set the headers.
header("Content-Type: application/json;");
header("X-Clickfunnels-Webhook-Delivery-Id : MD5 of CONTENTS HERE");
//check if there is input data.
if ($json = json_decode(file_get_contents("php://input"), true)) {
//print and set the data as variable
print_r($json);
$data = $json;
}else{
//try getting the POST data.
print_r($_POST);
$data = $_POST;
}
//set the response as OK - 200
http_response_code(200); //response code for OK.
echo json_encode(array("status" => 'OK', "code" => 1, "payload" => $data));
//save the data as a file to check the information that was sent by the webhook.
$file = 'webhook_contents.txt';
$current = file_get_contents($file);
$current .= date("Y-m-d h:i:s").json_encode($data);
file_put_contents($file, $current);
// From here, it all depends on what you want to do with the webhook event data.
?>
this code should give you the Headers to get a JSON payload, and echo a response of said payload.
At the same time it should create a file with the name "webhook_contents.txt" in which you should be able to read the webhook event payload.
Let me know if it works!

What Parameter Contact Form 7 using JSON to sent using API

I want create API for contact form 7.
How to send data from front-end to Contact Form 7 using WP rest api?
I mean, what should the data structure be to send it via the POST method?
http://xx.xxx/wp-json/contact-form-7/v1/contact-forms/<id-form>/feedback
I trying different ways, but request always return response “validation_failed”, “One or more fields contain erroneous data. Please check them and try again.”
I did not find anything about this in the documentation.
Were you able to find the solution? I've been working with the Contact Form 7 REST API and there are a few things you need to do to be abled to get a 'success' response, instead of validation_failed.
First, you need to know what form fields you need to submit. This is set up in your CF7's contact form. The field's name is defined in contact form. Most likely, CF7 uses the naming structure your-name and your-email. So you will need to format your post body to match this.
Next, you will need to submit it using FormData() https://developer.mozilla.org/en-US/docs/Web/API/FormData. From personal experience, I found that if I send my request as a normal object by using post, CF7 sends back validation_failed.
Note: I am using Nuxt's http package to submit data, but you are able to use axios here.
// Format your body response
const emailBody = {
"your-name": this.form.name,
"your-email": this.form.email,
"your-message": this.form.message,
};
// Create a FormData object, and append each field to the object
const form = new FormData();
for (const field in emailBody) {
form.append(field, emailBody[field]);
}
// Submit your form body using axios, or any other way you would like
const response = await this.$http.post(this.getEndEndpoint, form);
This is working for me, I am no longer getting the status validation_failed. Instead I now get a spam status. Trying to solve this problem now
Good luck
add_filter( 'wpcf7_mail_components', 'show_cf7_request', 10, 3 );
function show_cf7_request( $components, $wpcf7_get_current_contact_form, $instance ) {
print_r($_REQUEST);
die();
return $components;
};
Don't try on LIVE ;)
// google recaptcha integration v3 with contact form 7 Rest API
let email = $('input.email').val();
let g_recaptcha_response = $('textarea.g-recaptcha-response').val();
let data = new FormData(form);
data.append("email", email);
data.append("_wpcf7_recaptcha_response", g_recaptcha_response);
// _wpcf7_recaptcha_response key is important and should be same
$.ajax({
type: "POST",
enctype: 'multipart/form-data',
url: "/wp-json/contact-form-7/v1/contact-forms/783/feedback",
data: data,
processData: false,
contentType: false,
cache: false,
timeout: 600000,
}).then((data) => {alert(data.message);});

How to use update function to upload attachment in CouchDB

I would like to know what can I do to upload attachments in CouchDB using the update function.
here you will find an example of my update function to add documents:
function(doc, req){
if (!doc) {
if (!req.form._id) {
req.form._id = req.uuid;
}
req.form['|edited_by'] = req.userCtx.name
req.form['|edited_on'] = new Date();
return [req.form, JSON.stringify(req.form)];
}
else {
return [null, "Use POST to add a document."]
}
}
example for remove documents:
function(doc, req){
if (doc) {
for (var i in req.form) {
doc[i] = req.form[i];
}
doc['|edited_by'] = req.userCtx.name
doc['|edited_on'] = new Date();
doc._deleted = true;
return [doc, JSON.stringify(doc)];
}
else {
return [null, "Document does not exist."]
}
}
thanks for your help,
It is possible to add attachments to a document using an update function by modifying the document's _attachments property. Here's an example of an update function which will add an attachment to an existing document:
function (doc, req) {
// skipping the create document case for simplicity
if (!doc) {
return [null, "update only"];
}
// ensure that the required form parameters are present
if (!req.form || !req.form.name || !req.form.data) {
return [null, "missing required post fields"];
}
// if there isn't an _attachments property on the doc already, create one
if (!doc._attachments) {
doc._attachments = {};
}
// create the attachment using the form data POSTed by the client
doc._attachments[req.form.name] = {
content_type: req.form.content_type || 'application/octet-stream',
data: req.form.data
};
return [doc, "saved attachment"];
}
For each attachment, you need a name, a content type, and body data encoded as base64. The example function above requires that the client sends an HTTP POST in application/x-www-form-urlencoded format with at least two parameters: name and data (a content_type parameter will be used if provided):
name=logo.png&content_type=image/png&data=iVBORw0KGgoA...
To test the update function:
Find a small image and base64 encode it:
$ base64 logo.png | sed 's/+/%2b/g' > post.txt
The sed script encodes + characters so they don't get converted to spaces.
Edit post.txt and add name=logo.png&content_type=image/png&data= to the top of the document.
Create a new document in CouchDB using Futon.
Use curl to call the update function with the post.txt file as the body, substituting in the ID of the document you just created.
curl -X POST -d #post.txt http://127.0.0.1:5984/mydb/_design/myddoc/_update/upload/193ecff8618678f96d83770cea002910
This was tested on CouchDB 1.6.1 running on OSX.
Update: #janl was kind enough to provide some details on why this answer can lead to performance and scaling issues. Uploading attachments via an upload handler has two main problems:
The upload handlers are written in JavaScript, so the CouchDB server may have to fork() a couchjs process to handle the upload. Even if a couchjs process is already running, the server has to stream the entire HTTP request to the external process over stdin. For large attachments, the transfer of the request can take significant time and system resources. For each concurrent request to an update function like this, CouchDB will have to fork a new couchjs process. Since the process runtime will be rather long because of what is explained next, you can easily run out of RAM, CPU or the ability to handle more concurrent requests.
After the _attachments property is populated by the upload handler and streamed back to the CouchDB server (!), the server must parse the response JSON, decode the base64-encoded attachment body, and write the binary body to disk. The standard method of adding an attachment to a document -- PUT /db/docid/attachmentname -- streams the binary request body directly to disk and does not require the two processing steps.
The function above will work, but there are non-trivial issues to consider before using it in a highly-scalable system.

How do I get the cookie values in a backbone.js model?

I have a simple Backbone.js/Bootstrap front end in HTML5 with a Node.js/Restify backend. I am setting cookies in a header response from the server as below:
res.setHeader("Set-Cookie", ["token=ninja", "language=javascript"]);
On the client side, I am making a REST call as
var response = this.model.fetch().success(function(data){
//success
}).error(function(data){
//error
}).complete(function(data){
//complete
});
that callsback a parse method in the model.
How can I read the cookie value in the model?
Include Cookie.js.
You can then reference individual cookies like this:
var token = Cookie.get('token')
# token == 'ninja'
Here is what I figured out. My application has two components - the HTML/js from one domain that talks to a REST sevice on another domain (and therefore is cross-domain.) Because the cookie is set from REST, it appears is not readable across domains. So the web page will not store the cookie even though the server is sending it. One alternative is to use local cookies or use the technique illustrated by http://backbonetutorials.com/cross-domain-sessions/.
Assuming you are using jQuery with Backbone, you can get the headers by defining the parse function in your model by calling getAllResponseHeaders or getResponseHeader:
var model = Backbone.Model.extend({
// the rest of your model
parse: function(resp, xhr) {
var allHeaders = xhr. getAllResponseHeaders();
var cookieHeader = xhr. getResponseHeader("Set-Cookie");
// do something with the headers
return resp;
}
});