I have a headless application written in yii, with an Angular application using the yii2 api. Currently im using local storage for tokens, but I read this link and would like to store the token in a cookie.
Auth action:
\Yii::$app->response->cookies->add(new Cookie([
'name' => 'token',
'value'=> $token->__toString()
]));
AuthMethod:
if (($cookie = $cookies->get('token')) !== null) {
die('Token found in cookie');
$token = $parser->parse($cookie->value);
}
The token is allways null, so it seems like cookies are disabled by default in Rest controllers / JSON responses, how can I enable this?
For furture reference, if the link is dead it concludes that cookies are better than local storage for JWT tokens
Cookies, when used with the HttpOnly cookie flag, are not accessible through JavaScript, and are immune to XSS. You can also set the Secure cookie flag to guarantee the cookie is only sent over HTTPS. This is one of the main reasons that cookies have been leveraged in the past to store tokens or session data. Modern developers are hesitant to use cookies because they traditionally required state to be stored on the server, thus breaking RESTful best practices. Cookies as a storage mechanism do not require state to be stored on the server if you are storing a JWT in the cookie. This is because the JWT encapsulates everything the server needs to serve the request.
EDIT
Using the native PHP $_COOKIE the cookie can be read by the yii2 application, but the setcookie() does not work. It looks like the yii2-rest controller strips away the headers before sending the response.
I will make my first attempt to answer and I hope it makes sense and helps you and others.
I asked you about angular version, because, as you know angular works as a one page app, and I will need proof of concept code to show my point «I will asume angularjs 1.6.x», it means it works away from your YII2, unless you are rendering angular on call. Now there are two ways to “bypass” this in order to set up the cookie.
The first one will be to set up your cookie inside angular, on your controller call your login endpoint and make it return the token (in my example I am not doing that part just for speed). And then build the cookie using the $cookie service «you will have to import angular-cookie».
The other way could be to call and endpoint inside Yii «look at the actionSetCookie example» and build the cookie inside it, I don't like that one that much looks dirty to me.
Now here comes the big problem, you may have, yii uses a cookie validation that “signs the cookie”, but when you build the cookie outside, this cookie will not work. So… to make this example work you will have to turn off cookie validation.
components' => [
'request' => [
'enableCookieValidation' => false, //<---make it false
'enableCsrfValidation' => true,
'cookieValidationKey' => 'hdfghdsrthrgfdhfghthdrtth',
'parsers' => [
'application/json' => 'yii\web\JsonParser',
]
],
...
]
Now, I made a working test.
First I made two simple action to test that I was actually building the cookie. And that I was able to call them with postman.
public function actionSetCookie()
{
Yii::$app->response->format = Response::FORMAT_JSON;
$session = Yii::$app->session;
$cookies_rsp = Yii::$app->response->cookies;
$cookies_rsp->add(new Cookie([
'name' => 'token',
'value' => 'tt-eyJhbGciOiJIUzI1NiIs...',
]));
$response = ['cookie set'=>$cookies_rsp];
return $response;
}
public function actionCookie()
{
Yii::$app->response->format = Response::FORMAT_JSON;
$session = Yii::$app->session;
$cookies = Yii::$app->request->cookies;
$token = '';
if (isset($cookies['token'])) {
$token = $cookies['token']->value;
}
$response = ['token'=>$token];
return $response;
}
And I made two controllers in angular to make the explanation:
.controller('View1Ctrl', ['$cookies', function ($cookies) {
console.log('View1Ctrl++++++++');
var vm = this;
vm.title = 'Customers';
$cookies.remove('token');
$cookies.put('token', 'my-angular-token');
vm.myToken = $cookies.get('token');
console.log('myToken', vm.myToken);
}
.controller('View2Ctrl', ['$http', function ($http) {
console.log('View2Ctrl');
//Lest set the cookie
$http({
method: 'GET',
url: 'http://test.dev/site/set-cookie'
}).then(function successCallback(response) {
console.log('response', response);
}, function errorCallback(response) {
console.error('err', response)
});
}
Now here it goes
On solution 1 «View1Ctrl», I build the cookie using the $cookie service, and I can validate it using action actionCookie, this will be read only if enableCookieValidation is false.
On the solution 2 «View2Ctrl», I am using the actionSetCookie as a http get endpoint that does not much, but sets the cookie, this cookie will work with enableCookieValidation false or true.
Conclusión
Remember that angular and yii2 are "supposed" to be agnostic and independent, so you will have to consume yii as endpoint.
You will have to se the enableCookieValidation, depending on your solution. I am not sure if there is a way to do it with angular but probably is not a good idea because you will have to publish the cookieValidationKey inside angular.
I don't like using cookies for apis, personally, because the idea of stateless and cookieless will help if your are developing a mobile app «I might be wrong about that».
About postman, it will work with solution 2 unless you turn off enableCookieValidation. Remember that this validation will add some salt to the token inside a cookie, that is yii additional security.
Finally on p[ostman, if enableCookieValidation is set to true and toy are manually making the cookie, Yii will not receive the cookie, because of security.
Just to illustrate this security related to the signing of cookies, I captured this video. I hope this will help. So this is because CookieValidation is true. This is a reason no to use the PHP default cookie service but tu use the one that Yii provides. On the video you will see how this system very specific for each cookie. ANd why you may not see the cookie in postman.
If you make CookieValidation false, this manual cookies, php default cookies will actually work again, but is less secure.
Video
About the discussion on the blog and Angular, remember that angular actually protects your app when using $http calls, it is very secure in that sense, but also don't forget to use ngSanitize on your app.
More on this:
https://docs.angularjs.org/api/ng/service/$http#cross-site-request-forgery-xsrf-protection
https://docs.angularjs.org/api/ngSanitize
https://docs.angularjs.org/api/ngCookies/service/$cookies
Finally, if I find something on securing the cookie like Yii2 does from angular I will add that to the post.
Hope it helps.
Just in case you like to look at my code
https://github.com/moplin/testdev
If you want to access cookies on server side so you have to send them as well in your request. Angular by default does not send cookies in XHR request. To enable cookies in request add this following code:
.config(function ($httpProvider) {
$httpProvider.defaults.withCredentials = true;
});
OR
$http.get("URL", { withCredentials: true })
.success(function(data, status, headers, config) {})
.error(function(data, status, headers, config) {});
See Usage section in angular $http
Make sure you also setting that cookie on client side, using javascript and then check in your chrome console for http request making sure it's also sending cookie in it.
Make sure you read cookies from $cookies = Yii::$app->request->cookies;
http://www.yiiframework.com/doc-2.0/guide-runtime-sessions-cookies.html#reading-cookies
And writing a cookie using $cookies = Yii::$app->response->cookies;
http://www.yiiframework.com/doc-2.0/guide-runtime-sessions-cookies.html#sending-cookies
If the token is an access token you could you use in REST controller
the autheticator behaviors
yii\filters\auth\HttpBasicAuth;
yii\filters\auth\HttpBearerAuth;
yii\filters\auth\QueryParamAuth;
and when received first time token to save it to a cookie in the browser from angular
Related
When I create a new Google Cloud function, the default code given is:
const functions = require('#google-cloud/functions-framework');
functions.http('helloHttp', (req, res) => {
res.send(`Hello ${req.query.name || req.body.name || 'World'}!`);
});
However, in this tutorial, the function is specified as:
exports.validateTemperature = async (req, res) => {
try {
if (req.body.temp < 100) {
res.status(200).send("Temperature OK");
} else {
res.status(200).send("Too hot");
}
} catch (error) {
//return an error
console.log("got error: ", error);
res.status(500).send(error);
}
};
What is the difference between the two? How do they work in the bigger scheme of things?
In the second example, the code is listening for a Http POST request. Where is this specified?
Through the two methods you exposed the result is the same, a HTTP Path is used to receive the Request and Response objects.
Inside the Request object, you may find the Body (usually filled in POST & PUT requests but not limited to) and the Method (GET, POST, PUT, etc).
Therefore, your Cloud Function code will be used with both a GET and a POST call in either solution.
Functions Framework
Functions Framework turns a Cloud Functions snippet into a workable server. It's kind of like register a handler function to an express router, and run the express app behind the scenes.
The main use case is local development and migrant to App Engine or other services. They both need to start a HTTP server, and functions framework does that.
Express
Node.js runtime explains your code snippet using Express framework.
The validateTemperature accepts all HTTP methods.
We often filter HTTP methods by a router. Although you can do it in with req.method. And router, web server level is what you don't want to reapeat again and again.
If you want to split requests by method when buiding an API service, you could consider let Cloud Endpoints or some API Gateway stands before your functions.
I need to access infusionsoft api without user interaction. I do not want let user to click on a click so I can get a tocken. Is it possible?
$infusionsoft = new Infusionsoft\Infusionsoft(array(
'clientId' => '...',
'clientSecret' => '...',
'redirectUri' => '...',
));
// If the serialized token is available in the session storage, we tell the SDK
// to use that token for subsequent requests.
if (isset($_SESSION['token'])) {
$infusionsoft->setToken(unserialize($_SESSION['token']));
}
// If we are returning from Infusionsoft we need to exchange the code for an
// access token.
if (isset($_GET['code']) and !$infusionsoft->getToken()) {
$infusionsoft->requestAccessToken($_GET['code']);
}
if ($infusionsoft->getToken()) {
// Save the serialized token to the current session for subsequent requests
$_SESSION['token'] = serialize($infusionsoft->getToken());
// MAKE INFUSIONSOFT REQUEST
} else {
echo 'Click here to authorize';
}
Make 3 files
Request_new_token.php. It is similar to your code(Need to run one time only), but you will have to save the token to database or txt file.
//Convert object to string
$token = serialize($infusionsoft->requestAccessToken($_GET['code']));
//Update the token in database.
$update = new Update("systemsettings");
$update->addColumn('systemsettings_strvalue', $token);
$update->run(1);
exit;
Refresh_token.php. With saved token, you will need to refresh it within 21 hours. I suggest to use cronjob to auto run it on server back-end.
General_request.php(Up to your system preference). Whenever you need to make single request to GET/PUT/POST, you just need to initiate infusionsoft object and set token to the new object from database.
Good luck!
If you're looking to interact with the API and not get access via the newer oAuth methods, you'll need to use the depreciated legacy API which uses an API key from the actual Infusionsoft application. The upside is that unless the user changes their API key, you don't need to "renew" or "refresh" the token and you don't need the user to click through an authorize their app.
The big downside, of course, is that this older API has been depreciated and all new applications need to use oAuth.
What is the use case where you can't walk the users through an oAuth authentication flow?
I'm trying to get dojo to show Json data that comes from a remote web service. I need to be clear though - the web server hosting the html/dojo page I access isn't the same server as the one that's running the web service that returns the json data - the web service server just can't serve html pages reliably (don't ask!!).
As a test I move the page into the same web server as the web service and the below works. As soon as I move it back so that the html/dojo is served from Apache (//myhost.nodomain:82 say) and the web service sending the json is "{target:http://myhost.nodomain:8181}", then it stops working.
I've used FFox to look at the network & I see the web service being called ok, the json data is returned too & looks correct (I know it is from the previous test), but the fields are no longer set. I've tried this with DataGrid and the plain page below with the same effects.
Am I tripping up over something obvious???
Thanks
require([
"dojo/store/JsonRest",
"dojo/store/Memory",
"dojo/store/Cache",
"dojox/grid/DataGrid",
"dojo/data/ObjectStore",
"dojo/query",
"dojo/domReady!"
],
function(JsonRest, Memory, Cache, DataGrid, ObjectStore, query) {
var myStore, dataStore, grid;
myStore = JsonRest(
{
target: "http://localhost:8181/ws/job/definition/",
idProperty: "JOB_NAME"
}
);
myStore.query("JOB00001"
).then(function(results) {
var theJobDef = results[0];
dojo.byId("JOB_NAME").innerHTML = theJobDef.JOB_NAME;
dojo.byId("SCHEDULED_DAYS").innerHTML = theJobDef.SCHEDULED_DAYS;
});
}
);
Its true what Frans said about the cross domain restriction but dojo has this link to work around the problem.
require(["dojo/request/iframe"], function(iframe){
iframe("something.xml", {
handleAs: "json"
}).then(function(xmldoc){
// Do something with the XML document
}, function(err){
// Handle the error condition
});
// Progress events are not supported using the iframe provider
});
you can simply use this and the returned data can be inserted into a store and then into the grid.
Are you familiar with the Same Origin Policy:
http://en.wikipedia.org/wiki/Same-origin_policy
Basically it restricts websites to do AJAX requests to other domains than the html page was loaded from. Common solutions to overcome this are CORS and JSON-P. However, remember that these restrictions are made for security reasons.
I have a small multiplayer Flash game in which you can display a player profile by clicking at his or her avatar:
const PROFILE_URL:String = 'http://myserver/user.php?id=';
try {
navigateToURL(new URLRequest(PROFILE_URL+id), '_blank');
} catch(e:Error) {
}
This works well, but now I'd like to extend the user.php, so that players can add comments about each other. For authorization I'd like to use HTTP cookies, passed from the game.swf to the user.php.
(I don't want use GET or POST here, because GET will have the auth. variables in the URL and players might occasionaly send that URL around or post it in my forum. And POST will ask to re-post the request when you reload).
My problem is that I can't find the way to set HTTP cookies through the navigateToURL() method. Please advise me
Regards,
Alex
You could first authenticate by logging in via a seperate call, for example login.php and that script would start a session. Then all other calls to the same domain would already have the session started and you could check authentication. No need to worry about cookies when PHP can do it for you.
Assuming that you already have the cookie value in your swf you should be able to use the URLRequestHeader together with the URLRequest as follows:
var header:URLRequestHeader = new URLRequestHeader("Cookie", "<the cookie>");
var request:URLRequest = new URLRequest("http://example.com/script.php");
request.requestHeader.push(header);
request.method = URLRequestMethod.POST;
navigateToURL(request, "_blank");
Under certain circumstances, the browser will send the cookie to the server if it has been already set even if you don't explicitly include it in the request. This depends on the browser and the version of the Flash Player. You might also need to adjust your crossdomain.xml file.
Also note that there might be security implications of passing around an unencrypted cookie token. See Firesheep.
Is possible to intercept 404 error without using web server (browsing html file in the filesystem) ?
I tried with some javascript, using an hidden iframe that preload the destination page and check for the result and then trigger a custom error or redirect to the correct page.
This work fine but is not good on perfomance.
A 404 error is an HTTP status response. So unless you are trying to retrieve this file using an HTTP request/response, you can't have a genuine 404 error. You can only mimic one in something like the way you suggest. Any "standard" way of handling a 404 error is dependent on your flavour of web server anyway...
404 is a HTTP response code, and as such only delivered through the HTTP protocol by servers that speak it. The file:// extension isn't a real protocol response as such, it's a hack built into clients (like browsers) that enable local file support, however it's up to browsers / clients themselves whether they expose any response codes from their file:// implementation. In theory they could report them in the DOM, for example, but they would be response codes exposed to themselves, and as such rarely implemented. Most don't, and there isn't a standard way for it. You may look into browser extensions, like Firefox, and see if they support it, but then, this is highly unstandard and will likely break if you pop it on the web.
Why don't you want to use the server?
I don't believe that it's possible to handle a 404 error client-side, because a 404 error is server-side.
Whenever you load a webpage, you make a request to the server. Thus, when you ask for a file that's not there, it's the server that handles the error. Regular HTML/CSS/JavaScript only come into the picture when the server sends back a response to tell you that it can't find the file.
Steve
Because I was looking for this today. You can now do this without a server by using a Service Worker to cache the custom 404 page, and then serve it when a fetch request status is 404. Following the instructions on the google cache lab, the worker files looks as follows:
const filesToCache = [
'/',
'404.html'
];
const staticCacheName = 'pages-cache-v1';
self.addEventListener('install', event => {
console.log('Attempting to install service worker and cache static assets');
event.waitUntil(
caches.open(staticCacheName).then(cache => {
return cache.addAll(filesToCache);
});
);
});
self.addEventListener('fetch', event => {
console.log('Fetch event for ', event.request.url);
event.respondWith(
caches.match(event.request).then(response => {
if (response) {
console.log('Found ', event.request.url, ' in cache');
return response;
}
console.log('Network request for ', event.request.url);
return fetch(event.request).then(response => {
console.log('response.status:', response.status);
// fetch request returned 404, serve custom 404 page
if (response.status === 404) {
return caches.match('404.html');
}
});
});
);
});