I'm working on my first react/reflux app so I may be approaching this problem in completely the wrong way. I'm trying to return a promise from a reflux store's action handler. This is the minimum code that represents how I'm trying to do this. If I display this in the browser, I get an error saying that the promise is never caught, because the result of the onLogin function is not passed back when the action is initiated. What is the best way to do this?
var Reflux = require('reflux');
var React = require('react/addons')
const Action = Reflux.createAction();
const Store = Reflux.createStore({
init: function() {
this.listenTo(Action, this.onAction);
},
onAction: function(username, password) {
var p = new Promise((resolve, reject) => {
reject('Bad password');
});
return p;
}
});
var LoginForm = React.createClass({
mixins: [Reflux.connect(Store, 'store')],
login: function() {
Action('nate', 'password1').catch(function(e) {
console.log(e); // This line is never executed
});
},
render: function() {
return (
<a onClick={this.login} href="#">login</a>
)
}
});
React.render(<LoginForm />, document.body);
Several things seem a bit confused here.
Reflux.connect(Store, 'store') is a shorthand for listening to the provided store, and automatically set the "store" property of your component state to whatever is passed in your store's this.trigger() call. However, your store never calls this.trigger so "store" in your component's state will never be updated. Returning a value from your store's action handlers doesn't trigger an update.
Stores should listen to actions to update their internal state, and typically then broadcast this state update by calling this.trigger. No component is going to get your returned promise from the store's onAction unless it explicitly calls Store.onAction (and then it doesn't matter if the actual action was invoked or not).
Async work should typically happen in the action's preEmit hook, not in the store. You should then also declare the action as async in createAction by setting the asyncResult option to true to automatically create "completed" and "failed" child actions. Check out the Reflux documentation here to learn about async events. Async actions automatically return promises, whose resolve and reject are called when the "completed" and "failed" sub-actions are called respectively. This is a bit opinionated, but that is definitely what I perceive is the intended Reflux way.
Related
I'm writing a unit test of an AngularJS 1.x directive.
If I use "template" it works.
If I use "templateUrl" it does not work (the directive element remains the same original HTML instead of being "compiled").
This is how I create the directive element to test in Jasmine:
function createDirectiveElement() {
scope = $rootScope.$new();
var elementHtml = '<my-directive>my directive</my-directive>';
var element = $compile(elementHtml)(scope);
scope.$digest();
if (element[0].tagName == "my-directive".toUpperCase()) throw Error("Directive is not compiled");
return element;
};
(this does not actually work, see Update for real code)
I'm using this workaround to use the $httpBackend from ngMockE2E (instead of the one in ngMock). In the browser developer "network" tab I don't see any request to the template file. It seems to work because I solved the error "Object # has no method 'passThrough'".
I know that the call to the template is done asynchronously using the $httpBackend (this means $compile exit before the template is really applied).
My question is:
obviously $compile is not doing what I expect. How can I trap this error?
If I use a wrong address in the templateUrl I don't receive any error.
How can I found the problem happened when I called $compile(directive) or scope.$digest() ?
Thanks,
Alex
[Solution]
As suggested by #Corvusoft I inject $exceptionHandler and I check for errors after every test.
In the end this is the only code I have added:
afterEach(inject(function ($exceptionHandler) {
if ($exceptionHandler.errors.length > 0)
throw $exceptionHandler.errors;
}));
Now I can clearly see the errors occurred in the Jasmine test result (instead of search for them in the console), example:
Error: Unexpected request: GET /api/category/list
No more request expected,Error: Unexpected request: GET /api/category/list
No more request expected thrown
And, most important, my tests does not pass in case there are some errors.
[Update to show real example case]
Actually the real code to make templateUrl work use asynchronous beforeEach ("done") and a timeout to wait the end of compile/digest.
My directive use some prividers/services and the template contains other directives which in turn use their templateUrl and make calls to some APIs in the link function().
This is the current (working) test code:
// workaround to have .passThrough() in $httpBackend
beforeEach(angular.mock.http.init); // set $httpBackend to use the ngMockE2E to have the .passThrough()
afterEach(angular.mock.http.reset); // restore the $httpBackend to use ngMock
beforeEach(inject(function (_$compile_, _$rootScope_, _$http_, $httpBackend, $templateCache, $injector) {
$compile = _$compile_;
$rootScope = _$rootScope_;
$http = _$http_;
$httpBackend.whenGET(/\/Scripts of my app\/Angular\/*/).passThrough();
$httpBackend.whenGET(/\/api\/*/).passThrough(); // comment out this to see the errors in Jasmine
}));
afterEach(inject(function ($exceptionHandler) {
if ($exceptionHandler.errors.length > 0)
throw $exceptionHandler.errors;
}));
beforeEach(function(done) {
createDirectiveElementAsync(function (_element_) {
element = _element_;
scope = element.isolateScope();
done();
});
});
function createDirectiveElementAsync(callback) {
var scope = $rootScope.$new();
var elementHtml = '<my-directive>directive</my-directive>';
var element = $compile(elementHtml)(scope);
scope.$digest();
// I haven't found an "event" to know when the compile/digest end
setTimeout(function () {
if (element.tagName == "my-directive".toUpperCase()) throw Error("Directive is not compiled");
callback(element);
}, 0.05*1000); // HACK: change it accordingly to your system/code
};
it("is compiled", function () {
expect(element).toBeDefined();
expect(element.tagName).not.toEqual("my-directive".toUpperCase());
});
I hope this example helps someone else.
$exceptionHandler
Any uncaught exception in AngularJS expressions is delegated to this
service. The default implementation simply delegates to $log.error
which logs it into the browser console.
In unit tests, if angular-mocks.js is loaded, this service is overridden by mock $exceptionHandler which aids in testing.
angular.
module('exceptionOverwrite', []).
factory('$exceptionHandler', ['$log', 'logErrorsToBackend', function($log, logErrorsToBackend) {
return function myExceptionHandler(exception, cause) {
logErrorsToBackend(exception, cause);
$log.warn(exception, cause);
};
}]);
I use Meteor Dev Tools plugin in Chrome, and I’ve noticed a cool new feature, that is worrying me about the way I've coded my app.
The audit collection tool is telling me that some of my collections are insecure.
I am still using Meteor 1.2 with Blaze
1.
One of them is meteor_autoupdate_clientVersions
1.1. should I worry about this one?
1.2. How do I protect it?
Insert, Update and Remove are marked as insecure.
2.
Then I have a cycles collection, which has marked as insecure: update and remove
This collection is updated on the database now and then but not supposed to be accessed from the frontend, and is not meant to be related to any client interaction.
For this collection I have these allow/deny rules in a common folder (both client and server)
I've tried applying these rules only on the server side, but I didn't see a difference on the audit results.
2.1. Should these rules be only on the server side?
Cycles.allow({
insert: function () {
return false;
},
remove: function () {
return false;
},
update: function () {
return false;
}
});
Cycles.deny({
insert: function () {
return true;
},
remove: function () {
return true;
},
update: function () {
return true;
}
});
2.2. How do I protect this collection?
3.
And then, I also have another collection with an insecure check which is users, where remove is marked as insecure.
On this webapp I don't make any use of users, there is no login, etc.
I might want to implement this in the future, though.
3.1 Should I worry about this collection being insecure, since I don't use it at all?
3.2 How do I protect this collection?
You do not have to allow or deny. Just remove the insecure package from the meteor app.
Then you can use publish/subscribe and methods for data insert, update and delete.
Remove this please fo code from app:
Cycles.allow({
insert: function () {
return false;
},
remove: function () {
return false;
},
update: function () {
return false;
}
});
Cycles.deny({
insert: function () {
return true;
},
remove: function () {
return true;
},
update: function () {
return true;
}
});
For 1.1
This happens while the user is logging.
Basically, issue is not with this but with the login method.
see wait time: https://ui.kadira.io/pt/2fbbd026-6302-4a12-add4-355c0480f81d
why login method slow?
This happens when everytime, your app gets reconnected. So, after the sucessful login, it will re-run all the publications again. That's why you saw such a delay to login hence this publication.
There is no such remedy for this and but this is kind fine unless your app is having a lot of througput/subRate to this method/publication.
For 3.1 :
You do not have to worry about inscure anymore after removing allow/deny and insecure package. But make sure, you write secure methods.
I want to create a form on an index page that can store data via session storage. I also want to make sure that whatever data(let's say name) ... is remembered and used throughout the site with angular. I have researched pieces of this process but I do not understand how to write it or really even what it's called.
Any help in the right direction would be useful as I am in the infant stages of all of this angular business. Let me know.
The service you want is angular-local-storage.
Just configure it in your app.js file:
localStorageServiceProvider
.setStorageType('sessionStorage');
And then use it in the controller that contains whatever data you want to remember. Here is an example of a controller that loads the session storage data on initialization, and saves it when a user fires $scope.doSearch through the UI. This should give you a good place to start.
(function () {
angular.module("pstat")
.controller("homeCtrl", homeCtrl);
homeCtrl.$inject = ['$log', 'dataService', 'localStorageService', '$http'];
function homeCtrl ($log, dataService, localStorageService, $http) { {
if (localStorageService.get("query")) { //Returns null for missing 'query' cookie
//Or store the results directly if they aren't too large
//Do something with your saved query on page load, probably get data
//Example:
dataService.getData(query)
.success( function (data) {})
.error( function (err) {})
}
$scope.doSearch = function (query) {
vm.token = localStorageService.set("query", query);
//Then actually do your search
}
})
}()
There aren't many examples demonstrating indexedDB in a ServiceWorker yet, but the ones I saw were all structured like this:
const request = indexedDB.open( 'myDB', 1 );
var db;
request.onupgradeneeded = ...
request.onsuccess = function() {
db = this.result; // Average 8ms
};
self.onfetch = function(e)
{
const requestURL = new URL( e.request.url ),
path = requestURL.pathname;
if( path === '/test' )
{
const response = new Promise( function( resolve )
{
console.log( performance.now(), typeof db ); // Average 15ms
db.transaction( 'cache' ).objectStore( 'cache' ).get( 'test' ).onsuccess = function()
{
resolve( new Response( this.result, { headers: { 'content-type':'text/plain' } } ) );
}
});
e.respondWith( response );
}
}
Is this likely to fail when the ServiceWorker starts up, and if so what is a robust way of accessing indexedDB in a ServiceWorker?
Opening the IDB every time the ServiceWorker starts up is unlikely to be optimal, you'll end up opening it even when it isn't used. Instead, open the db when you need it. A singleton is really useful here (see https://github.com/jakearchibald/svgomg/blob/master/src/js/utils/storage.js#L5), so you don't need to open IDB twice if it's used twice in its lifetime.
The "activate" event is a great place to open IDB and let any "onupdateneeded" events run, as the old version of ServiceWorker is out of the way.
You can wrap a transaction in a promise like so:
var tx = db.transaction(scope, mode);
var p = new Promise(function(resolve, reject) {
tx.onabort = function() { reject(tx.error); };
tx.oncomplete = function() { resolve(); };
});
Now p will resolve/reject when the transaction completes/aborts. So you can do arbitrary logic in the tx transaction, and p.then(...) and/or pass a dependent promise into e.respondWith() or e.waitUntil() etc.
As noted by other commenters, we really do need to promisify IndexedDB. But the composition of its post-task autocommit model and the microtask queues that Promises use make it... nontrivial to do so without basically completely replacing the API. But (as an implementer and one of the spec editors) I'm actively prototyping some ideas.
I don't know of anything special about accessing IndexedDB from the context of a service worker via accessing IndexedDB via a controlled page.
Promises obviously makes your life much easier within a service worker, so I've found using something like, e.g., https://gist.github.com/inexorabletash/c8069c042b734519680c to be useful instead of the raw IndexedDB API. But it's not mandatory as long as you create and manage your own promises to reflect the state of the asynchronous IndexedDB operations.
The main thing to keep in mind when writing a fetch event handler (and this isn't specific to using IndexedDB), is that if you call event.respondWith(), you need to pass in either a Response object or a promise that resolves with a Response object. As long as you're doing that, it shouldn't matter whether your Response is constructed from IndexedDB entries or the Cache API or elsewhere.
Are you running into any actual problems with the code you posted, or was this more of a theoretical question?
I would like to use HTML5 Local Storage with my Ember.js.
I haven't been able to find any examples of doing this without Ember Data.
How should this be done? What do I need to consider?
So let's say we have an object called Storage that in our real-world implementation would represent an adapter-like object for the localStorage to store and retrieve data:
App.Storage = Ember.Object.extend({
init: function() {
this.clearStorage();
var items = ['foo', 'bar', 'baz'];
localStorage.items = JSON.stringify(items);
},
find: function(key) {
// pseudo implementation
if( !Ember.isNone(key) ) {
var items = [];
var storedItems = JSON.parse(localStorage[key]);
storedItems.forEach(function(item){
items.pushObject(item);
});
return items;
}
},
clearStorage: function() {
// pseudo implementation
localStorage.clear();
}
});
Beside the pseudo implementations, you can see there is a dummy array with some data stored at object initialization, we will use this later in our IndexRoute model hook to retrieve it, just to show that this works.
Now to the more nice stuff, you could do the register & inject directly after the application is ready, but what if we wanted it to be already available at application initialization? Well "there an ember-feature for that", called Application.initializer, initializer are simple classes with a 'name' property and a initialize function in where you have access to your application container and do what ever needs to be done, let me explain this in code:
To be notified when the application start loading we can listen to the onLoad event to create our initializer classes that will register and inject the before mentioned Storage object into every controller and every route:
Ember.onLoad('Ember.Application', function(Application) {
// Initializer for registering the Storage Object
Application.initializer({
name: "registerStorage",
initialize: function(container, application) {
application.register('storage:main', application.Storage, {singleton: true});
}
});
// Initializer for injecting the Storage Object
Application.initializer({
name: "injectStorage",
initialize: function(container, application) {
application.inject('controller', 'storage', 'storage:main');
application.inject('route', 'storage', 'storage:main');
}
});
});
Now, since the Storage object was injected into every route and every controller we can finally get access to it in our IndexRoute model hook and make the stores array mentioned above available trough the call self.get('storage').find('items') to our template to be rendered (just added a promise to make it actually conform with the ember-way and with some fictive delay, rather than just returning the array):
App.IndexRoute = Ember.Route.extend({
model: function(){
var self = this;
var promise = new Ember.RSVP.Promise(function(resolve) {
Ember.run.later(function() {
var data = self.get('storage').find('items');
console.log(data);
resolve(data);
}, 1000);
});
return promise;
}
});
In our index template we can now agnostically loop over the dummy array not caring where it is coming from:
<script type="text/x-handlebars" id="index">
<h2>Index</h2>
<ul>
{{#each item in model}}
<li>Item: {{item}}</li>
{{/each}}
</ul>
</script>
And lastly, you can see here all the above explained in a working example: http://jsbin.com/eqAfeP/2/edit
Hope it helps.
The accepted answer is great, but I thought I would add this alternative:
Dan Gebhardt has created a very interesting library called Orbit.js for coordinating different data sources on a client. There are three out of the box data sources: memory, local storage, and json api.
For ember integration, check out ember-orbit.
It is still under heavy development at this time, and it introduces a different paradigm than Ember Data, so proceed with caution!