Firing a javascript function from a dynamically created button - html

Updated code and issue:
I am creating a test harness for my RPC server. Currently it consists of a page which immeadiately fires off an AJAX request to retrieve all functions on the server. Once that is returned it creates a list of buttons so I can click to test. Eventually I will add dialog boxes to test parameter passing to the functions but currently I want to just fire off the basic request when I click the button. The issue I am seeing is that the onclick function is always firing the last function in the list presumably because when the click is fired key is set to the last value in the array. I thought to pass button.innerHTML value but that too suffers that the last button.innerHTML is that of the final key.
What do I need to do to fire off the action correctly?
Here is the business end of the code:
$(document).ready(function() {
$.jsonRPC.setup({
endPoint: '//api.localhost/index.php'
});
$.jsonRPC.request('getExampleData', {
params: [],
success: function(result) {
for (var key in result.result) {
console.log(key+' => '+result.result[key]);
var button = document.createElement('button');
button.innerHTML = result.result[key];
button.onclick = function() { callRPCFunction(result.result[key]); return false; }
var foo = document.getElementById("page");
foo.appendChild(button);
}
},
error: function(result) {
console.log(result);
}
});
});
function callRPCFunction(target) {
$.jsonRPC.request(target, {
params: [],
success: function(result) {
console.log(result);
},
error: function(result) {
console.log(result);
}
});
}

Assignment to element.onClick will not work until the element is added to the DOM. You may call element.onClick(callRPCFunction(result.result[key])); after foo.appendChild(element);. That might work!
You may use jQuery's live() here, it was created for these purposes:
$(element).live('click', callRPCFunction(result.result[key])

Related

Custom Polymer Element refresh data when visible?

In the element script I have:
Polymer({
is: 'projects-page',
attached: function () {
this.async(function () {
// access sibling or parent elements here
var model = this;
$.ajax({
url: "/api/projects",
headers: { "Authorization": "Bearer " + sessionStorage.getItem("accessToken") }
})
.done(function (data) {
console.log(data);
model.projects = data;
model.notifyPath('projects', model.projects);
})
.fail(function (jqXHR, textStatus) {
})
.always(function () {
});
});
}
});
I am needing to refresh this data when the route changes or when it becomes visible to the user.
I am still finding the Polymer docs lacking and any help would be appreciated.
UPDATE:
This is a partial answer.
How can I refresh/reload Polymer element, when page changes?
you could listen to the window url or any other variable that reflects the state of your application with a change-event listener and reload your ajax then. The model should be turned into a properity of your projects-page and the ajax-done event listener should use the Polymer-set function. So the view gets repopulated without any big hazzle.
https://www.polymer-project.org/1.0/docs/devguide/data-binding.html#set-path
https://www.polymer-project.org/1.0/docs/devguide/properties.html#change-callbacks

Service retrieves data from datastore but does not update ui

I have a service which retrieves data from the datastore (Web SQL). Afterwards, it stores the data in a AngularJS array. The problem is that this does not initiate changes to the UI.
Contrary, if after the retrieval of data from datastore, I call a web services using a $get method and append the results to the previous array, all data updates the UI.
Any suggestions? Is it possible that I fill the array before the Angular binds the variable?
Can I somehow delay the execution of the service?
Most of the code has been taken from the following example: http://vojtajina.github.io/WebApp-CodeLab/FinalProject/
In order for the UI to magically update, some changes must happen on properties of the $scope. For example, if retrieving some users from a rest resource, I might do something like this:
app.controller("UserCtrl", function($http) {
$http.get("users").success(function(data) {
$scope.users = data; // update $scope.users IN the callback
}
)
Though there is a better way to retrieve data before a template is loaded (via routes/ng-view):
app.config(function($routeProvider, userFactory) {
$routeProvider
.when("/users", {
templateUrl: "pages/user.html",
controller: "UserCtrl",
resolve: {
// users will be available on UserCtrl (inject it)
users: userFactory.getUsers() // returns a promise which must be resolved before $routeChangeSuccess
}
}
});
app.factory("userFactory", function($http, $q) {
var factory = {};
factory.getUsers = function() {
var delay = $q.defer(); // promise
$http.get("/users").success(function(data){
delay.resolve(data); // return an array of users as resolved object (parsed from JSON)
}).error(function() {
delay.reject("Unable to fetch users");
});
return delay.promise; // route will not succeed unless resolved
return factory;
});
app.controller("UserCtrl", function($http, users) { // resolved users injected
// nothing else needed, just use users it in your template - your good to go!
)
I have implemented both methods and the latter is far desirable for two reasons:
It doesn't load the page until the resource is resolved. This allows you to place a loading icon, etc, by attaching handlers on the $routeChangeStart and $routeChangeSuccess.
Furthermore, it plays better with 'enter' animations in that, all your items don't annoyingly play the enter animation every time the page is loaded (since $scope.users is pre populated as opposed to being updated in a callback once the page has loaded).
Assuming you're assigning the data to the array in the controller, set an $scope.$apply() after to have the UI update.
Ex:
$scope.portfolio = {};
$scope.getPortfolio = function() {
$.ajax({
url: 'http://website.com:1337/portfolio',
type:'GET',
success: function(data, textStatus, jqXHR) {
$scope.portfolio = data;
$scope.$apply();
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(errorThrown);
}
});
};

Persistent store load callback in ExtJS 4.1.1

I need a way to catch the JSON response every time my datastore has loaded. My first try was to use the autoLoad property but the callback fires only on first load :
autoLoad: {
callback: function (records, operation) {
// do something with operation.response.responseText
}
}
So, I have decided to extend the load method :
load: function (options) {
var callback = options && options.callback;
return this.callParent([Ext.apply(options || {}, {
callback: function (records, operation) {
// do something with operation.response.responseText
if (callback) {
return callback.apply(this, arguments);
}
}
})]);
}
It works, but I wonder if the framework already provides a more elegant solution.
You can add a load listener to the store and grab the current request from it's proxy when the load event is fired.
var myStore = Ext.create("Ext.data.store", {
...whatever here
listeners: {
load: function(store){
store.getProxy().activeRequest.options.operation.response.responseText;
}
}
});
Thats if you want the response text specifically. If you want the response as a JSON object, you can use store.getProxy().reader.rawData; which is a little simpler

Using jQuery.when with array of deferred objects causes weird happenings with local variables

Let's say I have a site which saves phone numbers via an HTTP call to a service and the service returns the new id of the telephone number entry for binding to the telephone number on the page.
The telephones, in this case, are stored in an array called 'telephones' and datacontext.telephones.updateData sends the telephone to the server inside a $.Deferred([service call logic]).promise();
uploadTelephones = function (deffered) {
for (var i = 0; i < telephones.length; i++){
deffered.push(datacontext.telephones.updateData(telephones[i], {
success: function (response) {
telephones[i].telephoneId = response;
},
error: function () {
logger.error('Stuff errored');
}
}));
}
}
Now if I call:
function(){
var deferreds = [];
uploadTelephones(deferreds);
$.when.apply($, deferreds)
.then(function () {
editing(false);
complete();
},
function () {
complete();
});
}
A weird thing happens. All the telephones are sent back to the service and are saved. When the 'success' callback in uploadTelephones method is called with the new id as 'response', no matter which telephone the query relates to, the value of i is always telephones.length+1 and the line
telephones[i].telephoneId = response;
throws an error because telephones[i] does not exist.
Can anyone tell me how to keep the individual values of i in the success callback?
All of your closures (your anonymous functions capturing a variable in the local scope) refer to the same index variable, which will have the value of telephones.length after loop execution. What you need is to create a different variable for every pass through the for loop saving the value of i at the instance of creation at for later use.
To create a new different variable, the easiest way is to create an anonymous function with the code that is to capture the value at that particular place in the loop and immediately execute it.
either this:
for (var i = 0; i < telephones.length; i++)
{
(function () {
var saved = i;
deffered.push(datacontext.telephones.updateData(telephones[saved],
{
success: function (response)
{
telephones[saved].telephoneId = response;
},
error: function ()
{
logger.error('Stuff errored ');
}
}));
})();
}
or this:
for (var i = 0; i < telephones.length; i++)
{
(function (saved) {
deffered.push(datacontext.telephones.updateData(telephones[saved],
{
success: function (response)
{
telephones[saved].telephoneId = response;
},
error: function ()
{
logger.error('Stuff errored ');
}
}));
})(i);
}
should work.
Now, that's a bit ugly, though. Since you are already going through the process of executing an anonymous function over and over, if you want your code to be a little bit cleaner, you might want to look at Array.forEach and just use whatever arguments are passed in, or just use jQuery.each as you are already using jQuery.

submit form using mootools

I have made a class that handles load and submit of a html form. See code below
ND.Form = new Class({
//--- Implements options og events.
Implements: [Options, Events],
//--- Options.
options: {
url: '',
injectTo: ''
},
//--- Initialize the class.
initialize: function (panel, options) {
//--- Set options
this.setOptions(options);
this.panel = panel;
this.loadForm();
},
loadForm: function () {
var req = new Request.HTML({
url: this.options.url,
method: 'get',
onSuccess: function (html) {
$(this.options.injectTo).empty();
$(this.options.injectTo).adopt(html);
var formId = $(this.options.injectTo).getFirst('form').get('id');
$(formId).addEvent('submit', function (e) {
e.stop();
this.submitForm(formId);
} .bind(this));
} .bind(this)
}).send();
},
submitForm: function (formId) {
$(formId).set('send', {
onSuccess: function (resp) {
this.panel.loadContent();
if (resp != null) {
$('lbl_error').empty();
$('lbl_error').setStyles({ 'display': 'block', 'color': 'red' }).set('html', resp);
}
}.bind(this),
onFailure: function (resp) {
if (resp != null) {
$('lbl_error').empty();
$('lbl_error').setStyles({ 'display': 'block', 'color': 'red' }).set('html', resp);
}
}
});
$(formId).send();
}
});
And it all works just fine, exept that when i push the save button more than ones the "this.panel.loadContent();" in the "submitForm: function (formId)" fires the same x amount of times I have pushed the button, how can i prevent this ?
/Martin
Starting from mootools 1.3 "set('send')" adds another one event.
So You need to write:
$('myForm').set('send', {
onSuccess: function (html) {},
onFailure: function(xhr){}
}).addEvent('submit', function(e){
e.stop();
this.send();
});
instead of:
$('myForm').addEvent('submit', function(e){
e.stop();
this.set('send', {
onSuccess: function (html) {},
onFailure: function(xhr){}
});
}).send();
Then Request will be sent only once each time when You handle form.
Basically we need to do three things:
Listen for the ‘click’ event on the submit button.
Stop the event from submitting the form.
Send the form using $(formElement).send()
A solution could look something like this:
$('submit').addEvent( 'click', function(evt){
// Stops the submission of the form.
new Event(evt).stop();
// Sends the form to the action path,
// which is 'script.php'
$('myForm').send();
} );
I have used the request.send() method but googled trying to find a way to just replicate the action of a user hitting a form submit button, but with also allowing some javascript logic being performed beforehand. I did not find anything in discussions specifically addressing this. The answer I found is to use the form.submit() method.