(saved) event is getting triggered twice in HTML - html

I am working on ASP.Net project. My frontend page has 1 editable field.
When user edits data in the field, saved event trigger method. It is getting triggered twice so data entry is getting added twice in database.
.html -
<cool-inline-edit-field
name="homePhone"
[ngModel]="user.homePhone"
(saved)="saveUserHomePhoneAsync($event)"
required
pattern="[0-9]{10}"
>
</cool-inline-edit-field>
component.ts
saveUserHomePhoneAsyc($event: Event) {
this.showSpinner();
debugger;
const oldVal = this.user.homePhone;
const newVal = $event as unknown as keyof any;
debugger;
this.accountService.setBorrowerPreferencesChangeData(this.borrowerIdentityData.fullName, "HomePhone",
oldVal, newVal.toString()).subscribe(
(data) => {
this.hideSpinner();
this.user.homePhone = newVal.toString();
this.snackbarService.show("Value saved");
},
(error) => {
this.user.homePhone = oldVal;
this.hideSpinner();
this.showError('Failed to save home phone. Please try again');
console.log(error);
}
);
}
I am not getting why it is getting called twice? .ts method should run only once.

Related

Chrome extension messaging error: Could not establish connection. Receiving end does not exist

I'm building an extension where when the extension first starts (browser is started/extension is updated) a window is opened with a html file containing a form asking for a master password. When this master password does not match a certain string, a message is sent through chrome.runtime.sendMessage. A message is also sent the same way when the modal window is closed through the chrome.windows.onRemoved listener.
Here is my service worker:
/// <reference types="chrome-types"/>
(async () => {
console.log('Extension started. Modal opened...');
const window = await chrome.windows.create({
url: chrome.runtime.getURL("html/index.html"),
type: "popup",
width: 400,
height: 600,
});
chrome.windows.onRemoved.addListener((windowId) => {
if (windowId === window?.id) chrome.runtime.sendMessage({ monitoringEnabled: true, reason: 'tab closed' }).catch(console.log);
});
chrome.runtime.onMessage.addListener((message) => {
if (Object.hasOwn(message, 'monitoringEnabled')) {
console.log(`Monitoring ${message.monitoringEnabled ? 'enabled' : 'disabled'}. ${message.reason ? `Reason: ${message.reason}` : ''}`)
chrome.storage.local.set({ monitoringEnabled: message.monitoringEnabled });
if (window?.id) chrome.windows.remove(window.id);
}
return true;
});
})();
The html file just has a form with a button which when clicked triggers a script:
const MASTER_PASSWORD = 'some_thing_here';
document.getElementById('submit-button').addEventListener("click", (e) => {
const password = document.getElementById('master-password-text-field').value;
if (password !== MASTER_PASSWORD) return chrome.runtime.sendMessage({ monitoringEnabled: true, reason: 'invalid password' })
return chrome.runtime.sendMessage({ monitoringEnabled: false })
})
These are some logs:
The first error is when the modal tab is closed, notice that nothing happens after this (i.e onMessage listener is not triggered). However, in the second case, when a message is sent from the modal script, the onMessage listener is triggered, but the connection error still appears after the code in the listener has processed.
Not sure why this happens. I have checked multiple other threads on the same topic but none of them seem to help me. If you have a better idea on what I can do to achieve what I want right now, please suggest.
In my code, I'm sending a message to the service worker in the server worker itself. I've re-wrote my code by just making a function which is called when the windows.onRemoved event is triggered and also when a message is sent from the modal tab. The seems to have fixed my issue. This is my service worker code for reference:
/// <reference types="chrome-types"/>
console.log('Extension started. Modal opened...');
let windowId: number | null = null;
chrome.windows
.create({
url: chrome.runtime.getURL('html/index.html'),
type: 'popup',
width: 400,
height: 600
})
.then((created) => (windowId = created?.id ?? null));
chrome.windows.onRemoved.addListener((id) => {
if (id === windowId) enableMonitoring('window closed')
});
chrome.runtime.onMessage.addListener((message) => {
if (message.monitoringEnabled) {
enableMonitoring(message.reason);
}
return undefined
})
function enableMonitoring(reason: any) {
console.log('monitoring enabled', reason);
}

Upload zip files in angular 8

I am trying to implement zip file upload functionality in Angular 8 app. 3 conditions that I need to satisfy are:
1. Only allow zip files to be uploaded else throw error message
2. File size should not cross 3 MBs else throw error message
3. When I choose zip file, it should show progress bar but file should only be uploaded via REST API call when I click 'Register' button separately.
What I have implemented so far is:File Upload Service
postFile(fileToUpload: File, header): Observable<any> {
const endpoint = 'your-destination-url';
const formData: FormData = new FormData();
formData.append('fileKey', fileToUpload, fileToUpload.name);
if (fileToUpload.size <= 3048576)
return this.httpClient.post(endpoint, formData, { headers: header })
.pipe(map(data => {
console.log(data);
return data;
},error => {
console.log(error, 'reduce file size');
}))
}
Component TS File
handleFileInput(files: FileList) {
this.fileToUpload = files.item(0);
}
uploadFileToActivity() {
this.fileUploadService.postFile(this.fileToUpload, this.headers).subscribe(data => {
// do something, if upload success
console.log('the file has been uploaded successfully', data);
}, error => {
console.log(error);
});
}
Component HTML
<input type="file"
id="file" (change)="handleFileInput($event.target.files)">
Please suggest how can I modify so that my functionality is as described.
for points 1 and 2 you should add a validation function in your code to check both the file extension and the size.
The upload should be possible only if the file passes the validation.
In addition to that, you should probably return some kind of feedback to the user when the validation fails.
You can track the file upload progress (and show a progress bar) adding additional options to the .post method and listening for specific events
return this.httpClient.post(endpoint, formData, {
headers: header,
reportProgress: true,
observe: 'events'
}).pipe(map(event => {
if (event.type === HttpEventType.Response) {
// upload complete
}
if (event.type === HttpEventType.UploadProgress) {
// the event contains information about loaded data
// you can use event.loaded and event.total to display the progress bar
}
}))

Why it is Undefined? How can i solve it?

when i am trying to get this console.log(this.empresas[0]); it says it is undefined, but empresas is loaded in the function.
empresas: any;
constructor(...) {
this.getEmpresas();
console.log(this.empresas[0]);
}
getEmpresas(){
this.empresas = [];
this.http.get("http://url").subscribe( data => {
this.empresas = JSON.parse(data["_body"]);
}, err =>{
console.log(err);
});
}
Because just by calling getEmprass(), the value of this.emprass will not get updated. getEmprass() will immediately return without updating it. Later on, when the http request is completed, this.emprass gets updated, but that is after the time you printed its value.
Many things happen in Javascript asynchronously. Observables and subscribe function is one of them: its content gets executed asynchronously in the future (similar to callbacks).
You may wanna change your code to this:
constructor(...) {
this.getEmpresas();
console.log('hey, getEmpresas() finished!');
}
getEmpresas(){
this.empresas = [];
this.http.get("http://url").subscribe( data => {
this.empresas = JSON.parse(data["_body"]);
console.log(this.empresas[0]);//---------------------------> Print it here
}, err =>{
console.log(err);
});
}
If you want to wait in constructor function until the http request is finished, you should use an async function .

A solution for streaming JSON using oboe.js in AngularJS?

I'm pretty new to Angular so maybe I'm asking the impossible but anyway, here is my challenge.
As our server cannot paginate JSON data I would like to stream the JSON and add it page by page to the controller's model. The user doesn't have to wait for the entire stream to load so I refresh the view fo every X (pagesize) records.
I found oboe.js for parsing the JSON stream and added it using bower to my project. (bower install oboe --save).
I want to update the controllers model during the streaming. I did not use the $q implementation of pomises, because there is only one .resolve(...) possible and I want multiple pages of data loaded via the stream so the $digest needs to be called with every page. The restful service that is called is /service/tasks/search
I created a factory with a search function which I call from within the controller:
'use strict';
angular.module('myStreamingApp')
.factory('Stream', function() {
return {
search: function(schema, scope) {
var loaded = 0;
var pagesize = 100;
// JSON streaming parser oboe.js
oboe({
url: '/service/' + schema + '/search'
})
// process every node which has a schema
.node('{schema}', function(rec) {
// push the record to the model data
scope.data.push(rec);
loaded++;
// if there is another page received then refresh the view
if (loaded % pagesize === 0) {
scope.$digest();
}
})
.fail(function(err) {
console.log('streaming error' + err.thrown ? (err.thrown.message):'');
})
.done(function() {
scope.$digest();
});
}
};
});
My controller:
'use strict';
angular.module('myStreamingApp')
.controller('MyCtrl', function($scope, Stream) {
$scope.data = [];
Stream.search('tasks', $scope);
});
It all seams to work. After a while however the system gets slow and the http call doesn't terminate after refreshing the browser. Also the browser (chrome) crashes when there are too many records loaded.
Maybe I'm on the wrong track because passing the scope to the factory search function doesn't "feel" right and I suspect that calling the $digest on that scope is giving me trouble. Any ideas on this subject are welcome. Especially if you have an idea on implementing it where the factory (or service) could return a promise and I could use
$scope.data = Stream.search('tasks');
in the controller.
I digged in a little further and came up with the following solution. It might help someone:
The factory (named Stream) has a search function which is passed parameters for the Ajax request and a callback function. The callback is being called for every page of data loaded by the stream. The callback function is called via a deferred.promise so the scope can be update automatically with every page. To access the search function I use a service (named Search) which initially returns an empty aray of data. As the stream progresses the factory calls the callback function passed by the service and the page is added to the data.
I now can call the Search service form within a controller and assign the return value to the scopes data array.
The service and the factory:
'use strict';
angular.module('myStreamingApp')
.service('Search', function(Stream) {
return function(params) {
// initialize the data
var data = [];
// add the data page by page using a stream
Stream.search(params, function(page) {
// a page of records is received.
// add each record to the data
_.each(page, function(record) {
data.push(record);
});
});
return data;
};
})
.factory('Stream', function($q) {
return {
// the search function calls the oboe module to get the JSON data in a stream
search: function(params, callback) {
// the defer will be resolved immediately
var defer = $q.defer();
var promise = defer.promise;
// counter for the received records
var counter = 0;
// I use an arbitrary page size.
var pagesize = 100;
// initialize the page of records
var page = [];
// call the oboe unction to start the stream
oboe({
url: '/api/' + params.schema + '/search',
method: 'GET'
})
// once the stream starts we can resolve the defer.
.start(function() {
defer.resolve();
})
// for every node containing an _id
.node('{_id}', function(node) {
// we push the node to the page
page.push(node);
counter++;
// if the pagesize is reached return the page using the promise
if (counter % pagesize === 0) {
promise.then(callback(page));
// initialize the page
page = [];
}
})
.done(function() {
// when the stream is done make surethe last page of nodes is returned
promise.then(callback(page));
});
return promise;
}
};
});
Now I can call the service from within a controller and assign the response of the service to the scope:
$scope.mydata = Search({schema: 'tasks'});
Update august 30, 2014
I have created an angular-oboe module with the above solution a little bit more structured.
https://github.com/RonB/angular-oboe

Service retrieves data from datastore but does not update ui

I have a service which retrieves data from the datastore (Web SQL). Afterwards, it stores the data in a AngularJS array. The problem is that this does not initiate changes to the UI.
Contrary, if after the retrieval of data from datastore, I call a web services using a $get method and append the results to the previous array, all data updates the UI.
Any suggestions? Is it possible that I fill the array before the Angular binds the variable?
Can I somehow delay the execution of the service?
Most of the code has been taken from the following example: http://vojtajina.github.io/WebApp-CodeLab/FinalProject/
In order for the UI to magically update, some changes must happen on properties of the $scope. For example, if retrieving some users from a rest resource, I might do something like this:
app.controller("UserCtrl", function($http) {
$http.get("users").success(function(data) {
$scope.users = data; // update $scope.users IN the callback
}
)
Though there is a better way to retrieve data before a template is loaded (via routes/ng-view):
app.config(function($routeProvider, userFactory) {
$routeProvider
.when("/users", {
templateUrl: "pages/user.html",
controller: "UserCtrl",
resolve: {
// users will be available on UserCtrl (inject it)
users: userFactory.getUsers() // returns a promise which must be resolved before $routeChangeSuccess
}
}
});
app.factory("userFactory", function($http, $q) {
var factory = {};
factory.getUsers = function() {
var delay = $q.defer(); // promise
$http.get("/users").success(function(data){
delay.resolve(data); // return an array of users as resolved object (parsed from JSON)
}).error(function() {
delay.reject("Unable to fetch users");
});
return delay.promise; // route will not succeed unless resolved
return factory;
});
app.controller("UserCtrl", function($http, users) { // resolved users injected
// nothing else needed, just use users it in your template - your good to go!
)
I have implemented both methods and the latter is far desirable for two reasons:
It doesn't load the page until the resource is resolved. This allows you to place a loading icon, etc, by attaching handlers on the $routeChangeStart and $routeChangeSuccess.
Furthermore, it plays better with 'enter' animations in that, all your items don't annoyingly play the enter animation every time the page is loaded (since $scope.users is pre populated as opposed to being updated in a callback once the page has loaded).
Assuming you're assigning the data to the array in the controller, set an $scope.$apply() after to have the UI update.
Ex:
$scope.portfolio = {};
$scope.getPortfolio = function() {
$.ajax({
url: 'http://website.com:1337/portfolio',
type:'GET',
success: function(data, textStatus, jqXHR) {
$scope.portfolio = data;
$scope.$apply();
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(errorThrown);
}
});
};