HTML 5 Local Storage proper usage - html

I'm using the new local storage that html5 offers.
When my mobile app (using phonegap) runs, it first goes to the server to get a list of members. The list doesn't change very often, so I was thinking maybe to keep it in the local storage and just refresh it every week or so.
My question is if it's right to do so, because it's a list of 900 people. Not too big but also not small.
Thanks.

900 people with (I guess) 5-6 fields for each one of them is around (did a test in my db with a text file) 45kb overhead.
Most phones and tablets now can live with that. (Aka, putting a bigger image in the background is going to load your app more).
So go for it.

Related

ASP.Net controls becomes unresponsive after a while of perfect function

I have been struggling with this issue for a while now, so I thought I ask you folks if anyone could offer any help/inspiration.
I have an ASP.Net application that runs inside another iOS App on an iPod atop of a barcode scanner.
So below are the steps.
1. Scan a barcode
2. Catch the barcode as a Querystring parameter on a page in ASP.Net application through the iOS App
3. Search the product and display details on the ASP Page
4. User will enter Quantity and add the product to the stock
and go ahead with another product. They can have 300-400 product at a time.
Now when the no. of products scanned and added into the stock, reaches to 100 - 105, the qty textbox on the page and other controls like buttons, start to become unresponsive and sometimes totally frozen.
if I scan a new product, the page searches and displays the product fine but I am not able to access the textbox to enter the qty.
I have contacted the iOS App developer and he has done some work to improve this from initially 70 to 105 scans. This is all for gen 5 iPod (500MB ram)
It even performs better when I use an gen 6 iPod (1 GB ram) and I can go upto 170-180 before the issue comes in to haunt me.
I have to kill and restart the iOS app to be able to work again.
I was using Linq in ASP.Net but I have replaced linq with simple sql reader etc and stored procs to keep it light. That did not help a great deal, may be about another 5 scans added before it freezes.
I have used the profiling in the VS 2012 and replaced the heavily performing methods/objects to reduce the bytes used
Is there a way that I can further optimize the page to help the situation?
I am not even sure if I am barking the right tree here.
It sounds like you have done all you can to reduce the resources of the ASP.Net client side page, You might squeeze a little performance out of it by running the pages through a minifier. As you mentioned the iOS dev was able to gain a pretty big increase in scan numbers it seems like the biggest resource overhead might be on the iOS side.
I am not familiar with iOS development but it seems your issue is entirely related to the hardware capacity of the device. There might be more tricks the iOS dev can use to try and force the scanner app to take priority, even close unneeded background processes when it starts.

Long load times of webpage assets

I have a website that runs just fine on my local server. It's quick, responsive, and overall runs great.
But when I put it on the server of my domain host, sometimes it takes excessively long to load assets. For example, one 1MB png file took 2.31 seconds to load:
Chrome's Network Developer Tool reveals to me the following:
So is this likely due to poor implementation of my code or is it possibly a crappy server? (The company is subscribing at the lowest tier possible to host their content) My internet connection is quick so I doubt it's that.
I think it is probably a problem with your host. An image is an image :) there aren't a 100 ways to implement one!
Oversized images always take longer to load, so you should keep your images as small as possible.To lower down the content download time, you can optimize/compress the image without degrading it's visual quality.If you are using any graphics software to optimize the images, you should use “Save for Web” option. This will reduce the size of images and hence image load time.
Furthermore, you can use CDN to serve static assets of your website like, images, CSS,JS, videos, etc. A CDN populates your website files to geographically distributed network of servers called POPs. CDN serves the website resources from the nearest geographical location of a visitor, that means your website assets will load more faster.
Use SSD based host. SSD has excellent read/write rates compared to that of traditional HDDs. Hence, solid state drives perform better than hard disk drives and they are almost 100 times faster than traditional HDDs.
Here's a question for you
Is the image you're trying to load in background, or its an image tag?

Adding new BackgroundTransferRequest's once app is in background

Adding BackgroundTransferRequest's to the BackgroundTransferService once the app is in the background is successful, but the new requests don't actually run until the app comes back to the foreground. Not so great for my scenario of downloading lots of small files that may take a fair amount of time to complete.
I imagine Microsoft has probably implemented this behavior by design(?), but does anyone know a way around this or an alternative approach?
A bit of background to the problem:
I'm developing a Windows Phone 8 map app that allows sections of maps to be downloaded and cached for offline use. This process can lead to 1,000's of map tiles needing to be downloaded.
I've created a process that spawns the full limit of 25 BackgroundTransferRequest's, then adds more to the BackgroundTransferService as requests complete. This all works fine until the app actually goes in to the background.
I have considered doing some web server side processing to allow tiles to be bundled in to a zip and downloaded as a single request, but this is extra complication and will result in twice the space being required on the phone to complete the download and then extract the files before deleting the original package. But, ideally I'd like to find a way to force new BackgroundTransferRequest's to start running in the background.
It's not clear what your actual question is but I'd definitely recommend bundling them into a zip file and then downloading that. It's almost always easier to work with a single file than thousands.
If disk space is really a genuine issue (not just a theoretical one - I've put thousands of map tiles in under 20mb before, but it will depend on image complexity & quality) then you could make a few zip files. then you'd avoid the BTR issue and not take up as much disk space (even temporarily).

Having trouble using SpeedTracer for Google Chrome

I am trying to find the bottle neck in this site here. Granted, there is not much on the page now, but I will be uploading nearly 300 images to this site in the coming week which I expect to take a toll on speed and performance. Therefore, I want to cut down on everything other than image loading as much as possible.
I've attached an image which shows what I'm seeing in SpeedTracer, but the data makes no sense to me. The page loaded in around 4 seconds, yet each of these blue bars is claiming to have taken ~ 3.5 seconds to load. How can this be?
Could someone try to provide me with some explanation as to what I'm looking at here?
Thanks,
Evan
What you're looking at is a waterfall of the page and it's resources loading...
Each row represents an individual resource, and show's when it started loading and how long it took.
Browsers can download resources in parallel - Chrome can download up to six resources from the same hostname e.g. www.nanisolutions.com, at the same time. It can also download up to six resources from every other hostname at the same time (up to maximum of 40 connections)
Quite why you screen grab shows the request for the root HTML document starting after some of the items I'm not sure but I'm wondering if you are on the same network as the webserver?
I'd be tempted to run the same test using webpagetest.org

Options for Storing Images In SqliteDatabase

I am writing a PhoneGap application to work on Android initially.
I am writing it to collect images from the camera and store them in the database (a SQLite database using the HTML5 functionality for databases).
So there are two options, store the images in the database, and store the images on the SD card and reference them by filename.
So storing them on the SD card, how can I stop someone deleting them? I can't right, I mean they could remove the SD card too.
Storing them in the database, the HTML5 spec suggests that databases shouldn't be bigger than 5 MB, and if they are then the user will be asked if they want to increase the size. But perhaps this is not really an issue for a PhoneGap app?
My recommendation is to store the images as files on the file system. After you take the picture it will be on the SD Card but then you use the File API to move the images to /data/data/{app package name}. That directory is protected and the user is very unlikely to delete the files in that sandbox. It also has the added bonus that when your application is uninstalled that the directory is cleaned up.
This will get you around a host of issues with taking a picture and returning Base64 data. On a lot of phones the camera is so good that the base64 encoded string causes an out of memory error so I tell people to avoid the DATA_URL option whenever then can. Plus, now you store a lot less data in your DB and won't run into size limits quite so easily.