i am looking for a solution for an auto-complete dropdown box which needs to load entries from a huge json (json file is also being updated / generated every second.)
I tried "typeahead.js" but by default it caches the json file in browser and was not able to display new entries added to json file.
is there a solution for an auto-complete text box which can load entries from the server as fast as possible ?
please suggest.
thanks
In your case, you can take advantage of Bloodhound, the typeahead.js suggestion engine. It provides two options. Prefetch and Remote.
In Prefetch, data is fetched and processed on initialization. If the browser supports local storage, the processed data will be cached there to prevent additional network requests on subsequent page loads.
In Remote, it will fetch data from remote source when ever you need it. But remember, in order to prevent an obscene number of requests being made to the remote endpoint, requests are rate-limited.
I think you should user Remote option in your situation.
Reference : Link
There are two ways which are mentioned in the documentation for typeahead.js
You can make the TTL value = 1 for prefetch ( this did;t work for me )
https://github.com/twitter/typeahead.js/blob/master/doc/bloodhound.md
Or you can use clearPrefetchCache on click page load or click of a button .
category.clearPrefetchCache();
Related
abcv
If i click this it's showing complete address. Problem is the path is shared and anyone can hack this path. Is there any way to not to show this entire path and only show abc.jpg in browser.
You may use following approaches
Store the images in the database and use HttpHandler to retrieve and display them.
As additional precaution you may pass current DateTime as encrypted url parameter to verify whether it is a fresh request within a specific time period say 10 minutes. You may refer this article for reference to HttpModule implementation
display-images-from-database-in-gridview-in-aspnet-using-handler-ashx-file-in-csharp-vbnet
As another approach you may implement the concept of Temporary URLs as described here Generating Temporary Download URLs
No. The complete path has to be shown so that browsers can retrieve the file.
You could implement a custom HTTP Handler or asp.net page that takes the name of the file in the query string and returns the contents of the file, perhaps even using a unique id (number, guid, etc,...) to map to each file to stop people "guessing" what other valid filenames may be. So, you'd have:
http://ipaddress/RetrieveUploadFile.aspx?fileid=36281
instead of
http://ipaddress/uplodfiles/2/abc.jpg
No, you don't want to "hide URL"; the whole notion would not make any sense, because all HTTP requests go with some URL. What you want is called URL rewriting:
Please refer this and this links
I need to set up a specific expire header to json files, much much lower than the rest of files. Can I do this in W3TV? I couldn't find a way.
The default of 31536000 seconds is ok for all other file types. But I use the JSON REST API to deliver data to an AngularJS+Cordova App, and was having a problem with content not being updated. We figured out that It was the json expire header when we manually configured 300 seconds, problem is W3 TC constantly overrides this change.
Is there a way to tell W3 TC to use a lower expire header for json files? Or a way manually enter a value that's not overridden by W3TC?
The only idea I've come up with is to rewrite the json expire header rule at the bottom of .htaccess, but I don't know if this is going to prevent W3TC to edit or erase it. Also, having a repeated rule seams just plain wrong from the begging.
Or is there any way to tell Angular to re download the json file even if the cache header tells it to keep it for a year?
What do you think?
Thanks!
FG
Add a random property to the end of the URL to fetch the JSON file. This is what jQuery does to ensure that the cache is not used for JSON requests.
Assume your file URL is http://example.com/myfile.json, then you would fetch it with http://example.com/myfile.json?__random=1 the first time and http://example.com/myfile.json?__random=2 the second time etc. of course you should use totally random numbers instead of 1, 2 etc.
W3TC offers 3 groups of header policies to manage for user agent (browser) caching. The third group on the browser cache settings page has an "Other" section where the headers available there would be applied to the json response from WP unless there's a bug or implementation that prevents this behavior from occurring. W3TC sets the directives for the nginx or apache web server to apply the headers specified on this page and there are ways that this policy could fail to be applied, however the intent is as indicated.
I'm currently creating pdf documents server side with wkhtmlpdf and nodejs. The client side sends the html to be rendered (which may include img tags with a source). When the user is previewing the html in the browser the images they uploaded to their account show fine because the user is authenticated via the browser and the node route can simply look up the image based on the user id (saved to session) and image id (passed in each image request).
The issue is when the images are attempting to be rendered in wkhtmltopdf webkit the renderer is not authenticated when it makes the request for images via node's exec of wkhtmltopdf in a separate process. A request to something like GET /user/images/<imageId> will fail due to the session not being set when the request is made inside the headless wkhtmltopdf renderer.
Is there a way to pass authentication via some wkhtmltopdf option or possibly a different way of authentication for images? The only restriction is not making images public.
I asked a similar question a while back that might help you:
Generate PDF Behind Authentication Wall
WKHTMLTOPDF has --cookie-jar which should get you what you need. Note that it didn't for me, and I wound up answering my own question with an alternate solution. In a nutshell, I wound up accessing the page via CURL - much more flexible - then writing a temporary that I converted to PDF, then deleted the temporary file.
A little round-a-bout, but it got the job done.
To implement authentication I allowed a cookie id flag ( with connect the key defaults to connect.sid ) as a query option in my image routes. The only "gotcha" is since images are requested from the server's perspective, you must ensure all your image paths are absolute domain paths rather than relative to your application ( unless those two are the same of course).
Steps for Expressjs:
Setup the id flag middleware which checks for say sid in the query via req.query (eg ?id=abc123 where abc123 is the req.cookies['connect.sid'], or req.signedCookies['connect.sid'] if your using a secret as you probably should )You may need to ensure the query middleware is setup first.
Ensure the req.headers contains this session id key and value prior to the cookie parser so the session is properly setup (eg if a cookie exists append a new one or if one does add it as the first req.headers.cookie = 'connect.sid=abc123;')
Ensure all image paths contain the full url (eg https://www.yourdomain.com/images/imageId?id=abc123)
Some extra tid bits: The image source replacement should probably happen at the server level to ensure the user does not copy/paste the image url with the session id and say email it to a friend which obviously leaves the door open for account hijacking.
I'm workin' on a web project where performance is a very important issue.
EDIT:
The situation:
I wanna add some details about the user's workflow:
The user visits the welcome page of my website http://example.org/ .
He clicks a link in order to visit the page http://example.org/mypage
onclick-Handler of the link's executed.
The handler loads data usin' XHR.
The handler creates http://example.org/mypage dynamically.
The handler saves mypage locally usin' FileSystem API at filesystem:http://example.org/mypage. EDIT: ( filesystem:http://example.org/mypage is a local ressource stored in the FileSystem at the client side)
The handler extends the history and changes the URL of the location bar usin' History API from http://example.org/ (URL of the welcome page) to http://example.org/mypage (the page which the user wants to see) .
The user vists another page in the meantime.
Later on, the user types http://example.org/mypage directly into the location bar.
The browser shows/loads filesystem:http://example.org/mypage (which is the locally stored version of http://example.org/mypage) instead of http://example.org/mypage. That means: The browser doesn't create a new request, it uses the local stored copy of http://example.org/mypage .
How can I get the browser to use the locally stored version of the page instead of creating a new request? EDIT: - That's what I want to do in #10 of the list above.
EDIT:
My Question:
A client-side has already created/generated http://example.org/mypage in #2 to #7 of the list above. I don't need to create that page some other time. That's why I don't want the browser to create a request for http://example.org/mypage.
That's what I wanna do:
If filesystem:http://example.org/mypage has already been created (respectively if the user has already visited http://example.org/mypage):
Use filesystem:http://example.org/mypage instead of http://example.org/mypage.
Otherwise:
Send a request for http://example.org/mypage
Tries to solve:
I can't use the Fallback section of the manifest file to do something like: EDIT: (aside from the orgin)
FALLBACK:
http://example.org/mypage filesystem:http://example.org/mypage
In order to get the browser to use the local version stored in the FileSystem because Fallback directives are just used if the user is offline, otherwise they are ignored. EDIT: But I want to use filesystem:http://example.org/mypage instead of http://example.org/mypage, even if the user's online.
I know that I can use the Expire field in the response header of a server-generated page in order to not create a new request and to use the cached version.
But what if I create an page dynamically on the client side using JS and XHRs. EDIT: (I described that case in The situation) When create a page at the client side there's no way to get the client to cache that page. That's why I "cache" the page manually usin' FileSystem API to store it on the client side.
In order to improve the performance I'm trying to store any page which the user has already visited locally. When the user visits a page again then I show him the old, locally stored version of the page and my script creates an XHR to find out if the page changed in the meantime.
But how can I get the browser to use the local version of the page?
I can save the generated page locally on the client side using the FileSystem API and I can choose an URL for the generated page to display it at the browser's location bar using the History API.
When the user now visits another site and then presses the back button I can catch the onPopState event by an event handler.
And that event handler can load the dynamically created file using the FileSystem API.
But what should I do if the user doesn't use the back button and if he types the URL, which I have registered using the History API, directly into the location bar?
Then the browser wouldn't use the locally stored version of the page, the browser would create a request to load the page from the server.
Don't put dynamic data in the application cache. If you want to put dynamic data in your pages then get it from the server with AJAX, store the data in Local Storage, and populate the page with the data from storage through JavaScript (you can hook into the History API for this).
By the way, this won't work because fallback entries have to be on the same domain:
FALLBACK:
http://example.org/mypage filesystem:http://example.org/mypage
Once your page is in the Application Cache (ie. it is locally stored) the browser will always use the version from the Application Cache until the manifest is updated or the user deletes the cache. It doesn't really matter what expiry headers you put on the page, except if you put a long expiry and you frequently update the manifest then it's likely the Application Cache will be populated from the browser cache rather than refreshed from the server. This is why the stuff you put in the Application Cache should be static files. Get your dynamic stuff with AJAX.
You might use URLs that encode the actual link within your hierarchy, e.g. "mypage", in the anchor part of the URL, i.e. http://example.com/#mypage. Then you can use window.location.hash to obtain the string after the # and do whatever magic you want. Just make sure your root (or whatever you want in front of the #) is in AppCache.
Is there a way to specify (in the cache manifest file) that all the resources included in the html page are to be cached?
I'm building a dynamic web app and want to give the user the ability to view the app while offline. Therefore I need all the images (for which the source is set from file names stored in the database according to the query string provided in the request) in the page cached. Basically what I need is something like * which can be used in the NETWORK and FALLBACK sections.
If there is no such way to specify this in the manifest file, what is the best approach to solve this? For example, making the manifest itself dynamic and including the resources based on a query string passed to that might work, but it involves getting the list of resources from the db again.
Any help is greatly appreciated!
You can't use a wildcard in the CACHE section.
The approach you described seems practicable. But why retrieving the resources from DB again? once you've fetched them all, give them to a listener which does the generation, or store them in a session attribute where you can fetch them to generate the manifest.