I have a website and want to display search results dynamically meaning that as the user interacts with controls and selects options, the search results are populated in realtime - i.e. the user doesnt need to click the search button.
The data is stored in a MySQL relational data base.
Now I know this is likely to lead to a large server load for a user-set above a certain size - are there anyways to mitigate this?
Max.
One way to mitigate the server load would be to introduce a slight timer delay before posting back to the server after each control is populated. If you give the user 3 seconds or so to input an additional field, the user may have time to add a search parameter. That could eliminate an extraneous query or two.
Also I always like to set a max numbers of results returned.
Related
I'm using MS-Access (365) as a frontend to a Postgres table backend. Communications mech between them is ODBC. That seems to work fine.
Before I migrated away from the MA-Access backend, a list of check-able options would appear in a column's filter options when filtering column data. For example, when you clicked on the little carat next to the column name/header, you would see the "Sort A to Z" and "Sort Z to A" (as before) then for the filtering, you would see "Text Filters" as before (with options for specifying the filtering), but then you would also see the options, including "Select All" (a toggle) and then each value with a checkable box next to it. The user could select/deselect to filter based on the values. But I no longer see that list of checkable values in the filtering.
The column values are constrained to values in a different table which contains the valid values for that column (a traditional primary/foreign key relationship) and that works fine in the pulldown as a means for a user to pick one of the valid values when editing a record/column.
Since the values of the column are constrained in a predictable, or at least queryable way, I would think that there might be a way to use that to restore those checkable boxes in the filtering. I looked at the "Design View" of the table and tried a few things to see if I could get this to work. No luck.
Any ideas ?
You can try this setting:
File->options->Current Database-> Filter lookup options.
the real problem here is that you really should not be loading up forms with SO MANY records in the first place. But you can try the above settings - and of course increase the 1000 rows. But this is going to cause a performance issues, since this suggests that forms and view are loading large recordsets and THEN you are applying filters to that data. So keep this pulling of data in mind. It is preferable to provide some kind of search form, and let the user search BEFORE you pulled large amounts of data.
So all in all this is a less then ideal option from a performance point of view. But if data pulls are not too large, then this certainly is a "nice" feature - just not a great from a performance point of view.
Ok, so what is the best practice when it comes down to paginating in mysql. Let me make it more clear, let's say that a given time I have 2000 records and there are more being inserted. And I am displaying 25 at a time, I know I have to use limit to paginate through the records. But what am I supposed to do for the total count of my records? Do I count the records every time users click to request the next 25 records. Please, don't tell me the answer straight up but rather point me in the right direction. Thanks!
The simplest solution would be to just continue working with the result set normally as new records are inserted. Presumably, each page you display will use a query looking something like the following:
SELECT *
FROM yourTable
ORDER BY someCol
LIMIT 25
OFFSET 100
As the user pages back and forth, if new data were to come in it is possible that a page could change from what it was previously. From a logical point of view, this isn't so bad. For example, if you had an alphabetical list of products and a new product appeared, then the user would receive this information in a fairly nice way.
As for counting, your code can allow moving to the next page so long as data is there to support a new page being added. Having new records added might mean more pages required to cover the entire table, but it should not affect your logic used to determine when to stop allowing pages.
If your table has a date or timestamp column representing when a record was added, then you might actually be able to restrict the entire result set to a snapshot in time. In this case, you could prevent new data from entering over a given session.
3 sugggestions
1. Only refreshing the data grid, while clicking the next button via ajax (or) storing the count in session for the search parameters opted .
2. Using memcache which is advanced, can be shared across all the users. Generate a unique key based on the filter parameters and keep the count. So you won't hit the data base. When a new record, gets added then you need to clear the existing memcache key. This requires a memache to be running.
3. Create a indexing and if you hit the db for getting the count alone. There won't be much any impact on performance.
All,
I am using Tableau 9.0 to do data analysis. My data set is very large containing 100 billion records.
I want to use filter to filter out the data firstly. But, when I try to add filter on the specific column of the data in Tableau, it keep running... for ever. The reason is tableau wants to display all this field value to me with ascending order, then allow me to make selection. e.g to select only one or two value to filter...
But it keeps running due to 100 billion records. How to solve this problem? Could I switch off this function (display all specific field value..)? How to filter so large data sets?
Thank you in advance
Pause Auto-Updates via the toolbar pause button before dragging a field to the filter shelf (or doing anything that you don't want to trigger a query refresh). Then either hit refresh or turn auto-updates back on when you want to run a query.
For discrete dimension filter, you can enter custom value lists to avoid querying to fill a list of items in the filter dialog.
you can improve your performance by considering the following tips-
Use custom sql query in tableau to filter the data source to get the smaller working data.(data filtered at backend will be added advantage)
Hide unwanted fields from the data source pane.
Publish your data set to tableau server and then connect the tde server extract to tableau.
I don't feel Tableau is the right tool for such a large data set. But check out this article on performance.
http://kb.tableau.com/articles/knowledgebase/database-query-performance
What is better in terms of speed…
I am trying to determine whether or not a user has added a certain URL to their list of shortcuts. If they have added the URL there will be a link on the page to remove the page from the shortcuts otherwise they can add it to their shortcuts for quick access via a dropdown menu. Unfortunately I need to make this check at every page load so the code is in my AppController. I would like to do whatever I can to speed this up. I don't want this cached.
Would it be faster to do a find('first') while limiting the "fields" to just “id”, a find('count'), or a field('id') where the conditions of either statement would be 'URL' => $this->here. Only 1 or 0 results should be returned.
Assuming your table is indexed correctly you will likely not see a difference. Per #mark's comment, use whichever one suites your needs.
The logic of which one to use should be your main concern.
If you're only trying to see IF there is one, then using field makes the most sense, since it's limit 1 and only returns a single field.
If you want to know how many there are, then you'll need count.
And if you want to know IF there is one, and retrieve it's data, then first or exists is the way to go.
I am trying to determine the best method of collecting a large list from a database and then displaying and filtering the results on the client side. Let me give a quick example:
Example: I've got a database with customer data and currently it contains around 2000 records. This number is constantly increasing. On my website I have a page that I want to be able to query said database based on information such as name, email, phone number etc. and of course display the results (when a user types in Smith it returns all records containing the name Smith). I am planning on using AJAX so that I can query the database and display the results on the fly similar to how google does it. When a user begins searching, results will start showing up on the page as they are found.
Possible Solutions:
Unfortunately I am stumped on how to go about implementing something like this. I am considering using a ValueList pattern. When the user first loads the page, should I be querying the database and storing every record in a collection and then searching that collection list and displaying the results on my jsp page? Essentially creating a java database. The thing I like about the ValueList pattern is that I take one huge hit on page load and dump the entire database in objects stored in a list. What if the database is larger though, say 2,000,000 records?
Or should I be using a simple DOA pattern without the ValueList and query the database for each individual search? This would result in a LOT of database queries, especially considering that I plan on returning results as the user types in the search box.
Edit: The more I think about this, the more it is an AJAX question. My biggest concern should be how to query my database while the user is typing. Do I set some sort of listener to listen for the user to stop typing and then perform the query?
I would use Solr for this type of task.
Fields, which you are going to use for searching should be indexed with Solr.
Then you do an ajax query to Solr and get the result. You can set the order, number of items per page and show results only for current page.
Solr has a lot of other features that can be useful for you.