im making a memberlist for my website, we got 500+ members and want them on 1 page. we used tables before but this makes many dom elements and firefox says it gets slow.
is there a way to make a list with 3/4 colums with css to avoid the nr. of dom elementst?
greets. stefan
In cases like this, the best way to display 500 elements is with Javascript. Having 500 records as JSON is not that big of a memory hog, totally acceptable. You'd have to make a dynamically updating table that takes the records from memory and displays only 40 rows at a time. You could also implement search etc in JS to make it more user friendly.
Related
I have this client that wants me to make him only with HTML and CSS all in this 2 files three table designs.Tables that are different in content and in number of rows. So that he in the future to be able to easily change the table displayed or the design of the table regarding the background-color, table borders, Font Awesome images, etc. ...
So, I think that the best solution it would be to create the structure in html and design them separately in 3 different CSS files. Rather then to design them in the same file and use 3 classes (for every element that needs to be shown or hidden)
My questions is in the fact that I write aprox. 300 lines of HTML code for this 3 tables and every time the browser reads my page will be showing only one table, 100 (just to say).... How this will affect my loading time of the page. Considering that this table will not be the only element to be displayed, maybe there will be images, videos....
Thanks!
The short answer is yes, it will affect loading time.
It affects load time because it will add to the overall size of the document and affect the download time for the html and css file.
That being said, 300 lines of code may not make a significant difference in your load times as much as adding images / videos. Probably best to build it and do your own testing to see if it's worth splitting into 3 sets of files.
I have an application that displays a page with 5-10000 rows in a table element and has a drop down menu that switches to other views with a similar structure. Currently I do an async request when switching between views (I also clear/remove the current view from the document) and load the appropriate list each time; however, I was thinking of
1) loading all views in the background before they are requested for viewing so they load instantly on click.
and
2) Just hiding a particular set of rows rather than removing it so if the client navigates back it will be instant as well.
This would mean that potentially 10s of thousands of html elements would be loaded into the current document; is this a problem? What if it was more than 10's of thousands?
Loading 10000+ HTML elements onto your page is not a very smart idea. If the user's computer is of normal to fast speed, the user may experience slight lag, and if the user's computer is slow, it may even cause a system crash (depending on the amount of RAM on the computer).
A technique you could explore is to lazyload the HTML elements - this means that when the user scrolls down to a particular portion of the page, then the elements are loaded via AJAX. (also known as "infinite scrolling").
This means that the browser's rendering engine does not have to render so many elements in one go, and that saves memory, and increases speed.
It depends on the HTML elements, but I would say as a rule of thumb, don't load upfront.
When I say it depends on the elements, I mean take a look at facebook for example. They load maybe 10 or 20 items into the feed, and then add more as you scroll, because each item is so rich in content (photos, videos, etc).
However on the flipside of that, think about how much info is in each row, if it's say, less than 500 bytes, 500 x 10000 = 5MB, which isn't an awful load for a web request, and if you can cache intelligently, maybe it will be a lot less than that.
The bottom line is, don't assume that the number of HTML elements matters so much, think about the amount of data that it amounts to. My advice, cross that bridge when you get there. Monitor your request times, if they are too long, come up with an AJAX page loader, which shouldn't be too hard.
The size of your html document would be conciderable...
I once got a page with 5000~10000 items (table rows) and the browser (IE) was rendering way too long (downloading the page, parsing and rendering it).
The best solution seems to me to set up a WebService with a LazyLoading system.
So IMHO yes, the number of element in a HTML document should be monitored.
Yes of course! The number of element in a HTML document can be monitored.! Use FireBug in Firefox!
We're building a web application that has a page where the user sees a large table of editable items. This table has controls on each row to move the row up/down and the option to delete the row. Each row also has two select elements.
This table could consist of around 200 rows in extreme circumstances, it is when we have lots of rows that we run into severe performance problems. The page is incredibly slow to scroll up and down and we see "checker-boxing" on the screen, also deleting a row takes around 30 seconds, sometimes more! Moving up and down takes a similar amount of time, and the page is generally unusable.
We've been trying to narrow down exactly what the problem is and we're pretty sure it is to do with the select elements in the table - if we remove these from the rows; scrolling is perfect, moving up and down is ~1 second and deleting a row ~7 seconds.
If we delete a row from the bottom of the 200 row table, it is near instant.
It seems like the problem is to do with the CSS on the page, when we run the profiler it is the re-calculating of styles that is taking around 3 seconds.
The page performs fine in other browsers, any help/knowledge would be great.
Thanks
We have several tables in our Web App for iOS 6, some of which contains several hundreds of rows as read-only with just text, but some contains SELECT elements as well.
Setting the table-layout to fixed had a tremendous impact on the performance. Some of the larger tables had major rendering problems with the default dynamic table width calculation, which we didnt overcome so far. This was especially the case when a row contained checkboxes.
Some tables didnt render the last line for example, or the table was cut off anywhere near the bottom, and so on.
We use a special stylesheet for the portrait mode, which changes the width of the TABLE and TD elements and which completely removes some of the columns as well.
This works really fine and had caused no side effects or problems so far.
For scrolling, we use a special DIV element surrounding the table which has a defined height and is using "-webkit-overflow-scrolling: touch". Even the largest table with around 850 entries scrolls without any loss of performance or flickering.
When do you render the table? During the page rendering/ ready/ load Event?
Do you use an "innerHTML" assignment or do you create the DOM tree in the logic?
Setting table-layout:fixed; as a style property on the table (in your stylesheet) might make it more respond faster. The browser has to do less calculations, downside is you have to specify column widths yourself.
I come from no server-side programming knowledge, so I have an issue. Here is the situation:
I have a large group of pictures, about 1000, that are related to each other in some way. I have written a function in C that creates an index table of relation between each picture. It is very long and complicated, no need to put it here, all you need to know: If they aren't related whatsoever, the index is 0, if they are very close, it goes up to 99, with 100 being the same picture. I can output the table into any sort of file that would be useful for retrieval online.
Now, I want to use this idea for a website. I have a domain and a file system, but I haven't messed with any server scripts or anything, because I don't know where to start. I have an html/css setup with several divs set up in decreasing size, and my goal is that when a certain picture is selected, it will be loaded by the biggest div, and then subsequently load related images in descending order into the smaller divs. I am not sure how to do this, both in terms of the 'relation' attribute and in terms of retrieving said attribute. Thanks for your time, I know this is very long question so sorry.
edit: to avoid confusion, I only have a few total divs, not the thousand all at once haha
I've recently been developing an application using tables with a large number (hundreds) of rows. The rows contain tabular data and each row contains two drop down lists as well as link which displays hidden table rows directly below it.
I am currently experiencing lengthy delays when attempting to select a value from the drop down, and when displaying the hidden table rows.
Is the table the source of my problem here?
From what I gather, HTML tables can be appear to be slow to render if widths are not explicitly stated. If they aren't, the browser has to finish loading the contents of the cells before it can calculate the correct widths.
MSDN has some information here on their "Building High Performance HTML Pages" article that may help; they suggest the following (regarding tables specifically):
Set the table-layout CSS attribute to fixed on the table.
Explicitly define col objects for each column.
Set the WIDTH attribute on each col.
The problem is more than likely the rendering of the controls (100 select boxes) rather than the table layout. How many items are there in the drop down list? Does it perform the same way in all browsers on different operating systems?
Sorry to be so vague but generally tables aren't slow to render, but are looked down upon because of their lack of accessibility (although of course much of the idealism is the fact screen readers don't like them).
I can state from experience that it's almost certainly the dropdown lists that are causing the slowness. Tables with hundreds of rows can render almost instantaneously if they only contain text, but add a dropdown list to each row with just a few dozen options, and now even 50 rows will take a noticeable amount of time.
Try to design around the need for dropdowns in the table rows - if this is a form (and why else would you have dropdowns?), have the user select the row they wish to edit, and then put the editable controls only in that row, either via AJAX if you're into that sort of thing, or a more traditional "detail view" approach.
I've not noticed any slowdowns when displaying static tables with a few thousand entries, but adding effects like sorting or dynamic zebra-striping can quickly bog down even a modern browser on a fast computer. Make sure you're not running any JavaScript iterations across the entire table.
Whatever you come up with, test the approach in a couple of browsers on a couple of platforms, with some slower machines. I once had an app that was speedy everywhere except in Safari for Macintosh. It turned out to be something about the way it rendered the dropdowns. There's just no substitute for experimentation. Uhm, I meant testing.
If you know how wide each column should be, try specifying table-layout: fixed in the CSS for the table and see if that makes any difference (it should stop the browser trying to re-render the whole table just because you've toggled visibility on a few rows.)
IE does not handle very large DOMs manipulations well at all.
If you have two selects in each of the rows and a link, and let's assume each select has 5 options that's (((5options + 1select) * 2) + 1 link + 3tds + 1tr) at least 17 DOM items per row. Plus, if I'm not mistaken, ie treats each text item as it's own DOM node. So add another 10 DOM nodes for the drop down text and 1 for the link, that's 28 DOM items per row.
I agree with the others that it's not only your table that is causing the slow loading but the population of the dropdown list as well.
If it helps you can try paginating your table. You can use JavaScript frameworks such as jQuery or extjs for pagination and AJAX.