Is there a maximum width a webpage can be? - html

I'm planning a page where things are repeatedly appended horizontally. Will there be a point where elements can no longer be appended, because of a maximum page width?

There will probably be some implementation specific limit to how wide the page can be or to how many elements you can have on a page, but there will be a limit you will hit long before then: the limit to how patient your user is when scrolling, or the amount of elements you can load before your page becomes too slow.
You might want to look at a paging solution and/or loading and unloading elements dynamically.

Related

Increasing Rows per Page in SSRS

I'm looking for a way to increase the number of rows per page in an SSRS report beyond 50. There are dozens upon dozens of guides on changing the rows per page via a parent group that breaks based on a CEILING(RowNumber(Nothing)/X) calculation, but every guide I've come across seems to be focused on decreasing the rows per page, and I want to increase the rows per page.
If I try a number higher than 50 for my breaks--CEILING(RowNumber(Nothing)/100), for example--it still caps the rows per page at 50. Using a number below 50 does work. Is there a report setting I've missed somewhere?
A bit late, but today I was trying to do this same thing.
I managed to do it by going to Report Properties -> Page Setup and increase the Height property.
Report Properties can be opened by right clicking in the blank space outside the report working area.
Depending on the Height will be the number of elements in the page, in my case I was able to show 100 rows per page with a Height of 21.5in
No matter where you set pages breaks, if the content exceeds the page size then the page will break. I'm assuming that you are viewing on screen only in which case you need to increase the height of the report's interactive size property.

Prefetching a larger image for hover div Page Speed issue

I currently have a page with about 20 or images with source data pulled from a database.
I display them at width of 100px and I have a hover function that appends an element with the full size image with a width of 250px.
I decided to use the full size version for the original image and just scale it down to 100px instead of using a thumbnail version. My thinking on this was that on this page it is very likely that the user will hover over most of the images so the page would end up having to load the full size version for most of the images anyway so why make them download the thumbnail AND full size version for each element. Also scaling from 250px to 100px didn't seem to display much if any distortion in the smaller element.
Now I am running my page through Google page speed analyzer and it really does not like me using larger than necessary images for the smaller elements. Of course it is ignoring the fact that those larger images are being used for the dynamically created popups.
In order to make my page play nice with Google's page speed tester I am giving in and using thumbnails for the smaller elements but I also want to prefetch the larger image to avoid an annoying delay when the user hovers over the element. This means I am essentially loading 2 versions of the same image just to make Google speed test not yell at me.
This seems ridiculous to me so I wanted to ask if this is really the best way to do this or is there another way to make my page play nice with Google speed test.
Thanks,
Adam
If you know what you're doing, there's no need to be a slave to the PageSpeed score.
Loading two copies of the images could make actual page speed slower. However, it depends on what you are trying to optimize for. Loading thumbnails first and then large versions could be better if you want time to full render to be fast (so the users can see the page) and then load the big images in the background to add interactivity later.
Or is it better to have fast time to interactivity, and time to full render doesn't matter that much. Then maybe having one copy of each image is better.

Should the number of html elements being loaded on a page be monitored?

I have an application that displays a page with 5-10000 rows in a table element and has a drop down menu that switches to other views with a similar structure. Currently I do an async request when switching between views (I also clear/remove the current view from the document) and load the appropriate list each time; however, I was thinking of
1) loading all views in the background before they are requested for viewing so they load instantly on click.
and
2) Just hiding a particular set of rows rather than removing it so if the client navigates back it will be instant as well.
This would mean that potentially 10s of thousands of html elements would be loaded into the current document; is this a problem? What if it was more than 10's of thousands?
Loading 10000+ HTML elements onto your page is not a very smart idea. If the user's computer is of normal to fast speed, the user may experience slight lag, and if the user's computer is slow, it may even cause a system crash (depending on the amount of RAM on the computer).
A technique you could explore is to lazyload the HTML elements - this means that when the user scrolls down to a particular portion of the page, then the elements are loaded via AJAX. (also known as "infinite scrolling").
This means that the browser's rendering engine does not have to render so many elements in one go, and that saves memory, and increases speed.
It depends on the HTML elements, but I would say as a rule of thumb, don't load upfront.
When I say it depends on the elements, I mean take a look at facebook for example. They load maybe 10 or 20 items into the feed, and then add more as you scroll, because each item is so rich in content (photos, videos, etc).
However on the flipside of that, think about how much info is in each row, if it's say, less than 500 bytes, 500 x 10000 = 5MB, which isn't an awful load for a web request, and if you can cache intelligently, maybe it will be a lot less than that.
The bottom line is, don't assume that the number of HTML elements matters so much, think about the amount of data that it amounts to. My advice, cross that bridge when you get there. Monitor your request times, if they are too long, come up with an AJAX page loader, which shouldn't be too hard.
The size of your html document would be conciderable...
I once got a page with 5000~10000 items (table rows) and the browser (IE) was rendering way too long (downloading the page, parsing and rendering it).
The best solution seems to me to set up a WebService with a LazyLoading system.
So IMHO yes, the number of element in a HTML document should be monitored.
Yes of course! The number of element in a HTML document can be monitored.! Use FireBug in Firefox!

Div limit for a website

How many divs can you have on a webpage at one time? And what would be controlling that limit?
There's nothing limiting the number of divs on a website (other than the memory on your machine).
There isn't a hard limit in any modern browser. I've literally iterated and printed thousands of divs on pages before. Your real limiting factors are going to be how big you want your page to be and of course, eventually your page will take a very long time to load.
I just ran a test case where I inserted 2,097,152 divs in a page and though it was slow to respond, it did eventually come up without choking either IE or Chrome.
If you need more information, or perhaps help figuring out an approach that wouldn't take so many divs as to require this question, just let us know.

Is there an upper limit on the number of divs in a document before performance is degraded?

I'm working on a site that will be showing a lot of information via divs. Its basically a chart out of divs. The way its setup is I've got an outer div with a set size that you can scroll the contents around (click and drag like google maps).
From here I can 2 ways of going, have 1 large inner div that is moved around with all the chart divs within it. This would be by far the easiest approach. The other option I see is a tiled approach where I break the large inner div up into smaller divs and dynamically add/remove them as needed.
The chart itself is potentially 1999425014 pixels square. Each point is made up of 6 divs and there are 100,000+ points.
What would be the best way to move forward?
I believe most modern browsers render sections of pages only as needed. If the entire page fits in RAM, only the section that is currently visible is being drawn. I conclude this because large documents do not slow down my computer when I use other programs, but there is a bit of lag when I scroll across one. Plus, when the window is repainted through the operating system's APIs, only the pixels that are visible should be rendered. That would only make sense when designing it, anyway.
Surely you can fit as many as you want on a page without worrying about real-time performance hits, unless you're talking about scrolling. Rendering the document should take less time than retrieving it out of system memory and determining where the scroll bars are.
Best of luck, and merry Christmas.
-Tom