How can I increase the number of posts my Squarespace site shows per page? - blogs

The Squarespace editor limits the number of blog posts per page to 20. I would like to increase this limit, but I don't know how or where to do that.
Is this possible?

Unfortunately, no. You just have workarounds. Summaries have a 30 limit, instead of 20, then again, you can also "chain" them by putting up multiple summary blocks. This is the way most do it.
Then again, there's this: "The lazy load summaries" plugin that has an option to go over the 30 item limit.
There's no "real" native way to do it other than with summary blocks or plugins (and even then, not for the actual posts). For now, anyway.

Related

Automatically insert thousands of review blocks into pages of a wordpress site

i try to ask for help here. I have a wordpress site where I use Elementor for pages.
Assuming to create an area of ​​the site where there are reviews divided into pages with menus for navigating them.
The problem is that there are thousands of reviews, how can I avoid having to enter them one by one? is there an automatic system that inserts them all into pages and also allows me to add them in the future?
And if it exists, is it possible to give it the style I want?
I've already done a similar thing by creating individual pages and inserting them the way I want. But when the reviews start to grow it becomes difficult and you have to create even hundreds of pages.
I have the reviews in a csv with columns "review" and "name".
Thanks!

How to hide content from File2HD?

There is a website called file2hd.com which can download any type of content from your website including audio, movies, links, applications, objects and style sheets. Of course this doesn't work for high profile websites such as Google, but is there there a type of method I can use to cloak content on my website and prevent this?
E.g. Using a HTML Code, or using .htaccess method?
Answers are appreciated. :)
If you hide something from the software, you also hide it from regular users. Unless you have a password-protected part of your website. But even then, those users with passwords will be able to fetch all loaded content - HTML is transparent. And since you didn't provide what kind of content are you trying to hide, it's hard to give you a more accurate answer.
One thing you can do, but it works just for certain file types, is to server just small portions of a file. For example, you have a video on your page and you're fetching 5-second bits of the video from the server every 5 seconds. That way, in order for someone to download the whole thing, they'd have to get all the bits (by watching the whole thing) and then find a way to join the parts... and it's usually just not worth it. Think of Google Maps... and Google uses this/similar technique on a few other products as well.

Maximum number of <iFrame> tags per web page

For a website it is possible to include multiple iFrames for integration of external content. I know that there are other better ways to include such content, but I wonder if there is a browser-specific limit for the number of allowed iFrames on one web page?
Now I thought that there are two possible browser limitation:
the number of connection that are needed for including the iFrame resources and
the concrete maximum number of allowed tags.
For (1): I figured that e.g. Mozilla Firefox (v17) has config parameter for maximum number of connections to a server (network.http.max-persistent-connections-per-server) and an overall limited number of concurrent connections (network.http.max-connections). See about:config.
For (2) I cannot find any information. Except that there is a higher resource demand and other performance issues there is no fixed limitation for those tags on one page.
Do you have any other information?
Thanks
There does not seem to be a limit. However, like you've mentioned, your website's speed may decrease with the increase of i-frames.
Here's a great guide for I-Frames: i frames

What's the most efficient way to add social media "like" and "+1" buttons to your site?

The task sounds trivial but bear with me.
These are the buttons I'm working with:
Google (+1)
Facebook (Like)
Twitter (Tweet)
LinkedIn (Share)
With a little testing on webpagetest.org I found that it's incredibly inefficient if you grab the snippet from each of these services to place these buttons on your page. In addition to the images themselves you're also effectively downloading several JavaScript files (in some cases multiple JavaScript files for just one button). The total load time for the Facebook Like button and its associated resources can be as long as 2.5 seconds on a DSL connection.
Now it's somewhat better to use a service like ShareThis as you can get multiple buttons from one source. However, they don't have proper support for Google +1. If you get the code from them for the Google +1 button, it's still pulling all those resources from Google.
I have one idea which involves loading all the buttons when a generic looking "Share" button is clicked. That way it's not adding to the page load time. I think this can be accomplished using the code described here as a starting point. This would probably be a good solution but I figured I'd ask here before going down that road.
I found one possible solution if you don't care about the dynamic aspect of these buttons. In other words, if you don't care to show how many people have +1'd or liked your page, you can just use these links...
https://plusone.google.com/_/+1/confirm?hl=en&url={URL}
http://www.facebook.com/share.php?u={URL}
http://twitter.com/home/?status={STATUS}
http://www.linkedin.com/shareArticle?mini=true&url={URL}&title={TITLE}&summary={SUMMARY}&source={SOURCE}
You'd just have to insert the appropriate parameters. It doesn't get much simpler or lightweight than that. I'd still use icons for each button of course, but I could actually use CSS sprites in this case for even more savings. I may actually go this route.
UPDATE
I implemented this change and the page load time went from 4.9 seconds to 3.9 seconds on 1.5 Mbps DSL. And the number of requests went from 82 to 63.
I've got a few more front-end optimizations to do but this was a big step in the right direction.
I wouldn't worry about it, and here's why: if the websites in question have managed their resources properly - and, come on, it's Google and Facebook, etc... - the browser should cache them after the first request. You may see the effect in a service where the cache is small or disabled, but, in all likelihood, all of your clients will already have those resources in their cache before they ever reach your page.
And, just because I was curious, here's another way:
Here's the snippet of relevant code from StackOverflow's facebook share javascript:
facebook:function(c,k,j){k=a(k,"sfb=1");c.click(function(){e("http://www.facebook.com/sharer.php?u="+k+"&ref=fbshare&t="+j,"sharefacebook","toolbar=1,status=1,resizable=1,scrollbars=1,width=626,height=436")})}}}();
Minified, because, hey, I didn't bother to rework the code.
It looks like the StackOverflow engineers are simply calling up the page on click. That means that it's just text until you click it, which dynamically pulls everything in lazily.

Thumbnails from HTML pages created and used automatically in web application

I am working on a Ruby on Rails app that visualizes product trees. The tree is built of nodes an everything is rendered in HTML/CSS3. Some of the products make several hundred SQL queries as the tree builds up (up to 800 queries on the biggest tree).
I'd like to have small thumbnails of each tree to present it on an index page. So rendering each tree once again and modifying CSS to make a tiny representation is an option.
But i think it's probably easier to generate thumbnails, crop, cache, and show these on the index page.
Any ideas on how to do this? Any links/articles/blog posts that could help me?
Check out websnapr; it looks like they provide 100,000 free snaps a month.
I should check this site more often. :D Anyway, I've done some more research and it looks like you'll need to set up some server-side scripts that will open a browser to the page, take a screenshot, and dump the file/store in database/etc.
This question has been open for quite a while. I have a proposal which actually fulfills most of the requirements.
Webkit2png can create screenshots which and crop parts of the image. You can specify dimensions, crop areas, and also it provides a thumbnail of the pages.
However, it will not support login in your application out-of-the-box.
Webkit2png is really easy to use in a shell script, so you can just feed it with a number of URLS and it will return all the image files.
More info in this blog post: Batch Screenshots with webkit2png
Webkit2png has an open request to add authentication (so you can use it on logged in pages).