I think some people go into meltdown when it comes to testing for page speed, and more often than not, much of it is pure hype to get people to sign-up for website monitoring services. I took a few of the speed test websites and was hard pushed to find one that ranked higher than 76%!! so why were they at the top of Google listings? because they paid to be there.
Although it is generally accepted that bootstrap does cause initial bloat when visiting a site for the first time, it isn’t the major issue that many make it out to be. If it is of real concern, there is nothing to stop you from serving bootstrap from a CDN. Of course, you always run the risk that the CDN may be down when someone visits, so you would probably need a fall back to your own hosted version of bootstrap anyway. You can do the same with javascript libraries, but with the same caveats. This only works because there are millions of websites out there that use the bootstrap framework, so the chances are that many users may already have a cached version of the bootstrap framework if they happen to have visited a site that uses the same CDN. Likewise, anyone who has visited your site will also have a cached version of your bootstrap framework, so repeat visits will be much faster. The same applies to fonts. You could have those delivered via a CDN on the off-chance that another site the user has visited has used the same fonts as the ones in your website.
The other thing you can do is to consolidate those CSS files into a single file and then minify the consolidated file. This is a simple copy and paste function and only requires a simple change to one line in your output HTML file. You could also enable GZIP on your hosting so that everything gets compressed on your server.
By far, the biggest savings are made by correctly optimising your content, particularly images. As things stand with Blocs, it already minimises your HTML, CSS, Javascript and Bootstrap files, so it’s already doing as much as possible to create faster loading sites. Anything beyond that is down to you, as the website developer. As almost every academic will tell you, the only real way of fully optimising your site is to hand code it. Of course, that suits their narrative because they make their living by teaching people how to hand code. Google, as you would expect, do their bit to support the narrative by convincing everyone that they are going to downgrade your ranking if you don’t achieve a certain mythical score. The fact is there are millions of bootstrap websites out there that actually do appear in Google’s listing, so there is clearly other factors involved in the ranking algorithm.
Certainly, Google doesn’t seem to worry about page speed when it comes to websites that are serving up google ads and using google tracking tools. The BBC News website ranks at about 68% (81% for a repeat visit), but it delivers a lot of content for Google. So, expect the BBC and similar site to get very high rankings from Google.
For me, the benchmark of performance is page size. If it’s less than the median of 2.11 mb it will do just fine. Interestingly, even a page size within that median can load in anywhere from 1.2 seconds to 7.8 seconds, depending where the test server is located. So, much of this speed testing is very much driven by the gullible. Seeking utopia on the Google scale isn’t the be-all and end-all of website design and development. People visit websites for all manner of reasons and as long as they see something within a few second, they are normally happy. I don’t know of too many people who sit there counting the milliseconds in order to decide if they are going to stick around to see what the site has to offer. If that was the case, I’m sure Amazon wouldn’t get the number of visits it gets. (their score, by the way, is between 51% and 73% with an average load time of 7.37 seconds.