First hugo site deployment, easy but small website (90Mo) yet slow to display, no caching?


Hi, it’s a new new hugo site, less than 100 Mo in size, almost nothing.
I’m surprised the loading of each page is slow in comparison to my localhost/ experience. I mean I understand it logically can’t be as immediate but I’m barely touching my allowed bandwith, and most of all it seems to reload the wallpaper for every page, which doesn’t seem right. The theming overall (CSS ?) also gets reloaded each time and it’s pretty ugly for an experience, I don’t mind the page to appear half a second later than on my computer but it looks quite different… and ugly. while on my disk it’s just for the first opening.
So I bet it’s a caching issue.


baseURL = ‘
languageCode = ‘en-us’
title = ‘Final Worldview’
theme = “hugo-book”
publishDir = “public”

BookTheme = ‘dark’
BookSearch = true
BookComments = true
BookPortableLinks = true
BookServiceWorker = true
BookTranslatedOnly = true
disablePathToLower = false
enableGitInfo = true

Needed for mermaid/katex shortcodes

unsafe = true

startLevel = 2

languageName = ‘English’
contentDir = ‘content’
weight = 1
languageName = ‘Français’
contentDir = ‘content_fr’
weight = 2

enableInlineShortcodes = false
allow = [‘^dart-sass-embedded$’, ‘^go$’, ‘^npx$’, ‘^postcss$’, ‘micro’]

The website looks fast enough to me - especially because you’ve added service worker, so it’s all getting cached after the first load.

it’s really slow compared to my local version, the theming and background should appear instantaneously after the first loading, since it’s the same for the whole damn site ! I had two peopel recording their screen, it’s about the same as here, the lower the bandwidth the worse it gets. I expected better, it’s ugly. it should download all the pictures then display them in one go, for a smooth experience.

@00120260a Several of your images are very large by web standards:

This one is 7.4mb:

Even the background is 2.3mb:

Try optimizing your images to reduce the overall page size.

Oh… then I had grossly overestimated the web. I had never considered that issue !
I’m optimizing pictures with a tool now.
But - to the risk of looking like an idiot - how did you get that useful table ?


That’s just the “Network” tab of the “Developer Tools” in Chrome.

I stripped it to 44 Mo, is it a more common size ?
Is it possible to cache/precharge more, so that the background and styling stay on the browser unlike now ?
Or is it possible to charge the pictures only when actually on the screen ? that would help a lot !

I’m not sure what you mean by 44mo, (or your other questions), but what I see for your site is:

That’s a first load, with the cache disabled.

How fast a first load feels will always be dependent on the connection speed of the person viewing.

The users browser will cache assets, but it can’t “precharge” or “precache” meaning it can’t load things before it is told to, or knows it will need them.

I meant 44 Mo for the total size of all pictures in a site, but considering the size per page varies a lot it doesn’t mean much.
So browsers aren’t smart enough to load stuff when they actually appear on the screen, then keep it in the cache. Ok.
For the other question, it’s simple: the wallpaper and css style doesn’t change at all. Then why are they reloaded again and again ?
Is a progress that could be made with hugo, netlify or the hugo theme I use ?

That is precisely how the browser cache works.

When something is requested it is downloaded to the users computer, and then it is cached according to the headers that were returned along with it. If the file is already on the users disk, if they view the same asset again prior to the expiry of the cache then the asset will be loaded from their disk rather than downloading from the remote source again.

This is the same as with the other assets, it’s down to a combination of things like the users browser/cache settings, and the headers delivered along with the file. With no change, on a subsequent reload a regular user would have the file retrieved from the cache of their browser and thus it would load faster.

I’m not sure what you mean by this.

If your issue is with assets / load times etc, then the theme being used wouldn’t matter much outside just how many assets were being loaded and how large they were.

Your site performance is good now, the stats that I gave you earlier were with my cache intentionally disabled, so they represent a “first load scenario”.

1 Like

It looks like Nathan’s suggestions got you unblocked and like you have a solution! If you have further questions here, please don’t hesitate to follow up.

Happy building!

1 Like

Sure, my question was rather why does it have to reinterpret the css sheat and reload the wallpaper for each page even though neither change in the entire site ? Can we do somethin’ about it, where ?
Is it a problem of pages not being divided in frames somethin’ ?

It’s just how browsers and the internet work for basic individual pages.
It is largely stateless and each page change begins from the blank canvas of the request for the url.

You could read more about it online easily by googling around.

If you don’t like the behavior you could engineer your site with a more complex solution, for example producing a JavaScript based SPA (Single Page Application) where each “page change” isn’t actually a real change of the page, but an internal navigation within the application itself.

You (or the system you decide to use) becomes responsible for managing when it pulls in the additional data for other pages, and how the screen changes its content when a different route within the application is accessed.

I’m not sure what you mean by this.

1 Like

Thanks, this was very informative. I had an inkling things were that way. Indeed, browers are fairly dumb as they are.
One last attempt: is it possible to display lower-quality pictures but to provide the real deal when the user clicks on “open image in a new tab” or “download image” in their contextual menu ?

According to me, that would be a highly incorrect statement in 2022, but to each its own.

I believe JavaScript makes a major chunk possible, including preventing the “dumb” thing that browsers are doing to your site.

While we won’t be able to provide you with a working example, here’s a basic lazy loading image function that I use on my own projects:

const lazyImagesObserver = new IntersectionObserver((entries, observer) => {
  entries.forEach(entry => {
    if (entry.isIntersecting) {
      const image = new Image()
      const src ='data-lazy') || 'blank'
      image.addEventListener('load', () => {'src', src)'data-lazy')
      }, {
        once: true
      image.src = src
}, {
  threshold: 0
document.querySelectorAll('[data-lazy]').forEach(element => {

Images can have a markup like:

  alt = "alt"
  data-lazy = "high-image-link"
  height = "high-image-height"
  style = "object-fit:contain"
  src = "low-image-source"
  width = "high-image-width"/>

All judgments are relative, that’s for sure :smiley: :smiley: I’m just keenly aware that softwares did not catch up on hardware improvements over the years.
Thanks for the code, for the sake of perfectionism I’ll think about learning some javascript.