Nuxt is draining my credit

Hello everyone. i already had some sort of conversation with support team (super awesome people! :smiley: ) but that time was more Netlify Related, this time i want to point the focus on the Nuxt part.

My app is based on Nuxt 3 (with a bit of Prismic and Shopify as Backend), just for the context.
I thought i’ve set everyting correctly for the build to keep everything smooth (and cheap).

i cannot tell you exacltly when, but my free credit (wich use to work perfectly for almos a year), started eroding very fast.
long story short seem that my app is consuming the web requests for a non correct nuxt configuration for the SSG. i’ve tried to submit the question to 3 different AI receiving 3 (even more) different answers.

so, why not a fourth? (or even more? :smiley: )

this is my nuxt.config file (the sexy parts :wink: )
any help will be appreciated

// ..some imports


const staticRoutes = ['/chi-siamo', '/eventi', '/eventi/**', '/contatti', '/cookie', '/privacy', '/condizioni']

const routes = ['/', '/products/', '/collections', '/eventi', '/contatti', '/recensioni', '/personalizzazioni']


export default defineNuxtConfig({
modules: [
// ...modules
  ],

ssr: true,
devtools: { enabled: true },

runtimeConfig: {
/// ...env variables
}

// FIRST PART
routeRules: {

  '/api/**': { cors: true, cache: { maxAge: 3600 } },
...Object.fromEntries(staticRoutes.map(route => [route, { prerender: true }])),
...Object.fromEntries(routes.map(route => [route, { prerender: true }])),
'/': { prerender: true },
  },

future: {
  compatibilityVersion: 4,
},

nitro: {
logLevel: 1,
preset: 'netlify',
future: {
  nativeSWR: true,
},


prerender: {

// failOnError: true,
autoSubfolderIndex: true, // maybe commented
crawlLinks: false,
routes: [
'/',
'/blog',
// those are imported from external files
...await getBlogRoutes(),
...await prerenderProducts(),
...await prerenderCollections(),
'/404',
'/sitemap.xml',
'/robots.txt',
      ],
    },

  },




robots: {
enabled: true,
cacheControl: '86400',
blockAiBots: true,
blockNonSeoBots: true,
botDetection: true,
mergeWithRobotsTxtPath: true,
debug: false,
robotsTxt: true,
groups: [
 { userAgent: 'facebookexternalhit', disallow: '/' },
],
},

sitemap: {
enabled: true,
autoLastmod: true,

  },

})

@biro You could try asking the Nuxt community:

Thanks for the quick response!

i also asked in the nuxt community.
anyway is something that i can do to monitor / improve to avoid this situation?