Not a Netlify specific problem, but I want to build a script that makes 6-7 API calls, combines the results and then uploads that file to my repo to trigger a rebuild of my Netlify site... how would one go about this?

So as the title says, this isn’t specifically a Netlify issue, but I am hoping some of the bright minds on here might be able to help a beginner out with this problem.

I am building a site that at the moment gets its data from a Wordpress/Gravity Forms api. The api is limited to a max of 100 results per page, and eventually there will be 6-7 pages. I need ALL the results to build the data table and charts that the site needs. The front end is being built in react. Here is the current work in progress:

https://investors-vc.netlify.app/

So currently the way it is working is that each visit by a user triggers (eventually) these 6-7 api calls. Then the app builds the front end from the received data. However the data will rarely be changing, so a much better option would be to have a script that once a day makes the calls to the api, builds a file with the returned data, uploads that to the repo the site is being built from and then have my site/repo reference that static file.

Thing is I am still a bit new to this, so whilst I am sure this is possible, I am not really sure how to go about this. If someone here could give me some pointers on where to start that would be great.

Alternatively if someone could build the script for me and get it all set up I would be happy to pay them a fair price for their time.

Thanks…

Hi @Paul_Haze,

This is definitely possible. While I can’t write an entire build script for you, I can still give you some pointers that might help you get going.

Considering you’re using GitHub as your repo host, you could make use of GitHub Actions to achieve this. The workflow would be something like:

You make some changes locally → You push them to GitHub with [skip netlify] anywhere in the commit message1 → GitHub action is triggered → You run a Node.js script in that action which does the job of fetching the data and storing at as a JSON file2 → You commit the file to the repo from within the action → Netlify builds from that commit.

1 I suggested [skip netlify] anywhere in the commit message so that Netlify doesn’t build for that commit as it would be a waste of a build.

2 The Node.js script can look something like:

const fetch = require('node-fetch')
const fs = require('fs')

const fileData = []
const fetchURLs = ['url1', 'url2']

function callApi(url) {
  return new Promise((resolve, reject) => {
    fetch(url).then(response => {
      if (response.ok) {
        return response.json()
      } else {
        throw response.status
      }
    }).then(data => {
      resolve(data)
    }).catch(error => {
      reject(error)
    })
  })
}

fetchURLs.forEach(url => {
  callApi(url).then(data => {
    fileData.push(data)
  })
})

fs.writeFileSync('file-name.json', JSON.stringify(fileData))

I haven’t tested it, but might work. Note that, you’d have to rewrite a bit of your front-end logic to handle the change in data structure.

3 Likes

Hi Hrishikesh,

Thanks a lot for replying, This definitely sounds like a workable approach. I’ll go have a play around with it now…

EDIT: Just thinking about it, there probably wouldn’t be any changes to the main build of the site, it just needs the JSON file updated as the site internals will be referencing that, but the main UI and what not will be the same. As such, I don’t think I need to trigger a rebuild would I? So perhaps I should commit the Github action with a [skip netlify] as well?

If you trigger GitHub action with skip netlify, the JSON file won’t be added to the build, but you can definitely use it with GitHub’s raw URL (so one static URL in the front end) - but that might not work for private repos.