Gatsby-like implementation of SSG using a CMS

I’m fairly new to Redwood, mainly been using Gatsby recently.
I took a look at prerendering and I’m not totally sure if it meets the needs I have for this project. I would need to be able to statically generate pages using a CMS (with i18n), and also have some dynamic stuff on the frontend that would take advantage of Redwood Cells/API and auth.

Is this something that’s been done with Redwood before? Or would I have to run my own implementation of SSG and translations?

EDIT: I’ve seen a thread about using both Gatsby and Redwood for a project, but the main issue is that even the dynamic pages with Redwood would still have copy/text/forms that are generated by the CMS.

1 Like

Welcome @davidli3100! My understanding is that right now, you can prerender entirely static pages (no data fetching during build) only. They are rehydrated client-side in the browser, and you can do whatever you want with them there in terms of dynamic data fetching. What you can’t do is build for example a static page that’s a list of all the records in a collection, where you have to fetch the collection when you build the static page server(less)-side.

My understanding also is that the prerender package might also include dynamic data fetching during static generation at some point in the future. There was a message at one point that said the core team felt like it was a critical thing to do right, and won’t be in the v1 release.

1 Like

Yeah I suspected that would be the case.

I guess I’ll just download everything from the CMS as a JSON file as a pre-build script and “dynamically” fetch the data that way. Feels a little clunky but I think Redwood is worth the extra hassle

Hi @davidli3100 if you are using a CMS like Contentful or anything that has either a graphql or other api and sdk/client, you can just:

  • define your SDL to match the CMS’s model
  • define queries
  • create services but instead of connecting to Prisma, create your CMS client and fetch the data

I’ve done this using Contentful’s SDK GitHub - contentful/contentful.js: JavaScript library for Contentful's Delivery API (node & browser)

Note: If you want to do mutations, check your CMS docs. For Contentful, they have separate delivery (read) and a management (write) SDKs – which would use different API keys.

import { createClient } from 'contentful'

const client = createClient({
  space: process.env.CONTENTFUL_SPACE,
  accessToken: process.env.CONTENTFUL_DELIVERY_API_KEY,
})

const renderAsset = (fields) => {
  return { title: fields.title, file: fields.file }
}

const renderAssets = (assets) => {
  return assets.map((asset) => renderAsset(asset.fields))
}

const renderEntry = (entry) => {
  return {
    id: entry.sys.id,
    name: entry.fields.name,
    description: entry.fields.description,
    price: entry.fields.price,
    rating: entry.fields.rating,
    slug: entry.fields.slug,
    photos: renderAssets(entry.fields.photos),
  }
}

export const cupcakes = async () => {
  const response = await client.getEntries({
    content_type: 'cupcake',
    limit: 1000,
    order: 'fields.name',
  })

  return response.items.map((entry) => renderEntry(entry))
}

export const cupcake = async ({ id }) => {
  const entry = await client.getEntry(id)

  return renderEntry(entry)
}

and

export const schema = gql`
  type Cupcake {
    id: String!
    name: String!
    description: String!
    price: Float
    rating: Int
    slug: String!
    photos: [ContentfulAsset]
  }
  type Query {
    cupcakes: [Cupcake!]!
    cupcake(id: String!): Cupcake!
  }
export const schema = gql`
  type ContentfulDimensions {
    width: Int
    height: Int
  }
  type ContentfulFileDetails {
    size: Int!
    image: ContentfulDimensions
  }
  type ContentfulFile {
    url: String!
    fileName: String!
    contentType: String!
    details: ContentfulFileDetails
  }
  type ContentfulAsset {
    title: String!
    file: ContentfulFile
  }
`
1 Like

Ideally, all of this would be done during the build phases and rendered as plain html, although I do see where you’re going with it.

I’m assuming using a CMS as a GQL client would mainly be for fetching that data whenever there’s a request from the frontend?

@dthyresson that’s a really great idea. I did it with Hasura just playing around and following a tutorial. Two headless CMS vendors that provide native GraphQL APIs are Sanity and GraphCMS. Sanity has a JS client. I don’t see one for GraphCMS.

You could also do the html rendering in a serverless function and leverage Netlify’s On-Demand Builders

This would be cached. until another deploy.

That is what they propose to do for large sites – so that content is not generated at build time.

You’d use a redirect proxy (with a slug or other identifier) to your serverless function that would fetch the CMS content, render the html and return the response.

You can totally use a generic GraphQL client like graphql-request as an api client in your services.

Here it talks to Hasura, but it can talk to anything:

import { GraphQLClient } from 'graphql-request'

export const request = async (
  query = {},
  domain = process.env.HASURA_DOMAIN
) => {
  const endpoint = `https://${domain}/v1/graphql`

  const graphQLClient = new GraphQLClient(endpoint, {
    headers: {
      'x-hasura-admin-secret': process.env.HASURA_KEY,
    },
  })

  try {
    return await graphQLClient.request(query)
  } catch (error) {
    console.log(error)
    return error
  }
}

and here a calenday query exists for

export const schema = gql`
  type Calendar {
    day: String!
    value: Int!
  }
  type Query {
    dailyStoryCounts: [Calendar!]!
  }
`

/// 

import { request } from 'src/lib/hasuraClient'

export const dailyStoryCounts = async () => {
  const query = `
  {
    dailyStoryCounts: calendar {
      day: date
      value
    }
  }
 `
  const data = await request(query)

  return data['dailyStoryCounts']
}
1 Like

I’m actually using Sanity, I have a semi-naive solution to this (for anyone else looking):

const SanityClient = require('@sanity/client')
const fs = require('fs')

const client = SanityClient({
  projectId: '',
  dataset: 'production',
  apiVersion: '2021-06-27',
  token: '',
  useCdn: false
})

const query = '*[name == "Test Form"]'
const params = {}
client.fetch(query, params).then((data) => {
  console.log(data[0].sections)
  fs.writeFileSync(`${__dirname}/src/locales/form-test.json`, JSON.stringify(data[0], null, 2))
})

And then you could have a hook that would feed back the translations and data into presentational components for pre-rendering (pretty sure JSON is included in pre-rendering as a static asset?). Gonna take a look at Netlify On-demand builders though, it seems promising

Where would you store the generated html pages? How would you route to them?

If you use prerender, then you specify that on a Route and it needs a Page.

Would you create n times React Pages?