Just started using Redwood last week, and I’m absolutely loving it!
I’ve created a few scripts with
yarn rw g script and run them locally with
yarn rw exec. I’ve also deployed my Redwood app using Vercel. Is there a way to run these scripts in “production” (i.e., on a Vercel deployment)?
For context, I’m coming from a Rails background, so I’m used to “serverful” deployments, where I can connect to the production server and run rake tasks there. I’m still wrapping my head around serverless architecture, so I’m not sure what the serverless equivalent of running a rake task would be.
Yeah it’s tough getting your head around things you used to do on a server when there are no more servers! I still struggle with this as I was a Rails dev for 12 years before starting to work on Redwood.
I’m not familiar with Vercel, but on other deploy targets like Netlify and Render, there’s a setting on their side that lists the deploy command(s) that are going to be run when you push to GitHub.
So if the default is
yarn rw deploy vercel you can add on your script command in there so it runs on the next deploy:
yarn rw deploy vercel && yarn rw exec myScript. After the next deploy you can update the command again back to the default,
yarn rw deploy vercel.
I know it’s weird… I wish we could just do something like
yarn rw exec myScript -e production and it would know how to connect to a remote server somewhere and run the command (we do this in Rails all the time!).
Thanks so much for the quick reply, Rob. I wasn’t expecting to get a response within minutes, much less from a co-founder!
Including the script in the build command worked for me. In case it’s helpful to future readers, here’s what I did:
- Updated the build command for my Vercel project using the “override” option described here
- Found my most recent deployment on the Vercel dashboard, clicked the vertical ellipsis button, and then clicked “Redeploy”, checking the “Include build cache” checkbox
- Reverted the build command to the default by turning off the “override” option
Great to know that I can use the build step to run arbitrary scripts!
Hi @Isaac - I also come from a Rails background and you can run “rake-like” tasks in serveless with Netlify as part of your build process. This time will be part of your “build minutes” allotment.
But, once deployed your SPA is served up from a CDN and your api runs on lambdas that spin up and down – and typically can run for one 10 seconds.
And, there’s no dyno running or console you can spin up to get into like you would do with, say, Heroku.
Now, Netlify does have some long running background functions that can run up to 15 minutes.
You can invoke those via a REST endpoint or curl or Webhooks but just be sure to secure them with a secret or other signature to prevent undesired use. All serverlesss functions are open api endpoints.
I think the best thing to as is: what type of tasks do you envision running?
If it is part of deployment, then this is a build-time task and then it’s fine to run those tasks as part of deploy.
The tasks I need to run would upload data from an external source to the Redwood app’s database or process existing data within the database.
- Download a CSV of books and use the data within it to create new book records
- Extract keywords from the book data, create keyword records in the database, and associate each book record with the relevant keywords
- For each keyword, cache the number of associated books
Since these tasks aren’t really related to deployment, running them as part of the build step probably isn’t ideal. Unfortunately, as fas as I can tell, Vercel doesn’t offer an equivalent to Netlify’s background functions, so I’ve stuck with using the build step for now. I could try using “regular” serverless functions, but since Vercel limits execution time to 5-30 seconds depending on your plan, I’d have to limit each task to a small batch of records.
I’ll keep thinking about these options and whether a different deploy target would serve me better. Thanks for the detailed reply, @dthyresson!
I have similar needs. For now I’m just running my scripts locally, but point them to my prod DB.
The plan is to turn them into serverfull functions and have some kind of cronjob service call them
Ah, I hadn’t thought of that solution. Thanks for sharing!
Scheduled tasks, jobs and job runners are the missing piece (well one missing piece) in a fuller Jamstack stack – and trying to do them in a serverless world is not nearly as straightforward as one might in the Rails and Heroku world with rake tasks, Heroku Scheduler, Sidekiq/Redis/Cron, etc. Also, CSV “files” are tough because in serverless there’s no filesystem (well, same issue with Heroku).
As @Tobbe said, I have done these in the task with some success using cron, a signed Webhook, and a long running background Netlify function. I have used these to fetch data from third party APIs and persist info on a recurring basis and also to run data processing jobs (article texts and some NLP). But they have some drawbacks, particularly because these functions can run up to 15 minutes but are asynchronous so retries and errors have to all be done manually (ie, Sidekiq does that for you).
Some options I can think of are:
pg_cron + pg_net with Supabase for cron scheduled jobs and webhooks. Signing the Webhooks is tricky, but expect some DX improvements in near future here for signing with secrets.
Faktory from the makers of Sidekiq. Can run RW scripts but need server to host Faktory and communicate with it. But you’ll still need a cron server if you want this done periodically. Can consider Render here. But then, you are serverful.
- Do all work in Postgres via Supabase as a PG function to fetch, load and process the data. can also wire up with a cron schedule.
I see 2022 as the year Redwood can focus and try to solve this – and some of the other remaining Jamstack gaps. It’s definitely something that is lacking in the space.