I am attempting to exclude a directory when running
yarn rw dev
Because it has 100s of gigabytes of data in it, which causes the file watcher to run out of memory shortly after launch. The simplified project structure looks like:
projectFolder
api
node_modules
staticFiles
web
I have attempted to ignore this in the following ways:
/**
* This file allows you to configure the Fastify Server settings
* used by the RedwoodJS dev server.
*
* It also applies when running the api server with `yarn rw serve`.
*
* For the Fastify server options that you can set, see:
* https://www.fastify.io/docs/latest/Reference/Server/#factory
*
* Examples include: logger settings, timeouts, maximum payload limits, and more.
*
* Note: This configuration does not apply in a serverless deploy.
*/
/** @type {import('fastify').FastifyServerOptions} */
const config = {
requestTimeout: 15_000,
logger: {
level: process.env.NODE_ENV === 'development' ? 'debug' : 'warn',
},
watchOptions:{
ignored: ['**/staticFiles/**'],
}
}
module.exports = config
But nothing seems to work. Where and how should this be implemented?
Additionally I would like to increase the amount of memory accessible by the api server when running. On other apps I’ve been able to simply do
node --max-old-space-size=XXXX app.js
But I am not sure how to pass this argument along either.
It isn’t in a public folder unless you are telling me that when redwoodjs run the dev server that everything inside of the entire project is part of the public path? It is at the same level as the node_modules folder inside the project.
It is a large amount of geographical imaging data that the company doesn’t want colocated, it is all being hosted on prem.
I don’t store it in GitHub and it is excluded from my deployment setup.
Do these questions mean my request is not possible with RedWoodJS?
I don’t have an answer for your specific question, but I don’t believe David is saying it’s not possible. Seems he’s trying to figure out what your trying to accomplish by doing it this way. It might be better to not put large amount of data inside a project, rather keep it on the same server and just point web-server to it.
It isn’t in a public folder unless you are telling me that when redwoodjs run the dev server that everything inside of the entire project is part of the public path? It is at the same level as the node_modules folder inside the project.
No this isn’t the case. Just to clarify that point a bit further, are you referencing the files in the staticFiles folder in the api side or web side? In the web sides case, importing them into components, pages, etc?
They are requested from the web side via an API call and streamed to the end user from a custom Redwood function. I don’t have the authentication functionality working yet on the custom function, but the idea is that the user roles will determine what files can be served to them for download. The code in the custom function is:
All of this works as is, it was mostly a convenience factor of making sure my directory traversal was all correct while writing code from the IDE without having to open another window. I’ve got to add some authentication and authorization checks before going live but that is all that is left.
The only problem I have now is the Node memory settings that I’d like to be able to change.
Luckily the files being served are never more than 1 MB, most are less than 50 Kbs even. In testing it has streamed the data faster than my previous setup using Express with sendFile().
I believe the memory allocation for the API call is per request and released once complete? If so I don’t think it will be an issue with how small the files being sent are.