I’m about to open up a can of worms but here goes.
I have built a POC of something cool using Fauna. Hosting is via Vercel and I’m using the integration provided to store keys in three separate db’s that I have as children of a master one (namely dev, test, prod)
I have helper scripts to upload the graphql schema with a flag to merge or override the database. Also created are helper functions to upload my UDF’s that I use as custom resolvers in my graphql endpoints as well as roles pertaining to these and general roles for users, admins etc.
I have started writing a script which will automatically do this between my environments based on the key defined per child database which is defined as an env in Vercel. All good.
Now the tricky part: When running locally and in the cloud, functions are not overwritten with new versions if they exist already. In addition to this the merge schema flag always seems fail and leave trace data. I could use override but then when this hits my prod db it will wipe that out.
I then thought of backing up the db before running the override command to recreate the schema and encountered my next big issue (and only one really concerning me) - there does not seem to be a modern flow to backup and restore databases manually. I know there is a tool (not maintained and would cost more to run in the cloud) but what mechanism besides temporality does Fauna provide to enable this?
Maybe I am missing something here but how can Fauna be integrated into a production DevOps workflow, one that allows quick iterations and failsafes?