We are considering transferring large amounts of our OLTP traffic from Postgres to Fauna. Fauna’s seemless scale is very appealing to us since we expect a large spike in our user base, and we already find ourselves fixing memory and CPU issues in Postgres.
The question is, as we accumulate large amount of data in Fauna what’s our options to export a large collection to an external system. For example Snowflake for BI queries, S3 for long term storage, Druid for real time OLAP queries or graph database like Neo4j.
When exporting data I think of two method:
CDC, meaning capture changes in real time. For example, support the projection of changes to a Kafka topic.
Dump, meaning downloading big chunks of data as one operation. For example, export a collection to one (or multiple) big JSON files in S3.
Do Fauna has something in those lines planned for the near future?
It might be a deal breaker for us not knowing we have an export strategy for the future in case we need one.