Workaround for size limit while editing documents using the web dashboard?


I am facing a problem trying to edit a long document using the web dashboard.

The error is pretty generic, but I believe that I reached a certain size limit of the document. This is understandable, as my documents are whole books. One of the books is “The Brothers Karamazov” with 840 pages, for example.

I can start adding the book’s text, chapter by chapter, without an issue. But then, at some point, it throws an error and doesn’t save the changes.

The thing is, I need to add the whole book in that document, and I really wanted to do it using Fauna’s web dashboard so I don’t have to build an application on my own just to do that.

The main doubt:
Can someone at Fauna confirm that there is a size limit do editing a document using the web dashboard? If so, is there a workaround? Can I get an exception?

For context, the database is used for my site It delivers classic novels in weekly installments to your email. Each day I run through the subscribers, check which book they are reading and in which installment they are. Then I get the “books” collection, grab the document of that book, find the current installment, and send the email. To be more precise, I have one document per author, but currently I only have one book per author, so all the same.

An example of a document:

"data": {
"authorId": "charlesDickens",
"authorName": "Charles Dickens",
"books": {
  "greatExpectations": {
    "bookId": "greatExpectations",
    "bookName": "Great Expectations",
    "installments": [
        "number": 1,
        "title": "Installment #1",
        "text": "Chapter I.\r\nMy father’s family name being Pirrip, and my Christian name Philip... <looong string with the whole chapter in here>.",
        "note": "",


    <one object for each installment in here, which can be 50 or more>


Thanks for any help!

@soneca Do you know what error your get ? We have a limit of 1MB per each request.

I just get a “Network Request Failed”.

If the editing request is sending the whole document again (and not just the part that I added), it is probably over 1MB.

Is there a workaround for that?

I am afraid we don’t have an easy workaround. Even if you use some driver instead of dashboard, your data element "installments" is an array and to update an array element you have to provide entire element. Otherwise it replaces with the provided values.

One way is to change your data model to use two different collections - books and installments. Each book will have an array of references to installments. This would also work provided your installment is less than 1MB.

Ok, it is a better data model indeed.
But I wonder if I can make it work with a smaller rewrite.

What if I keep one collection, but change the "installments" element from an array to an object. Will it allow me to update only that particular installment (which are all individually below 1MB) instead of the whole group?

Like this:

"installments": {
    1: {
        "title": "Installment #1",
        "text": "Chapter I.\r\nMy father’s family name being Pirrip, and my Christian name Philip... <looong string with the whole chapter in here>.",
        "note": "",
    2: {...},
    3: {...},

If I update only the element "2" of the "installments", will it try to substitute the whole "installments" (like when it’s an array) or will it update only the specific "2" object?

Object works fine as it merges instead of replace but then you may not be able to Index on [“data”, “installments”, “id”] to get a specific installment in a book. Just a tradeoff on what you really want. :slight_smile:

Perfect, I will do that way then and leave the rewrite for later.
And I don’t currently need the indexing by installment.

Thanks a lot @Jay!

Hi @Jay, I am sorry to say that it didn’t work! :frowning:

I changed the array to an object and then tried the add the remaining chapters. I got the same error.

Checking the network request payload, it appears to still be trying to do a replace with the whole object (not just the "installments" object, but the whole document object).

I believe I will have to go with the whole rewrite to another collection indeed

Just to close the thread, @soneca and I took this offline and resolved it. update with object appends the existing object in the document as expected.

1 Like

Yes, just to highlight that I was using the web dashboard GUI to manually edit the content of the document; but that apparently will always use replace, which gets the whole document as payload (which could be larger than 1MB in some of my cases).

Jay’s solution was using the shell, so you can create our own query to use update with object, which will append having only that last object of the tree that you are changing as payload.

Thanks for all the help, @Jay ! It solved my problem.

1 Like

Update returns the full document as well. Do you need to write the query to return something else? Or will Fauna return results greater than 1MB?

I may just be confused about what you are referring to as “payload”. Is it the data you send to fauna in the query or the response from the query?

Hi @ptpaterson, sorry, just saw this.

The “payload” that I mention is what I am sending to FaunaDB.

The response apparently can be of any size, as you are correct that Update (as well as Replace) returns the full document, and my full documents may be larger than 1MB.

But I learned that if you do a Update on an object, it will perform a merge operation, which just need to pass as payload what you are changing (i.e. the last property of the whole object path).