Clocking in thousands of read requests with a single database and two collections

Yesterday, I created a database, a sub-database and two collections within that sub-database. My collections do not contain any documents, I only have one index and one function, and I have only used my database for testing purposes for around 2 hours max. There is no way that I could have gathered more than a few hundred read requests by simply browsing around my database, let alone thousands.

I made a similar post over a year ago as well, where I had the exact same issue of disproportionally many requests being registered to the amount of interactions I’ve had with my database.

1 Like

Hi @vxern. There was an issue last year that caused the Dashboard to refetch data more often than necessary; it was refetching when a tab lost and regained focus. But that has since been resolved.

We have a helpdesk article that walks through how using the dashboard charges operations. Since the dashboard always shows you the latest information, it needs to query the database every time you navigate to a new page on the site.

Rules of thumb for dashboard operations

  • Every time you visit the Home page, expect 48 Read Ops
  • Every time you visit a DB Overview page, expect 32 + [# of Collections] + [# of Indexes] Read Ops
  • Every time you visit the Collection details page, expect 8 + [# of Documents] Read Ops
  • Every time you click the Collections nav button, expect 8 + [# of Collections] Read Ops, plus the cost of loading the first Collection’s details.
  • Functions, Keys and Roles are also plain old Documents underneath, so browsing the other pages will have required costs as well.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.