Skip to content

Commit

Permalink
refactor: decrease max_page_size to 500
Browse files Browse the repository at this point in the history
- Fetching the heaviest endpoint (TH2) using page_size=1000 took almost
  20 seconds and returned 149M (tracker - StreamExpress 2024A - 378294 -
PixelPhase1/Tracks/PXForward/clusterposition_xy_ontrack_PXDisk_-1). To
avoid reaching the 30 seconds limit, page_size=500 is doable for the
same parameters (76.6M, 13 seconds).
  • Loading branch information
gabrielmscampos committed Nov 21, 2024
1 parent 5732746 commit 97c9550
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion backend/utils/pagination.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ class DynamicMultiOrderingCursorPagination(CursorPagination):
cursor_query_param = "next_token"
ordering_param = "order_by"
page_size_query_param = "page_size"
max_page_size = 1000
max_page_size = 500

def get_ordering(self, request, queryset, view):
default_order_by = list(queryset.query.order_by)
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/views/browser/index.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import API from '../../services/api'
import { buildTree } from '../../utils/ops'

const Browser = () => {
const defaultPageSize = 1000
const defaultPageSize = 500

const [isLoadingDatasets, setIsLoadingDatasets] = useState(true)
const [datasets, setDatasets] = useState()
Expand Down

0 comments on commit 97c9550

Please sign in to comment.