Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for pagination at table level. #1244

Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
99551fc
feat(postgres): ✨ add transfer pagination support
DavideSegullo Mar 26, 2024
5671955
test(postgres): ✅ add transfer pagination test
DavideSegullo Mar 26, 2024
9794af7
feat(comlink): ✨ add pagination for getTransfers
DavideSegullo Mar 26, 2024
d958396
test(ender): ✅ update transfer handler test
DavideSegullo Mar 26, 2024
bcc7085
feat(postgres): ✨ add fill pagination support
DavideSegullo Mar 26, 2024
4dac677
test(postgres): ✅ add fill pagination test
DavideSegullo Mar 26, 2024
635dcd6
feat(comlink): ✨ add pagination for getTrades and getFills
DavideSegullo Mar 26, 2024
f4ca7ec
test: :white_check_mark: update pagination tests
DavideSegullo Mar 26, 2024
0d14f11
feat(postgres): :label: add explicit types
DavideSegullo Mar 27, 2024
15bcf49
feat(postgres): :sparkles: add order pagination support
DavideSegullo Apr 5, 2024
34c4292
test(postgres): :white_check_mark: add order pagination test
DavideSegullo Apr 5, 2024
9b37c5a
feat: :recycle: update order findAll usage
DavideSegullo Apr 5, 2024
6bea2ce
revert: :rewind: revert add pagination for getTransfers
DavideSegullo Apr 5, 2024
5e21036
revert: :rewind: revert add pagination for getTrades and getFills
DavideSegullo Apr 5, 2024
d1ee4e0
feat: :label: add explicit ts types
DavideSegullo Apr 6, 2024
f197ad1
docs: :bulb: add pagination todo comment
DavideSegullo Apr 10, 2024
8f971ab
Merge branch 'main' into DavideSegullo/feat-support_pagination
DavideSegullo Apr 11, 2024
e8bbace
docs(postgres): :bulb: improve pagination comment
DavideSegullo Apr 11, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
67 changes: 60 additions & 7 deletions indexer/packages/postgres/__tests__/stores/fill-table.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ describe('Fill store', () => {
FillTable.create(defaultFill),
]);

const fills: FillFromDatabase[] = await FillTable.findAll({}, [], {});
const { results: fills } = await FillTable.findAll({}, [], {});

expect(fills.length).toEqual(2);
expect(fills[0]).toEqual(expect.objectContaining(defaultFill));
Expand All @@ -91,7 +91,7 @@ describe('Fill store', () => {
}),
]);

const fills: FillFromDatabase[] = await FillTable.findAll({}, [], {
const { results: fills } = await FillTable.findAll({}, [], {
orderBy: [[FillColumns.eventId, Ordering.DESC]],
});

Expand All @@ -103,6 +103,59 @@ describe('Fill store', () => {
expect(fills[1]).toEqual(expect.objectContaining(defaultFill));
});

it('Successfully finds fills using pagination', async () => {
await Promise.all([
FillTable.create(defaultFill),
FillTable.create({
...defaultFill,
eventId: defaultTendermintEventId2,
}),
]);

const responsePageOne = await FillTable.findAll({
page: 1,
limit: 1,
}, [], {
orderBy: [[FillColumns.eventId, Ordering.DESC]],
});

expect(responsePageOne.results.length).toEqual(1);
expect(responsePageOne.results[0]).toEqual(expect.objectContaining({
...defaultFill,
eventId: defaultTendermintEventId2,
}));
expect(responsePageOne.offset).toEqual(0);
expect(responsePageOne.total).toEqual(2);

const responsePageTwo = await FillTable.findAll({
page: 2,
limit: 1,
}, [], {
orderBy: [[FillColumns.eventId, Ordering.DESC]],
});

expect(responsePageTwo.results.length).toEqual(1);
expect(responsePageTwo.results[0]).toEqual(expect.objectContaining(defaultFill));
expect(responsePageTwo.offset).toEqual(1);
expect(responsePageTwo.total).toEqual(2);

const responsePageAllPages = await FillTable.findAll({
page: 1,
limit: 2,
}, [], {
orderBy: [[FillColumns.eventId, Ordering.DESC]],
});

expect(responsePageAllPages.results.length).toEqual(2);
expect(responsePageAllPages.results[0]).toEqual(expect.objectContaining({
...defaultFill,
eventId: defaultTendermintEventId2,
}));
expect(responsePageAllPages.results[1]).toEqual(expect.objectContaining(defaultFill));
expect(responsePageAllPages.offset).toEqual(0);
expect(responsePageAllPages.total).toEqual(2);
});

it('Successfully finds Fill with eventId', async () => {
await Promise.all([
FillTable.create(defaultFill),
Expand All @@ -112,7 +165,7 @@ describe('Fill store', () => {
}),
]);

const fills: FillFromDatabase[] = await FillTable.findAll(
const { results: fills } = await FillTable.findAll(
{
eventId: defaultFill.eventId,
},
Expand All @@ -134,7 +187,7 @@ describe('Fill store', () => {
) => {
await FillTable.create(defaultFill);

const fills: FillFromDatabase[] = await FillTable.findAll(
const { results: fills } = await FillTable.findAll(
{
createdBeforeOrAt: createdDateTime.plus({ seconds: deltaSeconds }).toISO(),
},
Expand All @@ -155,7 +208,7 @@ describe('Fill store', () => {
) => {
await FillTable.create(defaultFill);

const fills: FillFromDatabase[] = await FillTable.findAll(
const { results: fills } = await FillTable.findAll(
{
createdBeforeOrAtHeight: Big(createdHeight).plus(deltaBlocks).toFixed(),
},
Expand All @@ -177,7 +230,7 @@ describe('Fill store', () => {
) => {
await FillTable.create(defaultFill);

const fills: FillFromDatabase[] = await FillTable.findAll(
const { results: fills } = await FillTable.findAll(
{
createdOnOrAfter: createdDateTime.minus({ seconds: deltaSeconds }).toISO(),
},
Expand All @@ -199,7 +252,7 @@ describe('Fill store', () => {
) => {
await FillTable.create(defaultFill);

const fills: FillFromDatabase[] = await FillTable.findAll(
const { results: fills } = await FillTable.findAll(
{
createdOnOrAfterHeight: Big(createdHeight).minus(deltaBlocks).toFixed(),
},
Expand Down
52 changes: 48 additions & 4 deletions indexer/packages/postgres/__tests__/stores/order-table.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ describe('Order store', () => {
it('Successfully creates an Order with goodTilBlockTime', async () => {
await OrderTable.create(defaultOrderGoodTilBlockTime);

const orders: OrderFromDatabase[] = await OrderTable.findAll({}, [], {});
const { results: orders } = await OrderTable.findAll({}, [], {});
giorgionocera marked this conversation as resolved.
Show resolved Hide resolved

expect(orders).toHaveLength(1);
expect(orders[0]).toEqual(expect.objectContaining({
Expand All @@ -67,7 +67,7 @@ describe('Order store', () => {
}),
]);

const orders: OrderFromDatabase[] = await OrderTable.findAll({}, [], {
const { results: orders } = await OrderTable.findAll({}, [], {
giorgionocera marked this conversation as resolved.
Show resolved Hide resolved
orderBy: [[OrderColumns.clientId, Ordering.ASC]],
});

Expand All @@ -79,6 +79,50 @@ describe('Order store', () => {
}));
});

it('Successfully finds all Orders using pagination', async () => {
await Promise.all([
OrderTable.create(defaultOrder),
OrderTable.create({
...defaultOrder,
clientId: '2',
}),
]);

const responsePageOne = await OrderTable.findAll({ page: 1, limit: 1 }, [], {
orderBy: [[OrderColumns.clientId, Ordering.ASC]],
});

expect(responsePageOne.results.length).toEqual(1);
expect(responsePageOne.results[0]).toEqual(expect.objectContaining(defaultOrder));
expect(responsePageOne.offset).toEqual(0);
expect(responsePageOne.total).toEqual(2);

const responsePageTwo = await OrderTable.findAll({ page: 2, limit: 1 }, [], {
orderBy: [[OrderColumns.clientId, Ordering.ASC]],
});

expect(responsePageTwo.results.length).toEqual(1);
expect(responsePageTwo.results[0]).toEqual(expect.objectContaining({
...defaultOrder,
clientId: '2',
}));
expect(responsePageTwo.offset).toEqual(1);
expect(responsePageTwo.total).toEqual(2);

const responsePageAllPages = await OrderTable.findAll({ page: 1, limit: 2 }, [], {
orderBy: [[OrderColumns.clientId, Ordering.ASC]],
});

expect(responsePageAllPages.results.length).toEqual(2);
expect(responsePageAllPages.results[0]).toEqual(expect.objectContaining(defaultOrder));
expect(responsePageAllPages.results[1]).toEqual(expect.objectContaining({
...defaultOrder,
clientId: '2',
}));
expect(responsePageAllPages.offset).toEqual(0);
expect(responsePageAllPages.total).toEqual(2);
});

it('findOpenLongTermOrConditionalOrders', async () => {
await Promise.all([
OrderTable.create(defaultOrder),
Expand Down Expand Up @@ -106,7 +150,7 @@ describe('Order store', () => {
}),
]);

const orders: OrderFromDatabase[] = await OrderTable.findAll(
const { results: orders } = await OrderTable.findAll(
giorgionocera marked this conversation as resolved.
Show resolved Hide resolved
{
clientId: '1',
},
Expand Down Expand Up @@ -202,7 +246,7 @@ describe('Order store', () => {
OrderTable.create(defaultOrderGoodTilBlockTime),
]);

const orders: OrderFromDatabase[] = await OrderTable.findAll(
const { results: orders } = await OrderTable.findAll(
giorgionocera marked this conversation as resolved.
Show resolved Hide resolved
filter,
[],
{ readReplica: true },
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ describe('Transfer store', () => {
TransferTable.create(transfer2),
]);

const transfers: TransferFromDatabase[] = await TransferTable.findAllToOrFromSubaccountId(
const { results: transfers } = await TransferTable.findAllToOrFromSubaccountId(
{ subaccountId: [defaultSubaccountId] },
[], {
orderBy: [[TransferColumns.id, Ordering.ASC]],
Expand All @@ -142,7 +142,7 @@ describe('Transfer store', () => {
TransferTable.create(transfer2),
]);

const transfers: TransferFromDatabase[] = await TransferTable.findAllToOrFromSubaccountId(
const { results: transfers } = await TransferTable.findAllToOrFromSubaccountId(
{
subaccountId: [defaultSubaccountId],
eventId: [defaultTendermintEventId],
Expand All @@ -155,6 +155,57 @@ describe('Transfer store', () => {
expect(transfers[0]).toEqual(expect.objectContaining(defaultTransfer));
});

it('Successfully finds all transfers to and from subaccount using pagination', async () => {
const transfer2: TransferCreateObject = {
senderSubaccountId: defaultSubaccountId2,
recipientSubaccountId: defaultSubaccountId,
assetId: defaultAsset2.id,
size: '5',
eventId: defaultTendermintEventId2,
transactionHash: '', // TODO: Add a real transaction Hash
createdAt: createdDateTime.toISO(),
createdAtHeight: createdHeight,
};
await Promise.all([
TransferTable.create(defaultTransfer),
TransferTable.create(transfer2),
]);

const responsePageOne = await TransferTable.findAllToOrFromSubaccountId(
{ subaccountId: [defaultSubaccountId], page: 1, limit: 1 },
[], {
orderBy: [[TransferColumns.id, Ordering.ASC]],
});

expect(responsePageOne.results.length).toEqual(1);
expect(responsePageOne.results[0]).toEqual(expect.objectContaining(defaultTransfer));
expect(responsePageOne.offset).toEqual(0);
expect(responsePageOne.total).toEqual(2);

const responsePageTwo = await TransferTable.findAllToOrFromSubaccountId(
{ subaccountId: [defaultSubaccountId], page: 2, limit: 1 },
[], {
orderBy: [[TransferColumns.id, Ordering.ASC]],
});

expect(responsePageTwo.results.length).toEqual(1);
expect(responsePageTwo.results[0]).toEqual(expect.objectContaining(transfer2));
expect(responsePageTwo.offset).toEqual(1);
expect(responsePageTwo.total).toEqual(2);

const responsePageAllPages = await TransferTable.findAllToOrFromSubaccountId(
{ subaccountId: [defaultSubaccountId], page: 1, limit: 2 },
[], {
orderBy: [[TransferColumns.id, Ordering.ASC]],
});

expect(responsePageAllPages.results.length).toEqual(2);
expect(responsePageAllPages.results[0]).toEqual(expect.objectContaining(defaultTransfer));
expect(responsePageAllPages.results[1]).toEqual(expect.objectContaining(transfer2));
expect(responsePageAllPages.offset).toEqual(0);
expect(responsePageAllPages.total).toEqual(2);
});

it('Successfully finds Transfer with eventId', async () => {
await Promise.all([
TransferTable.create(defaultTransfer),
Expand Down Expand Up @@ -234,7 +285,7 @@ describe('Transfer store', () => {
TransferTable.create(transfer2),
]);

const transfers: TransferFromDatabase[] = await TransferTable.findAllToOrFromSubaccountId(
const { results: transfers } = await TransferTable.findAllToOrFromSubaccountId(
{
subaccountId: [defaultSubaccountId],
createdBeforeOrAt: '2000-05-25T00:00:00.000Z',
Expand Down
37 changes: 34 additions & 3 deletions indexer/packages/postgres/src/stores/fill-table.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ import {
CostOfFills,
QueryableField,
QueryConfig,
PaginationFromDatabase,
} from '../types';

export function uuid(eventId: Buffer, liquidity: Liquidity): string {
Expand All @@ -49,10 +50,11 @@ export async function findAll(
createdOnOrAfter,
clientMetadata,
fee,
page,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, we'd have a check that if page is set, then orderBy must contain eventId. Can you add a TODO for now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

}: FillQueryConfig,
requiredFields: QueryableField[],
options: Options = DEFAULT_POSTGRES_OPTIONS,
): Promise<FillFromDatabase[]> {
): Promise<PaginationFromDatabase<FillFromDatabase>> {
verifyAllRequiredFields(
{
limit,
Expand Down Expand Up @@ -156,11 +158,40 @@ export async function findAll(
Ordering.DESC,
);

if (limit !== undefined) {
if (limit !== undefined && page === undefined) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure that the limit is only applied when page is not provided to avoid unintended data truncation.

-  if (limit !== undefined && page === undefined) {
+  if (limit !== undefined && page == null) {
    baseQuery = baseQuery.limit(limit);
  }

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
if (limit !== undefined && page === undefined) {
if (limit !== undefined && page == null) {

baseQuery = baseQuery.limit(limit);
}

return baseQuery.returning('*');
/**
* If a query is made using a page number, then the limit property is used as 'page limit'
*/
if (page !== undefined && limit !== undefined) {
/**
* We make sure that the page number is always >= 1
*/
const currentPage: number = Math.max(1, page);
const offset: number = (currentPage - 1) * limit;

/**
* We need to remove the sorting as it is not necessary in this case.
* Also a casting of the ts type is required since the infer of the type
* obtained from the count is not performed.
*/
const count: { count?: string } = await baseQuery.clone().clearOrder().count({ count: '*' }).first() as unknown as { count?: string };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment about removing sorting is misleading because the sorting is actually necessary for consistent pagination results. Recommend rephrasing or removing this comment to avoid confusion.

-     * We need to remove the sorting as it is not necessary in this case.
+     * Ensure sorting is applied to maintain consistent pagination results.

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
* We need to remove the sorting as it is not necessary in this case.
* Also a casting of the ts type is required since the infer of the type
* obtained from the count is not performed.
*/
const count: { count?: string } = await baseQuery.clone().clearOrder().count({ count: '*' }).first() as unknown as { count?: string };
* Ensure sorting is applied to maintain consistent pagination results.
* Also a casting of the ts type is required since the infer of the type
* obtained from the count is not performed.
*/
const count: { count?: string } = await baseQuery.clone().clearOrder().count({ count: '*' }).first() as unknown as { count?: string };


baseQuery = baseQuery.offset(offset).limit(limit);
Copy link
Contributor

@vincentwschau vincentwschau Mar 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Question: How does this logic interact with additional items being added that are higher in the sort order than items returned?
E.g.
I want fills created after height X in descending order, page size of 10. I get page 2, which is the 10th-20th most recent fills. 10 new fills get added, and now I get page 3, would that be the same set of fills I got for page 2?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To address your example: yes, you are right.

We use offset-based pagination to improve UX through parallel data retrieval, recognising the potential for data inconsistencies with new insertions. While this method provides fast access to any data segment, in certain scenarios it can lead to duplicate entries if data is added between two queries.

Conversely, cursor-based pagination offers better data consistency by using a pointer (cursor) to retrieve data sequentially, eliminating duplicates or missed entries due to data movement. However, it lacks the flexibility of parallel retrieval and direct access to specific data segments.

The request for such a feature stems from the need to add pagination to retrieve a user's full trading history at the front-end.

Given the frequency of data updates relative to request receipt in our context, we believe that offset-based pagination strikes the best balance for our current needs, optimising the user experience while considering the trade-off in data consistency. In fact, in this scenario, if a user has a very long history, a sequential approach could actually degrade the user experience, as we would have to wait for each request to continue with the new one.

Anyway, give us more feedback, we are open to adapting our approach as requirements evolve.


return {
results: await baseQuery.returning('*'),
limit,
offset,
total: parseInt(count.count ?? '0', 10),
};
}

return {
results: await baseQuery.returning('*'),
};
}

export async function create(
Expand Down
Loading
Loading