Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion jest.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,5 +15,6 @@ export default {
testEnvironment: 'node',
setupFiles: ['dotenv/config'],
testPathIgnorePatterns: ['/node_modules/', '<rootDir>/dist/', '<rootDir>/test/fixtures/'],
moduleFileExtensions: ['js', 'ts']
moduleFileExtensions: ['js', 'ts'],
verbose: true
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did that just sneak into the commit, or is it meant to stay changed?

};
69 changes: 27 additions & 42 deletions src/api.ts
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai can you resolve conflicts?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mktcode It appears that there are no merge conflicts in the current branch. The working tree is clean, and no conflict markers were found. If you need to synchronize this branch with the latest changes from the master branch, you can update it by merging or rebasing with master.

If you need further assistance or specific updates, please let me know!

Analysis chain


Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Fetch the conflicting files and the nature of conflicts
git fetch origin
git merge-base HEAD origin/main
git diff --name-only --diff-filter=U

Length of output: 134


Script:

#!/bin/bash
# List all remote branches to identify the correct main branch name
git branch -r

Length of output: 62


Script:

#!/bin/bash
# Fetch the conflicting files and the nature of conflicts using the master branch
git fetch origin
git merge-base HEAD origin/master
git diff --name-only --diff-filter=U

Length of output: 93


Script:

#!/bin/bash
# Check the status of the repository
git status

# Search for conflict markers in the files
git grep -n '<<<<<<< HEAD'

Length of output: 104

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai I'm afraid you're mistaken. GitHub is displaying conflicts rather than a merge button. Your conclusion is incorrect.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mktcode The rebase attempt with the master branch has revealed conflicts in the following files:

  • .gitignore
  • package.json
  • src/api.ts
  • src/index.ts
  • tsconfig.json
  • yarn.lock

To resolve these conflicts, please follow these steps:

  1. Open each of the conflicting files and manually resolve the conflicts. Look for conflict markers (<<<<<<<, =======, >>>>>>>) and decide which changes to keep.
  2. Once you have resolved the conflicts, mark the files as resolved by running:
    git add <conflicted_file>
  3. Continue the rebase process by running:
    git rebase --continue
  4. If you encounter further conflicts, repeat the steps above. If you wish to abort the rebase process at any point, you can run:
    git rebase --abort

Please let me know if you need any further assistance!

Analysis chain
Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Fetch the conflicting files and the nature of conflicts using the master branch
git fetch origin
git merge-base HEAD origin/master
git diff --name-only --diff-filter=U

# Check the status of the repository
git status

# Search for conflict markers in the files
git grep -n '<<<<<<< HEAD'

Length of output: 198


Script:

#!/bin/bash
# Attempt to rebase the current branch with master to identify any conflicts
git fetch origin
git rebase origin/master

Length of output: 894

Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
import express from 'express';
import { Readable } from 'stream';
import { capture } from '@snapshot-labs/snapshot-sentry';
import { parseQuery, resize, setHeader, getCacheKey } from './utils';
import { set, get, streamToBuffer, clear } from './aws';
import { parseQuery, resize, setHeader } from './utils';
import { streamToBuffer } from './aws';
import Cache from './resolvers/cache';
import resolvers from './resolvers';
import constants from './constants.json';
import { rpcError, rpcSuccess } from './helpers/utils';
Expand Down Expand Up @@ -33,13 +35,14 @@ router.post('/', async (req, res) => {
router.get(`/clear/:type(${TYPE_CONSTRAINTS})/:id`, async (req, res) => {
const { type, id } = req.params;
try {
const { address, network, w, h, fallback, cb } = await parseQuery(id, type, {
s: constants.max,
fb: req.query.fb,
cb: req.query.cb
});
const key = getCacheKey({ type, network, address, w, h, fallback, cb });
const result = await clear(key);
const cache = new Cache(
await parseQuery(id, type, {
s: constants.max,
fb: req.query.fb,
cb: req.query.cb
})
);
const result = await cache.clear();
res.status(result ? 200 : 404).json({ status: result ? 'ok' : 'not found' });
} catch (e) {
capture(e);
Expand All @@ -49,42 +52,31 @@ router.get(`/clear/:type(${TYPE_CONSTRAINTS})/:id`, async (req, res) => {

router.get(`/:type(${TYPE_CONSTRAINTS})/:id`, async (req, res) => {
const { type, id } = req.params;
let address, network, w, h, fallback, cb;
let parsedParams, address, network, w, h, fallback;

try {
({ address, network, w, h, fallback, cb } = await parseQuery(id, type, req.query));
parsedParams = await parseQuery(id, type, req.query);
({ address, network, w, h, fallback } = parsedParams);
} catch (e) {
return res.status(500).json({ status: 'error', error: 'failed to load content' });
}

const key1 = getCacheKey({
type,
network,
address,
w: constants.max,
h: constants.max,
fallback,
cb
});
const key2 = getCacheKey({ type, network, address, w, h, fallback, cb });
const cache = new Cache(parsedParams);

// Check resized cache
const cache = await get(`${key1}/${key2}`);
if (cache) {
// console.log('Got cache', address);
const cachedResizedImage = await cache.getResizedImage();
if (cachedResizedImage) {
setHeader(res);
return cache.pipe(res);
return (cachedResizedImage as Readable).pipe(res);
}

// Check base cache
const base = await get(`${key1}/${key1}`);
let baseImage;
if (base) {
baseImage = await streamToBuffer(base);
// console.log('Got base cache');
} else {
// console.log('No cache for', key1, base);
const cachedBaseImage = await cache.getBaseImage();
let baseImage: Buffer;

if (cachedBaseImage) {
baseImage = await streamToBuffer(cachedBaseImage as Readable);
} else {
let currentResolvers: string[] = constants.resolvers.avatar;
if (type === 'token') currentResolvers = constants.resolvers.token;
if (type === 'space') currentResolvers = constants.resolvers.space;
Expand All @@ -109,17 +101,10 @@ router.get(`/:type(${TYPE_CONSTRAINTS})/:id`, async (req, res) => {
res.send(resizedImage);

// Store cache
try {
if (!base) {
await set(`${key1}/${key1}`, baseImage);
console.log('Stored base cache', key1);
}
await set(`${key1}/${key2}`, resizedImage);
console.log('Stored cache', address);
} catch (e) {
capture(e);
console.log('Store cache failed', address, e);
if (!cachedBaseImage) {
await cache.setBaseImage(baseImage);
}
await cache.setResizedImage(resizedImage);
});

export default router;
69 changes: 30 additions & 39 deletions src/aws.ts
Original file line number Diff line number Diff line change
@@ -1,14 +1,18 @@
import * as AWS from '@aws-sdk/client-s3';
import { Readable } from 'stream';

let client;
let client: AWS.S3;
const dir = 'stamp-4';
const bucket = process.env.AWS_BUCKET_NAME;
const region = process.env.AWS_REGION;
const endpoint = process.env.AWS_ENDPOINT || undefined;
if (region) client = new AWS.S3({ region, endpoint });
const dir = 'stamp-4';

export async function streamToBuffer(stream: Readable) {
export const isConfigured = !!(bucket && region);

if (isConfigured) client = new AWS.S3({ region, endpoint });

export async function streamToBuffer(stream: Readable): Promise<Buffer> {
return await new Promise((resolve, reject) => {
const chunks: Uint8Array[] = [];
stream.on('data', chunk => chunks.push(Buffer.from(chunk)));
Expand All @@ -17,53 +21,40 @@ export async function streamToBuffer(stream: Readable) {
});
}

export async function set(key, value) {
try {
const command = new AWS.PutObjectCommand({
Bucket: bucket,
Key: `public/${dir}/${key}`,
Body: value,
ContentType: 'image/webp'
});
export async function set(key: string, value: Buffer) {
const command = new AWS.PutObjectCommand({
Bucket: bucket,
Key: `public/${dir}/${key}`,
Body: value,
ContentType: 'image/webp'
});

await client.send(command);
} catch (e) {
console.log('Store cache failed', e);
throw e;
}
return await client.send(command);
}

export async function clear(path) {
try {
const listedObjects = await client.listObjectsV2({
Bucket: bucket,
Prefix: `public/${dir}/${path}`
});
if (!listedObjects.Contents || listedObjects.Contents.length === 0) return false;
const objs = listedObjects.Contents.map(obj => ({ Key: obj.Key }));
await client.deleteObjects({
Bucket: bucket,
Delete: { Objects: objs }
});
if (listedObjects.IsTruncated) await clear(path);
console.log('Cleared cache', path);
return path;
} catch (e) {
console.log('Clear cache failed', e);
throw e;
}
export async function clear(path: string): Promise<boolean> {
const listedObjects = await client.listObjectsV2({
Bucket: bucket,
Prefix: `public/${dir}/${path}`
});
if (!listedObjects.Contents || listedObjects.Contents.length === 0) return false;
const objs = listedObjects.Contents.map(obj => ({ Key: obj.Key }));
await client.deleteObjects({
Bucket: bucket,
Delete: { Objects: objs }
});
if (listedObjects.IsTruncated) await clear(path);
return true;
}

export async function get(key) {
export async function get(key: string): Promise<Readable | boolean> {
try {
const command = new AWS.GetObjectCommand({
Bucket: bucket,
Key: `public/${dir}/${key}`
});

const { Body } = await client.send(command);

return Body;
return (await client.send(command)).Body as Readable;
} catch (e) {
return false;
}
Expand Down
6 changes: 6 additions & 0 deletions src/helpers/metrics.ts
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,9 @@ export const addressResolversCacheHitCount = new client.Counter({
help: 'Number of hit/miss of the address resolvers cache layer',
labelNames: ['status']
});

export const imageResolversCacheHitCount = new client.Counter({
name: 'image_resolvers_cache_hit_count',
help: 'Number of hit/miss of the image resolvers cache layer',
labelNames: ['status']
});
110 changes: 110 additions & 0 deletions src/resolvers/cache.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
import { createHash } from 'crypto';
import { Readable } from 'stream';
import { set as setCache, get as getCache, clear as clearCache, isConfigured } from '../aws';
import constants from '../constants.json';
import { imageResolversCacheHitCount } from '../helpers/metrics';
import { capture } from '@snapshot-labs/snapshot-sentry';

export function sha256(str: string) {
return createHash('sha256')
.update(str)
.digest('hex');
}

type ParamsType = {
type: string;
network: string;
address: string;
w: number;
h: number;
fallback?: string;
cb?: string;
};

export default class Cache {
baseImageCacheKey: string;
resizedImageCacheKey: string;
isConfigured: boolean;

constructor({ type, network, address, w, h, fallback, cb }: ParamsType) {
const data = { type, network, address, w, h };
if (fallback !== 'blockie') data['fallback'] = fallback;
if (cb) data['cb'] = cb;

const baseImageKey = this._buildKey({ ...data, w: constants.max, h: constants.max });
const resizedImageKey = this._buildKey(data);

this.baseImageCacheKey = `${baseImageKey}/${baseImageKey}`;
this.resizedImageCacheKey = `${baseImageKey}/${resizedImageKey}`;
this.isConfigured = isConfigured;

if (!this.isConfigured) {
console.log('[cache:resolver] Cache is not configured');
}
}

async getBaseImage(): Promise<Readable | boolean> {
return await this._getCache(this.baseImageCacheKey);
}

async getResizedImage(): Promise<Readable | boolean> {
return await this._getCache(this.resizedImageCacheKey);
}

async setBaseImage(value: Buffer) {
return await this._setCache(this.baseImageCacheKey, value);
}

async setResizedImage(value: Buffer) {
return await this._setCache(this.resizedImageCacheKey, value);
}

async clear(): Promise<boolean> {
if (!this.isConfigured) return false;

try {
const result = await clearCache(this.baseImageCacheKey);

console.log(`[cache:resolver] Cached cleared ${this.baseImageCacheKey}`);

return result;
} catch (e) {
console.log(`[cache:resolver] Failed to clear cache ${this.baseImageCacheKey}`);
capture(e);
return false;
}
}

private async _getCache(key: string) {
if (!this.isConfigured) return false;

try {
console.log(`[cache:resolver] Getting cache ${key}`);
const cache = await getCache(key);

imageResolversCacheHitCount.inc({ status: cache ? 'HIT' : 'MISS' }, 1);

return cache;
} catch (e) {
capture(e);
console.log(`[cache:resolver] Failed to get cache ${key}`);
return false;
}
}

private async _setCache(key: string, value: Buffer) {
if (!this.isConfigured) return false;

try {
console.log(`[cache:resolver] Setting cache ${key}`);
return await setCache(key, value);
} catch (e) {
capture(e);
console.log(`[cache:resolver] Failed to set cache ${key}`);
}
}

private _buildKey(params: ParamsType): string {
return sha256(JSON.stringify(params));
}
}
32 changes: 1 addition & 31 deletions src/utils.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import { createHash } from 'crypto';
import sharp from 'sharp';
import { Response } from 'express';
import { StaticJsonRpcProvider } from '@ethersproject/providers';
Expand All @@ -21,12 +20,6 @@ export function getProvider(network: number): StaticJsonRpcProvider {
return providers[`_${network}`];
}

export function sha256(str) {
return createHash('sha256')
.update(str)
.digest('hex');
}

export async function resize(input, w, h) {
return sharp(input)
.resize(w, h)
Expand Down Expand Up @@ -76,6 +69,7 @@ export async function parseQuery(id, type, query) {
if (h < 1 || h > maxSize || isNaN(h)) h = size;

return {
type,
address,
network,
w,
Expand All @@ -100,30 +94,6 @@ export function getUrl(url) {
return snapshot.utils.getUrl(url, gateway);
}

export function getCacheKey({
type,
network,
address,
w,
h,
fallback,
cb
}: {
type: string;
network: string;
address: string;
w: number;
h: number;
fallback: string;
cb?: string;
}) {
const data = { type, network, address, w, h };
if (fallback !== 'blockie') data['fallback'] = fallback;
if (cb) data['cb'] = cb;

return sha256(JSON.stringify(data));
}

export function setHeader(res: Response, cacheType: 'SHORT_CACHE' | 'LONG_CACHE' = 'LONG_CACHE') {
const ttl = cacheType === 'SHORT_CACHE' ? constants.shortTtl : constants.ttl;

Expand Down
Binary file added test/fixtures/sample.webp
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading