A Python web web app that searches for images matching a given text
Note you need to do this stage whether you're going to run the app locally or via Docker.
First, create a virtual environment to keep package installation local to this directory
python3 -m venv venv
Enable it - this shows doing so for a normal Unix shell, there are other
scripts for (for instance) the fish
shell
source venv/bin/activate
Install the Python packages we need
python3 -m pip install -r requirements.txt
Create your PostgreSQL® database. An Aiven for PostgreSQL service will do very well - see the Create a service section in the Aiven documentation.
Either:
-
Copy the template environment file
cp .env_example .env
Then edit the
.env
file to insert the credentials needed to connect to the database.Note If you're using an Aiven for PostgreSQL service, then you want the Service URI value from the service Overview in the Aiven console. The result should look something like:
DATABASE_URL=postgres://<user>:<password>@<host>:<port>/defaultdb?sslmode=require
or:
- Set the
DATABASE_URL
environment variable to the PostgreSQL Service URI.
Enable pgvector and set up the table we need in the database
./create_table.py
Calculate the embeddings for the pictures in the photos
directory, and
upload them to the database
./process_images.py
You can run find_images.py
to check that everything is working - it looks
for images matching the text man jumping
and reports their filenames
./find_images.py
First, if you didn't already do so, create a virtual environment to keep package installation local to this directory
python3 -m venv venv
Enable it - this shows doing so for a normal Unix shell, there are other
scripts for (for instance) the fish
shell
source venv/bin/activate
Install the Python packages we need
python3 -m pip install -r requirements.txt
Note Sometimes we've seen the Python
clip.load
function fail to download the CLIP model, presumably due to the source server being busy. The code here will use a local copy of the model if it's available. To make that local copy:mkdir models curl https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt --output models/ViT-B-32.pt
If you set up a .env
file containing the PostgreSQL service URI, then you're
ready to run the app. Otherwise, you can set a local environmment variable,
DATABASE_URL
, to that same URI.
Run the webapp locally using fastapi
fastapi dev app.py
Go to http://127.0.0.1:8000 in a web browser, and request a search.
Possible ideas include:
- cat
- man jumping
- outer space
Build the image.
docker build -t appimage .
Run the container. Pass the PostgreSQL service URI as an environment variable.
docker run -d --name mycontainer -p 3000:3000 -e DATABASE_URL=$DATABASE_URL appimage
Then go to http://localhost:3000/ and the app should be running.
The images in the photos
directory are the same as those used in Workshop: Searching for images with vector search - OpenSearch and CLIP model.
They came from Unsplash and have been reduced in size to make them fit within GitHub filesize limits for a repository.
Note When the app is running, it retrieves the images for the HTML page directly from this GitHub repository. This is not good practice for a production app, as GitHub is not intended to act as an image repository for web apps.
-
When writing the Dockerfile, the default
FROM python:3.11
downloads much of Ubuntu, which we don't need. We can vastly reduce the size of the image by usingFROM python:3.11-slim
, at the cost of needing to installgit
(needed by the requirements to downloadgit+https://github.com/openai/CLIP.git
) andcurl
. See https://hub.docker.com/_/python for more about the Python images available. -
In the Docker context, we deliberately download the CLIP image file
RUN curl https://openaipublic.azureedge.net/clip/models/.../ViT-B-32.pt --output /app/models/ViT-B-32.pt
so that the file is ready for the app when it runs. This makes a lot of sense in a container context.
-
At one point I was running the Dockerised application in an HTTPS context. In order to make the HTMX redirect to
/search_form
also use HTTPS, I needed to tell FastAPIredirect_slashes=FALSE
(and make sure that the/search_form
in thetemplates/index.html
file didn't end with/
).I found the information at FastAPI redirection for trailing slash returns non-SSL link very helpful, particularly this comment.
-
The Workshop: Searching for images with vector search - OpenSearch and CLIP model which does (essentially) the same thing, but using OpenSearch and Jupyter notebooks.
-
Building a movie recommendation system with Tensorflow and PGVector which searches text, and produces a web app using JavaScript
For help understanding how to use HTMX
- Using HTMX with FastAPI
- and for help understanding how I wanted to use forms, Updating Other Content from the HTMX documentation (I went for option 1, as suggested).