Skip to content

Unexpected Results Depending on DisCoPy Installation Timing #207

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
SiyoonKIM opened this issue Feb 13, 2025 · 6 comments
Open

Unexpected Results Depending on DisCoPy Installation Timing #207

SiyoonKIM opened this issue Feb 13, 2025 · 6 comments

Comments

@SiyoonKIM
Copy link

Issue Description

When following the PennyLane blog tutorial, I observed that the results differ based on when DisCoPy is installed.

  • If LAMBEQ and DisCoPy are pre-installed before running the code, the final results are different from when DisCoPy is installed dynamically at the "Initializing the model" step.
  • There is a noticeable drop in accuracy when DisCoPy is pre-installed before execution.
  • Uninstalling and reinstalling DisCoPy results in higher accuracy in a local environment, but in Google Colab, it produces the same results as if DisCoPy had not been uninstalled.

This suggests that pre-installing both LAMBEQ and DisCoPy might introduce unexpected interactions, affecting some processes.

This issue affects the reproducibility of quantum NLP models trained with LAMBEQ and PennyLane. If the model's accuracy depends on the installation timing of DisCoPy, it becomes difficult to ensure consistent training results.


Environment

Component Version
Python Interpreter 3.11.11
PyTorch 2.5.1
PennyLane 0.40.0
LAMBEQ 0.4.3
DisCoPy 1.2.0
  • Installation Method: pip

Steps to Reproduce

Case 1: Installing DisCoPy During Execution

  1. Start with only lambeq installed:
    pip3 install torch torchvision torchaudio
    pip install pennylane
    pip install lambeq
  2. Run the tutorial code from the PennyLane blog tutorial.
  3. During the 'Initializing the model' step of the tutorial, install DisCoPy dynamically:
    pip install "discopy>=1.1.0"
  4. Observe the final accuracy.
    Image
  5. Restart the kernel and re-run the code.

Case 2: Pre-installing DisCoPy

  1. Install all dependencies before running the code:
    pip3 install torch torchvision torchaudio
    pip install pennylane
    pip install lambeq
    pip install "discopy>=1.1.0"
  2. Observe the final accuracy.
    Image
  3. Compare the results.

Expected Behavior

The results should be consistent regardless of whether DisCoPy is pre-installed or installed dynamically.

Reference Notebook

The following Jupyter Notebook contains the code from the PennyLane blog tutorial, provided here for reference.
qnlp_pennylane_blog.ipynb.zip

@dimkart
Copy link
Contributor

dimkart commented Feb 17, 2025

@SiyoonKIM Hi and thanks for reporting this. Can you please try the same with discopy >= 1.1.7 and see if you have the same issue?

@SiyoonKIM
Copy link
Author

@dimkart Hi, thank you for your response.

When running:

pip install "discopy>=1.1.0"

DisCoPy 1.2.0 gets installed.

To verify, I first uninstalled DisCoPy and then ran:

pip uninstall discopy
pip install "discopy>=1.1.7"

Yet, DisCoPy 1.2.0 was still installed.

After rerunning the tutorial, the issue persists—the accuracy drop still occurs.

Would you recommend explicitly installing a specific version (e.g., 1.1.7 or 1.1.8) for further testing?

@dimkart
Copy link
Contributor

dimkart commented Feb 17, 2025

So in both cases you mention (pre-installed and dynamically installed), do you end up with the same version of DisCoCy or these are different? Can you check @SiyoonKIM

@dimkart
Copy link
Contributor

dimkart commented Feb 17, 2025

In general have in mind that dynamic installation is much safer since dependencies are taken care automatically (can be upgraded or downgraded as required) while a pre-installation can't do that. In any case we'll have a closer look and come back to you. I would suggest you to do a freeze for both cases after the installation to check differences in the packages versions.

@neiljdo
Copy link
Collaborator

neiljdo commented Feb 18, 2025

Hi @SiyoonKIM. I tried the steps you outlined above but couldn't replicate the issue - in both cases, I got the lower accuracy result. I've verified that installing pip install discopy only installs the discopy library itself and doesn't update any dependencies so we can rule out the possibility that the notebook is using a different version of the packages already imported. I have the same dependencies you listed above except for torch==2.6.0.

What happens when you install discopy while in the notebook, restart the kernel, and rerun everything? Do you still get the higher accuracy? Also, can you compare the dependencies you end up installing (via pip freeze) with the one I got (see attached)? We'll continue investigating this and your input is highly appreciated.

dcp-pre-reqts.txt

@SiyoonKIM
Copy link
Author

Hi @dimkart and @neiljdo,

Thank you for your responses. I ran some tests in a venv environment using Visual Studio Code and obtained the following results:

Installation Location Timing of discopy Installation Requirements File Accuracy
Terminal Before installing discopy requirements0.txt -
Terminal Installing discopy mid-run requirements1.txt High accuracy
Terminal Installing discopy from the start requirements2.txt Low accuracy
In an ipynb file Installing discopy mid-run requirements3.txt High accuracy
  • requirements1.txt, requirements2.txt, and requirements3.txt are identical based on comparison.
  • requirements0.txt is the same as requirements1.txt, 2.txt, and 3.txt except that it does not contain discopy==1.2.0.
  • There are some differences between dcp-pre-reqts.txt and requirements1.txt, 2.txt, and 3.txt.

I hope this additional information helps with the investigation. Please let me know if you need any further details or clarifications.

requirements0.txt
requirements1.txt
requirements2.txt
requirements3.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants