You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I encountered an issue when trying to use APScheduler to schedule tasks that involve a pyodbc.Connection. The error occurs when I start the scheduler, and it attempts to execute a function that interacts with the database. The function fails with the following error:
TypeError: cannot pickle 'pyodbc.Connection' object
The issue arises because pyodbc.Connection objects cannot be serialized (pickled), and APScheduler with ProcessPoolExecutor tries to pickle arguments passed to the scheduled job function.
To fix this, I tried to create a new database connection within the scheduled job function instead of passing the existing connection object, but it still did not work.
Steps to Reproduce:
Create a function that establishes a database connection using pyodbc and interacts with the database.
Use APScheduler with a ProcessPoolExecutor to schedule this function to run periodically.
Start the scheduler using scheduler.start().
The error occurs when the job is executed.
Expected Behavior:
The function should execute successfully without errors, fetching data from the database periodically. Actual Behavior:
The scheduler throws the following error when trying to execute the job:
TypeError: cannot pickle 'pyodbc.Connection' object
Could you update the documentation to highlight this behavior and suggest a workaround? Alternatively, if there's a way to configure APScheduler to handle non-pickleable objects more gracefully, guidance on that would be appreciated.
The text was updated successfully, but these errors were encountered:
Hi @AnahitaHonarmandian Pyodbc Connection objects should not typically be pickled. After all, they are used to control database transactions so it would be unusual to pass a copy of those objects to another user or service, in any form. It's better for each of your threads/processes to create their own connection to the database, using the same credentials. I'm guessing those credentials can be passed into APScheduler through the "args" and "kwargs" parameters in the add_job() function. I appreciate you said you have already tried that. Do pass on the error message if you want me to take a look at it.
Description:
I encountered an issue when trying to use APScheduler to schedule tasks that involve a pyodbc.Connection. The error occurs when I start the scheduler, and it attempts to execute a function that interacts with the database. The function fails with the following error:
TypeError: cannot pickle 'pyodbc.Connection' object
The issue arises because pyodbc.Connection objects cannot be serialized (pickled), and APScheduler with ProcessPoolExecutor tries to pickle arguments passed to the scheduled job function.
To fix this, I tried to create a new database connection within the scheduled job function instead of passing the existing connection object, but it still did not work.
Steps to Reproduce:
Create a function that establishes a database connection using pyodbc and interacts with the database.
Use APScheduler with a ProcessPoolExecutor to schedule this function to run periodically.
Start the scheduler using scheduler.start().
The error occurs when the job is executed.
Environment
APScheduler version: (e.g., 3.9.1.post1)
Python version: (e.g., 3.9.7)
pyodbc version: (e.g., 4.0.34)
Database: (e.g., SQL Server)
Operating System: (e.g., Windows 10)
Issue
Expected Behavior:
The function should execute successfully without errors, fetching data from the database periodically.
Actual Behavior:
The scheduler throws the following error when trying to execute the job:
TypeError: cannot pickle 'pyodbc.Connection' object
Could you update the documentation to highlight this behavior and suggest a workaround? Alternatively, if there's a way to configure APScheduler to handle non-pickleable objects more gracefully, guidance on that would be appreciated.
The text was updated successfully, but these errors were encountered: