-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mention SQLite in docs #246
Comments
This depends on the performance of the ODBC driver of SQLite itself. I have not yet heard of much usage of it, so my expectations aren't that great. But in the end: just try it and see whether it is faster. |
Actually, for others wondering about this, the Python https://docs.python.org/3/library/sqlite3.html doesn't ship an ODBC driver at all. So any attempt like this
will fail with an error like
Related idea, but I think not realised yet: pandas-dev/pandas#17790 (comment) |
Just to chip in here a little. I recently tested inserting into an SQLite Database. I inserted a csv file 25k rows (somewhat larger than 3MiB) into an SQLite DB using |
Could you please add some information in the docs whether turbodbc support SQLite or not?
Is it fast or slow?
There doesn't seem to be a single mention of "sqlite" in the docs currently:
https://turbodbc.readthedocs.io/en/latest/search.html?q=sqlite&check_keywords=yes&area=default
My current use case is that I have ~ 10 pandas Dataframes, each ~ 100 MB in size, and would like to write them concatenated to a SQLite file. Using
sqlalchemy.create_engine(f'sqlite:///db.sqlite')
and then in a loopDataframe.to_sql
withif_exists="append"
seems to work, but it's extremely slow. Is turbodbc the right tool for this job, or something else?The text was updated successfully, but these errors were encountered: