Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

streaming through a table #42

Open
alanpaulkwan opened this issue Sep 18, 2018 · 1 comment
Open

streaming through a table #42

alanpaulkwan opened this issue Sep 18, 2018 · 1 comment

Comments

@alanpaulkwan
Copy link

Imagine hte following:

res=dbSendQuery(con,queryText)
while(nrow(x<-dbFetch(res,50))){
........

}

Right now, this could complain about a memory error. However, having asked Clickhouse, it's not clear the results should be stored in memory? Not sure why we can't stream through the results sequentially in R.

@inkrement
Copy link
Member

This package does not support server side cursors or "sequential streaming" as you call it. dbSendQuery returns a "result" which is fetched as a huge chunk. The second dbFetch parameter does not define the batch-size to load, it defines the max amount to load. Thus, it does not work as expected. However, such a feature would be nice to have and we will consider it for future enhancements. And of course: feel free to provide a pull request.

inkrement pushed a commit that referenced this issue Dec 4, 2019
* Add TypeAst cache.

* Add Type::Code in TypeAst.

* Use -O2 optimization to build.

* Add benchmark: SELECT from system.numbers
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants