-
Notifications
You must be signed in to change notification settings - Fork 14k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Druid + Hive query "22" error #7520
Comments
Issue-Label Bot is automatically applying the label Links: app homepage, dashboard and code for this bot. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. For admin, please label this issue |
Mmmh looks like out-of-sync Thrift stuff, like the server is returning error codes that the client doesn't know about yet because it's too old. Seems like you need a newer Hive thrift client. |
For the interested, I'm working on a fix for this at dropbox/PyHive#292 |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. For admin, please label this issue |
To reproduce this issue you will need to create external table stored by druid in hive and try to query it from superset.
In sql lab you will see smth like this:
The real error is here:
The problem is in connector.
Lets see what is ttypes.TTypeId._VALUES_TO_NAMES:
Ok. Now we need to know where type_id coming from:
And now if you want to see column - it will be
__time
So i have created issue in PyHive.
The easy solution for now replace 313 line in pyhive/hive.py
from
type_code = ttypes.TTypeId._VALUES_TO_NAMES[type_id]
to
type_code = ttypes.TTypeId._VALUES_TO_NAMES[type_id if type_id != 22 else 18]
18 is datetime so all will be cool:
P.S. I know that is not superset problem, but i guess many superset users use pyhive, so I just wanted to share my solution.
The text was updated successfully, but these errors were encountered: