-
Notifications
You must be signed in to change notification settings - Fork 686
Closed
Description
Hi there!
I am currently trying to perform some trasforms of json from a kafka queue to another kafka queue with Flink and I am really struggling to work with the deep json structure of the kafka input queue:
ROW is not supported to be used as the element type of ArrayType.
ROW is not supported to be used as the field type of RowType
I ended up specifying the complex column types as string and using cast('json') everywhere.
Still I am struggling to access values with this approach:
myFieldValue = data.cast('string')
# myFieldValue will contain "{ \"field\": { \"subfield\": \"value\" } }"
myFieldValue = data['field']['subfield'].cast('string')
# myFieldValue will contain null
myFieldValue = data['field.subfield'].cast('string') or data['field.subfield'][0].cast('string')
# myFieldValue will contain "[\"value\"]"
myFieldValue = data['field.subfield[0]'].cast('string')
# myFieldValue will contain nullAny help on how I should handle deep objects with ibis Flink?
Originally posted by @christophediprima in #11642
Metadata
Metadata
Assignees
Labels
No labels
Type
Projects
Status
done