Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [Oracle CDC to Kafka] table column with default value, if the insert column is null, the result to kafka is the default value but not null #8759

Open
3 tasks done
sin70611 opened this issue Feb 18, 2025 · 0 comments
Labels

Comments

@sin70611
Copy link

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

I synchronized multitable from Oracle to Kafka, One table has a column with default value 'N'. and I find that a record inserted into the table with value 'null', but the CDC to Kafka column value is 'N';

SeaTunnel Version

2.3.8

SeaTunnel Config

Hybrid Mode Cluster

Running Command

{
    "sink" : [
        {
            "format" : "compatible_debezium_json",
            "topic" : "${table_name}",
            "bootstrap.servers" : "xx.xx.xx.xx:9092",
            "source_table_name" : "t1",
            "kafka.request.timeout.ms" : 6000,
            "semantics" : "AT_LEAST_ONCE",
            "plugin_name" : "kafka",
            "kafka.config" : {
                "request.timeout.ms" : 6000,
                "acks" : "all",
                "buffer.memory" : 33554432
            }
        }
    ],
    "source" : [
        {
            "base-url" : "jdbc:oracle:thin:@xx.xx.xx.xx:1521:xx",
            "startup.mode" : "latest",
            "debezium" : {
                "key.converter.schemas.enable" : false,
                "value.converter.schemas.enable" : false,
                "database.server.name" : "XXX",
                "include.schema.changes" : true
            },
            "table-names" : [
                "AAA.BBB.CCC"
            ],
            "format" : "compatible_debezium_json",
            "result_table_name" : "t1",
            "database-names" : [
                "AAA"
            ],
            "schema-names" : [
                "BBB"
            ],
            "plugin_name" : "Oracle-CDC",
            "skip_analyze" : true,
            "password" : "XXX",
            "source.reader.close.timeout" : 120000,
            "username" : "XXX"
        }
    ],
    "env" : {
        "job.mode" : "STREAMING",
        "job.name" : "XXX",
        "read_limit.bytes_per_second" : 7000000,
        "parallelism" : 1,
        "read_limit.rows_per_second" : 400,
        "checkpoint.interval" : 5000
    }
}

Error Exception

it run normal with no error information

Zeta or Flink or Spark Version

zeta

Java or Scala Version

11

Screenshots

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@sin70611 sin70611 added the bug label Feb 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant