Skip to content

[Bug]Filter pushdown !(col <=> value) is not correct when col has null value #6931

@xieshuaihu

Description

@xieshuaihu

Search before asking

  • I searched in the issues and found nothing similar.

Paimon version

Latest version, 1.4-SNAPSHOT

Compute Engine

Spark

Minimal reproduce step

Step 1: prepare data

CREATE TABLE T (a INT, b INT);
INSERT INTO T VALUES (1, null);

Step 2: this result is true

SELECT !(b <=> 1) FROM T;

Step 3: but this return empty rows

SELECT a, b FROM T WHERE !(b <=> 1);

What doesn't meet your expectations?

After filter pushdown, the result is not correct.

Spark supports Equal (i.e. =) and EqualNullSafe (i.e. <=>). After pushdown, paimon treat both as Equal and simplify Not Equal (two predicate expressions) as NotEqual (one predicate expression).

This table shows the differences between spark and paimon when col is null:

WHERE col = 1 WHERE col <=> 1 WHERE !(col = 1) WHERE !(col <=> 1)
spark false false false true
paimon false false false false

Anything else?

No response

Are you willing to submit a PR?

  • I'm willing to submit a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions