[tests] Add integration tests for array data type#523
[tests] Add integration tests for array data type#523charlesdong1991 wants to merge 4 commits intoapache:mainfrom
Conversation
| pa.field("arr_ts_nano", pa.list_(pa.timestamp("ns"))), | ||
| pa.field("arr_float", pa.list_(pa.float32())), | ||
| pa.field("arr_double", pa.list_(pa.float64())), | ||
| # TODO(fluss-python): support PyArrow FixedSizeBinary in schema conversion. |
There was a problem hiding this comment.
i seems fixed size binary will fail now, and should be supported.
it is nicer to address in a separate PR IMHO, so added a new issue: #524
|
@leekeiabstraction @fresh-borzoni it will be great if you can take a look! 🙏 |
fresh-borzoni
left a comment
There was a problem hiding this comment.
@charlesdong1991 Ty for the PR, LGTM overall
Left a couple of nits and cleanups, non-blocking though
| make_key_with_len(id, 3) | ||
| } | ||
|
|
||
| fn make_key_with_len(id: i32, field_count: usize) -> GenericRow<'static> { |
There was a problem hiding this comment.
nit: make_key_with_field_count feels like a better name
| if expected.is_nan() { | ||
| assert!(actual.is_nan(), "expected NaN"); | ||
| } else if expected.is_infinite() { | ||
| assert_eq!(actual.is_infinite(), true); |
There was a problem hiding this comment.
nit: assert!(actual.is_infinite()) ?
| if expected.is_nan() { | ||
| assert!(actual.is_nan(), "expected NaN"); | ||
| } else if expected.is_infinite() { | ||
| assert_eq!(actual.is_infinite(), true); |
| row | ||
| } | ||
|
|
||
| fn make_string_array(values: &[Option<&str>]) -> FlussArray { |
There was a problem hiding this comment.
we have a lot of similar helpers shared between log_table.rs and kv_table.rs
Let's factor them out to utils.rs, wdyt?
|
Thanks for your comments, addressed all, and rebased! @fresh-borzoni PTAL 🙏 |
All good, thank you for addressing 👍 |
leekeiabstraction
left a comment
There was a problem hiding this comment.
TY for the PR. left a couple of comments. PTAL!
| } | ||
|
|
||
| #[tokio::test] | ||
| async fn upsert_and_lookup_with_array() { |
There was a problem hiding this comment.
Hmm, this approach increases the number of test cases (eventually when we have map types support as well). Can you try to incorporate these into all_supported_datatypes? You can have additional columns there e.g. array of arrays.
There was a problem hiding this comment.
hmm, array test coverage is also added to all_supported_datatypes for both logs and kv tests. This test cover the patterns that don't fit all_supported_datetypes structure that well IMHO, since these require multi-row setups and different null patterns per row.
And if i understand correctly, all_supported_datetypes tests one row with values and one row with all nulls.
If we add Map, i think it's best to follow the same approach as here for Array, so basic column incorporated in all_supported_datatypes while specific tests for map-specific edge cases.
But i don't have very strong opinion on this, happy to adjust if you prefer the consolidation! @leekeiabstraction 🙏
| } | ||
|
|
||
| #[tokio::test] | ||
| async fn append_and_scan_with_array() { |
There was a problem hiding this comment.
Similarly here on incorporating into all_supported_datatypes test case.
There was a problem hiding this comment.
similar reply above 🙏
|
i think https://github.com/apache/fluss-rust/actions/runs/25275194644/job/74108466204 is a test flake, and shouldn't be caused by changes in this pr @leekeiabstraction @fresh-borzoni hope rerun will fix. in the meantime, i can take a quick look to understand why |
@charlesdong1991 Yes, it's just flake with python tests, we use single test-cluster across 4 different python version x 4 workers, so under contention this can happen. I need to take a look at this, because it's getting annoying sometimes :) |
Purpose
Add integration tests for array data type support in Rust core and Python bindings.
c++ has integration tests covered already, so only focus on rust and python
Linked issue: close #441
Tests
All tests have passed locally.