Skip to content

Test data labeled NP1100 has electrode positions inconsistent with NP1100 catalogue geometry #407

@h-mayorquin

Description

@h-mayorquin

Three probes in the test dataset OE_Neuropix-PXI-NP-Ultra/settings.xml are labeled probe_part_number="NP1100" (L75, L274, L473), but their electrode positions contradict the NP1100 specification. The XML positions show 48 um vertical spacing across a y range of 0-2256 um. According to the ProbeTable catalogue, NP1100 is a passive probe with only 384 total electrodes at 6 um pitch (y = 0-282 um), a single bank, and no electrode selection. This is an 8x mismatch in vertical spacing and the y range is off by an order of magnitude. Every other probe type in our test data (NP1, NP2, NP2 4-shank, NP-Opto, NP1110, NP1121) matches its catalogue geometry exactly.

I suspect the probe_part_number in the XML is wrong and these are actually NP1110 probes. The positions match NP1110 exactly: NP1110 (datasheet) has 6144 electrodes in an 8x768 grid at 6 um pitch with active electrode selection, and selecting every 8th row produces 48 rows at 48 um effective spacing = 2256 um. NP1110 is the only probe in the NP11xx family with more than 384 electrodes, making it the only one capable of producing the observed positions. The two part numbers differ by a single digit (NP1100 vs NP1110), and NP1100 is listed as non-commercial in the ProbeTable while NP1110 is commercial, so these may have been prototype units with an incorrect part number. Looking at the neuropixels-pxi plugin source, the part number is read directly from the probe's EEPROM chip via readProbePN() and passed through unchanged to the XML, so the mislabel would originate from the chip itself.

I propose that read_openephys raises an error when it encounters any probe whose XML positions don't match the catalogue geometry, and that we remove the NP1100 test data from our test suite. In the case of the NP1100 test data, the mismatch was caused by an incorrect part number on a non-commercial prototype, but in general both possible fallbacks (trusting the XML positions or trusting the catalogue) can be wrong, so I don't feel comfortable silently picking one. By raising an error we surface the problem to users. If this happens more often we can revisit and discuss a proper solution based on the cases reported. I am doing this on: #406

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions