Dataset#

Adapter for NWB datasets to linkml Classes

class DatasetMap#
abstract classmethod check(cls: Dataset) bool#

Check if this map applies to the given item to read

abstract classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class MapScalar#

Datasets that are just a single value should just be a scalar value, not an array with size 1

Replaces the built class with a slot.

Examples

NWB Schema

datasets:
- name: MyScalar
  doc: A scalar
  dtype: int32
  quantity: '?'

LinkML

attributes:
- name: MyScalar
  description: A scalar
  multivalued: false
  range: int32
  required: false
classmethod check(cls: Dataset) bool#

Attr

Value

neurodata_type_inc

None

attributes

None

dims

None

shape

None

name

str

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class MapScalarAttributes#

A scalar with attributes gets an additional slot “value” that contains the actual scalar value of this field

classmethod check(cls: Dataset) bool#

Attr

Value

neurodata_type_inc

None

attributes

Truthy

dims

None

shape

None

name

str

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class MapListlike#

Datasets that refer to other datasets (that handle their own arrays)

classmethod check(cls: Dataset) bool#

Check if this map applies to the given item to read

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class MapArraylike#

Datasets without any additional attributes don’t create their own subclass, they’re just an array :).

Replace the base class with the array class, and make a slot that refers to it.

classmethod check(cls: Dataset) bool#

Check if this map applies to the given item to read

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class MapArrayLikeAttributes#

The most general case - treat everything that isn’t handled by one of the special cases as an array!

Specifically, we make an Arraylike class such that:

  • Each slot within a subclass indicates a possible dimension.

  • Only dimensions that are present in all the dimension specifiers in the original schema are required.

  • Shape requirements are indicated using max/min cardinalities on the slot.

  • The arraylike object should be stored in the array slot on the containing class (since there are already properties named data)

NEEDS_NAME = True#
classmethod check(cls: Dataset) bool#

Check if this map applies to the given item to read

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class Map1DVector#

VectorData is subclassed with a name but without dims or attributes, treat this as a normal 1D array slot that replaces any class that would be built for this

classmethod check(cls: Dataset) bool#

Check if this map applies to the given item to read

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

class MapNVectors#

An unnamed container that indicates an arbitrary quantity of some other neurodata type.

Most commonly: VectorData is subclassed without a name and with a ‘*’ quantity to indicate arbitrary columns.

classmethod check(cls: Dataset) bool#

Check if this map applies to the given item to read

classmethod apply(cls: Dataset, res: BuildResult | None = None, name: str | None = None) BuildResult#

Actually apply the map!

pydantic model DatasetAdapter#

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.

Fields:
field cls: Dataset [Required]#
build() BuildResult#

Make this abstract so it can’t be instantiated directly.

Subclasses call build_base() to get the basics true of both groups and datasets

make_arraylike(cls: Dataset, name: str | None = None) ClassDefinition#
is_1d(cls: Dataset) bool#
has_attrs(cls: Dataset) bool#