Schema#

I don’t know if NWB necessarily has a term for a single nwb schema file, so we’re going to call them “schema” objects

class SplitSchema(main, split)#

Create new instance of SplitSchema(main, split)

main: BuildResult#

Alias for field number 0

split: BuildResult | None#

Alias for field number 1

pydantic model SchemaAdapter#

An individual schema file in nwb_schema_language

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

__init__ uses __pydantic_self__ instead of the more common self for the first arg to allow self as a field name.

Fields:
field datasets: List[Dataset] [Optional]#
field groups: List[Group] [Optional]#
field imports: List[SchemaAdapter | str] [Optional]#
field namespace: str | None = None#

String of containing namespace. Populated by NamespacesAdapter

field path: Path [Required]#
field version: str | None = None#

Version of schema, populated by NamespacesAdapter since individual schema files dont know their version in NWB Schema Lang

build() BuildResult#

Make the LinkML representation for this schema file

Things that will be populated later - id (but need to have a placeholder to instantiate) - version

model_post_init(__context: Any) None#

This function is meant to behave like a BaseModel method to initialise private attributes.

It takes context as an argument since that’s what pydantic-core passes when calling it.

Parameters:
  • self – The BaseModel instance.

  • __context – The context.

property created_classes: List[Type[Group | Dataset]]#
property name: str#
property needed_imports: List[str]#

Classes that need to be imported from other namespaces

TODO: - Need to also check classes used in links/references