Skip to content

cli ¤

bioimageio CLI

Some docstrings use a hair space ' '

to place the added '(default: ...)' on a new line.

Classes:

Name Description
AddWeightsCmd

Add additional weights to a model description by converting from available formats.

ArgMixin
Bioimageio

bioimageio - CLI for bioimage.io resources 🦒

CmdBase
EmptyCache

Empty the bioimageio cache directory.

PackageCmd

Save a resource's metadata with its associated files.

PredictCmd

Run inference on your data with a bioimage.io model.

TestCmd

Test a bioimageio resource (beyond meta data formatting).

UpdateCmdBase
UpdateFormatCmd

Update the metadata format to the latest format version.

UpdateHashesCmd

Create a bioimageio.yaml description with updated file hashes.

ValidateFormatCmd

Validate the meta data format of a bioimageio resource.

WithSource
WithSummaryLogging

Attributes:

Name Type Description
JSON_FILE
WEIGHT_FORMAT_ALIASES
YAML_FILE

JSON_FILE module-attribute ¤

JSON_FILE = 'bioimageio-cli.json'

WEIGHT_FORMAT_ALIASES module-attribute ¤

WEIGHT_FORMAT_ALIASES = AliasChoices('weight-format', 'weights-format', 'weight_format', 'weights_format')

YAML_FILE module-attribute ¤

YAML_FILE = 'bioimageio-cli.yaml'

AddWeightsCmd ¤

AddWeightsCmd(**data: Any)

Bases: CmdBase, WithSource, WithSummaryLogging


              flowchart TD
              bioimageio.core.cli.AddWeightsCmd[AddWeightsCmd]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.WithSummaryLogging[WithSummaryLogging]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.AddWeightsCmd
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.AddWeightsCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                


                bioimageio.core.cli.WithSummaryLogging --> bioimageio.core.cli.AddWeightsCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSummaryLogging
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                




              click bioimageio.core.cli.AddWeightsCmd href "" "bioimageio.core.cli.AddWeightsCmd"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.WithSummaryLogging href "" "bioimageio.core.cli.WithSummaryLogging"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Add additional weights to a model description by converting from available formats.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
log
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

output CliPositionalArg[Path]

The path to write the updated model package to.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

source_format Optional[SupportedWeightsFormat]

Exclusively use these weights to convert to other formats.

summary List[Union[Literal['display'], Path]]

Display the validation summary or save it as JSON, Markdown or HTML.

target_format Optional[SupportedWeightsFormat]

Exclusively add this weight format.

tracing bool

Allow tracing when converting pytorch_state_dict to torchscript

verbose bool

Log more (error) output.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

output instance-attribute ¤

output: CliPositionalArg[Path]

The path to write the updated model package to.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

source_format class-attribute instance-attribute ¤

source_format: Optional[SupportedWeightsFormat] = Field(None, alias='source-format')

Exclusively use these weights to convert to other formats.

summary class-attribute instance-attribute ¤

summary: List[Union[Literal['display'], Path]] = Field(default_factory=lambda: ['display'], examples=[Path('summary.md'), Path('bioimageio_summaries/'), ['display', Path('summary.md')]])

Display the validation summary or save it as JSON, Markdown or HTML. The format is chosen based on the suffix: .json, .md, .html. If a folder is given (path w/o suffix) the summary is saved in all formats. Choose/add "display" to render the validation summary to the terminal.

target_format class-attribute instance-attribute ¤

target_format: Optional[SupportedWeightsFormat] = Field(None, alias='target-format')

Exclusively add this weight format.

tracing class-attribute instance-attribute ¤

tracing: bool = True

Allow tracing when converting pytorch_state_dict to torchscript (still uses scripting if possible).

verbose class-attribute instance-attribute ¤

verbose: bool = False

Log more (error) output.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
def cli_cmd(self):
    model_descr = ensure_description_is_model(self.descr)
    if isinstance(model_descr, v0_4.ModelDescr):
        raise TypeError(
            f"model format {model_descr.format_version} not supported."
            + " Please update the model first."
        )
    updated_model_descr = add_weights(
        model_descr,
        output_path=self.output,
        source_format=self.source_format,
        target_format=self.target_format,
        verbose=self.verbose,
        allow_tracing=self.tracing,
    )
    self.log(updated_model_descr)

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

log ¤

log(descr: Union[ResourceDescr, InvalidDescr])
Source code in src/bioimageio/core/cli.py
131
132
def log(self, descr: Union[ResourceDescr, InvalidDescr]):
    _ = descr.validation_summary.log(self.summary)

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

ArgMixin ¤

ArgMixin(**data: Any)

Bases: BaseModel


              flowchart TD
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                


              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

Bioimageio ¤

Bases: BaseSettings


              flowchart TD
              bioimageio.core.cli.Bioimageio[Bioimageio]

              

              click bioimageio.core.cli.Bioimageio href "" "bioimageio.core.cli.Bioimageio"
            

bioimageio - CLI for bioimage.io resources 🦒

Methods:

Name Description
cli_cmd
settings_customise_sources

Attributes:

Name Type Description
add_weights CliSubCommand[AddWeightsCmd]

Add additional weights to a model description by converting from available formats.

empty_cache CliSubCommand[EmptyCache]

Empty the bioimageio cache directory.

model_config
package CliSubCommand[PackageCmd]

Package a resource

predict CliSubCommand[PredictCmd]

Predict with a model resource

test CliSubCommand[TestCmd]

Test a bioimageio resource (beyond meta data formatting)

update_format CliSubCommand[UpdateFormatCmd]

Update the metadata format

update_hashes CliSubCommand[UpdateHashesCmd]

Create a bioimageio.yaml description with updated file hashes.

validate_format CliSubCommand[ValidateFormatCmd]

Check a resource's metadata format

add_weights class-attribute instance-attribute ¤

add_weights: CliSubCommand[AddWeightsCmd] = Field(alias='add-weights')

Add additional weights to a model description by converting from available formats.

empty_cache class-attribute instance-attribute ¤

empty_cache: CliSubCommand[EmptyCache] = Field(alias='empty-cache')

Empty the bioimageio cache directory.

model_config class-attribute instance-attribute ¤

model_config = SettingsConfigDict(json_file=JSON_FILE, yaml_file=YAML_FILE)

package instance-attribute ¤

package: CliSubCommand[PackageCmd]

Package a resource

predict instance-attribute ¤

predict: CliSubCommand[PredictCmd]

Predict with a model resource

test instance-attribute ¤

test: CliSubCommand[TestCmd]

Test a bioimageio resource (beyond meta data formatting)

update_format class-attribute instance-attribute ¤

update_format: CliSubCommand[UpdateFormatCmd] = Field(alias='update-format')

Update the metadata format

update_hashes class-attribute instance-attribute ¤

update_hashes: CliSubCommand[UpdateHashesCmd] = Field(alias='update-hashes')

Create a bioimageio.yaml description with updated file hashes.

validate_format class-attribute instance-attribute ¤

validate_format: CliSubCommand[ValidateFormatCmd] = Field(alias='validate-format')

Check a resource's metadata format

cli_cmd ¤

cli_cmd() -> None
Source code in src/bioimageio/core/cli.py
893
894
895
896
897
898
def cli_cmd(self) -> None:
    logger.info(
        "executing CLI command:\n{}",
        pformat({k: v for k, v in self.model_dump().items() if v is not None}),
    )
    _ = CliApp.run_subcommand(self)

settings_customise_sources classmethod ¤

settings_customise_sources(settings_cls: Type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource) -> Tuple[PydanticBaseSettingsSource, ...]
Source code in src/bioimageio/core/cli.py
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
@classmethod
def settings_customise_sources(
    cls,
    settings_cls: Type[BaseSettings],
    init_settings: PydanticBaseSettingsSource,
    env_settings: PydanticBaseSettingsSource,
    dotenv_settings: PydanticBaseSettingsSource,
    file_secret_settings: PydanticBaseSettingsSource,
) -> Tuple[PydanticBaseSettingsSource, ...]:
    cli: CliSettingsSource[BaseSettings] = CliSettingsSource(
        settings_cls,
        cli_parse_args=True,
        formatter_class=RawTextHelpFormatter,
    )
    sys_args = pformat(sys.argv)
    logger.info("starting CLI with arguments:\n{}", sys_args)
    return (
        cli,
        init_settings,
        YamlConfigSettingsSource(settings_cls),
        JsonConfigSettingsSource(settings_cls),
    )

CmdBase ¤

CmdBase(**data: Any)

Bases: BaseModel


              flowchart TD
              bioimageio.core.cli.CmdBase[CmdBase]
              pydantic.main.BaseModel[BaseModel]

                              pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                


              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

EmptyCache ¤

EmptyCache(**data: Any)

Bases: CmdBase


              flowchart TD
              bioimageio.core.cli.EmptyCache[EmptyCache]
              bioimageio.core.cli.CmdBase[CmdBase]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.EmptyCache
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                



              click bioimageio.core.cli.EmptyCache href "" "bioimageio.core.cli.EmptyCache"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Empty the bioimageio cache directory.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
814
815
def cli_cmd(self):
    empty_cache()

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

PackageCmd ¤

PackageCmd(**data: Any)

Bases: CmdBase, WithSource, WithSummaryLogging


              flowchart TD
              bioimageio.core.cli.PackageCmd[PackageCmd]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.WithSummaryLogging[WithSummaryLogging]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.PackageCmd
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.PackageCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                


                bioimageio.core.cli.WithSummaryLogging --> bioimageio.core.cli.PackageCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSummaryLogging
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                




              click bioimageio.core.cli.PackageCmd href "" "bioimageio.core.cli.PackageCmd"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.WithSummaryLogging href "" "bioimageio.core.cli.WithSummaryLogging"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Save a resource's metadata with its associated files.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
log
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

path CliPositionalArg[Path]

The path to write the (zipped) package to.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

summary List[Union[Literal['display'], Path]]

Display the validation summary or save it as JSON, Markdown or HTML.

weight_format WeightFormatArgAll

The weight format to include in the package (for model descriptions only).

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

path instance-attribute ¤

path: CliPositionalArg[Path]

The path to write the (zipped) package to. If it does not have a .zip suffix this command will save the package as an unzipped folder instead.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

summary class-attribute instance-attribute ¤

summary: List[Union[Literal['display'], Path]] = Field(default_factory=lambda: ['display'], examples=[Path('summary.md'), Path('bioimageio_summaries/'), ['display', Path('summary.md')]])

Display the validation summary or save it as JSON, Markdown or HTML. The format is chosen based on the suffix: .json, .md, .html. If a folder is given (path w/o suffix) the summary is saved in all formats. Choose/add "display" to render the validation summary to the terminal.

weight_format class-attribute instance-attribute ¤

weight_format: WeightFormatArgAll = Field('all', alias='weight-format', validation_alias=WEIGHT_FORMAT_ALIASES)

The weight format to include in the package (for model descriptions only).

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
262
263
264
265
266
267
268
269
270
271
272
273
def cli_cmd(self):
    if isinstance(self.descr, InvalidDescr):
        self.log(self.descr)
        raise ValueError(f"Invalid {self.descr.type} description.")

    sys.exit(
        package(
            self.descr,
            self.path,
            weight_format=self.weight_format,
        )
    )

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

log ¤

log(descr: Union[ResourceDescr, InvalidDescr])
Source code in src/bioimageio/core/cli.py
131
132
def log(self, descr: Union[ResourceDescr, InvalidDescr]):
    _ = descr.validation_summary.log(self.summary)

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

PredictCmd ¤

PredictCmd(**data: Any)

Bases: CmdBase, WithSource


              flowchart TD
              bioimageio.core.cli.PredictCmd[PredictCmd]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.PredictCmd
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.PredictCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                




              click bioimageio.core.cli.PredictCmd href "" "bioimageio.core.cli.PredictCmd"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Run inference on your data with a bioimage.io model.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
blockwise bool

process inputs blockwise

descr
descr_id str

a more user-friendly description id

example bool

generate and run an example

inputs NotEmpty[List[Union[str, NotEmpty[List[str]]]]]

Model input sample paths (for each input tensor)

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

outputs Union[str, NotEmpty[Tuple[str, ...]]]

Model output path pattern (per output tensor)

overwrite bool

allow overwriting existing output files

preview bool

preview which files would be processed

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

stats Annotated[Path, WithJsonSchema({type: string}), PlainSerializer(lambda p: p.as_posix(), return_type=str)]

path to dataset statistics

weight_format WeightFormatArgAny

The weight format to use.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

blockwise class-attribute instance-attribute ¤

blockwise: bool = False

process inputs blockwise

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

example class-attribute instance-attribute ¤

example: bool = False

generate and run an example

  1. downloads example model inputs
  2. creates a {model_id}_example folder
  3. writes input arguments to {model_id}_example/bioimageio-cli.yaml
  4. executes a preview dry-run
  5. executes prediction with example input

inputs class-attribute instance-attribute ¤

inputs: NotEmpty[List[Union[str, NotEmpty[List[str]]]]] = Field(default_factory=lambda: ['{input_id}/001.tif'])

Model input sample paths (for each input tensor)

The input paths are expected to have shape... - (n_samples,) or (n_samples,1) for models expecting a single input tensor - (n_samples,) containing the substring '{input_id}', or - (n_samples, n_model_inputs) to provide each input tensor path explicitly.

All substrings that are replaced by metadata from the model description: - '{model_id}' - '{input_id}'

Example inputs to process sample 'a' and 'b' for a model expecting a 'raw' and a 'mask' input tensor: --inputs="[[\"a_raw.tif\",\"a_mask.tif\"],[\"b_raw.tif\",\"b_mask.tif\"]]" (Note that JSON double quotes need to be escaped.)

Alternatively a bioimageio-cli.yaml (or bioimageio-cli.json) file may provide the arguments, e.g.:

inputs:
- [a_raw.tif, a_mask.tif]
- [b_raw.tif, b_mask.tif]

.npy and any file extension supported by imageio are supported. Aavailable formats are listed at https://imageio.readthedocs.io/en/stable/formats/index.html#all-formats. Some formats have additional dependencies.

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

outputs class-attribute instance-attribute ¤

outputs: Union[str, NotEmpty[Tuple[str, ...]]] = 'outputs_{model_id}/{output_id}/{sample_id}.tif'

Model output path pattern (per output tensor)

All substrings that are replaced: - '{model_id}' (from model description) - '{output_id}' (from model description) - '{sample_id}' (extracted from input paths)

overwrite class-attribute instance-attribute ¤

overwrite: bool = False

allow overwriting existing output files

preview class-attribute instance-attribute ¤

preview: bool = False

preview which files would be processed and what outputs would be generated.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

stats class-attribute instance-attribute ¤

stats: Annotated[Path, WithJsonSchema({type: string}), PlainSerializer(lambda p: p.as_posix(), return_type=str)] = Path('dataset_statistics.json')

path to dataset statistics (will be written if it does not exist, but the model requires statistical dataset measures)

weight_format class-attribute instance-attribute ¤

weight_format: WeightFormatArgAny = Field('any', alias='weight-format', validation_alias=WEIGHT_FORMAT_ALIASES)

The weight format to use.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
def cli_cmd(self):
    if self.example:
        return self._example()

    model_descr = ensure_description_is_model(self.descr)

    input_ids = get_member_ids(model_descr.inputs)
    output_ids = get_member_ids(model_descr.outputs)

    minimum_input_ids = tuple(
        str(ipt.id) if isinstance(ipt, v0_5.InputTensorDescr) else str(ipt.name)
        for ipt in model_descr.inputs
        if not isinstance(ipt, v0_5.InputTensorDescr) or not ipt.optional
    )
    maximum_input_ids = tuple(
        str(ipt.id) if isinstance(ipt, v0_5.InputTensorDescr) else str(ipt.name)
        for ipt in model_descr.inputs
    )

    def expand_inputs(i: int, ipt: Union[str, Sequence[str]]) -> Tuple[str, ...]:
        if isinstance(ipt, str):
            ipts = tuple(
                ipt.format(model_id=self.descr_id, input_id=t) for t in input_ids
            )
        else:
            ipts = tuple(
                p.format(model_id=self.descr_id, input_id=t)
                for t, p in zip(input_ids, ipt)
            )

        if len(set(ipts)) < len(ipts):
            if len(minimum_input_ids) == len(maximum_input_ids):
                n = len(minimum_input_ids)
            else:
                n = f"{len(minimum_input_ids)}-{len(maximum_input_ids)}"

            raise ValueError(
                f"[input sample #{i}] Include '{{input_id}}' in path pattern or explicitly specify {n} distinct input paths (got {ipt})"
            )

        if len(ipts) < len(minimum_input_ids):
            raise ValueError(
                f"[input sample #{i}] Expected at least {len(minimum_input_ids)} inputs {minimum_input_ids}, got {ipts}"
            )

        if len(ipts) > len(maximum_input_ids):
            raise ValueError(
                f"Expected at most {len(maximum_input_ids)} inputs {maximum_input_ids}, got {ipts}"
            )

        return ipts

    inputs = [expand_inputs(i, ipt) for i, ipt in enumerate(self.inputs, start=1)]

    sample_paths_in = [
        {t: Path(p) for t, p in zip(input_ids, ipts)} for ipts in inputs
    ]

    sample_ids = _get_sample_ids(sample_paths_in)

    def expand_outputs():
        if isinstance(self.outputs, str):
            outputs = [
                tuple(
                    Path(
                        self.outputs.format(
                            model_id=self.descr_id, output_id=t, sample_id=s
                        )
                    )
                    for t in output_ids
                )
                for s in sample_ids
            ]
        else:
            outputs = [
                tuple(
                    Path(p.format(model_id=self.descr_id, output_id=t, sample_id=s))
                    for t, p in zip(output_ids, self.outputs)
                )
                for s in sample_ids
            ]
        # check for distinctness and correct number within each output sample
        for i, out in enumerate(outputs, start=1):
            if len(set(out)) < len(out):
                raise ValueError(
                    f"[output sample #{i}] Include '{{output_id}}' in path pattern or explicitly specify {len(output_ids)} distinct output paths (got {out})"
                )

            if len(out) != len(output_ids):
                raise ValueError(
                    f"[output sample #{i}] Expected {len(output_ids)} outputs {output_ids}, got {out}"
                )

        # check for distinctness across all output samples
        all_output_paths = [p for out in outputs for p in out]
        if len(set(all_output_paths)) < len(all_output_paths):
            raise ValueError(
                "Output paths are not distinct across samples. "
                + f"Make sure to include '{{sample_id}}' in the output path pattern."
            )

        return outputs

    outputs = expand_outputs()

    sample_paths_out = [
        {MemberId(t): Path(p) for t, p in zip(output_ids, out)} for out in outputs
    ]

    if not self.overwrite:
        for sample_paths in sample_paths_out:
            for p in sample_paths.values():
                if p.exists():
                    raise FileExistsError(
                        f"{p} already exists. use --overwrite to (re-)write outputs anyway."
                    )
    if self.preview:
        print("🛈 bioimageio prediction preview structure:")
        pprint(
            {
                "{sample_id}": dict(
                    inputs={"{input_id}": "<input path>"},
                    outputs={"{output_id}": "<output path>"},
                )
            }
        )
        print("🔎 bioimageio prediction preview output:")
        pprint(
            {
                s: dict(
                    inputs={t: p.as_posix() for t, p in sp_in.items()},
                    outputs={t: p.as_posix() for t, p in sp_out.items()},
                )
                for s, sp_in, sp_out in zip(
                    sample_ids, sample_paths_in, sample_paths_out
                )
            }
        )
        return

    def input_dataset(stat: Stat):
        for s, sp_in in zip(sample_ids, sample_paths_in):
            yield load_sample_for_model(
                model=model_descr,
                paths=sp_in,
                stat=stat,
                sample_id=s,
            )

    stat: Dict[Measure, MeasureValue] = dict(
        _get_stat(
            model_descr, input_dataset({}), len(sample_ids), self.stats
        ).items()
    )

    pp = create_prediction_pipeline(
        model_descr,
        weight_format=None if self.weight_format == "any" else self.weight_format,
    )
    predict_method = (
        pp.predict_sample_with_blocking
        if self.blockwise
        else pp.predict_sample_without_blocking
    )

    for sample_in, sp_out in tqdm(
        zip(input_dataset(dict(stat)), sample_paths_out),
        total=len(inputs),
        desc=f"predict with {self.descr_id}",
        unit="sample",
    ):
        sample_out = predict_method(sample_in)
        save_sample(sp_out, sample_out)

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

TestCmd ¤

TestCmd(**data: Any)

Bases: CmdBase, WithSource, WithSummaryLogging


              flowchart TD
              bioimageio.core.cli.TestCmd[TestCmd]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.WithSummaryLogging[WithSummaryLogging]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.TestCmd
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.TestCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                


                bioimageio.core.cli.WithSummaryLogging --> bioimageio.core.cli.TestCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSummaryLogging
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                




              click bioimageio.core.cli.TestCmd href "" "bioimageio.core.cli.TestCmd"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.WithSummaryLogging href "" "bioimageio.core.cli.WithSummaryLogging"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Test a bioimageio resource (beyond meta data formatting).

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
log
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

determinism Literal['seed_only', 'full']

Modes to improve reproducibility of test outputs.

devices Optional[List[str]]

Device(s) to use for testing

format_version Union[FormatVersionPlaceholder, str]

The format version to use for testing.

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

runtime_env Union[Literal['currently-active', 'as-described'], Path]

The python environment to run the tests in

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

stop_early bool

Do not run further subtests after a failed one.

summary List[Union[Literal['display'], Path]]

Display the validation summary or save it as JSON, Markdown or HTML.

weight_format WeightFormatArgAll

The weight format to limit testing to.

working_dir Optional[Path]

(for debugging) Directory to save any temporary files.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

determinism class-attribute instance-attribute ¤

determinism: Literal['seed_only', 'full'] = 'seed_only'

Modes to improve reproducibility of test outputs.

devices class-attribute instance-attribute ¤

devices: Optional[List[str]] = None

Device(s) to use for testing

format_version class-attribute instance-attribute ¤

format_version: Union[FormatVersionPlaceholder, str] = Field('discover', alias='format-version')

The format version to use for testing. - 'latest': Use the latest implemented format version for the given resource type (may trigger auto updating) - 'discover': Use the format version as described in the resource description - '0.4', '0.5', ...: Use the specified format version (may trigger auto updating)

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

runtime_env class-attribute instance-attribute ¤

runtime_env: Union[Literal['currently-active', 'as-described'], Path] = Field('currently-active', alias='runtime-env')

The python environment to run the tests in - "currently-active": use active Python interpreter - "as-described": generate a conda environment YAML file based on the model weights description. - A path to a conda environment YAML. Note: The bioimageio.core dependency will be added automatically if not present.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

stop_early class-attribute instance-attribute ¤

stop_early: bool = Field(False, alias='stop-early', validation_alias=AliasChoices('stop-early', 'x'))

Do not run further subtests after a failed one.

summary class-attribute instance-attribute ¤

summary: List[Union[Literal['display'], Path]] = Field(default_factory=lambda: ['display'], examples=[Path('summary.md'), Path('bioimageio_summaries/'), ['display', Path('summary.md')]])

Display the validation summary or save it as JSON, Markdown or HTML. The format is chosen based on the suffix: .json, .md, .html. If a folder is given (path w/o suffix) the summary is saved in all formats. Choose/add "display" to render the validation summary to the terminal.

weight_format class-attribute instance-attribute ¤

weight_format: WeightFormatArgAll = Field('all', alias='weight-format', validation_alias=WEIGHT_FORMAT_ALIASES)

The weight format to limit testing to.

(only relevant for model resources)

working_dir class-attribute instance-attribute ¤

working_dir: Optional[Path] = Field(None, alias='working-dir')

(for debugging) Directory to save any temporary files.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
232
233
234
235
236
237
238
239
240
241
242
243
244
def cli_cmd(self):
    sys.exit(
        test(
            self.descr,
            weight_format=self.weight_format,
            devices=self.devices,
            summary=self.summary,
            runtime_env=self.runtime_env,
            determinism=self.determinism,
            format_version=self.format_version,
            working_dir=self.working_dir,
        )
    )

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

log ¤

log(descr: Union[ResourceDescr, InvalidDescr])
Source code in src/bioimageio/core/cli.py
131
132
def log(self, descr: Union[ResourceDescr, InvalidDescr]):
    _ = descr.validation_summary.log(self.summary)

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

UpdateCmdBase ¤

UpdateCmdBase(**data: Any)

Bases: CmdBase, WithSource, ABC


              flowchart TD
              bioimageio.core.cli.UpdateCmdBase[UpdateCmdBase]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.UpdateCmdBase
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.UpdateCmdBase
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                




              click bioimageio.core.cli.UpdateCmdBase href "" "bioimageio.core.cli.UpdateCmdBase"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

diff Union[bool, Path]

Output a diff of original and updated bioimageio.yaml.

exclude_defaults bool

Exclude fields that have the default value (even if set explicitly).

exclude_unset bool

Exclude fields that have not explicitly be set.

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

output Union[Literal['display', 'stdout'], Path]

Output updated bioimageio.yaml to the terminal or write to a file.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

updated Union[ResourceDescr, InvalidDescr]
Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

diff class-attribute instance-attribute ¤

diff: Union[bool, Path] = Field(True, alias='diff')

Output a diff of original and updated bioimageio.yaml. If a given path has an .html extension, a standalone HTML file is written, otherwise the diff is saved in unified diff format (pure text).

exclude_defaults class-attribute instance-attribute ¤

exclude_defaults: bool = Field(False, alias='exclude-defaults')

Exclude fields that have the default value (even if set explicitly).

exclude_unset class-attribute instance-attribute ¤

exclude_unset: bool = Field(True, alias='exclude-unset')

Exclude fields that have not explicitly be set.

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

output class-attribute instance-attribute ¤

output: Union[Literal['display', 'stdout'], Path] = 'display'

Output updated bioimageio.yaml to the terminal or write to a file. Notes: - "display": Render to the terminal with syntax highlighting. - "stdout": Write to sys.stdout without syntax highligthing. (More convenient for copying the updated bioimageio.yaml from the terminal.)

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

updated cached property ¤

updated: Union[ResourceDescr, InvalidDescr]

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
def cli_cmd(self):
    original_yaml = open_bioimageio_yaml(self.source).unparsed_content
    assert isinstance(original_yaml, str)
    stream = StringIO()

    save_bioimageio_yaml_only(
        self.updated,
        stream,
        exclude_unset=self.exclude_unset,
        exclude_defaults=self.exclude_defaults,
    )
    updated_yaml = stream.getvalue()

    diff = compare(
        original_yaml.split("\n"),
        updated_yaml.split("\n"),
        diff_format=(
            "html"
            if isinstance(self.diff, Path) and self.diff.suffix == ".html"
            else "unified"
        ),
    )

    if isinstance(self.diff, Path):
        _ = self.diff.write_text(diff, encoding="utf-8")
    elif self.diff:
        console = rich.console.Console()
        diff_md = f"## Diff\n\n````````diff\n{diff}\n````````"
        console.print(rich.markdown.Markdown(diff_md))

    if isinstance(self.output, Path):
        _ = self.output.write_text(updated_yaml, encoding="utf-8")
        logger.info(f"written updated description to {self.output}")
    elif self.output == "display":
        updated_md = f"## Updated bioimageio.yaml\n\n```yaml\n{updated_yaml}\n```"
        rich.console.Console().print(rich.markdown.Markdown(updated_md))
    elif self.output == "stdout":
        print(updated_yaml)
    else:
        assert_never(self.output)

    if isinstance(self.updated, InvalidDescr):
        logger.warning("Update resulted in invalid description")
        _ = self.updated.validation_summary.display()

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

UpdateFormatCmd ¤

UpdateFormatCmd(**data: Any)

Bases: UpdateCmdBase


              flowchart TD
              bioimageio.core.cli.UpdateFormatCmd[UpdateFormatCmd]
              bioimageio.core.cli.UpdateCmdBase[UpdateCmdBase]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.UpdateCmdBase --> bioimageio.core.cli.UpdateFormatCmd
                                bioimageio.core.cli.CmdBase --> bioimageio.core.cli.UpdateCmdBase
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.UpdateCmdBase
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                





              click bioimageio.core.cli.UpdateFormatCmd href "" "bioimageio.core.cli.UpdateFormatCmd"
              click bioimageio.core.cli.UpdateCmdBase href "" "bioimageio.core.cli.UpdateCmdBase"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Update the metadata format to the latest format version.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

diff Union[bool, Path]

Output a diff of original and updated bioimageio.yaml.

exclude_defaults bool

Exclude fields that have the default value (even if set explicitly).

exclude_unset bool

Exclude fields that have not explicitly be set.

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

output Union[Literal['display', 'stdout'], Path]

Output updated bioimageio.yaml to the terminal or write to a file.

perform_io_checks bool

Wether or not to attempt validation that may require file download.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

updated
Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

diff class-attribute instance-attribute ¤

diff: Union[bool, Path] = Field(True, alias='diff')

Output a diff of original and updated bioimageio.yaml. If a given path has an .html extension, a standalone HTML file is written, otherwise the diff is saved in unified diff format (pure text).

exclude_defaults class-attribute instance-attribute ¤

exclude_defaults: bool = Field(True, alias='exclude-defaults')

Exclude fields that have the default value (even if set explicitly).

Note

The update process sets most unset fields explicitly with their default value.

exclude_unset class-attribute instance-attribute ¤

exclude_unset: bool = Field(True, alias='exclude-unset')

Exclude fields that have not explicitly be set.

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

output class-attribute instance-attribute ¤

output: Union[Literal['display', 'stdout'], Path] = 'display'

Output updated bioimageio.yaml to the terminal or write to a file. Notes: - "display": Render to the terminal with syntax highlighting. - "stdout": Write to sys.stdout without syntax highligthing. (More convenient for copying the updated bioimageio.yaml from the terminal.)

perform_io_checks class-attribute instance-attribute ¤

perform_io_checks: bool = Field(settings.perform_io_checks, alias='perform-io-checks')

Wether or not to attempt validation that may require file download. If True file hash values are added if not present.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

updated cached property ¤

updated

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
def cli_cmd(self):
    original_yaml = open_bioimageio_yaml(self.source).unparsed_content
    assert isinstance(original_yaml, str)
    stream = StringIO()

    save_bioimageio_yaml_only(
        self.updated,
        stream,
        exclude_unset=self.exclude_unset,
        exclude_defaults=self.exclude_defaults,
    )
    updated_yaml = stream.getvalue()

    diff = compare(
        original_yaml.split("\n"),
        updated_yaml.split("\n"),
        diff_format=(
            "html"
            if isinstance(self.diff, Path) and self.diff.suffix == ".html"
            else "unified"
        ),
    )

    if isinstance(self.diff, Path):
        _ = self.diff.write_text(diff, encoding="utf-8")
    elif self.diff:
        console = rich.console.Console()
        diff_md = f"## Diff\n\n````````diff\n{diff}\n````````"
        console.print(rich.markdown.Markdown(diff_md))

    if isinstance(self.output, Path):
        _ = self.output.write_text(updated_yaml, encoding="utf-8")
        logger.info(f"written updated description to {self.output}")
    elif self.output == "display":
        updated_md = f"## Updated bioimageio.yaml\n\n```yaml\n{updated_yaml}\n```"
        rich.console.Console().print(rich.markdown.Markdown(updated_md))
    elif self.output == "stdout":
        print(updated_yaml)
    else:
        assert_never(self.output)

    if isinstance(self.updated, InvalidDescr):
        logger.warning("Update resulted in invalid description")
        _ = self.updated.validation_summary.display()

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

UpdateHashesCmd ¤

UpdateHashesCmd(**data: Any)

Bases: UpdateCmdBase


              flowchart TD
              bioimageio.core.cli.UpdateHashesCmd[UpdateHashesCmd]
              bioimageio.core.cli.UpdateCmdBase[UpdateCmdBase]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.UpdateCmdBase --> bioimageio.core.cli.UpdateHashesCmd
                                bioimageio.core.cli.CmdBase --> bioimageio.core.cli.UpdateCmdBase
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.UpdateCmdBase
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                





              click bioimageio.core.cli.UpdateHashesCmd href "" "bioimageio.core.cli.UpdateHashesCmd"
              click bioimageio.core.cli.UpdateCmdBase href "" "bioimageio.core.cli.UpdateCmdBase"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Create a bioimageio.yaml description with updated file hashes.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

diff Union[bool, Path]

Output a diff of original and updated bioimageio.yaml.

exclude_defaults bool

Exclude fields that have the default value (even if set explicitly).

exclude_unset bool

Exclude fields that have not explicitly be set.

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

output Union[Literal['display', 'stdout'], Path]

Output updated bioimageio.yaml to the terminal or write to a file.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

updated
Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

diff class-attribute instance-attribute ¤

diff: Union[bool, Path] = Field(True, alias='diff')

Output a diff of original and updated bioimageio.yaml. If a given path has an .html extension, a standalone HTML file is written, otherwise the diff is saved in unified diff format (pure text).

exclude_defaults class-attribute instance-attribute ¤

exclude_defaults: bool = Field(False, alias='exclude-defaults')

Exclude fields that have the default value (even if set explicitly).

exclude_unset class-attribute instance-attribute ¤

exclude_unset: bool = Field(True, alias='exclude-unset')

Exclude fields that have not explicitly be set.

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

output class-attribute instance-attribute ¤

output: Union[Literal['display', 'stdout'], Path] = 'display'

Output updated bioimageio.yaml to the terminal or write to a file. Notes: - "display": Render to the terminal with syntax highlighting. - "stdout": Write to sys.stdout without syntax highligthing. (More convenient for copying the updated bioimageio.yaml from the terminal.)

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

updated cached property ¤

updated

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
def cli_cmd(self):
    original_yaml = open_bioimageio_yaml(self.source).unparsed_content
    assert isinstance(original_yaml, str)
    stream = StringIO()

    save_bioimageio_yaml_only(
        self.updated,
        stream,
        exclude_unset=self.exclude_unset,
        exclude_defaults=self.exclude_defaults,
    )
    updated_yaml = stream.getvalue()

    diff = compare(
        original_yaml.split("\n"),
        updated_yaml.split("\n"),
        diff_format=(
            "html"
            if isinstance(self.diff, Path) and self.diff.suffix == ".html"
            else "unified"
        ),
    )

    if isinstance(self.diff, Path):
        _ = self.diff.write_text(diff, encoding="utf-8")
    elif self.diff:
        console = rich.console.Console()
        diff_md = f"## Diff\n\n````````diff\n{diff}\n````````"
        console.print(rich.markdown.Markdown(diff_md))

    if isinstance(self.output, Path):
        _ = self.output.write_text(updated_yaml, encoding="utf-8")
        logger.info(f"written updated description to {self.output}")
    elif self.output == "display":
        updated_md = f"## Updated bioimageio.yaml\n\n```yaml\n{updated_yaml}\n```"
        rich.console.Console().print(rich.markdown.Markdown(updated_md))
    elif self.output == "stdout":
        print(updated_yaml)
    else:
        assert_never(self.output)

    if isinstance(self.updated, InvalidDescr):
        logger.warning("Update resulted in invalid description")
        _ = self.updated.validation_summary.display()

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

ValidateFormatCmd ¤

ValidateFormatCmd(**data: Any)

Bases: CmdBase, WithSource, WithSummaryLogging


              flowchart TD
              bioimageio.core.cli.ValidateFormatCmd[ValidateFormatCmd]
              bioimageio.core.cli.CmdBase[CmdBase]
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.WithSummaryLogging[WithSummaryLogging]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.CmdBase --> bioimageio.core.cli.ValidateFormatCmd
                                pydantic.main.BaseModel --> bioimageio.core.cli.CmdBase
                

                bioimageio.core.cli.WithSource --> bioimageio.core.cli.ValidateFormatCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                


                bioimageio.core.cli.WithSummaryLogging --> bioimageio.core.cli.ValidateFormatCmd
                                bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSummaryLogging
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                




              click bioimageio.core.cli.ValidateFormatCmd href "" "bioimageio.core.cli.ValidateFormatCmd"
              click bioimageio.core.cli.CmdBase href "" "bioimageio.core.cli.CmdBase"
              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.WithSummaryLogging href "" "bioimageio.core.cli.WithSummaryLogging"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Validate the meta data format of a bioimageio resource.

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
cli_cmd
construct
copy

Returns a copy of the model.

dict
from_orm
json
log
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

perform_io_checks bool

Wether or not to perform validations that requires downloading remote files.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

summary List[Union[Literal['display'], Path]]

Display the validation summary or save it as JSON, Markdown or HTML.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

perform_io_checks class-attribute instance-attribute ¤

perform_io_checks: bool = Field(settings.perform_io_checks, alias='perform-io-checks')

Wether or not to perform validations that requires downloading remote files. Note: Default value is set by BIOIMAGEIO_PERFORM_IO_CHECKS environment variable.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

summary class-attribute instance-attribute ¤

summary: List[Union[Literal['display'], Path]] = Field(default_factory=lambda: ['display'], examples=[Path('summary.md'), Path('bioimageio_summaries/'), ['display', Path('summary.md')]])

Display the validation summary or save it as JSON, Markdown or HTML. The format is chosen based on the suffix: .json, .md, .html. If a folder is given (path w/o suffix) the summary is saved in all formats. Choose/add "display" to render the validation summary to the terminal.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

cli_cmd ¤

cli_cmd()
Source code in src/bioimageio/core/cli.py
177
178
179
180
181
182
183
def cli_cmd(self):
    self.log(self.descr)
    sys.exit(
        0
        if self.descr.validation_summary.status in ("valid-format", "passed")
        else 1
    )

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

log ¤

log(descr: Union[ResourceDescr, InvalidDescr])
Source code in src/bioimageio/core/cli.py
131
132
def log(self, descr: Union[ResourceDescr, InvalidDescr]):
    _ = descr.validation_summary.log(self.summary)

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

WithSource ¤

WithSource(**data: Any)

Bases: ArgMixin


              flowchart TD
              bioimageio.core.cli.WithSource[WithSource]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSource
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                



              click bioimageio.core.cli.WithSource href "" "bioimageio.core.cli.WithSource"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
construct
copy

Returns a copy of the model.

dict
from_orm
json
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
descr
descr_id str

a more user-friendly description id

model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

source CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

descr cached property ¤

descr

descr_id property ¤

descr_id: str

a more user-friendly description id (replacing legacy ids with their nicknames)

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

source instance-attribute ¤

source: CliPositionalArg[str]

Url/path to a (folder with a) bioimageio.yaml/rdf.yaml file or a bioimage.io resource identifier, e.g. 'affable-shark'

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)

WithSummaryLogging ¤

WithSummaryLogging(**data: Any)

Bases: ArgMixin


              flowchart TD
              bioimageio.core.cli.WithSummaryLogging[WithSummaryLogging]
              bioimageio.core.cli.ArgMixin[ArgMixin]
              pydantic.main.BaseModel[BaseModel]

                              bioimageio.core.cli.ArgMixin --> bioimageio.core.cli.WithSummaryLogging
                                pydantic.main.BaseModel --> bioimageio.core.cli.ArgMixin
                



              click bioimageio.core.cli.WithSummaryLogging href "" "bioimageio.core.cli.WithSummaryLogging"
              click bioimageio.core.cli.ArgMixin href "" "bioimageio.core.cli.ArgMixin"
              click pydantic.main.BaseModel href "" "pydantic.main.BaseModel"
            

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Methods:

Name Description
__class_getitem__
__copy__

Returns a shallow copy of the model.

__deepcopy__

Returns a deep copy of the model.

__delattr__
__eq__
__get_pydantic_core_schema__
__get_pydantic_json_schema__

Hook into generating the model's JSON schema.

__getattr__
__getstate__
__init_subclass__

This signature is included purely to help type-checkers check arguments to class declaration, which

__iter__

So dict(model) works.

__pydantic_init_subclass__

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass

__pydantic_on_complete__

This is called once the class and its fields are fully initialized and ready to be used.

__replace__
__repr__
__repr_args__
__setattr__
__setstate__
__str__
construct
copy

Returns a copy of the model.

dict
from_orm
json
log
model_computed_fields

A mapping of computed field names to their respective ComputedFieldInfo instances.

model_construct

Creates a new instance of the Model class with validated data.

model_copy

Usage Documentation

model_dump

Usage Documentation

model_dump_json

Usage Documentation

model_fields

A mapping of field names to their respective FieldInfo instances.

model_json_schema

Generates a JSON schema for a model class.

model_parametrized_name

Compute the class name for parametrizations of generic classes.

model_post_init

Override this method to perform additional initialization after __init__ and model_construct.

model_rebuild

Try to rebuild the pydantic-core schema for the model.

model_validate

Validate a pydantic model instance.

model_validate_json

Usage Documentation

model_validate_strings

Validate the given object with string data against the Pydantic model.

parse_file
parse_obj
parse_raw
schema
schema_json
update_forward_refs
validate

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__fields__ dict[str, FieldInfo]
__fields_set__ set[str]
__pretty__
__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ _decorators.DecoratorInfos

Metadata containing the decorators defined on the model.

__pydantic_extra__ Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__
__repr_recursion__
__repr_str__
__rich_repr__
__signature__ Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__
model_config ConfigDict

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra dict[str, Any] | None

Get extra fields set during validation.

model_fields_set set[str]

Returns the set of fields that have been explicitly set on this model instance.

summary List[Union[Literal['display'], Path]]

Display the validation summary or save it as JSON, Markdown or HTML.

Source code in pydantic/main.py
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
def __init__(self, /, **data: Any) -> None:
    """Create a new model by parsing and validating input data from keyword arguments.

    Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be
    validated to form a valid model.

    `self` is explicitly positional-only to allow `self` as a field name.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
    if self is not validated_self:
        warnings.warn(
            'A custom validator is returning a value other than `self`.\n'
            "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n"
            'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.',
            stacklevel=2,
        )

__class_vars__ class-attribute ¤

__class_vars__: set[str]

The names of the class variables defined on the model.

__fields__ property ¤

__fields__: dict[str, FieldInfo]

__fields_set__ property ¤

__fields_set__: set[str]

__pretty__ pydantic-field ¤

__pretty__ = _repr.Representation.__pretty__

__private_attributes__ class-attribute ¤

__private_attributes__: Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__pydantic_complete__ class-attribute ¤

__pydantic_complete__: bool = False

Whether model building is completed, or if there are still undefined fields.

__pydantic_computed_fields__ class-attribute ¤

__pydantic_computed_fields__: Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_core_schema__ class-attribute ¤

__pydantic_core_schema__: CoreSchema

The core schema of the model.

__pydantic_custom_init__ class-attribute ¤

__pydantic_custom_init__: bool

Whether the model has a custom __init__ method.

__pydantic_decorators__ class-attribute ¤

__pydantic_decorators__: _decorators.DecoratorInfos = _decorators.DecoratorInfos()

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_extra__ pydantic-field ¤

__pydantic_extra__: Dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields__ class-attribute ¤

__pydantic_fields__: Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects. This replaces Model.__fields__ from Pydantic V1.

__pydantic_fields_set__ pydantic-field ¤

__pydantic_fields_set__: set[str]

The names of fields explicitly set during instantiation.

__pydantic_generic_metadata__ class-attribute ¤

__pydantic_generic_metadata__: _generics.PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ class-attribute ¤

__pydantic_parent_namespace__: Dict[str, Any] | None = None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ class-attribute ¤

__pydantic_post_init__: None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_private__ pydantic-field ¤

__pydantic_private__: Dict[str, Any] | None

Values of private attributes set on the model instance.

__pydantic_root_model__ class-attribute ¤

__pydantic_root_model__: bool = False

Whether the model is a RootModel.

__pydantic_serializer__ class-attribute ¤

__pydantic_serializer__: SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_setattr_handlers__ class-attribute ¤

__pydantic_setattr_handlers__: Dict[str, Callable[[BaseModel, str, Any], None]]

__setattr__ handlers. Memoizing the handlers leads to a dramatic performance improvement in __setattr__

__pydantic_validator__ class-attribute ¤

__pydantic_validator__: SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__repr_name__ pydantic-field ¤

__repr_name__ = _repr.Representation.__repr_name__

__repr_recursion__ pydantic-field ¤

__repr_recursion__ = _repr.Representation.__repr_recursion__

__repr_str__ pydantic-field ¤

__repr_str__ = _repr.Representation.__repr_str__

__rich_repr__ pydantic-field ¤

__rich_repr__ = _repr.Representation.__rich_repr__

__signature__ class-attribute ¤

__signature__: Signature

The synthesized __init__ [Signature][inspect.Signature] of the model.

__slots__ pydantic-field ¤

__slots__ = ('__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__')

model_config class-attribute ¤

model_config: ConfigDict = ConfigDict()

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_extra property ¤

model_extra: dict[str, Any] | None

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property ¤

model_fields_set: set[str]

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

summary class-attribute instance-attribute ¤

summary: List[Union[Literal['display'], Path]] = Field(default_factory=lambda: ['display'], examples=[Path('summary.md'), Path('bioimageio_summaries/'), ['display', Path('summary.md')]])

Display the validation summary or save it as JSON, Markdown or HTML. The format is chosen based on the suffix: .json, .md, .html. If a folder is given (path w/o suffix) the summary is saved in all formats. Choose/add "display" to render the validation summary to the terminal.

__class_getitem__ ¤

__class_getitem__(typevar_values: type[Any] | tuple[type[Any], ...]) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef
Source code in pydantic/main.py
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
def __class_getitem__(
    cls, typevar_values: type[Any] | tuple[type[Any], ...]
) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef:
    cached = _generics.get_cached_generic_type_early(cls, typevar_values)
    if cached is not None:
        return cached

    if cls is BaseModel:
        raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel')
    if not hasattr(cls, '__parameters__'):
        raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic')
    if not cls.__pydantic_generic_metadata__['parameters'] and Generic not in cls.__bases__:
        raise TypeError(f'{cls} is not a generic class')

    if not isinstance(typevar_values, tuple):
        typevar_values = (typevar_values,)

    # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`,
    # this gives us `{T: str, U: bool, V: int}`:
    typevars_map = _generics.map_generic_model_arguments(cls, typevar_values)
    # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`):
    typevar_values = tuple(v for v in typevars_map.values())

    if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map:
        submodel = cls  # if arguments are equal to parameters it's the same object
        _generics.set_cached_generic_type(cls, typevar_values, submodel)
    else:
        parent_args = cls.__pydantic_generic_metadata__['args']
        if not parent_args:
            args = typevar_values
        else:
            args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args)

        origin = cls.__pydantic_generic_metadata__['origin'] or cls
        model_name = origin.model_parametrized_name(args)
        params = tuple(
            dict.fromkeys(_generics.iter_contained_typevars(typevars_map.values()))
        )  # use dict as ordered set

        with _generics.generic_recursion_self_type(origin, args) as maybe_self_type:
            cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args)
            if cached is not None:
                return cached

            if maybe_self_type is not None:
                return maybe_self_type

            # Attempt to rebuild the origin in case new types have been defined
            try:
                # depth 2 gets you above this __class_getitem__ call.
                # Note that we explicitly provide the parent ns, otherwise
                # `model_rebuild` will use the parent ns no matter if it is the ns of a module.
                # We don't want this here, as this has unexpected effects when a model
                # is being parametrized during a forward annotation evaluation.
                parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {}
                origin.model_rebuild(_types_namespace=parent_ns)
            except PydanticUndefinedAnnotation:
                # It's okay if it fails, it just means there are still undefined types
                # that could be evaluated later.
                pass

            submodel = _generics.create_generic_submodel(model_name, origin, args, params)

            _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args)

    return submodel

__copy__ ¤

__copy__() -> Self

Returns a shallow copy of the model.

Source code in pydantic/main.py
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
def __copy__(self) -> Self:
    """Returns a shallow copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', copy(self.__dict__))
    _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__))
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined},
        )

    return m

__deepcopy__ ¤

__deepcopy__(memo: dict[int, Any] | None = None) -> Self

Returns a deep copy of the model.

Source code in pydantic/main.py
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:
    """Returns a deep copy of the model."""
    cls = type(self)
    m = cls.__new__(cls)
    _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo))
    _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo))
    # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str],
    # and attempting a deepcopy would be marginally slower.
    _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__))

    if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None:
        _object_setattr(m, '__pydantic_private__', None)
    else:
        _object_setattr(
            m,
            '__pydantic_private__',
            deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo),
        )

    return m

__delattr__ ¤

__delattr__(item: str) -> Any
Source code in pydantic/main.py
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
def __delattr__(self, item: str) -> Any:
    cls = self.__class__

    if item in self.__private_attributes__:
        attribute = self.__private_attributes__[item]
        if hasattr(attribute, '__delete__'):
            attribute.__delete__(self)  # type: ignore
            return

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            del self.__pydantic_private__[item]  # type: ignore
            return
        except KeyError as exc:
            raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc

    # Allow cached properties to be deleted (even if the class is frozen):
    attr = getattr(cls, item, None)
    if isinstance(attr, cached_property):
        return object.__delattr__(self, item)

    _check_frozen(cls, name=item, value=None)

    if item in self.__pydantic_fields__:
        object.__delattr__(self, item)
    elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__:
        del self.__pydantic_extra__[item]
    else:
        try:
            object.__delattr__(self, item)
        except AttributeError:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__eq__ ¤

__eq__(other: Any) -> bool
Source code in pydantic/main.py
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
def __eq__(self, other: Any) -> bool:
    if isinstance(other, BaseModel):
        # When comparing instances of generic types for equality, as long as all field values are equal,
        # only require their generic origin types to be equal, rather than exact type equality.
        # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1).
        self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__
        other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__

        # Perform common checks first
        if not (
            self_type == other_type
            and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None)
            and self.__pydantic_extra__ == other.__pydantic_extra__
        ):
            return False

        # We only want to compare pydantic fields but ignoring fields is costly.
        # We'll perform a fast check first, and fallback only when needed
        # See GH-7444 and GH-7825 for rationale and a performance benchmark

        # First, do the fast (and sometimes faulty) __dict__ comparison
        if self.__dict__ == other.__dict__:
            # If the check above passes, then pydantic fields are equal, we can return early
            return True

        # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return
        # early if there are no keys to ignore (we would just return False later on anyway)
        model_fields = type(self).__pydantic_fields__.keys()
        if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields:
            return False

        # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore
        # Resort to costly filtering of the __dict__ objects
        # We use operator.itemgetter because it is much faster than dict comprehensions
        # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an
        # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute
        # raises an error in BaseModel.__getattr__ instead of returning the class attribute
        # So we can use operator.itemgetter() instead of operator.attrgetter()
        getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL
        try:
            return getter(self.__dict__) == getter(other.__dict__)
        except KeyError:
            # In rare cases (such as when using the deprecated BaseModel.copy() method),
            # the __dict__ may not contain all model fields, which is how we can get here.
            # getter(self.__dict__) is much faster than any 'safe' method that accounts
            # for missing keys, and wrapping it in a `try` doesn't slow things down much
            # in the common case.
            self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__)
            other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__)
            return getter(self_fields_proxy) == getter(other_fields_proxy)

    # other instance is not a BaseModel
    else:
        return NotImplemented  # delegate to the other item in the comparison

__get_pydantic_core_schema__ classmethod ¤

__get_pydantic_core_schema__(source: type[BaseModel], handler: GetCoreSchemaHandler) -> CoreSchema
Source code in pydantic/main.py
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
@classmethod
def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema:
    # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass.
    # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to
    # *not* be called if not overridden.
    warnings.warn(
        'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling '
        '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using '
        '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected '
        'side effects.',
        PydanticDeprecatedSince211,
        stacklevel=2,
    )
    # Logic copied over from `GenerateSchema._model_schema`:
    schema = cls.__dict__.get('__pydantic_core_schema__')
    if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema):
        return cls.__pydantic_core_schema__

    return handler(source)

__get_pydantic_json_schema__ classmethod ¤

__get_pydantic_json_schema__(core_schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue

Hook into generating the model's JSON schema.

Parameters:

Name Type Description Default

core_schema ¤

CoreSchema

A pydantic-core CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema ({'type': 'nullable', 'schema': current_schema}), or just call the handler with the original schema.

required

handler ¤

GetJsonSchemaHandler

Call into Pydantic's internal JSON schema generation. This will raise a pydantic.errors.PydanticInvalidForJsonSchema if JSON schema generation fails. Since this gets called by BaseModel.model_json_schema you can override the schema_generator argument to that function to change JSON schema generation globally for a type.

required

Returns:

Type Description
JsonSchemaValue

A JSON schema, as a Python object.

Source code in pydantic/main.py
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
@classmethod
def __get_pydantic_json_schema__(
    cls,
    core_schema: CoreSchema,
    handler: GetJsonSchemaHandler,
    /,
) -> JsonSchemaValue:
    """Hook into generating the model's JSON schema.

    Args:
        core_schema: A `pydantic-core` CoreSchema.
            You can ignore this argument and call the handler with a new CoreSchema,
            wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),
            or just call the handler with the original schema.
        handler: Call into Pydantic's internal JSON schema generation.
            This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema
            generation fails.
            Since this gets called by `BaseModel.model_json_schema` you can override the
            `schema_generator` argument to that function to change JSON schema generation globally
            for a type.

    Returns:
        A JSON schema, as a Python object.
    """
    return handler(core_schema)

__getattr__ ¤

__getattr__(item: str) -> Any
Source code in pydantic/main.py
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
def __getattr__(self, item: str) -> Any:
    private_attributes = object.__getattribute__(self, '__private_attributes__')
    if item in private_attributes:
        attribute = private_attributes[item]
        if hasattr(attribute, '__get__'):
            return attribute.__get__(self, type(self))  # type: ignore

        try:
            # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items
            return self.__pydantic_private__[item]  # type: ignore
        except KeyError as exc:
            raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
    else:
        # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
        # See `BaseModel.__repr_args__` for more details
        try:
            pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
        except AttributeError:
            pydantic_extra = None

        if pydantic_extra and item in pydantic_extra:
            return pydantic_extra[item]
        else:
            if hasattr(self.__class__, item):
                return super().__getattribute__(item)  # Raises AttributeError if appropriate
            else:
                # this is the current error
                raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

__getstate__ ¤

__getstate__() -> dict[Any, Any]
Source code in pydantic/main.py
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
def __getstate__(self) -> dict[Any, Any]:
    private = self.__pydantic_private__
    if private:
        private = {k: v for k, v in private.items() if v is not PydanticUndefined}
    return {
        '__dict__': self.__dict__,
        '__pydantic_extra__': self.__pydantic_extra__,
        '__pydantic_fields_set__': self.__pydantic_fields_set__,
        '__pydantic_private__': private,
    }

__init_subclass__ ¤

__init_subclass__(**kwargs: Unpack[ConfigDict])

This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs.

from pydantic import BaseModel

class MyModel(BaseModel, extra='allow'): ...

However, this may be deceiving, since the actual calls to __init_subclass__ will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are not expected keys in ConfigDict. (This is due to the way ModelMetaclass.__new__ works.)

Parameters:

Name Type Description Default

**kwargs ¤

Unpack[ConfigDict]

Keyword arguments passed to the class definition, which set model_config

{}
Note

You may want to override __pydantic_init_subclass__ instead, which behaves similarly but is called after the class is fully initialized.

Source code in pydantic/main.py
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):
    """This signature is included purely to help type-checkers check arguments to class declaration, which
    provides a way to conveniently set model_config key/value pairs.

    ```python
    from pydantic import BaseModel

    class MyModel(BaseModel, extra='allow'): ...
    ```

    However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any
    of the config arguments, and will only receive any keyword arguments passed during class initialization
    that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)

    Args:
        **kwargs: Keyword arguments passed to the class definition, which set model_config

    Note:
        You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called
        *after* the class is fully initialized.
    """

__iter__ ¤

__iter__() -> TupleGenerator

So dict(model) works.

Source code in pydantic/main.py
1229
1230
1231
1232
1233
1234
def __iter__(self) -> TupleGenerator:
    """So `dict(model)` works."""
    yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')]
    extra = self.__pydantic_extra__
    if extra:
        yield from extra.items()

__pydantic_init_subclass__ classmethod ¤

__pydantic_init_subclass__(**kwargs: Any) -> None

This is intended to behave just like __init_subclass__, but is called by ModelMetaclass only after basic class initialization is complete. In particular, attributes like model_fields will be present when this is called, but forward annotations are not guaranteed to be resolved yet, meaning that creating an instance of the class may fail.

This is necessary because __init_subclass__ will always be called by type.__new__, and it would require a prohibitively large refactor to the ModelMetaclass to ensure that type.__new__ was called in such a manner that the class would already be sufficiently initialized.

This will receive the same kwargs that would be passed to the standard __init_subclass__, namely, any kwargs passed to the class definition that aren't used internally by Pydantic.

Parameters:

Name Type Description Default

**kwargs ¤

Any

Any keyword arguments passed to the class definition that aren't used internally by Pydantic.

{}
Note

You may want to override __pydantic_on_complete__() instead, which is called once the class and its fields are fully initialized and ready for validation.

Source code in pydantic/main.py
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
@classmethod
def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:
    """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`
    only after basic class initialization is complete. In particular, attributes like `model_fields` will
    be present when this is called, but forward annotations are not guaranteed to be resolved yet,
    meaning that creating an instance of the class may fail.

    This is necessary because `__init_subclass__` will always be called by `type.__new__`,
    and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that
    `type.__new__` was called in such a manner that the class would already be sufficiently initialized.

    This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,
    any kwargs passed to the class definition that aren't used internally by Pydantic.

    Args:
        **kwargs: Any keyword arguments passed to the class definition that aren't used internally
            by Pydantic.

    Note:
        You may want to override [`__pydantic_on_complete__()`][pydantic.main.BaseModel.__pydantic_on_complete__]
        instead, which is called once the class and its fields are fully initialized and ready for validation.
    """

__pydantic_on_complete__ classmethod ¤

__pydantic_on_complete__() -> None

This is called once the class and its fields are fully initialized and ready to be used.

This typically happens when the class is created (just before __pydantic_init_subclass__() is called on the superclass), except when forward annotations are used that could not immediately be resolved. In that case, it will be called later, when the model is rebuilt automatically or explicitly using model_rebuild().

Source code in pydantic/main.py
877
878
879
880
881
882
883
884
885
886
@classmethod
def __pydantic_on_complete__(cls) -> None:
    """This is called once the class and its fields are fully initialized and ready to be used.

    This typically happens when the class is created (just before
    [`__pydantic_init_subclass__()`][pydantic.main.BaseModel.__pydantic_init_subclass__] is called on the superclass),
    except when forward annotations are used that could not immediately be resolved.
    In that case, it will be called later, when the model is rebuilt automatically or explicitly using
    [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild].
    """

__replace__ ¤

__replace__(**changes: Any) -> Self
Source code in pydantic/main.py
1125
1126
def __replace__(self, **changes: Any) -> Self:
    return self.model_copy(update=changes)

__repr__ ¤

__repr__() -> str
Source code in pydantic/main.py
1236
1237
def __repr__(self) -> str:
    return f'{self.__repr_name__()}({self.__repr_str__(", ")})'

__repr_args__ ¤

__repr_args__() -> _repr.ReprArgs
Source code in pydantic/main.py
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
def __repr_args__(self) -> _repr.ReprArgs:
    # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such
    # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__`
    # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration.
    computed_fields_repr_args = [
        (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr
    ]

    for k, v in self.__dict__.items():
        field = self.__pydantic_fields__.get(k)
        if field and field.repr:
            if v is not self:
                yield k, v
            else:
                yield k, self.__repr_recursion__(v)
    # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized.
    # This can happen if a `ValidationError` is raised during initialization and the instance's
    # repr is generated as part of the exception handling. Therefore, we use `getattr` here
    # with a fallback, even though the type hints indicate the attribute will always be present.
    try:
        pydantic_extra = object.__getattribute__(self, '__pydantic_extra__')
    except AttributeError:
        pydantic_extra = None

    if pydantic_extra is not None:
        yield from ((k, v) for k, v in pydantic_extra.items())
    yield from computed_fields_repr_args

__setattr__ ¤

__setattr__(name: str, value: Any) -> None
Source code in pydantic/main.py
1028
1029
1030
1031
1032
1033
1034
def __setattr__(self, name: str, value: Any) -> None:
    if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None:
        setattr_handler(self, name, value)
    # if None is returned from _setattr_handler, the attribute was set directly
    elif (setattr_handler := self._setattr_handler(name, value)) is not None:
        setattr_handler(self, name, value)  # call here to not memo on possibly unknown fields
        self.__pydantic_setattr_handlers__[name] = setattr_handler  # memoize the handler for faster access

__setstate__ ¤

__setstate__(state: dict[Any, Any]) -> None
Source code in pydantic/main.py
1139
1140
1141
1142
1143
def __setstate__(self, state: dict[Any, Any]) -> None:
    _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {}))
    _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {}))
    _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {}))
    _object_setattr(self, '__dict__', state.get('__dict__', {}))

__str__ ¤

__str__() -> str
Source code in pydantic/main.py
1274
1275
def __str__(self) -> str:
    return self.__repr_str__(' ')

construct classmethod ¤

construct(_fields_set: set[str] | None = None, **values: Any) -> Self
Source code in pydantic/main.py
1477
1478
1479
1480
1481
1482
1483
1484
1485
@classmethod
@typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None)
def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `construct` method is deprecated; use `model_construct` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_construct(_fields_set=_fields_set, **values)

copy ¤

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, deep: bool = False) -> Self

Returns a copy of the model.

Deprecated

This method is now deprecated; use model_copy instead.

If you need include or exclude, use:

data = self.model_dump(include=include, exclude=exclude, round_trip=True)
data = {**data, **(update or {})}
copied = self.model_validate(data)

Parameters:

Name Type Description Default

include ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to include in the copied model.

None

exclude ¤

AbstractSetIntStr | MappingIntStrAny | None

Optional set or mapping specifying which fields to exclude in the copied model.

None

update ¤

Dict[str, Any] | None

Optional dictionary of field-value pairs to override field values in the copied model.

None

deep ¤

bool

If True, the values of fields that are Pydantic models will be deep-copied.

False

Returns:

Type Description
Self

A copy of the model with included, excluded and updated fields as specified.

Source code in pydantic/main.py
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
@typing_extensions.deprecated(
    'The `copy` method is deprecated; use `model_copy` instead. '
    'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
    category=None,
)
def copy(
    self,
    *,
    include: AbstractSetIntStr | MappingIntStrAny | None = None,
    exclude: AbstractSetIntStr | MappingIntStrAny | None = None,
    update: Dict[str, Any] | None = None,  # noqa UP006
    deep: bool = False,
) -> Self:  # pragma: no cover
    """Returns a copy of the model.

    !!! warning "Deprecated"
        This method is now deprecated; use `model_copy` instead.

    If you need `include` or `exclude`, use:

    ```python {test="skip" lint="skip"}
    data = self.model_dump(include=include, exclude=exclude, round_trip=True)
    data = {**data, **(update or {})}
    copied = self.model_validate(data)
    ```

    Args:
        include: Optional set or mapping specifying which fields to include in the copied model.
        exclude: Optional set or mapping specifying which fields to exclude in the copied model.
        update: Optional dictionary of field-value pairs to override field values in the copied model.
        deep: If True, the values of fields that are Pydantic models will be deep-copied.

    Returns:
        A copy of the model with included, excluded and updated fields as specified.
    """
    warnings.warn(
        'The `copy` method is deprecated; use `model_copy` instead. '
        'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import copy_internals

    values = dict(
        copy_internals._iter(
            self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False
        ),
        **(update or {}),
    )
    if self.__pydantic_private__ is None:
        private = None
    else:
        private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}

    if self.__pydantic_extra__ is None:
        extra: dict[str, Any] | None = None
    else:
        extra = self.__pydantic_extra__.copy()
        for k in list(self.__pydantic_extra__):
            if k not in values:  # k was in the exclude
                extra.pop(k)
        for k in list(values):
            if k in self.__pydantic_extra__:  # k must have come from extra
                extra[k] = values.pop(k)

    # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg
    if update:
        fields_set = self.__pydantic_fields_set__ | update.keys()
    else:
        fields_set = set(self.__pydantic_fields_set__)

    # removing excluded fields from `__pydantic_fields_set__`
    if exclude:
        fields_set -= set(exclude)

    return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep)

dict ¤

dict(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) -> Dict[str, Any]
Source code in pydantic/main.py
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
@typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None)
def dict(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `dict` method is deprecated; use `model_dump` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return self.model_dump(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

from_orm classmethod ¤

from_orm(obj: Any) -> Self
Source code in pydantic/main.py
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
@classmethod
@typing_extensions.deprecated(
    'The `from_orm` method is deprecated; set '
    "`model_config['from_attributes']=True` and use `model_validate` instead.",
    category=None,
)
def from_orm(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `from_orm` method is deprecated; set '
        "`model_config['from_attributes']=True` and use `model_validate` instead.",
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if not cls.model_config.get('from_attributes', None):
        raise PydanticUserError(
            'You must set the config attribute `from_attributes=True` to use from_orm', code=None
        )
    return cls.model_validate(obj)

json ¤

json(*, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, models_as_dict: bool = PydanticUndefined, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
@typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None)
def json(  # noqa: D102
    self,
    *,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    encoder: Callable[[Any], Any] | None = PydanticUndefined,  # type: ignore[assignment]
    models_as_dict: bool = PydanticUndefined,  # type: ignore[assignment]
    **dumps_kwargs: Any,
) -> str:
    warnings.warn(
        'The `json` method is deprecated; use `model_dump_json` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if encoder is not PydanticUndefined:
        raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.')
    if models_as_dict is not PydanticUndefined:
        raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.')
    if dumps_kwargs:
        raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.')
    return self.model_dump_json(
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
    )

log ¤

log(descr: Union[ResourceDescr, InvalidDescr])
Source code in src/bioimageio/core/cli.py
131
132
def log(self, descr: Union[ResourceDescr, InvalidDescr]):
    _ = descr.validation_summary.log(self.summary)

model_computed_fields classmethod ¤

model_computed_fields() -> dict[str, ComputedFieldInfo]

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
273
274
275
276
277
278
279
280
281
282
@_utils.deprecated_instance_property
@classmethod
def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:
    """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_computed_fields__', {})

model_construct classmethod ¤

model_construct(_fields_set: set[str] | None = None, **values: Any) -> Self

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default

_fields_set ¤

set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None

values ¤

Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
@classmethod
def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self:  # noqa: C901
    """Creates a new instance of the `Model` class with validated data.

    Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.
    Default values are respected, but no other validation is performed.

    !!! note
        `model_construct()` generally respects the `model_config.extra` setting on the provided model.
        That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`
        and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.
        Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in
        an error if extra values are passed, but they will be ignored.

    Args:
        _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,
            this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.
            Otherwise, the field names from the `values` argument will be used.
        values: Trusted or pre-validated data dictionary.

    Returns:
        A new instance of the `Model` class with validated data.
    """
    m = cls.__new__(cls)
    fields_values: dict[str, Any] = {}
    fields_set = set()

    for name, field in cls.__pydantic_fields__.items():
        if field.alias is not None and field.alias in values:
            fields_values[name] = values.pop(field.alias)
            fields_set.add(name)

        if (name not in fields_set) and (field.validation_alias is not None):
            validation_aliases: list[str | AliasPath] = (
                field.validation_alias.choices
                if isinstance(field.validation_alias, AliasChoices)
                else [field.validation_alias]
            )

            for alias in validation_aliases:
                if isinstance(alias, str) and alias in values:
                    fields_values[name] = values.pop(alias)
                    fields_set.add(name)
                    break
                elif isinstance(alias, AliasPath):
                    value = alias.search_dict_for_path(values)
                    if value is not PydanticUndefined:
                        fields_values[name] = value
                        fields_set.add(name)
                        break

        if name not in fields_set:
            if name in values:
                fields_values[name] = values.pop(name)
                fields_set.add(name)
            elif not field.is_required():
                fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values)
    if _fields_set is None:
        _fields_set = fields_set

    _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None
    _object_setattr(m, '__dict__', fields_values)
    _object_setattr(m, '__pydantic_fields_set__', _fields_set)
    if not cls.__pydantic_root_model__:
        _object_setattr(m, '__pydantic_extra__', _extra)

    if cls.__pydantic_post_init__:
        m.model_post_init(None)
        # update private attributes with values set
        if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None:
            for k, v in values.items():
                if k in m.__private_attributes__:
                    m.__pydantic_private__[k] = v

    elif not cls.__pydantic_root_model__:
        # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist
        # Since it doesn't, that means that `__pydantic_private__` should be set to None
        _object_setattr(m, '__pydantic_private__', None)

    return m

model_copy ¤

model_copy(*, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's [__dict__][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]).

Parameters:

Name Type Description Default

update ¤

Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None

deep ¤

bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:
    """!!! abstract "Usage Documentation"
        [`model_copy`](../concepts/models.md#model-copy)

    Returns a copy of the model.

    !!! note
        The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This
        might have unexpected side effects if you store anything in it, on top of the model
        fields (e.g. the value of [cached properties][functools.cached_property]).

    Args:
        update: Values to change/add in the new model. Note: the data is not validated
            before creating the new model. You should trust this data.
        deep: Set to `True` to make a deep copy of the model.

    Returns:
        New model instance.
    """
    copied = self.__deepcopy__() if deep else self.__copy__()
    if update:
        if self.model_config.get('extra') == 'allow':
            for k, v in update.items():
                if k in self.__pydantic_fields__:
                    copied.__dict__[k] = v
                else:
                    if copied.__pydantic_extra__ is None:
                        copied.__pydantic_extra__ = {}
                    copied.__pydantic_extra__[k] = v
        else:
            copied.__dict__.update(update)
        copied.__pydantic_fields_set__.update(update.keys())
    return copied

model_dump ¤

model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> dict[str, Any]

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default

mode ¤

Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'

include ¤

IncEx | None

A set of fields to include in the output.

None

exclude ¤

IncEx | None

A set of fields to exclude from the output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to use the field's alias in the dictionary key if defined.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
def model_dump(
    self,
    *,
    mode: Literal['json', 'python'] | str = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> dict[str, Any]:
    """!!! abstract "Usage Documentation"
        [`model_dump`](../concepts/serialization.md#python-mode)

    Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

    Args:
        mode: The mode in which `to_python` should run.
            If mode is 'json', the output will only contain JSON serializable types.
            If mode is 'python', the output may contain non-JSON-serializable Python objects.
        include: A set of fields to include in the output.
        exclude: A set of fields to exclude from the output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to use the field's alias in the dictionary key if defined.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A dictionary representation of the model.
    """
    return self.__pydantic_serializer__.to_python(
        self,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        context=context,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    )

model_dump_json ¤

model_dump_json(*, indent: int | None = None, ensure_ascii: bool = False, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, exclude_computed_fields: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False) -> str

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default

indent ¤

int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None

ensure_ascii ¤

bool

If True, the output is guaranteed to have all incoming non-ASCII characters escaped. If False (the default), these characters will be output as-is.

False

include ¤

IncEx | None

Field(s) to include in the JSON output.

None

exclude ¤

IncEx | None

Field(s) to exclude from the JSON output.

None

context ¤

Any | None

Additional context to pass to the serializer.

None

by_alias ¤

bool | None

Whether to serialize using field aliases.

None

exclude_unset ¤

bool

Whether to exclude fields that have not been explicitly set.

False

exclude_defaults ¤

bool

Whether to exclude fields that are set to their default value.

False

exclude_none ¤

bool

Whether to exclude fields that have a value of None.

False

exclude_computed_fields ¤

bool

Whether to exclude computed fields. While this can be useful for round-tripping, it is usually recommended to use the dedicated round_trip parameter instead.

False

round_trip ¤

bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False

warnings ¤

bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True

fallback ¤

Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None

serialize_as_any ¤

bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
def model_dump_json(
    self,
    *,
    indent: int | None = None,
    ensure_ascii: bool = False,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    exclude_computed_fields: bool = False,
    round_trip: bool = False,
    warnings: bool | Literal['none', 'warn', 'error'] = True,
    fallback: Callable[[Any], Any] | None = None,
    serialize_as_any: bool = False,
) -> str:
    """!!! abstract "Usage Documentation"
        [`model_dump_json`](../concepts/serialization.md#json-mode)

    Generates a JSON representation of the model using Pydantic's `to_json` method.

    Args:
        indent: Indentation to use in the JSON output. If None is passed, the output will be compact.
        ensure_ascii: If `True`, the output is guaranteed to have all incoming non-ASCII characters escaped.
            If `False` (the default), these characters will be output as-is.
        include: Field(s) to include in the JSON output.
        exclude: Field(s) to exclude from the JSON output.
        context: Additional context to pass to the serializer.
        by_alias: Whether to serialize using field aliases.
        exclude_unset: Whether to exclude fields that have not been explicitly set.
        exclude_defaults: Whether to exclude fields that are set to their default value.
        exclude_none: Whether to exclude fields that have a value of `None`.
        exclude_computed_fields: Whether to exclude computed fields.
            While this can be useful for round-tripping, it is usually recommended to use the dedicated
            `round_trip` parameter instead.
        round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].
        warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,
            "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].
        fallback: A function to call when an unknown value is encountered. If not provided,
            a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.
        serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.

    Returns:
        A JSON string representation of the model.
    """
    return self.__pydantic_serializer__.to_json(
        self,
        indent=indent,
        ensure_ascii=ensure_ascii,
        include=include,
        exclude=exclude,
        context=context,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        exclude_computed_fields=exclude_computed_fields,
        round_trip=round_trip,
        warnings=warnings,
        fallback=fallback,
        serialize_as_any=serialize_as_any,
    ).decode()

model_fields classmethod ¤

model_fields() -> dict[str, FieldInfo]

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
262
263
264
265
266
267
268
269
270
271
@_utils.deprecated_instance_property
@classmethod
def model_fields(cls) -> dict[str, FieldInfo]:
    """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.

    !!! warning
        Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.
        Instead, you should access this attribute from the model class.
    """
    return getattr(cls, '__pydantic_fields__', {})

model_json_schema classmethod ¤

model_json_schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', *, union_format: Literal['any_of', 'primitive_type_array'] = 'any_of') -> dict[str, Any]

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default

by_alias ¤

bool

Whether to use attribute aliases or not.

True

ref_template ¤

str

The reference template.

DEFAULT_REF_TEMPLATE

union_format ¤

Literal['any_of', 'primitive_type_array']

The format to use when combining schemas from unions together. Can be one of:

  • 'any_of': Use the anyOf keyword to combine schemas (the default).
  • 'primitive_type_array': Use the type keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive type (string, boolean, null, integer or number) or contains constraints/metadata, falls back to any_of.
'any_of'

schema_generator ¤

type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema

mode ¤

JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
@classmethod
def model_json_schema(
    cls,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
    *,
    union_format: Literal['any_of', 'primitive_type_array'] = 'any_of',
) -> dict[str, Any]:
    """Generates a JSON schema for a model class.

    Args:
        by_alias: Whether to use attribute aliases or not.
        ref_template: The reference template.
        union_format: The format to use when combining schemas from unions together. Can be one of:

            - `'any_of'`: Use the [`anyOf`](https://json-schema.org/understanding-json-schema/reference/combining#anyOf)
            keyword to combine schemas (the default).
            - `'primitive_type_array'`: Use the [`type`](https://json-schema.org/understanding-json-schema/reference/type)
            keyword as an array of strings, containing each type of the combination. If any of the schemas is not a primitive
            type (`string`, `boolean`, `null`, `integer` or `number`) or contains constraints/metadata, falls back to
            `any_of`.
        schema_generator: To override the logic used to generate the JSON schema, as a subclass of
            `GenerateJsonSchema` with your desired modifications
        mode: The mode in which to generate the schema.

    Returns:
        The JSON schema for the given model class.
    """
    return model_json_schema(
        cls,
        by_alias=by_alias,
        ref_template=ref_template,
        union_format=union_format,
        schema_generator=schema_generator,
        mode=mode,
    )

model_parametrized_name classmethod ¤

model_parametrized_name(params: tuple[type[Any], ...]) -> str

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default

params ¤

tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
@classmethod
def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:
    """Compute the class name for parametrizations of generic classes.

    This method can be overridden to achieve a custom naming scheme for generic BaseModels.

    Args:
        params: Tuple of types of the class. Given a generic class
            `Model` with 2 type variables and a concrete model `Model[str, int]`,
            the value `(str, int)` would be passed to `params`.

    Returns:
        String representing the new class where `params` are passed to `cls` as type variables.

    Raises:
        TypeError: Raised when trying to generate concrete names for non-generic models.
    """
    if not issubclass(cls, Generic):
        raise TypeError('Concrete names should only be generated for generic models.')

    # Any strings received should represent forward references, so we handle them specially below.
    # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future,
    # we may be able to remove this special case.
    param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params]
    params_component = ', '.join(param_names)
    return f'{cls.__name__}[{params_component}]'

model_post_init ¤

model_post_init(context: Any) -> None

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
612
613
614
615
def model_post_init(self, context: Any, /) -> None:
    """Override this method to perform additional initialization after `__init__` and `model_construct`.
    This is useful if you want to do some validation that requires the entire model to be initialized.
    """

model_rebuild classmethod ¤

model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None) -> bool | None

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default

force ¤

bool

Whether to force the rebuilding of the model schema, defaults to False.

False

raise_errors ¤

bool

Whether to raise errors, defaults to True.

True

_parent_namespace_depth ¤

int

The depth level of the parent namespace, defaults to 2.

2

_types_namespace ¤

MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required. If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
@classmethod
def model_rebuild(
    cls,
    *,
    force: bool = False,
    raise_errors: bool = True,
    _parent_namespace_depth: int = 2,
    _types_namespace: MappingNamespace | None = None,
) -> bool | None:
    """Try to rebuild the pydantic-core schema for the model.

    This may be necessary when one of the annotations is a ForwardRef which could not be resolved during
    the initial attempt to build the schema, and automatic rebuilding fails.

    Args:
        force: Whether to force the rebuilding of the model schema, defaults to `False`.
        raise_errors: Whether to raise errors, defaults to `True`.
        _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.
        _types_namespace: The types namespace, defaults to `None`.

    Returns:
        Returns `None` if the schema is already "complete" and rebuilding was not required.
        If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.
    """
    already_complete = cls.__pydantic_complete__
    if already_complete and not force:
        return None

    cls.__pydantic_complete__ = False

    for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'):
        if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer):
            # Deleting the validator/serializer is necessary as otherwise they can get reused in
            # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()`
            # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used.
            # Same applies for the core schema that can be reused in schema generation.
            delattr(cls, attr)

    if _types_namespace is not None:
        rebuild_ns = _types_namespace
    elif _parent_namespace_depth > 0:
        rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {}
    else:
        rebuild_ns = {}

    parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {}

    ns_resolver = _namespace_utils.NsResolver(
        parent_namespace={**rebuild_ns, **parent_ns},
    )

    return _model_construction.complete_model_class(
        cls,
        _config.ConfigWrapper(cls.model_config, check=False),
        ns_resolver,
        raise_errors=raise_errors,
        # If the model was already complete, we don't need to call the hook again.
        call_on_complete_hook=not already_complete,
    )

model_validate classmethod ¤

model_validate(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate a pydantic model instance.

Parameters:

Name Type Description Default

obj ¤

Any

The object to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

from_attributes ¤

bool | None

Whether to extract data from object attributes.

None

context ¤

Any | None

Additional context to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
@classmethod
def model_validate(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    from_attributes: bool | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate a pydantic model instance.

    Args:
        obj: The object to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Raises:
        ValidationError: If the object could not be validated.

    Returns:
        The validated model instance.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_python(
        obj,
        strict=strict,
        extra=extra,
        from_attributes=from_attributes,
        context=context,
        by_alias=by_alias,
        by_name=by_name,
    )

model_validate_json classmethod ¤

model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default

json_data ¤

str | bytes | bytearray

The JSON data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
@classmethod
def model_validate_json(
    cls,
    json_data: str | bytes | bytearray,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """!!! abstract "Usage Documentation"
        [JSON Parsing](../concepts/json.md#json-parsing)

    Validate the given JSON data against the Pydantic model.

    Args:
        json_data: The JSON data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.

    Raises:
        ValidationError: If `json_data` is not a JSON string or the object could not be validated.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_json(
        json_data, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

model_validate_strings classmethod ¤

model_validate_strings(obj: Any, *, strict: bool | None = None, extra: ExtraValues | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None) -> Self

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default

obj ¤

Any

The object containing string data to validate.

required

strict ¤

bool | None

Whether to enforce types strictly.

None

extra ¤

ExtraValues | None

Whether to ignore, allow, or forbid extra data during model validation. See the [extra configuration value][pydantic.ConfigDict.extra] for details.

None

context ¤

Any | None

Extra variables to pass to the validator.

None

by_alias ¤

bool | None

Whether to use the field's alias when validating against the provided input data.

None

by_name ¤

bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def model_validate_strings(
    cls,
    obj: Any,
    *,
    strict: bool | None = None,
    extra: ExtraValues | None = None,
    context: Any | None = None,
    by_alias: bool | None = None,
    by_name: bool | None = None,
) -> Self:
    """Validate the given object with string data against the Pydantic model.

    Args:
        obj: The object containing string data to validate.
        strict: Whether to enforce types strictly.
        extra: Whether to ignore, allow, or forbid extra data during model validation.
            See the [`extra` configuration value][pydantic.ConfigDict.extra] for details.
        context: Extra variables to pass to the validator.
        by_alias: Whether to use the field's alias when validating against the provided input data.
        by_name: Whether to use the field's name when validating against the provided input data.

    Returns:
        The validated Pydantic model.
    """
    # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    __tracebackhide__ = True

    if by_alias is False and by_name is not True:
        raise PydanticUserError(
            'At least one of `by_alias` or `by_name` must be set to True.',
            code='validate-by-alias-and-name-false',
        )

    return cls.__pydantic_validator__.validate_strings(
        obj, strict=strict, extra=extra, context=context, by_alias=by_alias, by_name=by_name
    )

parse_file classmethod ¤

parse_file(path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
@classmethod
@typing_extensions.deprecated(
    'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
    'use `model_validate_json`, otherwise `model_validate` instead.',
    category=None,
)
def parse_file(  # noqa: D102
    cls,
    path: str | Path,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:
    warnings.warn(
        'The `parse_file` method is deprecated; load the data from file, then if your data is JSON '
        'use `model_validate_json`, otherwise `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    obj = parse.load_file(
        path,
        proto=proto,
        content_type=content_type,
        encoding=encoding,
        allow_pickle=allow_pickle,
    )
    return cls.parse_obj(obj)

parse_obj classmethod ¤

parse_obj(obj: Any) -> Self
Source code in pydantic/main.py
1362
1363
1364
1365
1366
1367
1368
1369
1370
@classmethod
@typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None)
def parse_obj(cls, obj: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `parse_obj` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(obj)

parse_raw classmethod ¤

parse_raw(b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False) -> Self
Source code in pydantic/main.py
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
@classmethod
@typing_extensions.deprecated(
    'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
    'otherwise load the data then use `model_validate` instead.',
    category=None,
)
def parse_raw(  # noqa: D102
    cls,
    b: str | bytes,
    *,
    content_type: str | None = None,
    encoding: str = 'utf8',
    proto: DeprecatedParseProtocol | None = None,
    allow_pickle: bool = False,
) -> Self:  # pragma: no cover
    warnings.warn(
        'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, '
        'otherwise load the data then use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    from .deprecated import parse

    try:
        obj = parse.load_str_bytes(
            b,
            proto=proto,
            content_type=content_type,
            encoding=encoding,
            allow_pickle=allow_pickle,
        )
    except (ValueError, TypeError) as exc:
        import json

        # try to match V1
        if isinstance(exc, UnicodeDecodeError):
            type_str = 'value_error.unicodedecode'
        elif isinstance(exc, json.JSONDecodeError):
            type_str = 'value_error.jsondecode'
        elif isinstance(exc, ValueError):
            type_str = 'value_error'
        else:
            type_str = 'type_error'

        # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same
        error: pydantic_core.InitErrorDetails = {
            # The type: ignore on the next line is to ignore the requirement of LiteralString
            'type': pydantic_core.PydanticCustomError(type_str, str(exc)),  # type: ignore
            'loc': ('__root__',),
            'input': b,
        }
        raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error])
    return cls.model_validate(obj)

schema classmethod ¤

schema(by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE) -> Dict[str, Any]
Source code in pydantic/main.py
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
@classmethod
@typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None)
def schema(  # noqa: D102
    cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE
) -> Dict[str, Any]:  # noqa UP006
    warnings.warn(
        'The `schema` method is deprecated; use `model_json_schema` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template)

schema_json classmethod ¤

schema_json(*, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any) -> str
Source code in pydantic/main.py
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
@classmethod
@typing_extensions.deprecated(
    'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
    category=None,
)
def schema_json(  # noqa: D102
    cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any
) -> str:  # pragma: no cover
    warnings.warn(
        'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    import json

    from .deprecated.json import pydantic_encoder

    return json.dumps(
        cls.model_json_schema(by_alias=by_alias, ref_template=ref_template),
        default=pydantic_encoder,
        **dumps_kwargs,
    )

update_forward_refs classmethod ¤

update_forward_refs(**localns: Any) -> None
Source code in pydantic/main.py
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
@classmethod
@typing_extensions.deprecated(
    'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
    category=None,
)
def update_forward_refs(cls, **localns: Any) -> None:  # noqa: D102
    warnings.warn(
        'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    if localns:  # pragma: no cover
        raise TypeError('`localns` arguments are not longer accepted.')
    cls.model_rebuild(force=True)

validate classmethod ¤

validate(value: Any) -> Self
Source code in pydantic/main.py
1599
1600
1601
1602
1603
1604
1605
1606
1607
@classmethod
@typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None)
def validate(cls, value: Any) -> Self:  # noqa: D102
    warnings.warn(
        'The `validate` method is deprecated; use `model_validate` instead.',
        category=PydanticDeprecatedSince20,
        stacklevel=2,
    )
    return cls.model_validate(value)