spark.core.config#

Attributes#

Classes#

InitializableFieldMetaclass

Metaclass that automatically injects common methods into the class.

InitializableField

Wrapper for fields that allow for Initializers | InitializersConfig to define the init() method.

SparkMetaConfig

Metaclass that promotes class attributes to dataclass fields

BaseSparkConfig

Base class for module configuration.

SparkConfig

Default class for module configuration.

Module Contents#

spark.core.config.logger[source]#
spark.core.config.METADATA_TEMPLATE[source]#
spark.core.config.IMMUTABLE_TYPES[source]#
class spark.core.config.InitializableFieldMetaclass[source]#

Bases: type

Metaclass that automatically injects common methods into the class.

class spark.core.config.InitializableField(obj)[source]#

Wrapper for fields that allow for Initializers | InitializersConfig to define the init() method. The method init() is extensively used through Spark modules to initialize variables either from default values or from full fledge initializers.

__obj__: Any[source]#
__getattr__(name)[source]#
Return type:

Any

__setattr__(name, value)[source]#
Return type:

None

__repr__()[source]#
Return type:

str

__str__()[source]#
Return type:

str

init(init_kwargs={}, key=None, shape=None, dtype=None, **kwargs)[source]#
Parameters:
Return type:

jax.Array | int | float | complex | bool

class spark.core.config.SparkMetaConfig[source]#

Bases: abc.ABCMeta

Metaclass that promotes class attributes to dataclass fields

map_common_init_patterns(factory=dc.MISSING)[source]#
Parameters:

factory (Callable)

Return type:

tuple[Any, Any]

class spark.core.config.BaseSparkConfig(__skip_validation__=False, **kwargs)[source]#

Bases: abc.ABC

Base class for module configuration.

Parameters:

__skip_validation__ (bool)

__config_delimiter__: str = '__'[source]#
__shared_config_delimiter__: str = '_s_'[source]#
__metadata__: dict[source]#
__graph_editor_metadata__: dict[source]#
classmethod __init_subclass__(**kwargs)[source]#
__eq__(other)[source]#
Return type:

bool

merge(partial={}, __skip_validation__=False)[source]#

Update config with partial overrides.

Parameters:
Return type:

None

diff(other)[source]#

Return differences from another config.

Parameters:

other (BaseSparkConfig)

Return type:

dict[str, Any]

validate(is_partial=False, errors=None, current_path=['main'])[source]#

Validates all fields in the configuration class.

Parameters:
Return type:

None

get_field_errors(field_name)[source]#

Validates all fields in the configuration class.

Parameters:

field_name (str)

Return type:

list[str]

get_metadata()[source]#

Returns all the metadata in the configuration class, indexed by the attribute name.

Return type:

dict[str, Any]

property class_ref: type[source]#

Returns the type of the associated Module/Initializer.

NOTE: It is recommended to set the __class_ref__ to the name of the associated module/initializer when defining custom configuration classes. The automatic class_ref solver is extremely brittle and likely to fail in many different custom scenarios.

Return type:

type

__post_init__()[source]#
to_dict(is_partial=False)[source]#

Serialize config to dictionary

Parameters:

is_partial (bool)

Return type:

dict[str, dict[str, Any]]

get_kwargs()[source]#

Returns a dictionary with pairs of key, value fields (skips metadata).

Return type:

dict[str, dict[str, Any]]

classmethod from_dict(dct)[source]#

Create config instance from dictionary.

Parameters:

dct (dict)

Return type:

BaseSparkConfig

to_file(file_path, is_partial=False)[source]#

Export a config instance from a .scfg file.

Parameters:
  • file_path (str)

  • is_partial (bool)

Return type:

None

classmethod from_file(file_path, is_partial=False)[source]#

Create config instance from a .scfg file.

Parameters:
  • file_path (str)

  • is_partial (bool)

Return type:

BaseSparkConfig

__iter__()[source]#

Custom iterator to simplify SparkConfig inspection across the entire ecosystem. This iterator excludes private fields.

Output:

field_name: str, field name field_value: tp.Any, field value

Return type:

Iterator[tuple[str, dataclasses.Field, Any]]

__repr__()[source]#
inspect(simplified=False)[source]#

Returns a formated string of the datastructure.

Return type:

str

with_new_seeds(seed=None)[source]#

Utility method to recompute all seed variables within the SparkConfig. Useful when creating several populations from the same config.

Return type:

BaseSparkConfig

class spark.core.config.SparkConfig(__skip_validation__=False, **kwargs)[source]#

Bases: BaseSparkConfig

Default class for module configuration.

Parameters:

__skip_validation__ (bool)

seed: int[source]#
dtype: jax.typing.DTypeLike[source]#
dt: float[source]#