spark#
Submodules#
Attributes#
Decorator used to register a new SparkModule. |
|
Decorator used to register a new Initializer. |
|
Decorator used to register a new SparkPayload. |
|
Decorator used to register a new SparkConfig. |
|
Decorator used to register a new ConfigurationValidator. |
|
Registry singleton. |
Classes#
Jax.Array wrapper for constant arrays. |
|
The base class for all |
|
Abstract payload definition to validate exchanges between SparkModule's. |
|
Representation of a collection of spike events. |
|
Representation of a collection of currents. |
|
Representation of a collection of membrane potentials. |
|
Representation of a float array. |
|
Representation of an integer array. |
|
Representation of an inhibitory boolean mask. |
|
Base specification for a port of an SparkModule. |
|
Specification for an input port of an SparkModule. |
|
Specification for an output port of an SparkModule. |
|
Specification for an output port of an SparkModule. |
|
Specification for SparkModule automatic constructor. |
|
Functions#
Package Contents#
- class spark.Constant(data, dtype=None)[source]#
Jax.Array wrapper for constant arrays.
- Parameters:
data (Any)
dtype (Any)
- class spark.Variable(value, dtype=None, **metadata)[source]#
Bases:
flax.nnx.VariableThe base class for all
Variabletypes. Note that this is just a convinience wrapper around Flax’s nnx.Variable to simplify imports.- Parameters:
value (Any)
dtype (Any)
- class spark.SparkPayload[source]#
Bases:
abc.ABCAbstract payload definition to validate exchanges between SparkModule’s.
- class spark.SpikeArray[source]#
Bases:
ValueSparkPayloadRepresentation of a collection of spike events.
- class spark.CurrentArray[source]#
Bases:
ValueSparkPayloadRepresentation of a collection of currents.
- class spark.PotentialArray[source]#
Bases:
ValueSparkPayloadRepresentation of a collection of membrane potentials.
- class spark.BooleanMask[source]#
Bases:
ValueSparkPayloadRepresentation of an inhibitory boolean mask.
- class spark.PortSpecs(payload_type, shape, dtype, description=None)[source]#
Base specification for a port of an SparkModule.
- Parameters:
- payload_type: type[spark.core.payloads.SparkPayload] | None[source]#
- class spark.InputSpec(payload_type, shape, dtype, description=None)[source]#
Bases:
PortSpecsSpecification for an input port of an SparkModule.
- Parameters:
- class spark.OutputSpec(**kwargs)[source]#
Bases:
PortSpecsSpecification for an output port of an SparkModule.
- class spark.ModuleSpecs(name, module_cls, inputs, config)[source]#
Specification for SparkModule automatic constructor.
- Parameters:
- module_cls: type[spark.core.module.SparkModule][source]#
- spark.split(node, *filters)[source]#
Wrapper around flax.nnx.split to simply imports.
- Parameters:
node (A)
filters (flax.nnx.filterlib.Filter)
- Return type:
tuple[flax.nnx.graph.GraphDef[A], flax.nnx.graph.GraphState | flax.nnx.variablelib.VariableState, typing_extensions.Unpack[tuple[flax.nnx.graph.GraphState | flax.nnx.variablelib.VariableState, Ellipsis]]]
- spark.merge(graphdef, state, /, *states)[source]#
Wrapper around flax.nnx.merge to simply imports.
- Parameters:
graphdef (flax.nnx.graph.GraphDef[A])
state (Any)
states (Any)
- Return type:
A
- spark.register_module[source]#
Decorator used to register a new SparkModule. Note that module must inherit from spark.nn.Module (spark.core.module.SparkModule)
- spark.register_initializer[source]#
Decorator used to register a new Initializer. Note that module must inherit from spark.nn.initializers.base.Initializer
- spark.register_payload[source]#
Decorator used to register a new SparkPayload. Note that module must inherit from spark.SparkPayload (spark.core.payloads.SparkPayload)
- spark.register_config[source]#
Decorator used to register a new SparkConfig. Note that module must inherit from spark.nn.BaseConfig (spark.core.config.BaseSparkConfig)
- spark.register_cfg_validator[source]#
Decorator used to register a new ConfigurationValidator. Note that module must inherit from spark.core.config_validation.ConfigurationValidator
- class spark.GraphEditor[source]#
-
- launch()[source]#
Creates and shows the editor window without blocking. This method is safe to call multiple times.
- Return type:
None
- closeEvent(event)[source]#
Overrides the default close event to check for unsaved changes.
- Return type:
None
- new_session()[source]#
Clears the current session after checking for unsaved changes.
- Return type:
None
- load_session()[source]#
Loads a graph state from a Spark Graph Editor file after checking for unsaved changes.
- Return type:
None