spark.nn.interfaces

Contents

spark.nn.interfaces#

Submodules#

Classes#

Interface

Abstract Interface model.

InterfaceConfig

Abstract Interface model configuration class.

ControlInterface

Abstract ControlInterface model.

ControlInterfaceConfig

Abstract ControlInterface model configuration class.

ControlInterfaceOutput

ControlInterface model output spec.

Concat

Combines several streams of inputs of the same type into a single stream.

ConcatConfig

Concat configuration class.

ConcatReshape

Combines several streams of inputs of the same type into a single stream.

ConcatReshapeConfig

ConcatReshape configuration class.

Sampler

Sample a single input streams of inputs of the same type into a single stream.

SamplerConfig

Sampler configuration class.

InputInterface

Abstract input interface model.

InputInterfaceConfig

Abstract InputInterface model configuration class.

InputInterfaceOutput

InputInterface model output spec.

PoissonSpiker

Transforms a continuous signal to a spiking signal.

PoissonSpikerConfig

PoissonSpiker model configuration class.

LinearSpiker

Transforms a continuous signal to a spiking signal.

LinearSpikerConfig

LinearSpiker model configuration class.

TopologicalPoissonSpiker

Transforms a continuous signal to a spiking signal.

TopologicalPoissonSpikerConfig

TopologicalPoissonSpiker configuration class.

TopologicalLinearSpiker

Transforms a continuous signal to a spiking signal.

TopologicalLinearSpikerConfig

TopologicalLinearSpiker configuration class.

OutputInterface

Abstract OutputInterface model.

OutputInterfaceConfig

Abstract OutputInterface model configuration class.

OutputInterfaceOutput

OutputInterface model output spec.

ExponentialIntegrator

Transforms a discrete spike signal to a continuous signal.

ExponentialIntegratorConfig

ExponentialIntegrator configuration class.

Package Contents#

class spark.nn.interfaces.Interface(config=None, **kwargs)[source]#

Bases: spark.core.module.SparkModule, abc.ABC, Generic[ConfigT]

Abstract Interface model.

Parameters:

config (ConfigT | None)

config: ConfigT[source]#
abstractmethod __call__(*args, **kwargs)[source]#

Computes the control flow operation.

Parameters:

args (spark.core.payloads.SparkPayload)

Return type:

InterfaceOutput

class spark.nn.interfaces.InterfaceConfig(**kwargs)[source]#

Bases: spark.core.config.SparkConfig

Abstract Interface model configuration class.

class spark.nn.interfaces.ControlInterface(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.base.Interface, abc.ABC, Generic[ConfigT]

Abstract ControlInterface model.

Parameters:

config (ConfigT | None)

config: ConfigT[source]#
abstractmethod __call__(*args, **kwargs)[source]#

Control operation.

Parameters:

args (spark.core.payloads.SparkPayload)

Return type:

ControlInterfaceOutput

class spark.nn.interfaces.ControlInterfaceConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.base.InterfaceConfig

Abstract ControlInterface model configuration class.

class spark.nn.interfaces.ControlInterfaceOutput[source]#

Bases: TypedDict

ControlInterface model output spec.

Initialize self. See help(type(self)) for accurate signature.

output: spark.core.payloads.SparkPayload[source]#
class spark.nn.interfaces.Concat(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.control.base.ControlInterface

Combines several streams of inputs of the same type into a single stream.

Init:

num_inputs: int payload_type: type[SparkPayload]

Input:

input: type[SparkPayload]

Output:

output: type[SparkPayload]

Parameters:

config (ConcatConfig | None)

config: ConcatConfig[source]#
num_inputs[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

__call__(inputs)[source]#

Merge all input streams into a single data output stream.

Parameters:

inputs (list[spark.core.payloads.SparkPayload])

Return type:

spark.nn.interfaces.control.base.ControlInterfaceOutput

class spark.nn.interfaces.ConcatConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.control.base.ControlInterfaceConfig

Concat configuration class.

num_inputs: int[source]#
class spark.nn.interfaces.ConcatReshape(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.control.base.ControlInterface

Combines several streams of inputs of the same type into a single stream.

Init:

num_inputs: int reshape: tuple[int, …] payload_type: type[SparkPayload]

Input:

input: type[SparkPayload]

Output:

output: type[SparkPayload]

Parameters:

config (ConcatReshapeConfig | None)

config: ConcatReshapeConfig[source]#
reshape[source]#
num_inputs[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

__call__(inputs)[source]#

Merge all input streams into a single data output stream. Output stream is reshape to match the pre-specified shape.

Parameters:

inputs (list[spark.core.payloads.SparkPayload])

Return type:

spark.nn.interfaces.control.base.ControlInterfaceOutput

class spark.nn.interfaces.ConcatReshapeConfig(**kwargs)[source]#

Bases: ConcatConfig

ConcatReshape configuration class.

reshape: tuple[int, Ellipsis][source]#
class spark.nn.interfaces.Sampler(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.control.base.ControlInterface

Sample a single input streams of inputs of the same type into a single stream. Indices are selected randomly and remain fixed.

Init:

sample_size: int

Input:

input: type[SparkPayload]

Output:

output: type[SparkPayload]

Parameters:

config (SamplerConfig | None)

config: SamplerConfig[source]#
sample_size[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

property indices: jax.Array[source]#
Return type:

jax.Array

__call__(inputs)[source]#

Sub/Super-sample the input stream to get the pre-specified number of samples.

Parameters:

inputs (spark.core.payloads.SparkPayload)

Return type:

spark.nn.interfaces.control.base.ControlInterfaceOutput

class spark.nn.interfaces.SamplerConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.control.base.ControlInterfaceConfig

Sampler configuration class.

sample_size: int[source]#
class spark.nn.interfaces.InputInterface(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.base.Interface, abc.ABC, Generic[ConfigT]

Abstract input interface model.

Parameters:

config (ConfigT | None)

config: ConfigT[source]#
abstractmethod __call__(*args, **kwargs)[source]#

Transform the input signal into an Spike signal.

Parameters:

args (spark.core.payloads.SparkPayload)

Return type:

InputInterfaceOutput

class spark.nn.interfaces.InputInterfaceConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.base.InterfaceConfig

Abstract InputInterface model configuration class.

class spark.nn.interfaces.InputInterfaceOutput[source]#

Bases: TypedDict

InputInterface model output spec.

Initialize self. See help(type(self)) for accurate signature.

spikes: spark.core.payloads.SpikeArray[source]#
class spark.nn.interfaces.PoissonSpiker(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.input.base.InputInterface

Transforms a continuous signal to a spiking signal. This transformation assumes a very simple poisson neuron model without any type of adaptation or plasticity.

Init:

max_freq: float [Hz]

Input:

signal: FloatArray

Output:

spikes: SpikeArray

Parameters:

config (PoissonSpikerConfig | None)

config: PoissonSpikerConfig[source]#
max_freq[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

__call__(signal)[source]#

Input interface operation.

Input:

A FloatArray of values in the range [0,1].

Output:

A SpikeArray of the same shape as the input.

Parameters:

signal (spark.core.payloads.FloatArray)

Return type:

spark.nn.interfaces.input.base.InputInterfaceOutput

class spark.nn.interfaces.PoissonSpikerConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.input.base.InputInterfaceConfig

PoissonSpiker model configuration class.

max_freq: float[source]#
class spark.nn.interfaces.LinearSpiker(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.input.base.InputInterface

Transforms a continuous signal to a spiking signal. This transformation assumes a very simple linear neuron model without any type of adaptation or plasticity. Units have a fixed refractory period and at maximum input signal will fire up to some fixed frequency.

Init:

tau: float [ms] cd: float [ms] max_freq: float [Hz]

Input:

signal: FloatArray

Output:

spikes: SpikeArray

Parameters:

config (LinearSpikerConfig | None)

config: LinearSpikerConfig[source]#
tau[source]#
cd[source]#
max_freq[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

reset()[source]#

Reset module to its default state.

__call__(signal)[source]#

Input interface operation.

Input:

A FloatArray of values in the range [0,1].

Output:

A SpikeArray of the same shape as the input.

Parameters:

signal (spark.core.payloads.FloatArray)

Return type:

spark.nn.interfaces.input.base.InputInterfaceOutput

class spark.nn.interfaces.LinearSpikerConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.input.base.InputInterfaceConfig

LinearSpiker model configuration class.

tau: float[source]#
cd: float[source]#
max_freq: float[source]#
class spark.nn.interfaces.TopologicalPoissonSpiker(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.input.base.InputInterface

Transforms a continuous signal to a spiking signal. This transformation maps a vector (a point in a hypercube) into a simple manifold with/without its borders glued. This transformation assumes a very simple poisson neuron model without any type of adaptation or plasticity.

Init:

glue: jax.Array mins: jax.Array maxs: jax.Array resolution: int max_freq: float [Hz] sigma: float

Input:

signal: FloatArray

Output:

spikes: SpikeArray

Parameters:

config (TopologicalPoissonSpikerConfig | None)

config: TopologicalPoissonSpikerConfig[source]#
resolution[source]#
max_freq[source]#
sigma[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

__call__(signal)[source]#

Input interface operation.

Input: A FloatArray of values in the range [mins, maxs]. Output: A SpikeArray of the same shape as the input.

Parameters:

signal (spark.core.payloads.FloatArray)

Return type:

spark.nn.interfaces.input.base.InputInterfaceOutput

class spark.nn.interfaces.TopologicalPoissonSpikerConfig(**kwargs)[source]#

Bases: TopologicalSpikerConfig, spark.nn.interfaces.input.poisson.PoissonSpikerConfig

TopologicalPoissonSpiker configuration class.

class spark.nn.interfaces.TopologicalLinearSpiker(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.input.base.InputInterface

Transforms a continuous signal to a spiking signal. This transformation maps a vector (a point in a hypercube) into a simple manifold with/without its borders glued. This transformation assumes a very simple linear neuron model without any type of adaptation or plasticity.

Init:

glue: jax.Array mins: jax.Array maxs: jax.Array resolution: int tau: float [ms] cd: float [ms] max_freq: float [Hz] sigma: float

Input:

signal: FloatArray

Output:

spikes: SpikeArray

Parameters:

config (TopologicalLinearSpikerConfig | None)

config: TopologicalLinearSpikerConfig[source]#
resolution[source]#
tau[source]#
cd[source]#
max_freq[source]#
sigma[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

reset()[source]#

Reset module to its default state.

__call__(signal)[source]#

Input interface operation.

Input: A FloatArray of values in the range [mins, maxs]. Output: A SpikeArray of the same shape as the input.

Parameters:

signal (spark.core.payloads.FloatArray)

Return type:

spark.nn.interfaces.input.base.InputInterfaceOutput

class spark.nn.interfaces.TopologicalLinearSpikerConfig(**kwargs)[source]#

Bases: TopologicalSpikerConfig, spark.nn.interfaces.input.linear.LinearSpikerConfig

TopologicalLinearSpiker configuration class.

class spark.nn.interfaces.OutputInterface(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.base.Interface, abc.ABC, Generic[ConfigT]

Abstract OutputInterface model.

Parameters:

config (ConfigT | None)

abstractmethod __call__(*args, **kwargs)[source]#

Transform incomming spikes into a output signal.

Parameters:

args (spark.core.payloads.SpikeArray)

Return type:

dict[str, spark.core.payloads.SparkPayload]

class spark.nn.interfaces.OutputInterfaceConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.base.InterfaceConfig

Abstract OutputInterface model configuration class.

class spark.nn.interfaces.OutputInterfaceOutput[source]#

Bases: TypedDict

OutputInterface model output spec.

Initialize self. See help(type(self)) for accurate signature.

signal: spark.core.payloads.FloatArray[source]#
class spark.nn.interfaces.ExponentialIntegrator(config=None, **kwargs)[source]#

Bases: spark.nn.interfaces.output.base.OutputInterface

Transforms a discrete spike signal to a continuous signal. This transformation assumes a very simple integration model model without any type of adaptation or plasticity. Spikes are grouped into k non-overlaping clusters and every neuron contributes the same amount to the ouput.

Init:

num_outputs: int saturation_freq: float [Hz] tau: float [ms] shuffle: bool smooth_trace: bool

Input:

spikes: SpikeArray

Output:

signal: FloatArray

Parameters:

config (ExponentialIntegratorConfig)

config: ExponentialIntegratorConfig[source]#
num_outputs[source]#
saturation_freq[source]#
tau[source]#
shuffle[source]#
smooth_trace[source]#
build(input_specs)[source]#

Build method.

Parameters:

input_specs (dict[str, spark.core.specs.InputSpec])

Return type:

None

reset()[source]#

Reset module to its default state.

__call__(spikes)[source]#

Transform incomming spikes into a output signal.

Parameters:

spikes (spark.core.payloads.SpikeArray)

Return type:

spark.nn.interfaces.output.base.OutputInterfaceOutput

class spark.nn.interfaces.ExponentialIntegratorConfig(**kwargs)[source]#

Bases: spark.nn.interfaces.output.base.OutputInterfaceConfig

ExponentialIntegrator configuration class.

num_outputs: int[source]#
saturation_freq: float[source]#
tau: float[source]#
shuffle: bool[source]#
smooth_trace: bool[source]#