Reference

Neurons

LIF

WaspNet.LIFType
LIF{T<:Number}<:AbstractNeuron

Contains the necessary parameters for describing a Leaky Integrate-and-Fire (LIF) neuron as well as the current membrane potential of the neuron.

Fields

  • τ::T: Neuron time constant (ms)
  • R::T: Neuronal model resistor (MOhms)
  • θ::T: Threshold voltage (mV) - when state exceeds this, firing occurs.
  • vSS::T: Steady-state voltage (mV) - in the absence of input, this is the resting membrane potential.
  • v0::T: Reset voltage (mV) - immediately after firing, state is set to this.
  • state::T: Current membrane potential (mV)

Different relative orders of threshold voltage, resting voltage, and reset voltage will produce different dynamics. The default values of resting > threshold >> reset allows for a baseline firing rate that can be modulated up or down.

source
WaspNet.updateMethod
update!(neuron::LIF, input_update, dt, t)

Evolve an LIF neuron subject to a membrane potential step of size input_update a time duration dt starting from time t

Inputs

  • input_update: Membrane input charge (pC)
  • dt: timestep duration (s)
  • t: global time (s)

Izh

WaspNet.IzhType
struct Izh{T<:Number}<:AbstractNeuronn

Contains the vector of paramters [a, b, c, d, I, θ] necessary to simulate an Izhikevich neuron as well as the current state of the neuron.

The @with_kw macro is used to produce a constructor which accepts keyword arguments for all values. This neuron struct is immutable, therefor we store the state of the neuron in an Array such that its values can change while the parameters remain static. This represents a minimal example for an AbstractNeuron implementation to build it into a Layer.

Fields

  • a::T-d::T: Neuron parameters as described at https://www.izhikevich.org/publications/spikes.htm
  • I::T: Background current (mA)
  • θ::T: Threshold potential (mV)
  • v0::T: Reset voltage (mV)
  • u0::T: Reset recovery variable value
  • state::T: Vector holding the current (v,u) state of the neuron
  • output::T: Vector holding the current output of the neuron
source
WaspNet.resetMethod
reset(neuron::Izh)

Resets the state of the Izhikevich neuron to its initial values given by v0, u0

WaspNet.updateMethod
update(neuron::Izh, input_update, dt, t)

Evolves the given Neuron subject to an input of input_update a time duration dt starting from time t according to the equations defined in the Izhikevich paper https://www.izhikevich.org/publications/spikes.htm

We use an Euler update for solving the set of differential equations for its computational efficiency and simplicity of implementation.

Functional Neurons

WaspNet.FunctionalType
struct Functional{T<:Number, F<:Function}<:AbstractNeuron

A neuron type which applies some scalar function to its input and returns that value as both its state and output.

Fields

  • func::F: A scalar function to apply to all inputs
  • state::T: The last value computed by this neuron's function
source

Layers

WaspNet.LayerType
Layer{
    L<:AbstractNeuron, N<:Number, A<:AbstractArray{N,1}, M<:Union{AbstractArray{N,2}, Array{AbstractArray{N,2},1}
    }<:AbstractLayer

Track a population of neurons of one AbstractNeuron type, the other Layers those neurons are connected to, and the incoming weights.

Fields

  • neurons::Array{L,1}: an array of neurons for the Layer
  • W<:Union{Matrix,AbstractBlockArray}: either a Matrix or BlockArray containing weights for inputs from incoming layers
  • conns: either [] or Array{Int,1} indicating which Layers in the Network are connected as inputs to this Layer
  • input::Array{N,1}: a pre-allocated array of zeros for staging inputs to the layer
  • output::Array{N,1}: a pre-allocated array for staging outputs from this layer
WaspNet.LayerType
Layer(neurons, W[, conns = Array{Int,1}()])

Constructs a Layer with constituent neurons which accept inputs from the Layers denoted by conns (input 1 is the Network input) and either a BlockArray of weights if length(conns) > 1 or a Matrix of weights otherwise.

WaspNet.LayerMethod
Layer(neurons, W, conns, N_neurons, input, output)

Default non-parametric constructor for Layers for pre-processing inputs and computing parametric types.

WaspNet.reset!Method
reset!(l::AbstractLayer)

Reset all of the neurons in l to the state defined by their reset! function.

WaspNet.update!Method
function update!(l::Layer{L,N,A,M}, input, dt, t) where {L,N,A, M<:AbstractArray{T,1}}

Evolve the state of all of the neurons in the Layer a duration dt, starting from time t, subject to a set of inputs from all Network layers in input.

Not all arrays within input are used; we iterate over l.conn to select the appropriate inputs to this Layer, and the corresponding Blocks from l.W are used to calculate the net Layer input.

WaspNet.update!Method
function update!(l::Layer, input, dt, t)

Evolve the state of all of the neurons in the Layer a duration dt, starting from time t, subject to a set of inputs from all Network layers in input.

This (default) method assumes a feed-forward, non-BlockArray representation for l.W

Arguments

  • l::Layer: the Layer to be evolved
  • input: an Array of Arrays of output values from other Layers potentially being input to l
  • dt: the time step to evolve the Layer
  • t: the time at the start of the current time step
WaspNet.update!Method
function update!(l::Layer{L,N,A,M}, input, dt, t)

Evolve the state of all of the neurons in the Layer a duration dt, starting from time t, subject to a set of inputs from all Network layers in input.

Not all arrays within input are used; we iterate over l.conn to select the appropriate inputs to this Layer, and the corresponding Blocks from l.W are used to calculate the net Layer input.

Networks

WaspNet.NetworkType
mutable struct Network<:AbstractNetwork

Contains constituent Layers, orchestrates the movement of signals between Layers, and handles first-layer input.

Fields

  • layers::Array{AbstractLayer,1}: Array of Layers ordered from 1 to N for N layers
  • N_in::Int: Number of input dimensions to the first Layer
  • prev_outputs::Vector: Vector of vectors sized to hold the output from each Layer
WaspNet.NetworkMethod
function Network(layers, N_in::Int)

Given an array of Layers and the dimensionality of the input to the network, make a new Network which is a copy of each Layer with weights converted to BlockArray format.

The output dimensionality is in

WaspNet.NetworkMethod
function Network(layers::Array{L, 1}) where L <: AbstractLayer

Given an array of Layers, constructs the Network resulting from connecting the Layers with their specified conns.

The input dimensionality is inferred from the size of the weight matrices for the first Layer in the layers array.

Simulations

WaspNet.SimulationResultType
struct SimulationResult{
    OT<:AbstractArray{<:Number,2}, ST<:AbstractArray{<:Number, 2}, TT<:AbstractArray{<:Real,1}
    }<:AbstractSimulation

Contains simulation results from simulating a Network for a specific length of time with simulate!

Fields

  • outputs::OT: A Matrix containing the output of all simulated neurons at every time step.
  • states::ST: A Matrix containing the state of all simulated neurons at every time step. If states were not tracked, an Nx0 dimensional Matrix.
  • times::TT: An Array of times at which the WaspnetElement was sampled.
WaspNet.SimulationResultMethod
SimulationResult(element::EL, times::TT) where {EL<:WaspnetElement,TT<:AbstractArray{<:Real, 1}}

Given a WaspnetElement and the times at which to simulate the element, construct the SimulationResult instance to store the results of the simulation.

WaspNet.sim_update!Method
function sim_update!(ne::WaspnetElement, input_update, dt, t)

Generic function for wrapping calls to update! from simulate!

WaspNet.sim_update!Method
function sim_update!(neuron::AbstractNeuron, input_update<:AbstractArray{T,N}, dt, t) where {T<:Number, N}

Wrapper to ensure that if a 1D array is passed to update a neuron, it is converted to a scalar first

WaspNet.simulate!Function
simulate!(element::WaspnetElement, input::Function, dt, tf, t0 = 0.; track_state=false, kwargs...)

Simulates the supplied WaspnetElement subject to a function of time, input by sampling input at the chosen sample times and returns the relevant SimulationResult instance

WaspNet.simulate!Function
simulate!(element::WaspnetElement, input::Matrix, dt, tf, t0 = 0.; track_state=false, kwargs...)

Simulates the supplied WaspnetElement subject to some pre-sampled input where each column is one time step and returns the relevant SimulationResult instance

Utilities

General Utilities

Pruning

WaspNet.delete_entriesMethod
function delete_entries(W, entries; axis::Int = 1)

Given an AbstractArray, deletes the specified entries (e.g. rows or columns) along the given axis; used for pruning weight matrices.

As an example, delete_entries(W, [3,4]; axis = 2) would delete columns 3 and 4 from W and return the modified W.

WaspNet.pruneMethod
function prune(el::WaspnetElement, layers, neurons[, l_idx])

Given an element el along with indices for target Neurons, constructs new Layers and Networks with all references to those neurons removed by deleting rows and columns from the proper weight matrices in each Layer.

layers should be an array of indices relative to the Network it is being pruned in; neurons should be an array of arrays of indices where the entries in each inner array are indices of neurons within the respective Layer from layers.

Arguments

  • el::WaspnetElement: The element to prune neurons from, either a Network or Layer
  • layers: A list of indices for which Layers we're removing neurons from the Network where it resides
  • neurons: A list of lists of neurons to remove in the respective entries from layers.
  • l_idx: If prune is called on a Layer, l_idx denotes the index of the that Layer if it were to appear in the list layers

<!– "src/utilities/pruning.jl" –>