Reference
Neurons
LIF
WaspNet.LIF
— TypeLIF{T<:Number}<:AbstractNeuron
Contains the necessary parameters for describing a Leaky Integrate-and-Fire (LIF) neuron as well as the current membrane potential of the neuron.
Fields
τ::T
: Neuron time constant (ms)R::T
: Neuronal model resistor (MOhms)θ::T
: Threshold voltage (mV) - when state exceeds this, firing occurs.vSS::T
: Steady-state voltage (mV) - in the absence of input, this is the resting membrane potential.v0::T
: Reset voltage (mV) - immediately after firing, state is set to this.state::T
: Current membrane potential (mV)
Different relative orders of threshold voltage, resting voltage, and reset voltage will produce different dynamics. The default values of resting > threshold >> reset allows for a baseline firing rate that can be modulated up or down.
WaspNet.update
— Methodupdate!(neuron::LIF, input_update, dt, t)
Evolve an LIF
neuron subject to a membrane potential step of size input_update
a time duration dt
starting from time t
Inputs
input_update
: Membrane input charge (pC)dt
: timestep duration (s)t
: global time (s)
Izh
WaspNet.Izh
— Typestruct Izh{T<:Number}<:AbstractNeuronn
Contains the vector of paramters [a, b, c, d, I, θ] necessary to simulate an Izhikevich neuron as well as the current state of the neuron.
The @with_kw macro is used to produce a constructor which accepts keyword arguments for all values. This neuron struct is immutable, therefor we store the state of the neuron in an Array
such that its values can change while the parameters remain static. This represents a minimal example for an AbstractNeuron
implementation to build it into a Layer
.
Fields
a::T
-d::T
: Neuron parameters as described at https://www.izhikevich.org/publications/spikes.htmI::T
: Background current (mA)θ::T
: Threshold potential (mV)v0::T
: Reset voltage (mV)u0::T
: Reset recovery variable valuestate::T
: Vector holding the current (v,u) state of the neuronoutput::T
: Vector holding the current output of the neuron
WaspNet.reset
— Methodreset(neuron::Izh)
Resets the state of the Izhikevich neuron to its initial values given by v0
, u0
WaspNet.update
— Methodupdate(neuron::Izh, input_update, dt, t)
Evolves the given Neuron
subject to an input of input_update
a time duration dt
starting from time t
according to the equations defined in the Izhikevich paper https://www.izhikevich.org/publications/spikes.htm
We use an Euler update for solving the set of differential equations for its computational efficiency and simplicity of implementation.
Functional Neurons
WaspNet.Functional
— Typestruct Functional{T<:Number, F<:Function}<:AbstractNeuron
A neuron type which applies some scalar function to its input and returns that value as both its state and output.
Fields
func::F
: A scalar function to apply to all inputsstate::T
: The last value computed by this neuron's function
Layers
WaspNet.Layer
— TypeLayer{
L<:AbstractNeuron, N<:Number, A<:AbstractArray{N,1}, M<:Union{AbstractArray{N,2}, Array{AbstractArray{N,2},1}
}<:AbstractLayer
Track a population of neurons of one AbstractNeuron
type, the other Layer
s those neurons are connected to, and the incoming weights.
Fields
neurons::Array{L,1}
: an array of neurons for theLayer
W<:Union{Matrix,AbstractBlockArray}
: either a Matrix or BlockArray containing weights for inputs from incoming layersconns
: either[]
orArray{Int,1}
indicating whichLayer
s in theNetwork
are connected as inputs to thisLayer
input::Array{N,1}
: a pre-allocated array of zeros for staging inputs to the layeroutput::Array{N,1}
: a pre-allocated array for staging outputs from this layer
WaspNet.Layer
— TypeLayer(neurons, W[, conns = Array{Int,1}()])
Constructs a Layer
with constituent neurons
which accept inputs from the Layer
s denoted by conns
(input 1 is the Network
input) and either a BlockArray
of weights if length(conns) > 1
or a Matrix of weights otherwise.
WaspNet.Layer
— MethodLayer(neurons, W, conns, N_neurons, input, output)
Default non-parametric constructor for Layer
s for pre-processing inputs and computing parametric types.
WaspNet.get_neuron_count
— Methodget_neuron_count(l::AbstractLayer)
Return the number of neurons in the given Layer
WaspNet.get_neuron_outputs
— Methodget_neuron_outputs(l::AbstractLayer)
Return the current output of l
's constituent neurons
WaspNet.get_neuron_states
— Methodget_neuron_states(l::AbstractLayer)
Return the current state of l
's constituent neurons
WaspNet.reset!
— Methodreset!(l::AbstractLayer)
Reset all of the neurons in l
to the state defined by their reset!
function.
WaspNet.update!
— Methodfunction update!(l::Layer{L,N,A,M}, input, dt, t) where {L,N,A, M<:AbstractArray{T,1}}
Evolve the state of all of the neurons in the Layer
a duration dt
, starting from time t
, subject to a set of inputs from all Network
layers in input
.
Not all arrays within input
are used; we iterate over l.conn
to select the appropriate inputs to this Layer
, and the corresponding Block
s from l.W
are used to calculate the net Layer
input.
WaspNet.update!
— Methodfunction update!(l::Layer, input, dt, t)
Evolve the state of all of the neurons in the Layer
a duration dt
, starting from time t
, subject to a set of inputs from all Network
layers in input
.
This (default) method assumes a feed-forward, non-BlockArray representation for l.W
Arguments
l::Layer
: theLayer
to be evolvedinput
: anArray
ofArray
s of output values from otherLayers
potentially being input tol
dt
: the time step to evolve theLayer
t
: the time at the start of the current time step
WaspNet.update!
— Methodfunction update!(l::Layer{L,N,A,M}, input, dt, t)
Evolve the state of all of the neurons in the Layer
a duration dt
, starting from time t
, subject to a set of inputs from all Network
layers in input
.
Not all arrays within input
are used; we iterate over l.conn
to select the appropriate inputs to this Layer
, and the corresponding Block
s from l.W
are used to calculate the net Layer
input.
Networks
WaspNet.Network
— Typemutable struct Network<:AbstractNetwork
Contains constituent Layer
s, orchestrates the movement of signals between Layer
s, and handles first-layer input.
Fields
layers::Array{AbstractLayer,1}
: Array ofLayer
s ordered from 1 to N for N layersN_in::Int
: Number of input dimensions to the firstLayer
prev_outputs::Vector
: Vector of vectors sized to hold the output from eachLayer
WaspNet.Network
— Methodfunction Network(layers, N_in::Int)
Given an array of Layer
s and the dimensionality of the input to the network, make a new Network
which is a copy of each Layer
with weights converted to BlockArray
format.
The output dimensionality is in
WaspNet.Network
— Methodfunction Network(layers::Array{L, 1}) where L <: AbstractLayer
Given an array of Layer
s, constructs the Network
resulting from connecting the Layer
s with their specified conn
s.
The input dimensionality is inferred from the size of the weight matrices for the first Layer
in the layers
array.
Simulations
WaspNet.SimulationResult
— Typestruct SimulationResult{
OT<:AbstractArray{<:Number,2}, ST<:AbstractArray{<:Number, 2}, TT<:AbstractArray{<:Real,1}
}<:AbstractSimulation
Contains simulation results from simulating a Network
for a specific length of time with simulate!
Fields
outputs::OT
: AMatrix
containing the output of all simulated neurons at every time step.states::ST
: AMatrix
containing the state of all simulated neurons at every time step. If states were not tracked, an Nx0 dimensionalMatrix
.times::TT
: AnArray
of times at which theWaspnetElement
was sampled.
WaspNet.SimulationResult
— MethodSimulationResult(element::EL, times::TT) where {EL<:WaspnetElement,TT<:AbstractArray{<:Real, 1}}
Given a WaspnetElement
and the times at which to simulate the element, construct the SimulationResult
instance to store the results of the simulation.
WaspNet.sim_update!
— Methodfunction sim_update!(ne::WaspnetElement, input_update, dt, t)
Generic function for wrapping calls to update!
from simulate!
WaspNet.sim_update!
— Methodfunction sim_update!(neuron::AbstractNeuron, input_update<:AbstractArray{T,N}, dt, t) where {T<:Number, N}
Wrapper to ensure that if a 1D array is passed to update a neuron, it is converted to a scalar first
WaspNet.simulate!
— Functionsimulate!(element::WaspnetElement, input::Function, dt, tf, t0 = 0.; track_state=false, kwargs...)
Simulates the supplied WaspnetElement
subject to a function of time, input
by sampling input
at the chosen sample times and returns the relevant SimulationResult
instance
WaspNet.simulate!
— Functionsimulate!(element::WaspnetElement, input::Matrix, dt, tf, t0 = 0.; track_state=false, kwargs...)
Simulates the supplied WaspnetElement
subject to some pre-sampled input
where each column is one time step and returns the relevant SimulationResult
instance
Utilities
General Utilities
Pruning
WaspNet.delete_entries
— Methodfunction delete_entries(W, entries; axis::Int = 1)
Given an AbstractArray
, deletes the specified entries
(e.g. rows or columns) along the given axis; used for pruning weight matrices.
As an example, delete_entries(W, [3,4]; axis = 2)
would delete columns 3 and 4 from W
and return the modified W
.
WaspNet.prune
— Methodfunction prune(el::WaspnetElement, layers, neurons[, l_idx])
Given an element el
along with indices for target Neuron
s, constructs new Layer
s and Network
s with all references to those neurons removed by deleting rows and columns from the proper weight matrices in each Layer
.
layers
should be an array of indices relative to the Network
it is being pruned in; neurons
should be an array of arrays of indices where the entries in each inner array are indices of neurons within the respective Layer
from layers
.
Arguments
el::WaspnetElement
: The element to prune neurons from, either aNetwork
orLayer
layers
: A list of indices for whichLayer
s we're removing neurons from theNetwork
where it residesneurons
: A list of lists of neurons to remove in the respective entries fromlayers
.l_idx
: Ifprune
is called on aLayer
,l_idx
denotes the index of the thatLayer
if it were to appear in the listlayers
<!– "src/utilities/pruning.jl" –>