Tuning curves#

With Pynapple you can easily compute n-dimensional tuning curves (for example, firing rate as a function of 1D angular direction or firing rate as a function of 2D position). It is also possible to compute average firing rate for different epochs (for example, firing rate for different epochs of stimulus presentation).

Hide code cell content

import pynapple as nap
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import xarray as xr
from pprint import pprint
custom_params = {"axes.spines.right": False, "axes.spines.top": False}
sns.set_theme(style="ticks", palette="colorblind", font_scale=1.5, rc=custom_params)
xr.set_options(display_expand_attrs=False);

From timestamps or continuous activity#

Computing tuning curves is done using compute_tuning_curves.

When computing from general time-series, mandatory arguments are:

  • data: a TsGroup (or single Ts) or TsdFrame (or single Tsd) containing the neural activity of one or more units.

  • features: a Tsd or TsdFrame containing one or more features.

By default, 10 bins are used for all features, but you can specify the number of bins, or the bin edges explicitly, using the bins argument.

The min and max of the tuning curves are by default the minima and maxima of the features. This can be tweaked with the range argument.

If an IntervalSet is passed with epochs, everything is restricted to epochs, otherwise the time support of the features is used.

If you do not want the sampling rate of the features to be estimated from the timestamps, you can pass it explicitly using the fs argument.

You can further also pass a list of strings to label each dimension via feature_names (by default the columns of the features are used).

The output is an xarray.DataArray in which the first dimension represents the units and further dimensions represent the features. The occupancy and bin edges are stored as attributes.

If you explicitly want a pd.DataFrame as output (which is only possible when you have just the one feature), you can set return_pandas=True. Note that this will not return the occupancy and bin edges.

1D tuning curves from spikes#

We will start by simulating some spiking units modulated by a 1D circular variable:

# Feature
T = 500
dt_feature = 0.02
times_feature = np.arange(0, T, dt_feature)
feature = nap.Tsd(
    t=times_feature, d=np.pi + np.pi * np.cos(2 * np.pi * times_feature / 10)
)

# Spikes
N = 6
max_rate = 20
dt_spikes = 0.002
feature_interp = feature.interpolate(nap.Ts(np.arange(0, T, dt_spikes)))
centers = np.linspace(0, 2 * np.pi, N, endpoint=False)
rates = max_rate * np.exp(
    -10 * (np.sin((feature_interp.d[:, np.newaxis] - centers) / 2)) ** 2
)
tsgroup_1d = nap.TsGroup(
    {
        i + 1: nap.Ts(
            feature_interp.t[np.random.poisson(rates[:, i] * dt_spikes) > 0]
        )
        for i in range(N)
    },
)

We now have all the ingredients to compute tuning curves:

tuning_curves_1d = nap.compute_tuning_curves(
    data=tsgroup_1d,
    features=feature,
    bins=120, 
    range=(0, 2*np.pi),
    feature_names=["feature"]
    )
tuning_curves_1d
<xarray.DataArray (unit: 6, feature: 120)> Size: 6kB
array([[20.37931034, 20.75      , 19.7       , 20.5       , 19.33333333,
        18.33333333, 13.66666667, 14.5       , 12.5       , 10.75      ,
         7.5       ,  8.83333333,  6.        ,  6.25      ,  6.75      ,
         5.        ,  3.5       ,  1.25      ,  3.        ,  2.        ,
         0.75      ,  1.25      ,  1.        ,  0.5       ,  0.        ,
         0.25      ,  0.5       ,  0.        ,  0.        ,  0.5       ,
         0.5       ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.5       ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.25      ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.5       ,  0.25      ,  0.5       ,  0.75      ,
         0.25      ,  1.        ,  1.        ,  1.        ,  1.        ,
...
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.25      ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
         0.        ,  0.        ,  0.        ,  0.        ,  0.25      ,
         0.        ,  0.25      ,  0.        ,  0.        ,  0.        ,
         1.        ,  0.        ,  1.25      ,  2.5       ,  2.        ,
         1.5       ,  2.5       ,  4.        ,  5.        ,  5.5       ,
         6.        ,  7.25      ,  7.        ,  6.75      ,  7.        ,
        11.75      , 11.5       , 10.75      , 15.        , 16.        ,
        14.75      , 16.5       , 16.75      , 21.5       , 23.        ,
        19.        , 18.        , 18.25      , 20.5       , 16.25      ,
        16.        , 16.5       , 14.5       , 12.5       ,  9.25      ,
         7.        ,  8.33333333,  5.66666667,  4.66666667,  6.33333333,
         3.        ,  3.375     ,  3.6       ,  1.83333333,  2.20689655]])
Coordinates:
  * unit     (unit) int64 48B 1 2 3 4 5 6
  * feature  (feature) float64 960B 0.02618 0.07854 0.1309 ... 6.152 6.205 6.257
Attributes: (4)

The xarray.DataArray can be treated like a numpy array.

It has a shape:

tuning_curves_1d.shape
(6, 120)

It can be sliced:

tuning_curves_1d[1, 2:8]
<xarray.DataArray (feature: 6)> Size: 48B
array([2.6       , 3.        , 5.        , 4.33333333, 5.        ,
       6.33333333])
Coordinates:
  * feature  (feature) float64 48B 0.1309 0.1833 0.2356 0.288 0.3403 0.3927
    unit     int64 8B 2
Attributes: (4)

It can also be indexed using the coordinates:

tuning_curves_1d.sel(unit=1)
<xarray.DataArray (feature: 120)> Size: 960B
array([20.37931034, 20.75      , 19.7       , 20.5       , 19.33333333,
       18.33333333, 13.66666667, 14.5       , 12.5       , 10.75      ,
        7.5       ,  8.83333333,  6.        ,  6.25      ,  6.75      ,
        5.        ,  3.5       ,  1.25      ,  3.        ,  2.        ,
        0.75      ,  1.25      ,  1.        ,  0.5       ,  0.        ,
        0.25      ,  0.5       ,  0.        ,  0.        ,  0.5       ,
        0.5       ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.5       ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.25      ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.5       ,  0.25      ,  0.5       ,  0.75      ,
        0.25      ,  1.        ,  1.        ,  1.        ,  1.        ,
        3.5       ,  2.        ,  1.5       ,  5.25      ,  4.        ,
        4.5       ,  8.        ,  7.25      ,  8.66666667, 10.        ,
       11.75      , 12.16666667, 13.83333333, 16.5       , 13.33333333,
       13.83333333, 18.125     , 20.4       , 18.75      , 19.17241379])
Coordinates:
  * feature  (feature) float64 960B 0.02618 0.07854 0.1309 ... 6.152 6.205 6.257
    unit     int64 8B 1
Attributes: (4)

xarray further has matplotlib support, allowing for easy visualization:

tuning_curves_1d.plot.line(x="feature", add_legend=False)
plt.ylabel("firing rate [Hz]");
../_images/9bb630399d8be7b3eace1b9c0ed3bbf1905146d6b4501c5cdf26272bac164b7c.png

You can either customize the plot labels yourself using matplotlib, or you can set them in the tuning curve object:

tuning_curves_1d.name = "firing rate"
tuning_curves_1d.attrs["unit"] = "Hz"
tuning_curves_1d.coords["feature"].attrs["unit"] = "rad"
tuning_curves_1d.plot.line(x="feature", add_legend=False);
../_images/c340c5aeaf60ea35081e6500573db514a467bab1a014dc5a6cb3ae7b9098e989.png

Internally, the compute_tuning_curves calls the value_from method which maps timestamps to their closest values in time from a Tsd object. It is then possible to validate the tuning curves by displaying the timestamps as well as their associated values.

Hide code cell source

plt.figure()
plt.subplot(121)
plt.plot(tsgroup_1d[3].value_from(feature), 'o')
plt.plot(feature, label="feature")
plt.ylabel("feature")
plt.xlim(0, 20)
plt.xlabel("Time [s]")
plt.subplot(122)
plt.plot(
    tuning_curves_1d[3].values,
    tuning_curves_1d.coords["feature"],
    label="tuning curve (unit 3)",
)
plt.xlabel("firing rate [Hz]")
plt.legend();
../_images/da7c0d8334d49c91a7e51415905757ff1dcf8e2779101da68286e848de31a0a2.png

It is also possible to just get the spike counts per bins. This can be done by setting the argument return_counts=True. The output is also a xarray.DataArray with the same dimensions as the tuning curves.

spike_counts = nap.compute_tuning_curves(
    data=tsgroup_1d,
    features=feature,
    bins=30, 
    range=(0, 2*np.pi),
    feature_names=["feature"],
    return_counts=True
)

Hide code cell source

plt.figure()
plt.subplot(131)
plt.plot(tsgroup_1d[3].value_from(feature), 'o')
plt.plot(feature, label="feature")
plt.ylabel("feature")
plt.xlim(0, 20)
plt.xlabel("time [s]")
plt.subplot(132)
plt.plot(tuning_curves_1d[3].values, tuning_curves_1d.coords["feature"])
plt.xlabel("firing rate [Hz]")
plt.subplot(133)
plt.barh(
    spike_counts.coords["feature"],
    width=spike_counts[3].values,
    height=np.mean(np.diff(spike_counts.coords["feature"])),
)
plt.xlabel("spike count")
plt.tight_layout()
../_images/7e975cdc02b4f5f90a83257600c6db849e837b396059e2950c24a57b15157c82.png

2D tuning curves from spikes#

Now, let us simulate some spiking units modulated by a 2D circular variable:

features = nap.TsdFrame(
    t=times_feature,
    d=np.stack(
        [
            np.cos(2 * np.pi * times_feature / 10),
            np.sin(2 * np.pi * times_feature / 10),
        ],
        axis=1,
    ),
    columns=["a", "b"],
)
features_interp = features.interpolate(nap.Ts(np.arange(0, T, dt_spikes)))
alpha = (
    np.arctan2(features_interp["b"].values, features_interp["a"].values)
    / np.pi
)

N = 6
centers_2d = np.linspace(-1, 1, N)
rates_2d = (
    max_rate
    * np.exp(50.0 * np.cos(alpha[:, np.newaxis] - centers_2d))
    / np.exp(50.0)
)
tsgroup_2d = nap.TsGroup(
    {
        i + 1: nap.Ts(
            features_interp.t[
                np.random.poisson(rates_2d[:, i] * dt_spikes) > 0
            ]
        )
        for i in range(N)
    },
)

If you pass more than 1 feature, a multi-dimensional tuning curve is computed:

tuning_curves_2d = nap.compute_tuning_curves(
    data=tsgroup_2d, 
    features=features,
    bins=(5,5),
    range=[(-1, 1), (-1, 1)],
    feature_names=["a", "b"]
)
tuning_curves_2d
<xarray.DataArray (unit: 6, a: 5, b: 5)> Size: 1kB
array([[[ 4.36363636, 12.05714286,  8.36363636,  0.        ,
          0.        ],
        [ 0.88571429,         nan,         nan,         nan,
          0.        ],
        [ 0.03030303,         nan,         nan,         nan,
          0.        ],
        [ 0.        ,         nan,         nan,         nan,
          0.        ],
        [ 0.        ,  0.        ,  0.        ,  0.        ,
          0.        ]],

       [[10.36363636,  3.85714286,  0.48484848,  0.        ,
          0.        ],
        [17.42857143,         nan,         nan,         nan,
          0.        ],
        [16.06060606,         nan,         nan,         nan,
          0.        ],
        [ 4.71428571,         nan,         nan,         nan,
          0.        ],
        [ 0.86363636,  0.08571429,  0.        ,  0.        ,
...
         11.09090909],
        [ 0.        ,         nan,         nan,         nan,
         16.8       ],
        [ 0.        ,         nan,         nan,         nan,
         13.66666667],
        [ 0.        ,         nan,         nan,         nan,
          6.17142857],
        [ 0.        ,  0.        ,  0.        ,  0.22857143,
          0.77272727]],

       [[ 0.        ,  0.        ,  9.        , 13.28571429,
          3.68181818],
        [ 0.        ,         nan,         nan,         nan,
          0.77142857],
        [ 0.        ,         nan,         nan,         nan,
          0.09090909],
        [ 0.        ,         nan,         nan,         nan,
          0.        ],
        [ 0.        ,  0.        ,  0.        ,  0.        ,
          0.        ]]])
Coordinates:
  * unit     (unit) int64 48B 1 2 3 4 5 6
  * a        (a) float64 40B -0.8 -0.4 1.11e-16 0.4 0.8
  * b        (b) float64 40B -0.8 -0.4 1.11e-16 0.4 0.8
Attributes: (4)

tuning_curve_2d is a again an xarray.DataArray but now with three dimensions: one for the units of TsGroup and 2 for the features, the coordinates contain the centers of the bins. Bins that have never been visited by the feature have been assigned a NaN value.

Two-dimensional tuning curves can also easily be visualized:

tuning_curves_2d.name="firing rate"
tuning_curves_2d.attrs["unit"]="Hz"
tuning_curves_2d.plot(col="unit", col_wrap=3);
../_images/3bc458832c56d4ce6e0bb94ad648afa14189c995e1c4b93fe81bd4dcae8440d4.png

Verifying the accuracy of the tuning curves can once more be done by displaying the spikes aligned to the features with the function value_from which assign to each spikes the corresponding features value for unit 0.

ts_to_features = tsgroup_2d[1].value_from(features)
ts_to_features
Time (s)            a           b
----------  ---------  ----------
5.046       -0.999684  -0.0251301
5.05        -0.999289  -0.0376902
5.058       -0.999289  -0.0376902
5.072       -0.998737  -0.0502443
5.094       -0.998027  -0.0627905
5.206       -0.992115  -0.125333
5.236       -0.988652  -0.150226
...
495.7       -0.904827  -0.425779
495.732     -0.893841  -0.448383
495.772     -0.882291  -0.470704
495.806     -0.876307  -0.481754
495.836     -0.863923  -0.503623
496.07      -0.778462  -0.627691
496.32      -0.675333  -0.737513
dtype: float64, shape: (826, 2)

tsgroup[0] which is a Ts object has been transformed to a TsdFrame object with each timestamps (spike times) being associated with a features value.

Hide code cell source

fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4), sharey=True)
ax1.plot(features["b"], features["a"], label="features")
ax1.plot(
    ts_to_features["b"],
    ts_to_features["a"],
    "o",
    color="red",
    markersize=4,
    label="spikes",
)
ax1.set_xlabel("b")
ax1.set_ylabel("a")
[ax1.axvline(b, linewidth=0.5, color="grey") for b in np.linspace(-1, 1, 6)]
[ax1.axhline(b, linewidth=0.5, color="grey") for b in np.linspace(-1, 1, 6)]
extents = (
    np.min(features["a"]),
    np.max(features["a"]),
    np.min(features["b"]),
    np.max(features["b"]),
)
tuning_curves_2d[0].plot(ax=ax2)
ax2.set_ylabel("")
plt.tight_layout()
../_images/43a1efc81cff81684a0dc96227d3e5c6f970e8ddfe8dddc7ef8ce115ad8d8f1c.png

1D tuning curves from continuous activity#

We do not always have spikes. Sometimes we are analysing continuous firing rates or calcium intensities. As an example, we will simulate noisy continuous activity for some units modulated by the 1D variable:

noise_level = 2.0
traces = np.exp(-10 * (np.sin((feature.d[:, np.newaxis] - centers) / 2)) ** 2)
traces = traces * (1 + noise_level * np.random.randn(*traces.shape))
tsdframe_1d = nap.TsdFrame(t=times_feature, d=traces)

The same function can take a Tsd or TsdFrame as data and compute tuning curves for continuous data:

tuning_curves_1d_continuous = nap.compute_tuning_curves(
    data=tsdframe_1d,
    features=feature,
    bins=120,
    range=(0, 2*np.pi),
    feature_names=["feature"]
)
tuning_curves_1d_continuous
<xarray.DataArray (unit: 6, feature: 120)> Size: 6kB
array([[1.08477641e+00, 9.60433279e-01, 9.22796276e-01, 1.08351874e+00,
        9.35081507e-01, 9.15050288e-01, 7.66570022e-01, 6.53180951e-01,
        6.18149920e-01, 5.68996494e-01, 4.54209215e-01, 5.04315142e-01,
        2.69240929e-01, 2.63468559e-01, 2.34233798e-01, 1.96962981e-01,
        1.77484698e-01, 1.45835546e-01, 9.84924153e-02, 1.28468624e-01,
        7.64791359e-02, 5.37590392e-02, 5.04121685e-02, 3.84385028e-02,
        2.87816592e-02, 2.25061636e-02, 1.27978518e-02, 1.09564994e-02,
        7.56002027e-03, 8.94175625e-03, 5.89208851e-03, 4.87595233e-03,
        4.57960660e-03, 3.34154957e-03, 2.55324439e-03, 1.76787902e-03,
        1.08423413e-03, 9.52873655e-04, 9.46326529e-04, 4.85222707e-04,
        4.75407581e-04, 4.04051138e-04, 2.92468325e-04, 3.62785749e-04,
        2.16612798e-04, 1.54136227e-04, 1.22844340e-04, 1.52777177e-04,
        1.32572685e-04, 1.13053052e-04, 9.18594561e-05, 7.66691512e-05,
        6.20040046e-05, 4.78685544e-05, 6.38321896e-05, 5.71988973e-05,
        4.49206761e-05, 4.77731110e-05, 4.13092249e-05, 4.61476169e-05,
        4.46751009e-05, 4.43199602e-05, 4.90687166e-05, 5.25482585e-05,
        4.36986185e-05, 6.89992916e-05, 5.85537792e-05, 8.44292223e-05,
        8.09003092e-05, 8.73583935e-05, 9.37684010e-05, 1.23426737e-04,
        1.17274247e-04, 1.57040514e-04, 1.49561555e-04, 3.22051807e-04,
        2.13450041e-04, 3.02022147e-04, 3.44462106e-04, 4.09013538e-04,
...
        3.83279724e-05, 3.21689757e-05, 4.71612178e-05, 4.73473787e-05,
        5.13298083e-05, 5.43345616e-05, 5.26675580e-05, 6.18918040e-05,
        8.55512006e-05, 5.23094167e-05, 9.40863530e-05, 9.96522658e-05,
        1.42772700e-04, 1.59753048e-04, 1.15818509e-04, 2.32479313e-04,
        2.91591353e-04, 3.16597059e-04, 4.93655104e-04, 5.43764742e-04,
        7.66080480e-04, 6.41571028e-04, 7.59369075e-04, 1.30923194e-03,
        1.77284428e-03, 1.41597493e-03, 1.94889588e-03, 3.05576605e-03,
        3.28811865e-03, 6.42634762e-03, 5.52824700e-03, 1.33424369e-02,
        1.19525249e-02, 2.17322016e-02, 1.88288730e-02, 2.74444891e-02,
        4.29159931e-02, 4.77455041e-02, 4.91905149e-02, 5.40250927e-02,
        9.15208117e-02, 5.29856329e-02, 1.39478972e-01, 1.75399458e-01,
        2.17692761e-01, 1.46259707e-01, 3.31548836e-01, 3.50717567e-01,
        4.92979802e-01, 5.75550107e-01, 4.76028699e-01, 6.52629294e-01,
        7.01095029e-01, 7.17498154e-01, 7.00581183e-01, 5.55762738e-01,
        9.29990274e-01, 8.18465850e-01, 1.17753425e+00, 9.57615518e-01,
        1.00328412e+00, 1.00032014e+00, 1.03362093e+00, 7.72318839e-01,
        1.06881998e+00, 9.11002622e-01, 6.98208475e-01, 8.39210933e-01,
        6.08721711e-01, 6.99117853e-01, 4.77307494e-01, 4.15735628e-01,
        3.27381609e-01, 2.86449721e-01, 2.17044739e-01, 2.22199734e-01,
        1.96798553e-01, 1.53056325e-01, 1.21861100e-01, 8.93957122e-02]])
Coordinates:
  * unit     (unit) int64 48B 0 1 2 3 4 5
  * feature  (feature) float64 960B 0.02618 0.07854 0.1309 ... 6.152 6.205 6.257
Attributes: (3)
tuning_curves_1d_continuous.name="ΔF/F"
tuning_curves_1d_continuous.attrs["unit"]="a.u."
tuning_curves_1d_continuous.plot.line(x="feature", add_legend=False);
../_images/b64316ab8d1033aa50e4634c0b2f37e92ab33b9c3d83e1788e34f97017cf9d3f.png

2D tuning curves from continuous activity#

This also works with more than one feature. Let us first simulate noisy continuous activity for some units modulated by the 2D variable:

alpha = np.arctan2(features["b"].values, features["a"].values) / np.pi
traces = np.exp(50.0 * np.cos(alpha[:, np.newaxis] - centers_2d)) / np.exp(50.0)
traces = traces * (1 + noise_level * np.random.randn(*traces.shape))
tsdframe_2d = nap.TsdFrame(t=times_feature, d=traces, columns=range(1, N+1, 1))

The same function again handles computing the tuning curves:

tuning_curves_2d_continuous = nap.compute_tuning_curves(
    data=tsdframe_2d,
    features=features,
    bins=5,
    feature_names=["a", "b"]
    )
tuning_curves_2d_continuous
<xarray.DataArray (unit: 6, a: 5, b: 5)> Size: 1kB
array([[[2.63564940e-01, 6.62824211e-01, 4.76589803e-01, 4.64507383e-28,
         5.27241975e-26],
        [4.27013807e-02,            nan,            nan,            nan,
         3.46441606e-23],
        [2.88013998e-03,            nan,            nan,            nan,
         2.65901977e-20],
        [1.04267693e-04,            nan,            nan,            nan,
         2.63424958e-17],
        [2.12976271e-06, 6.44548082e-08, 2.48006681e-10, 1.17784061e-12,
         1.97584657e-15]],

       [[6.54912002e-01, 2.10853256e-01, 1.97811642e-02, 1.68478544e-19,
         2.71521213e-17],
        [9.89672571e-01,            nan,            nan,            nan,
         1.13546642e-14],
        [6.93526290e-01,            nan,            nan,            nan,
         4.12400982e-12],
        [2.95005162e-01,            nan,            nan,            nan,
         1.24310573e-09],
        [5.74295989e-02, 7.05023358e-03, 2.38298614e-04, 5.16691861e-06,
...
         5.92884066e-01],
        [1.04439972e-14,            nan,            nan,            nan,
         9.65458975e-01],
        [4.17458461e-12,            nan,            nan,            nan,
         7.67685407e-01],
        [1.29764162e-09,            nan,            nan,            nan,
         2.54157736e-01],
        [6.59764204e-08, 5.31801250e-06, 2.63666593e-04, 7.74703892e-03,
         5.12386096e-02]],

       [[4.94551166e-26, 3.65790467e-28, 4.82981365e-01, 5.91853167e-01,
         2.21251770e-01],
        [3.80984194e-23,            nan,            nan,            nan,
         4.50375028e-02],
        [2.72861835e-20,            nan,            nan,            nan,
         3.21899150e-03],
        [2.60538625e-17,            nan,            nan,            nan,
         1.06197132e-04],
        [2.39346567e-15, 1.02129662e-12, 2.80472711e-10, 6.70846957e-08,
         1.98506325e-06]]])
Coordinates:
  * unit     (unit) int64 48B 1 2 3 4 5 6
  * a        (a) float64 40B -0.8 -0.4 1.11e-16 0.4 0.8
  * b        (b) float64 40B -0.8 -0.4 1.11e-16 0.4 0.8
Attributes: (3)
tuning_curves_2d_continuous.name="ΔF/F"
tuning_curves_2d_continuous.attrs["unit"]="a.u."
tuning_curves_2d_continuous.plot(col="unit", col_wrap=3);
../_images/3350bb0ec1288f3f4eec011f01b3b8d6bfb28c1e2dc3b6f141d4005873583a84.png

From epochs#

When computing from epochs, you should store them in a dictionary:

epochs_dict =  {
    "A": nap.IntervalSet(start=[0, 20, 40, 60, 80], end=[10, 29, 49, 69, 89]),
    "B":nap.IntervalSet(start=[10, 30, 40, 60, 90], end=[19, 39, 59, 79, 99])
}

You can then compute the tuning curves using nap.compute_response_per_epoch. You can pass either a TsGroup for spikes, or a TsdFrame for rates/calcium activity.

The output is an xarray.DataArray with labeled dimensions:

epochs_tuning_curves = nap.compute_response_per_epoch(tsgroup_2d, epochs_dict)
epochs_tuning_curves
<xarray.DataArray (unit: 6, epochs: 2)> Size: 96B
array([[1.91304348, 1.89230769],
       [4.2173913 , 3.84615385],
       [2.5       , 2.69230769],
       [3.39130435, 3.44615385],
       [3.52173913, 3.66153846],
       [1.80434783, 1.70769231]])
Coordinates:
  * unit     (unit) int64 48B 1 2 3 4 5 6
  * epochs   (epochs) <U1 8B 'A' 'B'

We can visualize using barplots:

fig, axs = plt.subplots(
    1, N, constrained_layout=True, sharey=True, figsize=(8, 3)
)
for unit, ax in zip(epochs_tuning_curves.coords["unit"], axs):
    ax.bar(
        epochs_tuning_curves.coords["epochs"],
        epochs_tuning_curves.sel(unit=unit),
    )
    ax.set_title(f"unit {unit.item()}")
axs[0].set_xlabel("epoch")
axs[0].set_ylabel("firing rate [hz]");
../_images/f5f41308f2833308da325da0e28682518e7fdf529931438ea9c425abdb11bb60.png

Error bars#

Often, you will want error bars on your tuning curves, to be able to quantify uncertainty. Pynapple does not provide explicit functions for this, but in this section we will show how you can easily compute error bars yourself, using the functions we introduced above.

From timestamps or continuous activity#

If you are computing tuning curves against features, you can split your session into n_splits, compute a tuning curve per split, and then compute statistics over those.

We will start by creating splits:

n_splits = 4
full_session = feature.time_support
split_length = full_session.tot_length() / n_splits
splits = full_session.split(split_length)
splits
  index    start      end
      0    0      124.995
      1  124.995  249.99
      2  249.99   374.985
      3  374.985  499.98
shape: (4, 2), time unit: sec.

Then, we can compute the tuning curves like before, by looping over the splits:

tuning_curves_per_split = [
        nap.compute_tuning_curves(
            tsgroup_1d,
            epochs=split,
            features=feature,
            bins=120,
            range=(0, 2 * np.pi),
            feature_names=["feature"],
        )
        for split in splits
]

To make things easier down the line, we advise combining these into one big xarray.DataArray using xarray.concat , adding a dimension for the splits:

tuning_curves_per_split = xr.concat(tuning_curves_per_split, dim="split")
tuning_curves_per_split
<xarray.DataArray (split: 4, unit: 6, feature: 120)> Size: 23kB
array([[[24.58563536, 26.        , 20.4       , ..., 22.8       ,
         15.        , 18.59504132],
        [ 2.48618785,  3.66666667,  4.        , ...,  1.2       ,
          1.33333333,  2.0661157 ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.82872928,  1.        ,  1.2       , ...,  2.8       ,
          2.        ,  1.65289256]],

       [[18.18181818, 18.        , 22.8       , ..., 16.4       ,
         20.66666667, 19.88950276],
        [ 1.65289256,  1.66666667,  2.4       , ...,  1.2       ,
          2.66666667,  1.38121547],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
...
          0.        ,  0.27548209],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 1.93370166,  1.66666667,  0.4       , ...,  4.4       ,
          1.        ,  2.20385675]],

       [[20.11019284, 17.33333333, 18.4       , ..., 23.2       ,
         21.66666667, 20.22160665],
        [ 1.51515152,  1.        ,  3.2       , ...,  1.6       ,
          2.        ,  2.35457064],
        [ 0.13774105,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 0.        ,  0.        ,  0.        , ...,  0.        ,
          0.        ,  0.        ],
        [ 2.0661157 ,  0.66666667,  1.2       , ...,  3.6       ,
          2.66666667,  1.93905817]]], shape=(4, 6, 120))
Coordinates:
  * unit     (unit) int64 48B 1 2 3 4 5 6
  * feature  (feature) float64 960B 0.02618 0.07854 0.1309 ... 6.152 6.205 6.257
Dimensions without coordinates: split
Attributes: (4)

Computing the mean and standard deviation can then be done easily using:

means = tuning_curves_per_split.mean(dim="split")
stds = tuning_curves_per_split.std(dim="split")

Visualizing also becomes more simple:

lines = means.plot.line(x="feature", add_legend=False)
for line, unit in zip(lines, means.coords["unit"]):
    mean = means.sel(unit=unit)
    std = stds.sel(unit=unit)
    plt.fill_between(
        means["feature"],
        mean - std,
        mean + std,
        color=line.get_color(),
        alpha=0.2,
    )
plt.xlabel("feature [rad]")
plt.ylabel("firing rate [Hz]");
../_images/ddd4eec0d6767e878d12a884948bfb10a88ea9400a25574fbe44068a257cc483.png

To make things easier in the future, here is a function that does all of this:

def compute_tuning_curves_with_error_bars(
    data, features, bins, range, feature_names, n_splits
):
    # Get splits
    full_session = features.time_support
    split_length = full_session.tot_length() / n_splits
    splits = full_session.split(split_length)

    # Compute tuning curves per split
    tuning_curves_per_split = [
        nap.compute_tuning_curves(
            data,
            features=features,
            epochs=split,
            bins=bins,
            range=range,
            feature_names=feature_names,
        )
        for split in splits
    ]
    tuning_curves_per_split = xr.concat(tuning_curves_per_split, dim="split")

    # Return mean and standard deviation
    return (
        tuning_curves_per_split.mean(dim="split"), 
        tuning_curves_per_split.std(dim="split")
    )

Feel free to extend it to your needs!

From epochs#

If you want error bars for epochs, the typical use-case will be that you have multiple presentations of a stimulus, and you want the mean response over those:

epochs_dict
{'A':   index    start    end
       0        0     10
       1       20     29
       2       40     49
       3       60     69
       4       80     89
 shape: (5, 2), time unit: sec.,
 'B':   index    start    end
       0       10     19
       1       30     39
       2       40     59
       3       60     79
       4       90     99
 shape: (5, 2), time unit: sec.}

So, we can use nap.compute_response_per_epoch in a loop to compute that:

epochs_tuning_curves_per_presentation = [
    nap.compute_response_per_epoch(
        tsgroup_2d, {"A": stimulus_A, "B": stimulus_B}
    )
    for stimulus_A, stimulus_B in zip(epochs_dict["A"], epochs_dict["B"])
]

To make things easier down the line, we advise combining these into one big xarray.DataArray using xarray.concat , adding a dimension for the presentations:

epochs_tuning_curves_per_presentation = xr.concat(
    epochs_tuning_curves_per_presentation, dim="presentation"
)
epochs_tuning_curves_per_presentation
<xarray.DataArray (presentation: 5, unit: 6, epochs: 2)> Size: 480B
array([[[1.6       , 1.77777778],
        [4.1       , 4.22222222],
        [2.8       , 2.11111111],
        [3.1       , 3.55555556],
        [2.9       , 3.44444444],
        [1.7       , 2.22222222]],

       [[2.44444444, 1.33333333],
        [3.66666667, 4.        ],
        [2.33333333, 1.88888889],
        [3.55555556, 3.44444444],
        [3.33333333, 4.22222222],
        [2.44444444, 1.55555556]],

       [[1.88888889, 1.73684211],
        [3.55555556, 3.42105263],
        [2.77777778, 3.21052632],
        [3.33333333, 3.31578947],
        [3.88888889, 3.73684211],
        [2.        , 1.73684211]],

       [[2.55555556, 2.26315789],
        [4.77777778, 4.05263158],
        [2.55555556, 2.94736842],
        [3.55555556, 3.15789474],
        [4.        , 3.42105263],
        [1.66666667, 1.42105263]],

       [[1.11111111, 2.11111111],
        [5.        , 3.77777778],
        [2.        , 2.44444444],
        [3.44444444, 4.22222222],
        [3.55555556, 3.66666667],
        [1.22222222, 1.88888889]]])
Coordinates:
  * unit     (unit) int64 48B 1 2 3 4 5 6
  * epochs   (epochs) <U1 8B 'A' 'B'
Dimensions without coordinates: presentation

We can then visualize again, but now with error bars:

means = epochs_tuning_curves_per_presentation.mean(dim="presentation")
stds = epochs_tuning_curves_per_presentation.std(dim="presentation")

fig, axs = plt.subplots(
    1, N, constrained_layout=True, sharey=True, figsize=(8, 3)
)
for unit, ax in zip(epochs_tuning_curves.coords["unit"], axs):
    ax.bar(
        means.coords["epochs"], means.sel(unit=unit), yerr=stds.sel(unit=unit)
    )
    ax.set_title(f"unit {unit.item()}")
axs[0].set_xlabel("epoch")
axs[0].set_ylabel("firing rate [Hz]");
../_images/98e1d54575342f68d93f917f3c8676506cd7098f4ec886e33f8db8fc3b207c0b.png

To make things easier in the future, here is a function that does all of this:

def compute_response_per_epoch_with_error_bars(data, epochs_dict):
    epochs_dict_per_presentation = [
        dict(zip(epochs_dict.keys(), presentations))
        for presentations in zip(*epochs_dict.values())
    ]
    epochs_tuning_curves_per_presentation = [
        nap.compute_response_per_epoch(data, presentation_epochs_dict)
        for presentation_epochs_dict in epochs_dict_per_presentation
    ]
    epochs_tuning_curves_per_presentation = xr.concat(
        epochs_tuning_curves_per_presentation, dim="presentation"
    )
    return (
        epochs_tuning_curves_per_presentation.mean(dim="presentation"),
        epochs_tuning_curves_per_presentation.std(dim="presentation"),
    )

Feel free to extend it to your needs!

Mutual information#

Given a set of tuning curves, you can use compute_mutual_information to compute the mutual information between the activity of the neurons and the features, no matter what dimension. See the Skaggs et al. (1992) paper for more information on what mutual information computes.

MI = nap.compute_mutual_information(tuning_curves_1d)
MI
bits/sec bits/spike
1 7.756818 1.022207
2 5.043786 1.428777
3 5.643588 2.177223
4 5.565737 2.362273
5 5.688925 2.271844
6 5.106260 1.425476
MI = nap.compute_mutual_information(tuning_curves_2d)
MI
bits/sec bits/spike
1 3.913554 2.368885
2 5.974394 1.750925
3 5.966003 1.702558
4 5.788383 1.689478
5 5.656865 1.696652
6 4.213189 2.412955

Take a look at the tutorial on head direction cells for a realistic example.