I have two questions which fit well together. I will try to describe them in a single example... If this is too long, I would not mind to be asked to reformulate my questions in a simpler way, nor to talk about it on skype.

My first question is the following: is it possible to access, in the transfer functions, to specific neuron populations defined in the brain section? For example, would it be possible to define several "layers" (pyNN populations) in the brain section, and return a PyNN assembly as a circuit, so that I can refer to different populations in the transfer functions? My second question will be: how can I use the decorator "@nrp.MapRobotPublisher"?

Both questions may become more explicit, if I write a minimal example of brain definition and transfer functions:

In the brain definition

sourceTargetVector = [ ] # here I just connect some neurons of LGN to some neurons of V1for k in range(nOri):--- for i in range(nRows):------- for j in range(nColumns):----------- if #someCondition#:--------------- sourceTargetVector.append(i*nColumns + j, k*nRows*nColumns + i*nColumns + j)sim.Projection(LGNLayer, V1Layer, MyConnector(sourceTargetVector), sim.StaticSynapse(someWeight))

circuit = LGNLayer + V1Layer6 # this is an assembly, and I want to refer to both these populations in the TFs

In the transfer functions

Now, let's say I have an input transfer function that gives a resized version of the robot's camera output as an input to my LGN layer. For a second transfer function, I want to publish the live activity of the V1 layer, as a "retinotopic" 2D array which displays the current spike rate of each neurons of the layer. I know how to generate, from all spike counts of the oriented neurons of the V1 Layer, a nicely plottable 2D-array.

What I would want to know is how I can use the decorator @nrp.MapRobotPublisher to make the TF plot, during the simulation, the 2D-array I generated. I tried some pseudo-code here, to illustrate what I would want to do, but I may have done it all wrong:

--- (some code that uses V1LayerToPlot and cumulativePlotDensityV1 (just a global array I need) to get the current spike rate of each neuron of the V1 layer and to generate a plottable 2D-array, "plottableV1", out of it)--- plottableV1 = ...--- V1Figure = plt.figure()--- plt.imshow(plottableV1)--- plt.show()

You're trying to plot the activity of a layer of neurons (as in, plot an array of values), by sending a matplotlib figure on a rostopic. It won't work like this; there is no rostopic_type being matplotlib figure.

Instead, the the proper ROS way would be to send the actual array you want to plot on a topic (there is a type for 2D array), and another ROS node should take care of plotting it. Sadly, we do not have such a ROS 2D plotter node yet, but it is planned:https://bbpteam.epfl.ch/project/issues/browse/NRRPLT-4524

Meanwhile, I recommend you instead use the CSV recorder feature to save your 2D array at every TF run, so that you can later plot them in a jupyter notebook directly from the collab space. Probably not the most efficient way but it should work:

Meanwhile, I recommend you instead use the CSV recorder feature to save your 2D array at every TF run, so that you can later plot them in a jupyter notebook directly from the collab space.

I tried this on the online version of the NRP, and it still gives me an error when I run the NRP. Here it is:

global name '_apply_' is not defined (Runtime)

I guess I did a mistake in the call of csv_recorder.recorde_entry(*plottableV1). If that is useful, here is how I invoke it (code below, intersting line = last one). If I comment the last line (csv_recorder call), the simulation runs without bug (but does not record V1 activity). Thank you a lot for your support!!

here is an update of what I decided to do: when you told me that image_view was available on the collab version of the NRP, I decided to forget about the csv recorder, and try to display a live image of the activity of my neurons inside the NRP experiment. So here is approximately what my transfer function does right now: