The cmap package is intended to make the PDF files generated by
pdflatex "searchable and copyable" in acrobat reader and other
compliant PDF viewers.

Just put the line \usepackage{cmap} at the beginning of your
preamble, and it will pre-load the needed CMap translations for the
fonts used in the document, provided that there exists the file
<encoding>.cmap for the font encoding.

a first read to the manual (texdocmedia9), seemed to be a
regression, as it could do more complicated stuff (reading youtube
files from within a PDF) but not the simple stuff you really use (I
will never read a youtube files from within a PDF ...). In addition,
the movie15 package was removed from texlive, so you have to
make the transition (this was easily fixed by downloading a copy of
movie15.sty)

Video files must be in the MP4 format. Videos in other formats must be converted to MP4 or recreated from the raw material. The H.246 codec gives high quality video at stunningly small file size. With ffmpeg, video files can be created easily.
From an existing video file, such as Sample.avi:
ffmpeg -i Sample.avi -vcodec libx264 Sample.mp4
From a numbered sequence of bitmaps, such as frame-0.png, frame-1.png, ... :
ffmpeg -i frame-%d.png -vcodec libx264 myvideo.mp4

it is recommended that instead of using bibtex, you copy and paste
the content of the .bbl file.

Moral: one advantage of LaTeX is that your data, your dear prose, is
in text (not binary) format and has an open syntax. This process
allows the creation of the red-lined article file in one shot with no
fiddling and lost time of copy and pasting. So whatever soft you use
to put your ideas in readable digital format, just use something
simple, structured and open.

inline

Because people requested an easier way to enter latex, I've added the
possibility to write $ ... $ to obtain inline formulas. This is
equivalent to writing \$ ...\$ and has the same
single-line limitation (but everything else isn't really useful in
formulas anyway). In order to do this, install the inline\_latex.py
parser add #format inline\_latex to your page (alternatively, configure
the default parser to be ``inline\_latex). This parser accepts all regular
wiki syntax, but additionally the $ ... $' syntax. Additionally, the
``inline_latex` formatter supports $$....$$ style formulas (still limited
to a single line though!) which puts the formula into a paragraph on its
own.

Note: in the nikola blog, this is directly accomplished by using ReST : \$\\lambda\$ = $lambda$

In order to produce proceedings for the
NeuroComp08 that we organized, I used a
combination of LaTeX and
Python to generate a
PDF from our
preprint server based on ConfMaster.
This was due to the lack of an appropriate tool for this system and the
need to be flexible to any change made in last minute by the authors. I
used the following steps (these are summarized in the included
Makefile
file at the bottom that allowed to rebuild everything when any small
change in these steps were done). Please
edit
this page.

First, in ConfMaster, download the
papers from the system (Admistrator/Export DB/Download Files/Submit)
but also all metadata in CSV format (Admistrator/Export DB/CSV Data
to export/Papers). The CSV file had to be manually cleaned-up (using
vim and OpenOffice) to correct character encoding and some
errors from users. In fact, people had sometimes accents in their
names and I found out ultimately that the most flexible way to get
all accents was to translate everything to a good old latex-type of
encoding.

the following script
body.py
generated a link between the CSV and the folder of PDFs, but also
generated index terms in the resulting
body.tex
file for the creation of the authors and keywords tables:

\subsection*{Introduction}
Ce recueil contient les actes de la seconde conférence française de neurosciences computationnelles qui s'est tenue à Marseille du 8 au 11 octobre 2008.
Les neurosciences computationnelles portent sur l'étude des processus de traitement de l'information dans le système nerveux, du niveau de la cellule jusqu'à celui des populations de neurones et du contrôle du comportement. Le but de cette conférence est de rassembler des chercheurs issus de différentes disciplines, incluant les neurosciences, les sciences de l'information, la physique statistique ou encore la robotique, afin d'offrir un large panorama des recherches menées dans le domaine.
Ce recueil présente les 68 contributions qui ont été présentées lors de la conférence, dans leur ordre d'apparition dans le programme. Le premier jour était consacré aux modèles de la cellule neurale, aux modèles des traitements visuels et corticaux, ainsi qu'aux modèles de réseaux de neurones bio-mimétiques. La seconde journée était consacrée aux interfaces cerveau-machine, à la dynamique des grands ensembles de neurones, à la plasticité fonctionnelle et aux interfaces neurales.
Cette conférence a été rendue possible grâce au soutien de nombreuses institutions, et nous tenons à remercier le CNRS, la Société des neurosciences, Le conseil régional de la région Provence Alpes Côte d'Azur, le conseil général des Bouches de Rhône, la mairie de Marseille, l'université de Provence, l'IFR "Sciences du cerveau et de la cognition", et l'INRIA. Nous remercions chaleureusement la faculté de médecine de Marseille et l'université de la Méditerranée qui nous ont hébergés pendant tout le déroulement de la conférence.
Les organisateurs de la conférence remercient les membres du comité scientifique et du comité de lecture, les auteurs des différentes contributions ainsi que tous ceux qui ont contribué au bon déroulement de ces journées.
{\it This proceedings book contains the contributions that were presented at the second french conference on Computational Neuroscience that was held in Marseille from October 8th to 11th, 2008.
Computational neuroscience is the study of the mechanisms governing the processing of information in the nervous system, from the cellular level to the population of neurons and behaviour control. The aim of this conference was to gather people from various fields, including neuroscience, information science, statistical physics or robotics, in order to give a large panorama of the ongoing research in the field.
This book presents the 68 contributions which have been presented at the conference, with respect to their order of appearance in the conference program. The first day was devoted to the modelling of neural cells, to visual and cortical treatments and realistic neural networks models. The second day was devoted to brain-machine interfaces, large-scale and dynamical models, functional plasticity and neural interfaces.
This conference has been made possible with financial support from the CNRS, the French Society of Neuroscience, the regional council of Provence and of Bouches-du-Rhône, the city of Marseille, the university of Provence, the IFR "Sciences du Cerveau et de la Cognition" and the INRIA. It was kindly hosted by the Marseille medicine faculty and the University of the Mediterranean. We are grateful to all these supporting organizations for helping us gathering the computational neuroscience community in Marseille.
The organizers of this conference would like to thank the scientific committee members and reviewers, the authors of the submitted papers and all those who have helped with which we could provide you the best conditions possible.
}
\vfill
\noindent Laurent Perrinet and Emmanuel Daucé\hfill October, 2008
\newpage