Let $f:\mathbb{R}\to\mathbb{R}$ be a $T$-periodic function, that is $f(t+T)=f(t)$ for all $t\in \mathbb{R}$. Assume that
$$\int_0^{+\infty}|f(s)|ds<+\infty.$$
Now if we assume in addition that $f$ is continuous, my intuition tells me that we must have necessarily $f=0$, is this correct ?

This is correct. The way you can see this is by considering the maximum of $|f|$, call it $L$. For any $x$ such that $|f(x)|=L$, we have that $|f(y)| > \frac{L}{2}$ for all $|x-y| < \delta$ (for a sufficient choice of $\delta$). Can you see how to argue it from here?