I tried a basic approach and wrote x as a matrix of four unknown elements $\begin{pmatrix} a && b \\ c && d \end{pmatrix}$ and squared it when I obtained $\begin{pmatrix} a^2 + bc && ab + bd \\ ca + dc && cd + d^2\end{pmatrix}$ and by making it equal with $I_2$ I got the following system

$a^2 + bc = 1$

$ab +bd = 0$

$ca + dc = 0$

$cd + d^2 = 1$

I don't know how to proceed. (Also, if anyone knows of a better or simpler way of solving this matrix equation I'd be more than happy to know).

What you need to know about JCF is that for any $2\times 2$ matrix $x$, there is an invertible matrix $P$ such that $P^{-1}xP=\left(\begin{smallmatrix}t&0\\0&t\end{smallmatrix}\right)$ or $P^{-1}xP=\left(\begin{smallmatrix}t&1\\0&t\end{smallmatrix}\right)$ or
$P^{-1}xP=\left(\begin{smallmatrix}t&0\\0&r\end{smallmatrix}\right)$. Those three possibilities are the various Jordan forms you might get.

We have $x^2=I$. Multiply on the left by $P^{-1}$ and on the right by $P$ and you get $P^{-1}x^2P=P^{-1}P=I$. But then $P^{-1}x^2P=P^{-1}xIxP=P^{-1}x(PP^{-1})xP=(P^{-1}xP)(P^{-1}xP)=I$.

Hence if $x^2=I$, then the JCF also squares to $I$. This means that the second case is out, and in the first or third case, $t^2=r^2=1$. Hence the only possible JCF are the four matrices $\left(\begin{smallmatrix}\pm1 &0\\0&\pm1\end{smallmatrix}\right)$.

You are looking for matrices $A$ satisfying the polynomial equation $A^2-I=0$. Note that the corresponding polynomial splits $X^2-1=(X+1)(X-1)$, and since you are presumably not working over a field of characterisitic$~2$ but rather over the rational, real or complex numbers (although the question is not entirely clear about this), these two linear factors are distinct. It is a general fact that matrices satisfying a polynomial equation that splits into distinct linear factors are diagonalisable, and its eigenvalues must be among the roots of those linear factors (since the polynomial in $A$ is supposed to kill eigenvectors in particular). So no Jordan forms are needed. For the case $A^2=I$ you can also argue "by hand" that any vector $v$ is the sum of $\frac12(v+Av)$ in the eigenspace for$~1$ and $\frac12(v-Av)$ in the eigenspace for$~{-}1$, so those eigenspaces span the whole space and $A$ is diagonalisable. All this does not depend in the size of the square matrix$~A$.

Now to find all solutions for $A$, you just need to determine what the eigenspaces for $1$ and $-1$ can be. They can be any pair of complementary subspaces. If one of the eigenspaces is reduced to $\{0\}$ (so it is not really an eigenspace) one gets $A=I$ or $A=-I$, and these are isolated solutions. In the $2\times 2$ case the only other possibility is having two complementary eigenspaces of dimension$~1$. If $\binom pq$ and $\binom rs$ are linearly independent vectors, then you get a matrix with eigenspace for$~1$ spanned by the first and the eignespace for$~{-}1$ spanned by the second as
$$
A=\begin{pmatrix}p&r\\q&s\end{pmatrix}
\begin{pmatrix}1&0\\0&-1\end{pmatrix}
\begin{pmatrix}p&r\\q&s\end{pmatrix}^{-1}
=\frac1{ps-qr}\begin{pmatrix}ps+qr&-2pr\\2qs&-ps-qr\end{pmatrix},
$$
and that is the general form for this case. There is some redundancy in this expression since scaling either $\binom pq$ or $\binom rs$ has no effect; the set of such involutions only has dimension$~2$, and it can also be described as the set of matrices
$$
\begin{pmatrix}a&b\\c&-a\end{pmatrix}
\qquad\text{with $a^2+bc=1$,}
$$
in other words as those with characteristic polynomial $X^2-1$.

Whenever we get a value of $a = 0$, we return to conditions $\pars{1}$ and $\pars{2}$. In that case, those conditions are reduced to $\vec{b}\cdot\vec{b} = \alpha$ and $\vec{0} = \vec{\beta}$. So, we see that it occurs when $I_{2}$ is proportional to the identity matrix.