You could also make an attempt at it in vector notation,
[tex]L = \frac{1}{2}m (\dot{\vec{x}})^2 + e(\vec{B}\times\vec{x})\cdot\dot{\vec{x}}[/tex]
but you'd have to use a vector identity,
[tex](\vec{A}\times\vec{B})\cdot\vec{C} = (\vec{B}\times\vec{C})\cdot\vec{A} = (\vec{C}\times\vec{A})\cdot\vec{B}[/tex]
so you could write
[tex]L = \frac{1}{2}m (\dot{\vec{x}})^2 + e(\dot{\vec{x}}\times\vec{B})\cdot\vec{x}[/tex]
Then you get
[tex]\frac{\partial L}{\partial \vec{x}} = e \dot{\vec{x}}\times\vec{B} = -e \vec{B}\times\dot{\vec{x}}[/tex]
and
[tex]\frac{\mathrm{d}}{\mathrm{d}t}\frac{\partial L}{\partial \dot{\vec{x}}} = \frac{\mathrm{d}}{\mathrm{d}t}\left[m \dot{\vec{x}} + e \vec{B}\times\vec{x}\right] = m \ddot{\vec{x}} + e \vec{B}\times\dot{\vec{x}}[/tex]
and again, when you put them together, you get
[tex]m \ddot{\vec{x}} + 2e \vec{B}\times\dot{\vec{x}} = 0[/tex]

It's the same math going on underneath, it's just that there are several different ways to write it out. If I didn't already have a lot of experience with using vector or tensor notation, I'd probably do it the way you did it, manually, one component at a time.

Ah, well in that one you have to remember that j and k are dummy indices, so that [itex]\dot{x}^j[/itex] and [itex]\dot{x}^k[/itex] are really the same thing. So the derivative applies to both of them equally.

If you want to show it explicitly, you can apply the product rule. I'm going to relabel the index of the [itex]\dot{x}^k[/itex] in the derivative so there's no confusion (of course, the label makes no difference because it is a dummy index):
[tex]\frac{\partial}{\partial \dot{x}^a}\biggl[\frac{1}{2}m\delta_{jk}\dot{x}^j\dot{x}^k\biggr] = \frac{1}{2}m\delta_{jk}\frac{\partial \dot{x}^j}{\partial \dot{x}^a}\dot{x}^k + \frac{1}{2}m\delta_{jk}\dot{x}^j\frac{\partial \dot{x}^k}{\partial \dot{x}^a}[/tex]
Now, do you accept that [itex]\partial \dot{x}^j/\partial \dot{x}^a = \delta^{j}_{a}[/itex]? Then
[tex]\begin{align*}
\frac{\partial}{\partial \dot{x}^a}\biggl[\frac{1}{2}m\delta_{jk}\dot{x}^j\dot{x}^k\biggr] &= \frac{1}{2}m\delta_{jk}\delta^{j}_{a}\dot{x}^k + \frac{1}{2}m\delta_{jk}\dot{x}^j\delta^{k}_{a} \\
&= \frac{1}{2}m\delta_{ak}\dot{x}^k + \frac{1}{2}m\delta_{ja}\dot{x}^j \\
&= \frac{1}{2}m\delta_{ja}\dot{x}^j + \frac{1}{2}m\delta_{ja}\dot{x}^j \\
&= m\delta_{ja}\dot{x}^j
\end{align*}[/tex]
I went from [itex]\delta_{ja}\dot{x}^j[/itex] to [itex]\delta_{ak}\dot{x}^k[/itex] just by relabeling that index from j to k, and then switching the order of indices in the Kronecker delta because it's a symmetric tensor.