4 Answers
4

For a fixed $x$, and a given $\epsilon>0$, first find a $\delta$ so that, if $r$ is rational and $|r-x|<\delta$ then $|f(r)-f(x)|<\epsilon$. Then if $t$ is any real with say $|t-x|<\delta/2$ we can pick a rational $r$ very close to $t$ such that $r$ is also within $\delta$ of $x$. Now apply the triangle inequality to
$(f(t)-f(x)) = (f(t)-f(r)) + (f(r)-f(x))$ and obtain

$|f(t)-f(x)|<=|f(t)-f(r)|+|f(r)-f(x)|.$

Since $t$ is close to $r$ the first term is small, and since $r$ is close to $x$ the second term is small. The idea is that in each separate term one rational and one real occurs, so that your assumption about convergence of limits through rational values converging to any real applies. I didn't fill in all the details, but with some manipulation one can get the thing less than $2\epsilon$.

EDIT: This needs to be thought through more. I agree with the OP that it's not clear how the details should go. But at least it seems to me it will go through...

Another try, more details: Given the fixed real $x$ and some $\epsilon>0$. First we can pick $\delta_1>0$ so that for rational $r$ we have

$|r-x|<\delta_1$ implies $|f(r)-f(x)|<\epsilon/2$.

Define $\delta=\delta_1/2$

Suppose $t$ is real with $|t-x|<\delta = \delta_1/2$.

We can now pick $\delta'>0$ so that for rational $r$ we have

$|r-t|<\delta'$ implies $|f(r)-f(t)|<\epsilon/2$.

Now put $\delta_2=min(\delta,\delta')$ and pick a rational $r$ with $|r-t|<\delta_2$.

Then we have $|r-x|<=|r-t|+|t-x|<\delta_1/2+\delta_1/2=\delta_1$ so that $|f(r)-f(x)|<\epsilon/2$, and also from $|r-t|<\delta_2$ we have also $|r-t|<\delta'$ so that $|f(r)-f(t)|<\epsilon/2$. We finally arrive at $|f(t)-f(x)|<\epsilon$ on applying the triangle inequality.

Fix a $\xi\in{\mathbb R}$. If $f$ were not continuous at $\xi$ we could find an $\epsilon_0>0$ and for each $\delta>0$ a point $x_\delta\in U_\delta(\xi)$ with $$|f(x_\delta)- f(\xi)|\geq\epsilon_0\ .$$ Consider such an $x_\delta$. As $$\lim_{q\to x_\delta, \ q\in{\mathbb Q}} f(q)=f(x_\delta)$$ we can find a $q_\delta\in{\mathbb Q}$ with $|q_\delta-x_\delta|<\delta$ such that $$|f(q_\delta)-f(x_\delta|<{\epsilon_0\over2}\ .$$

It follows that there is for each $\delta>0$ a point $q_\delta\in U_{2\delta}(\xi)\cap{\mathbb Q}$ such that
$$|f(q_\delta)-f(\xi)|\geq{\epsilon_0\over2}\ .$$ This contradicts $\lim_{q\to \xi, \ q\in{\mathbb Q}} f(q)=f(\xi)$.

The former being the definition of the limit and the latter the definition of continuity. Maybe what he's getting at is that being a limit point guarantees that a $\delta$ neighborhood of $x$ will always exist that contains an appropriate $r$, and we can use the exact same $\delta$ for any corresponding $\epsilon$ used to establish the limit in the hypothesis.

Here is my own answer. (right or not)
Proof: We will use Heine Theorem：
Suppose $x_n\to x_0$ as $n\to \infty$, we want to prove that:
$$\lim_{n\to \infty}f(x_n)=f(x_0).$$
Due to $$\lim_{r\to x,r\in\mathbb{Q}}f(r)=f(x)\ \ (*)$$
we know that: for any given $\epsilon>0,\exists\ \delta>0$, s.t when $|r-x_n|<\delta$ and $r\in\mathbb{Q}$
$$|f(r)-f(x_n)|<\epsilon.$$
Now take $\epsilon_n=\frac{1}{n}$, then $\exists \ \delta_n'>0$, s.t when $|r-x_n|<\delta_n'$ and $r\in\mathbb{Q}$
$$|f(r)-f(x_n)|<\epsilon_n=\frac{1}{n}.$$
Let $\delta_n=\min\{\delta_n',1/n\}$, obviously, $\delta_n\leq\frac{1}{n}$,
then take $r_n$ such that $|r_n-x_n|<\delta_n\leq\frac{1}{n}.$
And it satisfies that $$|f(r_n)-f(x_n)|<\epsilon_n=\frac{1}{n}.$$
By $|r_n-x_n|<\frac{1}{n}$ and $x_n\to x_0$, we know that $r_n\to x_0.$
Combining $(*)$, we get $$\lim\limits_{n\to \infty}f(r_n)=f(x_0).$$
For any given $\epsilon>0,\exists N>0$, s.t when $n>N$
$$\frac{1}{n}<\frac{\epsilon}{2}\ \text{and}\ |f(r_n)-f(x_0)|<\frac{\epsilon}{2}.$$
So $|f(x_n)-f(x_0)|\leq|f(x_n)-f(r_n)|+|f(r_n)-f(x_0)|<\frac{1}{n}+\frac{\epsilon}{2}<\epsilon.$
Here comes the result.