Model order reduction of continuous-time linear time-delayed systems over limited frequency intervals is discussed in this paper. The approximation performance is characterized by introducing an index associated with the finite-frequency maximum singular value of the error transfer function. With the aid of some fundamental matrix inequality techniques, sufficient criterion for stability of the reduced-order model and optimizing finite-frequency approximation error is derived. The model order reduction problems can be tackled by solving the corresponding linear matrix inequalities (LMIs) based optimization problems. A numerical example is given to show effectiveness of the proposed technique.