An optimal control problem associated with the dynamics of the orientation of a bipolar molecule in the plane can be understood by means of tools in differential geometry. For first time in the literature k -symplectic formalism is used to provide the optimal control problems associated to some families of partial differential equations with a geometric formulation. A parallel between the classic formalism of optimal control theory with ordinary differential equations and the one with particular families of partial differential equations is established. This description allows us to state and prove Pontryagin’s Maximum Principle on k -symplectic formalism. We also consider the unified Skinner-Rusk formalism for optimal control problems governed by an implicit partial differential equation.

A close relationship between the classical Hamilton-Jacobi theory and the kinematic reduction of control systems by decoupling vector fields is shown in this paper. The geometric interpretation of this relationship relies on new mathematical techniques for mechanics defined on a skew-symmetric algebroid. This geometric structure allows us to describe in a simplified way the mechanics of nonholonomic systems with both control and external forces.

A close relationship between the classical Hamilton-
Jacobi theory and the kinematic reduction of control systems by
decoupling vector fields is shown in this paper. The geometric interpretation
of this relationship relies on new mathematical techniques
for mechanics defined on a skew-symmetric algebroid. This
geometric structure allows us to describe in a simplified way the
mechanics of nonholonomic systems with both control and external
forces.

We present a new setting of the geometric Hamilton-Jacobi theory by using the so-called time-evolution operator K. This new approach unifies both the Lagrangian and
the Hamiltonian formulation of the problem developed in [7], and can be applied to the
case of singular Lagrangian dynamical systems.

For the last forty years, differential geometry has provided a means of understanding optimal control theory. Usually the best strategy to solve a difficult problem is to transform it into a different problem that can be dealt with more easily. Pontryagin's Maximum Principle provides the optimal control problem with a Hamiltonian structure. The solutions to the Hamiltonian problem, satisfying particular conditions, are candidates to be solutions to the optimal control problem. These candidates are called extremals. Thus, Pontryagin's Maximum Principle lifts the original problem to the cotangent bundle.
In this thesis, we develop a complete geometric proof of Pontryagin's Maximum Principle. We investigate carefully the crucial points in the proof such as the perturbations of the controls, the linear approximation of the reachable set and the separation condition.
Among all the solutions to an optimal control problem, there exist the abnormal curves. These do not depend on the cost function we want to minimize, but only on the geometry of the control system. Some work has been done in the study of abnormality, although only for control-linear and control-affine systems with mainly control-quadratic cost functions. Here we present a novel geometric method to characterize all the different kinds of extremals (not only the abnormal ones) in general optimal control problems. This method is an adaptation of the presymplectic constraint algorithm. Our interest in the abnormal curves is with the strict abnormal minimizers. These last minimizers can be characterized by the geometric algorithm presented in this thesis.
As an application of the above-mentioned method, we characterize the extremals for the free optimal control problems that include, in particular, the time-optimal control problem. Moreover, an example of an strict abnormal extremal for a control-affine system is found using the geometric method.
Furthermore, we focus on the description of abnormality for optimal control problems for mechanical control systems, because no results about the existence of strict abnormal minimizers are known for these problems. Results about the abnormal extremals are given when the cost function is control-quadratic or the time must be minimized. In this dissertation, the abnormality is characterized in particular cases through geometric constructions such as vectorvalued quadratic forms that appear as a result of applying the previous geometric procedure.
The optimal control problems for mechanical control systems are also tackled taking advantage of the equivalence between nonholonomic control systems and kinematic control systems. In this thesis, it is found an equivalence between time-optimal control problems for both control systems. The results allow us to give an example of a local strict abnormal minimizer in a time-optimal control problem for a mechanical control system.
Finally, setting aside the abnormality, the non-autonomous optimal control problem is described geometrically using the Skinner-Rusk unified formalism. This approach is valid for implicit control systems that arise in optimal control problems for the controlled Lagrangian systems and for descriptor systems. Both systems are common in engineering problems.