One tendency that has come during the last ten years, is that of
compiling to byte code for virtual machines. This is not new, but with
Java it has become widespread. This has in turn spawned the "just in
time" breed of compilers, which compile platform neutral byte code to
target platform code, for faster execution.

Does this qualify as a major trend? I think it does.

Another trend, not in compilers as such, but related, is the tendency,
due to ever faster machines, to write larger and larger programs in
interpreted code. In the old days, everyone had experienced slow BASIC
interpreters, and moving forward to compiled code (pascal or C),
together of course with better hardware, made a huge difference in
throughput. Whereas now, with powerful hardware, interpreted code is in
many cases sufficient to solve many problems, such as PHP and
Javascript.
[In the really old days in the 1950s, byte code like interpreters were very
popular, mostly to provide a higher level instruction set than the simple
computers of the day had, e.g., adding floating point and stacks. -John]