Temperature and trend

We made an experiment in class where we measured the evolution of temperature over the course of 5 weeks.

I'd like to show that the increase in temperature is accelerating as summer is coming.

Is it enough to calculate the slope of the linear trend over the full 5 weeks and show that it is smaller than the slope of the linear trend over the past week (increase in trends) to deduce that it is accelerating ?

I'm asking because my teacher said that comparing linear trends over different time periods is not always appropriate and can be biased, but i don't understand why.

There are many possible issues here:
a) Trends over different durations
b) Trends over different time phases
c) Trends over different numbers of datapoints
d) Trends over sufficient numbers of datapoints
e) Trends over sufficiently long intervals (for the purposes of the inference)

Weather temperatures tend to vary over cycles (daily obviously, but in cities also weekly). You should be ok on that if the hours and days of week are equally represented. E.g. whole weeks, same times of day, in both datasets, but not necessarily the same number of weeks.
They're also prone to fluctuations lasting several days as weather systems move through. This makes trends of that timescale or shorter rather useless for tracking the change of season.
(c) should not be an issue provided you understand that the reliability of the trend will depend on the number of datapoints used to calculate it. So if you compare a trend over two weeks of daily noontime data with another such over three weeks, the second trend will be the more reliable, but does not invalidate a comparison.

Your specific proposal might be invalidated by (e). You could make it more robust by looking at successive differences, i.e the increase over each period of exactly 24x7. Then you can see if there's a trend in these deltas.