Suppose $$f(t,u)=f(0,u)+\int_0^t{\mu (w,u)dw}+\int_0^t{\sigma(w,u)dB_w},$$ where $B_w$ is a standard Brownian motion. I would like to calculus the drift and diffusion of $Y_t=-\int_t^s{f(t,u)du}$ (under sufficient conditions that guarantee all the regularities).

The problem comes from the HJM model in finance, and the answer is $$dY_t=\left(f(t,t)-\int_t^s{\mu(t,u)du}\right)dt+\left(-\int^s_t{\sigma(t,u)du}\right)dB_t.$$ I am really confused about where the term $f(t,t)$ in the drift comes from. Formally, I can calculate this using Leibniz rule and get $$dY_t=-\int_t^s{d_t f(t,u)du}+f(t,t)dt$$ which is equivalent to the answer, but this calculation is not justified. And the notes I have says the answer is due to Fubini's theorem for stochastic integrals. I understand that I can use Fubini's theorem to get $$Y_t=-\int_t^sf(0,u)du-\int^t_0\int^s_t\mu(w,u)dudw-\int_0^t\int_t^s\sigma(w,u)dudB_w.$$ But I don't know how this could lead to the answer, since the "drift" $\int^s_t\mu(w,u)du$ also depends on $t$, and how I could get the term $f(t,t)$.