I want to compute the derivative of binary cross entropy loss w.r.t to the input of the sigmoid function and was wondering if there's a closed form expression? I've seen derivations of binary cross entropy loss with respect to model weights/parameters (derivative of cost function for Logistic Regression) as well as derivations of the sigmoid function w.r.t to its input (Derivative of sigmoid function $\sigma (x) = \frac{1}{1+e^{-x}}$), but nothing that combines the two. I would greatly appreciate any help with this.

There's also a post that computes the derivative of categorical cross entropy loss w.r.t to pre-softmax outputs (Derivative of Softmax loss function). I am looking for something similar in the binary case (perhaps this generalizes to the binary case, but not sure).