How does the negative log likelihood function that’s used as the objective in gamma regression link back to shape and scale parameter of the gamma distribution?

I’ve been particularly looking at xgboost which use this definition:

https://github.com/dmlc/xgboost/blob/7663de956c37eb4dd528132214e68ba2851d9696/src/metric/elementwise_metric.cu#L270-L286

I don’t understand the purpose of psi in this equation given it is set to 1 and this would allow the final function can be simplified significantly.

The gamma deviance is more intuitive to me as it the deviance for an individual prediction Vs actual will remain constant for if the multiplicative difference prediction Vs actual stays constant which ties in with how I interpret the connection between mean and variance for a gamma distribution. To be exact when the gamma distribution is parameterised by the mean and a fixed shape parameter alpha as the mean increases the variance will increase.