In fact, what is the total output token size for all model nowadays? Because I seem to have confused context length and output length. I figured that output length is usually fixed to 4k tokens. Is there such a limit to all models even the ones I mentioned? Because their model cards do not have any info on token limit.