I know that vLLM and TensorRT can be used to speed up LLM inference. I tried to find other tools can be do such things similar and will compare them. Do you guys have any suggestions?

vLLM: speed up inference

TensorRT: speed up inference

DeepSpeed:speed up for training phrase