Last week we released c3-llamacpp, a containerized llama.cpp with a fast hf downloader. This week, c3-vllm. This containerizes vLLM, The final boss of LLM API servers.

FAST-5.17%
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 5
  • Repost
  • Share
Comment
0/400
BearHuggervip
· 08-05 18:57
Big guy, it's up and down
View OriginalReply0
ForkTroopervip
· 08-04 08:25
There are so many models that I can't handle them.
View OriginalReply0
UncleWhalevip
· 08-03 01:51
I can handle this container, let's go!
View OriginalReply0
TrustlessMaximalistvip
· 08-03 01:50
Not bad, vllm is on the chain.
View OriginalReply0
PumpAnalystvip
· 08-03 01:44
Favourable Information from the technical perspective also doesn't mean you should blindly chase the price; be careful of becoming a warrior who cuts off his own wrist.
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)