Scan to Download Gate App
qrCode
More Download Options
Don't remind me again today

Fair point - yeah, the *training phase* for LLMs eats up insane resources. But once they're running? The way they handle context windows is actually pretty wild. Give them the right setup and they squeeze out massive value from minimal input. It's that in-context learning magic.

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 7
  • Repost
  • Share
Comment
0/400
PermabullPetevip
· 12h ago
High cost performance after training
View OriginalReply0
ForkYouPayMevip
· 18h ago
The training cost is too high.
View OriginalReply0
GateUser-beba108dvip
· 18h ago
Lower the training costs quickly.
View OriginalReply0
SchrödingersNodevip
· 18h ago
Resources once gone are gone forever
View OriginalReply0
MetaMaskVictimvip
· 18h ago
The model consumes too much power.
View OriginalReply0
CryptoNomicsvip
· 18h ago
Actually, the compute efficiency follows a logarithmic optimization curve
Reply0
LuckyBearDrawervip
· 18h ago
Training is very exhausting, I can understand.
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)