OpenAI如何优化LLM的效果
How Abilities in Large Language Models are Affected by Supervised Fine-tuning Data Composition
实用的Prompts列表
微调语言大模型选LoRA还是全参数?基于LLaMA 2深度分析
Fuyu-8B:A Multimodal Architecture for AI Agents
ar5iv
如何让 GPT-4 帮你写 Prompt?
Efficient Memory Management for Large Language Model Serving with PagedAttention
Continuous Batching:一种提升 LLM 部署吞吐量的利器
Colossal-LLaMA-2:千元预算半天训练,效果媲美主流大模型,开源可商用中文LLaMA-2