MEDITRON is a pair of open-source large language models (LLMs) with 7 and 70 billion parameters, tailored to the medical domain. It was trained on carefully curated medical data sources, including peer-reviewed literature and clinical practice guidelines. MEDITRON outperforms other open-source models and closed models like GPT-3.5 on medical benchmarks, coming within 5-10% of GPT-4 and Med-PaLM-2.
This page was last edited on 2024-04-12.
This page was last edited on 2024-04-12.