Abstract
Mixture of Expert Tuning (MoE-Tuning) has effectively enhanced the performance of general MLLMs with fewer parameters, yet its application in resource-limited medical settings has not been fully explored. To address this gap, we developed MoE-TinyMed, a model tailored for medical applications that significantly lowers parameter demands. In evaluations on the VQA-RAD, SLAKE, and Path-VQA datasets, MoE-TinyMed outperformed LLaVA-Med in all Med-VQA closed settings with just 3.6B parameters. Additionally, a streamlined version with 2B parameters surpassed LLaVA-Med's performance in PathVQA, showcasing its effectiveness in resource-limited healthcare settings.
Abstract (translated)
混合专家调整(MoE-Tuning)有效地增强了一般MLMs的性能,同时参数更少。然而,在资源受限的医疗环境中,它的应用并没有完全被探索。为了填补这一空白,我们开发了MoE-TinyMed,一种专为医疗应用而设计的模型,显著降低了参数需求。在VQA-RAD、SLAKE和Path-VQA数据集上的评估显示,MoE-TinyMed在所有Med-VQA关闭设置中均超过了LLaVA-Med的性能,只需3.6B个参数。此外,一个优化版本,具有2B个参数,在PathVQA上超过了LLaVA-Med的性能,展示了其在资源受限的医疗环境中的有效性。
URL
https://arxiv.org/abs/2404.10237