嵌入模型: bge-small-en-v1.5 (130MB) - 384维向量 重排模型: Qwen3-Reranker-0.6B (1.2GB) - MTEB-R: 65.80 查询扩展: Qwen2.5-0.5B-Instruct (1.0GB) - 本地运行 推理框架: PyTorch (CPU/CUDA) - 自动检测并使用GPU加速 ...
Primary use case: Finding optimal BERT models for ontology classification with limited compute resources.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results