-
C2LLM Technical Report: A New Frontier in Code Retrieval via Adaptive Cross-Attention Pooling
Paper β’ 2512.21332 β’ Published β’ 13 -
codefuse-ai/C2LLM-7B
Feature Extraction β’ 8B β’ Updated β’ 92 β’ 5 -
codefuse-ai/C2LLM-0.5B
Feature Extraction β’ 0.5B β’ Updated β’ 73 β’ 5 -
codefuse-ai/F2LLM-0.6B
Feature Extraction β’ 0.6B β’ Updated β’ 198 β’ 11
CodeFuse AI
community
AI & ML interests
None defined yet.
Recent Activity
View all activity
Papers
C2LLM Technical Report: A New Frontier in Code Retrieval via Adaptive Cross-Attention Pooling
F2LLM Technical Report: Matching SOTA Embedding Performance with 6 Million Open-Source Data
This is a collection of the Ling-Coder Lite open-source models and datasets.
-
inclusionAI/Ling-Coder-lite
Text Generation β’ 17B β’ Updated β’ 149 β’ 50 -
inclusionAI/Ling-Coder-lite-base
Text Generation β’ 17B β’ Updated β’ 125 β’ 26 -
inclusionAI/Ling-Coder-SFT
Viewer β’ Updated β’ 4.48M β’ 1.74k β’ 33 -
inclusionAI/Ling-Coder-DPO
Viewer β’ Updated β’ 253k β’ 102 β’ 8
code LLMs with extra training on 3rd party models
-
codefuse-ai/CodeFuse-CodeLlama-34B
Text Generation β’ 34B β’ Updated β’ 33 β’ 93 -
codefuse-ai/CodeFuse-DeepSeek-33B
Text Generation β’ 33B β’ Updated β’ 131 β’ 62 -
codefuse-ai/CodeFuse-DeepSeek-33B-4bits
Text Generation β’ Updated β’ 24 β’ 10 -
codefuse-ai/CodeFuse-CodeLlama-34B-4bits
Text Generation β’ Updated β’ 27 β’ 27
-
MFTCoder: Boosting Code LLMs with Multitask Fine-Tuning
Paper β’ 2311.02303 β’ Published β’ 12 -
CodeFuse-13B: A Pretrained Multi-lingual Code Large Language Model
Paper β’ 2310.06266 β’ Published β’ 2 -
CoBa: Convergence Balancer for Multitask Finetuning of Large Language Models
Paper β’ 2410.06741 β’ Published β’ 3 -
Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM
Paper β’ 2503.17793 β’ Published β’ 23
Rodimus models developed by CodeFuse team
Native models by CodeFuse Team
-
C2LLM Technical Report: A New Frontier in Code Retrieval via Adaptive Cross-Attention Pooling
Paper β’ 2512.21332 β’ Published β’ 13 -
codefuse-ai/C2LLM-7B
Feature Extraction β’ 8B β’ Updated β’ 92 β’ 5 -
codefuse-ai/C2LLM-0.5B
Feature Extraction β’ 0.5B β’ Updated β’ 73 β’ 5 -
codefuse-ai/F2LLM-0.6B
Feature Extraction β’ 0.6B β’ Updated β’ 198 β’ 11
-
MFTCoder: Boosting Code LLMs with Multitask Fine-Tuning
Paper β’ 2311.02303 β’ Published β’ 12 -
CodeFuse-13B: A Pretrained Multi-lingual Code Large Language Model
Paper β’ 2310.06266 β’ Published β’ 2 -
CoBa: Convergence Balancer for Multitask Finetuning of Large Language Models
Paper β’ 2410.06741 β’ Published β’ 3 -
Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM
Paper β’ 2503.17793 β’ Published β’ 23
This is a collection of the Ling-Coder Lite open-source models and datasets.
-
inclusionAI/Ling-Coder-lite
Text Generation β’ 17B β’ Updated β’ 149 β’ 50 -
inclusionAI/Ling-Coder-lite-base
Text Generation β’ 17B β’ Updated β’ 125 β’ 26 -
inclusionAI/Ling-Coder-SFT
Viewer β’ Updated β’ 4.48M β’ 1.74k β’ 33 -
inclusionAI/Ling-Coder-DPO
Viewer β’ Updated β’ 253k β’ 102 β’ 8
Rodimus models developed by CodeFuse team
code LLMs with extra training on 3rd party models
-
codefuse-ai/CodeFuse-CodeLlama-34B
Text Generation β’ 34B β’ Updated β’ 33 β’ 93 -
codefuse-ai/CodeFuse-DeepSeek-33B
Text Generation β’ 33B β’ Updated β’ 131 β’ 62 -
codefuse-ai/CodeFuse-DeepSeek-33B-4bits
Text Generation β’ Updated β’ 24 β’ 10 -
codefuse-ai/CodeFuse-CodeLlama-34B-4bits
Text Generation β’ Updated β’ 27 β’ 27
Native models by CodeFuse Team