• Qwen2.5-Coder

  • Nov 12 2024
  • Length: 24 mins
  • Podcast

  • Summary

  • 🔷 Qwen2.5-Coder Technical Report

    The report introduces the Qwen2.5-Coder series, which includes the Qwen2.5-Coder-1.5B and Qwen2.5-Coder-7B models. These models are specifically designed for coding tasks and have been pre-trained on a massive dataset of 5.5 trillion code-related tokens. A significant focus is placed on data quality, with detailed cleaning and filtering processes, and advanced training techniques such as file-level and repo-level pre-training. The models were rigorously tested on various benchmarks, including code generation, completion, reasoning, repair, and text-to-SQL tasks, where they demonstrated strong performance, even surpassing larger models in some areas. The report concludes with suggestions for future research, such as scaling model size and enhancing reasoning abilities.

    📎 Link to paper

    Show More Show Less
activate_Holiday_promo_in_buybox_DT_T2

What listeners say about Qwen2.5-Coder

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.