QuixiAI/dolphin-coder
Viewer • Updated • 109k • 4.65k • 61
How to use monsterapi/gpt2_137m_DolphinCoder with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("gpt2")
model = PeftModel.from_pretrained(base_model, "monsterapi/gpt2_137m_DolphinCoder")Model Used: gpt2
Dataset: cognitivecomputations/dolphin-coder
Dolphin-Coder dataset – a high-quality collection of 100,000+ coding questions and responses. It's perfect for supervised fine-tuning (SFT), and teaching language models to improve on coding-based tasks.
With the utilization of MonsterAPI's no-code LLM finetuner, this finetuning:
$1.96 for the entire 1 epoch.license: apache-2.0
Base model
openai-community/gpt2