OLMo-2-7B Fine-tuned on AICB Benchmark

This model is a fine-tuned version of allenai/OLMo-2-1124-7B-Instruct on the AICB (Analog Integrated Circuit Benchmark) dataset.

Model Details

  • Base Model: allenai/OLMo-2-1124-7B-Instruct
  • Training Data: AICB Benchmark (520 QA samples on analog circuits)
  • Task: Multiple-choice question answering for analog integrated circuits

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("vikrahul77/OLMo-2-7B-AICB-SFT")
tokenizer = AutoTokenizer.from_pretrained("vikrahul77/OLMo-2-7B-AICB-SFT")

# Example usage
messages = [
    {"role": "user", "content": "Your question about analog circuits here..."}
]
inputs = tokenizer.apply_chat_template(messages, return_tensors='pt')
outputs = model.generate(inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0]))

Training Details

  • Framework: Oumi
  • Training Type: Supervised Fine-Tuning (SFT)
  • Hardware: NVIDIA H100 PCIe
Downloads last month
13
Safetensors
Model size
7B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for vikrahul77/OLMo-2-7B-AICB-SFT

Finetuned
(116)
this model
Quantizations
1 model

Dataset used to train vikrahul77/OLMo-2-7B-AICB-SFT