itarutomy commited on
Commit
b9037b4
·
verified ·
1 Parent(s): df157f8

Re-save with cleaned inner.* keys and torch serialization; include custom code

Browse files
Files changed (3) hide show
  1. README.md +5 -6
  2. config.json +1 -1
  3. pytorch_model.bin +3 -0
README.md CHANGED
@@ -2,13 +2,13 @@
2
  title: LLM Workshop Hands-on GPT
3
  license: mit
4
  language:
5
- - ja
6
  datasets:
7
- - hotchpotch/fineweb-2-edu-japanese
8
  tags:
9
- - gpt
10
- - transformer
11
- - from-scratch
12
  pipeline_tag: text-generation
13
  ---
14
 
@@ -26,7 +26,6 @@ pipeline_tag: text-generation
26
  ## 使い方
27
  ```python
28
  from transformers import AutoModelForCausalLM, AutoTokenizer
29
-
30
  tok = AutoTokenizer.from_pretrained("gpt2")
31
  model = AutoModelForCausalLM.from_pretrained(
32
  "itarutomy/llm_workshop_hands_on_gpt-model",
 
2
  title: LLM Workshop Hands-on GPT
3
  license: mit
4
  language:
5
+ - ja
6
  datasets:
7
+ - hotchpotch/fineweb-2-edu-japanese
8
  tags:
9
+ - gpt
10
+ - transformer
11
+ - from-scratch
12
  pipeline_tag: text-generation
13
  ---
14
 
 
26
  ## 使い方
27
  ```python
28
  from transformers import AutoModelForCausalLM, AutoTokenizer
 
29
  tok = AutoTokenizer.from_pretrained("gpt2")
30
  model = AutoModelForCausalLM.from_pretrained(
31
  "itarutomy/llm_workshop_hands_on_gpt-model",
config.json CHANGED
@@ -1,7 +1,7 @@
1
  {
2
  "activation_function": "gelu_new",
3
  "architectures": [
4
- "GPTScratchForCausalLM"
5
  ],
6
  "attn_pdrop": 0.1,
7
  "auto_map": {
 
1
  {
2
  "activation_function": "gelu_new",
3
  "architectures": [
4
+ "ExportableGPTScratchForCausalLM"
5
  ],
6
  "attn_pdrop": 0.1,
7
  "auto_map": {
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a3a0044fb8374fab6cbf7861280e819920f351a1ca53c3d586dc533e5d2467ca
3
+ size 54781246