File size: 1,880 Bytes
2a09364 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
---
library_name: transformers
pipeline_tag: text-generation
inference: true
widget:
- text: Hello!
example_title: Hello world
group: Python
---
This model is for debugging. It is randomly initialized with the config from [ibm-fms/Bamba-9B](https://huggingface.co/ibm-fms/Bamba-9B) but is of smaller size.
Codes:
```python
import os
import torch
import transformers
from transformers import (AutoConfig, AutoModelForCausalLM, AutoTokenizer,
GenerationConfig, pipeline, set_seed)
model_id = "ibm-fms/Bamba-9B"
repo_id = "tiny-random/bamba"
save_path = f"/tmp/{repo_id}"
config = AutoConfig.from_pretrained(model_id, trust_remote_code=True)
config.attn_layer_indices = [1]
config.attn_rotary_emb = 4
config.hidden_size = 16
config.intermediate_size = 32
config.num_attention_heads = 2
config.num_hidden_layers = 2
config.num_key_value_heads = 1
config.mamba_expand = 4
config.mamba_d_head = 8
config.mamba_n_heads = 8
config.mamba_d_state = 8
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
tokenizer.save_pretrained(save_path)
model = AutoModelForCausalLM.from_config(
config, torch_dtype=torch.bfloat16,
trust_remote_code=True,
)
# model.generation_config = GenerationConfig.from_pretrained(
# model_id, trust_remote_code=True
# )
set_seed(42)
with torch.no_grad():
for name, p in sorted(model.named_parameters()):
torch.nn.init.normal_(p, 0, 0.5)
print(name, p.shape)
model.save_pretrained(save_path)
model = AutoModelForCausalLM.from_pretrained(save_path).cuda()
tokenizer = AutoTokenizer.from_pretrained(save_path)
message = ["Hello, world!"]
inputs = tokenizer(message, return_tensors='pt', return_token_type_ids=False).to(model.device)
response = model.generate(**inputs, max_new_tokens=2)[0]
print(tokenizer.convert_ids_to_tokens(response, skip_special_tokens=False))
```
|