How do i access llama 3.1 70b in my space ?this doesn't s... | How do i access llama 3.1 70b in my space ?this doesn't s...
How do i access llama 3.1 70b in my space ?

this doesn't seem to work, can someone help me with a working code


from transformers import AutoConfig

config = AutoConfig.from_pretrained("meta-llama/Meta-Llama-3.1-70B", revision="main")
config.rope_scaling = {"type": "llama3", "factor": 8.0}

model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-70B", config=config, use_auth_token=True)