LLM Settings
Model
@cf/meta/llama-3.2-3b-instruct
Temperature
Max Tokens
System Prompt
You are a helpful assistant.
Stream Response
Advanced Settings
Top P
Top K
Frequency Penalty
Presence Penalty
Repetition Penalty
Reset settings
Hosted on NuxtHub
Hub Chat
Ask me anything, type to get started