Select Page

Updated Dec 1010,000 context

$0.00 / 1M input tokens$0.00 / 1M output tokens

RWKV is an RNN (recurrent neural network) with transformer-level performance. It aims to combine the best of RNNs and transformers – great performance, fast inference, low VRAM, fast training, “infinite” context length, and free sentence embedding.

RWKV-5 is trained on 100+ world languages (70% English, 15% multilang, 15% code).

RWKV 3B models are provided for free, by Recursal.AI, for the beta period. More details here.

#rnn