While recurring neural network (RNN) models are powerful for tasks that demand fast inference and domain-specific tuning, such as e-commerce, Transformer RNN models excel at handling complex language understanding and are well-suited for common AI tasks like classification, prediction, and sequence modeling. Because a transformer RNN model uses a frozen pre-trained transformer (FPT) base, its customizability is limited. And due to the complexity of the transformer architecture, it requires more training and inference time than RNN-only models. However, it also delivers high-quality results with minimal tuning, even when your data is limited or noisy. To help you get started faster, Lucidworks AI provides the models below as pre-trained bases:Documentation Index
Fetch the complete documentation index at: https://doc.lucidworks.com/llms.txt
Use this file to discover all available pages before exploring further.
- Hugging Face all_minilm_l12_rnn model
- snowflake_arctic_embed_l_rnn (recommended multilingual model)
- e5_small_v2_rnn
- e5_base_v2_rnn
- e5_large_v2_rnn
- multilingual_e5_small_rnn
- multilingual_e5_base_rnn
- multilingual_e5_large_rnn
- gte_small_rnn
- gte_base_rnn
- gte_large_rnn
- snowflake_arctic_embed_xs_rnn
- snowflake_arctic_embed_s_rnn