useCaseConfig
parameter.modelConfig
parameter.Llama-3.1-8b-instruct
(llama-3-8b-instruct
)Llama-3.2-3b-instruct
(llama-3v2-3b-instruct
)Mistral-7B-instruct
(v0.2) (mistral-7b-instruct
)Model | Sync Length | Async Length |
---|---|---|
llama-3-8b-instruct | 32000 | 128000 |
llama-3v2-3b-instruct | 32000 | 64000 |
mistral | 8192 | 8192 |
nu-zero-ner | 384 | 384 |
gpt-4
gpt-4o
gpt-4o-2024-05-13
gpt-4-0613
gpt-4-turbo
gpt-4-turbo-2024-04-09
gpt-4-turbo-preview
gpt-4-1106-preview
gpt-3.5-turbo
gpt-3.5-turbo-1106
gpt-3.5-turbo-0125
Azure OpenAI
models you want to use. Lucidworks does not support Azure AI Studio
.Deployment Name
for the model you want to use. Use this as the value of the Lucidworks AI API "modelConfig": "azureDeployment"
field.Key1
or Key2
for the model you want to use. Use either as the value of the Lucidworks AI API "modelConfig": "apiKey"
field.Endpoint
for the model you want to use. Use this as the value of the Lucidworks AI API "modelConfig": "azureEndpoint"
field.MODEL_ID
for Azure OpenAI is azure-openai
.gemini-2.5-pro
(based on gemini-2.5-pro-preview-03-25
)gemini-2.5-flash
(based on gemini-2.5-flash-preview-04-17
)gemini-2.0-flash
gemini-2.0-flash-lite
apiKey
, googleProjectId
, and googleRegion
.
There are no defaults for any of these fields.
The value for apiKey
is a base64-encoded Google Vertex AI service account key.
To learn how to create it, see Create a Google service account key.
claude-sonnet-4-20250514
claude-3-7-sonnet-20250219
claude-3-5-sonnet-20241022
claude-3-5-haiku-20241022
/ai/prediction/USE_CASE/MODEL_NAME
.
The generic path for the Async Prediction API is /ai/async-prediction/USE_CASE/MODEL_NAME
.
The GenAI use cases based on the generic path are as follows:
memoryUuid
. This use case can be invoked during the RAG use case.ner
use case where the LLM ingests text and entities to extract and return a JSON response that contains a list of entities extracted from the text.