Set Up a Pre-Trained Cold Start Model for Smart Answers
Lucidworks provides these pre-trained cold start models for Smart Answers:
-
qna-coldstart-large
- this is a large model trained on variety of corpuses and tasks. -
qna-coldstart-multilingual
- covers 16 languages. List of supported languages: Arabic, Chinese-simplified, Chinese-traditional, English, French, German, Italian, Japanese, Korean, Dutch, Polish, Portuguese, Spanish, Thai, Turkish, Russian.
When you use these models, you do not need to run the model training job. Instead, you run a job that deploys the model into Managed Fusion. The Create Seldon Core Model Deployment job deploys your model as a Docker image in Kubernetes, which you can scale up or down like other Managed Fusion services.
These models are a good basis for a cold start solution if your data does not contain much domain-specific terminology. Otherwise, consider training a model using your existing content.
Dimension size of vectors for both models is 512. You might need this information when creating collections in Milvus. |
1. Deploy a pre-trained cold-start model into Managed Fusion
The pre-trained cold-start models are deployed using a Managed Fusion job called Create Seldon Core Model Deployment. This job downloads the selected pre-trained model and installs it in Managed Fusion.
-
Navigate to Collections > Jobs.
-
Select Add > Create Seldon Core Model Deployment.
-
Enter a Job ID, such as
deploy-qna-coldstart-multilingual
ordeploy-qna-coldstart-large
. -
Enter the Model Name, one of the following:
-
qna-coldstart-multilingual
-
qna-coldstart-large
-
-
In the Docker Repository field, enter
lucidworks
. -
In the Image Name field, enter one of the following:
-
qna-coldstart-multilingual:v1.1
-
qna-coldstart-large:v1.1
-
-
Leave the Kubernetes Secret Name for Model Repo field empty.
-
In the Output Column Names for Model field, enter one of the following:
-
qna-coldstart-multilingual:[vector]
-
qna-coldstart-large:[vector, compressed_vector]
-
-
Click Save.
-
Click Run > Start to start the deployment job.