Lucidworks AI offers several APIs for managing and automating your AI implementation. You can use them to train and manage models, run synchronous and asynchronous predictions, preview and debug pass-through prompts and embedding model tokens, and chunk text. Lucidworks AI APIs let you control and customize your AI orchestration to suit your business needs by enabling developer teams to build out bespoke integrations and implementations. You can use the Lucidworks AI APIs in addition to, or instead of, the Lucidworks Platform UI, but some actions are only supported with the APIs or through Fusion pipeline stages. For example, you can use the custom model training user interface or the Models API to train and deploy custom models, but running predictions and chunking text require you to use the APIs or Fusion stages. Some APIs are used in conjunction with other APIs. For example, use the Use Case API and Tokenization API to generate values for the Prediction API.Documentation Index
Fetch the complete documentation index at: https://doc.lucidworks.com/llms.txt
Use this file to discover all available pages before exploring further.
API access tokens
Lucidworks AI APIs require you to authenticate using Java Web tokens (JWT) tokens. Generate these tokens using the Authentication API, then include them in the request header.Fetch an access token
Fetch an access token
Locate your integration
In the Platform UI, navigate to Models > Integrations and click on your integration.
To fetch an access token, you’ll need these details from your integration:
- Client ID
- Client secret
Compose your request
You’ll send your access token request using the Authentication API.The request uses basic authentication, where your base64-encoded client ID is your username and your base64-encoded client secret is your password.
Some clients perform the encoding automatically; if not, you can use the Compose your request like this:The value of the
base64 command line utility, like this:scope query parameter varies between APIs, and you need a separate access token for each scope:- The Models API uses the
machinelearning.modelscope. - All other APIs use the
machinelearning.predictscope.
Fetch the access token
When you send a successful Authentication API request, the JSON response looks like this:Copy the value of the
access_token key and use it for API authentication.Using the APIs
Once you have an access token, you’re ready to send API requests. The base URL depends on the API:| API | Base URL |
|---|---|
| Authentication API | https://identity.lucidworks.com |
| Async Chunking API | https://APPLICATION_ID.applications.lucidworks.com |
| Async Prediction API | https://APPLICATION_ID.applications.lucidworks.com |
| Models API | https://api.lucidworks.com |
| Prediction API | https://APPLICATION_ID.applications.lucidworks.com |
| Prompting Preview API | https://APPLICATION_ID.applications.lucidworks.com |
| Tokenization API | https://APPLICATION_ID.applications.lucidworks.com |
| Use Case API | https://APPLICATION_ID.applications.lucidworks.com |
APPLICATION_ID from your integration (at Models > Integrations in the Platform UI).
Your request must include a Bearer authorization header containing your access token, like this: