Product Selector

Fusion 5.9
    Fusion 5.9

    Authentication API

    The Authentication API authenticates a user to use the Lucidworks AI APIs. The single endpoint generates an access token to use in future API requests.

    To view the full configuration specification for an API, click the View API specification button.

    view api spec

    Token guidelines and behavior

    • Lucidworks uses OAuth 2.0 to authenticate your credentials.

    • Tokens generated by the Authentication API are Java Web tokens (JWT) tokens.

    • Every token generated is active for 3600 seconds before it expires. After that token expires, API requests attempting to use the token fail and generate an HTTP status 401 Unauthorized error.

    • You can generate a new token at any time. To avoid authorization failures, generate a new token before the current token expires. For example, generate a new token five minutes before the current token expires.

      To calculate the token expiration time, Base64 decode the JWT and review the exp field, which contains the expiration time in seconds since the Unix epoch.
    • The scope for the:

      • Models API is machinelearning.model.

      • Use Case API, Prediction API, Async Prediction API, and Async Chunking API is the same and is machinelearning.predict. Therefore, an authentication request for a token for these APIs does not affect an existing Models API token.

        Because the Use Case API, Prediction API, Async Prediction API, and Async Chunking API use the machinelearning.predict scope, if you request a new token with that scope, it deletes the existing token, so all of the APIs are affected. You need to manage token regeneration for those APIs to run successfully.

    Credentials

    This section contains information exclusive to your organization and you need to ensure only authorized individuals can access this data.

    To obtain the API credentials, navigate to the megamenu and click Models > Integrations.

    Lucidworks AI Integrations

    Click the integration instance to display the details screen.

    Lucidworks AI integration details screen

    The following values are used to obtain authentication tokens and values sent in API queries:

    • Application ID. The unique value that identifies a specific application. Each application has a different value.

    • Client ID. The unique client value that is part of the information required to obtain OAuth 2.0 authorization. For a self-hosted Fusion client integration, this value is unique to the specific integration.

    • Client Secret. The private client value that is part of the information required to obtain OAuth 2.0 authorization. You must keep this value secret. Do not use it in public clients such as client-side applications.

    The Customer ID is not displayed on this screen. It is the unique value that identifies your organization, and is also required for API usage. Lucidworks will provide that value to your organization. The Customer ID is generated when the workspace is created and is unique to that workspace. It cannot be reset or changed.

    Obtain the access token

    To use the Authentication API, you must obtain or renew the access token. The access token is also submitted in the other Lucidworks AI API requests.

    1. Copy the Client ID and Client Secret from the Integration Details screen and paste the values into the following command to encode those values into base64:

      echo -n "CLIENT_ID:CLIENT_SECRET" | base64
    2. Copy the following CURL command, replacing CLIENT_ID:CLIENT_SECRET_BASE64 with the value you encoded. Paste and run in your command line tool.

      curl --request POST \
        --url 'https://identity.lucidworks.com/oauth2/ausao8uveaPmyhv0v357/v1/token?scope=machinelearning.predict&grant_type=client_credentials' \
        --header 'Accept: application/json' \
        --header 'Authorization: Basic [CLIENT_ID:CLIENT_SECRET_BASE64]' \
        --header 'Cache-Control: no-cache' \
        --header 'Content-Type: application/x-www-form-urlencoded'
    If you request a new token with the machinelearning.predict scope, the existing token is deleted, and all APIs using that scope are affected. You need to manage token regeneration for those APIs to run successfully.