Sending prompt and message using GenAI LLM Passthrough endpoints

This topic describes how to use API endpoints.
  1. Use these endpoints to send either a single prompt (/prompt or /prompt/stream) or multiple messages (/messages or /messages/stream):
    • api/v1/prompt
    • api/v1/messages
    • api/v1/prompt/stream
    • api/v1/messages/stream
  2. Complete the request body with properties from GET/api/v1/models to specify the model and its version.

Ensure that these requirements are met:

  • A valid Infor Registry logical ID header is required to invoke the endpoints. Failure to provide a valid header value results in an invocation error.
  • You can only select a single model at the time. If a model and version is not selected, the endpoint selects Claude 3 Haiku by default.
Note: When invoking the same prompt, we remind users that mild variations to the output must be expected from every invocation of the LLM.

For reference, review these sample payloads:

  • /prompt API

    {
      "model": "CLAUDE",
      "version": "claude-3-haiku-20240307-v1:0",
      "prompt": "Hello world!"
    }
    
  • /messages API

    {
        "system": "You are a math profressor, you help people answer math questions.",
        "model": "CLAUDE",
        "version": "claude-3-7-sonnet-20250219-v1:0",  
        "messages": [
            {
                "role": "user",
                "content": [
                    {
                        "type": "text",
                        "data": "What is your favorite algorithm?"
                    },
                    {
                        "type": "text",
                        "data": "Can you tell me yours first?."
                    }
                ]
            },
            {
                "role": "assistant",
                "content": [
                    {
                        "type": "text",
                        "data": "Can you tell me yours first?."
                    }
                ]
            },
            {
                "role": "user",
                "content": [
                    {
                        "type": "text",
                        "data": "But I asked first"
                    }
                ]
            }
        ]
    } 
    
    

Where:

  • Model parameter inside any Passthrough LLM APIs is the Name parameter in the GET/api/v1/models
  • Version parameter inside any Passthrough LLM APIs is the ID parameter in the GET/api/v1/models