Skip to main content

Ollama

This Integration is part of the Ollama Pack.#

Supported versions

Supported Cortex XSOAR versions: 6.0.0 and later.

Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models.

Configure Ollama on Cortex XSOAR#

  1. Navigate to Settings > Integrations > Servers & Services.

  2. Search for Ollama.

  3. Click Add instance to create and configure a new integration instance.

    ParameterDescriptionRequired
    ProtocolHTTP or HTTPSFalse
    Server hostname or IPEnter the Ollama IP or hostnameTrue
    PortThe port Ollama is running onTrue
    PathBy default Ollama's API path is /api, but you may be running it behind a proxy with a different path.True
    Trust any certificate (not secure)Trust any certificate (not secure)False
    Use system proxy settingsUse system proxy settingsFalse
    Cloudflare Access Client IdIf Ollama is running behind CLoudflare ZeroTrust, provide the Service Access ID here.False
    Cloudflare Access Client SecretIf Ollama is running behind CLoudflare ZeroTrust, provide the Service Access Secret here.False
    Default ModelSome commands allow you to specify a model. If no model is provided, this value will be used.False
  4. Click Test to validate the URLs, token, and connection.

Commands#

You can execute these commands from the Cortex XSOAR CLI, as part of an automation, or in a playbook. After you successfully execute a command, a DBot message appears in the War Room with the command details.

ollama-list-models#


Get a list of all available models

Base Command#

ollama-list-models

Input#

Argument NameDescriptionRequired

Context Output#

PathTypeDescription
ollama.modelsunknownOutput of the command

ollama-model-pull#


Pull a model

Base Command#

ollama-model-pull

Input#

Argument NameDescriptionRequired
modelName of model to pull. See https://ollama.com/library for a list of options.Optional

Context Output#

PathTypeDescription
ollama.pullunknownOutput of the command

ollama-model-delete#


Delete a model

Base Command#

ollama-model-delete

Input#

Argument NameDescriptionRequired
modelThe name of the model to delete.Optional

Context Output#

PathTypeDescription
ollama.deleteunknownOutput of the command

ollama-conversation#


General chat command that tracks the conversation history in the Incident.

Base Command#

ollama-conversation

Input#

Argument NameDescriptionRequired
modelThe model name.Optional
messageThe message to be sent.Required

Context Output#

PathTypeDescription
ollama.historyunknownOutput of the command

ollama-model-info#


Show information for a specific model.

Base Command#

ollama-model-info

Input#

Argument NameDescriptionRequired
modelname of the model to show.Optional

Context Output#

PathTypeDescription
ollama.showunknownOutput of the command

ollama-model-create#


Create a new model from a Modelfile.

Base Command#

ollama-model-create

Input#

Argument NameDescriptionRequired
modelname of the model to create.Required
model_filecontents of the Modelfile.Required

Context Output#

PathTypeDescription
ollama.createunknownOutput of the command

ollama-generate#


Generate a response for a given prompt with a provided model. Conversation history IS NOT tracked.

Base Command#

ollama-generate

Input#

Argument NameDescriptionRequired
modelThe model name.Optional
messageThe message to be sent.Optional

Context Output#

PathTypeDescription
ollama.generateunknownOutput of the command