Skip to main content
Global settings define the default behavior for your entire flow agent. These defaults apply across all nodes unless overridden at the node level.

Overview

Global settings are organized into these categories:
CategoryPurpose
System PromptCore AI instructions
Voice SettingsDefault voice for all nodes
LLM SettingsDefault language model
Noise CancellationAudio processing
Knowledge BaseRAG data sources
MCP ToolsGlobally available tools
OutcomeCall result tracking
Phone NumberAssigned phone numbers
Call SettingsTiming and behavior defaults
WebhookExternal event notifications

System Prompt

The system prompt defines the agent’s identity, behavior, and constraints. It applies across all conversation nodes unless a node has its own prompt that overrides it.
System prompt is required. Your flow cannot be saved without it.
The default value is You are a helpful assistant that will answer users questions. The system prompt supports system variables and custom variables only. Extracted variables (collected during conversation) are not available here.

Example

You are a customer service representative for Acme Corp.
Today is {{current_date}} and the time is {{current_time}}.
The caller's number is {{user_number}}.
Working hours: {{working_hours}}
→ See Writing Effective Prompts for full guidance on structuring prompts and using variables.

Voice Settings

Select a voice from Hamsa’s voice library. The selected voice is the default for all nodes; individual nodes can override it. Voice configuration works the same way as in single-prompt agents. → See Voice Settings for full details on browsing, filtering, previewing, and selecting voices.

LLM Settings

Configure the default language model for your flow. Individual nodes can override the model and temperature.

Provider and Model

ProviderModels
OpenAI (default)GPT-5, GPT-5-Mini, GPT-5-Nano, GPT-4.1, GPT-4.1-Mini, GPT-4.1-Nano, GPT-4o, GPT-4o-mini
GeminiGemini 2.5-Pro, Gemini 2.5-Flash
GroqGPT-120-OSS, GPT-20-OSS
DeepMystgpt-4.1-optimize, gpt-4.1-mini-optimize
CustomAny OpenAI-compatible endpoint (requires base URL and API key)
Default: OpenAI GPT-4.1

Temperature

Controls response variability. Range: 0.0–1.0 Default: 0.2
GPT-5 family models (GPT-5, GPT-5-Mini, GPT-5-Nano) require temperature = 1.0. This is enforced automatically when you select a GPT-5 model.

Node-Level Overrides

Model and temperature can be overridden per conversation node. This lets you use a more capable model for complex reasoning nodes and a lighter model for simple confirmations.

Noise Cancellation

Removes background noise from the caller’s audio. Options: Disabled, Per Sentence, Per Conversation
Do not enable both Noise Cancellation and Background Noise — they conflict and create audio artifacts.

Knowledge Base

Attach knowledge sources that your agent can query during the conversation using Retrieval-Augmented Generation (RAG). When a user asks a question, relevant content is retrieved from the knowledge base and provided to the LLM as context. Knowledge base items are created and managed in the Knowledge Base section and then attached to your agent here. → See Knowledge Base for full details on creating and managing knowledge items.

MCP Tools

Model Context Protocol (MCP) tools attached here are available to all conversation nodes in the flow — any node can invoke them without per-node configuration. Use MCP tools for functions that multiple nodes need (for example, a customer lookup or a business hours check). For actions that only one node needs, use a node-level tool instead. → See Tools for full details on creating and configuring tools.

Outcome

Define how call results are classified for analytics and reporting.

Configuration

outcomeField: string          // Name of the outcome category
outcomeResponseShape?: object // Expected response structure (optional)

Example

{
  "outcome": "APPOINTMENT_BOOKED",
  "details": {
    "appointment_date": "2024-11-20",
    "appointment_time": "14:00"
  }
}
Outcomes are typically set in End Call nodes, but can also be set via tool calls or webhooks during the conversation.

Phone Number

Assign phone numbers to this agent for inbound and outbound calls. Inbound: When a customer calls an assigned number, the call routes to this flow agent. Outbound: When initiating a call via API, the agent uses one of the assigned numbers as caller ID. All assigned numbers use the same flow configuration. To use different flows for different numbers, create separate agents.

Call Settings

Configure timing and behavior defaults for all calls.
SettingRangeDefault
Response Delay100–1500ms400ms
InterruptOn / OffOn
User Inactivity Timeout5–60s15s
Max Call Duration30s – 1 hour5 min
Minimum Interruption Duration0.2–1.5s0.5s
VAD Activation Threshold0.2–0.90.5
Background NoiseOn / OffOff
Thinking VoiceOn / OffOff
→ See Call Behavior Settings for full details on each setting.

Webhook

Send call events to an external HTTP endpoint.

Configuration

webhookUrl: string | null
webhookAuth: {
  authKey: 'noAuth' | 'bearer' | 'basic'
  authSecret?: string
}
The webhook endpoint must respond within 5 seconds. Return 200 OK quickly and process the payload asynchronously. → See Webhooks for event types, payload structure, and authentication details.

Next Steps

Node Types

Learn about the 8 available node types

Transitions

Control how conversations move between nodes

Variables

Pass data between nodes and into prompts

Best Practices

Tips for building reliable flow agents