Tools let your AI agent take actions during a conversation β transfer calls, end calls, or call external APIs β based on the context of the conversation and your prompt instructions. When a tool is attached to a workflow node, the LLM decides when to invoke it and what parameters to pass, based on the userβs spoken intent and your node-level instructions.Documentation Index
Fetch the complete documentation index at: https://docs.dograh.com/llms.txt
Use this file to discover all available pages before exploring further.
Tool Types
Dograh provides two categories of tools:Built-in Tools
Pre-configured tools that handle common telephony operations out of the box:- Call Transfer β Transfer the active call to a phone number or SIP endpoint
- End Call β Terminate the call when the conversation is complete
Custom Tools
Tools you define to integrate with any external system:- HTTP API β Call any REST API endpoint during a conversation (e.g., CRM updates, data lookups, triggering automations)
How Tools Work
- You define a tool with a name, description, and parameters
- You attach the tool to one or more workflow nodes
- During a call, the LLM reads your node prompt, the tool description, and the callerβs intent to decide whether to invoke the tool
- The tool executes and returns a result that the agent can use to continue the conversation
Best Practices
- Attach only relevant tools to each node β fewer tools means more reliable invocations
- Write clear tool descriptions β the LLM uses these to decide when to call the tool
- Guide the LLM in your node prompt β explicitly describe when a tool should be used
- Test tool behavior β verify your agent invokes tools at the right moments using web or phone calls