AI Phone Agent Operations

Dynamic Variables

By Vadim Kouznetsov, Founder of BubblyPhone · Last updated April 5, 2026

Dynamic variables are placeholders in an AI agent’s system prompt that are filled in at call time with per-call data — the caller’s name, a looked-up account balance, a prospect’s company, anything that should change from one call to the next without changing the underlying prompt. They are what turn a generic agent into a personalised one.

Why they exist

The naive way to personalise an AI phone agent is to write a new system prompt for every call. If you are calling 1,000 prospects, you would assemble 1,000 slightly different prompts on the fly and send each one along with its call. This works, but it is wasteful: 95% of every prompt is identical, and the 5% that changes is always in the same handful of places.

Dynamic variables formalise those handful of places. The system prompt is written once, with explicit placeholders, and the values are supplied at call time. It looks like this:

system_prompt: | You are calling {{prospect_name}} at {{company}}. Introduce yourself as Sarah from TechCorp and follow up on their enquiry about our {{product_interest}} offering from {{enquiry_date}}. variables: prospect_name: "John Smith" company: "Acme Corp" product_interest: "enterprise plan" enquiry_date: "last Tuesday"

Where dynamic variables live in the pipeline

There are three distinct places where variables can be interpolated, and each has different properties:

  • At the start of the call.Variables are baked into the system prompt before the call begins. This is the most common pattern and works for anything known before the call connects — caller name, campaign type, known account data.
  • During the call. New variables are injected mid-conversation when your backend returns data from a tool call or an external lookup. The AI sees the new information and incorporates it into the next turn.
  • After the call. Transcript variables are extracted from the conversation and stored in the call log for later analytics. These flow the opposite direction — from conversation to structured data, not the other way.

Static prompt, dynamic data

The mental model that helps the most: the system prompt iscode, and dynamic variables are data. The same code runs for every call, data changes per call. Treat the prompt as a versioned artifact you deploy, and treat the variables as the request payload. This separation is what makes prompt engineering debuggable at scale — when you change the prompt, you know exactly which part changed and for which calls.

The prompt injection problem

Dynamic variables are prompt injection vectors. If you put arbitrary user-supplied text into a system prompt, a malicious user can write input that escapes the intended context and instructs the model to ignore your rules. This is the LLM equivalent of SQL injection, and it is a real production problem.

The defences are the same as for any injection vulnerability:

  • Never put untrusted user input directly in the system prompt. If a variable comes from a caller, a form, or a CRM field populated by customers, treat it as adversarial.
  • Constrain the shape of the value. A prospect name should be letters and spaces only. A date should match a date format. Reject anything that does not fit.
  • Delimit clearly. Wrap user-supplied values in quoted strings or XML tags so the model can tell what is data and what is instruction. Modern LLMs handle this well when given the boundary explicitly.
  • Do not make the AI’s guardrails depend on variables.If the system prompt says “never discuss topic X unless the caller’s role is admin”, and the role is a dynamic variable, an attacker just sets their role to admin.

Common uses

  • Outbound sales calls. Inject the prospect name, company, industry, and any known pain points so the AI opens with relevant context instead of a generic pitch.
  • Appointment reminders. Inject the patient name, appointment date, time, and provider so the AI can confirm or reschedule specifically.
  • Account-aware support.Inject the customer’s plan tier, recent interactions, and open tickets so the AI starts with a full picture instead of asking redundant questions.
  • Multi-language routing. Inject the detected language so the prompt instructs the AI to respond in the right one.

Dynamic variables in BubblyPhone Agents

BubblyPhone Agents supports per-call system prompt overrides on outbound calls, which is the dynamic-variable pattern in its simplest form: you supply the full prompt (with values already interpolated) in the call creation request, and that prompt applies to that call only. For mid-call context injection, use the context endpoint to push new information into an active conversation. See the guide on AI outbound calls for a full campaign example.

Further reading