Quick Navigation
Overview
TalkAlly is a universal voice and text frontend that connects to any backend system via a simple JSON-based API. Your backend provides configuration and function endpoints — TalkAlly handles voice recognition, text-to-speech, and the mobile interface.
What TalkAlly Provides
| Feature | Description |
|---|---|
| Voice Interface | Real-time speech-to-text and text-to-speech via OpenAI Realtime API |
| Text Interface | Chat-style text input with file attachments |
| Function Calling | AI-driven execution of your backend functions via webhooks |
| Multi-Profile | Users can connect to multiple backends (yours + others) |
| Local Functions | Optional app-based functions (notes, contacts, navigation) |
What You Provide
| Component | Description |
|---|---|
| Config Endpoint | HTTP GET returning JSON with system prompt and function definitions |
| Function Endpoints | HTTP POST webhooks that execute actions and return results |
| OpenAI API Key | Your key (billed to you) or let users provide their own |
Architecture
Request Flow
- Connect: User creates a connection profile in TalkAlly with your config URL
- Load Config: TalkAlly fetches your config endpoint (GET request)
- Start Session: TalkAlly connects to OpenAI with your system prompt and function definitions
- User Speaks/Types: Input is processed by OpenAI's AI model
- Function Call: When AI decides to call a function, TalkAlly POSTs to your webhook
- Response: Your webhook returns data, AI formulates response, TalkAlly speaks/displays it
API Contract
TalkAlly expects your backend to implement a simple HTTP-based protocol:
| Endpoint | Method | Purpose |
|---|---|---|
/config |
GET | Returns JSON configuration (system prompt, functions, settings) |
/function-name |
POST | Executes a function and returns result |
Config Endpoint
The config endpoint is the core of TalkAlly integration. It tells the app how to behave and what functions are available.
Request
GET https://your-backend.com/webhook/config?tenant=sales&passkey=secret123
Query Parameters (Optional)
| Parameter | Description |
|---|---|
tenant |
Tenant/profile identifier for multi-tenant setups |
passkey |
Optional authentication token |
custom_greeting |
Override default greeting message |
display_name |
User's display name for personalization |
Response Schema
{
"system_prompt": "You are a helpful assistant for Acme Corp...",
"greeting": "Hello! How can I help you today?",
"functions": [
{
"name": "function_name",
"description": "What this function does",
"webhook_url": "https://your-backend.com/webhook/function_name",
"http_method": "POST",
"acknowledgment": "Processing your request...",
"parameters": {
"type": "object",
"properties": {
"param1": {
"type": "string",
"description": "Description of param1"
}
},
"required": ["param1"]
}
}
],
"openai_api_key": "sk-...",
"voice": "alloy",
"voice_speed": 1.0,
"enable_local_functions": true
}
Field Reference
| Field | Type | Required | Description |
|---|---|---|---|
system_prompt |
string | Required | Instructions for the AI assistant's behavior |
openai_api_key |
string | Required | OpenAI API key for voice/text processing |
greeting |
string | Optional | Initial greeting spoken when session starts |
functions |
array | Optional | List of callable functions |
voice |
string | Optional | OpenAI voice: alloy, ash, ballad, coral, echo, sage, shimmer, verse |
voice_speed |
number | Optional | Playback speed (0.5 to 2.0, default: 1.0) |
enable_local_functions |
boolean | Optional | Enable app-based functions (notes, contacts, navigation) |
Function Schema
| Field | Type | Required | Description |
|---|---|---|---|
name |
string | Required | Unique function identifier (snake_case) |
description |
string | Required | What the function does (AI uses this to decide when to call it) |
webhook_url |
string | Required | Full URL to POST when function is called |
acknowledgment |
string | Optional | Message spoken while function executes |
parameters |
object | Optional | JSON Schema defining function parameters |
Function Endpoints
When the AI decides to call a function, TalkAlly sends a POST request to the function's webhook_url.
Request
POST https://your-backend.com/webhook/get_weather
Content-Type: application/json
{
"location": "Sydney"
}
Response
Return JSON with the function result. The AI will use this to formulate a response to the user.
{
"temperature": 25,
"conditions": "Sunny",
"humidity": 65,
"wind": "15 km/h NE"
}
Error Handling
{
"error": "Location not found",
"message": "Could not find weather data for the specified location."
}
Integration Examples
Minimal Config (Voice Only)
The simplest possible integration — just voice chat with no functions:
{
"system_prompt": "You are a helpful assistant for Acme Corporation.",
"greeting": "Hello! Welcome to Acme support.",
"openai_api_key": "sk-your-api-key-here"
}
With Custom Functions
{
"system_prompt": "You are a sales assistant. Help customers find products.",
"greeting": "Hi! I can help you find products.",
"openai_api_key": "sk-your-api-key-here",
"voice": "nova",
"voice_speed": 1.1,
"functions": [
{
"name": "search_products",
"description": "Search for products by name or category",
"webhook_url": "https://api.example.com/search",
"acknowledgment": "Searching our catalog...",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "Product name or category"
}
},
"required": ["query"]
}
}
]
}
Multi-Tenant Example
// Your backend pseudo-code
GET /webhook/config?tenant=XXX
if (tenant === "sales") {
return {
system_prompt: "You are a sales assistant...",
functions: [/* sales functions */],
openai_api_key: "sk-sales-key"
}
}
if (tenant === "support") {
return {
system_prompt: "You are a support agent...",
functions: [/* support functions */],
openai_api_key: "sk-support-key"
}
}
n8n Code Node
// n8n Code Node
const tenant = $input.query.tenant || 'default';
const displayName = $input.query.display_name || 'User';
return {
json: {
system_prompt: `You are a helpful assistant for ${displayName}.`,
greeting: `Hello ${displayName}! How can I help?`,
openai_api_key: "sk-your-key",
functions: [
{
name: "get_weather",
description: "Get weather for a location",
webhook_url: "https://your-n8n.com/webhook/weather",
parameters: {
type: "object",
properties: {
location: { type: "string", description: "City name" }
},
required: ["location"]
}
}
]
}
};
Python Flask
from flask import Flask, request, jsonify
app = Flask(__name__)
@app.route('/config', methods=['GET'])
def get_config():
tenant = request.args.get('tenant', 'default')
return jsonify({
"system_prompt": f"You are assistant for {tenant} team.",
"greeting": "Hello! How can I help?",
"openai_api_key": "sk-your-key",
"functions": [
{
"name": "get_data",
"description": "Retrieve data from database",
"webhook_url": "https://your-server.com/api/data",
"parameters": {
"type": "object",
"properties": {
"query": {"type": "string"}
}
}
}
]
})
@app.route('/api/data', methods=['POST'])
def get_data():
query = request.json.get('query', '')
# Your logic here
return jsonify({"result": f"Data for: {query}"})
Compatible Platforms
TalkAlly works with any platform that can serve HTTP endpoints returning JSON:
Workflow Automation
n8n
Open source workflow automation
LangFlow
Visual LLM workflow builder
Flowise
LangChain visual builder
Node-RED
Flow-based programming
Activepieces
Open source automation
Dify
AI application platform
Low-Code Platforms
Make
Visual automation platform
Zapier
Webhooks by Zapier
Power Automate
Microsoft HTTP triggers
Pipedream
Developer workflows
Serverless & Custom
AWS Lambda
+ API Gateway
Azure Functions
Microsoft serverless
Google Cloud Functions
GCP serverless
Flask / FastAPI
Python frameworks
Express / Node.js
JavaScript backend
Cloudflare Workers
Edge computing
Service Integrations
TalkAlly connects to popular services, giving you hands-free access to email, calendar, contacts, and files. Each integration has its own setup guide.
Google Workspace
Gmail, Calendar, Contacts, and Drive. Multi-account support with automatic token refresh.
Microsoft 365 Coming Soon
Outlook email, Calendar, and OneDrive integration for personal and work accounts.
Getting Started
- Create a config endpoint on your platform that returns the JSON schema above
- Test the endpoint by visiting it in a browser — you should see JSON
- Install TalkAlly from Google Play Store
- Add a Connection Profile → Choose "Business" mode
- Enter your config URL as the "Orchestrator URL"
- Test the connection — TalkAlly will fetch your config
- Connect and talk!