# Indian Workflow Benchmark

Stage: Designing v0.1

Audience: Indian enterprises, AI buyers, and product teams

## Summary

A workflow benchmark for Indian business tasks: finance, support, multilingual handoffs, document reasoning, sales ops, and evidence-grounded escalation.

## Buyer Problem

Most benchmarks do not test the messy Indian operating surface: mixed-language tickets, GST documents, vendor email threads, policy exceptions, and workflows where the answer has to cite the right evidence.

## Metrics

- Outcome correctness
- Evidence citation
- Escalation judgement
- Cost per accepted output

## Deliverables

- Workflow task pack
- Model comparison memo
- Evidence audit
- Deployment readiness map

## Buyer Questions

- Which models survive Indian document and support workflows?
- Where do multilingual or policy tasks fail?
- What can be safely automated versus escalated?
- How do quality, latency, and cost change by workflow type?

## Demo State

Live benchmark console is connected to the Indian Enterprise Workflow Suite; next step is replacing seed traces with real client-approved examples.

Demo readiness: 82/100

Missing for live demo:
- Product walkthrough video
- Client-approved example
- Real run export

## Connected Evidence

- [Indian Enterprise Workflow Suite](/benchmarks/indian-enterprise-workflow-suite)
- [Indian workflow article](/articles/designing-the-indian-enterprise-ai-workflow-benchmark)
- [Leaderboard methodology](/articles/building-a-useful-ai-leaderboard-without-fooling-ourselves)

## Visual Preview

![Indian Workflow Benchmark live Studio preview screenshot](/reports/studio/previews/indian-workflow-benchmark.png)
