A bad AI RFP produces bad responses. When you ask generic questions ("describe your AI capabilities," "what is your implementation methodology"), you get generic answers — polished decks designed to impress a procurement committee, not to demonstrate that the vendor can solve your specific problem.
Here's how to write an RFP that forces vendors to show you what they can actually do.
Section 1: Company and Relevant Experience
Don't ask for a company overview — ask for evidence of relevant delivery.
What to ask:
- Describe 2–3 AI agent systems you've built that are currently in production. What does each one do, what systems does it integrate with, and what measurable business outcome has it delivered?
- Have you built AI systems in our industry? If yes, describe the use case and any compliance requirements you addressed.
- What is your team's direct experience with the specific integrations our project requires? [List your key systems — CRM, ERP, etc.]
- What percentage of your AI projects reach production vs. are abandoned before deployment?
Section 2: Technical Approach to Your Specific Problem
Describe your use case in detail — the workflow, the systems involved, the data, the volume. Then ask specific questions that require the vendor to engage with your actual problem.
What to ask:
- Based on the workflow described, what is your proposed architecture? Which LLM would you use and why?
- What data quality issues do you anticipate, and how would you address them before building?
- How would you handle [specific edge case in your workflow]?
- What does your human-escalation design look like for this workflow?
- What are the three most likely reasons this project could fail, and how would you mitigate them?
Section 3: Project Structure and Delivery
What to ask:
- Provide a detailed project plan with milestones, deliverables, and timelines for this specific scope.
- Who specifically will work on this project? Provide names, roles, and relevant experience — not just team descriptions.
- How do you handle scope changes? Provide a specific example from a past engagement.
- What does your definition of "done" look like? What production readiness criteria do you use before handing off?
- What documentation, training, and knowledge transfer do you include at project close?
Section 4: Pricing and Total Cost of Ownership
What to ask:
- Provide a fixed-price or not-to-exceed estimate for the build scope described. Break down by phase.
- Estimate monthly ongoing costs: LLM API costs at our stated volume, infrastructure, and your maintenance/support fees.
- What's not included in your estimate? What assumptions are you making about data quality, integration availability, and access to your team?
- What events would trigger a change order? What is your change order process?
Section 5: Reference and Work Samples
What to ask:
- Provide 2–3 references from clients with similar use cases who are willing to speak with us. Include the client name, project description, and contact information.
- Can you provide a technical architecture diagram from a comparable past project (with client details redacted)?
- What does your post-deployment monitoring and support look like? Provide an example of an issue that arose after go-live and how it was resolved.
Red Flags in Vendor Responses
- No named references — vendors without real clients they can refer you to are not production-experienced
- Vague project plans — "we'll assess and plan in phase one" is not a project plan; capable vendors know enough about AI delivery to give you a specific timeline
- No discussion of failure modes — experienced AI builders know what breaks; vendors who only describe success have either never shipped or are hiding the difficult parts
- Generic technical architecture — if their proposed architecture doesn't reference your specific systems, they haven't engaged with your actual problem
- Incomplete cost disclosure — proposals that don't address API costs, infrastructure, and maintenance are showing you a fraction of the real cost
The reference call is the most important step: After shortlisting vendors, call their references. Ask specifically: Did the project ship on time and budget? What surprised you about working with them? What would you do differently? What would you tell a company in your position considering this vendor? References who answer questions vaguely or seem reluctant are a signal worth taking seriously.
Want to Talk Through Your AI Vendor Selection?
We'll help you scope your project, write your RFP, and evaluate vendor responses — or just answer your questions honestly about what a project like yours should cost and how long it should take.
Talk to the Team