Designing Your First Developer Experience Survey: A Practical Guide
Guide to help with all end-to-end all aspects of developer survey design. Includes everything from defining structure, designing questions, simple scoring mechanism and how to interpret results.
We currently work on the SDLC and Infrastructure Platform, which spans multiple teams responsible for the end-to-end developer lifecycle. This includes everything from setting up scaffolded repositories, webhooks for pull request checks, and standardized CI/CD templates, to enabling access to service URLs, jumphosts, logging, and monitoring tools.
Recently, we had the opportunity to design a DevEx survey for our platform tribe. The goal of this survey was to gather feedback from developers on:
- Ease of development and overall sentiment toward the features provided
- Perceived productivity, pain points, and areas of improvement to help prioritize the backlog
- Developer satisfaction and engagement, including what’s working well (e.g., platform shortcuts, automation jobs, documentation)
While many of us have filled out surveys and can recognize good or bad ones, designing one from scratch can be daunting. We hope this blog helps others who are getting started.
References and Inspiration
To understand the "what, why, and how" of DevEx surveys, we referred to:
- DevEx: What Actually Drives Productivity
- The LinkedIn DPH Framework
- Developer Experience (DevEx) – Engineering Fundamentals Playbook
- DevEx in Action
- Designing Developer Experience Surveys
The Process We Followed
1. Choosing the Survey Structure
The first decision was around structuring the survey. A flat list of questions can feel overwhelming and offers limited actionable insight. We considered three approaches:
a. By Persona
Questions are tailored for different user personas (e.g., frontend, backend, DevOps). Each developer answers only the relevant sections, which keeps responses contextual and focused. LinkedIn uses this structure.
b. By Product
Questions are grouped by platform products (e.g., dashboards, observability tools). This approach works well when tools are shared across personas. However, it can make the survey long and harder to analyze if many products are covered.
c. By Developer Lifecycle
Questions align with the stages of the software development lifecycle—build, test, deploy/release, and operate. We chose this structure because:
- It offers a natural flow for developers
- It ensures coverage of all stages common to most workflows
- It allows squads to own and refine their respective sections
- It simplifies responsibility for question design, while allowing consistency in tone and style
This structure aligned well with how our teams are organized, and it proved to be effective.
2. Designing the Questions
Let’s take the Build stage as an example, covering:
- Repository scaffolding
- CI templates
- A custom build reporting tool
a. Cover Each Product
Since this was our first survey, we ensured that each product—regardless of adoption level—was represented with at least one question. This helped us capture both awareness and early sentiment across tools.
b. Use Three Key Dimensions
For each product, we crafted questions across the following DevEx dimensions:
- Flow State – Does the tool support uninterrupted development?
- Feedback Loop – Are there delays that interrupt progress?
- Cognitive Load – How easy is it to understand, debug, or fix issues?
Example – Repository Scaffolding:
- Flow State: “What is the average turnaround time for the first commit?”
- Feedback Loop: “What is the wait time for the first build?”
- Cognitive Load: “When setting up a new repository using the scaffolding tool, how often do you need to refer to external documentation or ask for help?”
c. Objective vs. Subjective Questions
While objective metrics (e.g., build duration) are valuable, they don’t tell the full story. Subjective questions uncover developer sentiment and should take precedence when there’s a conflict between the two.
For instance, a 5-minute build might be acceptable for ETL jobs but not for lightweight microservices. Only subjective questions can uncover that nuance.
More examples:
- Feedback Loop (subjective): “How does your build time impact focus and productivity?”
- Cognitive Load (subjective): “How likely are build errors in scaffolded repos?”
- [If very likely]: “Are those errors easily debuggable?”
d. Open-Ended Feedback
At the end of each section, we added an optional open-text question:
“Is there anything else you’d like to share about your experience with this stage?”
This allows developers to share insights we might not have captured explicitly.
3. Keeping the Survey Lightweight
Despite the comprehensive scope, we designed the survey to be lightweight by:
- Grouping sub-questions under each product
- Using conditional logic to show follow-ups only when relevant
- Maintaining a consistent structure across sections to avoid cognitive switching
4. Scoring Mechanism
To quantify sentiment, we used the Net Promoter Score (NPS) format, which is commonly used in DevEx surveys.
Each question offered options such as:
- Very Satisfied, Satisfied, Neutral, Sometimes Dissatisfied, Always Dissatisfied
We mapped these to a 0–5 scale:
- Promoters: 4–5
- Passives: 3
- Detractors: 0–2
NPS = % of Promoters – % of Detractors
In addition, we used user profile parameters—such as role or designation—to adjust scoring where appropriate. For example, a lead engineer’s input might carry more weight for certain questions. Tools like Qualtrics allow this level of customization.
This scoring system gave us a baseline and allowed us to track trends over time.
5. Pilot Testing
Before rolling out the survey broadly, we conducted a pilot with a representative cohort of developers. This helped us validate:
- Clarity and tone of questions
- Total time required to complete the survey
- Any missing areas or redundant questions
After incorporating the feedback, we launched the full survey to the wider developer organization.
6. Interpreting Results
The initial NPS score gave us a high-level view of developer sentiment. From there, each squad reviewed its respective section to:
- Identify what was working well
- Pinpoint pain points and prioritize improvements in the roadmap
What’s Next: Evolving the Survey
Survey design is iterative. Based on what we learned, here's how we plan to improve future versions:
- Conduct developer interviews to dive deeper into specific feedback
- Retain most questions to allow trend comparison
- Add focused deep-dives into areas flagged in previous surveys
- Share results and trends in business review meetings, to help:
- Track NPS improvement over time
- Enable cross-tribe knowledge sharing and alignment
Final Thoughts
This blog outlines how we approached the design and rollout of our first DevEx survey. Over time, the same framework can evolve into:
- Persona-specific surveys
- Product-specific feedback loops
- Tools for tracking change and platform maturity across the organization
A well-designed survey isn’t just about collecting responses—it’s a lever to improve how developers build, collaborate, and deliver software.