DBN & Associates - Social Purpose Consultancy, Bridging Research to Client Results
  • Home
  • About DBN
    • Approach
    • Expertise
    • Testimonials
  • Services
    • Speaking Engagements
    • Training and Consulting
  • Latest Learning
    • Debra's Blog
    • Resources and Links
  • Contact

Practice-Based Evidence for Surveying Adult Clients

3/19/2025

1 Comment

 
​The Plan, Do, Study, Act (PDSA) for survey design, data collection, response analysis, reflection, use, and future planning is a solid process for surveying adult clients. The results should enhance any social sector programs’ aims for learning from clients and incorporating feedback to improve services.
 
PLAN: GOOD SURVEY RESPONSE REQUIRES SOLID SURVEY PLANNING
Data collection happens after intentional survey design and various data collection strategies are tied to the survey tool. There are variations in design, collection strategy, and response rates depending whether the survey is provided to clients during program engagement; immediately post-program; or at set intervals such as annually.
 
The survey should be designed and structured to ensure that it is:
  • aligned with organizational/program needs (theory of change) and the potential for improvement based on diverse input;
  • developed to be client centered, with feedback loops for diverse and inclusive respondent input (multiple languages) and “client voice” (through processes such as cognitive interviewing during the design phase);
  • reduced bias based on question-and-answer types, wording, fully tested for errors and gaps, and ready for distribution;
  • focused on ease, including length, flow, logic, topic and perceived connection;
  • considered for “use” for each response item and demographics to understand the resulting data - What categories are critical to understanding responses? Who defines the categories? Decisions and understanding around whether the surveys will be:
  • anonymous (demographic data required for later meaningful segmentation) vs identifiable respondent (unique ID ties back to database that will allow for segmentation)
  • high-touch services (direct clients) vs transactional (community participants)

​DO: DATA COLLECTION PROMISING PRACTICES FOR CLIENT SURVEYS

(does not include community surveys for one-off events/workshops/trainings)
  1. Who to survey?
    1. Consider starting with a portion of your target population (pilot)
    2. Ensure the timing is relevant for the recipients (i.e.: holidays)
  2. When to survey? Map backward to when you need the results
    1. Consider incentives (gift cards or other meaningful encouragement, either to all who complete or for the “first 50 respondents”)
    2. Promotion (marketing and communications to raise awareness)
    3. Dissemination (who sends the email or text is a key determinant of response combined with the clarity of the invitation wording)
    4. Collection communication (provide at least 2 reminders over a 3-4 week period to complete the survey, including a final notice prior to closing responses)
  3. How to distribute? (consider client safety and organizational capacity concerns)
    1. Text: convenient for wide reach and easy data analysis.
    2. Email: effective for contacting established supporters.
    3. Snail mail: for specific demographics that may not have electronic access
    4. Phone/In-person: for specific demographics, especially those who cannot access it other ways or require assistance (safety concerns, language, incarceration, disabilities)

STUDY: WHAT RESPONSE RATE IS GOOD; MEANINGFUL?

“Response rate” is the percentage calculated from the number of complete survey responses divided by the total number of people who received the survey and then multiplied by 100.
 
According to Qualitrics, a large survey firm:
A typical customer satisfaction survey response rate usually falls between 10% and 30% across various industries, with an excellent response rate considered to be anything above 50% depending on the survey design and customer engagement level; however, this can vary significantly based on the industry and how the survey is delivered. (AI overview).
 
Response rate benchmarks are usually qualified by a specific distribution channel or survey type:
  • 33% as the average response rate for all survey channels, including in-person and digital (SurveyAnyplace, 2018)
  • >20% being a good survey response rate for Net Promoter Score surveys (Genroe, 2019)
 
Note: this data was gathered pre-pandemic, and response rates for surveys distributed by email have decreased over the last several years. It is therefore important to meet clients where they are and embed surveys into clients’ normal flow to gather more feedback.
 
The importance of the response rate is to assess both reliability and confidence in the results. This confidence stems from whether the respondents were representative of the target audience for the survey and whether there might be conclusions pulled from the responses in order to make business decisions, improvements, and authentic marketing efforts, beyond anecdotal experiences. 
 
Sources from experience, blogs and analysis: Qualtrics, Urban Institute, SmartSurvey
Note: this does not include statistical elements such as significance or margin of error calculations.
 
ACT: POST COLLECTION
After your survey data is collected, make sure to save energy and time to complete:
  • analysis and interpretation
  • internal learning and reflection
  • communication of results (internally and externally)
  • program quality improvements
  • future plans (determine if you’ll use the same survey again and when or if you need to make edits first based on what you learned.)
 
Beyond response rate, statistical significance, or any attempts to benchmark results to large data sets, planning a solid tool to use consistently, setting internal goals, and tracking trends over time are often the most practical and meaningful way to interpret change within a target population for your organization or specific program.
 
There are many tools to assist with the internal learning and reflection phase. For example, Excel is useful for analyzing the quantitative elements of a survey, including charts and graphs. There are also numerous AI tools that can be utilized to assist with qualitative analysis and interpretation, especially to summarize and bucket open-ended responses into themes and action items. For communicating the results, data visualization is often useful, and Excel, Tableau, and Power BI are some options.
 
Continuous improvement is likely if your program or organization engages in a Plan, Do, Study, Act process for survey design, data collection, response analysis, reflection, use, and future planning. 
1 Comment
Peter Maxwell
5/4/2025 02:40:02 am

Since I believe that leadership is something that is learned and developed via experience and training, I continue to attend trainings from https://echelonfront.com/services/leadership-consulting/ and other organizations to perfect my skills as an experienced senior management at a company.

Reply



Leave a Reply.

    Debra B. Natenshon

    National Organizational Management expert, interested in sharing, learning and collaborating for better social outcomes.

    Archives

    March 2025
    May 2024
    June 2022
    March 2022
    December 2020
    May 2020
    January 2020
    August 2017
    February 2017
    May 2016
    January 2016
    July 2015
    February 2015
    December 2014
    November 2014
    August 2014
    July 2014

    I've known and worked with Debra for over 15 years and this blog series is worth reading and bookmarking. #NPStrong

    Mary K. Winkler, Independent Consultant and Fellow Leap Ambassador

    Categories

    All
    Chicago
    Debra Natenshon
    Nonprofit Consulting
    Outcome Management
    The Center For What Works
    What Works

    RSS Feed

Copyright © DBN & Associates, L3C.  All rights reserved.