Use the following FAQs to answer common questions from Council/Board members or other stakeholders about Polco and The NCS instrument. Feel free to download and share the attached PDF as needed!
Who is Polco? What is the company’s background?
Polco and National Research Center together form the gold standard in civic engagement, bringing people and data together to help build stronger, healthier communities.
With award-winning community engagement tools and services, Polco allows leaders to engage residents around their most important topics, align goals, and strengthen public trust. Polco’s online civic engagement platform is designed to provide officials with ongoing community feedback and data-rich reports to support more informed decisions. Polco enables local government staff to quickly deploy surveys online, gather critical information from community stakeholders, and easily analyze the results. Polco’s tools are used by governments, economic development organizations, and nonprofits across the nation.
National Research Center (NRC) has been designing and conducting surveys for local governments—and other public sector organizations—for over 30 years. In that time, NRC has worked with hundreds of jurisdictions across the nation, giving them the data they need to effectively measure community livability and make more informed decisions. NRC is best known for its national benchmarking surveys and custom survey research. Their benchmark database holds hundreds of thousands of resident opinions, making it the largest of its kind in the United States. NRC is a team of leading experts in the survey research industry, acting as thought-leaders in the field, authoring numerous articles and books, and presenting research at conferences around the nation.
In 2019, NRC merged with Polco, combining survey research expertise with cutting-edge online engagement capabilities. Now Polco offers local leaders comprehensive insight into their community’s strengths, needs, and key priorities. Polco is committed to a strict privacy policy, designed to keep participants’ responses confidential and protect personal data. Learn more at info.polco.us.
What are Polco’s credentials?
Polco is a leading U.S.-based government technology firm specializing in community engagement, survey research, data collection, analysis, reporting, and performance management. With a commitment to innovation, our extensive experience and holistic approach make us the preferred choice for more than 400+ government agencies. Polco is the only strategic partner of the International City/County Management Association (ICMA) to offer survey research products and services to local governments.
National Research Center (NRC) is Polco’s in-house research firm, boasting a 30+ year legacy of conducting trusted community surveys. Our leadership on surveying extends to educational efforts such as authoring books (Citizen Surveys for Local Government) and conducting workshops on survey methodology. Polco's data and survey scientists consistently drive innovation in survey methodologies across multiple industries.
What is The NCS?
The NCS is an expertly-written, standardized, comprehensive survey that allows municipalities and counties to assess resident opinion about their community and local government. It is a trusted benchmarking tool used to evaluate services, measure quality of life, build public trust, inform budgeting decisions, assist in strategic planning, and monitor performance trends over time. The NCS focuses on ten main facets of community livability, which have been identified by survey researchers as most vital to creating a high-quality community that people want to live in.
The NCS survey instrument was developed by National Research Center (NRC, now the research arm of Polco) in 2001, in partnership with ICMA, to address the need for a standardized survey tool that could provide robust benchmark comparisons for local governments across the nation. It was among the first scientific surveys developed to gather resident opinion on a range of community issues, and has since been used in more than 500 jurisdictions across 46 states. The NCS instrument is the gold standard in community surveys, and it is tried and true. We periodically refresh and update the survey instrument to make sure it's relevant, and any changes to the survey questions are extensively tested by our research experts. Communities using The NCS have reported that the tool improved service delivery, strengthened communications with community stakeholders, and helped leaders identify clear priorities for use in strategic planning and budget setting.
What are the benefits of using The NCS rather than another survey option?
The NCS was developed, written and vetted by survey research experts, ensuring unbiased questions, high-quality data, and trustworthy results. Polco’s 30 years of verified resident feedback powers the most comprehensive benchmark comparisons available, allowing you to compare results for specific municipal services or features against those same services/features in other communities.
With Polco-led implementations of The NCS, you can be confident that you're hearing from a truly representative sample of your residents—especially residents that you may not hear from in council meetings or in an open participation survey, for one reason or another. Our rigorous methodology ensures a high level of precision and confidence in your results, empowering you to use this data to inform budgets, strategic plans, or other major decisions.
Should we add custom questions to our implementation of The NCS?
Our standard survey questions are fairly comprehensive and provide a broad overview of the different facets of livability in the community. These questions will provide a good baseline by which you can measure against other communities (using the benchmark comparisons), as well as monitor progress against your own results in future years via the trendlines.
However, if you do opt to purchase custom questions, you’ll have ½ page of space—although you may purchase more space if needed, we recommend keeping the survey as short as possible, because longer surveys can negatively affect response rate. Often ½ page correlates to about 2-3 questions, depending on length. Custom questions can provide a good place to get resident input on current/upcoming proposals, gauge the effectiveness of past initiatives, evaluate specific priorities for the community, or other timely topics.
If you do decide to include custom questions on the survey, we strongly recommend that you consider and plan for how residents' input/responses will affect the decisions and plans made by your local government. It's important that questions are only asked if/when resident input will affect the outcome in some way. Avoid asking questions about decisions that have already been made/will not be changed based on the results! If you’d like some inspiration for questions to ask, you can look through our library of already developed and vetted questions. We’re happy to help revise or construct new questions if you have specific topics in mind.
What grade does The NCS read at?
We aim for a 6th grade reading level, and always do our best to avoid any jargon.
What is the estimated time that it takes people to complete this survey?
Typically it takes respondents about 20 minutes to fill out the survey, but it can be a little more depending on the complexity and number of custom questions that you add.
What's a probability-based survey and what are the benefits?
A probability-based survey uses a sampling method where every eligible person or household in a community has a known chance of being selected to participate. Respondents are selected at random from a comprehensive list (usually of residential addresses) as an effort to reduce bias. Probability-based sampling is currently considered the gold standard for survey methods, although with declining response rates and the advent of new outreach and weighting methods, this may change in the future.
What's an open participation survey and what are the benefits?
An open participation survey is offered to all adult residents in a community. This approach to gathering resident feedback is available to anyone who has access to an internet-connected device, and often provides a higher number of completed responses both overall and across specific population sub-groups. It can also be less resource intensive than a probability-based survey and requires less time from initial administration to receiving actionable data.
Polco’s open participation surveys are administered digitally and require the promotional support of as many trusted communications channels as possible. Open participation surveys benefit from “snowball” sampling, whereby survey respondents recruit their friends and acquaintances by forwarding them the survey link. Encouraging residents to invite participation from others can increase overall response numbers as well as bolster responses from traditionally underrepresented communities.
Can I do a probability-based survey and an open participation survey at the same time?
Yes, in fact it is always advisable to include an open participation survey when implementing a survey with a probability-based sample. The probability-based method ensures that every resident has an equal chance of being selected, but most residents are not selected in the final sample. The open participation works to increase the number of completed surveys and also ensures that the whole community feels included and everyone has a chance to be heard.
Please note, the open participation survey and random sample survey data will be provided separately in the final report.
What is sampling and how does it work?
Sampling is the process of selecting a subset of random households from your community to receive the survey. To do this, we use the boundary data you provide, along with lists of addresses in your jurisdiction from USPS, to geocode the households that are within your community’s boundaries. We then randomly select a number of households for our sample, because we cannot survey everyone due to how expensive that would be. To mitigate lower response rate trends within multi-family households and ensure our sample is as representative as possible, we often select a higher proportion of multi-family homes compared to single-family homes.
What does it mean to track an area, and what kind of information does this provide?
If you choose to track areas, your final report will allow you to compare results between respondents located within each of those different areas. These comparisons can show if (and how) perspectives vary between residents from different parts of the community, which could provide useful data for future decision-making. Some clients choose to track council districts, wards, or general geographic locations (e.g., North/South/Central). Others choose not to track areas at all! It's entirely up to you and your colleagues. It may be something to consider as you discuss custom questions, too, in case any of those may benefit from area breakdowns.
What stops respondents from taking the survey multiple times?
We have conducted extensive testing over the past two decades on duplicate responses, and have also presented on this topic at AAPOR. In our research, we have found duplicate responses and submissions to be quite rare; in fact, it is a long-standing joke in the survey industry that it is difficult enough to get the respondent to take the survey once, much less multiple times.
In particular, duplicate responses or “ballot-stuffing” are very rarely a concern for our benchmark surveys such as The NCS, as these surveys do not cover contentious topics and are unlikely to elicit such strong feelings and actions from residents. Even so, Polco does automatically check for duplicate responses during analysis and removes those from the dataset when found.
If duplicate responses are a major concern for your community, you could consider requiring registration on the survey. This is an online survey setting that requires respondents to provide their email address and zip code in order to submit a response. We generally recommend allowing guest responses for the random sample survey, to reduce any potential barriers for residents; requiring registration is one way to deter multiple responses from a single individual, but it might also deter response in general, so it’s worth additional consideration and discussion.
Should we require participants to register on Polco and provide their email address and zip code?
When guest responses are enabled (our default and recommended setting for the random sample survey), respondents will be invited—but not required—to provide their name, zip code, and email address. If registration is required, respondents must provide their zip code and email address, with full name optional, in order to submit their responses.
Both settings have their benefits, so it’s worth considering your goals for the survey. Guest responses may encourage more responses overall but may be less effective in helping to build a panel of subscribers for future efforts, will likely result in fewer verified responses, and won’t defend against duplicate responses. Requiring registration may result in fewer responses and potentially more respondent questions/concerns about privacy, but it also offers some mutual benefits outside of those listed previously: by building a larger panel of trusted subscribers, the organization can lean on these interested constituents for future feedback needs, and the subscribers can easily stay notified of and connected with ongoing opportunities to share input with their local leaders.
What does it mean if a respondent is verified?
When guest responses are enabled (which is our default and recommended setting for the random sample survey), respondents will be invited to optionally provide their name, zip code, and email address. If registration is required, respondents will be required to provide their zip code and email address, with full name optional, in order to submit their responses. Behind the scenes, Polco uses a proprietary matching algorithm to verify the user-provided name and zip code against local voter registration records, giving administrators additional demographic information about their respondents, such as gender, age range, and voting district. This can provide valuable context to results, allowing leaders to see how their respondents align with the larger community demographics. By comparing verified results to unverified results, admins can get a better idea of the accuracy of their results and identify any potential issues.
Historically, Polco has matched around 60% to 70% of its responding users on local voter files. Each user successfully matched against the verification list no longer needs to self-report most additional demographic and geographic information (age, gender, precinct/ward, etc.). This not only improves the accuracy of the results, but helps stave off survey fatigue. In addition, verification ensures that respondents only respond to each survey once, avoiding fraud and ballot box stuffing. Polco values respondent privacy and does not report any individually identifying information.
Residents are concerned about needing to provide their email addresses to submit the survey (if registration is required). They’re saying it’s not anonymous/confidential. How should I respond?
Polco’s privacy policy is very strict and favors respondents. Participant email addresses are not shared with or displayed anywhere within the administrative portal, providing a critical layer of anonymity between the survey giver (city, state, organization) and the respondent. In addition, results are reported in group format, so an individual’s identity can not be known through their responses. Additionally, Polco does not share, sell, or give away participant email addresses. Please view our privacy policy for more information.
By requiring residents to include their zip code and email address when they submit the survey, we make sure to hear from a single resident only once and ensure that people are not responding multiple times. Any residents that have registered with Polco will receive an email notification whenever the jurisdiction posts additional content on Polco in the future, allowing them the chance to respond/engage with their local government if/when they wish, and they can opt out at any time. Polco will never use a participant’s email for any other reason outside of civic engagement!
What type of response rate can I expect?
Typically we see a response rate of 15-25% from communities. In general, response rates to surveys similar to The NCS have lowered over time across the entire survey research industry. We continue to explore different methods for encouraging engagement!
What can we try to increase our response rates?
How many responses do I need for my results to be valid?
The data will be valid regardless of the number of surveys received, because we use industry standard best practices for sampling and weighting the data. However, the number of surveys received will impact the margin of error. We typically strive for a margin of error of +/-5%, and generally meet that threshold for the vast majority of surveys we conduct.
What is the confidence interval of the random sample survey?
We use the 95% confidence interval, and our goal is a margin of error of +/- 5% or less.
Our response rate was 15% – are the data even valid?
While we always strive for a higher response rate, a low response rate certainly does not mean you should disregard the data. We use rigorous methodologies to ensure that every person in your community has an equal chance of being selected to receive a survey and that those who are selected have several opportunities to complete the survey. Once we have all the responses, we also weight the data to best represent the profile of the community. Respondents receive a higher weight if they are in groups that are under-represented (for example, younger males) and a lower weight if they are in a group that is over-represented (for example, older females). Each respondent’s opinion is still counted, but the sum of the opinions reflects the true population. Again, while we may prefer a high response rate, this survey (with its rigorous methodology and weighting) is going to be a much better indicator of community opinion than only listening to those residents who choose to write or telephone the city or attend a meeting.
What does it mean for Polco to “clean” the data?
Cleaning data refers to the process of reviewing and verifying the collected responses to ensure they are complete, accurate, and consistent with the survey questionnaire. This involves identifying and addressing issues such as missing or inconsistent responses, duplicate responses, outliers, and invalid data which can distort the analysis and interpretation of the results. The goal of cleaning survey data is to produce a dataset that is valid and reliable, and that can support meaningful insights and conclusions.
What is weighting?
Weighting is a method used to make the demographics of respondents match the demographics of the total population of your community. We select a random sample of households to receive survey invitations. After we get the results back, we compare certain variables like age, race and gender, between the sample and the entire population using Census data. Those who respond to the survey are not always representative of the total population. There are some groups that are commonly underrepresented, such as males and renters. Weighting allows us to adjust the results to bring them more in line with the population norms. For example, if our sample of survey respondents contained 40% males and the population contained 49% males, weighting can be used to adjust the data to correct for this discrepancy. This allows us to give underrepresented groups a voice and align the sample with the entire population of your community.
How seriously should benchmarks be taken?
Benchmark comparisons provide important context for interpreting your survey results. Benchmarking allows communities to compare their performance and services with those of other communities and identify areas for improvement. By measuring performance against recognized benchmarks and best practices, organizations can identify gaps, prioritize efforts, and demonstrate value to their constituents. Benchmarking is a critical tool in any effort to increase transparency, accountability, and service delivery.
In all cases, benchmarks should be used as a part of the puzzle. In addition to the benchmark comparisons, we recommend you consider the rating relative to past years where possible, as trends over time can provide very valuable data.
What are crosstabs and how do they work?
Crosstabulations, also called subgroup comparisons, are a common way to analyze the relationship between two or more questions, making data more actionable by identifying patterns, trends, and correlations within survey results. Crosstabs also break down results into more manageable pieces, allowing for increased focus on specific topics and demographics. Understanding how different groups of respondents answered certain questions can foster deeper awareness about community needs and power data-driven decisions.
How have other jurisdictions shared these results with their communities?
Most organizations post the URL to their interactive Tableau report of results from The NCS on their websites, and then share that URL across their communication channels. Many will distribute a press release summarizing the results, create quick videos, and/or include high level takeaways on blogs or social media. It is a best practice, if possible, to share your planned next steps alongside the survey results, demonstrating to the community that their time taking The NCS was valuable, and that their feedback will be used to make important decisions.
How have other communities used their survey results?
Most organizations use The NCS to make policy decisions, assist with strategic planning and budgeting, and measure performance over time. Results are also used to understand resident sentiment, identify issues and opportunities early, and highlight gaps in perception and/or education. The first year of results establish a baseline for performance and service delivery, and subsequent years produce trendlines that help gauge the success of new initiatives and determine areas of focus.