Market Research for Education Nonprofit
- Market Research
- Confidential
- Jan. 2017 - Dec. 2018
- Quality
- 5.0
- Schedule
- 5.0
- Cost
- 5.0
- Willing to Refer
- 5.0
"Their customer service was amazing and they really connected well with the staff."
- Other industries
- Culver City, California
- 51-200 Employees
- Phone Interview
- Verified
An education nonprofit hired Q-Insights for their market research services. They recruited 12,000 participants for the client's education program and conducted a series of surveys for qualification purposes.
The project was a huge undertaking but Q-Insights was able to overcome the challenges and provide the clients with the results they were looking for. The internal stakeholders were impressed with how the team was able to accomplish their work on a very tight schedule.
A Clutch analyst personally interviewed this client over the phone. Below is an edited transcript.
BACKGROUND
Introduce your business and what you do there.
I am the former chief advancement officer at the XPRIZE Foundation, which is a global leader in the design and operation of large-scale incentivized prize competitions and philanthropy.
OPPORTUNITY / CHALLENGE
What challenge were you trying to address with Q-Insights?
On the adult literacy XPRIZE, we were looking to recruit 12,000 adult learners in Los Angeles, Dallas, and Philadelphia, all of whom either spoke English or Spanish and read at or below the equivalent of a third-grade level. This was the largest educational program of its kind in the United States and we were looking to recruit all of those people within four months.
SOLUTION
What was the scope of their involvement?
We contracted with Q-Insights to do a lot of the operational work of the educational program, which included primarily canvassing and street recruitment — Q-Insights set up a lot of the infrastructure to do all of that. They were an integral part in strategizing how we would work through recruiting people who are not enrolled in educational programs. They went to sites all over Los Angeles and Dallas, where we were likely to find participants who did not meet that basic literacy requirement, and therefore were eligible for our program.
They had set up mobile testing sites to recruit people, screen them to see if they were eligible, administer a test, help them download a learning app, and collect all of their information for both demographic and contact information. They would then get back in touch with those people a year later to get them back in for post-testing. This was the most difficult part of the work because this is a population that is highly transient, and not likely to maintain the same phone number year over year.
They are transient for a number of reasons, they might be migrant workers, they might move to get another job where there isn’t typical migrant work, and there are other reasons that they move and are hard to contact. Our target was to bring back between one in five and one in six people, and their job was to track those people throughout the year, to make sure that they had a way to contact them at the end of the year and bring them back in. They did all that work from strategizing what that would look like, all the way to execution.
We had a technology set up and all the testing was done on either a computer or tablet and all the data fed back directly. They actually built their own setup that went through a data system that very easily generated the reports that we needed on who the participants were, what their tests were, and everyone was assigned a student ID. There was a whole process that was difficult, but they managed to pull it off. They ended up being responsible for close to 40% of our recruits for the entire study, which I think was pretty remarkable. They did an unbelievable job, and honestly, without them, there was no way we could do it.
The other thing was that we had a lot of Spanish speakers that came back easily from other venues, and we needed more of the English speakers to return. Recruiting English speakers is actually harder than recruiting Spanish speakers for a whole slew of reasons that have to do with the stigma associated with low literacy when you are a native speaker of English versus a non-native speaker of English. They were able to focus on recruiting native speakers of English and actually most of the native speakers of English who came back for post-testing, came through them.
I think it was close to 80% of the native English speakers who came through Q-Insights — they took the hardest part of the job, and they did it.
What is the team composition?
I worked closely with Jamie (VP of Account Services). Jamie was my primary point of contact and the overall project lead. She had quite a large team working on this that included a project manager, and a couple of other people who were doing coordination and data and things. On top of that, she had all the people in the field who were doing the work and that was quite a large number of people.
How did you come to work with Q-Insights?
I was introduced through someone that I knew. I knew what I was looking for and spoke with someone who suggested I speak to a research firm, and the research firm suggested I talk to Q-Insights. I spoke to seven different potential firms to do this work, from political canvassing firms to the folks who collect signatures for propositions in California — I also spoke to market research firms. With Q-Insights, through our conversations, they just seemed to have the infrastructure to do the project — it wasn’t something you can just build from scratch.
They had a lot of sites around both of these cities, and they were also honest with us. Our third city was Philadelphia, but they let us know they couldn’t do it there because they had never worked there and wouldn’t know how to approach it.
What is the status of this engagement?
We started working with them in January 2017, and we wrapped up in December 2018.
RESULTS & FEEDBACK
What evidence can you share that demonstrates the impact of the engagement?
I really cannot overemphasize how logistically difficult this type of work was — this was large-scale research. By comparison, when you do research on vaccines, your first phase will be a very small number of people, and the second phase will also be a very small number of people. It was only in phase three, where we started to really get into large numbers, and usually, you’re looking for a general slice of the population — you aren’t digging into a very specific group of people.
When you look at things like medicines that are developed for a specific disease or disorder, you are testing a very narrow group because the total addressable market is small — so generally, recruiting within a very small group of people is exceedingly difficult. Reaching large numbers is not tenable.
We were targeting an unprecedented number of a very narrow population. When you look at the total addressable market of low literacy adults in the two cities where they were doing the work, the number of people that they recruited within our very narrow definition of what we needed, they were getting to nearly 1% of all the people which was insane — Our addressable market was very narrow. People had to be native speakers of English or Spanish, they had to be between the ages of 16–64, they had to have a smartphone that they could use to access the internet, they had to live in Los Angeles or Dallas, and they had to not be enrolled in any educational program at the time.
For them to recruit 4,000 people out of that narrow population was actually a pretty large percentage of the population. I don’t think anybody else would undertake that, as it is logistically very difficult.
I’d like to say it was nearly 300 individual sites spread out across the cities, and they had staff wheeling around those different sites. They were tracking all the staff, and they also had to give out gift cards to participants which was a big part of the cost — we were giving a $25 gift card to every single participant that came in. They had to coordinate all of that, which was insane. That had to make sure that nothing was stolen, and nothing went missing.
To me, that was very difficult to do under the time pressure that we were giving them. We gave them four months total to do all of that work and I was incredibly impressed. There was zero chance that we would have actually reached our numbers without them, and we knew that from the get-go — they were our fail switch. We had other recruitment methods, but without Q-Insights we were just never going to hit the numbers.
How did Q-Insights perform from a project management standpoint?
I’m not very clear thinking and logistics like this were a nightmare for me, but they were able to make it seem workable. They were able to speak to me in a language that I could understand, and I appreciated that. We had general communication and didn’t use any tools in particular.
What did you find most impressive about them?
I think their customer service was amazing and they really connected well with the staff. They connected well with learners, and they were able to navigate very different types of rooms and different types of people very well, which I think stood out for me. They worked well under pressure, and they never freaked out. We were also working with people who had this as their day job, and they were freaking out about what we were asking them to do. Q-Insights always said they had this project figured out and I appreciated that. They were very smart, and they know how to solve difficult problems — they know how to ask the right questions.
Are there any areas they could improve?
Not in this particular case. I don’t want to overstate how smooth it was because it was hard. We would meet regularly, and they would come over to our office to look at the data, and figure out what was working. One of the things I thought was incredible was the quickness with which they were able to tell, based on their data analysis, which sites were working well, and which sites were not — they would shut down the sites that were not productive within a week.
For example, we did a screener and it was supposed to be about 80% accurate for whether or not someone should be a participant. The reason we cared about that was because if we just randomly grabbed people off the street and we had to give everyone $25, it would’ve been financially unfeasible for us. We had to make sure that no more than 20% of participants tested out, meaning that they screened in but then they took the test and they tested too high. For us, they still got the app and the money, but they didn’t count in our data collection.
Within a week, they knew if a particular site had too high of a percentage of people that tested out when they took the test. People just sometimes click through the site and they didn’t really care. Q-Insights identified one site where that was happening too many times. They were able to identify all of those things, and work with the data and figure those things out. As far as improving anything, my sense is that they did a great job.
Do you have any advice for potential customers?
Hire them. In my experience working with Jamie, she had project managers who were really top-notch. Our teams connected really well, and they got along well. I think part of that was because the people on my team were good at vendor management, but when you have a vendor who was committed to the work and really sits with you through it, it’s going to be easy. They were over communicative and sometimes send too much information that was hard to process.
RATINGS
-
Quality
5.0Service & Deliverables
-
Schedule
5.0On time / deadlines
-
Cost
5.0Value / within estimates
-
Willing to Refer
5.0NPS