Our monthly Evaluation Therapy Newsletter shares strategies, ideas, and lessons learned from our decades of evaluating learning in non-school spaces - museums, zoos, gardens, and after-school programs. Jessica is a learning researcher who is an educator at heart. She loves helping education teams really understand and build insights from data that they can use immediately – even those who are a bit wary of evaluation.
The Search is Over: Finding treasure in the data you have
Published 9 months ago • 7 min read
Hi Reader,
In the steamy, dog days of summer, there has always been something soothing about sinking into an icily air conditioned movie theater. Especially for an over-the-top adventure flick. Whether we're following The Goonies, Indiana Jones, or Nicolas Cage doing unspeakable things to the Declaration of Independence, give me AC on high and a bucket of popcorn. I'm in.
This month, let's channel our inner Goonies and talk about the buried treasures of data that could be hiding in information your organization already has.
Also included:
September kicks off conference season. Where you might run into us next?
A free webinar this Thursday (8/15) introducing the On-the-Spot Feedback approach. What's that? Read on to find out.
Cheers,
Jessica
The search is over. It was with you all the while.*
Typically, when people think about an evaluation project, the team has visions of new surveys, interviews, or focus groups dancing in their heads. The assumption when contacting an evaluator is that collecting new data is the solution. And, sure, that is very often true.
But sometimes, it's not.
Sometimes, I hear comments that scream, "There's already data here!" Like this:
"We need a survey to find out what our network members have been doing toward our project goals. ...
I've also been having check-in calls with each member of our network over the past year, just to see what they're doing, what they need, and how we can help."
[insert record-scratch sound]
In cases like these, my data-sniffing nose tells me that an institution is sitting on piles of information that could inform the exact questions they have about audiences and programs.
In understand why they wouldn't see what they have as useful. The information is a mess. Or a bit haphazard. Or trapped in a CRM system that everyone low-key hates. Or is massive in scale. Or in a form that doesn't look like data.
At this stage, I love to start digging to find out whether they have a pile of buried treasure (in need of some analytical love) or just fool's gold.
So, let's get real about the pros and cons of "found data"
A few common places where really valuable data can be hiding in plain sight
Pro: A bird in the hand is worth two in the bush.
Hunting idioms aside, this is the most obvious strength of found data. They already exist. Someone already expended energy giving, gathering, and recording them. This strategy can leverage and respect those prior expenditures of effort - whether it was from your staff or from your participants.
Con: Pushing against "new is always better."
By its very nature, this approach values what you already have. But there are always folks who look at existing material and think it means "already known." And, as we'll get to below, existing material tends to be a bit messy. Human nature can place a higher value on new-and-shiny or fresh-and-clean, making "found data" a harder sell.
Pro: It may capture internal expertise.
Some types of found data have the fingerprints of internal staff all over them. I strongly believe that people on the ground know a lot about audiences and how things really work. When data from that close work is available, it can capture reality in a way you may not get through less relationship-centered methods.
Con: The data are probably in a jumbled pile of chaos.
Let's be honest, if these data were in highly organized folder systems, with clear labels, names, and metadata... well, you would have used them by now. Most of the time the valuable bits are intermingled with lots of other stuff. They are stored in an assortment of idiosyncratically named Google folders. Sound familiar? The data exist, but it's gonna take some investment to pull out what's useful and get it into a format to analyze.
Pro: You can benefit from the power of data-merging.
It's a little like Captain Planet and the Planeteers. (Stay with me.) Each set of found data has value on its own, but it has limits. But whenyou combine that data with other data, their power multiplies. This is where publicly available datasets (also big, messy, and already in existence) can enrich what you already have. Got ZIP codes? Got school or district names? Those are doors to so much more information.
Con: The data are probably messy.
Even if data are in spreadsheets, I'll bet that someone, at some point, started entering the data differently. If it's a big data set (say, ticketing data), there are inevitably errors and mistake entries. To be analyzed, all of that will need to be cleaned up first. To re-quote something I once heard: I will let the data speak for itself when it has cleaned itself. (source, as best I can tell.)
Con: The data are imperfect.
As rosy as I am on found data, it is never perfect. Sometimes it's a pile of goo that cannot be salvaged. But more often, there are degrees of imperfection.
Maybe questions were poorly worded. Maybe it was gathered inconsistently. Maybe "sampling" is a punchline. Maybe records kept changing names or grain size of what got written down. Maybe process or tools evolved over time, so it's kinda consistent... but also not really.
I'm not a believer that data must be "perfect" to be useful. At the same time, factor in data source, issues, and completeness when drawing conclusions. But when patterns emerge, it likely spotlights something that is going on within the data's constraints.
Pro: Leverage what you have to focus future investment.
This is where the real value comes in. Existing data may not answer all of your questions. But it can clarify what your institution does, in fact, already know.
Now you can figure out the next question. With some data pointing the way, you better know where to look. Maybe you discover a big gap in field trips from Title I schools, which is the segment you most want to support. Now you know that your evaluation questions aren't about all schools, but the unique needs and perspectives of those high-need schools. With a narrower and action-oriented question, you will get more bang for your evaluation buck.
Also: Digging through the messy, imperfect data you have can help you get serious about improving in-house data processes. You'll weed out the crappy survey questions, track those schools carefully, or understand why we ask for ZIP codes.
Real World Example:
We worked with a project that was sitting on a mountain of survey data collected over 3+ years of programming. It was time to report back to funders. But past efforts analyzing the data had been somewhat disappointing. And, of course, the data had some issues.
Questions were tweaked, added, and subtracted over time (for good reason - stop asking questions that aren't working).
Some surveys were collected online, some on paper.
Data entry was consistent. Except when it wasn't.
A global pandemic screwed up everything for a year or two.
In discussions, we also realized they had more than those piles of surveys. They had a ton of staff knowledge and documentation that could be mined:
Application information
Staff observations during 1:1 sessions
Staff knowledge of 6+ years of past participants' current status -- if they stayed actively involved, cut off contact, kept up with skills, or had life events that impacted involvement.
Staff were engaged in the community on a regular basis. They had tons of knowledge. It was more efficient (and respectful) to wrangle that knowledge than asking people yet again to provide information. And staff knowledge filled information gaps that a fresh survey or interview project likely would have missed.
The task for us was to systematize, clean, merge, and analyze those piles of existing information. And that made the data far stronger than the sum of the individual parts.
Ultimately, we learned not only about participants' skill growth, but that incoming experience had a lasting effect on self-confidence -- which also told us that participants' self-reports (on surveys) could be somewhat unreliable.
Has anyone else leveraged interesting data that were lying around? Reply and tell me. (I love getting creative with data.)
*It's no Eye of the Tiger, but The Search is Over is a Survivor classic.
On the Road Again
Where you can find JSC team members out in the wild:
Science-Art Symposium: Mechanisms of Change (September 13-14, Denver Botanic Gardens): Jessica will be giving a lightning talk about our project that is combining strategies of visual art and data literacy in science teaching. It looks like DBG has put together a great program. Interested?Registration is open here.
ASTC Annual Conference (September 28-30, Chicago): We love the Windy City! Jessica, Angie, and Michelle will all be at ASTC this year. We have several sessions, posters, and a workshop on the docket. Will you be there? Email us and let's meet up!
Webinar: On-the-Spot Feedback
Thursday (August 15) at 3 p.m. ET
Jessica will be part of a webinar about the On-the-Spot Feedback approach. OTSF was developed to help people who communicate about science get feedback from audiences on-the-spot.
(I love a project title that doesn't beat around the bush.)
Curious? During the webinar, you'll:
Learn how these strategies help science communicators get instant feedback (and adjust what they do) during public engagement.
Hear some research and evaluation evidence about how scientists actually used this approach in real life and how they changed in response.
Get the scoop on resources available if you want to try this approach.
The full project team is leading this webinar. Jessica will present evaluation highlights, alongside Kelly Hoke (from Oregon State) who will be sharing late-breaking results from the project's research.
P.S. Got a question you'd like us to answer in an upcoming newsletter? Hit reply and tell me what's on your mind!
P.P.S. Get this email from a colleague? Sign up to get your very own copy every month.
Why the "Evaluation Therapy" Newsletter?
The moniker is light-hearted. But the origin is real. I have often seen moments when evaluation causes low-key anxiety and dread, even among evaluation enthusiasts. Maybe it feels like a black-box process sent to judge your work. Maybe it’s worry that the thing to be evaluated is complicated, not going to plan, or politically fraught. Maybe pressures abound for a "significant" study. Maybe evaluation gets tossed in your "other duties as assigned" with no support. And so much more.
Evaluation can be energizing! But the reality of the process, methods, and results means it can also feel messy, risky, or overwhelming.
I've found that straightforward conversation about the realities of evaluation and practical solutions can do wonders. Let's demystify the jargon, dial down the pressure, reveal (and get past) barriers, and ultimately create a spirit of learning (not judging) through data. This newsletter is one resource for frank talk and learning together, one step at a time.
You are receiving this email because you signed up for our newsletters somewhere along the line. Changed your mind? No hard feelings. Unsubscribe anytime.
Wanna send us some snail mail? J. Sickler Consulting, 100 S. Commons, Suite 102, Pittsburgh, PA 15212
The Evaluation Therapy Newsletter
J. Sickler Consulting
Our monthly Evaluation Therapy Newsletter shares strategies, ideas, and lessons learned from our decades of evaluating learning in non-school spaces - museums, zoos, gardens, and after-school programs. Jessica is a learning researcher who is an educator at heart. She loves helping education teams really understand and build insights from data that they can use immediately – even those who are a bit wary of evaluation.