Minnesota Bridge Image
St. Anthony Main
219 Main Street SE, Suite 302
Minneapolis, Minnesota 55414

612.623.9110
(f) 612.623.8807


This past March I presented on data visualization techniques to a very engaged crowd at the Minnesota Evaluation Studies Institute (MESI). I noticed a few charts that I presented garnered a lot of interest judging by the number of people who took out their cell phones to snap pictures (which to me is like reaching star status). In this post, I review a few of those useful, but overlooked charts. They’re really effective and new(er) ways to visualize common types of data. And they all use Excel!

1. Dot plots for making comparisons

I first learned about dot plots from the very talented Ann Emery. She has a simple, 5 minute tutorial on how to create them on her blog, which you can find here. Stephanie Evergreen also wrote about them on her blog, though she refers to them as dumbbell plots, which you can find here. I highly recommend you learn how to create these! We recently submitted some reports that looked at grantee data from Year 1 to Year 2, and this kind of chart was so helpful in showing how each grantee’s data shifted from year to year. Here’s an example of one of those charts:

2. Dot plots for descriptive statistics

I also started using dot plots to visualize descriptive statistics. I think of it as an improved version of a box plot. These are pretty simple to create once you get the hang of using dot plots. I recently wrote about these for aea365, which you can find here. Here is an example of using dot plots to visualize the min, mean, and max of a variable across several programs. I found this was a much more effective visual than showing these data in a table.

3. Line charts for timelines

Lately I’ve been using a line chart to create timelines. Nothing fancy, but I find they do the job, and I know my way around a line chart. This is also something I recently wrote about for aea365, which you can find here. Here is an example of a timeline created with a line chart. In this example I’m showing a timeline of when intake and follow-up data are available, as well as when the reports were submitted.

It’s nice to have some more options to effectively visualize data using Excel, and I hope this expanded upon your options as well!

 

 

Posted in Data visualization, General

All of our hard work trying to reach as many people as we can with our follow-up surveys is paying off!

The North American Quitline Consortium (NAQC) recently updated their quitline map, which includes (among many other things) survey response rates for states that collect evaluation follow-up data. PDA was thrilled to see that we evaluate and collect the follow-up data for three of the four states with the highest response rates – Minnesota, Hawaii, and Florida.

This of course is more than just about being “the best”. Getting a strong survey response rate is important because it helps to ensure that the data we collect and report is a good representation of all of the participants who used the program. A low response rate could lead to inflated outcomes because they most likely represent individuals who are easier to reach and more successful with the program.1

We spend a LOT of time and energy figuring out how to best reach people for follow-up surveys. Should we offer the survey via phone, web, and/or mail? How many times should we contact them, and on what days and at what time? How much of an incentive should we provide? Should the incentive be pre-paid or promised? How often should we send them a reminder via mail or email to let them know we’re trying to get a hold of them? There’s more to it than you may think.

Luckily, there are people like Don Dillman who have studied this topic at length, and he provides many best practices for this very thing. We included the reference to his latest book below along with some other helpful resources related to obtaining strong response rates. Also, later this year NAQC will  publish an updated issue paper on calculating quit rates, which will include a section about strategies for conducting follow-up surveys. Stay tuned!

1 Lien RK, Schillo BA, Goto CJ, Porter L. (2015). The Effect of Survey Nonresponse on Quitline Abstinence Rates: Implications for Practice. Nicotine Tob Res. pii: ntv026. [Epub ahead of print]

 

Additional resources

Dillman DA, Smyth JD, Christian  LM. (2014). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Ed. Hoboken, NJ: John Wiley & Sons, Inc.

Lepkowski JM, Tucker C, Brick JM. (2008). Advances in telephone survey methodology. Hoboken, NJ : John Wiley & Sons, Inc.

De Leeuw E, Callegaro M, Hox J, Korendijk E, Lensvelt-Mulders G. (2007). The influence of advance letters on response in telephone surveys: a meta-analysis. Public Opin Q, 71(3), 413–443.

Cantor D, O’Hare B, O’Connor K. (2007). The use of monetary incentives to reduce nonresponse in random digit dial telephone surveys. In J. Lepkowski, C. Tucker, J. Brick, E. Leeuw, L. Japec, P. Lavrakas, et. al., Advances in Telephone Survey Methodology (pp. 471-498). Hoboken, NJ : John Wiley & Sons, Inc.

De Leeuw ED. (2005). To mix or not mix data collection modes in surveys. J Off Stat, 21(5), 233-255.

The American Association for Public Opinion Research: www.aapor.org

 

Posted in General, Surveys

Contributing staff: Becky Lien

Last week, Becky Lien presented at the 2015 conference of the American Association of Public Opinion Researchers (AAPOR). The conference took place in Hollywood, FL and attracts a diverse group of attendees: political pollsters, market research firms, methodologists from academia, and survey practitioners.

Becky presented a paper comparing a web+phone follow-up survey strategy to a phone-only strategy; she looked at both effectiveness and cost of the two strategies. She found the web+phone strategy was more effective than the phone-only strategy: web+phone produced a higher survey response rate and did a better job of recruiting younger people to respond to the survey. In addition, for surveys that require around N=140 completes per month, the web+phone strategy was also more cost-effective.

Hot topics at the conference included Big Data, non-probability samples, and multi-mode survey strategies. Becky returned from the conference with some great ideas for PDA to test in our surveys.

Posted in Presentations, Surveys

In March we welcomed Andy Raddatz to the PDA team. Andy will be serving as our Senior Developer. Prior to joining PDA, Andy worked as a Senior Developer at The Nerdery where he gained invaluable experience as a lead on several different teams working with varied software platforms. Andy is a full-stack developer with a passion for architecture and code style. At PDA, Andy will take a leadership role in our web-based solutions division and will have key roles in planning and building software for our clients.

Andy adds, “I love the thoughtful approach that PDA takes, and I am inspired by the leadership’s commitment to long-term solutions. I am driven by the desire to create software that faithfully serves a purpose, built with the flexibility to grow with the future needs of PDA and our clients.”

We are so grateful to have you on board, Andy!

Posted in General, PDA Staff

This weekend Heather is attending the Pediatric Academic Societies Annual Meeting in San Diego, CA. This conference is the largest international meeting focused on research in child health. Through her role at Children’s Hospitals and Clinics of Minnesota, she is presenting one poster and is a contributing author to another. If you’re there, check out her work:

Variation in emergency department care for children with asthma
Zook, H.G., Payne, N.R., Puumala, S.E., Burgess, K.M., & Kharbanda, A.B.
Tuesday, April 28 | 7-11am

Implicit bias towards American Indian children in the emergency department
Puumala, S.E., Kharbanda, A.B., Burgess, K.M., Zook, H.G., Pickner, W.J., & Payne, N.R.
Sunday, April 26 | 4:15-7:30pm

 

Posted in General, Presentations

The welcome is a bit delayed, but we would like to share our delight in having Heather Zook join the PDA team as an Associate Evaluator. Heather will graduate in May with a Master of Arts degree in Evaluation Studies from the University of Minnesota. Prior to starting at PDA, Heather worked as a Clinical Research Associate at Children’s Hospitals and Clinics of Minnesota where she studied emergency department use by American Indian children and racial disparities in emergency department care. She presented her research at several regional and international conferences, and published an article in the Journal of Emergency Medicine.

At PDA, Heather primarily works on evaluating the Florida Department of Health’s tobacco cessation programs. She enjoys working with data and developing user-friendly data management solutions for clients. She also loves writing reports and presenting data in formats that are clear and easy to understand for a variety of audiences. She’s clearly a good fit for our team and we’re happy to have her!

 

Posted in General, PDA Staff

The Society for Research on Nicotine and Tobacco’s annual conference is going on right now and some PDA’ers are there! PDA staff collaborated with our client Clearway Minnesota on five different poster presentations. Posters that PDA staff are a part of are listed below. If you’re attending SRNT, all of these posters will be presented on Fri, Feb 27 at 5:15pm (because who doesn’t want to learn about the challenges of measuring secondhand smoke exposure at the end of the day on a Friday?). Anne and Harlan will be in attendance, so stop by and say hi if you see them. For more information about the SRNT conference, click here.

 

Perceptions of secondhand smoke risk and restrictions: Challenges and opportunities for new policies

Anne Betzner, Michael Amato, Melissa Haynes, Ann St. Claire & Raymond G. Boyle

Poster Number: 155

 

Measurement of secondhand smoke exposure using self-report survey methods: Complexities, challenges, and recommendations

Melissa Haynes, Anne Betzner, Ann St. Claire, Raymond G. Boyle & Michael Amato

Poster Number: 156

 

Using a modified reasoned actioned approach model to assess intention to quit tobacco

Jake Depue, Michael Luxenberg, Barbara Schillo, Andrea Mowery & Marietta Dreher

Poster Number: 30

 

If you build it, will they come? Initial impact of expanding QUITPLAN® services

Paula A. Keller, Marietta Dreher, Rebecca Lien, Matt Christenson & Barbara A. Schillo

Poster Number: 103

 

The QuitCash Challenge: Service utilization results from six years of Quit and Win contests in Minnesota

Erin O’Gara, Raymond Boyle, Marietta Dreher, Mike Sheldon & Rebecca Lien

Poster Number: 130

 

Posted in General

Written by: Anne Betzner

There are a wide range of qualitative methods available: focus groups, in-depth interviews, participant observation, photo analysis, just to name a few.  When are focus groups the right choice?

Focus groups are best used to assess a group response.  By bringing people together to answer a set of planned questions, an evaluator can assess how individuals respond within a group and the level of consensus that develops through discussion.  One situation where a group response is useful is  gathering information about potential new policies or community-level initiatives.  For example, we conducted focus groups to assess participants’ opinions about new policies to limit non-smokers’ exposure to secondhand smoke.  Because policies are implemented and experienced at a community level, the group setting of a focus group was an ideal way to learn more about people’s opinions.

Focus groups also provide the opportunity to see how people’s opinions change in a group setting.  One interesting exercise is to ask participants to respond to a question privately on paper prior to discussing the issue as a group. By examining individuals’ private responses against group discussion, an evaluator can see how a person’s thinking changes when talking with others.

Another great use of focus groups is to get people’s reactions to an idea or product.  Because a gut reaction is immediate, and can be expressed quickly, a focus group is a great way to learn what people think about something specific.

Finally, focus groups are good for when you want to gather information from more people compared to in-depth interviews. They also allow you to ask probing questions and gather richer information than could be obtained with more quantitatively-oriented telephone, web, or paper surveys.

And when is a focus group not a good idea?

If you want to understand habits of individuals with precision, a focus group is not a good fit because it’s sometimes difficult for all focus group participants to provide a very specific response in a group discussion. Also, if you want to really hear people’s stories about a certain topic, individual interviews might be better.

Focus groups are not ideal when collecting sensitive information because people may not feel comfortable sharing within a group.  A more private interaction like an interview (for richer information) or a survey (for more precise behavior measurement, for example) would probably be preferable.

A few tips

As with all methods, take care to interpret focus groups findings carefully.  Keep in mind that focus group findings are not intended to be generalized like population-based surveys. It is important that the evaluator provide enough information about the context of the study (methods, participants, conditions, etc.) so that readers can determine how well findings transfer to other situations.

It can be helpful to remember that focus groups can be combined with other evaluation methods. For example, we had participants complete a short survey after a focus group to help us collect a few pieces of information that would be used in the evaluation. Some focus group participants were also selected for an in-depth interview over the phone on a topic that came up during the focus group. You are not limited to one method – mix it up if it helps!

If a focus group is right for your needs, we have two words of advice. First, pilot. Pilot your protocol either by conducting interviews over the phone or having an actual focus group pilot. While it requires a little extra planning, it will help you refine your protocol. Second, sample carefully.  Think through where your focus group participants are coming from, and what might make them different from the group of people you want to learn about.  Be sure to also keep this in mind when interpreting results.

Below are a few resources that may help you determine if a focus group is a good fit for your needs.

Krueger, R.A, and Casey, M.A. (2015). Focus Groups: A Practical Guide for Applied Research. 5th Ed. Thousand Oaks, CA: Sage.

Patton, M.Q. (2015). Qualitative Research & Evaluation Methods: Integrating Theory and Practice. 4th Ed. Thousand Oaks, CA: Sage.

 

Posted in General, Qualitative Methods

Written by Becky Lien

Most tobacco quitline managers in the U.S. and Canada conduct evaluations that include outcome studies, yet many struggle to achieve adequate survey response rates. Low survey response rates can result in biased quit rates, which is a super important measure for measuring the success of a quitline. No worries though, because PDA recently coauthored a journal article that helps inform the quitline field about the impact of low survey response rates on outcome studies and provides suggestions to improve survey response rates.

The article is entitled, “The effect of survey nonresponse on quitline abstinence rates: Implications for practice” and will be published in the journal of Nicotine & Tobacco Research. I coauthored the article with three of our clients – Barb Schillo at ClearWay Minnesota, Cindy Goto at Hawaii Tobacco Prevention and Control Trust Fund, and Lauren Porter at the Bureau of Tobacco Free Florida. I found collaborating with these three to be especially rewarding, and I want to thank ClearWay Minnesota for funding my time on the project.

The abstract is available on-line to the public at:  http://ntr.oxfordjournals.org/cgi/content/abstract/ntv026?ijkey=Md4IzP98xy7wtJa&keytype=ref. Journal subscribers will be able to access the full article at the same location. The paper will be published in print in an upcoming issue.

 

Posted in General, PDA in Print

Hazelden Publishing, Clemson University, and Professional Data Analysts, Inc. are releasing a new white paper/report card on bullying in the U.S. According to the report “despite a dramatic increase in public awareness and antibullying legislation nationwide, the prevalence of bullying is still one of the most pressing issues facing our nation’s youth.”

A stratified random sample of 20,000 Olweus Bullying Questionnaires™ was selected from over 200,000 collected during the 2012-2013 school year from schools all across the United States that had not yet implemented the Olweus Bullying Prevention Program. The sample included 1,000 girls and 1,000 boys from each grade between third through twelfth — and the results were broken down by grade level and gender. The most striking findings are:

  1. The overall percentage of boys and girls involved in bullying (as one who bullied others, was bullied by others, or both) was 18% of all students surveyed.
  2. The highest prevalence of bullying was in third and fourth grades, where roughly 22% of school children reported that they were bullied two or three times or more per month.
  3. Cyberbullying was one of the least common forms of bullying experienced. Only 4% of boys and 6% of girls reported being cyber bullied two or three times a month or more. Students were less likely to be cyberbullied than to have been bullied in any of the following ways: called mean names (verbal bullying), the target of rumors or lies, deliberatively left out (exclusion), bullied with words with sexual meaning, bullied about race, physically bullied, or threatened to do things against his or her will.
  4. Some forms of bullying, such as cyberbullying and being bullied in ways that had a sexual meaning, were much more common among high school students than elementary or middle school students.
  5. On average, students reported that they were bullied in 4 different ways; only 16% said they were bullied in only one way.
  6. Bullying incidents are not restricted to one location— 20% of students who are bullied report it happening in two locations and 45% report being bullied in 3 or more locations.
  7. Students who reported being bullied as well as bullying others (bully-victims) experienced the most forms of bullying. In fact by high school, on average these students experienced six different forms of bullying two to three times or more per month.
  8. A substantial proportion of bullied students did not confide in anyone about being bullied, and boys were less likely to confide in others than girls. For girls: 17% of third through fifth graders, 23% of middle schoolers, and 27% of high school youth kept silent about bullying they had experienced. For boys: 22% of third through fifth graders kept silent; 29% in middle school; and a very high 43% in high school.
  9. Although more than 90% of girls and 80% of boys said they felt sorry for students who are bullied, far fewer reached out to help them.
  10. The confidence students had in the administrative and teaching staff to address bullying was low. By high school, less than one-third of bullied students had reported the bullying to adults at school, and only 36% of all high school students said that school staff often addressed bullying in school.

Despite significant attention about bullying from policymakers, educators, and community members, the number of students who are bullied remains unacceptably high. The results of this report show that there is still much work to be done in strengthening school environments so that every student has a safe place to learn and grow.

For more information visit www.violencepreventionworks.org.

The full report is available here: http://www.hazelden.org/web/public/document/obppbullyingtrends_2014_final.pdf

 

Posted in General