St. Anthony Main
219 Main Street SE, Suite 302
Minneapolis, Minnesota 55414
Last month, Anne, Becky, and Julie presented at the North American Quitline Consortium’s (NAQC) conference in Atlanta, GA. The conference was a great opportunity for the quitline community to come together to discuss the future of quitlines and hot topics, such as electronic nicotine device systems (ENDS), e-referrals, and the quitline’s role in treating tobacco users with mental health and/or behavioral health issues.
Anne’s presentation was a sneak peek into a recently released NAQC issue paper on how to calculate NAQC quit rates, which can be found here. NAQC invited PDA to author this issue paper with recommendations for how to calculate standard NAQC quit rates, which quitlines in the US and Canada will use to assess and improve their performance. In her presentation, Anne focused on key changes in the recommended NAQC standard quit rate, including how to handle participants who receive web- or text-based cessation treatment, how to handle electronic cigarette users, and options for gathering high quality quit rate data on a limited budget. The room was filled with quitline managers and researchers who had a lively discussion about the challenges and best practices of measuring high quality quit rates.
Becky’s and Julie’s presentation provided strategies to improve survey response rates and data quality. Getting a strong survey response rate is important because it helps to ensure that the evaluation data is a good representation of all of the participants who used the program. Becky and Julie provided concrete examples of strategies related to study design, working with vendors, and survey methods, along with cost implications of each strategy. They also reviewed how to pick the best mix of strategies to balance cost and effectiveness. See an image of their handout below.
Contributing staff: Traci Capesius
More and more we are seeing health care systems shift their focus further upstream, using a systems change approach to address behaviors, such as tobacco use, that are the cause of or exacerbate many chronic health conditions. A systems change approach can be defined as permanently altering protocols, policies, and infrastructure so that addressing a target behavior (e.g. helping tobacco users quit) becomes systematic and part of daily practice. Systems change may occur within an agency, between multiple departments, or across an entire health system.
Why systems change?
Health systems are engaging in more systems change work in order to systematically improve the health of their patient population and, ultimately, reduce the cost of care. Treating tobacco use dependence, for example, is becoming more important to improving patient health and reducing the cost of care, as tobacco use can cause or contribute to the worsening of many chronic health conditions. Implementation of tobacco-related systems change strategies can enhance health systems’ ability to identify and treat tobacco use, therefore reducing health care costs. In 2014 PDA co-authored a journal article with ClearWay MinnesotaSM that summarized how health systems were able to make systematic changes to address tobacco use dependence among their patient populations and provided recommendations for health systems and funders of systems change initiatives.
Systems change strategies
The following are a few examples of key systems change strategies that have the potential to increase the sustainability of tobacco dependence treatment interventions. While some of these strategies are more suited to clinical environments, some could also be applicable to community or social service-based organizations.
- Obtain buy-in from providers, clinical staff, other front-line personnel via trainings and on-going communication (e.g. via daily “huddles” or meetings) regarding the importance of tobacco user identification and treatment.
- Recruit project champions from multiple levels who fully support the systems change effort, are willing to provide leadership and are in a position to influence colleagues and decision makers.
- Integrate brief intervention into clinic standard practice; this includes regularly asking tobacco use status during clinic visits and making conversations with clients about tobacco use, treatment options (including cessation pharmacotherapy) a standard part of care with all tobacco dependent clients.
- Continually monitor clinic and provider performance in implementing screening, referral, service provision, and follow-up. Regularly reviewing data and report results back to key stakeholders.
Evaluating systems change initiatives
PDA has evaluated several systems change initiatives in multiple states. In our evaluation approach, we often conduct interviews with key staff members at the start and end of the initiative (and perhaps at a midpoint as well). The interviews help determine where programs are starting and ending, what the facilitators and barriers to making changes were, and what lessons they learned along the way. We also review the literature on systems change (such as those shown below) to help inform our methods, protocols, results, and conclusions.
Centers for Disease Control and Prevention. Best Practices for Comprehensive Tobacco Control Programs — 2014. Atlanta: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health, 2014.
Fiore MC, Jaen CR, Baker TB, et al. Treating Tobacco Use and Dependence: 2008 Update. Clinical Practice Guideline. Rockville, MD: U.S. Department of Health and Human Services. Public Health Service. May 2008.
LaPelle N, Zapka J, Ockene J. Sustainability of public health programs: the example of tobacco treatment services in Massachusetts. Am J Public Health. 2006;96(8):1363-1369.
Savaya R, Spiro SE. Predictors of sustainability of social programs. Am J Evaluation. 2012;33(1):26-43.
Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–2067.
Recently our Senior Analyst, Becky Lien, shared some uses for Venn diagrams based on this recent report from the Pew Research Center, and I’m surprised to say I actually see use for them! The twist with this Venn diagram is that you change the size of the circle as well as the size of the circle overlap to accurately represent the proportion of data. Sweet, and duh! You can do this fairly accurately in Microsoft Office, or super-duper accurately in R.
Now, I often avoid using circles when I visualize data because people have a hard time estimating area, but this seems like a useful way to visualize data when you have different combinations of responses. Take the following example. We collect data on what type of stop-smoking medication people used, if any, to help them quit using tobacco. Someone might use one type of medication, but they could also use a combination of medications, which can result in a lengthy, and perhaps unclear, bar chart. Becky thought that a proportional Venn diagram might be a great way to show the various patterns of medication use. Here is a diagram that shows the number of participants who used nicotine replacement therapy (NRT) in the form of the patch, gum, and/or lozenge.
Patterns of NRT use
Becky created this in R, but you could also do it in Microsoft PowerPoint, Word, or Excel. You can either go the quick route and eyeball the proportions of the circles, which could be totally fine, or you can revisit your geometry and algebra classes to get more exact circle sizes, which is what I tried out.
I replicated Becky’s work in PowerPoint. I first determined the total number of people represented for each of the four circles. For Patch, 661 + 106 + 60 + 40 = 867. For Gum, 239 + 106 + 102 + 40 = 349. I added up the numbers for the remaining two circles.
I then created the largest sized circle, in this case it’s Patch. To figure out the size of the circle for the Gum, I divided the number of people who used Gum (349) by the number of people who used Patch (867) which equals .40. So the circle for the Gum needs to be 40% the size of the Patch circle.
Here is where that geometry and algebra come into play. The area of a circle = 3.14*r2, where the radius (r) is half the height or width of the circle. The area of the Patch circle came out to 2.83, so I knew my area for the Gum circle needed to be 40% of that, or 1.13. I solved for r (the radius of the Gum circle) and following these steps for the remaining circles.
By the way, I could solve for r all day long! Geometry wasn’t really my thing, but I was way into algebra.
The trickier, less exact part is trying to overlap the circles in a proportional way. R will do this for you fairly accurately, but you will have to eyeball it in PowerPoint. I think you could do this well enough though to get the point across.
Finally, I added in text boxes for the data labels, and the color fill of the circles is 40% transparent so that you can see the overlap.
What do you think about using Venn diagrams in this way? How would you visualize these data?
PDA is currently seeking a Senior Statistician to work on both evaluation and statistical projects. This is a full-time position (32 to 40 hours per week). For more information about this position, qualifications, and how to apply, click here.
This past March I presented on data visualization techniques to a very engaged crowd at the Minnesota Evaluation Studies Institute (MESI). I noticed a few charts that I presented garnered a lot of interest judging by the number of people who took out their cell phones to snap pictures (which to me is like reaching star status). In this post, I review a few of those useful, but overlooked charts. They’re really effective and new(er) ways to visualize common types of data. And they all use Excel!
1. Dot plots for making comparisons
I first learned about dot plots from the very talented Ann Emery. She has a simple, 5 minute tutorial on how to create them on her blog, which you can find here. Stephanie Evergreen also wrote about them on her blog, though she refers to them as dumbbell plots, which you can find here. I highly recommend you learn how to create these! We recently submitted some reports that looked at grantee data from Year 1 to Year 2, and this kind of chart was so helpful in showing how each grantee’s data shifted from year to year. Here’s an example of one of those charts:
2. Dot plots for descriptive statistics
I also started using dot plots to visualize descriptive statistics. I think of it as an improved version of a box plot. These are pretty simple to create once you get the hang of using dot plots. I recently wrote about these for aea365, which you can find here. Here is an example of using dot plots to visualize the min, mean, and max of a variable across several programs. I found this was a much more effective visual than showing these data in a table.
3. Line charts for timelines
Lately I’ve been using a line chart to create timelines. Nothing fancy, but I find they do the job, and I know my way around a line chart. This is also something I recently wrote about for aea365, which you can find here. Here is an example of a timeline created with a line chart. In this example I’m showing a timeline of when intake and follow-up data are available, as well as when the reports were submitted.
It’s nice to have some more options to effectively visualize data using Excel, and I hope this expanded upon your options as well!
All of our hard work trying to reach as many people as we can with our follow-up surveys is paying off!
The North American Quitline Consortium (NAQC) recently updated their quitline map, which includes (among many other things) survey response rates for states that collect evaluation follow-up data. PDA was thrilled to see that we evaluate and collect the follow-up data for three of the four states with the highest response rates – Minnesota, Hawaii, and Florida.
This of course is more than just about being “the best”. Getting a strong survey response rate is important because it helps to ensure that the data we collect and report is a good representation of all of the participants who used the program. A low response rate could lead to inflated outcomes because they most likely represent individuals who are easier to reach and more successful with the program.1
We spend a LOT of time and energy figuring out how to best reach people for follow-up surveys. Should we offer the survey via phone, web, and/or mail? How many times should we contact them, and on what days and at what time? How much of an incentive should we provide? Should the incentive be pre-paid or promised? How often should we send them a reminder via mail or email to let them know we’re trying to get a hold of them? There’s more to it than you may think.
Luckily, there are people like Don Dillman who have studied this topic at length, and he provides many best practices for this very thing. We included the reference to his latest book below along with some other helpful resources related to obtaining strong response rates. Also, later this year NAQC will publish an updated issue paper on calculating quit rates, which will include a section about strategies for conducting follow-up surveys. Stay tuned!
1 Lien RK, Schillo BA, Goto CJ, Porter L. (2015). The Effect of Survey Nonresponse on Quitline Abstinence Rates: Implications for Practice. Nicotine Tob Res. pii: ntv026. [Epub ahead of print]
Dillman DA, Smyth JD, Christian LM. (2014). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Ed. Hoboken, NJ: John Wiley & Sons, Inc.
Lepkowski JM, Tucker C, Brick JM. (2008). Advances in telephone survey methodology. Hoboken, NJ : John Wiley & Sons, Inc.
De Leeuw E, Callegaro M, Hox J, Korendijk E, Lensvelt-Mulders G. (2007). The influence of advance letters on response in telephone surveys: a meta-analysis. Public Opin Q, 71(3), 413–443.
Cantor D, O’Hare B, O’Connor K. (2007). The use of monetary incentives to reduce nonresponse in random digit dial telephone surveys. In J. Lepkowski, C. Tucker, J. Brick, E. Leeuw, L. Japec, P. Lavrakas, et. al., Advances in Telephone Survey Methodology (pp. 471-498). Hoboken, NJ : John Wiley & Sons, Inc.
De Leeuw ED. (2005). To mix or not mix data collection modes in surveys. J Off Stat, 21(5), 233-255.
The American Association for Public Opinion Research: www.aapor.org
Contributing staff: Becky Lien
Last week, Becky Lien presented at the 2015 conference of the American Association of Public Opinion Researchers (AAPOR). The conference took place in Hollywood, FL and attracts a diverse group of attendees: political pollsters, market research firms, methodologists from academia, and survey practitioners.
Becky presented a paper comparing a web+phone follow-up survey strategy to a phone-only strategy; she looked at both effectiveness and cost of the two strategies. She found the web+phone strategy was more effective than the phone-only strategy: web+phone produced a higher survey response rate and did a better job of recruiting younger people to respond to the survey. In addition, for surveys that require around N=140 completes per month, the web+phone strategy was also more cost-effective.
Hot topics at the conference included Big Data, non-probability samples, and multi-mode survey strategies. Becky returned from the conference with some great ideas for PDA to test in our surveys.
In March we welcomed Andy Raddatz to the PDA team. Andy will be serving as our Senior Developer. Prior to joining PDA, Andy worked as a Senior Developer at The Nerdery where he gained invaluable experience as a lead on several different teams working with varied software platforms. Andy is a full-stack developer with a passion for architecture and code style. At PDA, Andy will take a leadership role in our web-based solutions division and will have key roles in planning and building software for our clients.
Andy adds, “I love the thoughtful approach that PDA takes, and I am inspired by the leadership’s commitment to long-term solutions. I am driven by the desire to create software that faithfully serves a purpose, built with the flexibility to grow with the future needs of PDA and our clients.”
We are so grateful to have you on board, Andy!
This weekend Heather is attending the Pediatric Academic Societies Annual Meeting in San Diego, CA. This conference is the largest international meeting focused on research in child health. Through her role at Children’s Hospitals and Clinics of Minnesota, she is presenting one poster and is a contributing author to another. If you’re there, check out her work:
Variation in emergency department care for children with asthma
Zook, H.G., Payne, N.R., Puumala, S.E., Burgess, K.M., & Kharbanda, A.B.
Tuesday, April 28 | 7-11am
Implicit bias towards American Indian children in the emergency department
Puumala, S.E., Kharbanda, A.B., Burgess, K.M., Zook, H.G., Pickner, W.J., & Payne, N.R.
Sunday, April 26 | 4:15-7:30pm
The welcome is a bit delayed, but we would like to share our delight in having Heather Zook join the PDA team as an Associate Evaluator. Heather will graduate in May with a Master of Arts degree in Evaluation Studies from the University of Minnesota. Prior to starting at PDA, Heather worked as a Clinical Research Associate at Children’s Hospitals and Clinics of Minnesota where she studied emergency department use by American Indian children and racial disparities in emergency department care. She presented her research at several regional and international conferences, and published an article in the Journal of Emergency Medicine.
At PDA, Heather primarily works on evaluating the Florida Department of Health’s tobacco cessation programs. She enjoys working with data and developing user-friendly data management solutions for clients. She also loves writing reports and presenting data in formats that are clear and easy to understand for a variety of audiences. She’s clearly a good fit for our team and we’re happy to have her!