Minnesota Bridge Image
St. Anthony Main
219 Main Street SE, Suite 302
Minneapolis, Minnesota 55414

612.623.9110
(f) 612.623.8807


In January we welcomed Mark DiPasquale to our team. Mark will be serving as a Senior Developer. Prior to joining PDA, Mark worked as a Systems Developer at Capella University, where he supported marketing initiatives for prospective learners in a fast-paced agile environment. As a full-stack developer experienced with a variety of architectures and technologies, Mark looks forward to planning and building software for PDA’s clients.

Mark adds, “I love PDA’s mission, commitment to building flexible long-term solutions, and pragmatic can-do approach to technical challenges. It’s a pleasure and privilege to be a part of this team.”

Welcome, Mark! We are thrilled to expand our programming capabilities!

Mark

Posted in General, PDA Staff

A handful of PDA staff are heading to Chicago tomorrow for the American Evaluation Association’s annual conference. Check out our presentations below. We hope to see you there!

 

From tragic to magic: Building organizational capacity to produce high-quality data visualizations, reports, and presentations. An ignite presentation by Angie Ficek on Wed, Nov 11 at 5:35pm.

What is going on here? The use of audio recordings and “secret shoppers” to assess program implementation in public health. Presented by Emily Subialka Nowariak, Angie Ficek, and Anne Betzner on Thurs, Nov 12, 2:00-2:45pm.

Novel public health program evaluation: Adaptive tobacco cessation evaluation and dissemination models. Presented by Amy Kerr, Vanessa Kittelson, and Emily Subialka Nowariak, and co-authored by Paula Keller and Marietta Dreher from ClearWay Minnesota on Fri, Nov 13, 3:30-4:15pm.

Part 1: Responsive and relevant evaluation: Using university/community partnerships to prepare exemplary evaluators. Discussion led by Melissa Chapman Haynes on Fri, Nov 13, 3:30-4:15pm.

Part 2: Responsive and relevant evaluation: Using university/community partnerships to prepare exemplary evaluators. “Building university/community partnerships through an interdisciplinary training institute,” presented by Jean A King and Melissa Chapman Haynes on Friday, Nov 13, 4:30 – 5:15 pm.

 

Posted in General, PDA Staff, Presentations

In August and September, we welcomed two new members to the PDA team and welcomed back one familiar face.

 

Andy Hajrovic, Business Analyst

Andy will be serving as PDA’s first Business Analyst.  Prior to joining PDA, Andy worked as a Project Manager at Optum-UnitedHealth Group where he aided in remediating, managing, and implementing wellness incentives on the Employer Health Portal. At PDA, Andy will have a role in requirements gathering, reporting on business objectives, providing data analysis, assisting with quality assurance, overseeing training, and software documentation. Andy hopes to help our clients achieve their goals by transforming their business needs into effective technical solutions.

 

Sara Richter, Senior Statistician

Sara comes to us from Park Nicollet where she served as a Senior Statistician working on health services research projects and clinical trials in the fields of diabetes, eating disorders, oncology, orthopedics and movement disorders. At PDA, Sara will primarily work on analyzing data for tobacco cessation evaluations and for the Olweus Bullying Prevention Program. She holds a Master of Science in Statistics from the University of Minnesota and is passionate about using high quality data to draw practical conclusions and help solve real-world problems.

 

Melissa Chapman Haynes, Senior Evaluator

We are so very happy to have Melissa back! For about two years she served as the Managing Director for the Minnesota Evaluation Studies Institute at the University of Minnesota. At PDA, Melissa will primarily work on evaluating a prevention initiative in Ohio focused on increased access to healthy foods, awareness and promotion of pre-diabetes lifestyle change programs, and promotion of active transportation. Melissa is an outstanding evaluator and person, and we are so happy to have her knowledge, smile, and energy back in the office!

 

Posted in General, PDA Staff

Last month, Anne, Becky, and Julie presented at the North American Quitline Consortium’s (NAQC) conference in Atlanta, GA. The conference was a great opportunity for the quitline community to come together to discuss the future of quitlines and hot topics, such as electronic nicotine device systems (ENDS), e-referrals, and the quitline’s role in treating tobacco users with mental health and/or behavioral health issues.

Anne’s presentation was a sneak peek into a recently released NAQC issue paper on how to calculate NAQC quit rates, which can be found here. NAQC invited PDA to author this issue paper with recommendations for how to calculate standard NAQC quit rates, which quitlines in the US and Canada will use to assess and improve their performance. In her presentation, Anne focused on key changes in the recommended NAQC standard quit rate, including how to handle participants who receive web- or text-based cessation treatment, how to handle electronic cigarette users, and options for gathering high quality quit rate data on a limited budget. The room was filled with quitline managers and researchers who had a lively discussion about the challenges and best practices of measuring high quality quit rates.

Becky’s and Julie’s presentation provided strategies to improve survey response rates and data quality. Getting a strong survey response rate is important because it helps to ensure that the evaluation data is a good representation of all of the participants who used the program. Becky and Julie provided concrete examples of strategies related to study design, working with vendors, and survey methods, along with cost implications of each strategy. They also reviewed how to pick the best mix of strategies to balance cost and effectiveness. See an image of their handout below.

 

Posted in PDA in Print, Presentations, Surveys

Contributing staff: Traci Capesius

 

More and more we are seeing health care systems shift their focus further upstream, using a systems change approach to address behaviors, such as tobacco use, that are the cause of or exacerbate many chronic health conditions. A systems change approach can be defined as permanently altering protocols, policies, and infrastructure so that addressing a target behavior (e.g. helping tobacco users quit) becomes systematic and part of daily practice.  Systems change may occur within an agency, between multiple departments, or across an entire health system.

 

Why systems change?

Health systems are engaging in more systems change work in order to systematically improve the health of their patient population and, ultimately, reduce the cost of care. Treating tobacco use dependence, for example, is becoming more important to improving patient health and reducing the cost of care, as tobacco use can cause or contribute to the worsening of many chronic health conditions.  Implementation of tobacco-related systems change strategies can enhance health systems’ ability to identify and treat tobacco use, therefore reducing health care costs. In 2014 PDA co-authored a journal article with ClearWay MinnesotaSM that summarized how health systems were able to make systematic changes to address tobacco use dependence among their patient populations and provided recommendations for health systems and funders of systems change initiatives.

 

Systems change strategies

The following are a few examples of key systems change strategies that have the potential to increase the sustainability of tobacco dependence treatment interventions.  While some of these strategies are more suited to clinical environments, some could also be applicable to community or social service-based organizations.

  • Obtain buy-in from providers, clinical staff, other front-line personnel via trainings and on-going communication (e.g. via daily “huddles” or meetings) regarding the importance of tobacco user identification and treatment.
  • Recruit project champions from multiple levels who fully support the systems change effort, are willing to provide leadership and are in a position to influence colleagues and decision makers.
  • Integrate brief intervention into clinic standard practice; this includes regularly asking tobacco use status during clinic visits and making conversations with clients about tobacco use, treatment options (including cessation pharmacotherapy) a standard part of care with all tobacco dependent clients.
  • Continually monitor clinic and provider performance in implementing screening, referral, service provision, and follow-up. Regularly reviewing data and report results back to key stakeholders.

 

Evaluating systems change initiatives

PDA has evaluated several systems change initiatives in multiple states. In our evaluation approach, we often conduct interviews with key staff members at the start and end of the initiative (and perhaps at a midpoint as well). The interviews help determine where programs are starting and ending, what the facilitators and barriers to making changes were, and what lessons they learned along the way. We also review the literature on systems change (such as those shown below) to help inform our methods, protocols, results, and conclusions.

 

Resources

Centers for Disease Control and Prevention. Best Practices for Comprehensive Tobacco Control Programs — 2014. Atlanta: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health, 2014.

Fiore MC, Jaen CR, Baker TB, et al. Treating Tobacco Use and Dependence: 2008 Update. Clinical Practice Guideline. Rockville, MD: U.S. Department of Health and Human Services. Public Health Service. May 2008.

LaPelle N, Zapka J, Ockene J. Sustainability of public health programs: the example of tobacco treatment services in Massachusetts. Am J Public Health. 2006;96(8):1363-1369.

Savaya R, Spiro SE. Predictors of sustainability of social programs. Am J Evaluation. 2012;33(1):26-43.

Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–2067.

 

Posted in General, Sustainability

Recently our Senior Analyst, Becky Lien, shared some uses for Venn diagrams based on this recent report from the Pew Research Center, and I’m surprised to say I actually see use for them! The twist with this Venn diagram is that you change the size of the circle as well as the size of the circle overlap to accurately represent the proportion of data. Sweet, and duh! You can do this fairly accurately in Microsoft Office, or super-duper accurately in R.

Now, I often avoid using circles when I visualize data because people have a hard time estimating area, but this seems like a useful way to visualize data when you have different combinations of responses. Take the following example. We collect data on what type of stop-smoking medication people used, if any, to help them quit using tobacco. Someone might use one type of medication, but they could also use a combination of medications, which can result in a lengthy, and perhaps unclear, bar chart. Becky thought that a proportional Venn diagram might be a great way to show the various patterns of medication use. Here is a diagram that shows the number of participants who used nicotine replacement therapy (NRT) in the form of the patch, gum, and/or lozenge.

Patterns of NRT use

Becky created this in R, but you could also do it in Microsoft PowerPoint, Word, or Excel. You can either go the quick route and eyeball the proportions of the circles, which could be totally fine, or you can revisit your geometry and algebra classes to get more exact circle sizes, which is what I tried out.

I replicated Becky’s work in PowerPoint. I first determined the total number of people represented for each of the four circles. For Patch, 661 + 106 + 60 + 40 = 867. For Gum, 239 + 106 + 102 + 40 = 349. I added up the numbers for the remaining two circles.

I then created the largest sized circle, in this case it’s Patch. To figure out the size of the circle for the Gum, I divided the number of people who used Gum (349) by the number of people who used Patch (867) which equals .40. So the circle for the Gum needs to be 40% the size of the Patch circle.

Here is where that geometry and algebra come into play. The area of a circle = 3.14*r2, where the radius (r) is half the height or width of the circle. The area of the Patch circle came out to 2.83, so I knew my area for the Gum circle needed to be 40% of that, or 1.13. I solved for r (the radius of the Gum circle) and following these steps for the remaining circles.

By the way, I could solve for r all day long! Geometry wasn’t really my thing, but I was way into algebra.

The trickier, less exact part is trying to overlap the circles in a proportional way. R will do this for you fairly accurately, but you will have to eyeball it in PowerPoint. I think you could do this well enough though to get the point across.

Finally, I added in text boxes for the data labels, and the color fill of the circles is 40% transparent so that you can see the overlap.

What do you think about using Venn diagrams in this way? How would you visualize these data?


 

Posted in Data visualization, General

PDA is currently seeking a Senior Statistician to work on both evaluation and statistical projects. This is a full-time position (32 to 40 hours per week). For more information about this position, qualifications, and how to apply, click here.

Posted in Job Openings

This past March I presented on data visualization techniques to a very engaged crowd at the Minnesota Evaluation Studies Institute (MESI). I noticed a few charts that I presented garnered a lot of interest judging by the number of people who took out their cell phones to snap pictures (which to me is like reaching star status). In this post, I review a few of those useful, but overlooked charts. They’re really effective and new(er) ways to visualize common types of data. And they all use Excel!

1. Dot plots for making comparisons

I first learned about dot plots from the very talented Ann Emery. She has a simple, 5 minute tutorial on how to create them on her blog, which you can find here. Stephanie Evergreen also wrote about them on her blog, though she refers to them as dumbbell plots, which you can find here. I highly recommend you learn how to create these! We recently submitted some reports that looked at grantee data from Year 1 to Year 2, and this kind of chart was so helpful in showing how each grantee’s data shifted from year to year. Here’s an example of one of those charts:

2. Dot plots for descriptive statistics

I also started using dot plots to visualize descriptive statistics. I think of it as an improved version of a box plot. These are pretty simple to create once you get the hang of using dot plots. I recently wrote about these for aea365, which you can find here. Here is an example of using dot plots to visualize the min, mean, and max of a variable across several programs. I found this was a much more effective visual than showing these data in a table.

3. Line charts for timelines

Lately I’ve been using a line chart to create timelines. Nothing fancy, but I find they do the job, and I know my way around a line chart. This is also something I recently wrote about for aea365, which you can find here. Here is an example of a timeline created with a line chart. In this example I’m showing a timeline of when intake and follow-up data are available, as well as when the reports were submitted.

It’s nice to have some more options to effectively visualize data using Excel, and I hope this expanded upon your options as well!

 

 

Posted in Data visualization, General

All of our hard work trying to reach as many people as we can with our follow-up surveys is paying off!

The North American Quitline Consortium (NAQC) recently updated their quitline map, which includes (among many other things) survey response rates for states that collect evaluation follow-up data. PDA was thrilled to see that we evaluate and collect the follow-up data for three of the four states with the highest response rates – Minnesota, Hawaii, and Florida.

This of course is more than just about being “the best”. Getting a strong survey response rate is important because it helps to ensure that the data we collect and report is a good representation of all of the participants who used the program. A low response rate could lead to inflated outcomes because they most likely represent individuals who are easier to reach and more successful with the program.1

We spend a LOT of time and energy figuring out how to best reach people for follow-up surveys. Should we offer the survey via phone, web, and/or mail? How many times should we contact them, and on what days and at what time? How much of an incentive should we provide? Should the incentive be pre-paid or promised? How often should we send them a reminder via mail or email to let them know we’re trying to get a hold of them? There’s more to it than you may think.

Luckily, there are people like Don Dillman who have studied this topic at length, and he provides many best practices for this very thing. We included the reference to his latest book below along with some other helpful resources related to obtaining strong response rates. Also, later this year NAQC will  publish an updated issue paper on calculating quit rates, which will include a section about strategies for conducting follow-up surveys. Stay tuned!

1 Lien RK, Schillo BA, Goto CJ, Porter L. (2015). The Effect of Survey Nonresponse on Quitline Abstinence Rates: Implications for Practice. Nicotine Tob Res. pii: ntv026. [Epub ahead of print]

 

Additional resources

Dillman DA, Smyth JD, Christian  LM. (2014). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Ed. Hoboken, NJ: John Wiley & Sons, Inc.

Lepkowski JM, Tucker C, Brick JM. (2008). Advances in telephone survey methodology. Hoboken, NJ : John Wiley & Sons, Inc.

De Leeuw E, Callegaro M, Hox J, Korendijk E, Lensvelt-Mulders G. (2007). The influence of advance letters on response in telephone surveys: a meta-analysis. Public Opin Q, 71(3), 413–443.

Cantor D, O’Hare B, O’Connor K. (2007). The use of monetary incentives to reduce nonresponse in random digit dial telephone surveys. In J. Lepkowski, C. Tucker, J. Brick, E. Leeuw, L. Japec, P. Lavrakas, et. al., Advances in Telephone Survey Methodology (pp. 471-498). Hoboken, NJ : John Wiley & Sons, Inc.

De Leeuw ED. (2005). To mix or not mix data collection modes in surveys. J Off Stat, 21(5), 233-255.

The American Association for Public Opinion Research: www.aapor.org

 

Posted in General, Surveys

Contributing staff: Becky Lien

Last week, Becky Lien presented at the 2015 conference of the American Association of Public Opinion Researchers (AAPOR). The conference took place in Hollywood, FL and attracts a diverse group of attendees: political pollsters, market research firms, methodologists from academia, and survey practitioners.

Becky presented a paper comparing a web+phone follow-up survey strategy to a phone-only strategy; she looked at both effectiveness and cost of the two strategies. She found the web+phone strategy was more effective than the phone-only strategy: web+phone produced a higher survey response rate and did a better job of recruiting younger people to respond to the survey. In addition, for surveys that require around N=140 completes per month, the web+phone strategy was also more cost-effective.

Hot topics at the conference included Big Data, non-probability samples, and multi-mode survey strategies. Becky returned from the conference with some great ideas for PDA to test in our surveys.

Posted in Presentations, Surveys