Professional Development Workshops

Professional Development Workshops at AEA's Annual Conference are hands-on, interactive sessions that provide an opportunity to learn new skills or hone existing ones.

IMPORTANT BACKGROUND: Professional development workshops precede and follow the conference. These workshops differ from sessions offered during the conference itself in at least three ways: 1) each is longer (either 3, 6, or 12 hours in length) and thus provides a more in-depth exploration of a skill or area of knowledge, 2) presenters are paid for their time and are expected to have significant experience both presenting and in the subject area, and 3) attendees pay separately for these workshops and are given the opportunity to evaluate the experience. Sessions are filled on a first-come, first-served basis and many are likely to fill before the conference begins.

FEES: Professional development workshops cost $300 for a two-day session, $150 for a full-day session and $75 for a half-day session for AEA members. For nonmembers, the fees are $400, $200 and $100 respectively and for students they are $160, $80 and $40.

REGISTRATION: Registration for professional development sessions is handled right along with standard conference registration. You may register for professional development workshops even if you are not attending the conference itself.

FULL SESSIONS: Sessions that are closed because they have reached their maximum attendance are clearly marked below the session name. No more registrations will be accepted for full sessions and AEA does not maintain waiting lists. Once sessions are closed, they will not be re-opened.

 

TWO DAY, MONDAY-TUESDAY, NOV 3-4, FROM 9 am to 4 pm

Qualitative Methods  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Qualitative data can humanize evaluations by portraying people and stories behind the numbers. Qualitative inquiry involves using in-depth interviews, focus groups, observational methods, and case studies to provide rich descriptions of processes, people, and programs. When combined with participatory and collaborative approaches, qualitative methods are especially appropriate for capacity-building-oriented evaluations.

Through lecture, discussion, and small-group practice, this workshop will help you to choose among qualitative methods and implement those methods in ways that are credible, useful, and rigorous. It will culminate with a discussion of new directions in qualitative evaluation.

You will learn:

§    Types of evaluation questions for which qualitative inquiry is appropriate,

§    Purposeful sampling strategies,

§    Interviewing, case study, and observation methods,

§    Analytical approaches that support useful evaluation.

 

Michael Quinn Patton is an independent consultant and professor at the Union Institute. An internationally known expert on utilization-focused evaluation and qualitative methods, he published the third edition of Qualitative Research and Evaluation Methods through Sage in 2001. 

 

Session 1: Qualitative Methods

Scheduled: Monday and Tuesday, 11/3-4, 9 am to 4 pm

Level: Beginner, no prerequisites
Fee: Members $300, Nonmembers $400, Students $160

Quantitative Methods
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Quantitative data offers  opportunities for numerical descriptions of populations and samples. The challenge is in knowing which analyses are best for a given situation. Designed for the evaluator with little statistical background, the workshop covers the basics of parametric statistics, and nonparametric statistics for use with small or skewed samples. 

Hands-on exercises interspersed with mini-lectures will introduce methods and concepts. The instructor will review examples of research and evaluation questions and will provide data disks so you can prepare data for analysis, run the analysis, and interpret the output. 

Specifically, the parametric techniques covered will include power analysis, reliability analysis, factor analysis, t-tests, ANOVAs, and, if time allows, ANCOVAs. Multiple regression will not be covered. The nonparametric techniques covered will include tests of association (several variants of correlations and chi-squares), pre/post/follow-up tests, tests for significant differences between groups, and inter-rater reliability. Additionally, stats basics such as confidence intervals and standard error will be reviewed to help attendees maximize understanding of output. Finally, because the workshop focuses on SPSS as the data analysis tool, some time reviewing navigation of the software will be included at the start of the workshop.

You will learn:

§    Appropriate statistical techniques for large or small/skewed samples,

§    Which analysis technique is best for a given data set or evaluation question,

§    Analysis using SPSS,

§    How to interpret and report findings.

Jennifer Camacho applies quantitative analysis in her practice as the Director of Evaluation and Quality Assurance at Sinai Community Institute. She has presented workshops at AEA’s annual conference for the past four years and enjoyed consistently positive reviews.  

Session 2: Quantitative Methods

Scheduled: Monday and Tuesday, 11/3-4, 9 am to 4 pm

Level: Beginner, no prerequisites

Fee: Members $300, Nonmembers $400, Students $160


Consulting Skills for Evaluators: Getting Started

Do you have what it takes to be a successful independent consultant? Designed for evaluators considering becoming independent consultants or who have recently begun a consulting practice, the workshop will help you to assess your own skills and characteristics to determine if you have what it takes to be successful and strategize about areas in need of improvement.

The workshop will focus on the full scope of operating an independent consulting practice from marketing to developing client relationships to project management, ethics, and business operations. Case examples, hands-on activities, and take-home materials will prepare you to enter the world of consulting.

You will learn:

§   If consulting is an appropriate career choice for you,

§   How to break into the evaluation consulting market – and stay there,

§   Time and money management strategies,

§   Professional practices including customer service, ethical operations, and client relations.

Gail Barrington started Barrington Research Group 18 years ago as a sole practitioner. Today, she has a staff of 20 and a diverse client base. A top rated presenter, she has taught workshops throughout the US and Canada.

Session 3: Consulting Skills

Scheduled: Monday and Tuesday, 11/3-4, 9 am to 4 pm

Level: Beginner, no prerequisites

Fee: Members $300, Nonmembers $400, Students $160


Evaluation 101: Intro to Evaluation Practice

Begin at the beginning and learn the basics of evaluation from an expert trainer. The session will focus on the logic of evaluation to answer the key question: "What resources are transformed into what program evaluation strategies to produce what outputs for which evaluation audiences, to serve what purposes." Enhance your skills in planning, conducting, monitoring, and modifying the evaluation so that it generates the information needed to improve program results.

A case-driven instructional process, using discussion, exercises, and lecture will introduce the steps in conducting useful evaluations: Getting started, Describing the program, Identifying evaluation questions, Collecting data, Analyzing and reporting, and Using results.

You will learn:

§   The basic steps to an evaluation,

§   Contextual influences on evaluation and ways to respond,

§   Logic modeling as a tool to describe a program and develop evaluation questions and foci,

§   Methods for analyzing, and using evaluation information.

John McLaughlin has been part of the evaluation community for over 30 years working in the public, private, and non-profit sectors. He has presented this workshop in multiple venues and will tailor this two-day format for Evaluation 2003.

Session 4: Evaluation 101

Scheduled: Monday and Tuesday, 11/3-4, 9 am to 4 pm

Level: Beginner, no prerequisites

Fee: Members $300, Nonmembers $400, Students $160


Using Appreciative Inquiry in Evaluation

Experience the power of appreciative reframing! Appreciative evaluation maximizes chances for sustainable impact by helping programs identify what is working and drawing on existing strengths to build capacity and improve program effectiveness. Appreciative evaluation does not veil problems, but rather refocuses energy in a constructive and empowering way.

You will experience the various phases of Appreciative Inquiry (AI) by developing evaluation questions, indicators and data collection tools; conducting and analyzing apprecia­tive interviews; and sharing results. You will also explore ways to use AI for evaluation capacity building.

You will learn:

§   The principles and applications of AI to evaluation,

§   To formulate evaluation goals, questions and indicators using AI,

§   Development and use of instruments using an AI approach,

§   Ways to employ AI for evaluation capacity building.

Tessie Catsambas, President of EnCompass LLC, Ana Coghlan, evaluation specialist at the Peace Corps, and Hallie Preskill, University of New Mexico professor and evaluation consultant, together bring to the workshop years of training experience and hands-on practice using AI.

Session 5: Evaluation 101

Scheduled: Monday and Tuesday, 11/3-4, 9 am to 4 pm

Level: Beginner, no prerequisites
Fee: Members $300, Nonmembers $400, Students $160

TUESDAY, NOV 4, FULL DAY SESSIONS, 9 am to 4 pm

Using Effect Size and Association Measures

Answer the call to report effect size and association measures as part of your evaluation results. Improve your capacity to understand and apply a range of measures including: standardized measures of effect sizes from Cohen, Glass, and Hedges; Eta-squared; Omega-squared; the Intraclass correlation coefficient; and Cramer’s V.

Through mini-lecture, hands-on exercises, and demonstration, you will improve your understanding of the theoretical foundation and computational procedures for each measure as well as ways to identify and correct for bias.

You will learn:

§   To compute a range of effect size and association measures,

§   Considerations in the use of confidence intervals,

§   Ways to identify and correct for measurement bias,

§   How to select the appropriate measure of effect size or association.

Jack Barnette, from the University of Iowa has been conducting research and writing on topic of how to best use affect size and association measures for five years. He also brings over 30 years of teaching and workshop facilitation experience and has received awards for outstanding teaching.

Session 6: Effect/Size Measures

Prerequisites: Univariate statistics through ANOVA & power

Scheduled: Tuesday, 11/4, 9 am to 4 pm

Level: Intermediate
Fee: Members $150, Nonmembers $200, Students $80


Evaluation-specific Methodology  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Is there 'something more' that an evaluator needs to be able to do, that a doctorate in a social science won't have taught her/him? Here we’ll spell out the 'something more' in some detail, so that you not only know what it is, but acquire basic skills in it. We'll cover: 1) validation of values; 2) the process of integration of values with factual claims; 3) needs assessment; 4) integration of evaluations of a program (etc.) on several dimensions of merit into an overall evaluation of merit; and 5) setting standards of merit.

Via discussion and mini-lectures, we will first examine the possibility that there is no evaluation-specific methodology and then investigate the inverse through small-group problem solving.

You will learn:

§   The ways in which evaluation differs from other social sciences,

§   Evaluation-specific skills,

§   When and how to apply such methodologies.

Michael Scriven is among the most well-known known professionals in the field today with over 90 publications related to evaluation methodology. He is currently a professor at Auckland University in New Zealand.

Session 7: Evaluation Methodology

Prerequisites: Basic training or experience in evaluation

Scheduled: Tuesday, 11/4, 9 am to 4 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80


Participatory Evaluation Practice: Issues and Strategies

Participatory evaluation practice requires evaluators to be skilled facilitators of interpersonal interactions. This workshop will provide you with theoretical grounding (social interdependence theory, conflict theory, and evaluation use theory) and practical frameworks for analyzing and extending your own practice.

Through presentations, discussion, reflection, and case study, you will experience a range of strategies that can be used to enhance participatory evaluation and foster interaction. You are encouraged to bring examples of challenges faced in your evaluation practice for discussion.

You will learn:

§   Strategies to foster effective interaction, including belief sheets; values voting; three-step interview; cooperative rank order; graffiti; constructive controversy,

§   Responses to challenges when using participatory evaluation practices,

§   Four frameworks for reflective evaluation practice.

Laurie Stevahn is a professor at Seattle University and has extensive facilitation experience as well as applied experience in participatory evaluation. In 2002, she co-taught this workshop when it received the highest overall ranking among the sessions offered.

Session 8: Participatory Evaluation

Prerequisites: Basic eval skills

Scheduled: Tuesday, 11/4, 9 am to 4 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80


Shoestring Evaluation: Overcoming Constraints  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

What do you do when asked to perform an evaluation on a program that is well underway? When time and resources are few, yet expectations high? When quest-ions about baseline data and control groups are met with blank stares? The Shoestring Evaluation approach seeks to ensure the best quality evaluation under real-life constraints.

Through presentations and discussion, with real-world examples drawn from international develop­ment evaluation, you will study the Shoestring Evaluation approach. The workshop focuses on developing country evaluation, but the techniques are applicable to evaluators working in any context with budget, time, and data constraints.

You will learn:

§   The six steps of the Shoestring Evaluation approach,

§   Ways to reduce the costs and time of data collection,

§   How to reconstruct baseline and control group data,

§   Methods for addressing threats to validity and accuracy of findings.

 

Michael Bamberger and Lucia Fort of the World Bank, and Jim Rugh of CARE International, will draw upon their experience in international training and application of the Shoestring Method.

 

Session 9: Shoestring Evaluation

Prerequisites: Basic evaluation skills and field experience

Scheduled: Tuesday, 11/4, 9 am to 4 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80


Exploring Qualitative Data Analysis Software

Which qualitative data analysis (QDA) software package is right for your work? How can you take full advantage of the QDA package you have purchased? Issues beyond marketing, sales figures and colleague suggestions should guide choice strategies. Leave this session as an informed decision maker both before and after your QDA software purchase.

Through demonstration and discussion the session will feature a range of functions within major commercial packages to illustrate how you can use the computer to illuminate conceptual connections that emerge from your data.

You will learn:

§   The fit of QDA packages with your personal analysis style,

§   Considerations for making informed QDA software purchase and use decisions,

§   Ways to ask informed questions of current software users,

§   Important do’s and don’ts in qualitative software use.

 

Ray Maietta is President and founder of ResearchTalk Inc, a qualitative inquiry consulting firm. His training and content expertise is extensive, reflected in this workshop being ranked among the top 10% when offered in 2002.

 

Session 10: Qualitative Software

Prerequisites: Experience in qualitative data analysis

Scheduled: Tuesday, 11/4, 9 am to 4 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80

WEDNESDAY, NOV 5, FULL DAY,  8 am to 3 pm

 

Logic Modeling for Dummies

Many evaluations fall short not in rigor of scientific method but in a failure to describe adequately the program and its intended outcomes. The logic model, as a schematic for what a program is and intends to do, is a useful tool for clarifying program objectives and identifying how to improve the relationship between program activities and those objectives.

We will recapture the utility of program logic modeling as a simple discipline. We will examine steps for constructing logic models, identify what models can tell us, and explore how to use models to get managers, staff, and evaluators rowing in the same direction. We will also touch on ways to improve logic models using insights from program theory and system dynamics. The session is a series of modules each incorporat­ing short didactic presentations, small group case studies, and plenary debriefs to reinforce group work.

You will learn:

§   To construct and refine logic models,

§   To use logic models to identify and answer strategic planning questions,

§   To develop an evaluation focus based on a logic model.

Thomas Chapel is the central evaluation resource person and logic model trainer at the Centers for Disease Control. He has taught this workshop at AEA for the past two years to much acclaim.

Session 11: Logic Modeling Intro

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Using GIS in Evaluation 

Geographic Information Systems (GIS) are a suite of tools that can help you to manage, analyze, model and display complex spatial information and relationships simply. GIS have been used in a variety of contexts, and are highly applicable to evaluators examining community-level or larger change.

Through lecture in the morning and hands on plotting and analysis of real-world data in the afternoon, you will investigate how to use GIS to depict change through examining special relationships. You will receive a free demo copy of one of the most commonly used GIS software tools, ARCGIS. 

Participants should bring a laptop computer (running WINXP Home, WINXP Pro, WIN2000 Pro, or NT 4.0 with Service Pack 6a) with the following specifications: minimum 128MB RAM (256MB recommended), a minimum processor speed of 450MHz (650 MHz recommended), minimum disk space of 605 MB and minimum Swap Space of 300 MB.

You will learn:

§   The purposes, strengths, and weaknesses of GIS,

§   When and how to apply GIS to show change over time,

§   The basics of running ARCGIS,

§   How to make a map and identify special patterns in the data.

Ralph Renger, Sydney Pettygrove, Seumas Rogan, and Adriana Cimetta authored an article in the Winter 2002 issue of the American Journal of Evaluation on using GIS as an evaluation tool. The team will reconvene to share their expertise in a hands-on format.

Session 12: Using GIS

Prerequisites: Basic Evaluation Skills, Computer literacy

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80


Grantwriting 101: Creating Winning Proposals

How do we find and tap into needed funding? One way is through public and private sector grants, but to access these you must master effective grant writing. This workshop will enhance your ability to identify potential grant funds, to conceptualize and organize a grant application, to oversee the grant application process through to fruition, and to administer a funded grant.

Through a combination of lecture and small group exercises you will investigate each stage of the grantwriting process and become conversant in the art and science of effective grantwriting. You will also receive extensive resource materials for take-home use.

You will learn:

§   National and international sources for potential grant funding,

§   How to conceptualize and organize a grant application,

§   The intricacies of budget development,

§   Key components to successful grants management and administration.

Michael Shafer is a returning trainer for 2003 and has personally generated in excess of $12 million dollars in grant revenue. He currently manages a grant funded research and training center at the University of Arizona.

Session 13: Grantwriting 101
Scheduled:
Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Evaluation in Immigrant Communities

Attend to the unique issues of working in communities and cultures with which you may be unfamiliar and within which your craft is unknown. This workshop will examine such issues as entry, access, trust and relationship-building, sampling, methodological syncretism and adaptation, instrument development, translation, culturally appropriate behavior and reporting, and stakeholder participation.

Drawing on case examples from extensive experience in immigrant communities, we will illustrate what has and hasn’t worked well, principles of good practice, and the learning opportunities for all involved. Through simulations and exercises you will experience the challenges and rewards of cross-cultural evaluation.

You will learn:

§   New and transformational approaches to evaluation practice in unfamiliar cultures and settings,

§   How to draw upon the traditions of communities in mutually beneficial ways,

§   Useful, respectful and credible ways to collect and report information for stakeholders.

Joan Othieno, Mia Robillos and Barry Cohen, are on the staff of Rainbow Research, Inc. a non-profit research and evaluation firm celebrating its thirtieth year in 2003.

Session 14: Immigrant Comm.

Prerequisites: Basic Eval Skills
Scheduled:
Wednesday, 11/5, 8 am to 3 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80


Social Network Analysis  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Social Network Analysis (SNA) allows you to examine the use of connections between and among individuals and things as indicators of the capacity to attain goals. SNA methodology provides the appropri­ate tool for evaluators interested in informal and formal networks and structural patterns in groups, teams, and collaborations; as well as for examining organization capacity for reform and project implementation.

You will have the opportunity to review and apply the SNA methodology from the hands-on development of appropriate data-collection instruments to computer-based data set construction, analysis and reporting. A takeaway workbook will ensure that what you learn in the workshop is at your fingertips in the field. Please bring a laptop if you have one, but do not hesitate to register without one – we’ll share!

You will learn:

§   What SNA is and how it differs from other methodologies,

§   The types of evaluation questions and data for which SNA is best,

§   How to apply SNA to a real-world problem from data-collection to analysis and reporting.

Maryann Durland has worked with Social Network Analysis for over 16 years in business and education settings, and uses the methodology consistently in her independent consulting practice.

Session 15: Network Analysis

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Utilization-focused Evaluation  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Evaluations should be useful, practical, accurate and ethical. Utilization-focused Evaluation is a process that meets these expectations and promotes use of evaluation from beginning to end. By carefully implementing evaluations for increased utility, this approach encourages situational responsive-ness, adaptability and creativity. 

With an overall goal of teaching you the process of Utilization-focused Evaluation, the session will combine lectures with concrete examples and interactive case analyses, including cases provided by the participants.

You will learn:

§   The fundamental premises of Utilization-focused Evaluation,

§   The implications of focusing an evaluation on intended use by intended users,

§   Options for evaluation design and methods based on situational responsiveness, adaptability and creativity,

§   How to use the Utilization-focused Evaluation checklist & flowchart.

Michael Quinn Patton is an independent consultant and professor at the Union Institute. An internationally known expert on Utilization-focused Evaluation, in 1997 he published the third edition of the book on which this session is based, Utilization Focused Evaluation: The New Century Text.

 

Session 16: Utilization-focused

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Needs Assessment for Evaluators

Needs assessments identify gaps between current results ("What Is") and required ones ("What Should Be"), prioritizes those gaps on the basis of the costs and benefits of closing versus ignoring them, and selects the needs to be reduced and eliminated. 

Working individually and in small groups supported by mini-lectures, you will explore: 1) identifying current results and specifying desired ones, 2) cause (SWOT) analysis, 3) performance requirements analysis (objective setting), and 4) solutions alternative analysis (decision making). Throughout the workshop you will receive tools for applying these techniques within your organizations.

You will learn:

§   Derivation of long-term and short-term objectives,

§   Identification and prioritization of gaps in results (needs),

§   Development of a data collection plan to support improvement,

§   Performance and causal analysis,

§   Selection of solution alternatives.

Doug Leigh chairs AEA’s Needs Assessment TIG and teaches at Pepperdine University. He has authored multiple publications on improving organizations using needs assessment and brings this content expertise as well as extensive training experience to the workshop.

Session 17: Needs Assessment

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Survey Development

Surveys and questionnaires are often critical components of evaluations. Skillfully designing, pilot testing, and administering surveys and questionnaires can improve your response rate, as well as increase the reliability, validity, and usefulness of the data collected. This workshop will help you more efficiently create quality surveys and questionnaires while avoiding the common pitfalls.

Through lecture, discussion, and hands-on exercises, we will focus on the practical “how to” aspects of surveying. Materials include question examples and design and administra­tion checklists to ensure the information from the workshop is at your fingertips after the session.

You will learn:

§   What information to gather before you begin survey development,

§   Question formats to gather specific types of data (and the advantages and disadvantages
of each),

§   Tips for practical pilot-testing,

§   An overview of administration options & process considerations,

§   Steps to increase the reliability and validity of your survey data.

Kelly Hannum and Jennifer Martineau conduct evaluations at the Center for Creative Leadership and bring over 17 years of combined experience developing and using surveys in the public, private, and non-profit sectors.

Session 18: Survey Development

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Systems Concepts, Methods, and Evaluation Practice

Systems-based approaches examine the inter-relationships among human and contextual actors in systems both small and large. They have the potential to yield powerful insights for program improvement; yet, are often misunderstood by evaluators. This workshop will focus on the practical application of three commonly used systems-based approaches: system dynamics, soft systems methodology, and complex adaptive systems.

Through short lectures and case studies, you will have the opportunity to apply systems approaches to real evaluation contexts that produce meaningful recommendations for change.

You will learn:

§   The underlying concepts of systems theory,

§   Key facets of three systems theory approaches and their application to evaluation,

§   How to use practical systems-based approaches as a tool to improve programs.

Bob Williams, Bill Harris and Glenda Eoyang are published experts on systems-based approaches. Pioneers in applying systems-based approaches to evaluation, they supple­ment unparalleled theoretical grounding with extensive practical experience.

Session 19: Systems Approaches

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Evaluation Practice: A Collaborative Approach

Collaborative evaluation is an approach that actively engages program stakeholders in the evaluation process. When stake-holders collaborate with evaluators, stakeholder and evaluator under-standing increases and the utility of the evaluation is often enhanced.

Employing discussion, hands-on activities, and roleplaying, this workshop focuses on strategies and techniques for conducting successful collaborative evaluations, including ways to avoid common collaborative evaluation pitfalls.

You will learn:

§   A collaborative approach to evaluation,

§   Levels of collaboration and when and how to employ them,

§   Techniques used in collaborative evaluation,

§   Collaborative evaluation design and data-collection skills.

Rita O’Sullivan of the University of North Carolina and John O’Sullivan of North Carolina A&T State University have offered this well-received session for the past five years at AEA. The presenters have used collaborative evaluation techniques in a variety of program settings, including education, extension, family support, health, and non-profit organizations.

Session 20: Collaborative Eval

Prerequisites: Basic Eval Skills

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80


Boundaries, Borderlands and Border-Crossers

We live and work in an interconnected community of global “neighbor-hoods” that challenge us to communicate and engage others across “diversity divides.”  Join us in walking the pathways towards multicultural competencies as an ongoing, open-ended process that supports the design and implementation of more authentic, responsive, and effective evaluations.

The teaching-learning journey will be guided by a 10- member team that is planfully diverse in race/ethnicity and gender. These teams will individually and jointly engage attendees in a deliberative capacity-building process that models inclusive ways of inviting and attending to socioculturally diverse voices and experiences.

You will learn:

§   Mindfully-inclusive ways to invite and incorporate socioculturally diverse experiences,

§   The challenges of unpacking, deconstructing and reconstructing conventional evaluation practices,

§   Ways to work more responsively and effectively across “diversity divides,”

§   The value of cultivating intercultural/multicultural competencies.

Hazel Symonette, hails from the University of Wisconsin-Madison and is the Board’s liaison to AEA's Building Diversity Initiative and International Committee. Drawing on over 30 years of experience in diversity-related arenas, she has convened five gender-balanced and race/ethnic specific teams to collaborate in the facilitation of this session.

Session 21: Border-Crossers

Prerequisites: Basic evaluation skills

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Intermediate
Fee: Members $150, Nonmembers $200, Students $80

ROI: Providing a Balanced Viewpoint of Program Success
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Measuring the Return on Investment (ROI) of training, human resources, education, and social programs is important in both the public and private sectors. The proven, credible ROI methodology presented in this workshop generates six types of measures of program success including participant reaction and satisfaction; learning; application and implementation; impact; ROI; and intangible measures. The process includes a critical step to isolate the effects of the program without the need for a control group.

Exercises and case studies will offer you the opportunity to work through critical steps of the ROI methodology currently in use at hundreds of organizations around the world. To ensure that you have resources available after the workshop, you will receive a workbook, models, and a copy of Return on Investment in Training and Performance Improvement Programs. 

You will learn:

§   A proven, credible, feasible method for measuring ROI,

§   Methods to isolate program effects,

§   Methods to convert program effects to monetary value,

§   Strategies for implementing the ROI methodology and reporting ROI results.

 

Jack Phillips is the developer and internationally-recognized expert on the ROI methodology. He has conducted workshops on ROI at companies and conferences around the world. Patti Phillips focuses her work on issues of accountability and ROI. Both have published extensively on the topic.

 

Session 22: Return on Investment

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Beginner, no prerequisites

Fee: Members $150, Nonmembers $200, Students $80


Practical Meta-analysis

Meta analysis (aka quantitative synthesis) is widely used to summarize the findings of evaluation studies for researchers, policy makers, and practitioners. Increasingly, it is playing a central role in the movement for evidence-based practice in social programs through such initiatives as the Campbell Collaboration and the Department of Education’s What Works Clearinghouse.

This workshop will use lecture, exercises, and demonstrations to introduce the basic techniques of meta-analysis. We will discuss the steps in conducting a meta-analysis as well as the ways in which the techniques can be used in the design and analysis of individual evaluation studies. Along the way, we will review some of the insights meta-analysis has revealed about evaluation research methods and program effects.

You will learn:

§   How to conduct a simple meta-analysis to summarize evaluation findings,

§   How to interpret and use meta-analysis results,

§   How to apply meta-analysis techniques to the design and analysis of evaluation studies.

Mark Lipsey is the Director of the Center for Evaluation Research and Methodology at Vanderbilt University and has been conducting meta-analysis research for nearly two decades. He is a co-author with David Wilson of a meta-analysis primer, Practical Meta-Analysis (Sage), and co-author with Peter Rossi of the textbook, Evaluation: A Systematic Approach.

Session 23: Meta-analysis

Prerequisites: Basic quantitative methods and analysis skills

Scheduled: Wednesday, 11/5, 8 am to 3 pm

Level: Intermediate

Fee: Members $150, Nonmembers $200, Students $80

WEDNESDAY, NOV 5, HALF DAY, FROM 8 am to 11 am

Costs, Cost-effectiveness, and Cost-benefit Analysis

By building a model for a program you have already or wish to evaluate, you will learn Cost -> Procedure ->  Process ->  Outcome Analysis. CPPOA is a comprehensive, well-tested approach for modeling, evaluating, managing, and systematically improving the cost-effectiveness and cost-benefit of health and human services programs.

 

The presenter will provide real-world examples for each step in under-standing and improving relationships between resources used, processes enacted, and outcomes produced. After each step’s explanation, you will construct that element of the model supported by personal coaching from the facilitator.

 

You will learn:

§   Differences among costs, benefits, cost-effectiveness, & cost-benefit,

§   To develop a logic model organized into cost, procedure, process, and outcome variables,

§   Strategies for analyzing costs, cost-effectiveness, & cost-benefit,

§   How to improve the costs and benefits of programs based on a CPPOA evaluation.

 

Brian Yates hails from American University and brings to AEA more than a quarter-century of experience, supported by over 60 publications, on cost-effectiveness and cost-benefit analysis in a variety of health, mental health, and substance abuse settings.

 

Session 24: Cost-benefit Analysis

Scheduled: Wednesday, 11/5, 8 am to 11 am

Level: Beginner, no prerequisites

Fee: Members $75, Nonmembers $100, Students $40


Observation Tools and Techniques

When is observation an appropriate data collection method? How do you plan for and manage an observation of a program and its participants, service delivery, events, and/or facilities? This workshop will provide an overview of observation as a method of data collection including when it should be used, how to prepare, and development and use of instruments. 

 

Through mini-lecture, discussion, small-group exercises, and hands-on practice, you will investigate, create, and then use observation tools and techniques. You will leave the workshop with the beginnings of an observation plan for a program of your choosing.

 

You will learn:

§   The purpose, value, advantages and disadvantages of observation as a data collection technique,

§   Steps to planning an observation,

§   Techniques for constructing an observer rating checklist,

§   How to conduct an observation.

 

Dawn Hanson Smart is a primary partner in The Evaluation Forum, created to build internal evaluation capacity in community agencies and their funders. She has over twenty years of evaluation experience with social services agencies, government funders, and United Ways including design, implementation and technical assistance projects.

 

Session 25: Observation Tools

Scheduled: Wednesday, 11/5, 8 am to 11 am

Level: Beginner, no prerequisites

Fee: Members $75, Nonmembers $100, Students $40


Reconsidering Representation in Evaluation

Plays,  poetry, and pictures have been suggested as "new" or "alternative" forms of representation possibilities for evaluators. The focus of this workshop is the importance of considering representational form in the presentation of evaluation findings and what that means for the evaluation process from planning and design to analysis and reporting.

 

We will move beyond the novelty of these forms to engage in discussion of issues that surface when you present findings differently. You will leave with ideas for incorporating new representational formats into your evaluation practice.

 

You will learn:

§   To structure evaluation design to allow for alternative forms of representations of findings,

§   To combine "traditional" and "non-traditional" representations to inspire learning and action,

§   How thinking about possibilities for representing findings can open up new possibilities for evaluation design, methods and reporting.

 

Leslie Goodyear’s dissertation focused on combining forms of representation. She conducts evaluation training as part of her work at City Year. Merrill Chandler co-led an AEA workshop in 2002 on using performance in evaluation.

 

Session 26: Representation in Eval

Prerequisites: Observation of new forms of representation

Scheduled: Wednesday, 11/5, 8 am to 11 am

Level: Intermediate
Fee: Members $75, Nonmembers $100, Students $40


Making Sense of Qualitative Evaluation Data  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Explore strategies for category development, coding, analysis and interpretation of interview transcripts, field notes, reflective journal entries, and archival type documents including photographs. This hands-on workshop will give you the opportunity to experience the tasks of category development and the creation of working models.

 

Through mini-lecture, small group and large group activities you will gain facility with working with large amounts of narrative data including writing up a narrative representation. We will examine ethical issues that arise with qualitative projects as well as the problem of member checks and audit reviews of initial reporting to various types of agencies.

 

You will learn:

§   To make sense of interview transcripts,

§   Strategies for handling member checks of data,

§   Effective use of archival information and photographs.

 

Valerie Janesick’s life work is teaching, writing and research-ing about qualitative methodology. Her second edition of Stretching Exercises for Qualitative Researchers, (Sage) is due out this year. She has taught courses and workshops on methodology for over a quarter of a century.

 

Session 27: Qualitative Data

Prerequisites: Experience working with qualitative data

Scheduled: Wednesday, 11/5, 8 am to 11 am

Level: Advanced

Fee: Members $75, Nonmembers $100, Students $40


Cluster Evaluation Design and Data Collection Tools

Cluster evaluation focuses on evaluating a set, or cluster, of projects targeting a single issue such as literacy. The projects may have little in common other than (usually) a funding source and focus area, and their disparate nature brings unique evaluation challenges and opportunities. Cluster evaluation can be used for knowledge generation, formative and/or summative evaluation.

 

Oriented around group dialogue, this session will address 1) when to use cluster versus multi-site evaluation, and 2) designs for four areas of attention often addressed through cluster evaluation - a program’s quality, sustainability, ability to cultivate depth and breadth, and mutual benefits among partners.

 

You will learn:

§   The differences between cluster and multi-site evaluation and the defining features of each,

§   When to use cluster evaluation,

§   Four cluster evaluation designs and their applicability to different evaluation purposes.

 

Beverly Parsons, Executive Director of InSites, has over 20 years experience in evaluation and has conducted and consulted on numerous cluster evaluations. James Sanders is one of the originators of cluster evaluation, developed while he was director of evaluation at the W.K. Kellogg Foundation.

 

Session 28: Cluster Eval Design

Prerequisites: Basic Eval Skills

Scheduled: Wednesday, 11/5, 8 am to 11 am
Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40

WEDNESDAY, NOV 5, HALF DAY, FROM 12 PM to 3 PM

Complex Cluster Evaluation Management and Data Use

Move beyond data collection to the challenges of cluster evaluation management, data interpretation and use. Pre- registrants can submit design and data interpretation/use challenges to serve as the basis for dialogue. Post-workshop, you may participate in a new cluster evaluation Community of Practice and receive on-line coaching through January.

 

We will use case studies to develop a deep understanding of a basic premise of cluster evaluation—interactive data interpretation and data-based development of program changes. This will be followed by an examination of complex cluster evaluation designs that change over time and serve multiple purposes. This session is largely new content from previous years.

 

You will learn:

§   Differential data interpretation and use techniques,

§   Design of staffing plans and site visits for efficient data analysis, interpretation, and use,

§   Conscientious adjustment of evaluation over multiple years.

 

Beverly Parsons, and James Sanders together bring half a century of experience in evaluation to the session. See session 28 for more on these presenters.

 

Session 29: Cluster Management

Prerequisites: Workshop #28 OR Cluster Evaluation Experience

Scheduled: Wednesday, 11/5, 12 pm to 3 pm

Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40


Empowerment Evaluation

Empowerment Evaluation builds program capacity and fosters program improvement. It teaches people to help themselves by learning how to evaluate their own programs. The basic steps of empowerment evaluation include: 1) establishing a mission or unifying purpose for a group or program; 2) taking stock - creating a baseline to measure future growth and improvement; and 3) planning for the future - establishing goals and strategies to achieve goals, as well as credible evidence to monitor change. The role of the evaluator is that of coach or facilitator in an empowerment evaluation, since the group is in charge of the evaluation itself.

Employing lecture, activities, demonstration and discussion, the workshop will introduce you to the steps of empowerment evaluation and tools to facilitate the approach.

You will learn:

§   Steps to empowerment evaluation,

§   How to facilitate the prioritization of program activities,

§   Ways to guide a program’s self-assessment.

 

David Fetterman hails from Stanford University and is the author of the seminal text on this subject: Empowerment Evaluation. He Chairs the Collaborative, Participatory and Empowerment Evaluation AEA Topical Interest Group and is a highly experienced and sought after facilitator.

 

Session 30: Empowerment Eval

Scheduled: Wednesday, 11/5, 12 pm to 3 pm

Level: Beginner, no prerequisites

Fee: Members $75, Nonmembers $100, Students $40


Coaching Nonprofits For Outcome-based Evaluation  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Enhance your coaching skills related to building capacity of nonprofit and community groups to plan and implement effective outcome-based evaluation. The areas of focus will be: outcome and indicator selection; theory of change frameworks and logic models; and selecting evaluation designs and data collection methods.

You will be introduced to relevant "how- to" methods and practice exercises that can readily be used in your own evaluation coaching work. Group discussion will provide opportunities to share and learn from one another's experiences in order to enhance knowledge and skills to address challenges related to building evaluation capacity.

You will learn:

§   Coaching skills in defining outcomes and indicators,

§   Coaching skills in constructing logic models and theories of change frameworks,

§   Coaching skills in selecting evaluation designs and data collection methods for outcome-based evaluation.

 

Jane Reisman, Anne Gienapp, and Bill Leon all hail from The Evaluation Forum, a leading provider of evaluation, materials, training and coaching.

 

Session 31: Coaching 4 Outcomes

Prerequisites: Eval Skills, experience w/nonprofits & funders

Scheduled: Wednesday, 11/5, 12 pm to 3 pm

Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40

 


Presentation Skills for Evaluators

Solid presentation skills are of paramount importance to the ultimate success of your evaluation. We will cover the entire presentation process, from developing effective visual aids and content plans to adding impact to your presentations by enhancing your skills to control attention and increase interaction.

 

Through interactive mini-lectures, discussion, demonstrations, and hands-on exercises, you will experience a variety of presentation methods, creative uses of media, and well-crafted materials that illustrate how to achieve a high impact presentation. These activities are designed to simulate and replicate problems, challenges, and decisions related to real case presentations.

You will learn:

§   To refine your presentation style and engage participants’ interest,

§   Ways to overcome stage fright,

§   Visual layout guidelines and how to apply them for maximum effect,

§   To deal with difficult participants and handle distractions.

 

Liliana Rodriguez-Campos, and Rigoberto Rincones-Gomez, of Western Michigan University have taught evaluation and management-related courses and worked together for eight years designing and delivering workshops and seminars.

 

Session 32: Presentation Skills

Prerequisites: Basic Eval Skills

Scheduled: Wednesday, 11/5, 12 pm to 3 pm

Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40


Interviewing as an Evaluation Strategy  
This session is full. No more registrations will be accepted for this session. We do not maintain waiting lists and once a session is full it will not re-open.

Interviewing is a key strategy for generating useful information for evaluation purposes. Interviewing includes talking with people formally and informally as the process takes the evaluator into their worlds. It is frequently the primary method in evaluations that rely exclusively on qualitative methods and may supplement quantitative approaches in a mixed methods design. 

 

Through interactive lectures, small-group work, and simulations, you will develop and refine interviewing approaches for specific evaluation settings. You will have the opportunity to practice interviewing and receive critique.

You will learn:

Purposes and situations for interviewing in an evaluation,

§   Types of interviewing approaches,

§   When to talk and when to listen,

§   Strategies for implementing these approaches.

 

Gretchen Rossman and Sharon Rallis are evaluation consultants, university professors, and co-authors of Learning in the Field, 2nd edition. Each has taught qualitative methods to graduate students and education professionals for more than 15 years. Both have conducted workshop on specific qualitative methods at various professional conferences.

 

Session 33: Interviewing

Prerequisites: Basic Eval Skills

Scheduled: Wednesday, 11/5, 12 pm to 3 pm

Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40

SUNDAY, NOV 9, HALF DAY, FROM 9 am to 12 pm

Tools and Techniques to Improve Logic Models

Using logic models is an increasingly accepted component of evaluation practice. Yet employing a logic model does not assure a quality evaluation.

Models are only as good as their ability to illustrate program theory and to point to the evidence needed to test the model’s validity.

 

Rated among the top 5 workshops offered in 2002, this session will include an overview of concepts, tools, and techniques used to improve logic model quality and utility as well as guided hands-on opportunities to experiment with a variety of logic models. The workshop draws on the WK Kellogg Foundation’s Logic Model Development Guidebook that you will receive for use during and after the session.

 

You will learn:

§   How to critically examine and improve logic models,

§   Uses and misuses of logic models for program design, implementation, and evaluation,

§   Components of logic model quality and utility.

 

Cynthia Phillips hails from Phillips Wyatt Knowlton, Inc. She has extensive experience in research and evaluation and is a respected convener of professional development opportunities.

 

Session 34: Improve Logic Models

Prerequisites: Logic Model Basics

Scheduled: 11/9, 9 am to 12 pm

Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40


The Case Study as an Instructional Tool

Faculty often find themselves with a small window of opportunity to ensure their students are competent to make judgment, implementation, and knowledge-generation decisions. This workshop explores the case study as an instructional tool to build core evaluator competences including knowledge of different evaluation approaches, skill with data collection and data analysis, and attention to ethics and values.

 

In this hands-on session employing multiple teaching strategies, you will have access to classroom-proven materials, as well as opportunities to develop your own. The workshop is meant to spur discussion about "protective" but authentic instructional strategies that mirror the realities of the evaluator.

 

You will learn:

§   The range of competencies to which the case study attends,

§   Authentic case study design,

§   Techniques for workload management,

§   To connect the project with student’s likely future field work.

 

Marcie Bober teaches the field-based evaluation sequence at San Diego State University where she shepherds 20-25 students each year through the practicum process.

 

Session 35: Case Study

Prerequisites: Experience Teaching Evaluation

Scheduled: Sunday, 11/9, 9 am to 12 pm

Level: Advanced

Fee: Members $75, Nonmembers $100, Students $40


Using Cognitive Interviews for Survey Pretesting

Applying cognitive principles to survey methodology can improve item development and pre-testing by identifying cognitive and non-cognitive problems respondents encounter in answering survey questions. Cognitive Interviewing (CI) complements rather than replaces traditional methods of pre-testing and is aligned with current trends in evaluation practice—increasing statistical power through quality measurement practices, attending to cultural sensitivity, and increasing participation.

 

Through lecturettes and experiential learning opportunities, this workshop will provide an overview of the theory and practice of CI including identifying cognitive sources of survey error and how using CI in survey pre-testing can help identify and ameliorate these problems.

 

You will learn:

§   To apply CI to improve surveys,

§   Cognitive principles as they apply to survey pretesting,

§   When and how to conduct cognitive interviews,

§   How to write up CI reports.

 

Michael Schwerin is a survey methodologist at RTI International and Vicki Staebler-Tardino is an experienced organization development consultant with extensive training experience.

 

Session 36: Cognitive Interviews

Prerequisites: Interviewing Techniques, Qualitative Analysis

Scheduled: Sunday, 11/9, 9 am to 12 pm

Level: Intermediate

Fee: Members $75, Nonmembers $100, Students $40


The Success Case Method

The Success Case Method (SCM) is an innovative evaluation approach that organizations can use to find out quickly what is working and what is not. The SCM is faster and less expensive than traditional evaluation methods. Trading statistical complexity for clarity and story telling, SCM studies provide compelling evidence about the impact of organizational interventions such as training programs and technology installations, in ways that senior leaders believe and understand.

 

The workshop will familiarize you with the basic principles and procedures of the SCM and engage you in exercises to apply and critique the SCM approach. You will take home materials to use to plan and conduct your own SCM study.

 

You will learn:

§   Fundamentals of the SCM,

§   To plan a Success Case Study for evaluation of an organizational development initiative,

§   Strategic applications of the SCM.

 

Bob Brinkerhoff is on the faculty at Western Michigan University and is an experienced and sought-after facilitator. He created the SCM in 1998 and it has since been developed in dozens of applications with major organizations

 

Session 37: Success Case Method

Prerequisites: Survey and Interview Skills

Scheduled: Sunday, 11/9, 9 am to 12 pm

Level: Advanced

Fee: Members $75, Nonmembers $100, Students $40


Working Collaboratively in a Conflictive Environment

Master the mechanics of collaboration in a conflictive evaluation environment! Using the collaborative framework presented in this workshop, you can establish a more open and shared culture within programs being evaluated while attending to the intended and unintended effects of the working relationships.

 

Using discussion, demonstration, and hands-on exercises, the workshop blends theoretical grounding with the application of a collaboration framework to real-life evaluations. You will explore how the evaluators’ role can become mediators of conflict in order to create greater cooperation among stakeholders.

 

You will learn:

§   Ways to negotiate and respond to conflictive situations,

§   How to capitalize on stakeholders’ strengths to encourage feedback,

§   Techniques to respond to and minimize resistance to evaluation,

§   Processes for selecting appropriate collaboration methods.

 

Liliana Rodriguez-Campos hails from Western Michigan University and bring to the workshop extensive experience in conducting evaluations across a variety of settings and countries, having consulted in the private sector, with nonprofit organizations as well as with institutions of higher education.

 

Session 38: Collaborative Work

Prerequisites: Basic Eval Skills

Scheduled: Sunday, 11/9, 9 am to 12 pm

Level: Intermediate
Fee: Members $75, Nonmembers $100, Students $40