Grounded theory, mixed methods, and action research
International Journal of Multiple Research Approaches
Journal of Mixed Methods Research
Mixed Method and Mixed Model Research
The Use of Mixed Methods in Organizational Communication Research: An Analysis of the Last Ten Years
Debra Viadero 2005 Mixed Methods’ Research Examined.
Extra
Conference & Workshops 2009
User-Friendly Handbook for Mixed Method Evaluations
Edited by
Joy Frechtling
Laure Sharp
Westat
August 1997
Table of Contents
Part I. Introduction to Mixed Method Evaluations
1. Introducing This Handbook
(Laure Sharp and Joy Frechtling)
The Need for a Handbook on Designing and Conducting Mixed Method Evaluations
Key Concepts and Assumptions
2. Illustration: A Hypothetical Project
(Laure Sharp)
Project Title
Project Description
Project Goals as Stated in the Grant Application to NSF
Overview of the Evaluation Plan
Part II. Overview of Qualitative Methods and Analytic Techniques
3. Common Qualitative Methods
(Colleen Mahoney)
Observations
Interviews
Focus Groups
Other Qualitative Methods
Appendix A: Sample Observation Instrument
Appendix B: Sample Indepth Interview Guide
Appendix C: Sample Focus Group Topic Guide
4. Analyzing Qualitative Data
(Susan Berkowitz)
What Is Qualitative Analysis?
Processes in Qualitative Analysis
Summary: Judging the Quality of Qualitative Analysis
Practical Advice in Conducting Qualitative Analyses
Part III. Designing and Reporting Mixed Method Evaluations
5. Overview of the Design Process for Mixed Method Evaluation
(Laure Sharp and Joy Frechtling)
Developing Evaluation Questions
Selecting Methods for Gathering the Data: The Case for Mixed Method Designs
Other Considerations in Designing Mixed Method Evaluations
6. Evaluation Design for the Hypothetical Project
(Laure Sharp)
Step 1. Develop Evaluation Questions
Step 2. Determine Appropriate Data Sources and Data Collection Approaches to Obtain Answers to the Final Set of Evaluation Questions
Step 3. Reality Testing and Design Modifications: Staff Needs, Costs, Time Frame Within Which All Tasks (Data Collection, Data Analysis, and Reporting Writing) Must Be Completed
7. Reporting the Results of Mixed Method Evaluations
(Gary Silverstein and Laure Sharp)
Ascertaining the Interests and Needs of the Audience
Organizing and Consolidating the Final Report
Formulating Sound Conclusions and Recommendations
Maintaining Confidentiality
Tips for Writing Good Evaluation Reports
Part IV. Supplementary Materials
8. Annotated Bibliography
9. Glossary
List of Exhibits
1. Common techniques
2. Example of a mixed method design
3. Advantages and disadvantages of observations
4. Types of information for which observations are a good source
5. Advantages and disadvantages of indepth interviews
6. Considerations in conducting indepth interviews and focus groups
7. Which to use: Focus groups or indepth interviews?
8. Advantages and disadvantages of document studies
9. Advantages and disadvantages of using key informants
10. Data matrix for Campus A: What was done to share knowledge
11. Participants’ views of information sharing at eight campuses
12. Matrix of cross-case analysis linking implementation and outcome factors
13. Goals, stakeholders, and evaluation questions for a formative evaluation
14. Goals, stakeholders, and evaluation questions for a summative evaluation
15. Evaluation questions, data sources, and data collection methods for a formative evaluation
16. Evaluation questions, data sources, and data collection methods for a summative evaluation
17. First data collection plan
18. Final data collection plan
19. Matrix of stakeholders
20. Example of an evaluation/methodology matrix
International Journal of Multiple Research Approaches
Journal of Mixed Methods Research
Mixed Method and Mixed Model Research
The Use of Mixed Methods in Organizational Communication Research: An Analysis of the Last Ten Years
Debra Viadero 2005 Mixed Methods’ Research Examined.
Extra
Conference & Workshops 2009
User-Friendly Handbook for Mixed Method Evaluations
Edited by
Joy Frechtling
Laure Sharp
Westat
August 1997
Table of Contents
Part I. Introduction to Mixed Method Evaluations
1. Introducing This Handbook
(Laure Sharp and Joy Frechtling)
The Need for a Handbook on Designing and Conducting Mixed Method Evaluations
Key Concepts and Assumptions
2. Illustration: A Hypothetical Project
(Laure Sharp)
Project Title
Project Description
Project Goals as Stated in the Grant Application to NSF
Overview of the Evaluation Plan
Part II. Overview of Qualitative Methods and Analytic Techniques
3. Common Qualitative Methods
(Colleen Mahoney)
Observations
Interviews
Focus Groups
Other Qualitative Methods
Appendix A: Sample Observation Instrument
Appendix B: Sample Indepth Interview Guide
Appendix C: Sample Focus Group Topic Guide
4. Analyzing Qualitative Data
(Susan Berkowitz)
What Is Qualitative Analysis?
Processes in Qualitative Analysis
Summary: Judging the Quality of Qualitative Analysis
Practical Advice in Conducting Qualitative Analyses
Part III. Designing and Reporting Mixed Method Evaluations
5. Overview of the Design Process for Mixed Method Evaluation
(Laure Sharp and Joy Frechtling)
Developing Evaluation Questions
Selecting Methods for Gathering the Data: The Case for Mixed Method Designs
Other Considerations in Designing Mixed Method Evaluations
6. Evaluation Design for the Hypothetical Project
(Laure Sharp)
Step 1. Develop Evaluation Questions
Step 2. Determine Appropriate Data Sources and Data Collection Approaches to Obtain Answers to the Final Set of Evaluation Questions
Step 3. Reality Testing and Design Modifications: Staff Needs, Costs, Time Frame Within Which All Tasks (Data Collection, Data Analysis, and Reporting Writing) Must Be Completed
7. Reporting the Results of Mixed Method Evaluations
(Gary Silverstein and Laure Sharp)
Ascertaining the Interests and Needs of the Audience
Organizing and Consolidating the Final Report
Formulating Sound Conclusions and Recommendations
Maintaining Confidentiality
Tips for Writing Good Evaluation Reports
Part IV. Supplementary Materials
8. Annotated Bibliography
9. Glossary
List of Exhibits
1. Common techniques
2. Example of a mixed method design
3. Advantages and disadvantages of observations
4. Types of information for which observations are a good source
5. Advantages and disadvantages of indepth interviews
6. Considerations in conducting indepth interviews and focus groups
7. Which to use: Focus groups or indepth interviews?
8. Advantages and disadvantages of document studies
9. Advantages and disadvantages of using key informants
10. Data matrix for Campus A: What was done to share knowledge
11. Participants’ views of information sharing at eight campuses
12. Matrix of cross-case analysis linking implementation and outcome factors
13. Goals, stakeholders, and evaluation questions for a formative evaluation
14. Goals, stakeholders, and evaluation questions for a summative evaluation
15. Evaluation questions, data sources, and data collection methods for a formative evaluation
16. Evaluation questions, data sources, and data collection methods for a summative evaluation
17. First data collection plan
18. Final data collection plan
19. Matrix of stakeholders
20. Example of an evaluation/methodology matrix
Comments