On Friday, May 13th, I had my Face-to-Face Meeting with two of my mentors, Amber Budden and Heather Soyka (my third mentor is Viv Hutchison), to discuss my project, Developing a Survey Instrument for Evaluation of Teaching Materials, as well as to develop the project plan for all three parts of the project. The three parts of the project include: 1) a literature review that looks at the best practices for soliciting and receiving user feedback on teaching and educational resources; 2) the development of an assessment tool for evaluating current educational resources from DataONE; and 3) identification of target audiences and recommendations for deployment of the tool. As a result of the Face-to-Face Meeting and the project plan, I started the first week of my project with the first part of the 2-week literature review phase.
For the first part of the literature review, I focused primarily on looking for information that could help in informing the best practices and actual examples of methods for evaluating education/training resources. Since my mentors and I had also identified four different education/training delivery modes that were relevant to DataONE (i.e. in person, online, presentation slides, and text-based documents), I also conducted additional literature reviews that focused on the evaluation of these specific education/training formats. The following is the list of keywords and the databases that I used to perform the literature review.
- Search Engine/Database – Google (for information that are published outside of scholarly journal format)
- Searched keywords/phrases – (the keywords/phrases were constructed during the Face-to-Face Meeting)
- training evaluation criteria
- develop a method of training evaluation
- how to evaluate training effectiveness
- education evaluation methods
- education effectiveness assessment methods
- online education elearning evaluation criteria
- e learning evaluation survey
- e learning evaluation questionnaire
- presentation evaluation criteria
- best practices when evaluating presentation
- syllabus evaluation criteria
- evaluate effectiveness of written communication
- Searched keywords/phrases – (the keywords/phrases were constructed during the Face-to-Face Meeting)
- Search Engine/Database – (for information that are published mainly as peer-reviewed journals)
- Library & Information Science Source from UIUC Library
- ERIC (Education Resource Information Center) Education Literature from UIUC Library
- Association for Computing Machinery (ACM) Digital Library from UIUC Library
- Searched keywords/phrases – ((training OR workshop) AND effectiveness AND evaluation AND (criteria OR method))
- The same phrase was used for all three databases and was constructed as a result of reviewing the search results from Google.After reviewing more than 750 results, the preliminary literature review result indicates that questionnaire/survey/poll might be the most applicable method for soliciting user feedback for DataONE’s education/training resources. This is mainly due to the fact that questionnaires/surveys/polls can be administered digitally. Also, when designing the questionnaires/surveys/polls, evaluation areas such as learning objectives, content structure, reasoning, language/vocabularies, level of interactivity, usage of visual aides, technical functions, timing/duration, and return/retention rate of attendees can all be included. Additionally, specific evaluation areas could be emphasized further depending on the category of the education/training delivery methods that is being evaluated. However, care should be taken to ensure that the questionnaire/survey/poll is easy to use and is not overly time consuming for the users. Further, the frequency of the evaluation (e.g. before vs. during vs. after the education/training session) and the types of the evaluation questions (i.e. Open ended, Closed Scalar, Multi-choice, Checklist, Ranked/Ranking) can both be adjusted to support the evaluations more effectively.
- Searched keywords/phrases – ((training OR workshop) AND effectiveness AND evaluation AND (criteria OR method))
After reviewing more than 750 results, the preliminary literature review result indicates that questionnaire/survey/poll might be the most applicable method for soliciting user feedback for DataONE’s education/training resources. This is mainly due to the fact that questionnaires/surveys/polls can be administered digitally. Also, when designing the questionnaires/surveys/polls, evaluation areas such as learning objectives, content structure, reasoning, language/vocabularies, level of interactivity, usage of visual aides, technical functions, timing/duration, and return/retention rate of attendees can all be included. Additionally, specific evaluation areas could be emphasized further depending on the category of the education/training delivery methods that is being evaluated. However, care should be taken to ensure that the questionnaire/survey/poll is easy to use and is not overly time consuming for the users. Further, the frequency of the evaluation (e.g. before vs. during vs. after the education/training session) and the types of the evaluation questions (i.e. Open ended, Closed Scalar, Multi-choice, Checklist, Ranked/Ranking) can both be adjusted to support the evaluations more effectively.
Continuing the literature review process next week, I will look further into the scholarly publications that are related to perform evaluations effectively and the information available regarding the topics of questionnaire/survey/poll design and implementation.