So this next assignment was to create a survey, get a few people to do it and then submit the finalized survey with any changes that needed to be made. Well I have taken lots of survey's in my life and I have even made a few really minor ones so I was surprised that it was difficult to know where to start creating it. My first decision was what the survey should be about. I decided to relate my survey to my final project, due in a couple weeks. There were a few different survey that I could have created, one for educators or one for parents but I decided to create one for students.
I wanted to create a survey that would help to evaluate what type of experience students had during their outdoor education class. The purpose of the course is to create a fun, engaging class that is a positive learning experience for all students. This survey would be given anomalously to all the students- after the course is completed to evaluate how the students felt about the course. I wasn't sure where to begin so I decided to look back to my logic model so I could re-read my course goals and base my questions from these goals. From here I broke the survey into three main parts: reasons the students chose the course, the students experience in the course and lastly the students written opinions and suggestions. The original survey can be found here: assignment #5: Survey- original.
I then sent my survey on to some trustworthy friends, who I knew would be honest and give my their ideas for any changes that need to be made. The first change I made was to reduce the number of choices that I gave the students to choose from 6 to 4. I took out the yeah choice and the does not apply choice to leave only 4 choices. I generally like having an odd number of choices but then one of my reviewer's said that it is better to take out the middle option as most people default to the middle. Another change was made to a question in the second part that asked if the resources were used well. Feedback was that the question was too vague, I split it into two questions: first if the handouts were helpful and second if the sporting equipment used enhanced learning. My next change involved the question asking if the students felt they had the opportunity to be a part of the community. This question wasn't very descriptive, what I wanted to ask is if the students felt like they were able to feel as though they were able to explore places in the local communities, which would help to increase their knowledge of local ecology and biology. Since this would make a really long question, I compressed the question asking if the students had the opportunity to discover the local community. The last major changes that I made was to reword the written response questions to minimize the chance that students could respond with nothing or no suggestions. I fixed a few minor grammatical issues and changed some wording for consistency and I ended up with my final survey. This survey will be part of my final course evaluation to evaluate the effectiveness of the outdoor education program. Assignment #5- survey final version.
Adventures in 809
Thursday, March 14, 2013
Sunday, February 10, 2013
Assignment #4- Logic Model
So I have been having some troubles posting new posts to my blog, it acts like it should be loading but it all I get is a blank screen. I gave up last night and tried with a refreshed mind this morning, even though it should have been done yesterday- opps. I jumped outside of my knowledge box and took the plunge into using google docs as Bill and J.R had talked about using it to post in class. I have to say that I learned a lot and I will likely start to regularly use it, I didn't know how useful it could be! That being said I'll need to see if my link actually works or if I need to go back and do more research. Here it is, see below for the link- fingers crossed.
Logic Model- Outdoor Education
Now to create my survey :)
Logic Model- Outdoor Education
Now to create my survey :)
Saturday, February 9, 2013
Assignment #3- Planning a Program Evaluation
Outdoor Education
Engage Stakeholders
Who should be involved?
The Outdoor education teachers, administrators, participating students at Delisle Composite School will be involved in the
assessment and the general community including parents will also be informed of the results.
How might they be engaged?
Staff and students and will reflect on the
program through survey questions, interviews, and observations. Students
will also be involved through summative evaluation by comparing final exam
answers with non-participants and interviews of selected students.
Focus on the Evaluation
What are you going to evaluate? Describe
program (logic model)
The plan is to evaluate the:
-effectiveness of student learning in Biology 20
and Physical Education 20
-level of student engagement and satisfaction
Delisle Composite School (DCS) serves 230
students from grade 7 to grade 12. The school is part of the Prairie Spirit
School Division. The student population in the school has been decreasing
in recent years due to the influx of students choosing to attend Saskatoon high
schools. As a way to entice students to keep attending school in Delisle, a
number of interesting programs have been created. It is the hope that these
programs will increase student engagement and will help DCS compete with the
course offerings of bigger city schools.
The outdoor education program combines
physical education 20 and biology 20 together which helps to increase the hands
on perspective of biology 20. The phys. ed. portion of the course is easily
incorporated through biking, snowshoeing, canoeing, hiking and other
activities. The biology portion is typically hands on projects and activities
such as pond dipping, collecting samples and making observations. This
knowledge is supplemented by some coursework and research which helps to occupy
any days where weather does not permit students to be outdoors. This is a unique course as it allows students to enter into the community and use the resources that are available around them. Therefore strong community relationships need to be forged allowing students access to resources such as the golf course and continuing funding that the program receives from local businesses.
What is the purpose of this evaluation?
This is a formative and summative assessment
with the intent of identifying the effectiveness of the outdoor education program
and to identify ways to improve the program by making any adjustments that are
needed to make the program more effective.
Who will the use the evaluation? How will they
use it?
Who/users
|
How will they use the information?
|
Outdoor Education Teacher
|
Results can be used to ensure that the
teacher is delivering adequate knowledge in both subject areas as well as
engaging students with a positive educational experience. Results will also
help guide the teacher if changes need to be made.
|
Administrators
|
Information from the evaluation will be used
to improve the program and to gauge budget needs.
|
Staff
|
Results and conclusions from the program
evaluation will be shared with all staff members to increase staff knowledge
of school programs.
|
Community
|
Results from the evaluation will be shared
with the community, in order to build relationships and ensure contributing businesses that their funding is being well used.
|
Parents
|
Results from the evaluation will be shared
with parents, who will hopefully endorse the program and spread the word
through the community
|
What questions will the
evaluation answer?
How does Outdoor Education affect
student engagement?
How does Outdoor Education affect
student learning?
Do parents see benefits to
Outdoor Education?
Is the teacher utilizing instructional
time efficiently?
Are students performing as well
on final exam questions?
Are resources being used to their
potential?
Are the existing
resources adequate enough?
What information do you need to answer the questions?
What I wish to
know
|
Indicators – How
will I know it?
|
Does Outdoor Ed. affect student engagement?
Is the teacher using time efficiently?
|
Observation and student surveys in both the
outdoor education classes and traditional classes
|
Are students obtaining the same knowledge as
those in traditional classes?
Are students preforming as well on final
exam questions?
|
Compare marks between the traditional and
outdoor education classes.
|
Do parents see benefits to the program?
|
Survey given to the parents
|
Are resources adequate and being well used?
|
Interview outdoor education teacher and
administration
|
When is the evaluation needed?
The evaluation would be carried
out throughout the semester and would be finished by the end of the year on June
30. This would give the evaluator time to compare final exam results, compile
student and parent surveys and complete the necessary interviews.
What evaluation design will I use?
The evaluation will be both
formative and summative. It is formative since the research will be
on-going during the semester and summative since there will be data collected
at the end of the year and any changes or modifications would not be made until
the evaluation was complete. The evaluation will be goals-based using a CIPP
model.
Collect the information
What sources of information will you use?
People: Administration, teachers, students and parents
Pictorial records and observations: Administrative observations and teacher observations
What data collection method(s) will you use?
·
Surveys
(Before and after)
·
Interview
(After)
·
Observation
(During)
·
Marks
on final exams (After)
Focus Group (During)
Focus Group (During)
Instrumentation:
What is needed to record the information?
- Surveys: design questions; distribute surveys by
email to parents, and handouts to students; collect and organize results
- Interviews: choose a random
sample of students to answer questions, create questions, schedule interviews
at the end of the course
- Observations: general document – to be done throughout term
(beginning, middle, end)
- Focus Group: design questions; invite participants; record conversation
When will you collect
data for each method you’ve chosen?
Method
|
Before program
|
During program
|
Immediately after
|
Later
|
Surveys
|
X
|
X
|
||
Interviews
|
X
|
|||
Data (marks)
|
X
|
|||
Observation
Focus Group |
X
X |
Will a sample be used?
No
Analyze and Interpret
How will the data be analyzed and what methods
will be used?
·
Surveys: interpret and compile information (evaluator)
·
Interviews and focus group: interpret, compare and compile information
(evaluator)
·
Observations: compile and compare 3 sets of observations (teacher,
admin., evaluator)
·
Data (marks): compile and compare marks from final exams between two
groups of students
How will the information be interpreted-by whom?
The Evaluator will be responsible for
analyzing data. The administration and outdoor ed. teacher will then have
the opportunity to be brought into discussion and participate in discussion and
allow for their interpretations to be included.
What did you learn? What are the
limitations?
- · Since the evaluator is an internal member of staff there will need to be some care taken when analyzing data and making judgements on the program.
- · There would also be some time constraints on getting all the data collected between the final exam and the end of the year when marks are distributed.
- · The evaluator needs to consider any variations between the traditional classrooms and the outdoor education groups; some years the demand for the program requires that students get ‘accepted’ into the program
Use the information
How will the evaluation be communicated and shared?
The evaluation will
be analyzed and complied into a report that will then be shared with administration
and the outdoor education teacher.
What are the next steps?
Together the
administration, teacher and evaluator will create a brief statement of the
results and any changes that may be implemented in the future. This can be
shared with other stakeholders whenever it will be possible.
Manage the evaluation
Human subject’s protection
Surveys will be done anonymously by students
and parents. This will allow for participants to answer questions
truthfully. The data collected will have all names and identification removed
before the report is made.
Management chart
The evaluator will create a gantt chart (used
to schedule complex schedules) and share it with administrators.
Timeline
·
January:
complete all preparatory evaluation work, prepare surveys and interview
questions
·
February:
conduct pre surveys and do observations
·
March/April:
analyze pre-survey results, complete another observation
·
May: begin
conducting interviews of teacher and administration, distribute parent surveys; conduct student focus group
·
June: distribute, conduct and analyze post surveys; collect and
analyze final exam data; prepare report and meet with administrator and teacher
to finalize report and add in improvements
Responsibilities
The evaluator is responsible for creating all
survey and interview questions. In addition they will need to distribute
surveys, set up interviews, analyze and summarize the results and create a
report to share with administration and the teacher. Collaboratively, a list of recommendations
will be compiled and included in a brief report that will be shared with the
shareholders. The teacher will be responsible for making the final exam and
collecting them for the evaluator afterwards.
Budget
All devices and materials required are located
internally and therefore, the fees included would be any expenses incurred by
the evaluator.
Standards
Utility
Evaluation of the program will be used to make
improvements to the program and to evaluate the program effectiveness.
Feasibility
The outdoor education classes tend to be quite
small so the evaluation should be manageable. The most difficult part would be
in June when a lot needs to be accomplished and schools are at their busiest, however the interviews and focus group could be conducted in May to help ease this time constraint.
If necessary, arrangements could be made to extend the deadline if all the
parties involved will agree to it.
Propriety
There would have to be an element of trust from
the very onset of the evaluation so all parties are respected in all steps of
the evaluation.
Accuracy
The evaluation is looking at specific goals
and outcomes and should be accurate due to this concentrated focus of the
evaluation.
Saturday, January 26, 2013
Assignment #2-Evaluating a Pre-Natal Exercise Program
Assignment #2: Program Evaluation of the Pre-Natal Exercise Program for
Aboriginal Women.
It
has been found that children born to a mother with gestational diabetes (GDM) have
a greater chance of developing type 2 diabetes (Klomp, Dyck, & Sheppard, 2003). Diabetes is a concern in the aboriginal population
as it is estimated that 20% is affected by diabetes (Canadian Diabetes
Association, 2013). This program was set up to try to decrease the incidences
of GDM and eventually type 2 diabetes in Aboriginal people. In order to accomplish this, pregnant
Aboriginal women were invited to participate in an exercise program. I think that Stufflebeam’s CIPP would be
useful in the evaluation of this program.
I chose this model as it is well suited for
needs assessment and planning, monitor the process of implementation, and
provide feedback and judgment of the project’s effectiveness for continuous
improvement,
which is precisely what this project needs (Zhang et al, 2011). Stufflebeam’s CIPP
model is composed of four components: Context, Input, Process and Product.
Context: The objective of context
evaluation is to help decision makers assess needs, problems, assets and
opportunities while defining goals and actions (Tan, Lee, Hall, 2010). The
context of this study is the pre and post natal health of Aboriginal women.
There are many socioeconomic needs that need to be considered when using this
focus group. Limitations to transportation, program costs and lack of child care
were all problems that were addressed to prevent women from participating in the
program. The project was initiated to increase physical activity among young
aboriginal women leading to healthy pregnancies reducing the occurrences of
diabetes in future generations. The program intends to achieve this goal by
hosting a weekly 45 minute exercise class instructed by a qualified instructor.
Input: Input
evaluation helps decision makers to assess plans for the feasibility and
cost-effectiveness for achieving planning objectives (Tan, Lee and Hall, 2010).
The study was funded by the National Health Research and Development Program
(NHRDP) and they would be using the information gathered by the evaluation.
Assessing the cost-effectiveness is very important to determine as many costs
are incurred in the delivery of the program. Staffing costs include: certified fitness instructor,
registered nurse, project facilitator, and physiotherapist. Other additional costs
that were incurred include: childcare, transportation, maternity swimwear, door
prizes, food and drinks, and educational materials and resources.
Process: Process evaluation allows
decision makers to determine if the program is achieving what it was intended
to (Tan, Lee and Hall, 2010). By
examining the program and how it had adapted with the participants allows
evaluators to gauge how the current program is functioning as well as how to
structure anyn future programs. It is unclear as to how the participants were
able to give their feedback, whether it was through surveys or verbal
conversations, but either method would work.
Product: Product evaluation aids
in identifying and assessing outcomes, those intended and unintended,
short-term and long-term (Tan, Lee and Hall, 2010). Due to the short length of
this ‘pilot’ study only short-term outcomes could be determined. If the study
continued, long-term outcomes could be analyzed by seeing if GDM actually decreased
due to the program. Short-term goal of ‘optimizing healthy pregnancies’ could
be determined by allowing the participants to give feedback on the benefits of the
program. Although I don’t believe it was done, vital signs (blood pressure,
weight gain, etc) could be taken and used to give some concise data to the
evaluation. Some unintended outcomes of
the program would be the added value of the program such as the information resources
for loan and the social aspect from participants meeting after class.
I
think a major problem with the program is the lack of clear and concise goals.
Initially the program was designed to reduce type 2 diabetes in Aboriginals. Nothing
in the program is set up to actually determine if this is occurring, not to
mention what a huge undertaking that would be. Another problem is why there was
such low participation (<%7). Was
this acceptable to the program or was it a goal to increase participation as
well?
Canadian
Diabetes Association. (2013). Six hundred
Aboriginal diabetes programs at risk across Canada. Retrieved from
http://www.newswire.ca/en/story/576011/six-hundred-aboriginal-diabetes-programs-at-risk-across-canada.
Klomp, H.,
Dyck, R., and Sheppard, S. (2003). Description and evaluation of a prenatal
exercise program for urban Aboriginal women. Canadian Journal of Diabetes, 27: 231–238.
Tan, S.,
Less, N., and Hall, D. (2010). CIPP as a model for evaluating learning spaces. Swinburne
University of Technology. Retrieved from http://www.swinburne.edu.au/spl/learningspacesproject/.
Zhang, G., Zeller, N., Griffith,
R., Metcalf, D., Williams, J., Shea, C., and Misulis K., (2011). Using the Context, Input, Process, and Product
Evaluation Model (CIPP) as a Comprehensive Framework to Guide the Planning,
Implementation, and Assessment of Service-learning Programs. Journal of
Higher Education Outreach and Engagement, Volume 15, Number 4, p. 57.
Subscribe to:
Posts (Atom)