[Rasch] underfitting items and response rate

Gracia, Susan sgracia at ric.edu
Wed Sep 19 01:24:29 EST 2007


Hello-
The 11 scales correspond to the 11 components of the federal Comprehensive School Reform program.  The items in each scale represent (theoretically) varying degrees of implementation of each component.  Each scale ranges from 6-15 items.  There are 5 response options ranging from This is not at all true of my school to This is very true of my school.
 
The items with very few responses are interspersed throughout the survey.  They are not concentrated at the end, which would have suggested respondent fatigue.  
 
There are a total of 30 underfitting items.  They are items that I would associate with very high levels of implementation.  Very few of the respondents to the pre-survey would have been in schools with high implementation last year.  Where the number of respondents to the other items range from 104-140, the number of respondents to these underfitting items ranges from 1 to 17.
 
Juho Looveer asked me if I am more interested in changes in responses or in the post survey responses.  If  I have to choose, I guess I am more interested in changes in responses from pre to post test.  I have to report on this to my client.  However, I'd like to learn how to better measure these variables in the future, which might be a reason for leaving the questionnable items in the post test.
 
Finally, I'm using Winsteps for my analyses.
 
Thanks for your help.
 
Susan

________________________________

From: Trevor Bond [mailto:trevor.bond at jcu.edu.au]
Sent: Mon 9/17/2007 7:46 PM
To: Gracia, Susan; rasch listserve
Subject: [SPAM] Re: [Rasch] underfitting items and response rate



Dear Susan

11 scales (!!!) How many items on each?
what is the response format? Likert? How may response options?
The response load looks a bit tough on your sample...Is that why they skip?
I am all for understanding the variable first.

best
T

At 4:33 PM -0400 9/17/07, Gracia, Susan wrote:
>Hello-
>I administered an online survey last year as a pre-test for an
>educational intervention.  It's almost time for me to administer the
>post-test.  The survey consists of 11 scales that I want to analyze
>using Rasch methods.  Ultimately, I want to look at any change that
>took place over the past year on each of the 11 variables.
>
>Before administering the post-test, I have looked at the fit
>statistics for the items in the various scales.  I would like to
>eliminate any items that do not fit the model and are not part of
>the variables I'm interested in.  There's also some appeal to me in
>shortening the survey and reducing burden on the respondents.
>
>I have approximately 150 respondents to the presurvey.  On each of
>the scales, there's no problem with underfitting items.  Some
>scales, however, do have overfitting items with very low mean
>squares (e.g., ,38, .02, etc.).  Looking more closely, I see that
>those items are the ones that most people skipped.  They might have
>4 responses.  So, they obviously did not "fit" in some way from
>respondents' points of view.
>
>I could eliminate those items now.  However, I suspect that some of
>these items actually do belong on the variable(s) but they
>represented situations that respondents had never encountered or did
>not understand.  After a year in the program, I think their
>responses might be different.
>
>So, I am torn between reducing the number of items based on this
>preliminary items (and probably increasing my response rate because
>the survey will be shorter) versus leaving the survey as is.  If I
>leave the survey as is and the same items don't fit the second time,
>I will eliminate them from any analyses of change.  If the items fit
>the second time around, I will learn more about the variables I'm
>interested in, but I realize that I won't be able to speak much
>about change in respondents on items since almost noone responded to
>them the first time.
>
>How would you recommend that I proceed?  Any assistance would be
>most appreciated.
>
>Thank you.
>
>Susan Gracia, PhD
>Associate Professor, Educational Leadership Program
>Dept. of Counseling, Educational Leadership, and School Psychology
>Director of Assessment, Feinstein School of Education and Human Development
>Adams 103
>Rhode Island College
>600 Mount Pleasant Avenue
>Providence, RI 02908-1991
>tel.:  401-456-8577
>email:  sgracia at ric.edu
>
>_______________________________________________
>Rasch mailing list
>Rasch at acer.edu.au
>http://mailinglist.acer.edu.au/mailman/listinfo/rasch


--
Trevor G BOND Ph D
Professor and Head of Dept
Educational Psychology, Counselling & Learning Needs
D2-2F-01A EPCL Dept.
Hong Kong Institute of Education
10 Lo Ping Rd, Tai Po
New Territories HONG KONG

Voice: (852) 2948 8473
Fax:  (852) 2948 7983
Mob:





More information about the Rasch mailing list