Thursday, March 5, 2015

Alternative paths to college completion: Effect of attending a 2-year school on the probability of completing a 4-year degree: Jonathan Sandy, Arturo Gonzalezb, Michael Hilmer

This paper tries to find the effect of attending a 2-year college before attending a 4-year university has on Bachelor degree rates. It uses the National Longitudinal Survey, class of 1972 (NLS72); the 1994 round of the Beginning Postsecondary Study (BPS); and the 1992 round of the sophomore cohort of the High School and Beyond (HSB) as separate data sources to see if their results are consistent. It also uses Oaxaca's method to attempt to separate any potential differences between those who start at 2- year institutions and those who start at 4-year institutions by whether it is due to variations in student quality or variations in institution quality.

The most interesting part for my purposes was the analysis of the HSB data, as this was the data used by Adelman in his first toolbox study. He found transferring from a 2-year to a 4-year institution while earning more than 10 credits at both increased graduation rates. Would these researchers find the same results? They did not, in fact, they found that chances of graduating were 19.3% lower for those who attended 2-year institutions first, and that about 48% of this was due to lower student quality while about 52% was due to lower institution quality. Why the opposite results?

For Sandy, et al., mother some college, father some college, gpa above a B average, female, verbal and quantitative SAT scores, and hours worked were all significant controls. For Adelman, high school performance, having children, continuous enrollment, 1st year grades, dropping many courses first year and overall, and grade trend were all significant controls. He also found sex, whether a student worked or not, and starting at a 4-year university were insignificant, while Socioeconomic status (SES), which Sandy, et al. try to represent with parental education, was only significant at the .1 level.

Given the differences in the findings despite seeming to use the same data, it appears that one of the controls must explain the differences between these findings. In my eyes, the two major factors that Adelman accounted for that Sandy, et al. did not are the subject having a child and continuous enrollment. Between these, it seems like continuous enrollment, one of the biggest factors of Adelman's model, is the likely culprit. Perhaps 2-year students who go directly into 4-year programs have more momentum, or have lower opportunity costs, or were more motivated in the first place, or some other factor that means they are more likely to graduate than those who wait between finishing at a 2-year and attending a 4-year institution. Another potential factor is in the way transfer is described; Adelman requires at least 10 credits from each institution. Perhaps students struggle from the transition from 2-year to 4-year, and thus never complete even 10 credits from the later, still including them in Sandy, et al.'s analysis, but not Adelman's.

Either way, this shows how tangled all this data is and how seemingly similar assessments of  the same data can show wildly different results.

2 comments:

  1. The issue of excluding data that is in the sample but doesn't fit criteria that the researcher establishes for inclusion is interesting in itself. You say that Adelman requires at least 10 credits at each institution. Why? Is 10 some magic number here or is it an artificial boundary. Or is this a proxy for some other issue that isn't getting as much attention in the analysis?

    For example, consider the question when is someone a student versus when does a person consider being a student but then decides to opt out on that? However you draw the boundary to answer that question, you will probably have to admit that it is a fuzzy boundary, not precise at all. If so, then one would want to know if the results are sensitive to how the boundary is drawn.

    ReplyDelete
  2. After rereading Adelman, I see a few reasons for the 10 credit requirement. I agree a large part is to try to eliminate those who briefly attend an institution then quickly drop out, and this can be hard to differentiate. If a student drops two weeks in, then I think not counting the subject as an attendee is wise. On the other hand, if a student stays a semester or more, but fails enough classes that total credits end up less than 10, I think s/he probably should be counted.

    Other reasons seem to be to distinguish summer school from actual transfer. It is also in part to eliminate some high school students with some college credit, as Adelman does not want to include them in transfers.

    ReplyDelete