Evaluations from the Summer 2014 CMGPD Workshop at SJTU

We conducted the 4th CMGPD summer workshop at Shanghai JIaotong University this summer. As usual, we conducted a survey at the end to get student feedback. I’m a fan of making student evaluations public, so I have uploaded the scanned forms via the link below:

匿名问卷

Overall, I was pleased with the results of the workshop. We have always had good participants. This year, however, we were fortunate to have an especially large share of participants who were interested in historical topics, and had some facility with quantitative methods. In previous offerings, participants were often one or the other.

The students made presentations on the last day with preliminary results and I think that with some more work, many of them can be turned into papers.

Student evaluations for SOSC 1860 and SSMA 5010, Fall 2013

 

I received student evaluations for the two courses that I taught last fall, SOSC 1860 (Population and Society) and SSMA 5010 (Research Methods).

The former is a general education (Common Core in HKUST parlance) course aimed at freshmen and sophomores, while the latter is a required course in our self-taught Social Science MA program. I enjoyed teaching both courses. The students were bright and highly motivated.

Here are the evaluations for SOSC 1860.

I was initially surprised to read that the students in SOSC 1860 thought I required too much work, but eventually concluded this probably reflects that they have less prior exposure to open-ended written assignments and projects than students I have taught elsewhere. In fact, the course was a simplified version of an upper division course I taught regularly at UCLA that was only ten weeks long (versus thirteen here) yet had even more written assignments and reading. The assignments mostly required them to visit some websites to collect demographic data, and then write about trends and patterns. The final project required them to carry out an analysis at IPUMS. Talking to students here, it seems that they found the relatively open-ended assignments intimidating. The students here are just as smart and motivated as the ones I taught at UCLA, and they actually did a good job on the assignments and their final projects, thus I suspect their reaction may have more to do with lack of familiarity or confidence with open-ended written assignments than with any actual lack of ability. Several students I talked to said this was the first class they had ever taken that made such heavy use of written assignments. I will probably need to adjust the number of assignments next fall.

The evaluations for SSMA 5010 are unremarkable, and about what I expected. Some of the comments reflect that this was a new prep, and I will have to continue revising my course plan and the lecture slides. This is the first time I have taught a research methods course, and it was fun. The students were highly motivated and engaged, making it a relatively pleasant task.

 

Evaluations from my summer 2013 short course in Social Demography at SJTU

I received the evaluations from my summer 2013 short course in Social Demography at Shanghai Jiaotong University.  This undergraduate course is an abbreviated version of the one that I have taught at UCLA in the past, and am teaching at HKUST now.  If I understand the scores correctly, I don’t seem to have done too much damage.

Personally, I believe that aggregated information from teaching evaluations should be public, at least to students.  This should also be combined with efforts to maximize response rates.  I liked the system implemented at UCLA where the administration provided a list of students who had completed web-based evaluations in time for the instructor to provide a small amount of credit included in the calculation of the final grade.  Obviously, all the administration provided was a list of names.  It didn’t include the content of the responses.  We only saw the summary report on the evaluations and the collected written comments students after we turned in grades.

If you are having difficulty viewing the embedded Excel spreadsheet, you can download it here.

You can view other entries where I have posted the class evaluations.

Course evaluations from Sociology 181B (Contemporary Chinese Society) in Spring 2013

I just submitted final grades for my class on contemporary Chinese society.  Doing so unlocked the results for my online course evaluations.

Sociology 181B Spring 2013 Evaluations

The response rate this year was 88%, a substantial improvement over the old bubble forms that students filled out in the last week of class.  As in winter quarter, I gave the students who had completed evaluations a small number of points as an incentive.  I was able to do this because when the online evaluation period closes, I receive a list of students who have completed evaluations, though of course nothing about their actual responses.

The response rate is a huge improvement over last spring’s offering of Sociology 181B, in which only 7 of 32 students completed online evaluations.  That quarter was the first in which I used online evaluations, and I didn’t offer any grade incentive to students who completed the evaluations.  Comparison of the response rates between the two offerings suggests that offering an incentive, albeit a very small one, does make a big difference.

Looking at the evaluations, there weren’t many surprises.

Medians on the qualitative responses reflecting general feelings about the course were all 8 (“The instructor was concerned about student learning”, “Your overall rating of the instructor”, “Your overall rating of the class” etc.) and the means were between 7 and 8.  This usually seems to be where I end up.  Maybe I could get means or medians over 8 if I did something really entertaining, like wear period-appropriate costume when talking about different points in time?

As usual, the students completing the evaluations appear to have overestimated their final grades, in the sense that the distribution of estimated grades is (as it always seems) superior to the distribution of actual grades.  I don’t know why this is always the case.  I post scores on assignments, midterm, and so forth, and have a fairly clear scale for assignment of grades.  I’m not an especially harsh grader, but neither am I prone to being as generous as the distribution of expected grades suggests.

The written comments are about what I expected.

Some students find my lectures boring, but  I don’t particularly mind that.  I don’t subscribe to the school of thought that we are supposed to be dancing bears, performing for the amusement and delight of the students.  The students do seem to suggest that my lecturers are clear and well-organized, which is what matters.

I’m still trying to figure out how to make better use of the i>clicker in a course like this.  I liked making use of the i>clicker in my social demography course in winter 2013, because it allowed for quick, anonymous surveys on issues related to population.  For example, when we discussed fertility trends in the United States, I could ask people how many children they hoped or expected to have.  And so forth.  But for a class on Chinese society, I am still wrestling with how to use the I>clicker to animate class discussion.

And I am still trying to figure out how to stimulate more class discussion.  The class was small enough that we could have had more discussion.  There was some, because at least a few students did like to contribute to discussion, but overall it wasn’t what I would have liked.

Another lingering puzzle is the generally low enrollment for a class on contemporary Chinese society.  I would have thought that at a large university like UCLA with a lot of students who have broad, international interests, there would be large enrollments.  In the past, my contemporary Chinese society courses regularly drew 75-100 students.  Lately, however, the enrollments have been lower than I would have liked or expected.  Go figure.

Evaluations from my spring 2012 Chinese Society (Sociology 181B) course at UCLA

While I’m posting course evaluations, here is the final report from my on-line student evaluations from the course on Chinese society that I taught at UCLA this spring, Sociology 181B.  The response rate, 7 out of 32, was the lowest response rate I have ever seen for an undergraduate class.  The evaluations seem to be OK, but given the low response rate, I don’t know what to make of them.  I guess I should be glad that students didn’t take advantage of the anonymity offered by the internet to vent their rage against me.

I suppose this has something to do with the transition from paper to on-line evaluations.  In the past, the evaluations were on paper, and students filled them during class at some point in the last week or two.  Of course, the TA and I left the room when they did.  The response rate was generally somewhat higher.  For example, when I taught Social Demography in Fall 2011, I had a whopping 37.7 percent response rate: 26 of 69 students turned in their forms. I never did figure out why the response rate on the paper forms was as low as it was, because it always seemed like there were many more students in the room with pencils ready on the days when forms were distributed, than there were forms turned in.  Maybe some students who looked like they were ready to fill out a form ended up giving up and focusing on updating their Facebook status, or playing Minecraft.

It does look like we need to work on improving the response rates for on-line course evaluations.  Maybe the university should delay students’ online access to their final grades unless they complete their evaluations in a timely fashion?

If we could increase the response rate, then perhaps it would be possible for the university to carry out more sophisticated analysis that takes advantage of the possibility of internal linkage to other data.  For example, it should be possible to break the evaluations down by students’ overall or major GPA, or perhaps even their class grade, and compare how top students evaluate the class with how other students evaluate the class.  At least for large lecture courses with a high response rate, this could be done internally without affecting the anonymity of the respondents, and results presented to faculty in aggregated form.

Evaluations from my summer short course on Social Demography at Shanghai Jiaotong University

I taught an undergraduate course in Social Demography this summer at Shanghai Jiaotong University.  The university introduced a short summer semester this year.  As I understand it, at least part of the reason was to give undergraduates more opportunity to take courses with faculty like myself who have visiting appointments, and are only there during the summer.  The short semester was one month long, and immediately followed the end of the spring semester.

I just received the summary of student evaluations from the short course I taught at Shanghai Jiaotong University this summer.  I embedded the spreadsheet below. I don’t know how they compare with other courses there.  At first glance, they don’t seem disastrous, which is always a relief.  I was pleased that even though the students seemed to think the homework load rather heavy, they didn’t seem hate on me as an instructor.

There were some crossed wires so some students enrolled without knowing that I would be teaching in English. I had provided a course description and syllabus that included specification of English as the language of instruction, but as far as I can understand, that was not widely disseminated to students when they were making their choices for the short semester.  I did try to summarize main points in Chinese when people’s expressions suggested an unusual level of confusion.  I do admire the students with limited English who stuck with it and plugged away and ended up doing reasonably well.  I suppose I could have lectured in Chinese, but it probably would have been painful for the students to listen to my Chinese.  More importantly, many of them plan to go abroad for graduate school, so I thought that they might prefer a relatively short and painless taste of what an English language course would be like.

I was really impressed with the students.  Shanghai Jiaotong University is one of the best science and engineering schools in China.  It attracts very smart and ambitious students.

In terms of engagement, the students were much like a typical undergraduate class at UCLA.  One-quarter to one-third of the class routinely sat at the front of the room, were very engaged, listened attentively, raised questions and expressed opinions.  Perhaps another one-third tended to sit in the middle room, and paid attention and took notes, but didn’t participate much in discussion.  And of course, just like UCLA or probably any other university, there were the students who sat in the back of the class, had their laptops open and connected to the campus internet, and I suppose were on social networks or playing Minecraft or World of Warcraft.  I can’t be too harsh on such students since when I was an undergraduate at Caltech, I was one of them.  We didn’t have laptops to bring to class, or internet connectivity, so when I attended lecture, I usually sat in the back and doodled.

For the final project, the students had to use the IPUMS site to do a basis analysis of some aspect of American population.  Since there were only four weeks the projects couldn’t be as ambitious as some of the projects that my undergraduates at UCLA attempt.  That said, most of the students produced reasonably competent analyses on a subject of their choice, mostly consisting of tabulations.  Some ambitious students attempted regression analysis, and one team of economics students downloaded data and estimated quantile regressions to model wages.   Another student who was a physics undergraduate compiled marriage statistics from IPUMs, and then wrote code in Matlab to estimate a non-linear regression to fit Coale and McNeil’s marrigae model to contemporary American marriage patterns.  I couldn’t really follow the explanation after the student introduced tensors, but it looked pretty good.

The most popular topics for the student projects were marriage and divorce patterns, especially by education, and educational and occupational attainment of immigrants, especially Chinese-Americans.  In discussion and in written work, the students generally displayed a relatively mature and sophisticated understanding of contemporary American society.  Probably they were most surprised by the regional divides in socioeconomic and demographic outcomes, and the probably not unrelated regional divides in religious, political, and social orientations.  To the extent the students had anything wrong, it was that they assumed that the entire country was very liberal and open-minded in terms of social attitudes, and weren’t really aware of how socially conservative very large swathes of the country actually are.

I was pleased by how many of the students told me they were not only from outside Shanghai, but from  small towns or rural areas in the interior provinces.  I ran into some of them after the last class and many of them were about to embark on two or three day hard-sleeper train rides to return home.  As is the case at UCLA, many students were first-generation college students, or from otherwise humble origins.  It was a nice reminder of one of the distinguishing features of Caltech, and indeed the UC schools, which among top research universities are all distinguished by their relative (emphasis here on relative!) accessibility to students from modest origins.

Course evaluations from fall 2011 offering of Social Demography (Sociology 116)

I decided to start posting the summary sheets from my course evaluations, if only to demonstrate my commitment to transparency. I’m beginning with the summary sheet from my offering of Social Demography last fall, that is fall 2011. This is an upper division undergraduate lecture course that surveys key issues in population and demography. It has a little bit of everything, including relationships between population and resources (Malthus vs. Boserup), the demographic transition, marriage and divorce, fertility, and migration. This year I increased the number of exercises that required students to visit the IPUMS website to collect data, and added a final project in which students conceived of and executed a simple analysis of a demographic pattern or trend of interest to them by constructing tables at IPUMS.

I would like to think that the course evaluations are a somewhat more systematic assessment of student reactions to the course than comments made at websites devoted to rating professors.  That said, the response rate was low.  Only 26 of the 69 students enrolled in the course bothered to turn evaluations.  Attendance that day was close to normal, so I guess many students did not feel strongly enough one way or the other about the course.  The resulting ratings were par for the course for me.  Reading the written comments, there was the typical spread in reactions, with some students praising me for the clarity and organization of my lectures, and others claiming I was hard to follow and prone to tangents.
As usual, the distribution of students’ expected grades differed from the final distribution of grades.  I’m not sure if it is because the students who completed the evaluations were overly optimistic about their prospects, or because they were motivated students who did indeed do well in the class.  The class is structured to reward student’s time input as demonstrated by their performance on exams and assignments.  Here is the actual distribution of grades…
Out of curiosity, I went back and looked at the evaluations from my offering of the class the previous year, in fall 2010.  That summary sheet is included below.  That year I was assigned a much larger room, so the enrollment was much larger, with 124 students completing the class.  That year we were unable to find two TAs for the course, so we cancelled section, and I made do with a very dedicated and able reader.  One might have expected the fall 2011 offering to have better evaluations than fall 2010 because the class was smaller in fall 2011 and I had a TA who held section every week, but the medians on all the items are identical.  Go figure.  The fall 2010 evaluations are below…
The question remains: how do we measure effectiveness in the classroom.  Student evaluations certainly have an important role, but they do have their limitations.  Maybe at some point in the future we can have rigorous peer review of teaching by videotaping lectures for viewing by the anonymous members of the commits that write our evaluations for the merit review?  I have a feeling that would not be very popular.  
And I always wonder if we put too much emphasis on lectures when thinking about the impact of classes.  My own sense is that a lot of student learning is actually via completion of exercises, and that well-designed exercises are as important to learning as clear, well-organized lectures.  They may be more important.  When I think back to my own years as an undergraduate, the classes that I ended up appreciating the most were the ones where the exercises were carefully designed, and guided students step by step to an understanding of the material.  The quality of the lectures seemed less important.  There were people who were very entertaining and engaging lectures who I probably evaluated very highly, but who in retrospect I probably learned very little from.  And there were other people whose lectures were clear and well-organized but not especially scintillating from whom in retrospect I learned a great deal.