Do you have the power?

Standard

I was recently contacted by the author of the blog PsychBrief (@PsyBrief) about a post they were writing about our recent article, “Value added or misattributed? A multi-institution study on the educational benefit of labs for reinforcing physics content” in the journal Physical Review Physics Education Research. They were using the article as an example of how to deal with selection biases in education and psychology research.

I was very interested to see a thorough power analysis associated with our study. Given that we found a null result, how do we know we aren’t just missing a subtle, yet interesting effect? This is something I often try to teach in our intro physics labs, so I was very excited to see how the analysis would pan out. It turns out, given the sample size, (etc.) we had more than enough power to see some pretty small effect sizes, and yet saw none. (phew!) I highly recommend checking out the description of the analysis on the PsychBrief blog here. I certainly plan to refer to the analysis there next time I find myself dealing with null effects!

Advertisements

New adventures!

Standard

I am absolutely thrilled to announce that my partner and I are taking on tenure-track assistant professor positions at Cornell University! It has a been a long journey to get here but we are beyond ecstatic to be pursuing this new adventure together. Here are some things I’m particularly excited about:

The job

I am being hired into a tenure-track assistant professor position as a physics education researcher. The department have shown great commitment to treating this like any other physics hire. They have been clear about what we will need to figure out over time, but have otherwise set clear and reasonable expectations. They have also provided support in a variety of ways to help me succeed at meeting those expectations. In particular:

Cornell’s Active Learning Initiative

In the past few years, Cornell has been piloting an “Active Learning Initiative“, partially modeled after the Science Education Initiatives at CU-Boulder and UBC. When I visited Cornell in early April, I met with a number of faculty in a range of STEM disciplines who were working on transforming, improving, and evaluating their teaching practices. The introductory physics sequence, in general, has been transformed with a variety of interactive engagement activities, with at least half a dozen faculty taking part. They even have a beautiful new “Learning Suite” space with flexible classrooms and open study space (see press release).

Fortunately, the faculty involved in the initiative seem eager for collaboration on evaluating the success of their courses, paving the way nicely for a PER group to join in.

Even more fortunately, they transformed the lectures and discussion sections, leaving the labs unchanged — for now 🙂

The two body solution

My partner and I have been living apart for almost 4 years now. We met during grad school at UBC and he, being 3 years older than me, moved to New Mexico for his postdoc at the Los Alamos National Lab. Two years later, the postdoc opportunity for me at Stanford was incontestable. Personally, it would at least put us in the same country. This meant that we both pursued our ideal positions professionally, but Job Season 2016 began with a strong desire [understatement] to get ourselves in the same city.

We are so incredibly thankful to have an incredible 2-body solution – with a strong condensed matter program for him and groundwork laid for a PER hire me.

Moving forward

We will begin our positions in January, leaving us lots of time in the meantime to finish up existing projects and even some much needed vacation time.

After almost 8 years on the west coast, my family are excited to have me within driving distance. Not to mention that Ithaca is in wine country, the finger-lakes region with mountain biking and hiking opportunities, and I’m a 3-hour drive to Broadway! Who could ask for anything more?

More from productive discussions at Aarhus University

Standard

Thinking about the big picture

I cannot underestimate how valuable it is to talk to people in other disciplines, in other schools, with other perspectives. I was challenged yesterday to think about the big picture of my work and philosophy on labs, and to really step back to see the forest from the trees. What I have come to is this:

The way we teach in labs is not new. There is lots of fundamental research that tells us that the components that we used are all helpful and productive for learning. The key features are as follows:

  • Students make comparisons
  • They reflect on them and make sense of them
  • Then they decide what to do about them (how to act), ultimately leading to new decisions
Cycle

Figure from Holmes, et al. (2015) in PNAS with “Decisions” added to the center to reinforce the student autonomy involved.

The importance of making comparisons has been well documented (e.g. Gick & Holyoak, 1980; Bransford et al., 1989; Bransford & Schwartz, 1999). The importance of revising and iterating with feedback has been well documented (e.g. Ericsson, et al. 1993; Schwartz, et al. 1999 — I should have more references than this… hmm…). Clearly, combining the two is going to be helpful.

And while we focused very specifically on the sorts of comparisons that introductory physics students make with data and models, these comparisons could be a number of things. For example, coming up with multiple experimental designs, comparing them, determining which is best, and then trying one of them. Of course, then one has to evaluate the outcome of acting on it to try to determine whether it worked as well as expected (compare to expectations or predictions). This may then lead to trying the other design or modifying it in a small way. The initial comparison, however, can happen before students take any data in a lab, which can be beneficial when iterating and repeating measurements is costly (e.g. non-reusable materials in chemistry or biology labs). Another example would be to compare to another group, each try the designs, then compare how well they worked to determine the optimal design.

Comparing to predictions is also useful, but the problem with most courses is that students stop at the comparison. They reflect, but often draw funny conclusions, such as “they disagree because of human error.” [What does that mean?!] Encouraging students to then test that interpretation (or act on it in some way) is going to make those comparisons much more meaningful.

I’m pretty sure I wrote about a lot of this in our papers, but I think I needed to pushed on it to really come to believe or understand it.

Ideas and perspectives on inquiry labs from discussions at Aarhus University

Standard

My first day (morning) visiting the Centre for Science Education at Aarhus University has already elicited some very productive discussions and new perspectives.

The first is the notion of a continuum of inquiry.

One member of the group described how people often see labs as either being cookbook (where the recipe for how to conduct the experiment is given with no space for critical thinking or decision making by the students) or fully open inquiry (where students come up with their own goals, design, etc.). They then described how the framework for labs I presented shows that there is a continuum to that inquiry.

More recently, I’ve been thinking about lab course design based on the “cognitive task analysis” for experiments in physics, recently published by Carl Wieman (here). The cognitive task analysis presents a list of all the decisions and cognitive tasks that one must do to conduct an experiment in physics. This list, merged with the notion of a continuum of inquiry, can present a very clear perspective for developing labs.

Each lab can focus on which decisions you want to leave open to the students. No single lab needs to address all of the items, rather each course (or experiment) can focus on developing the skills necessary to complete one of the tasks. In our SQILabs framework, we’ve been focusing on the experimental design, data analysis, and evaluating results. In something like the ISLE labs, the focus also extends to determining the goals and criteria (which hypothesis are you going to test, for example, and what outcome will help you decide whether the hypothesis is correct).

This can also inform curricular design, such that by the end of the program, students can complete all of the tasks in, for example, a senior thesis project.

The second is the use of invention activities for experimental design.

Several members of the group have been working on evaluating and improving chemistry lab courses. In many of these, the fact that materials are not re-usable (e.g. chemicals get used and then you need more chemicals) makes the iterative, repetitive cycling of experiments undesirable (and costly). We came up with the idea to iterate on the experimental design before they actually conduct the experiment. This can be done using the invention as preparation for future learning framework.

That is, students can be given the research goal and equipment and then be tasked with coming up with an experimental design to meet that goal. They would then work in groups to define all the different elements that need to be considered. Then, a class discussion can compare the various designs, ultimately leading to a best design, presumably that intended for the experiment by the instructor. Students can then be handed the regular cookbook protocol to follow, but now with a more fundamental understanding of why we perform each of those steps. This maps right on to the invention activities we’ve been using to teach data analysis. Students are given a problem and have to invent a way to solve it. Then we discuss the different approaches, ultimately leading to the expert solution (or equation). This type of activity has been shown to help students apply the equation, but also to understand the various components of the solution, and to better recognize faults in variations on the equation and what components they fail to address. It, essentially, breaks apart the black box into various components and pieces.

Another option (or to do in addition) would be for students to explain next to the protocol why we perform each step, what it should accomplish, and to group them by different phases of the process. This again forces them to make sense of each step, rather than following the instructions blindly. One could also get students to predict what they should see at each step. That way they have comparisons in hand while conducting the experiment to allow them to make sense of the outcomes as they get them and think on their feet, while doing the experiment.

One of the group members had used this approach (getting them to explain why we perform each step) and found that a 10-15 minute activity at the beginning of lab saved the students 1-2 hours doing the lab, because they knew what was going on. The instructors also got students to assign who was going to do what at each stage to better balance the division of tasks. This was in response to some observations (both in my work and theirs) that students distribute the work unevenly during the lab (often based on gender or other demographics).

Priceless

Image

Slide1

This quote came from a student interview I did this summer – after asking what the purpose of lab courses was, this is the response I got. I promise I did not ask or pay the student to say this.

Telling students what to do, but not how to do it

Standard

I’ve been spending a lot of time this summer speaking with undergrad students doing summer research projects. I’ve asked lots of questions about their research experiences, but have been sneaking in questions about their lab courses. Their ideas and opinions have been fascinating. In particular, a student recently made a very clear description of what they want – both in their research and in their lab courses.

Tell me what to do, but let me figure out how to do it.

They made this comment first in relation to their research experience when I asked about independence and autonomy in their project. They agreed that it was beneficial for their research mentor to decide what they should be working on, since they, as early undergraduate students, did not have the breadth of knowledge to make decisions about what should be studied. They really appreciated that their mentors left them to their own devices to figure out how to get it done, though. Of course, this involves lots of feedback and support from their mentors, but they still have the autonomy to make their own decisions, see how/whether it works, and then iterate or ask for advice.

Then, in describing some of their favourite (and least favourite) lab courses, the same philosophy emerged. Tell them what they’re supposed to investigate, but let them investigate it!

I think this rings true with the way I’ve been designing lab activities, but I just thought they put it so eloquently, that I would share it.

They then started telling me that they wished labs taught them about critical thinking, scientific reasoning skills, etc. and this was all just music to my ears 🙂