New adventures!

Standard

I am absolutely thrilled to announce that my partner and I are taking on tenure-track assistant professor positions at Cornell University! It has a been a long journey to get here but we are beyond ecstatic to be pursuing this new adventure together. Here are some things I’m particularly excited about:

The job

I am being hired into a tenure-track assistant professor position as a physics education researcher. The department have shown great commitment to treating this like any other physics hire. They have been clear about what we will need to figure out over time, but have otherwise set clear and reasonable expectations. They have also provided support in a variety of ways to help me succeed at meeting those expectations. In particular:

Cornell’s Active Learning Initiative

In the past few years, Cornell has been piloting an “Active Learning Initiative“, partially modeled after the Science Education Initiatives at CU-Boulder and UBC. When I visited Cornell in early April, I met with a number of faculty in a range of STEM disciplines who were working on transforming, improving, and evaluating their teaching practices. The introductory physics sequence, in general, has been transformed with a variety of interactive engagement activities, with at least half a dozen faculty taking part. They even have a beautiful new “Learning Suite” space with flexible classrooms and open study space (see press release).

Fortunately, the faculty involved in the initiative seem eager for collaboration on evaluating the success of their courses, paving the way nicely for a PER group to join in.

Even more fortunately, they transformed the lectures and discussion sections, leaving the labs unchanged — for now 🙂

The two body solution

My partner and I have been living apart for almost 4 years now. We met during grad school at UBC and he, being 3 years older than me, moved to New Mexico for his postdoc at the Los Alamos National Lab. Two years later, the postdoc opportunity for me at Stanford was incontestable. Personally, it would at least put us in the same country. This meant that we both pursued our ideal positions professionally, but Job Season 2016 began with a strong desire [understatement] to get ourselves in the same city.

We are so incredibly thankful to have an incredible 2-body solution – with a strong condensed matter program for him and groundwork laid for a PER hire me.

Moving forward

We will begin our positions in January, leaving us lots of time in the meantime to finish up existing projects and even some much needed vacation time.

After almost 8 years on the west coast, my family are excited to have me within driving distance. Not to mention that Ithaca is in wine country, the finger-lakes region with mountain biking and hiking opportunities, and I’m a 3-hour drive to Broadway! Who could ask for anything more?

More from productive discussions at Aarhus University

Standard

Thinking about the big picture

I cannot underestimate how valuable it is to talk to people in other disciplines, in other schools, with other perspectives. I was challenged yesterday to think about the big picture of my work and philosophy on labs, and to really step back to see the forest from the trees. What I have come to is this:

The way we teach in labs is not new. There is lots of fundamental research that tells us that the components that we used are all helpful and productive for learning. The key features are as follows:

  • Students make comparisons
  • They reflect on them and make sense of them
  • Then they decide what to do about them (how to act), ultimately leading to new decisions
Cycle

Figure from Holmes, et al. (2015) in PNAS with “Decisions” added to the center to reinforce the student autonomy involved.

The importance of making comparisons has been well documented (e.g. Gick & Holyoak, 1980; Bransford et al., 1989; Bransford & Schwartz, 1999). The importance of revising and iterating with feedback has been well documented (e.g. Ericsson, et al. 1993; Schwartz, et al. 1999 — I should have more references than this… hmm…). Clearly, combining the two is going to be helpful.

And while we focused very specifically on the sorts of comparisons that introductory physics students make with data and models, these comparisons could be a number of things. For example, coming up with multiple experimental designs, comparing them, determining which is best, and then trying one of them. Of course, then one has to evaluate the outcome of acting on it to try to determine whether it worked as well as expected (compare to expectations or predictions). This may then lead to trying the other design or modifying it in a small way. The initial comparison, however, can happen before students take any data in a lab, which can be beneficial when iterating and repeating measurements is costly (e.g. non-reusable materials in chemistry or biology labs). Another example would be to compare to another group, each try the designs, then compare how well they worked to determine the optimal design.

Comparing to predictions is also useful, but the problem with most courses is that students stop at the comparison. They reflect, but often draw funny conclusions, such as “they disagree because of human error.” [What does that mean?!] Encouraging students to then test that interpretation (or act on it in some way) is going to make those comparisons much more meaningful.

I’m pretty sure I wrote about a lot of this in our papers, but I think I needed to pushed on it to really come to believe or understand it.

Ideas and perspectives on inquiry labs from discussions at Aarhus University

Standard

My first day (morning) visiting the Centre for Science Education at Aarhus University has already elicited some very productive discussions and new perspectives.

The first is the notion of a continuum of inquiry.

One member of the group described how people often see labs as either being cookbook (where the recipe for how to conduct the experiment is given with no space for critical thinking or decision making by the students) or fully open inquiry (where students come up with their own goals, design, etc.). They then described how the framework for labs I presented shows that there is a continuum to that inquiry.

More recently, I’ve been thinking about lab course design based on the “cognitive task analysis” for experiments in physics, recently published by Carl Wieman (here). The cognitive task analysis presents a list of all the decisions and cognitive tasks that one must do to conduct an experiment in physics. This list, merged with the notion of a continuum of inquiry, can present a very clear perspective for developing labs.

Each lab can focus on which decisions you want to leave open to the students. No single lab needs to address all of the items, rather each course (or experiment) can focus on developing the skills necessary to complete one of the tasks. In our SQILabs framework, we’ve been focusing on the experimental design, data analysis, and evaluating results. In something like the ISLE labs, the focus also extends to determining the goals and criteria (which hypothesis are you going to test, for example, and what outcome will help you decide whether the hypothesis is correct).

This can also inform curricular design, such that by the end of the program, students can complete all of the tasks in, for example, a senior thesis project.

The second is the use of invention activities for experimental design.

Several members of the group have been working on evaluating and improving chemistry lab courses. In many of these, the fact that materials are not re-usable (e.g. chemicals get used and then you need more chemicals) makes the iterative, repetitive cycling of experiments undesirable (and costly). We came up with the idea to iterate on the experimental design before they actually conduct the experiment. This can be done using the invention as preparation for future learning framework.

That is, students can be given the research goal and equipment and then be tasked with coming up with an experimental design to meet that goal. They would then work in groups to define all the different elements that need to be considered. Then, a class discussion can compare the various designs, ultimately leading to a best design, presumably that intended for the experiment by the instructor. Students can then be handed the regular cookbook protocol to follow, but now with a more fundamental understanding of why we perform each of those steps. This maps right on to the invention activities we’ve been using to teach data analysis. Students are given a problem and have to invent a way to solve it. Then we discuss the different approaches, ultimately leading to the expert solution (or equation). This type of activity has been shown to help students apply the equation, but also to understand the various components of the solution, and to better recognize faults in variations on the equation and what components they fail to address. It, essentially, breaks apart the black box into various components and pieces.

Another option (or to do in addition) would be for students to explain next to the protocol why we perform each step, what it should accomplish, and to group them by different phases of the process. This again forces them to make sense of each step, rather than following the instructions blindly. One could also get students to predict what they should see at each step. That way they have comparisons in hand while conducting the experiment to allow them to make sense of the outcomes as they get them and think on their feet, while doing the experiment.

One of the group members had used this approach (getting them to explain why we perform each step) and found that a 10-15 minute activity at the beginning of lab saved the students 1-2 hours doing the lab, because they knew what was going on. The instructors also got students to assign who was going to do what at each stage to better balance the division of tasks. This was in response to some observations (both in my work and theirs) that students distribute the work unevenly during the lab (often based on gender or other demographics).

Priceless

Image

Slide1

This quote came from a student interview I did this summer – after asking what the purpose of lab courses was, this is the response I got. I promise I did not ask or pay the student to say this.

Telling students what to do, but not how to do it

Standard

I’ve been spending a lot of time this summer speaking with undergrad students doing summer research projects. I’ve asked lots of questions about their research experiences, but have been sneaking in questions about their lab courses. Their ideas and opinions have been fascinating. In particular, a student recently made a very clear description of what they want – both in their research and in their lab courses.

Tell me what to do, but let me figure out how to do it.

They made this comment first in relation to their research experience when I asked about independence and autonomy in their project. They agreed that it was beneficial for their research mentor to decide what they should be working on, since they, as early undergraduate students, did not have the breadth of knowledge to make decisions about what should be studied. They really appreciated that their mentors left them to their own devices to figure out how to get it done, though. Of course, this involves lots of feedback and support from their mentors, but they still have the autonomy to make their own decisions, see how/whether it works, and then iterate or ask for advice.

Then, in describing some of their favourite (and least favourite) lab courses, the same philosophy emerged. Tell them what they’re supposed to investigate, but let them investigate it!

I think this rings true with the way I’ve been designing lab activities, but I just thought they put it so eloquently, that I would share it.

They then started telling me that they wished labs taught them about critical thinking, scientific reasoning skills, etc. and this was all just music to my ears 🙂

 

The calm before the storm…

Standard

My first big project of my postdoc begins next week, so I thought I’d squeeze in another blog post, since I expect the next 10 weeks to be crazy. For this blog, I thought I’d just write about some of the things I’ve been thinking about lately, in attempts to start some conversations and also just to get my thoughts down somewhere.

TA training and course development

The new project I’m working on has involved taking an existing course and restructuring it to use pedagogy and structure from my dissertation course. There are a couple neat challenges here, such as:

  1. The UBC course was 24 weeks long, this one is 10 weeks
  2. The UBC sections were 3 hours long with no pre-lab or homework, these ones are 2 hours long with pre-lab assignments
  3. The UBC course was graded, this one is pass/fail

Trying to fit 24 weeks of goals developed in 3-hour lab sections into 10 weeks and 2-hour labs has caused me to really evaluate what the important and necessary goals are. Hopefully this process will make our lab structure more generally useful to lab instructors. Depending on whether it works, I suppose.

I also spent several weeks testing each of the experiments to see if they were workable:

  • They need to work. None of this ‘human error’ stuff
  • Students need to be able to make precise and accurate measurements
  • Some of the experiments need opportunities for models to be revised, when the data quality is high enough
  • They also need to be able to highlight particular analysis tools or ideas (e.g. fitting, comparing pairs of measurements, different sources of uncertainty…)

Several labs got thrown out, several got turned into two-week labs to give students enough time to explore the ideas. I expect more changes to happen throughout the course. As the goals of the course were refined, the lab documents started to come together. The first week of TA training, which was a 1.5-hour intro to the course, went along swimmingly! TAs seemed to buy-in to what I was doing after a quick discussion about how experiments work and interact with theory. With only 5 TAs, it was also fairly easy to have productive and engaging discussions about the goals and content. Next week we get to dig into data and feedback.

I’m also dealing with the little things: creating a syllabus, posting it on the online course management system, coordinating labs and pre-lab documents, for example. There are also neat challenges when the course is also being used for research: need to inform students about the research, figure out how to coordinate accessing students materials around TAs marking them, etc. All in all – a great learning experience so far!

Lab goals and structures

With the AAPT’s endorsement of new recommendations for lab goals and curriculum, I’ve been evaluating my previously held goals for labs. Compared with the AAPT’s 1998 document of the same nature,  the new document is much more focused on developing skills (technical skills, modeling skills, data analysis skills…). My current opinion is that, while lab experiments should try to resemble authentic experimentation (that’s a loaded term, mind you, see “What is science” below), we can’t drop novices into expert behaviours and activities. Labs should aim to develop the skills and behaviours slowly and deliberately, until students are ready to engage in real science experimentation. Even graduate students and postdocs rarely engage in a full experimental design process on their own (identifying research questions, designing the experiment from scratch, etc. etc.). It’s always guided by a mentor or a team. There’s no need to get students to do that on their own while they’re still learning the basics of data and uncertainty.

How can we assess these goals?

My dissertation work involved lots and lots of coding of and analyzing students’ written lab notes. I looked at whether students iterated to improve their measurements or models during the lab, the sophistication of their analyses and interpretations of methods and results, whether they could identify physical assumptions of a model that had been violated, and whether and how they made and corrected measurement errors. While it was very fruitful for a PhD, coding lab books is not a useful or generalizable assessment for people looking to evaluate their labs. So, I’ve been trying to figure out whether there’s an FCI-like way of assessing labs. There are several validated lab-related assessments, such as the Concise Data Processing Assessment or the Physics Measurement Questionnaire. The CDPA, however, is hard and measures conceptual understanding of specific data handling concepts. The PMQ is open-ended and requires the instructor/researcher to code student responses. I’m currently working to develop an assessment somewhere between these two, but that also includes a focus on modeling through experimentation. That will be piloted next week! Stay tuned!

What is science?

Discussions with supervisors and colleagues have sent me down a rabbit hole of philosophy of science. In trying to disentangle whether the activities students do in the lab are actually “authentic science,” I made the mistake of asking what “authentic science” really is. Is it a noun or a verb? A set of facts and theories or a process for developing those facts and theories? [My current idea is that it’s both…] This, of course, has spun into much bigger questions that are only partially relevant to labs, but it’s been fun to think about anyway. My PhD advisor sent me to this fascinating 1978 documentary interview between Bryan Magee and Hilary Putnam about the philosophy of science, which I highly recommend. Please send along other favourite nature of science resources for me to check out!