SEEM 2018 accepted to ICSE!

SECM is now SEEM! And we are pleased to announce that our proposal, under the new name, the 2nd International Workshop on Software Engineering Education for Millennials, was accepted to ICSE 2018.  The workshop will be held in Gothenburg, Sweden, on June 2, 2018. The SEEM ‘2018 web site has launched recently. Organization is under way. Stay tuned for updates!

Teaching As Research

CMU’s Eberly Center for Teaching Excellence has outstanding resources  to support the faculty in their education research endeavors. They advocate an approach called Teaching as Research (TAR) that combines real-time teaching with on-the-fly research in education, for example to evaluate the effectiveness of a new teaching strategy while applying the strategy in a classroom setting.

TAR Workshops

Eberly Center’s interactive TAR workshops helps educators identify new teaching and learning strategies to introduce or existing teaching strategies to evaluate in their courses, pinpoint potential data sources, determine proper outcome measures, design classroom studies, and navigate ethical concerns and the the Institutional Review Board (IRB) approval process. Their approach builds on seven parts, each part addressing central questions:

  1. Identify a teaching or learning strategy that has the potential to impact student outcomes. What pedagogical problem is the said strategy trying to solve?
  2. What is the research question regarding the effect of the strategy considered on student outcomes? Or what do you want to know about it?
  3. What teaching intervention is associated with the strategy that will be implemented in the course as part of the study design? How will the intervention incorporate existing or new instructional techniques?
  4. What sources of data (i.e., direct measures) on student learning, engagement, and attitudes will the instructors leverage to answer the research question?
  5. What study design will the instructors use to investigate the research question?  For example, will collecting data at multiple times (e.g., pre- and post-intervention) or from multiple groups (e.g., treatment and control) help address the research question?
  6. Which IRB protocols are most suitable for the study? For example, different protocols are available depending on whether the study relies on data sources embedded in normally required course work, whether student consent is required for activities not part of the required course work, and whether any personal information, such as student registrar data, is needed.
  7. What are the actionable outcomes of the study? How will the results affect future instructional approaches or interventions?

After reviewing relevant research methods, literature, and case studies in small groups to illustrate how the above points can be addressed, each participants identifies a TAR project. The participants have a few months to refine and rethink the project, after which the center folks follow up to come up with a concrete plan in collaboration with the faculty member.

Idea

I teach a graduate-level flipped-classroom course with colleague Cécile Péraire on Foundations of Software Engineering.  We have been thinking about how to better incentivize the students to take assigned videos and other self-study study materials more seriously before attending live sessions. We wanted them to be better prepared for live session activities and also improve their uptake of the theory throughout the course. We had little idea about how effective the self-study videos and reading materials were. Once suggestion from the center folks was to use low stakes assessments with multiple components, which seemed like a good idea (and a lot of work). Cécile and I set out to implement this idea in the next offering, but we wanted to also measure and assess its impact.

Our TAR project

Based on the above idea, our TAR project, in terms of the seven questions, are summarized below.

  • Learning strategy: Multi-part, short low-stakes assessments composed of an online pre-quiz taken by student just before reviewing a self-study component, a matching online post-quiz completed by student right after reviewing the self-study component, and an online in-class quiz on the same topic taken at the beginning of the next live session. The in-class quiz is immediately followed by a plenary session to review and discuss the answers.  The assessments are low-stakes in that a student’s actual quiz performance (as measured by quiz scores)  do not count towards the final grade, but taking the quizzes are mandatory and each quiz completed counts towards a student’s participation grade.
  • Research question: Our research question is also multi-part. Are the self-study materials effective in conveying the targeted information? Do the low-stakes assessments help students retain the information given in self-study materials?
  • Intervention: The new intervention here are the pre- and post-quizzes. The in-class quiz simply replaces and formalizes an alternate technique based on online polls and ensuing discussion used in previous offerings.
  • Data sources: Low-stakes quiz scores, exam performance on matching topics, and basic demographic and background information collected through a project-team formation survey (already part of the course).
  • Study design: We used a repeated-measures, multi-object design that introduces the the intervention (pre- and post-quizzes) to pseudo-randomly determined rotating subset of students. The students are divided into two groups each week: the intervention group A and the control group B. The groups are switched in alternating weeks. Thus each student ends up receiving the intervention in alternate weeks only, as shown in the figure below. The effectiveness of self-study materials will be evaluated by comparing  pre- and post-quiz scores. The effectiveness of the intervention will be evaluated by comparing the performance of the control and intervention groups during in-class quizzes and related topics of course exams.

  • IRB protocols: Because the study relies on data sources embedded in normally required course work (with the new intervention becoming part of normal course work), we guarantee anonymity and confidentiality, and students only need to consent to their data being used in the analysis, we used an exempt IRB protocol applied to low risk studies in an educational context. To be fully aware of all research compliance issues, we recommend that anyone pursuing this type of inquiry consult with the IRB office at their institution before proceeding.
  • Actions: If the self-study materials are revealed to be inadequately effective, we have to look for ways to revise them and make them more effective, for example by shortening them, breaking them into smaller bits, adding examples or exercises, or converting them to traditional lectures. If the post-quizzes do not appear to improve retention of self-study materials, we have to consider withdrawing the intervention and trying alternative incentives and assessment strategies. If we get positive results, we will retain the interventions, keep measuring, and fine-tune the strategy with an eye to further improve student outcomes.

Status

We are in the middle of conducting the TAR study. Our results should be available by early Spring. Stay tuned for a sneak peek.

Acknowledgements

We are grateful to the Eberly Center staff Drs. Chad Hershock and Soniya Gadgil-Sharma for their guidance and help in designing the TAR study. Judy Books suggested the low-stakes assessment strategy. The section explaining the TAR approach is drawn from Eberly Center workshop materials.

Further Information

For further information on the TAR approach, visit the related page by  Center for the Integration of Research, Teaching and Learning. CIRTL is an NSF-funded network for learning and teaching in higher education.

SECM 2017 took place in Buenos Aires, Argentina

The first workshop on Software Engineering Curricula for Millennials (SECM) was collocated with ICSE 2017 in Buenos Aires, Argentina. The full-day event took place on May 27 at the Pontificia Universidad Católica Argentina (UCA) with Diego Fontdevila as the featured keynote speaker. Cécile and I would like to thank you for sharing your insights with everyone and making this first gathering a success! The collective experience and knowledge of the group were truly impressive. I have posted snapshots from the workshop on the web site (thanks for sending your photos to me). Also many thanks to PC members for reviewing the submissions and Nico Paez for helping with the local logistics.

Find the list of papers and workshop agenda on the SECM web site. You’ll also find a list of attendees and position slides from participants.

Here is a summary of the day by Nico Paez.

Our twitter tag was: #secm2017

This beautiful artwork was created on the fly by Nayla Portas during Diego Fontdevila’s keynote. Great job!

How shall we proceed next based on the action items collected at the end of the workshop? Some concrete suggestions were:

  • Establish a communication channel – done (Google group SE-EDU )
  • Refactor the workshop web site to get it ready for the next event and create a community web site for doing whatever we need to do in the interim – done (SECM 2017 is now served at subdomain secm2017.se-edu.org, and the domain se-edu.org points to this web site)
  • Invite others who might be interested in joining to grow the community (need suggestions)
  • Address a few of the high-priority topics that we did not get a chance to discuss at the workshop, perhaps via blogposts on this web site (more on this below)
  • Change the name of the workshop series before it becomes legacy to something everybody likes and is a bit less pedantic (e.g., SE Education for Millennials)
  • Create a draft agenda for next event (done thanks to Cécile, see this link)
  • Form a committee and submit a proposal to ICSE 2018 (ASAP)

Here is a list of high-priority topics identified at the workshop that could be addressed by attendees via blogposts on this web site:

  • Inverted curriculum (Nico P.)
  • Scaling up flipped classroom
  • Assessment and grading
  • Team formation
  •  Teaching collaboration skills
  • Maintaining student attention, motivation
  • Replicating courses
  • Active learning (Hakan, observations from a recent CMU Eberly Center workshop)
  • Teaching skills / Book reviews (Diego, Nico P.,  perhaps a summary of classroom-tested strategies from the books mentioned at the workshop, including Teaching from the Back of the Room and Pedagogical Patterns)
  • Teaching as Research (TAR, basically running classroom studies to measure effectiveness of teaching strategies, Hakan)
  • Teaching ethics and sustainability in SE (Davide, Claudia)
  • Sustainable/holistic learning (Juraj, based on his Yoga & Human Development experience. Here is the movie he has shared with us in an email)

Suggestions for other topics are always welcome!

Finally, here is a number of millennium traits validated by students who attended the workshop, assembled from her workshop notes by Cécile:

  • Tech-savvy
  • Want personalized experiences
  • Lack team focus
  • Do not like toy examples
  • Need challenging goals
  • Favor practice over theory
  • Active learner
  • Easily bored
  • (Ineffective) Multi-taskers
  • Want immediate feedback
  • Super scared of failure (their goal is not to learn from failure 🙂
  • Curious
  • Discovery driven
  • Socially aware
  • Visually focused (maybe?)

These traits come from circumstances (internet, social media, etc.) versus personal characteristics.