Search

Deputy Pro Vice-Chancellor, L&T Update

Re-thinking student evaluations

This post by Colleen Flaherty published earlier this year in Inside Higher Ed speaks to issues long debated amongst L&T leadership in the College.

Teaching Eval Shake-Up

Research is reviewed in a rigorous manner, by expert peers. Yet teaching is often reviewed only or mostly by pedagogical non-experts: students. There’s also mounting evidence of bias in student evaluations of teaching, or SETs — against female and minority instructors in particular. And teacher ratings aren’t necessarily correlated with learning outcomes.

All that was enough for the University of Southern California to do away with SETs in tenure and promotion decisions this spring. Students will still evaluate their professors, with some adjustments — including a new focus on students’ own engagement in a course. But those ratings will not be used in high-stakes personnel decisions.

The changes took place earlier than the university expected. But study after recent study suggesting that SETs advantage faculty members of certain genders and backgrounds (namely white men) and disadvantage others was enough for Michael Quick, provost, to call it quits, effective immediately.

‘I’m Done’

“He just said, ‘I’m done. I can’t continue to allow a substantial portion of the faculty to be subject to this kind of bias,” said Ginger Clark, assistant vice provost for academic and faculty affairs and director of USC’s Center for Excellence in Teaching. “We’d already been in the process of developing a peer-review model of evaluation, but we hadn’t expected to pull the Band-Aid off this fast.”

While Quick was praised on campus for his decision, the next, obvious question is how teaching will be assessed going forward. The long answer is through a renewed emphasis on teaching excellence in terms of training, evaluation and incentives.

“It’s big move. Everybody’s nervous,” Clark said. “But what we’ve found is that people are actually hungry for this kind of help with their teaching.”

SETs — one piece of the puzzle — will continue to provide “important feedback to help faculty adjust their teaching practices, but will not be used directly as a measure in their performance review,” Clark said. The university’s evaluation instrument also was recently revised, with input from the faculty, to eliminate bias-prone questions and include more prompts about the learning experience.

Umbrella questions such as, “How would you rate your professor?” and “How would you rate this course?” — which Clark called “popularity contest” questions — are now out. In are questions on course design, course impact and instructional, inclusive and assessment practices. Did the assignments make sense? Do students feel they learned something?

Students also are now asked about what they brought to a course. How many hours did they spend on coursework outside of class? How many times did they contact the professor? What study strategies did they use?

While such questions help professors gauge how their students learn, Clark said, they also signal to students that “your learning in this class depends as much as your input as your professor’s work.” There is also new guidance about keeping narrative comments — which are frequently subjective and off-topic — focused on course design and instructional practices.

Still, SETs remain important at USC. Faculty members are expected to explain how they used student feedback to improve instruction in their teaching reflection statements, which continue to be part of the tenure and promotion process, for example. But evaluation data will no longer be used in those personnel decisions.

Schools and colleges may also use evaluations to gather aggregate data on student engagement and perceptions about the curriculum, or USC’s diversity and inclusion initiatives, Clark said. They may also use them to identify faculty members who do “an outstanding job at engaging students, faculty who may need some support in that area of their teaching, or problematic behaviors in the classroom that require further inquiry.”

Again, however, SETs themselves will not be used as a direct measure in performance evaluations.

More Than a Number

While some institutions have acknowledged the biases inherent in SETs, many cling to them as a primary teaching evaluation tool because they’re easy — almost irresistibly so. That is, it takes a few minutes to look at professors’ student ratings on, say, a 1-5 scale, and label them strong or weak teachers. It takes hours to visit their classrooms and read over their syllabi to get a more nuanced, and ultimately more accurate, picture.

Yet that more time-consuming, comprehensive approach is what professors and pedagogical experts have been asking for, across academe, for years. A 2015 survey of 9,000 faculty members by the American Association of University Professors, for instance, found that 90 percent of respondents wanted their institutions to evaluate teaching with the same seriousness as research and scholarship.

The survey gave additional insight into the questionable validity of SETs: two-thirds of respondents said these evaluations create pressure to be easy graders, a quality students reward, and many reported low rates of feedback.

Echoing other studies and faculty accounts, responses to the AAUP survey suggested that SETs have an outsize impact on professors teaching off the tenure track, in that high student ratings can mean a renewed contract — or not.

The AAUP committee leading the 2015 study argued that faculty members within departments and colleges — not administrators — should develop their own, holistic teaching evaluations. It also urged “chairs, deans, provosts and institutions to end the practice of allowing numerical rankings from student evaluations to serve as the only or the primary indicator of teaching quality, or to be interpreted as expressing the quality of the faculty member’s job performance.”

Faculty committees at USC also have worked to address teaching excellence for the past five years, recommending that the university invest more in teaching, adopt incentives for strong instruction, and move toward a peer model of review.

USC’s teaching evaluation plan reflects some of those recommendations — as well as a new emphasis on teaching excellence.

“We must renew our focus on the importance of teaching and mentorship, putting into place the systems necessary to train, assess, and reward exceptional teaching,” Quick, the provost, and Elizabeth Graddy, vice provost, said in a March memo to the faculty. “In short, let’s make USC the great research university that expects, supports, and truly values teaching and mentoring.”

Clark, at the campus Center for Excellence in Teaching, is helping USC put its money where its mouth is. She said its new model of peer evaluation involves defining teaching excellence and developing training for the faculty, from graduate students who will become professors to full professors.

Peer Review Instead

Peer review will be based on classroom observation and review of course materials, design and assignments. Peer evaluators also will consider professors’ teaching reflection statements and their inclusive practices.

Rewards for high quality teaching will include grants and leaves for teaching development and emphasizing teaching performance in merit, promotion and tenure reviews, Clark said. Most significantly, thus far, the university has introduced continuing appointments for qualifying teaching-intensive professors off the tenure track.

Trisha Tucker, an assistant professor of writing and president of the USC’s Dornsife College of Letters, Arts and Sciences Faculty Council, said different professors have had different reactions to the “culture shift.” But she said she applauded the institution’s ability to resist the “easy shorthand” of teacher ratings in favor of something more meaningful — albeit more difficult. (USC also has made clear that research and service expectations will not change.)

“It does take work to do this peer review,” she said, “but teaching is important, and it takes a lot of time and resources to make that more than just empty words.”

As writing is a feedback-driven process, Tucker said her program already emphasizes pedagogy and peer review. But professors in some other programs will have to adjust, she said.

“For the many faculty who haven’t been trained in this way or hired based on these expectations, it can produce some anxiety,” she said. So an important measure of this new approach’s success is how USC supports people who “initially fall short.”

Clark said the teaching center offers a model for peer review that individual programs will adjust to their own needs over the next year. That kind of faculty involvement in shaping peer review should make for a process that is less “threatening” than representative of an “investment in each other’s success,” she said.

In the interim, professors’ teaching will be assessed primarily on their own teaching reflections. And while the center avoids using words such as “mandatory” with regarding to training, it is offering a New Faculty Institute, open to all instructors, for 90 minutes monthly over lunch for eight months. Sample topics include active learning, maximizing student motivation and effective, efficient grading practices.

Not Just USC

Philip B. Stark, associate dean of the Division of Mathematical and Physical Sciences and a professor of statistics at the University of California at Berkeley who has studied SETs and argued that evaluations are biased against female instructors in so many ways that adjusting them for that bias is impossible, called the USC news “terrific.”

“Treating student satisfaction and engagement as what they are — and I do think they matter — rather than pretending that student evaluations can measure teaching effectiveness is a huge step forward,” he said. “I also think that using student feedback to inform teaching but not to assess teaching is important progress.”

Stark pointed out that the University of Oregon also is on the verge of killing traditional SETs and adopting a Continuous Improvement and Evaluation of Teaching System based on non-numerical feedback. Under the system, student evaluations would still be part of promotion decisions, but they wouldn’t reduce instructors to numbers.

Elements of the program already have been piloted. Oregon’s Faculty Senate is due to vote on the program as a whole this week, to be adopted in the fall. The proposed system includes a midterm student experience survey, an anonymous web-based survey to collect non-numerical course feedback to be provided only to the instructor, along with an end-of-term student experience survey. An end-of-term instructor reflection survey also would be used for course improvement and teaching evaluation. Peer review and teaching evaluation frameworks,  customizable to academic units, are proposed, too.

“As of Fall 2018, faculty personnel committees, heads, and administrators will stop using numerical ratings from student course evaluations in tenure and promotion reviews, merit reviews, and other personnel matters,” reads the Oregon’s Faculty Senate’s proposal. “If units or committees persist in using these numerical ratings, a statement regarding the problematic nature of those ratings and an explanation for why they are being used despite those problems will be included with the evaluative materials.”

The motion already has administrative support, with Jayanth R. Banavar, provost, soliciting pilot participants on his website, saying, “While student feedback can be an important tool for continual improvement of teaching and learning, there is substantial peer-reviewed evidence that student course evaluations can be biased, particularly against women and faculty of color, and that numerical ratings poorly correlate with teaching effectiveness and learning outcomes.”

More than simply revising problematic evaluation instruments, the page says, Oregon “seeks to develop a holistic new teaching evaluation system that helps the campus community describe, develop, recognize and reward teaching excellence.” The goal is to “increase equity and transparency in teaching evaluation for merit, contract renewal, promotion and tenure while simultaneously providing tools for continual course improvement.”

Craig Vasey, chair of classics, philosophy and religion at the University of Mary Washington and chair of AAUP’s Committee on Teaching, Research and Publications, said the “most pernicious element” of quantitative student evaluations is that the results “get translated into rankings, which then take on a life of their own and don’t really improve the quality of education.”

Review of syllabi and classroom observation by peers are both more “useful means of evaluating,” he said. “And I think asking students how engaged they were in the class — and especially if they also ask why — gets “better input from them than the standard questionnaire.”

Ken Ryalls, president of The IDEA Center for learning analytics and a publisher of SETs, told Inside Higher Ed earlier this year that not all evaluations are created equal.

“Our advice: Find a good SET that is well designed and low in bias; use the data carefully, watching for patterns over time, adjusting for any proven bias, and ignoring irrelevant data; and use multiple sources of data, such as peer evaluations, administrative evaluations, course artifacts and self-evaluations, along with the student perspective from SETs,” he said via email.

Advertisements

HE Learning Framework

The Science of Learning Research Centre has released a Higher Education Learning Framework (HELF), an evidence-informed model for university learning.

HELF has seven principles:

Screen Shot 2018-11-19 at 7.10.58 am.png

Each principle is explored through an explanation and theoretical overview as well as strategies for teachers, strategies for students and strategies for assessment.

The Science of Learning Research Centre (SLRC) was established in 2013, funded as an ARC Special Research Initiative, to improve learning outcomes at pre-school, primary, secondary and tertiary levels through scientifically-validated learning tools and strategies.

Administered by the University of Queensland, the SLRC brings together neuroscientists, psychologists and education researchers from across the country, collaborating on programs to better understand learning, using a range of innovative experimental techniques and programs.

 

It’s OK to not be OK

We know that 1 in 4 tertiary students experiences a mental health issue each year.

Despite being common, these experiences remain a hidden adversity for many.

In October RMIT Wellbeing & Inclusion, RUSU and Communications ran a mental health campaign, It’s OK to not be OK, inviting students to acknowledge their own experiences of mental illness and share messages of support.

The 200 messages the students have produced so far are powerful and may be a lifeline for students in a time of need.

A student-centric video went behind the scenes of a parallel Wellbeing initiative while an alumni-focused story was able to show how it really can and does get better for students managing mental health issues.

 

Big Idea student makes it to finals

BA International Studies student, Meaghan Galindo. has successfully made it through to the finals of the Big Idea undergraduate competition. Meaghan won the RMIT competition and went on to pitch her winning social enterprise idea, competing against 12 Australian Universities in the semi finals of the competition.

The Big Idea finals are being held on Dec 4 at Price Waterhouse Coopers. We wish Meaghan well for the finals! Great outcomes already from this terrific initiative.

AQF framework review

In the 2017-18 budget, the Australian Government announced a review of the AQF, to ensure that the framework continues to meet the needs of students, employers, education providers and the wider community.

The AQF was last reviewed between 2009–2011. Since then there have been technological advances in education delivery, increased uptake of sub-qualifications and changes to standard international practice related to qualifications frameworks.

The Minister for Education and Training has appointed the following experts to the AQF Review Panel:

  • Professor Peter Noonan (Chair), Professor of Tertiary Education Policy at Victoria University
  • Mr Allan Blagaich, Executive Director of School Curriculum and Standards, WA Department of Education
  • Professor Sally Kift Adjunct Professor, College of Business, Law & Governance at James Cook University
  • Ms Megan Lilly, Head of Workforce Development, Ai Group
  • Ms Leslie Loble, Deputy Secretary of External Affairs and Regulation, NSW Department of Education
  • Professor Elizabeth More AM, Dean of the Australian Institute of Management School of Business
  • Ms Marie Persson, Chair Industry Reference Group of the NSW Skills Board.

The Minister for Education and Training, and the Council of Australian Governments Education and Industry and Skills Councils, have approved the Terms of Reference for the review. The review will involve broad public consultation in response to a discussion paper and will include extensive consultations with the sector. Release of a discussion paper and public consultations are expected in the second half of 2018, with the final report to be provided to government by June 2019.

For further information, see https://www.education.gov.au/australian-qualifications-framework-review-0

 

Changes to DSC L&T leadership

I recently signalled my intention to focus on my work as Dean of the School of Education. After six years in the DPVC role and two years doing both roles, I have decided to step down from the DPVC role.

To ensure a smooth transition we have made some changes at a College level. Clare Renner, who has provided excellent leadership of the Senior Advisors, will focus her attention on the establishment of the DSC VE School. My thanks to Clare for her respectful leadership of and commitment to the SALT team , particularly over the recent busy period setting up the Areas of Focus for 2019. Stepping back into the role of SALT Manager is Helen McLean. Helen has considerable experience in this role and will lead the team strongly over the coming months as we set up processes for the coming year.

Screen Shot 2018-11-12 at 5.37.19 amI am also pleased to announce that Professor Tania Broadley has been appointed the new Deputy Pro Vice-Chancellor for Learning and Teaching in DSC.

Tania brings a strong and distinctive record of leadership in Higher Education, a proven commitment to teaching and learning and an excellent insight and awareness of the importance of the student experience. Most recently through her role as the Assistant Dean (Teaching and Learning) in the Faculty of Education at QUT, Tania provided strategic leadership at the Faculty level and contributed to the wider University priorities.  Prior to QUT, Tania established the Curtin Learning Institute at Curtin University. Reporting to the DVC (Academic) she was responsible for providing strategic leadership in academic development and learning space design at the University level.

Tania has worked in online learning leadership in the Curtin Business School and as an academic in educational technology in the School of Education at Curtin.  Her research areas are concentrated in the field of educational technologies, academic professional development and teacher education. Professor Broadley completed her PhD in the discipline of education, is on the Teaching Performance Assessment National Expert panel for the most recent teacher education reform agenda, is currently the Queensland representative on the Australian Council of Deans of Education (NADLATE) group and has been a State President and National Board member of the Australian College of Educators.

Tania starts in mid-February 2019 and will help to drive the College’s Plan and deliver the promises in the RMIT’s five Areas of Focus (AoF). I’m sure you will join me in welcoming Tania in the new year.

 

Staff Creds

Screen Shot 2018-11-06 at 5.45.46 pm.pngFollowing the successful launch of RMIT Creds to students in 2017, RMIT is extending micro-credentials to staff, to allow them to develop new skills and capabilities supporting their professional development goals.

The first Staff Cred Advancing Reconciliation – Preparing for Bundyi Girri released as part of the official launch of Bundyi Girri by Vice-Chancellor Martin Bean, is now available via the DevelopMe portal for all staff to enrol.

Additional Staff Creds released in this first tranche include Agile Ways of Working, Coaching for Performance and Leading a Culturally Intelligent Workplace.  By the end of the year, there will be eight staff creds in total released with Program Design, Work Integrated Learning, Building Belonging and Applying Belonging coming online.

Staff Creds have been created, developed and released thanks to the efforts of the 21CC, ITS, Communications and credential product owners from HR, Belonging, Activator and Bundyi Girri – all working in a collaborative way to realise this significant organisational milestone.

GUSS Social Care and VE Design teams win LearnX awards

Screen Shot 2018-11-02 at 5.31.58 am.png

Tony Graham accepts LearnX award for the GUSS Social Care team

The LearnX Impact Awards is an annual event run by the LearnX Foundation; a not-for-profit organisation promoting innovative workforce learning and supporting technologies. The event attracts the leading learning and training industry organisations in Australia.

This year the combined forces of the GUSS Social Care team and the VE Design Team entered the competition. Beginning as a gleam in the eye of Glenn Blair, Assistant Director, VE Operations, this collaboration has delivered impressive outcomes in a short period of time.

The GUSS Social Care team of Renee Costa and Gwen Cawsey has worked with the VE Design team (now 27 staff strong and led by Elissa McKenzie) for the last eight months developing the Certificate III Individual Support for online delivery. The Certificate III has been a launching place for some key innovations in Canvas and has provided the opportunity for the design team to secure various industry partnerships with their considered approach to learning design and course content. Industry feedback has been very positive.

The GUSS team won Best Learning Design 70:20:10 and the VE Design Team took out awards in Design Accessibility and Best Talent.

Congratulations to all!

Screen Shot 2018-11-02 at 5.30.43 am.png
\

Divine 9 – Photo Futures Lab

In this series of posts I’d like to share some of the outcomes that have emerged from the Divine 9 programs involved in Every Graduate Ready this year.

I start with the impressive Photo Futures Lab in Collingwood, which I visited last week.

Screen Shot 2018-10-30 at 10.41.05 pmThe Photo Futures Lab is an offsite Bachelor of Arts (Photography) learning & teaching initiativeThe purpose of the Lab is to foster work-ready graduates through an industry-based incubator which reflects current, future and expansive work contexts in photography.  It enables students, staff, alumni, industry partners, and the local community to engage in a range of collaborative projects.

The project has three core components including:

  • A vertically integrated photography studio, The Social Turn: Collingwood Studio;
  • An Alumni in residence program
  • A series of community partnerships and projects exploring critical approaches to photography, collaboration and ethics.

This semester, students have been working in collaboration with The Bendigo Hotel to document the social history of the venue, the Social Studio to produce content to help tell the story of this innovative social enterprise; the African Diaspora Women’s Summit to document the achievements of the African diaspora in Melbourne and the Collingwood College mentoring Year 9 students in the politics of representation through the craft of photography.

Screen Shot 2018-10-30 at 10.43.19 pm.png

What the partners said…

The RMIT students were fantastic to work with. They were independent and really took the lead on this project. I love their photos…they have produced an amazing book which I want to keep on the barGuy Palermo, The Bendigo Hotel

The RMIT photography students were so impressive, they worked independently and took  initiative throughout the process. We hope to continue working with them in the future. Their work is really impressive and we will definitely be using their work to promote our products. Jade McKenzie, CEO The Social Studio

I am very moved by the impact our work had on the students. I am grateful for the collaboration, and for their work promoting the achievements of the African diaspora in Australia. Dr Mimmie Watts, Founder of the African Diaspora Women’s Summit

It was so inspiring for our year 9 students to work with the RMIT photography students who were patient and encouraging. As a teacher it was such a gift to have extra teachers in the room. It has been an unforgettable and very positive experience for all of us.
Tim Webster, Collingwood College

And what our students said…

Mimmie was so open and welcoming. I initially felt out of my depth as I previously had no connection to the continent of Africa. However, as I started to participate in events and educate myself, I became concerned about issues of representation, and in particular photography’s role in misrepresentation. Working with individuals who spoke so eloquently on these issues was a great privilege, and I owe much to their labour. I hope our film goes some way to sharing those positions and experiences more broadly. 2nd year RMIT BA Photography student

I loved teaching the year 9 students. It has opened up a whole new possible career for me—teaching! I didn’t even know that I loved teaching.  2nd year RMIT BA Photography student

It’s been a great experience to be a part of the course. I really enjoyed the presentations and the opportunity to consolidate a lot of our research over the semester, and find an additional way to present that work. It was also really moving to hear the personal impacts some of the other projects had on students and their way of thinking. It really did highlight the transformative power that collaboration can have. 2nd year RMIT BA Photography student

Screen Shot 2018-10-30 at 10.47.20 pmI found that once I started working with the designers at the social studio, I wanted to know more about them as people. I went to visit one of the women who hand knits the iconic ‘social studio’ jumpers. I feel we developed a meaningful relationship and collaboration that I hope will continue into the future. As she is a full time carer and finds it difficult to leave her property, we decided to photograph her knitwear in her own home. This made everything so much more personal, and meant she could participate in the art direction. 1st year RMIT BA Photography student

The students I met spoke passionately about their learning. The success of this initiative owes much to the commitment and respectful facilitation of Kelly Hussey-Smith under the Program Leadership of Pauline Anastasiou. Angela Clarke was also instrumental in helping realise this idea.

A big congratulations to the School of Art on this very successful initiative.

Blog at WordPress.com.

Up ↑